Abstract
As a challenging attempt to replace a traditional data structure with a learned model, this paper proposes a learned functional Bloom filter (L-FBF) for a key-value storage. The learned model in the proposed L-FBF learns the characteristics and the distribution of given data and classifies each input. It is shown through theoretical analysis that the L-FBF provides a lower search failure rate than a single FBF in the same memory size, while providing the same semantic guarantees. For model training, character-level neural networks are used with pretrained embeddings. In experiments, four types of different character-level neural networks are trained: a single gated recurrent unit (GRU), two GRUs, a single long short-term memory (LSTM), and a single one-dimensional convolutional neural network (1D-CNN). Experimental results prove the validity of theoretical results, and show that the L-FBF reduces the search failures by 82.8% to 83.9% when compared with a single FBF under the same amount of memory used.
Original language | English |
---|---|
Pages (from-to) | 1928-1938 |
Number of pages | 11 |
Journal | IEEE Transactions on Computers |
Volume | 71 |
Issue number | 8 |
DOIs | |
State | Published - 1 Aug 2022 |
Bibliographical note
Publisher Copyright:© 1968-2012 IEEE.
Keywords
- Key-value storage
- deep learning
- functional Bloom filter
- search failure