As a guest user you are not logged in or recognized by your IP address. You have
access to the Front Matter, Abstracts, Author Index, Subject Index and the full
text of Open Access publications.
In this paper, we propose a similar image retrieval algorithm based on feature fusion and locality-sensitive hash to address the problems of inadequate representation of image content by individual features and long retrieval time for massive image data. The fusion of global features and attention features makes the image features have both color structure and semantic information, which can better characterize the image content. In the image retrieval stage, the locality-sensitive hash is used to hash encode the image features, the cosine similarity is used as the similarity measure, and finally, the index is built to improve the retrieval efficiency. The similar image retrieval algorithm proposed in this paper has improved the average finding accuracy and recall rate on Caltech 256 and Corel5k datasets compared with other methods, and the retrieval time is greatly reduced.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.
This website uses cookies
We use cookies to provide you with the best possible experience. They also allow us to analyze user behavior in order to constantly improve the website for you. Info about the privacy policy of IOS Press.