A visual search engine based on Elasticsearch and Tensorflow (now fully dockerized to run it in up-to-date development environments).
This repository contains code in Python 2.7 and utilizes Faster-RCNN (with VGG-16 as backbone)
implemented in Tensorflow 0.12.1 to extract features from images. An
Elasticsearch instance is used to store feature vectors of the corresponding images,
along with a plugin to compute distance between them.
TODO: Replace the outdated Faster-RCNN with a faster and more
accurate model (suggestions or any collaboration is welcomed).
The setup assumes you have a running installation of nvidia-docker and driver version 367.48 or above.
First, we need to build the Elasticsearch plugin to compute distance between
feature vectors. Make sure that you have Maven installed.
cd elasticsearch/es-plugin
mvn installNext, we need to create a docker network
so that all other containers can resolve the IP address of our elasticsearch
instance.
docker network create vs_es_netFinally, start the elasticsearch container. It will automatically add the plugin, create a
named docker volume for persistent
storage and connect the container to the network we just created:
cd ../ && docker-compose up -dIn order to populate the elasticsearch db with images, we need to first process
them with a feature extractor (Faster-RCNN). The indexer services
can do this for any image we place inside visual_search/images.
First we build a dockerized environment for the object detection model to run in:
cd visual_search && docker build --tag visual_search_env .Here we use an earlier version implemented by @Endernewton.
To get pre-trained model, you can visit release section,
download and extract file model.tar.gz to visual_search/models/ folder. Optionally,
you can run:
mkdir models && cd models
curl https://github.com/tuan3w/visual_search/releases/download/v0.0.1/model.tar.gz
tar -xvf model.tar.gzTo index the desired images, copy the corresponding compose file to the proper directory and start the indexing service:
cd ../ && cp indexer/docker-compose.yml .
docker-compose upBefore starting the server, again copy the corresponding compose file (overwrite
the one used for indexing data) into the proper directory and start the containerized
flask server:
cp server/docker-compose.yml .
docker-compose up -dNow, you can access the link http://localhost:5000/static/index.html to test the search engine.