Do instructions from following sections in the following order:
# local db# local elasticsearch(start up Docker)# elasticsearch mappings# local elasticsearch(fill with data)# local start# local users
To start it with memory db, create server/datasources.local.json with content:
{
"db": {
"name": "db",
"connector": "memory",
"file": "mydata.json"
},
"emailDs": {
"name": "emailDs",
"connector": "mail",
"transports": []
}
}
Before uploading data to ES you need to set up dynamic mappings like so:
npm run resetor (for local elasticsearch)
npm run reset -- --url http://localhost:9200To start local ElasticSearch using docker:
docker-compose up
You can access it with http://localhost:9200.
To fill it with data:
npm run converter -- ./data/baseline.json --url http://localhost:9200
npm run converter -- ./data/cincinnati.json --url http://localhost:9200
npm run converter -- ./data/cincinnati-benchmarks.json --url http://localhost:9200To stop ElasticSearch and clear data:
docker-compose down -vThen copy .env.example to .env and start project normally:
npm install
npm run startStart up app, then
node generate-mock-users.jsAfter that, you can log in using
email: [email protected]
password: 1
To generate json suitable to bulk upload to ElasticSearch, use
npm run converter -- ./data/baseline.csv
npm run converter -- ./data/cincinnati.csv
npm run converter -- ./data/cincinnati-benchmarks.csvFor more options see
npm run converter -- --helpnpm run debugIn VSCode, use "Attach" debug configuration.