We have many filters, so we will need a smart way to handle the data structure and its indexing.
My current DB is MongoDB, so preferably to find the best strategy for it (although I'm open to change DB if really worth it).
I'm open and willing to change the data structure as needed to achieve the best possible strategy for indexing data and make all searches very fast.
The estimated amount of records we will have on production environment is about 8 millions.
I provide attached an example of the JSON data with 4 records in attached file of "[url removed, login to view]", just to give you an idea of the current data structure.
The list of fields that need to be indexed somehow to allow to be searched are in the attached file "[url removed, login to view]".
I can generate/provide as many records as needed (with the best data structure decided), to test performance on searches to confirm things work as expected.
In addition to the fields and indexes mentioned in the attached file, it will be needed extra ones (mostly numbers) related to distances. Also to discuss how to include in the strategy.
.json had a missing comma (just in case you needed a well formatted JSON).
Updated it as "ExampleData2.json"