How to improve query performance in mongoDB in collections with millions of records and each of them with many searchable attributes?

2

There is a collection with approximately 20 million records and each with about 200 searchable attributes.

Example:

{atrib001:"abc", atrib002:"123", atrib003:"1x3"... atrib200:"1zz"}

The application that does search allows it to be mounted dynamically, according to the options desired by the user.

How is it possible or what is the recommendation for creating indexes in mongoDB to improve the performance of this type of search? In addition, is it feasible to create an index for each attribute, which in this case would be 200 indexes, and trust that mongoDB chooses the best of them?

    
asked by anonymous 04.12.2015 / 22:02

1 answer

0

(Note: I'm not a DBA. And test anything on a separate copy of the bank before applying for production.)

As far as indexes are concerned, MongoDB is not that different from relational databases. They are typically B-trees.

If the query is dynamic and the user can choose which attributes to include in the search, the best thing to do is create the 200 indexes, one for each attribute, and thus the search will be optimized.

If there was a pattern, for example if the user always included attribute 1 and 2 in the search, you could create a composite index that mapped both attributes at once. But as there is not, indexes must be created separately.

    
04.12.2015 / 23:12