MongoDB index issues

classic Classic list List threaded Threaded
2 messages Options
Reply | Threaded
Open this post in threaded view
|

MongoDB index issues

jewzaam
Administrator
We have a generic entity being developed that can be thought of as a simple bag of properties (key + value).  The problem we have is there's a need to create an index on {value,key} but the values can be very large.  This causes the index creation to fail in mongo 2.6.x:

index key limit: 1024 bytes
- http://docs.mongodb.org/manual/reference/limits/#Index-Key-Limit

Thought for a while to get around this using a hashed index but hit some snags there too:
- single field, equality query only, not on array elements
- http://docs.mongodb.org/manual/core/index-hashed/

There is a way to get the index to not fail using setting failIndexKeyTooLong but:
- allows index to be created
- doesn't include failed keys in index data
- TBD if the entire document is excluded, didn't find that info easily
- https://jira.mongodb.org/browse/SERVER-12983
- http://docs.mongodb.org/manual/reference/parameters/#param.failIndexKeyTooLong


Some possible solutions discussed already:
1) split large values into multiple smaller values
2) create a second array of key/value that holds things too large to index and do not index it

Any other things someone can think of would be appreciated!

The very short term solution, to move testing forward, was to create a simple index only the key (which is not too big in the sample data).  This won't get us as much as indexing the values but it allows testing to continue.
Reply | Threaded
Open this post in threaded view
|

Re: MongoDB index issues

bserdar
Creating a text index might solve the problem. It moght also require
some changes to the code as well.

On Tue, Dec 2, 2014 at 9:33 AM, jewzaam [via lightblue-dev]
<[hidden email]> wrote:

> We have a generic entity being developed that can be thought of as a simple
> bag of properties (key + value).  The problem we have is there's a need to
> create an index on {value,key} but the values can be very large.  This
> causes the index creation to fail in mongo 2.6.x:
>
> index key limit: 1024 bytes
> - http://docs.mongodb.org/manual/reference/limits/#Index-Key-Limit
>
> Thought for a while to get around this using a hashed index but hit some
> snags there too:
> - single field, equality query only, not on array elements
> - http://docs.mongodb.org/manual/core/index-hashed/
>
> There is a way to get the index to not fail using setting
> failIndexKeyTooLong but:
> - allows index to be created
> - doesn't include failed keys in index data
> - TBD if the entire document is excluded, didn't find that info easily
> - https://jira.mongodb.org/browse/SERVER-12983
> -
> http://docs.mongodb.org/manual/reference/parameters/#param.failIndexKeyTooLong
>
>
> Some possible solutions discussed already:
> 1) split large values into multiple smaller values
> 2) create a second array of key/value that holds things too large to index
> and do not index it
>
> Any other things someone can think of would be appreciated!
>
> The very short term solution, to move testing forward, was to create a
> simple index only the key (which is not too big in the sample data).  This
> won't get us as much as indexing the values but it allows testing to
> continue.
>
> ________________________________
> If you reply to this email, your message will be added to the discussion
> below:
> http://dev.forum.lightblue.io/MongoDB-index-issues-tp245.html
> To start a new topic under lightblue-dev, email
> [hidden email]
> To unsubscribe from lightblue-dev, click here.
> NAML