You already have a model with incredibly powerful semantic understanding. Why do we need the document store to also be a smartass? The model can project multiple OR clauses into the search term based upon its interpretation of the context.
If you are using something like Lucene, queries are extremely fast and the maximum # of supported documents in one index far exceeds what AWS says they can support here.
jakozaur•2h ago
It seems AWS is leveraging its strong S3 brand to compete directly in the vector database market.
For more context, check out the TurboBuffer architecture docs and Notion’s presentation from the Data Council:
https://turbopuffer.com/docs/architecture
https://www.youtube.com/watch?v=_yb6Nw21QxA
(anti-disclaimer: I'm not affiliated with TurboBuffer in any way.)