Keeping low latency for an online data service with more than 5 billion records can be a challenge. Features like query the database with natural language(https://ossinsight.io/explore).This makes it easier for users to access the data without having to know SQL. But the generate SQL could be pretty heavy(many Full table scan and index scan).
We've got a in-house LOB app that can be boiled down to a giant todo-list/work history for accounts, virtually every task that needs to be done for an account gets logged into this system. A whole bunch of assorted data is pulled in from various systems into this for the explicit purpose of letting employees filter data, and there's a filter builder in the UI that lets them use whatever combination of criteria they might need.
Needless to say, the database servers absolutely hate this application. Because queries are completely dynamic based on filters there's no way to index things, so full table scans are not uncommon (we've at least got some conditional indexes to filter active accounts out from inactive ones to limit the most common queries to just those accounts that haven't been paid down).
So yeah, database servers hate these kinds of workloads, and not much you can do to make them happy :/
Totally agree.Database with both row and column storage engine could help a lot.T hat's the reason OSSInsight like workload can run on top of TiDB easily, whether scan the whole table or scan part of the index.
Answers from AI could be wrong sometimes(close to 20%). The better database schema and purpose provide for AI, the better results you can get. This can help the AI better understand what you are asking it to do and improve the accuracy.