I came across some tools like datawalk, falkordb, Cognee etc that help creating ontologies automatically, AI driven I believe. Are they really efficient in mapping all data to schema and automatically building the KGs? (I believe they are but havent tested, would love to read opinions from other's experiences)
Apart from these, what are the "gaps" that are yet to be addressed between these tools and successfully adopting KGs for AI tasks at enterprise level?
Do these tool take care of situations like:
- adding new data source
- Incremental updates, schema evolution, and versioning
- Schema drift
- Is there any point encountered where you realized there should be an "explainability" layer above the graph layer?
- What are some "engineering" problems that current tools dont address, like sharding, high-availability setups, and custom indexing strategies (if at all applicable in KG databases, im pretty new, not sure)
- Based on your experience, which tool comes closest to accurate "automated" parsing or multiple data sources to KG?
- Also do you think applications of KG would still be relevant 5 years down the line? I think its adoption would/is increasing but could be wrong