Unlike large models such as Gemini or ChatGPT, where information is extracted from numerous web sources that may contain “hallucinations,” NotebookLM relies 100% on the sources you provide, such as PDFs, audio files, YouTube videos, Google Docs, or even articles. By working exclusively with your sources, the tolerance for hallucinations is very low.
knollimar•1h ago
Huh don't most hallucinations come from the models internal knowledge and not the RAG?
burnerToBetOut•16m ago
Please clarify the Google connection.
I'm guessing that it's an official Google-built product. [1]
byandrev•1h ago
knollimar•1h ago