I tested this whole concept with Mistral AI. It searches the web, aggregates its findings and provides an answer highlighting potential perspectives / different answers with each one providing a link to the source URL. As much as I hate AI, it does work great that way (since the LLM doesn’t have to pull stuff out of its butt).
Are you certain that the answer was actually from their sources? I had multiple occasions where Mistral/ChatGPT gave me sources and I felt like something was off. I then followed the sources and could not find what they found according to themselves. I then asked them to quote the actual text they used to provide said answer and after drilling them a few more times they concluded that yes, the thing they said was actually not anywhere to be found in the sources they provided.
Doesn’t ChatGPT also use google?
I tested this whole concept with Mistral AI. It searches the web, aggregates its findings and provides an answer highlighting potential perspectives / different answers with each one providing a link to the source URL. As much as I hate AI, it does work great that way (since the LLM doesn’t have to pull stuff out of its butt).
Are you certain that the answer was actually from their sources? I had multiple occasions where Mistral/ChatGPT gave me sources and I felt like something was off. I then followed the sources and could not find what they found according to themselves. I then asked them to quote the actual text they used to provide said answer and after drilling them a few more times they concluded that yes, the thing they said was actually not anywhere to be found in the sources they provided.