One of the things that really annoys me with LLMs is that they don’t tell you when they can’t do something and they simply bullshit their way around it.
My father wanted to do some work and asked me to grab the medical advice from a YouTube video.
Chatgpt didn’t do it, so I loaded Google AI studio and linked it and asked Gemini to summarize it. Supposedly Google AI studio can work with YouTube links easily.
It showed some good bullet points.
Upon closer inspection, the points were irrelevant. Nothing in the summary was found in the video.
I told Gemini and it said it can’t load the video. It admitted the summary was hallucinated.
That video had closed captions and transcriptions off for some reason. It was unindexable.
So Gemini faked a whole summary from the title alone.
You need to be extra careful of that.

0 Comments