Stop quoting Chat GPT

Last evening I watched the 60 Minutes segment on Google’s several AI endeavors (one of which is ‘Bard’, very comparable to ChatGPT), and a problem they haven’t figured out is labeled AI ‘hallucinations’ where it can generate fictitious references as citations (e.g., books that don’t exist), not as ‘intentional’ deception, but something that happens for whatever reason.

Another thing they do not have a handle on is emergent behaviors. One that surprised them is that it taught itself the entire Thai language when it had come across only a few Thai words incidentally, unplanned and not the result of an intended ‘feature’.

4 Likes