AI Hallucination

Ghost in the Machine

AI Hallucination is when an AI engine, specifically an LLM, makes up an answer. For example an LLM may know that court cases are referred to as “word” vs “word”, but does not understand that those words are very, very specific and refer to real-word case examples. Therefore, an LLM has been known to cite cases that simply do not exist. Much work is going on in the AI world to both help LLMs not to hallucinate and to fact-check their responses.

Leave a Reply 0