Although LLM generates coherent and plausible output, it may generate false or misleading information, a phenomenon known as hallucination. Hallucination is innate in LLM. You can reduce it but not eliminate it. With hallucination, practical deployment of LLMs becomes challenging. … Continue reading →