• jedibob5@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    8 days ago

    AI’s tendency to hallucinate means that for it to be actually reliable, a human needs to double-check all of its output. If it is being used to acquire and convey information of any kind to the prompter, you might as well just skip the AI and find the information manually, as you’d have to do that anyway to validate what it told you.

    And AI hallucinations are a side effect of the fundamental way in which generative AI works - they will never be 100% accounted for. When an AI generates text, it is simply predicting what word is likely to come next based on its prompt in relation to its training data. While this predictive ability has become remarkably sophisticated within the last few years (more than I thought it ever would, tbh), it is still only a predictive text generator. It’s not “translating,” “understanding,” or “comprehending” anything about whatever subject it has been asked about - it is merely predicting the likelihood of the next word in its response based on its training data.