Hallucinations

AI

Instances where a Generative AI model generates output that is not grounded in its input data, i.e., it "makes things up". This is particularly common in tasks like text generation from large language models, where the model might generate plausible sounding but incorrect or nonsensical information.

Fast forward to 2030 with Futurescape

An in-depth exploration of the attitudes, innovations and media shifts that will shape the years ahead and redefine how we advertise by the turn of the decade

IAB UK Chatbot

Close Chat

Hi, I'm NORI

As the IAB’s AI-powered chatbot, I’m here to help IAB members understand everything about the world of digital advertising. You must be an IAB member to ask a question. To get started, either log into your account or create one below.

Are you sure you want to clear your chat history?

No
IAB chatbot icon