Dealing with Hallucinations

Balancing cost and quality in Generative AI models to reduce Hallucinations
Jonathan Yarkoni
Jonathan Yarkoni
CEO at Shujin.AI
at
at
at

Generative AI models hallucinate and it’s a problem. It’s the main reason holding back consumer facing implementations. Even GPT-4 hallucinates 3% of the time and that's on general knowledge, not specific to your use case. There are over 10 reasons why models hallucinate. Solving all of them is generally regarded as over engineering, much like implementing every security measure available on the market is. In this talk Jonathan Yarkoni will take us through the different ways and reasons models hallucinate. He will explain the cost value of each solution and showcase several through demos.

Generative AI models hallucinate and it’s a problem. It’s the main reason holding back consumer facing implementations. Even GPT-4 hallucinates 3% of the time and that's on general knowledge, not specific to your use case. There are over 10 reasons why models hallucinate. Solving all of them is generally regarded as over engineering, much like implementing every security measure available on the market is. In this talk Jonathan Yarkoni will take us through the different ways and reasons models hallucinate. He will explain the cost value of each solution and showcase several through demos.

Qwak optimizes AI in production

“From our very first interaction, it was clear that Qwak understood our needs and requirements. Their platform enabled us to deploy a complex recommendations solution within a remarkably short timeframe. Moreover, Qwak is an exceptionally responsive partner, continually refining their solution.”
Lightricks
“Our AI and Machine Learning pipelines are fundamentally built on Qwak's comprehensive platform, which has been a game-changer in our journey from the initial ideation to the full-scale production of our banking chatbot 'Ella 2.0'. ”
ONE ZERO BANK
“We ditched our in-house platform for Qwak. I wish we had found them sooner.”
Upside