Dealing with Hallucinations

Balancing cost and quality in Generative AI models to reduce Hallucinations
Jonathan Yarkoni
Jonathan Yarkoni
CEO at Shujin.AI
at
at
at

Generative AI models hallucinate and it’s a problem. It’s the main reason holding back consumer facing implementations. Even GPT-4 hallucinates 3% of the time and that's on general knowledge, not specific to your use case. There are over 10 reasons why models hallucinate. Solving all of them is generally regarded as over engineering, much like implementing every security measure available on the market is. In this talk Jonathan Yarkoni will take us through the different ways and reasons models hallucinate. He will explain the cost value of each solution and showcase several through demos.

‍

Generative AI models hallucinate and it’s a problem. It’s the main reason holding back consumer facing implementations. Even GPT-4 hallucinates 3% of the time and that's on general knowledge, not specific to your use case. There are over 10 reasons why models hallucinate. Solving all of them is generally regarded as over engineering, much like implementing every security measure available on the market is. In this talk Jonathan Yarkoni will take us through the different ways and reasons models hallucinate. He will explain the cost value of each solution and showcase several through demos.

Qwak optimizes AI in production

“Qwak streamlines AI development from prototype to production, freeing us from infrastructure concerns and maximizing our focus on business value.”
Notion
“We ditched our in-house platform for Qwak. I wish we had found them sooner.”
Upside
“The Qwak platform enabled us to deploy a complex recommendations solution within a remarkably short timeframe. Qwak is an exceptionally responsive partner, continually refining their solution.”
Lightricks