Google warns against hallucinating chat bots

Senior Vice President Google and the head of the Google Search division, Prabhakar Raghavan warns users about the importance of reviewing information received from chat bots based on artificial intelligence.

The variety of AI we are talking about sometimes leads to the creation of fictional answers that we call hallucinations, Mr. Raghavan said. He also stated that one of the important tasks is the minimization of such cases.

With hallucinations, AI bots provide answers that look reliable, but are not based on facts. Developers should minimize such malfunctions in the work of AI-Botts.

This week, security researchers from the IB-company Check Point Research said that hackers have developed a way to get around Chatgpt restrictions and use it to sell services that allow cybercriminals to create malicious programs and phishing emails.

It also became known that Openai collects a wide range of other user information: IP addresses of users; type of browser, its settings; data on the interaction of users with the site; type of content with which they interact; the functions that they use; actions that take. All this information can completely be collected by Openai in accordance with the company’s confidentiality policy.

/Media reports cited above.