- Advertisement -spot_img
Friday, February 3, 2023

When the algorithm thinks for you

My column this week at Invertia is entitled “Bad Times for Critical Thinking” (PDF), and tries to delve into the potential consequences of the proliferation of information search services based on algorithms, without additional data or explanation. about its origin.

The same thing comes to mind when using ChatGPT as if it were a search engine: the response is clearly brilliant, concise, well written and depending on who you are and what you need it for Ready to copy and paste anywhere. it. If we compare it with Google’s response, a result page (search engine results pageor SERP) usually contains, at a minimum, information about where the response came from, as well as a short piece of text (piece) and, above all, requires the use of critical thinking to decide which link to click on.

If we take into account that in today’s society there are no small number of people who take the first result of Google practically as a rule of faith and absolute truth and in fact there is a button in the search engine, “I feel lucky”, is fine for this type of people (according to the company, they keep it for sentimental reasons and because of its low level of usage), what if the answers to user’s questions are edited, rounded text and without sources be?

The answer is clear: If critical thinking is, unfortunately, an increasingly under-developed skill in human societies, then such a move could reduce it even further. Unless the example of Perplexity.ai holds and the companies providing such services decide to provide the sources from which they compose their answer texts – and even in that case, it is conceivable that many Users simply don’t even consider checking them – we’ll find ourselves with an increasing tendency to ask a question, read a few lines of a text that appears alongside a Nobel laureate’s a denialist, flat-earth and conspiracy page. could come from, and keep the answer as if it were the absolute truth. “as seen on TV” Everything from the 60s that pretended to be true because we saw it on TV got carried over to the web.

Furthermore, the problem is not only the loss of the ability to think critically, something whose responsibility lies not so much with technology companies, but with the educational system and the education that parents transmit to their children, but clearly The limitless manipulative potential that can arise from such a system is simply by choosing your sources with some care – which would also be extremely judicious – or by choosing your responses. The society to which the answers are given is completely chewed up so that it does not have to do any additional mental processing.

Competitively, it would be odd if that was the direction of things. Microsoft, after its ten billion dollar investment, is preparing to incorporate ChatGPT into its Bing search engine, basically, because it has nothing to lose: if the results aren’t good, it’s going to be a third party search engine. s mistake, and it will affect only some 3% of the market, and if they are considered good, it could mean a qualitative leap in the perception that the remaining 97% have binges. It’s also getting ready to incorporate it into Office (for example, it’s worth working in Excel or Google Sheets).

However, the move isn’t so straightforward for Google. The potential ramifications of a Google search engine responding with nonsense, lies, or gross errors are enormous, and can erode the trust of its users. In fact, he planned to spend a lot more time refining LaMDA before doing anything — when you’re the leader by such a high margin, there usually isn’t much of a crowd or much interest in sudden changes — when Unless they clearly see that at the time of extreme virginity, if you don’t move, you’re finished.

Where is it right for Google to go now? From your leadership position, and given the impact you have on the future of the web, it makes sense for you to bring your own integration to market, not a simple integration of third-party technology like OpenAI. giant language model (LLM) with the ability to do paraplexity-style source detection, and some kind of addendum to give it a more rigid image, something that might enhance their image and explain why it took them so long to move in that direction. Why did it take time

I also talked about this topic carlos del castillo From ElDiario.es, which I quote today in my article “Artificial Intelligences Like ChatGPT Assault the Google Search Empire” (PDF).

World Nation News Desk
World Nation News Deskhttps://worldnationnews.com/
World Nation News is a digital news portal website. Which provides important and latest breaking news updates to our audience in an effective and efficient ways, like world’s top stories, entertainment, sports, technology and much more news.
Latest news
Related news
- Advertisement -

LEAVE A REPLY

Please enter your comment!
Please enter your name here