Facebook has announced that it is stopping development on its Instagram Kids project. It follows reports that the social media giant commissioned – and kept secret – internal research that found Instagram was harmful to young people’s mental health.
The study’s findings, not to mention the fact that they were put on hold, have only reinforced the heavy criticism for which the project initially came. “Instagram for kids,” one headline quickly went on, “the social media site nobody asked for”.
Who asked for what in information technology development is an interesting question. In the late 1980s, research had already highlighted that the history of computers was arguably more of a demand-driven than a response to need. And social media is no different: It’s not something we didn’t know we wanted to be embedded in everything we do. Research increasingly confirms that it can also be a source of harm.
Children are at the center of this battle between usability and safety. They are the future designers of our technology – they will inherit our messes – but they are also using it right now. And they are the future customers of tech companies. Instagram chief Adam Mosseri is quick to defend the value and importance of the children’s version of the app. But can we trust Big Tech to give us what we really need, not to manipulate us into consuming what we need to sell?
advent of usability
The concept of user experience now dominates information technology thinking. But early home computers were anything but useful, or usable, for the average person. This is mainly because they were still being designed for trained experts: they assumed competence in whatever turned them on.
Parents were encouraged to embrace the educational potential of home computing since the early 1980s. They saw the devices as promoting their children’s education and future employability. But this uptick in early equipment was still more conceptual than practical.
By the late 1980s, however, the idea of usability began to take hold. As IT design began to focus more on how the average people could use their products effectively and efficiently, computer scientists built a home on human-computer interaction and user-centered design.
From User Experience to User Safety
Technology, of course, now enables how we live, how we communicate, how we interact, how we work. Families are full of devices and applications that are usable, useful and being used. Indeed, keeping the devices and what’s in use in them is central to IT design: the user is a customer and the technology is designed to nurture that custom.
Figuring out how to provide a meaningful and relevant experience for someone using a digital product or service, from a device to a social media platform, is known as user experience (UX) design. Tech giants talk about meeting our expectations before we even know them ourselves. And the way designers know what we want even before we want it, it comes down to the data they collect on us – and our children.
The flurry of recent lawsuits, however, highlights the line in terms of harm to the user, that such digital innovations have been exceeded, driven by profit and shaped by our personal data. These include the case against Tiktok started by former England Children’s Commissioner Anne Longfield.
Longfield’s case alleges that the video-sharing platform uses the personal information of its underage users for targeted advertising purposes: from date of birth, email and phone numbers to location data, religious or political beliefs and browsing history. till.
The concern these days is that privacy is at risk as profits are prioritized over security.
The utility movement that began in the late 1980s, therefore, now needs to be made way for security usable by computer scientists: human-centered design, where security is prioritized. Our research shows that many online applications are not suitable for use. They fail to find a balance between usability and security (and privacy).
We need to explore the possibilities of open-source designs as alternatives – those not driven by profit. And we need to foster a moral awareness of technology in young minds: they are the programmers of tomorrow. Just as important as learning to code is understanding the ethical implications of what is being coded.
This article is republished from – The Conversation – Read the – original article.