The Instagram Trials
Artwork from EuroNews
The recent courtroom proceedings involving Mark Zuckerberg and the leadership of Meta have forced an unusually public conversation about the impact of Instagram on young people. Much of the testimony has focused on harmful content. Parents, lawmakers, and internal company research have pointed to the way certain material, particularly content related to self harm, eating disorders, and extreme body comparison, can reach teenagers through the platform’s recommendation systems. The legal question being debated is whether the company understood the scale of this harm and failed to respond appropriately.
This concern is serious and deserves attention. However, the public conversation risks narrowing the issue too much. When the discussion focuses only on harmful posts, it treats the platform as if it were a container that occasionally distributes dangerous material. The deeper issue is not only what appears on the platform. It is how the platform itself is built.
A useful way to think about the problem is through a simple analogy that researchers often use when talking about systems. If something in the aquarium has gone wrong, you do not begin by blaming the fish. You examine the water, the tanks, the filtration system, the lights, the conditions that shape how everything inside behaves. When the current conversation about social media focuses exclusively on bad posts, it is essentially blaming the fish instead of examining the aquarium.
As many of us are aware, Instagram is an engineered environment designed to capture and maintain attention. Every part of the interface contributes to that goal. Infinite scrolling removes natural rest points, notifications repeatedly pull users back into the application, and algorithmic ranking continuously adjusts the feed in order to surface the material most likely to keep someone engaged. These mechanisms are structural features of the product.
This matters because the platform does not only distribute information. It shapes behavior. When someone opens the app, they enter a feedback system designed to hold their attention for as long as possible. The longer a person stays, the more data is collected and the more advertisements can be delivered. Engagement becomes the core metric around which the entire environment is organized.
For adults this often produces distraction and compulsive checking. For children and teenagers the effects can be more intense. Younger users are still developing the cognitive systems that regulate impulse control and attention. When those developing systems encounter an interface that is deliberately optimized to prolong engagement, the interaction becomes uneven. The user is adapting to a machine that has been refined by enormous amounts of behavioral data.
Internal research discussed during the hearings suggests that the company has long been aware of how these systems influence young users. Studies conducted within Meta have documented correlations between heavy platform use and increased anxiety, body image distress, and depressive symptoms among teenagers. The architecture of the platform has remained unchanged. The same design patterns that contribute to psychological strain are also the patterns that drive growth.
This is the central tension at the heart of the trial. The platform’s economic incentives reward the very behaviors that raise concerns about well-being.
Because of this, the conversation about protecting children online cannot stop with content moderation. Removing certain categories of posts may reduce some harm, but it does not address the deeper structure that governs how attention is captured and sustained. The interface itself remains optimized for stimulation.
If society wants to take the problem seriously, the design of these systems must become part of the discussion. Other industries that influence human behavior face scrutiny at the level of structure. We evaluate how cars are built, how pharmaceuticals interact with the body, and how financial systems distribute risk. Digital platforms now operate at a scale large enough to shape everyday cognitive life, yet their underlying design choices often remain invisible to the public.
The hearings have opened an important door by placing social media companies under direct examination. That said, focusing only on harmful content risks addressing the symptom rather than the mechanism. The real power of platforms like Instagram lies in their interface design. That design determines how attention moves, how habits form, and how long people remain inside the system. Until the conversation shifts from the fish to the aquarium (or the mice to the trap) the structure that produces the behavior will remain largely intact.