In 2021, I deleted my Instagram. I was an 18 year old in college, addicted to my phone, unable to focus, and struggling with my self-esteem. After clicking “delete”, I was immediately relieved of some of these pressures I had felt since downloading the app when I was 13. I began to wonder why my Instagram feed felt so addicting that I couldn’t put my phone away. I wondered how it knew exactly what content I would keep looking at.
I requested my Instagram data
Before I deleted the app, I requested my data from Instagram and the results shocked me. Instagram sent me an email with a large html file. It was a detailed record of everything I’d ever done on the app from my first download to my final delete. There was a list of every post I had ever viewed and when I viewed them. There was a list of my followers, who I followed, and when we had followed each other. What struck me most was the very long list of every ad that had appeared in my feed; I’d seen 222 ads in just the last 7 days before deleting my account.
The data was separated into lists such as posts I viewed, posts I liked, and posts I commented on. Each came with a timestamp, so I could carefully follow my activity from post to post, account to account. Sitting in the year 2023, I could map my logic from 2018 by the minute, watching content leading me to other content. Looking at my data, I could step into a time capsule of my entire mindset from any point in my 5 years as an Instagram user.
I could match the targeted ads to stages of my life. When I was participating in sorority recruitment, the ads from boutiques and women’s fashion brands catering to my recent searches for new outfits followed. When I befriended student athletes who lived in my dorm, I stalked them on Instagram, and soon after began receiving ads for Nike sportswear. I could see the app making connections and taking stabs at what I might buy when given an easy opportunity with a quick click.
By tracking how Instagram tracked me, I saw the version of myself that Instagram saw. This giant file Instagram had sent me was, in essence, a paper-doll version of myself, constructed out of clicks. Instagram took my person and turned it into a profile. My life was flattened into commercial opportunities, with companies clamoring on the other side for a chance to play a part.
Instagram took my person and turned it into a profile. My life was flattened into commercial opportunities, with companies clamoring on the other side for a chance to play a part.
Instagram’s business model
The business model of Instagram and most other social platforms relies on immense amounts of data. Instagram makes money by selling the ad space on your feed to companies who think they might get you to buy their product if they can be in front of your face at exactly the right time.
Advertisers rely on the data that Instagram collects about you – like what ads you’ve clicked on and what you’ve bought under what conditions – in order to buy their ads. Instagram wants to sell as many slots on your feed as it can, so it curates content based on what you engage with that seems to keep you on the app longer. More engaging content means more time spent on the app, which means more ads viewed and more data collected. It’s a never-ending, manipulative cycle.
The engagement model of keeping us online can encourage unhealthy behaviors. Social media shows you the content it thinks you can’t look away from, and sometimes these things hurt to see. Spending time dwelling on an ex’s profile makes the algorithm discern that showing you more posts from them will keep you scrolling. Sometimes the targeted ads we’re shown can be harmful. For example, women who have lost pregnancies have come forward with heart-wrenching stories about the immense weight that comes with being targeted with ads for maternity and baby supplies for a child they will no longer have.
Instagram can be bad for teens
Young people are a big part of Instagram’s business model; more than 40% of its users are 22 or younger. For approximately 85% of my own time on Instagram, I was a minor myself. These are the users to whom Instagram poses the biggest risks.
In 2021, leaked internal research that Facebook – which owns Instagram – had conducted on its own product concluded that Instagram was making body image worse for 1 in 3 teen girls. The algorithm delivered weight loss content, like shockingly skinny women giving tips on cutting more calories, to teen girls. In one study, Facebook found that of teen Instagram users who felt “unattractive”, more than 40% reported those feelings started because of the app. Teens participating in Facebook’s internal studies resoundingly blamed Instagram for increases in anxiety and depression, unprompted by researchers.
Facebook found that of teen Instagram users who felt "unattractive", more than 40% reported those feelings started because of the app.
Of all the social media apps, Instagram’s algorithm – according to its own research – is more focused on bodies and lifestyles. Looking at my own Instagram data, the app had assigned me Topics that included Beauty, Style and Fashion and Food and Drink. These umbrellas offer a lot of room for dangerous content to slip into the feed of an unsuspecting teenager.
Just last month, the U.S. Surgeon General issued an advisory warning about the ways that social media “presents a profound risk of harm to the mental health and well-being of children and adolescents”. Similarly, the American Academy of Pediatrics has been calling on health officials to address the mental health impacts of social media on minors, such as increased rates of anxiety and depression, advocating for the 95% of American teens who use social media.
We need to regulate Big Tech
Social media companies have a lot of power. They have huge user bases – Instagram, for example, surpassed 2 billion monthly users in 2021, meaning it collects data on nearly 1 of every 4 people in the world. This industry has also largely been unregulated to date. They should be held to account the same as other industries. As Senator Ed Markey put it at a hearing in 2021, “Facebook is just like Big Tobacco, pushing a product that they know is harmful to the health of young people, pushing it to them early, all so Facebook can make money.”
The good news is change is possible. Last year, California passed the Age Appropriate Design Code Act which requires tech companies to make design choices that will make it easier for kids to log off and stop excessive data collection. The FTC has taken recent steps to hold Meta (the umbrella company of Instagram and Facebook) accountable for its misleading use of minors’ data. And in Congress, Senators Markey (D-MA) and Cassidy (R-LA) have reintroduced COPPA 2.0 (Children and Teens’ Online Privacy Protection Act), which would ban targeted ads to minors altogether and limit data collection on users under 17. All of this would help rein in companies that have come to make the details of our personal lives the foundation of their business model.
It’s been 2 years now since I deleted my Instagram account and I don’t regret it for one second. When I was on the app, I was competing with a super-algorithm for space in my own brain. It amplified what served its commercial interests, no matter if those lined up with my own. If I can spend a couple hours combing through my Instagram data and start to see the patterns, Instagram can analyze and act on all of that data in an instant.
When it comes to social media, we’re all impacted in one way or another. All of us have reams of data collected on us that shape our feeds and in turn how we see ourselves and the world. By downloading Instagram we have inadvertently entered into a game with no clear rules and no ump. Our opponent invented the game, and we don’t even know we’re playing.
Intern, Don't Sell My Data campaign
Anastasia Micich is an intern with PIRG's Don't Sell My Data campaign.
Director, Don't Sell My Data Campaign, PIRG; Policy Analyst, Frontier Group
R.J. focuses on data privacy issues and the commercialization of personal data in the digital age. Her work ranges from consumer harms like scams and data breaches, to manipulative targeted advertising, to keeping kids safe online. In her work at Frontier Group, she has authored research reports on government transparency, predatory auto lending and consumer debt. Her work has appeared in WIRED magazine, CBS Mornings and USA Today, among other outlets. When she’s not protecting the public interest, she is an avid reader, fiction writer and birder.