Facebook’s unHappy 15th Birthday

Published by Mrs. AD on

The Problem of a Fully Connected Lifestyle

In what seems to be a new trend in my discussion posting for this class, I had written an entirely different blog based on my own experiences with social media addiction, when Roger McNamee, one of the early investors in Facebook and former mentor to Mark Zuckerberg, came on the NPR’s 1A show to discuss his new book, “Zucked: Waking Up to the Facebook Catastrophe.” Naturally, my head went spinning and I had to completely start over!

  • You can listen to the full interview with Roger McNamee and Alexandra Suich Bass here.
Roger McNamee (left), Mark Zuckerberg (right) (Photo credit: Boing Boing)

Last week was Facebook’s official 15th birthday, but it may well be their last according to Roger McNamee. McNamee has been investing in tech companies since 1982 and knew Facebook was something special when he met Zuckerberg in 2006. Straight off, McNamee told Zuckerberg that someday soon one of the big tech companies would offer him an absurd amount of money to buy Facebook, but “don’t take it.” McNamee told him he could do so much better if he kept the company private. That was exactly the situation Zuckerberg was in, and that frank conversation lead to McNamee becoming one of the many trusted advisors to the Facebook CEO over the next 3 years. Has anyone seen the HBO show Silicon Valley?? That story right there was the plot of the first two episodes.

McNamee praised early Facebook (and Zuckerberg) as “extraordinary.” He cited the early success to two things: 1. an authentic identity for users, since you could only join if you had a certain type of email address, making your identity verifiable thus keeping trolling to a minimum, and 2. It gave users full control of their privacy. Both of these have gone completely out the window in modern Facebook.

“He [Mark] believes that connecting the whole world on one network is a good idea for humanity, that it justifies whatever means are necessary to get there.”

Roger McNamee

Beacon was Facebook’s first big privacy issue, one I never heard of. On November 6, 2007, Beacon connected to 44 other businesses and was Facebooks first attempt at targeted advertising. It tracked where you were and what you did, then posted it on your feed, including what you purchased, where you purchased it, and how much you paid. One man bought an engagement ring for his soon-to-be fiancé, and his entire group of Facebook connections, including his girlfriend, read about it on his feed. The crux was that the user was not notified when a real life transactions would be posted to their feed and had no way to turn it off. On December 5th, Facebook announced it would allow users to opt out of Beacon. Due to the bad publicity, Facebook eventually pulled back from the entire Beacon idea. Why did Facebook go ahead with a technology that blatantly violated user privacy without their consent, or even their knowledge? McNamee says, “[Mark] believes that connecting the whole world on one network is a good idea for humanity that it justifies whatever means are necessary to get there.”

Visual Representation of your ‘Filter Bubble’ (photo credit: Teleshuttle)

Facebook’s second big mistake, according to McNamee, was when Facebook and Google worked together to selectively show you things that were relative to your interests, instead of showing you everything that was available. Eli Pariser coined these as ‘Filter Bubbles’ during his TEDTalk “Beware Online Filter Bubbles”( Pariser, Eli). Pariser liked seeing political news from both conservative and liberal outlets on his Facebook feed, but one day he noticed that all the conservative posts had disappeared. This was because he more frequently clicked on his friend’s liberal posts and Facebook had noticed this trend. Based on his clicks, the algorithm would bias his feed to show more of what it thought he wanted to see. Facebook was not the only one doing this, Google, Yahoo News, media outlets, were all creating a personalized experience of the internet based around what you have clicked on in the past. This search bias is what Pariser means when he refers to your “Filter Bubble.” Right now, if two people use Google to search for the word “London” they will get very different search results that are mediated by the algorithm’s personalized setting on their account.

Eli Pariser, “Beware of Online ‘Filter Bubbles'” – TEDTalk, May 1, 2011

I, too, remember noticing this trend, along with a trend in the ads I was seeing. They were related to items I had been looking to buy online. I searched Wayfair for sectional patio couches, and I started to see ads for them on Facebook. I looked for party dresses for New Years, and not only did I get ads for dresses on Facebook, I got ads for 50’s style swing dresses just like the one I had ordered from Amazon that very same day. McNamee noted that when Pariser went public with his TEDTalk, McNamee thought it would be enough bad press for Facebook to do an about face like it did with Beacon. It didn’t. They did the opposite and doubled down.

This is where the Facebook advertising platform became so enticing because it could match products to the users it knew would be likely to buy those products. For me this is a huge problem, if you didn’t already guess I have a problem with online shopping. This ad matching does show me things I want to buy, but they are often of very poor quality and take weeks to ship from 3rd world countries where the workers are exploited. Yet for some reason I still click on them when they show up, now IN my feed instead on the side bar. Currently on my scroll: Spoonflower, a company I ordered custom fabric from; Organic diapers, ever since I had a baby my feed is LOADED with baby products; Wonder powder foundation, perhaps because I used to be a makeup artist; a game to help pick out your wedding dress, I recently searched for dresses after a friend told me the dress she bought; ThredUp, an online thrift store that shows really cute clothes on their ad that you can then never find when you go to their site; Toyota C-HR, I googled 4-wheel drive once;… you get the picture. These are not things I want to see, they are what an algorithm thinks I want to see. The most frustrating piece is that I can’t turn them off. I can modify the interest boxes Facebook has checked off for me, which would literally take days there are so many, but this wouldn’t stop the ads because the next day I went to search for something it would add that to my Facebook “interests.” For giggles I went into my settings to see what was listed, and a section I was not familiar with came up, “Advertisers Who Uploaded a Contact List With Your Information.” Apparently, every car dealer and real estate broker in the nation ‘uploaded a contact list with [my] information.’ I X’ed them out one by one, each time getting a message to confirm “do you really want to do this?” for 2.5 hours and I was no closer to the end of the list.

Moving on to more recent Facebook scandals, Cambridge Analytical. McNamee went to Zuckerberg and COO Sheryl Sandberg about 10 days before the 2016 presidential election to warn them that their data collection could be used to influence the election. He had noticed groups were springing up that looked inauthentic. A company was collecting data on people interested in BlackLivesMatter and then turned around and sold it to police departments. And then there was Brexit, where Facebook had possibly “enabled inflammatory messages to outperform neutral messages.” The heads of Facebook responded that appreciated him looking out for them, but those were isolated incidents that they already had covered. That is the last McNamee spoke to the CEO and COO. Then the Cambridge Analytical story broke and McNamee has been a self-proclaimed “activist” against Facebook’s business model ever since. Yet he still owns considerable stock in the company. He feels that if the ship ends up going down, as someone who was there on the ground floor he should go down with it. An interesting case of “follow the money,” since he stands to lose so much of it if his “activism” succeeds.

Zuckerberg (center) testifying in front of the Senate Judiciary and Commerce Committee (Photo credit: Vox)

So why not just leave Facebook? If you don’t like it, delete your account and sign off. McNamee goes on to say that the true business model of Facebook is to “manipulate your attention so you spend more time” looking at the ads. To accomplish this, Facebook implemented a reward system via notifications and likes. They trigger our base instincts of fear and outrage because that is what grabs our attention and fires us up to share with others, to ultimately receive feedback that others agree with our feelings. How interesting, isn’t that how news headlines have been working? Run the sensational grabber, who cares if it isn’t entirely true – it gets people to read our stuff. It gets people to spend more time interacting with Facebook. The habitual pattern of responding to notifications and the gratification of seeing people like or agree with your comments’ feeds into behaviors of addiction. McNamee’s way of checking your Facebook addiction status is to ask yourself how early in the morning you check your phone for updates. Before you go to the bathroom? While you’re going to the bathroom? Before you get out of bed? Myself, I check it while I’m still in bed. For me it’s not just Facebook, I also check my emails, my calendar, the weather, Reddit updates, catch a Pokémon, and send out a few text messages, all before I put on pants. It really is like a gambling addiction. Casinos play on the same system of rewards and fear of missing out on a bigger score, or the anger of completely busting before you go home.

My conclusion: I’m doomed, save yourself. I when all out for that tantalizing carrot on Facebook’s string, and now everything there is to know about me is known and I will be at the mercy of their marketing skill until there are laws to stop all this. Now that I know this, what am I to do when my daughter is old enough for her own phone or computer? Should I encourage social media in my classrooms even though I know how manipulative it can be? We can’t protect our children forever, and the harder we push them away from something the more enticing it will be for them to break through and try it for themselves. I noticed my addiction well before the scandals started to break. After the election in 2016, I throttled way back on my use of social media. I didn’t sit with Facebook open on my computer at all times, I made a point to get away from my computer games, to do something in the physical world, even if it was only to trade one screen for another and watch a movie. I leave my phone in random places in the house and I don’t go looking for it until I’m going to bed. I can sympathize with what Paul Miller experienced in his year offline. When I’m not on Facebook or I don’t have my phone handy, I miss all kinds of messages, events, photos. It pisses off my husband because he can’t reach me when he needs to know something important, like do we need milk from the store. Recently I have started to worry about the amount of time my current courses will keep me online, browsing for research or experimenting on Twitter. I appreciate the moments of solitude where I ignore the beeping of my phone and enjoy the moment I am having, be it cuddling with my daughter or playing with the cat. The notifications, the emails, the workload, will all still be there when I pick it back up in an hour or so.

I leave you with one more quote from the interview with McNamee, which I think sums up the core of this predicament:

“It’s not a fair test because you make the first move, you get on there [Facebook] and you start using it, but after that they [Facebook] are doing everything in their power to manipulate your attention. And, once you are addicted your ability to control is much less… There’re literally a million things that could be in your feed at any point in time, they pick the 20 that are best for theirbusiness, not best for you.”

Roger McNamee


McNamee, Roger & Johnson, Joshua. (Feb 7, 2019). Why We Keep Forgiving Facebook . 1A. NPR. Retrieved from https://the1a.org/shows/2019-02-07/why-we-keep-forgiving-facebook

Pariser, Eli. (May 1, 2011). Beware Online Filter Bubbles . TED Talks. Retrieved from https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles/transcript?language=en#t-96094

Wikipedia. Facebook Beacon. Retrieved from https://en.wikipedia.org/wiki/Facebook_Beacon


Leave a Reply

Avatar placeholder

Your email address will not be published. Required fields are marked *