https://medium.com/@dsearls/the-human-solution-to-facebooks-machine-produced-problems-also-won-t-work-3364656bc257

https://s3-us-west-2.amazonaws.com/secure.notion-static.com/f38ab776-ce25-467c-967d-6f74a15b5d3c/1GLCBLGkuPmXgrcgyQtXjOg.jpeg

Nobody can fix this.

Facebook is doomed.

True, all companies are mortal. (Geoffrey West has been telling us how and why for years.) But Facebook is actually designed to fail in a world that stops tolerating the way Facebook works, and can’t quit working.

Alas, what will make it fail is what makes it work.

I first wrote about this way back in 2012, in After Facebook Fails. I had the timing wrong (expecting a crash in 2013), but the reasons right:

  1. That Facebook is based on what @EliPariser in The Filter Bubble calls “a bad theory of you”—even though the company clearly knows how to herd billions of people into its pen.
  2. That Facebook was building a massively complex machine for attracting, manipulating and advertising at people that was both uninterested in understanding itself and impossible for its operators to completely understand. In other words, it wasn’t designed or built to fully account for everything it does.
  3. Things would go badly because of #s 1 and 2.
  4. The right response by the market was to let giant flawed systems such as Facebook’s fail, and finally to start putting social and business engagement tools in the hands of human beings, where they should have been in the first place.

What I didn’t factor in were —

  1. How fully business would buy into the promise of Big Data, sold by McKinsey, IBM, Salesforce, Microsoft Dynamics, Oracle and other arms merchants to big business, utterly mindless toward inevitable collateral effects, including damage to market trust, accumulation of toxic assets, and inevitable regulatory restrictions, such as the GDPR. (I unpack this in After Peak Marketing.)
  2. How in the course of doing that, business would (again encouraged by arms merchants) hand way too much much budget and power to CMOs, who should have been called DTOs, since they were really Digital Targeting Officers, operating with nearly zero respect for the marketing lessons taught by Peter Drucker, Theodore Levitt, Philip Kotler, Jack Trout, Al Ries and other lighthouses on the rocky shoals of negative market sentiment, against which companies led by bad DTO navigation now feel their hulls grinding.
  3. How much bigger adtech and martech (and their biggest practitioners, Facebook and Google) would need to grow, fed by DTO-led spending, before their business methods — especially around data gathering — would begin to fully earn the collateral effects of #1 and #2.
  4. That the first large-scale responses to surveillance overreach would not come from the market (there was almost none of the investment I called for back in 2012), but from regulators, most notably with the EU’s General Data Protection Regulation (GDPR), which became enforceable in May.
  5. That all the internal policies and controls Facebook could put in place would not counteract the thing it does best for its advertising customers: microtargeting. When you know as much as Facebook does about everybody on it, and you add what advertisers can also know (as did Cambridge Analytica for the Trump Campaign), you can manipulate people personally, without either the person or the advertiser knowing it. (Or with enough plausible deniability to get away with it.)

In Facebook Can’t Be Fixed, John Battelle argued on 7 January 2018 that “Facebook’s fundamental problem is not foreign interference, spam bots, trolls, or fame mongers. It’s the company’s core business model, and abandoning it is not an option.” In an earlier piece titled Lost Context: How Did We End Up Here? John visits how Facebook got into that business, and went to hell after that. His subhead explains, “Facebook and Google’s advertising platforms are out of control. That used to be a good thing. Now…not so much.”

That’s an understatement. See, Facebook’s form of advertising amounts to voyeurism for hire. That was never a good thing. At a certain point having Goliath peek into personal windows creeps out too many people, including regulators.