No longer active

Comments have been turned off because of spam. Questions/comments: I'm at dantasse.com

Sunday, July 7, 2013

To Save Everything, Click Here

by Evgeny Morozov. This was quite a read. Not often I painfully jot down 11 pages of notes in my ebook reader. I guess it's because he has strong opinions about most everything I'm interested in.

Quick overview: railing against "internet-centrism" and "solutionism." Internet-centrism is the modern trend to ascribe magic powers or moral judgments to "The Internet." (Piracy should be tolerated (instead of DRM) because The Internet! We should change government to be more like Wikipedia because The Internet!) Solutionism, a broader topic, is the tendency to see any phenomenon as a problem to be solved.

Caveats: he's kind of a jerk. I'll call him out on it explicitly later. It's really frustrating; I want to like this book more, but often he just resorts to bullying. And his caustic writing style is probably calculated just so people like me will get all fired up about his book! In that case, sir, good job. (I hope I never meet you.)

Anyway, solutionism. A good example of something that's maybe not a problem to be solved is cooking. He makes fun of all these kitchen technologies that help you cook "more perfectly" - which is to say, more accurately or more efficiently, silently assuming that accuracy or efficiency is what we want. "Here is modernity in a nutshell: we are left with possibly better food but without the joy of cooking."

Internet-centrism: people complain about Apple creating a closed ecosystem. But just why is open better? In Apple's case, for selling you some sweet apps, maybe closed is a better model. (Morozov's thoughts, not mine.) Arguments for openness often (not always) resort to "because openness! The Internet!"

Other things that Morozov says that I agree with him on, or am open to considering:
- Silicon Valley libertarianism is bad: hell, we wouldn't even have The Internet without public financing.
- "Open government" isn't a goal - or at least, that openness isn't a goal in itself. For a few reasons: 1. it's a huge pain to have to document every damn thing you do; 2. it opens the doors to the public to nitpick every last little spending decision (some of which are unintuitive but turn out well). Efficiency is great, but not necessarily our #1 goal in our government.
- Technological solutions often have unintended consequences. If you publish crime stats by neighborhood, sure, that helps home buyers/renters to find safe neighborhoods, but it also hurts sellers and therefore might make them less likely to report crimes.
- You can't rely on Yelp-style crowds for everything. (hell, you can't even rely on Yelp-style crowds for Yelp.) I don't want crowds telling me where to eat, much less what to vote.
- If you publish some metrics (like senator attendance records) then people will optimize for them. (this can be problematic. maybe one senator has more important things to do one day.)
- We shouldn't take Google search results as gospel; they can be manipulated. (Interesting question then: given that we do take them as gospel, what should we do about it?)
- "Internet-centrism is at its most destructive when it recasts genuine concerns about the mismatch between what new digital tools and solutions have to offer and the problems they are trying to solve as yet more instances of Luddite and conservative resistance."
- Complicated computer algorithms, like any other decision making tool, reflect the biases of their creators. But complicated algorithms offer a (sometimes real, sometimes fake) appearance of objectivity. (for example, for police work.) (sounds like a call for intelligibility.)
- Oh man, great stuff about SCP, the law enforcement philosophy that says you should make it impossible to commit crimes, rather than just punishing people who do. This is really interesting. For example, if I decide I'm not going to eat cookies, I want SCP-style prevention there! I want to make it impossible for me to eat cookies! He agrees: as long as you make the decision yourself, there's no problem "shifting registers" (term from Roger Brownsword) from the "moral" register (x is good or bad) to the "practicable" register (x is easy or hard) or the "prudential" register (x helps me or hurts me).  So he's against "nudges". I can dig it. When we shift things out of the moral register, though, we might never even think about them again.
- Excessive quantification in the research world is a mess. People are gaming metrics, counting publications and citations too much, etc.
- Food is a good example of solutionism gone wrong. We decide fat is bad, so everyone counts grams and does all sorts of nasty tricks to call their products "low fat", only to later discover that fat's not so bad after all.
- Furthermore, Quantified-Self solutions can backfire by putting the onus back on the individual, rather tahn the broken system. ("why didn't you just count your calories?")
- Memory != preservation. And we shouldn't assume that we should preserve everything.
- We talk about "information nutrition" (e.g. The Economist is healthy, tabloids are junk food), but we really have no idea what we're talking about.
- You can't really design for serendipity.
- Gamification via points and badges is dumb. (okay, duh.)
- Check out Albert Hirschman's futility-perversity-jeopardy trio as a set of common reactions to new things.
- We should try to change people's behavior mostly by reflection, not by paternalistically "nudging" them into making the right decisions. I don't know, though; his example, using some complicated parking meter thing, seems like those 5 cent nickels at Whole Foods when you bring your own bags, and then you have to decide where to donate your 5 cent credit. I don't want to think about those 5 cents. Just let me get on with my day.
- Sure, everyone's always manipulating us. But I want to know when it's happening.

Anyway, the overall feeling I got from reading this is that maybe my earlier research thoughts are misguided. I basically heard the Weiser ubiquitous computing story ("your computers will fade into the background" etc) and thought "yes, let's do that." Maybe perfect efficiency isn't always the goal!

Some things that Morozov says that are stupid: (if he's going to bully people, I can bully him back)
- somehow make "open government" data (say, campaign finance) only appear on the original source (like the FEC website) so people can't re-publish it and selectively alter or highlight it.
- the "Pirate party" in Germany is losing support, therefore their ideas are failing and should be mocked.
- LiquidFeedback, this tool that sounds like Google Moderator, is a "solution to a problem that doesn't exist" - we don't need more feedback from, say, politicians. 
- partisanship isn't necessarily a problem
- Amazon might start automatically generating books! That are creepily optimized to be exactly what you like! Never mind that this is probably AI-complete!
- We should start having all algorithms be audited by qualified third parties. (oh my god. what constitutes an algorithm? geez.)
- "Once everyone is wearing Google's magic glasses, the costs of subjecting friends to a mini lie detector... are trivial."
- Argh, he totally doesn't understand some of the systems he's writing about and mocking. (e.g. Matthew Kay's Lullaby)
- Quantified Self people are weird and gross. (seriously, this whole chapter is just straight up bullying. it's really offensive.)
- Quantified Self people are all into some weird woo-woo shit about revealing the deeper truth of who we are, with numbers! What a bunch of misguided weirdos!
- Self tracking for health purposes makes a mess for the insurance industry, so we shouldn't track things about ourselves.
- If you like quantifying something, then you must be a Super Quantifier who wants to quantify everything!
- "Even though Bell doesn't quite put it this way..." (... puts words into his mouth.)
- Gordon Bell is a weird guy. I can just dismiss anything he does by mocking him.

Some comical excerpts from my notes as I was reading this:
"no, you numbskull."
"sigh"
"this section sounds fearmongery"
"the bullying in this section makes me wonder about the rest of this book."
"not what he said. you clown."
(on chapter title "Gamify or die") "I hate this chapter already"
(when he starts talking about extrinsic vs. intrinsic motivation) "sigh I knew this was coming"
"ad hominem, you're a jerk, etc"

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.