— Simply lovely. In Be Wrong as Fast as You Can, New York Times magazine editor Hugo Lindgren lays it all out on the line, in a first-person confessional with a moral for us all. Now, please excuse me but I must stop procrastinating and reading Everything On The Web and get back to it.
— I can’t remember how this link came my way or figure out why it resurfaced after a year (thank you, Internet) but this is a great list of advice on how to tell a good story from Pixar artist Emma Coats. These great tips are useful in so many contexts other than formal story-telling. For instance, I can totally imagine using them to try and make presentations more interesting, less dire.
— Sir James Dyson, founder of Dyson, outlines his approach to innovation, design and risk management, critical when the economic chips are down.
How To Fail Less is a nice Q&A with entrepreneur Steve Blank, who explains his methodology of the “Lean Launchpad” for teaching entrepreneurship. Based on the idea initial hypotheses are most often entirely wrong, this approach to creating a business shares a lot in common with the design techniques of prototyping and iterating.
[Story via Erik Van Crimmin]
Furniture company, Steelcase turns 100 years old this year, and so naturally they’ve launched a website that has nothing whatsoever to do with chairs and sofas. Instead, they’ve corralled journalist John Hockenberry to front a site that canvases dreams for the next century, asking all sorts of dignitaries to describe their best hopes for the world’s survival. Luminaries include the likes of Deepak Chopra, Roger Martin — and my boss and Doblin co-founder, Larry Keeley, who writes about his dream of revinventing participative democracy. Here’s his essay.
Most people think of the United States as a very young country. Given the childlike way we often act this is a natural enough impulse, but they are wrong. The U.S. is the oldest participative democracy on our small blue planet.
So consider this: perhaps countries have a natural life span, and we are at the end of ours. I am not a Chicken Little alarmist saying we will suffer the inevitable decline of all empires before us. Instead my message is positive and deeply focused on innovation: let’s reinvent participative democracy for the 21st Century.
And not just for the U.S. This reinvention should be a gift to the world – equally valuable in any land and for any people; useful at any scale: team, firm, town, city, state, province, region, country, and continent. To live up to its own promise, the U.S. should create it and should adopt it first but, hey, it’s an election year, so the odds that our “leaders” will do anything useful and path-breaking during this period of national embarrassment verges on zero.
How might we do it? Three revolutions, elegantly integrated:
1. Change how we fund candidates.
Since every country does this more wisely than the U.S., there are many good models to learn from, adapt, and adopt. Lawrence Lessig has the most interesting proposals for reinventing U.S. election funding, including the simple idea that all donations should be anonymous (so that even the candidate can’t figure out who donated) and reversible, so an individual or a company can donate today and claw back the funds next week. This means that candidates will have no idea who they are supposed to suck up to – raising the odd possibility that they might actually focus on making good decisions for rational reasons. Crazy, huh?
2. Reinvent how we understand issues.
To get past tiresome sound bites and attack politics we should harness new approaches to journalism that emphasize analysis as much as news. We see bits of this all over the web, but an ideal system would harness the power of deep data, information modeling, and great information design so that we can make it easier for anyone to understand hard things: Does capital punishment reduce crime? Does gun ownership make us safer? Roll it all up through devices integrated with social media so you can deeply understand something in a few minutes – on your tablet PC or smartphone. And, hey, it gives journalists a great new role.
3. Transform how we do polling.
People are polite. Ask ’em what they think and they will tell you. For decades now, we have resorted to daily, inane polls to help our “leaders” determine what people think. Instead, using deliberative polling, we could create practices where we would learn how people’s opinions change in the presence of objective, factual information about a topic. This would help leaders know what sensible, representative groups of people believe when they are taught actual facts.
— In Spending Other People’s Money: What Professors and Doctors Have in Common, Forbes writer David Whelan outlines a proposal to deal with the problematic disconnect between the one doing the prescribing (doctor/professor) and the one actually supplying (textbook publisher/pharmaceutical manufacturer.)
I loved this presentation by type specialist, Jonathan Hoefler, of the type foundry Hoefler & Frere-Jones, which will obviously appeal to designers but also affords those from outside the discipline a great insight into the complexity and depth of the design process. At the recent AIGA Pivot conference, Hoefler detailed the intricacies of the type design process and explained the nuances and difficulties of translating type to new media such as the various screens most of us now take for granted and use so constantly throughout our every day. As he explained, the web in particular has been dominated by “pretty bad fonts”, and designers have for the most part capitulated to the demands and restrictions of the technology. According to Hoefler, designers have stopped thinking about what users/readers need and instead are content to serve up what they happen to have on hand.
Instead, he argued, there’s a real need for designers to take a step back and to create systems that can convert an idea into an appropriate form. Then he ran through the design decisions that went into Retina, a typeface Hoefler & Frere-Jones designed for the Wall Street Journal’s stock listing system. It’s totally fascinating, not least because many of the design decisions would never be noticed or experienced by the reader. And the idea of thinking about a system, not merely the tangible product or obvious outcome of a design process, is a smart one that pervades much of the most useful innovation thinking.
— I don’t flag this post for its “tell-all” inside scoop on Twitter, as in fact this seems to be one of those more standard “I’m leaving, hey are you hiring?” type of notes that have become de rigeur for those in the Valley looking to send out a signal to potential future employers. But among the platitudes, On Leaving Twitter does usefully remind of the difficulties faced by companies charged with managing scale, particularly at the rate that companies such as Twitter have had to deal with it. The above stories, that projects are judged based on the cleverness of their bird-related name or the popularity of their stakeholder are surely not the whole truth, but they do sound some alarm bells. Certainly, they act as a signal that perhaps internal processes and systems aren’t yet robust enough to ensure that innovation and ideas can be brooked from all quarters internally. That’s an all-too-common problem will only get bigger as the company continues to grow.
I recently finished two books on the birth and growth of the computer industry: Steven Levy’s classic Hackers, and Walter Isaacson’s biography of Steve Jobs. Both left me pondering the topic of stealing ideas. For it’s Jobs who once said so memorably:
Picasso had a saying—‘good artists copy, great artists steal’—and we have always been shameless about stealing great ideas.
Yet in Isaacson’s book, Jobs is also at his most vitriolic when talking about the brazen temerity of others daring to steal Apple’s ideas. To wit:
On Microsoft Windows:
They just ripped us off completely, because [Bill] Gates has no shame.”
Our lawsuit is saying, “Google, you fucking ripped off the iPhone, wholesale ripped us off.” Grand theft. I will spend my last dying breath if I need to, and I will spend every penny of Apple’s $40 billion in the bank, to right this wrong. I’m going to destroy Android, because it’s a stolen product. I’m willing to go to thermonuclear war on this. They are scared to death, because they know they are guilty. Outside of Search, Google’s products—Android, Google Docs—are shit.
Now one might write off this contradiction as simply being characteristic of the late, mercurial CEO. But these days, stealing ideas seems to have become an accepted, encouraged part of culture. In a great piece about the digital creative meritocracy, Brain Pickings’ Maria Popova quotes Robert Levine, author of a new book on the topic, Free Ride.
In Silicon Valley, the information that wants to be free is almost always the information that belongs to someone else.
That’s what Jobs was saying. It’s perfectly ok for Apple to steal other people’s ideas. But anyone stealing from them should hang his or her head in shame for being such a low form of humanity.
This idea of stealing ideas is also touched upon in Hackers, which concludes with the fascinating story of Richard Stallman’s efforts to subvert work at Symbolics, a company he felt violated the hacker code of the free flow of information. His revenge: to reverse engineer every development Symbolics introduced and hand it, for free, to the company’s main rival, LMI.
For me, the whole topic raises a ton more questions than it supplies answers. For example, if anyone can steal an idea and reproduce it wholesale then does that imply that the idea wasn’t that great in the first place? When is building on someone else’s idea ok (standing on the shoulders of giants)? And when is it, you know, plain old, actual theft? Would love to hear your thoughts.
Oh, this is amazing. “Onward to the Edge” is the twelfth installment of Symphony of Science, a musical project by John D Boswell designed “to deliver scientific knowledge and philosophy in musical form.” This episode features auto-tuned insights from science world luminaries such as Neil deGrasse Tyson, Brian Cox and Carolyn Porco, and it’s ridiculous and uplifting in equal measure. I particularly love Tyson’s concluding sentiment: “There are times when, at least for now, one must be content to love the questions themselves.”
— Artist, illustrator and unapologetic all-round kook, Laurie Rosenwald tells it like it is.
This weekend saw Thrilling Wonder Stories 3 take place in both London and New York. I went along to the Saturday session at Studio-X in Manhattan, and was thrilled by presentations from the likes of Washington Post writer, Marc Kaufman, talking about life on Mars, and Morris Benjaminson discussing the whys and wherefores of In Vitro Meat, or as he’d like you to call it, “Muscle Protein Production System.” (Blech.) The above quote came courtesy of materials science professor, Debbie Chachra, who was describing how she conducted research into the mysterious structure of a cellophane-like material made by bees to protect their eggs. After exhaustive research, it turned out to be protein, not polyester, and not so mysterious after all.
But her point, that if something seems original then you likely haven’t done enough research, was something of a theme of the day. Author James Fletcher told entertaining stories of researching his book, Fixing the Sky, on the topic of the history and technology of weather control. He was also vehement about the fact that while we often think ideas are unprecedented, they really are no such thing. “These are not new ideas,” he said, of concepts such as hurling sulphates into the atmosphere or aiming toxic clouds at enemies. “This situation is precedented.” The moral of the story: read more; think more; learn more. All in all, not a bad lesson for a snowy Saturday afternoon.
Great piece from science fiction writer Neal Stephenson in the World Policy Journal. Innovation Starvation is a lyrical lament to the loss of imagination that Stephenson feels has accompanied so many of the technological developments in recent years. He writes:
Most people who work in corporations or academia have witnessed something like the following: A number of engineers are sitting together in a room, bouncing ideas off each other. Out of the discussion emerges a new concept that seems promising. Then some laptop-wielding person in the corner, having performed a quick Google search, announces that this “new” idea is, in fact, an old one—or at least vaguely similar—and has already been tried. Either it failed, or it succeeded. If it failed, then no manager who wants to keep his or her job will approve spending money trying to revive it. If it succeeded, then it’s patented and entry to the market is presumed to be unattainable, since the first people who thought of it will have “first-mover advantage” and will have created “barriers to entry.” The number of seemingly promising ideas that have been crushed in this way must number in the millions.
I’d write this off as beautifully written, thought-provoking overkill were it not for the fact that only last weekend, I ruined a perfectly good conversation by resorting to Google and an Internet search to resolve a question that came up. By now, this is pretty much standard practice for the smartphone owner. But the thing is, it would have sparked a lot more imagination and doubtless been a funnier, more memorable and interesting (though perhaps rather less accurate) discussion had we all continued just to make up answers to the question at hand. Surely the real challenge is to find a time and a place for both free-spirited invention and buttoned down application. Still, plenty of food for thought about the state of the innovation world we live in, while I also love the fact that according to his bio, Stephenson is committed to seeing that “BSGD” (“Big Stuff Gets Done”).
Good, short video interview with Goodby Silverstein’s Gareth Kay, discussing how important it is to get clients to engage on both a functional and emotional level. Yes, he’s coming from an advertising background, but the idea is applicable in many other contexts too.