Lately I've been flirting with psychological immobility over a Dreadful Complexity. This whatever-it-is stubbornly resists definition (which of course feeds its dreadfulness), remainingly tantalizingly near. Vicious.
So instead of definition, I'd like to share some things I've encountered recently that have made me go "Aha! This is related to the Dreadful Complexity!"
Below each, a few bullet points of my major takeaways.
The State of the World Conversations
Bruce Sterling and Jon Lebkowsky on the WELL
- Everything is getting more complex, exponentially faster
- This is finally starting to break humanity on many levels
- Eastern Europe might be okay, but not for long
Do We Need a Better Archive of the Internet?
PBS Idea Channel
- The Internet is getting more complex, exponentially faster
- It's extremely difficult to archive
- Notions of history, memory, and value no longer apply, and we can't create new ones fast enough
Historical Narrative, Futurism and Emergent Network Culture
Bruce Sterling at the European Graduate School
- We're living in a post-postmodern atemporality
- Network Culture is rapidly supplanting Digital Culture
- Everything is more complex in this future-present-past than even the sci-fi writers and futurists imagined
A New Culture of Immersiveness
Buddhist Geeks Podcast
- We're constantly distracted more than we've ever been
- That's super bad for us, because ads
- Don't worry, Silicon Valley has a solution coming soon!
China Miéville on Writing
Edinburgh World Writers Conference 2012
- China Miéville is a badass
- The novel is dead! Long live the novel!
- Nobody has a clue what to do, and most writers are worried too (but the poets are fine)
Meanwhile, Back in My Head
Bruce Sterling ended his Closing Remarks at the latest SXSW with a part-resigned, part-hopeful...
"I’m just gonna try and build some stuff."
To call Bruce "hopeful" is at best ironic, and at worst grossly inaccurate, but I'm starting to seriously dig his sentiments.
A meta-social Kaiju seems to be evolving faster than we can think. And something tells me we don't have the allegorical Jaeger to face it yet. Maybe Watson and Wolfram Alpha will get together to birth an AI-Moses that will lead us to the metaphysical / social / economic Promised Land. Kurzweil and his ilk are certainly banking on it. I'm not convinced. Her persuaded me that a superint is more likely to just leave us behind than ascend us into Singulatarian Heaven. Good argument for adding Jesus-class compassion algorithms to the Three Laws.
Blogging through this has helped, a little. Is there something here? How do you feel about all this? Have you felt anything similar?
Talk to me, Others. I need you.