Preview Mode Links will not work in preview mode

Crazy Wisdom

Sep 18, 2023


  • Guest: Julie Fredrickson, Managing Partner at Chaotic Capital. Chaotic Capital is a seed-stage investment vehicle that focuses on ideas adapting humanity to complexity. Follow her on Twitter

Adapting to Complexity:

  • Julie raises the essential question: "What is the primary factor linking humanity to complexity?"
  • As things change, it's vital to learn how to adapt and evolve. It's about more than just the traditional liberal-conservative divide.

Understanding Change:

  • Discussion on "Chesterton’s fence" – a principle highlighting the importance of understanding the reasons behind a tradition before discarding it. We often don't have all the information about why certain systems or structures exist.
  • The concept of a "conservative accelerationist" is introduced. What traditions or values do people want to preserve in the face of rapid change?

Historical Perspective & Our Understanding:

  • Examples from history highlight our ever-evolving understanding. A millennium ago, the earth was thought to be centered; 500 years ago, it was believed flat.
  • The movie "Men in Black" underscores the fluid nature of human understanding, challenging our conceptions and paradigms.
  • Julie emphasizes the power of human inference – our unique ability to draw conclusions from information. Given our current knowledge, how does this skill evolve, and what questions do we need to ask to better understand our surroundings?

Role and Limitations of AI:

  • The conversation delves into whether we've provided enough context to AI systems.
  • A significant concern is AI alignment – to what or whose standards is AI being aligned? Is it the median human, societal norms, or something else?
  • The prospect of AI achieving human-like inference and skepticism is explored. Can AI be trained to question and fact-check itself?

The Internet, Open Source, and Corporations:

  • A retrospective on the internet's evolution, from closed corporate internets to open source.
  • The dilemma of whether we'll be using corporate Large Language Models (LLMs) or open-sourced LLMs in the next five years.

Culture, Symbols, and Power:

  • Julie's career in fashion, beauty, and luxury offers insights into symbols and semiotics. Language isn't the only medium of communication. Who truly owns culture, and who holds the power?

Transhumanism and Medical Ethics:

  • Julie discusses her college days as a medical ethicist researching cloning during 2002-2006.
  • The topic of transhumanism is broached. Julie posits that in many ways, we are already transhuman, referencing technologies like fertility treatments.
  • There's a debate about the risks of corporate interests dominating AI and transhumanist technologies. The future is uncertain, and it's impossible to predict the outcomes fully.

Preparing for the Age of Acceleration:

  • Quoting William Gibson, "The future is here, but it's unevenly distributed," Julie ponders the accelerating pace of change.
  • She seeks solace and insights from groups focused on effective accelerationism and those termed "doomer optimists."
  • Julie touches on resource scarcity, arguing that human wants will always exceed available resources.
  • She briefly addresses the topic of prepping and the challenges associated with it.


  • Julie underscores the importance of agency. It's essential to ask who's influencing our decisions and who's potentially programming our behaviors.