Preview Mode Links will not work in preview mode

Crazy Wisdom


Aug 28, 2023

Show Notes for Crazy Wisdom Podcast Episode with Subutai Ahmad

Introduction

The episode features Subutai Ahmad, the CEO of Numenta and a pioneering figure in both neuroscience and artificial intelligence (AI). The discussion navigates the complex relationship between the human brain's architecture and contemporary AI models like deep learning systems. Topics range from the historical evolution of these disciplines to the cutting-edge research that could shape their future.

Historical Perspective

The initial inspiration for artificial neural networks came from our rudimentary understanding of how neurons and connections work, going back to the 1940s. Donald Hebb significantly influenced the back-propagation model developed in the 1980s. Hebb's work, combined with the discoveries of Hubel and Wiesel in the '50s, laid the groundwork for understanding how neurons learn features from the visual world, including edge detectors and higher-level shapes.

State of Neural Networks Today

Despite advancements, today’s neural networks still rely on a simplified model of what a neuron is, and they differ fundamentally from biological systems. One glaring difference is in power consumption; a human brain uses only about 20 watts, while running a deep learning network can require power equivalent to an entire city.

Learning Modes and Algorithms

Deep learning systems usually operate in two modes: inference and training. In contrast, the human brain doesn't distinguish between these states, learning continuously from environmental stimuli. Algorithms, particularly back propagation, are still part of the problem. They try to minimize error, unlike the brain, which adapts and learns contextually.

The Numenta Angle

Founded by Jeff Hawkins and Donna Dubinsky, Numenta has been researching to understand the principles underlying brain function. Recently, they have focused on applying this understanding to AI. Their approach comprises three main pillars:

  1. Efficiency: Using 'sparsity' to mimic the brain's efficient use of connections.
  2. Neuron Model: Incorporating the complex nature of neurons for continuous learning.
  3. Cortical Columns: Employing a standardized neural circuitry model to replicate intelligence.

The Road Ahead

For the future, Subutai discusses the need for AI systems to be autonomous and embodied, suggesting that agency and embodiment are crucial aspects of intelligent systems. He also touches on the importance of including elements like neuromodulators and even explores the potential role of quantum physics in neural processing.

Conclusion

We are in a transformative era where AI is far from being fully realized. Organizations are still trying to grasp how to incorporate these technologies effectively. However, the future is promising, especially with interdisciplinary approaches like Numenta's that blend neuroscience with AI, focusing on understanding the brain's core principles to improve AI's capabilities.