DiscoverCrazy Wisdom
Crazy Wisdom
Claim Ownership

Crazy Wisdom

Author: Stewart Alsop

Subscribed: 113Played: 5,977
Share

Description

In his series "Crazy Wisdom," Stewart Alsop explores cutting-edge topics, particularly in the realm of technology, such as Urbit and artificial intelligence. Alsop embarks on a quest for meaning, engaging with others to expand his own understanding of reality and that of his audience. The topics covered in "Crazy Wisdom" are diverse, ranging from emerging technologies to spirituality, philosophy, and general life experiences. Alsop's unique approach aims to make connections between seemingly unrelated subjects, tying together ideas in unconventional ways.
492 Episodes
Reverse
On this episode of Crazy Wisdom, Stewart Alsop talks with Agustin Ferreira, founder of Neurona, an AI community in Buenos Aires. Their conversation moves through Argentina’s history with economic crises and the rise of crypto as an alternative to failing institutions, the importance of Ethereum and smart contracts, the UX challenges that still plague crypto adoption, and how AI and agents could transform the way people interact with decentralized systems. They also explore the tension between TradFi and DeFi, questions of data privacy and surveillance, the shifting role of social networks, and even the cultural and philosophical meaning of decentralization. You can learn more about Agustin’s work through Neurona on Twitter at Neurona.Check out this GPT we trained on the conversationTimestamps00:05 Agustin shares how Argentina’s economic crises and the Corralito shaped interest in Bitcoin and Ethereum, with smart contracts offering a way out of broken systems.00:10 They compare Bitcoin’s simplicity with Ethereum’s immutability and programmability, opening new use cases beyond money transfers.00:15 The discussion shifts to crypto’s UX problem, from jargon and wallets to agents and AI smoothing the user experience, with projects like Gina Wallet and Gigabrain.00:20 Stewart’s frustrations with NFTs and bridging tokens highlight why validators, restaking, and cross-chain complexity still matter for decentralization.00:25 Agustin reflects on TradFi merging with DeFi, the risk of losing core values, and how stablecoins and U.S. interest could spark a spike in crypto markets.00:30 They broaden into Web 2.0’s walled gardens, the need for alternatives, and how AI, data privacy, and surveillance raise urgency for decentralized systems.00:35 Social networks, culture, and hypercapitalism come into focus, with Agustin questioning fantasy online lives and imagining more conscious connections.00:40 The conversation turns philosophical, exploring religion-like markets, self-knowledge, and the hope for technology that feels more human.00:45 Stewart and Agustin discuss off-grid living, AI as a tool for autonomy, and space exploration shaping future generations.00:50 Agustin brings in the metaverse, both its potential to connect people more deeply and the risk of centralization, closing with Neurona’s mission in Buenos Aires.Key InsightsOne of the strongest themes Agustin brings forward is how Argentina’s long history of economic crises and the Corralito in 2001 created a natural openness to crypto. For his generation, trust in the peso was destroyed early, and holding dollars became the norm. This made decentralized alternatives like Bitcoin and later Ethereum feel less like speculation and more like survival tools.Ethereum’s introduction of smart contracts represented a decisive leap from Bitcoin’s simple ledger into programmable, immutable agreements. For young Argentines, this opened a space to innovate and build projects that weren’t dependent on fragile local institutions, and it felt like a path to opportunity in the midst of recurring instability.Agustin emphasizes that crypto still has a major UX problem. From confusing jargon to multiple wallets and bridges, it’s far from intuitive. He sees AI agents playing a transformative role in making transactions and investments seamless, removing technical friction so people can use crypto without even realizing the complexity beneath it.Bridging across blockchains reveals both the promise and challenge of decentralization. Tokens must be locked, represented, and validated across chains, and while this creates resilience, it also adds layers of risk. Agustin hopes the future will feel “like magic,” where these processes disappear from the user’s view.The rise of TradFi players in DeFi is double-edged. On one hand, it accelerates maturity and scale, but on the other, it risks eroding the original ethos of decentralization. Agustin worries about lost principles yet also anticipates a surge of new DeFi projects and stablecoin adoption driven by U.S. financial interests.Beyond finance, the conversation turns to the politics of data privacy and surveillance. Agustin argues that much of the motivation for decentralized systems is to resist manipulation, polarization, and weaponization of personal information—issues that AI will amplify unless paired with decentralized alternatives.Finally, both Stewart and Agustin reflect on culture, social networks, and even the metaverse. Agustin critiques hypercapitalism’s fantasy-driven platforms and envisions technology that enables more authentic human connection. Whether through off-grid living, space exploration, or decentralized metaverse communities, he sees a need to balance innovation with deeper human and philosophical questions about freedom and meaning.
In this episode of Crazy Wisdom, host Stewart Alsop talks with Michel Bauwens, founder of the P2P Foundation, about the rise of peer-to-peer dynamics, the historical cycles shaping our present, and the struggles and possibilities of building resilient communities in times of crisis. The conversation moves through the evolution of the internet from Napster to Web3, the cultural shifts since 1968, Bauwens’ personal experiences with communes and his 2018 cancellation, and the emerging vision of cosmolocalism and regenerative villages as alternatives to state and market decline. For more on Michel’s work, you can explore his Substack at 4thgenerationcivilization.substack.com and the extensive P2P Foundation Wiki at wiki.p2pfoundation.net.Check out this GPT we trained on the conversationTimestamps00:00 Michel Bauwens explains peer-to-peer as both computer design and social relationship, introducing trans-local association and the idea of an anthropological revolution.05:00 Discussion of Web1, Web3, encryption, anti-surveillance, cozy web, and dark forest theory, contrasting early internet openness with today’s fragmentation.10:00 Bauwens shares his 2018 cancellation, deplatforming, and loss of funding after a dispute around Jordan Peterson, reflecting on identity politics and peer-to-peer pluralism.15:00 The cultural shifts since 1968, the rise of identity movements, macro-historical cycles, and the fourth turning idea of civilizational change are unpacked.20:00 Memories of 1968 activism, communes, free love, hypergamy, and the collapse of utopian experiments, showing the need for governance and rules in cooperation.25:00 From communes to neo-Reichian practices, EST seminars, and lessons of human nature, Bauwens contrasts failed free love with lasting models like kibbutzim and Bruderhof.30:00 Communes that endure rely on transcendence, religious or ideological foundations, and Bauwens points to monasteries as models for resilience in times of decline.35:00 Cycles of civilization, overuse of nature, class divisions, and the threat of social unrest frame a wider reflection on populism, Eurasian vs Western models, and culture wars.40:00 Populism in Anglo vs continental Europe, social balance, Christian democracy, and the contrast with market libertarianism in Trump and Milei.45:00 Bauwens proposes cosmolocalism, regenerative villages, and bioregional alliances supported by Web3 communities like Crypto Commons Alliance and Ethereum Localism.50:00 Historical lessons from the Roman era, monasteries, feudal alliances, and the importance of reciprocity, pragmatic alliances, and preparing for systemic collapse.55:00 Localism, post-political collaboration, Ghent urban commons, Web3 experiments like Zuzalu, and Bauwens’ resources: fortcivilizationsubstack.com and wiki.p2pfoundation.net.Key InsightsMichel Bauwens frames peer-to-peer not just as a technical design but as a profound social relationship, what he calls an “anthropological revolution.” Like the invention of writing or printing, the internet created trans-local association, allowing people across the globe to coordinate outside of centralized control.The conversation highlights the cycles of history, drawing from macro-historians and the “fourth turning” model. Bauwens explains how social movements rise, institutionalize, and collapse, with today’s cultural polarization echoing earlier waves such as the upheavals of 1968. He sees our era as the end of a long cycle that began after World War II.Bauwens shares his personal cancellation in 2018, when posting a video about Jordan Peterson triggered accusations and led to deplatforming, debanking, and professional exclusion. He describes this as deeply traumatic, forcing him to rethink his political identity and shift his focus to reciprocity and trust in smaller, resilient networks.The episode revisits communes and free love experiments of the 1970s, where Bauwens lived for years. He concludes that without governance, rules, and shared transcendence, these communities collapse into chaos. He contrasts them with enduring models like the Bruderhof, kibbutzim, and monasteries, which rely on structure, ideology, or religion to survive.A major theme is populism and cultural polarization, with Bauwens distinguishing between Anglo-Saxon populism rooted in market libertarianism and continental populism shaped by Christian democratic traditions. The former quickly loses support by privileging elites, while the latter often maintains social balance through family and worker policies.Bauwens outlines his vision of cosmolocalism and regenerative villages, where “what’s heavy is local, what’s light is global.” He argues that bioregionalism combined with Web3 technologies offers a practical way to rebuild resilient communities, coordinate globally, and address ecological and social breakdown.Finally, the episode underscores the importance of pragmatic alliances across political divides. Bauwens stresses that survival and flourishing in times of systemic collapse depend less on ideology and more on reciprocity, concrete projects, and building trust networks that can outlast declining state and market systems.
In this episode, Stewart Alsop speaks with Nico Sarian, Executive Director of the Eternity Foundation and PhD candidate in Religious Studies, about the strange currents that run through Armenian history, the fractured birth of early Christianity, and the survival of Gnostic and Hermetic traditions into the Renaissance. The conversation weaves through questions of empire and nation state, mysticism and metaphysics, the occult roots of modern science, and the unsettling horizon of accelerationism, drawing unexpected lines between the ancient world, the bureaucratic order critiqued by David Graeber, and our present entanglement with surveillance and identity. For more on Nico’s work, see The Eternity Foundation at eternity.giving.Check out this GPT we trained on the conversationTimestamps00:00 Stewart Alsop introduces Nico Sarian and sets the stage with Armenian history and the legacy of empire.05:00 The rise of early Christianity is traced, showing its fractures, Gnostic currents, and the persistence of esotericism.10:00 Hermeticism enters the frame, connecting mystical knowledge with the scientific spirit of the Renaissance.15:00 Empire versus nation state is explored, touching on bureaucracy, power, and identity.20:00 Mysticism and metaphysics are tied to questions of apocalypse, renewal, and hidden traditions.25:00 Nico brings in David Graeber, critiquing modern bureaucracy and how systems shape consciousness.30:00 Accelerationism surfaces, framed as both danger and possibility in modernity.35:00 Surveillance and identity are examined, echoing ancient struggles for meaning.40:00 Esotericism and religious syncretism are reconsidered as resources for navigating technological upheaval.45:00 The conversation closes with reflections on continuity, rupture, and the strange endurance of wisdom.Key InsightsOne of the central insights from Nico Sarian’s conversation with Stewart Alsop is that Armenian history carries a unique vantage point on the ancient world, positioned between empire and nation, East and West. Its survival under domination reveals how smaller cultures can preserve mysticism, ritual, and identity even within overwhelming imperial structures.The episode underscores how early Christianity was never monolithic but a field of competing visions. Gnostics, proto-orthodox bishops, and other sects fought over scripture, ritual, and authority, leaving traces of suppressed traditions that still haunt religious and philosophical discourse today.A powerful thread emerges around Hermeticism and Renaissance science, where occult traditions did not oppose but actively shaped early scientific inquiry. The magical and the rational were not enemies; rather, they grew together in ways that modern categories tend to obscure.Sarian and Alsop discuss empire versus the nation state, showing how forms of political order encode metaphysical assumptions. Empires sought transcendence through universality, while nation states leaned on identity and bureaucracy, each carrying spiritual implications for those living under them.Another insight is the role of mysticism and apocalypse as recurring frameworks for understanding collapse and renewal. Whether in ancient prophetic traditions or modern accelerationism, there is a yearning for rupture that promises transformation but also carries danger.David Graeber’s critique of bureaucracy becomes a lens for seeing how systems shape human consciousness. What appears as neutral administration actually molds imagination, desire, and even metaphysical assumptions about what is possible in the world.Finally, the episode points to the enduring tension between surveillance, identity, and esotericism. Just as ancient sects guarded secret knowledge from empire, modern individuals navigate the exposure of digital systems, suggesting that hidden wisdom traditions may offer unexpected resources for our technological present.
In this episode of Crazy Wisdom, Stewart Alsop speaks with Samuel, host of The Remnant Podcast, about the intersections of biblical prophecy, Gnostic traditions, transhumanism, and the spiritual battle unfolding in our age. The conversation moves from Dr. David Hawkins’ teachings and personal encounters with the Holy Spirit to questions of Lucifer, Archons, and the distortions of occult traditions, while also confronting timelines of 2025, 2030, and 2045 in light of technological agendas from Palantir, Neuralink, and the United Nations. Together they explore the tension between organic human life and the merging with machines, weaving in figures like Blavatsky, Steiner, and Barbara Marx Hubbard, and tying it back to cycles of history, prophecy, and the remnant who remain faithful. You can find Samuel’s work on The Remnant Podcast YouTube channel and follow future updates through his Instagram once it’s launched.Check out this GPT we trained on the conversationTimestamps00:00 Stewart Alsop welcomes Samuel of The Remnant Podcast, connecting through Dr. David Hawkins’ work and reflecting on COVID’s effect on consciousness.05:00 Samuel shares his discovery of Hawkins, a powerful encounter with Jesus, and shifting views on Lucifer, Gnosticism, Archons, and Rudolf Steiner’s Ahriman.10:00 They trace Gnosticism’s suppression in church history, Frances Yates on occult revival, the Nicene Creed, Neoplatonism, and the church’s battle with magic.15:00 Discussion of Acts 4, possessions, Holy Spirit, and Gnostic inversion of God and Lucifer; Blavatsky, Crowley, occult distortions, and forbidden knowledge in Enoch.20:00 Hawkins’ framework, naivety at higher states, Jesus as North Star, synchronicities, and the law of attraction as both biblical truth and sorcery.25:00 Transhumanism, homo spiritus, Singularity University, Barbara Marx Hubbard, hijacked timelines, Neuralink, and Butlerian Jihad.30:00 Attractor patterns, algorithms mimicking consciousness, Starlink’s omnipresence, singularity timelines—2025, 2030, 2045—and UN, WEF agendas.35:00 Organic health versus pod apartments and smart cities, Greg Braden’s critiques, bio-digital convergence, and the biblical remnant who remain faithful.40:00 Trump, MAGA as magician, Marina Abramović, Osiris rituals in inaugurations, Antichrist archetypes, and elite esoteric influences.50:00 Edward Bernays, Rockefeller, UN history, Enlightenment elites, Nephilim bloodlines, Dead Sea Scrolls on sons of light and darkness, Facebook’s control systems.55:00 Quantum dots using human energy, D-Wave quantum computers, Gordy Rose’s tsunami warning, Samuel’s book As It Was in the Days of Noah: The Rising Tsunami.Key InsightsThe episode begins with Stewart Alsop and Samuel connecting through their shared study of Dr. David Hawkins, whose work profoundly influenced both men. Samuel describes his path from Hawkins’ teachings into a life-altering encounter with Jesus, which reshaped his spiritual compass and allowed him to question parts of Hawkins’ framework that once seemed untouchable. This shift also opened his eyes to the living presence of Christ as a North Star in discerning truth.A central thread is the nature of Lucifer and the entities described in Gnostic, biblical, and esoteric traditions. Samuel wrestles with the reality of Lucifer not just as ego, but as a non-human force tied to Archons, Yaldabaoth, and Ahriman. This leads to the recognition that many leaders openly revere such figures, pointing to a deeper spiritual battle beyond mere metaphor.The discussion examines the suppression and resurgence of Gnosticism. Stewart references Frances Yates’ historical research on the rediscovery of Neoplatonism during the Renaissance, which fused with Christianity and influenced the scientific method. Yet, both men note the distortions and dangers within occult systems, where truth often hides alongside demonic inversions.Samuel emphasizes the importance of discernment, contrasting authentic spiritual awakening with the false light of occultism and New Age thought. He draws on the Book of Enoch’s account of fallen angels imparting forbidden knowledge, showing how truth can be weaponized when separated from God. The law of attraction, he argues, exemplifies this duality: biblical when rooted in faith, sorcery when used to “become one’s own god.”Transhumanism emerges as a major concern, framed as a counterfeit path to evolution. They compare Hawkins’ idea of homo spiritus with Barbara Marx Hubbard’s transhumanist vision and Elon Musk’s Neuralink. Samuel warns of “hijacked timelines” where natural spiritual gifts like telepathy are replaced with machine-based imitations, echoing the warnings of Dune’s Butlerian Jihad.Technology is interpreted through a spiritual lens, with algorithms mimicking attractor patterns, social media shaping reality, and Starlink rendering the internet omnipresent. Samuel identifies this as Lucifer’s attempt to counterfeit God’s attributes, creating a synthetic omniscience that pulls humanity away from organic life and into controlled systems.Finally, the conversation grounds in hope through the biblical concept of the remnant. Samuel explains that while elites pursue timelines toward 2025, 2030, and 2045 with occult enlightenment and digital convergence, those who remain faithful to God, connected to nature, and rooted in Christ form the remnant. This small, organic community represents survival in a time when most will unknowingly merge with the machine, fulfilling the ancient struggle between the sons of light and the sons of darkness.
On this episode of Crazy Wisdom, I, Stewart Alsop, sit down with Sweetman, the developer behind on-chain music and co-founder of Recoup. We talk about how musicians in 2025 are coining their content on Base and Zora, earning through Farcaster collectibles, Sound drops, and live shows, while AI agents are reshaping management, discovery, and creative workflows across music and art. The conversation also stretches into Spotify’s AI push, the “dead internet theory,” synthetic hierarchies, and how creators can avoid future shock by experimenting with new tools. You can follow Sweetman on Twitter, Farcaster, Instagram, and try Recoup at chat.recoupable.com.Check out this GPT we trained on the conversationTimestamps00:00 Stewart Alsop introduces Sweetman to talk about on-chain music in 2025.05:00 Coins, Base, Zora, Farcaster, collectibles, Sound, and live shows emerge as key revenue streams for musicians.10:00 Streaming shifts into marketing while AI music quietly fills shops and feeds, sparking talk of the dead internet theory.15:00 Sweetman ties IoT growth and shrinking human birthrates to synthetic consumption, urging builders to plug into AI agents.20:00 Conversation turns to synthetic hierarchies, biological analogies, and defining what an AI agent truly is.25:00 Sweetman demos Recoup: model switching with Vercel AI SDK, Spotify API integration, and building artist knowledge bases.30:00 Tool chains, knowledge storage on Base and Arweave, and expanding into YouTube and TikTok management for labels.35:00 AI elements streamline UI, Sam Altman’s philosophy on building with evolving models sparks a strategy discussion.40:00 Stewart reflects on the return of Renaissance humans, orchestration of machine intelligence, and prediction markets.45:00 Sweetman weighs orchestration trade-offs, cost of Claude vs GPT-5, and boutique services over winner-take-all markets.50:00 Parasocial relationships with models, GPT psychosis, and the emotional shock of AI’s rapid changes.55:00 Future shock explored through Sweetman’s reaction to Cursor, ending with resilience and leaning into experimentation.Key InsightsOn-chain music monetization is diversifying. Sweetman describes how musicians in 2025 use coins, collectibles, and platforms like Base, Zora, Farcaster, and Sound to directly earn from their audiences. Streaming has become more about visibility and marketing, while real revenue comes from tokenized content, auctions, and live shows.AI agents are replacing traditional managers. By consuming data from APIs like Spotify, Instagram, and TikTok, agents can segment audiences, recommend collaborations, and plan tours. What once cost thousands in management fees is now automated, providing musicians with powerful tools at a fraction of the price.Platforms are moving to replace artists. Spotify and other major players are experimenting with AI-generated music, effectively cutting human musicians further out of the revenue loop. This shift reinforces the importance of artists leaning into blockchain monetization and building direct relationships with fans.The “dead internet theory” reframes the future. Sweetman connects IoT expansion and declining birth rates to a world where AI, not humans, will make most online purchases and content. The lesson: build products that are easy for AI agents to buy, consume, and amplify, since they may soon outnumber human users.Synthetic hierarchies mirror biological ones. Stewart introduces the idea that just as cells operate autonomously within the body, billions of AI agents will increasingly act as intermediaries in human creativity and commerce. This frames AI as part of a broader continuity of hierarchical systems in nature and society.Recoup showcases orchestration in practice. Sweetman explains how Recoup integrates Vercel AI SDK, Spotify APIs, and multi-model tool chains to build knowledge bases for artists. By storing profiles on Base and Arweave, Recoup not only manages social media but also automates content optimization, giving musicians leverage once reserved for labels.Future shock is both risk and opportunity. Sweetman shares his initial rejection of AI coding tools as a threat to his identity, only to later embrace them as collaborators. The conversation closes with a call for resilience: experiment with new systems, adapt quickly, and avoid becoming a Luddite in an accelerating digital age.
In this episode of Crazy Wisdom, host Stewart Alsop sits down with Hannah Aline Taylor to explore themes of personal responsibility, freedom, and interdependence through her frameworks like the Village Principles, Distribution Consciousness, and the Empowerment Triangle. Their conversation moves through language and paradox, equanimity, desire and identity, forgiveness, leadership, money and debt, and the ways community and relationship serve as our deepest resources. Hannah shares stories from her life in Nevada City, her perspective on abundance and belonging, and her practice of love and curiosity as tools for living in alignment. You can learn more about her work at loving.university, on her website hannahalinetaylor.com, and in her book The Way of Devotion, available on Amazon.Check out this GPT we trained on the conversationTimestamps00:00 Stewart Alsop welcomes Hannah Aline Taylor, introducing Loving University, Nevada City, and the Village Principles.05:00 They talk about equanimity versus non-duality, emotional mastery, and curating experience through boundaries and high standards.10:00 The focus shifts to desire as “who do I want to be,” identity as abstraction, and relationships beyond monogamy or labels.15:00 Hannah introduces the Empowerment Triangle of anything, everything, nothing, reflecting on reality as it is and the role of perception.20:00 Discussion of Nevada City’s healing energy, community respect, curiosity, and differences between East Coast judgment and West Coast freedom.25:00 Responsibility as true freedom, rebellion under tyranny, delicate ecosystems, and leadership inspired by the Dao De Jing.30:00 Love and entropy, conflict without enmity, curiosity as practice, and attention as the prerequisite for experience.35:00 Forgiveness, discernment, moral debts, economic debt, and reframing wealth consciousness through the “princess card.”40:00 Interdependence, community belonging, relationship as the real resource, and stewarding abundance in a disconnected world.45:00 Building, frontiers, wisdom of indigenous stewardship, the Amazon rainforest, and how knowledge without wisdom creates loss.50:00 Closing reflections on wholeness, abundance, scarcity, relationship technology, and prioritizing humanity in transition.Key InsightsHannah Taylor introduces the Village Principles as a framework for living in “distribution consciousness” rather than “acquisition consciousness.” Instead of chasing community, she emphasizes taking responsibility for one’s own energy, time, and attention, which naturally draws people into authentic connection.A central theme is personal responsibility as the true meaning of freedom. For Hannah, freedom is inseparable from responsibility—when it’s confused with rebellion against control, it remains tied to tyranny. Real freedom comes from holding high standards for one’s life, curating experiences, and owning one’s role in every situation.Desire is reframed from the shallow “what do I want” into the deeper question of “who do I want to be.” This shift moves attention away from consumer-driven longing toward identity, integrity, and presence, turning desire into a compass for embodied living rather than a cycle of lack.Language, abstraction, and identity are questioned as both necessary tools and limiting frames. Distinction is what fuels connection—without difference, there can be no relationship. Yet when we cling to abstractions like “monogamy” or “polyamory,” we obscure the uniqueness of each relationship in favor of labels.Hannah contrasts the disempowerment triangle of victim, perpetrator, and rescuer with her empowerment triangle of anything, everything, and nothing. This model shows reality as inherently whole—everything arises from nothing, anything is possible, and suffering begins when we believe something is wrong.The conversation ties money, credit, and debt to spiritual and moral frameworks. Hannah reframes debt not as a burden but as evidence of trust and abundance, describing her credit card as a “princess card” that affirms belonging and access. Wealth consciousness, she says, is about recognizing the resources already present.Interdependence emerges as the heart of her teaching. Relationship is the true resource, and abundance is squandered when lived independently. Stories of Nevada City, the Amazon rainforest, and even a friend’s Wi-Fi outage illustrate how scarcity reveals the necessity of belonging, curiosity, and shared stewardship of both community and land.
On this episode of Crazy Wisdom, Stewart Alsop sits down with Abhimanyu Dayal, a longtime Bitcoin advocate and AI practitioner, to explore how money, identity, and power are shifting in a world of deepfakes, surveillance, automation, and geopolitical realignment. The conversation ranges from why self-custody of Bitcoin matters more than ETFs, to the dangers of probabilistic biometrics and face-swap apps, to the coming impact of AGI on labor markets and the role of universal basic income. They also touch on India’s refinery economy, its balancing act between Russia, China, and the U.S., and how soft power is eroding in the information age. For more from Abhimanyu, connect with him on LinkedIn.Check out this GPT we trained on the conversationTimestamps00:00 Stewart Alsop opens with Abhimanyu Dayal on crypto, AI, and the risks of probabilistic biometrics like facial recognition and voice spoofing.05:00 They critique biometric surveillance, face-swap apps, and data exploitation through casual consent.10:00 The talk shifts to QR code treasure hunts, vibe coding on Replit and Claude, and using quizzes to mint NFTs.15:00 Abhimanyu shares his finance background, tying it to Bitcoin as people’s money, agent-to-agent payments, and post-AGI labor shifts.20:00 They discuss universal basic income, libertarian ideals, Hayek’s view of economics as critique, and how AI prediction changes policy.25:00 Pressure, unpredictability, AR glasses, quantum computing, and the surveillance state future come into focus.30:00 Open source vs closed apps, China’s DeepSeek models, propaganda through AI, and U.S.–China tensions are explored.35:00 India’s non-alignment, Soviet alliance in 1971, oil refining economy, and U.S.–India friction surface.40:00 They reflect on colonial history, East India Company, wealth drain, opium wars, and America’s rise on Indian capital.45:00 The conversation closes on Bitcoin’s role as reserve asset, stablecoins as U.S. leverage, BRICS disunity, and the geopolitics of freedom.Key InsightsA central theme of the conversation is the contrast between deterministic and probabilistic systems for identity and security. Abhimanyu Dayal stresses that passwords and private keys—things only you can know—are inherently more secure than facial recognition or voice scans, which can be spoofed through deepfakes, 3D prints, or AI reconstructions. In his view, biometric data should never be stored because it represents a permanent risk once leaked.The rise of face-swap apps and casual facial data sharing illustrates how surveillance and exploitation have crept into everyday life. Abhimanyu points out that companies already use online images to adjust things like insurance premiums, proving how small pieces of biometric consent can spiral into systemic manipulation. This isn’t a hypothetical future—it is already happening in hidden ways.On the lighter side, they experiment with “vibe coding,” using tools like Replit and Claude to design interactive experiences such as a treasure hunt via QR codes and NFTs. This playful example underscores a broader point: lightweight coding and AI platforms empower individuals to create experiments without relying on centralized or closed systems that might inject malware or capture data.The discussion expands into automation, multi-agent systems, and the post-AGI economy. Abhimanyu suggests that artificial superintelligence will require machine-to-machine transactions, making Bitcoin an essential tool. But if machines do the bulk of labor, universal basic income may become unavoidable, even if it drifts toward collectivist structures libertarians dislike.A key shift identified is the transformation of economics itself. Where Hayek once argued economics should critique politicians because of limited data, AI and quantum computing now provide prediction capabilities so granular that human behavior is forecastable at the individual level. This erodes the pseudoscientific nature of past economics and creates a new landscape of policy and control.Geopolitically, the episode explores India’s rise, its reliance on refining Russian crude into petroleum exports, and its effort to stay unaligned between the U.S., Russia, and China. The conversation recalls India’s Soviet ties during the 1971 war, while noting how today’s energy and trade policies underpin domestic improvements for India’s poor and middle class.Finally, they critique the co-optation of Bitcoin through ETFs and institutional custody. While investors celebrate, Abhimanyu argues this betrays Satoshi’s vision of money controlled by individuals with private keys. He warns that Bitcoin may be absorbed into central bank reserves, while stablecoins extend U.S. monetary dominance by reinforcing dollar power rather than replacing it.
In this episode of Crazy Wisdom, host Stewart Alsop speaks with Robin Hanson, economist and originator of the idea of futarchy, about how conditional betting markets might transform governance by tying decisions to measurable outcomes. Their conversation moves through examples of organizational incentives in business and government, the balance between elegant theories and messy implementation details, the role of AI in robust institutions, and the tension between complexity and simplicity in legal and political systems. Hanson highlights historical experiments with futarchy, reflects on polarization and collective behavior in times of peace versus crisis, and underscores how ossified bureaucracies mirror software rot. To learn more about his work, you can find Robin Hanson online simply by searching his name or his blog overcomingbias.com, where his interviews—including one with Jeffrey Wernick on early applications of futarchy—are available.Check out this GPT we trained on the conversationTimestamps00:05 Hanson explains futarchy as conditional betting markets that tie governance to measurable outcome metrics, contrasting elegant ideas with messy implementation details.00:10 He describes early experiments, including Jeffrey Wernick’s company in the 1980s, and more recent trials in crypto and an India-based agency.00:15 The conversation shifts to how companies use stock prices as feedback, comparing public firms tied to speculators with private equity and long-term incentives.00:20 Alsop connects futarchy to corporate governance and history, while Hanson explains how futarchy can act as a veto system against executive self-interest.00:25 They discuss conditional political markets in elections, AI participation in institutions, and why proof of human is unnecessary for robust systems.00:30 Hanson reflects on simplicity versus complexity in democracy and legal systems, noting how futarchy faces similar design trade-offs.00:35 He introduces veto markets and outcome metrics, adding nuance to how futarchy could constrain executives while allowing discretion.00:40 The focus turns to implementation in organizations, outcome-based OKRs, and trade-offs between openness, liquidity, and transparency.00:45 They explore DAOs, crypto governance, and the need for focus, then compare news-driven attention with deeper institutional design.00:50 Hanson contrasts novelty with timelessness in academia and policy, explaining how futarchy could break the pattern of weak governance.00:55 The discussion closes on bureaucratic inertia, software rot, and how government ossifies compared to adaptive private organizations.Key InsightsFutarchy proposes that governance can be improved by tying decisions directly to measurable outcome metrics, using conditional betting markets to reveal which policies are expected to achieve agreed goals. This turns speculation into structured decision advice, offering a way to make institutions more competent and accountable.Early experiments with futarchy existed decades ago, including Jeffrey Wernick’s 1980s company that made hiring and product decisions using prediction markets, as well as more recent trials in crypto-based DAOs and a quiet adoption by a government agency in India. These examples show that the idea, while radical, is not just theoretical.A central problem in governance is the tension between elegant ideas and messy implementation. Hanson emphasizes that while the core concept of futarchy is simple, real-world use requires addressing veto powers, executive discretion, and complex outcome metrics. The evolution of institutions involves finding workable compromises without losing the simplicity of the original vision.The conversation highlights how existing governance in corporations mirrors these challenges. Public firms rely heavily on speculators and short-term stock incentives, while private equity benefits from long-term executive stakes. Futarchy could offer companies a new tool, giving executives market-based feedback on major decisions before they act.Institutions must be robust not just to human diversity but also to AI participation. Hanson argues that markets, unlike one-person-one-vote systems, can accommodate AI traders without needing proof of human identity. Designing systems to be indifferent to whether participants are human or machine strengthens long-term resilience.Complexity versus simplicity emerges as a theme, with Hanson noting that democracy and legal systems began with simple structures but accreted layers of rules that now demand lawyers to navigate. Futarchy faces the same trade-off: it starts simple, but real implementation requires added detail, and the balance between elegance and robustness becomes crucial.Finally, the episode situates futarchy within broader social trends. Hanson connects rising polarization and inequality to times of peace and prosperity, contrasting this with the unifying effect of external threats. He also critiques bureaucratic inertia and “software rot” in government, arguing that without innovation in governance, even advanced societies risk ossification.
On this episode of Crazy Wisdom, Stewart Alsop sits down with Brad Costanzo, founder and CEO of Accelerated Intelligence, for a wide-ranging conversation that stretches from personal development and the idea that “my mess is my message” to the risks of AI psychosis, the importance of cognitive armor, and Brad’s sovereign mind framework. They talk about education through the lens of the Trivium, the natural pull of elites and hierarchies, and how Bitcoin and stablecoins tie into the future of money, inflation, and technological deflation. Brad also shares his perspective on the synergy between AI and Bitcoin, the dangers of too-big-to-fail banks, and why decentralized banking may be the missing piece. To learn more about Brad’s work, visit acceleratedintelligence.ai or reach out directly at brad@acceleratedintelligence.ai.Check out this GPT we trained on the conversationTimestamps00:00 Brad Costanzo joins Stewart Alsop, opening with “my mess is my message” and Accelerated Intelligence as a way to frame AI as accelerated, not artificial.05:00 They explore AI as a tool for personal development, therapy versus coaching, and AI’s potential for self-insight and pattern recognition.10:00 The conversation shifts to AI psychosis, hype cycles, gullibility, and the need for cognitive armor, leading into Brad’s sovereign mind framework of define, collaborate, and refine.15:00 They discuss education through the Trivium—grammar, logic, rhetoric—contrasted with the Prussian mass education model designed for factory workers.20:00 The theme turns to elites, natural hierarchies, and the Robbers Cave experiment showing how quickly humans split into tribes.25:00 Bitcoin enters as a silent, nonviolent revolution against centralized money, with Hayek’s quote on sound money and the Trojan horse of Wall Street adoption.30:00 Stablecoins, treasuries, and the Treasury vs Fed dynamic highlight how monetary demand is being engineered through crypto markets.35:00 Inflation, disinflation, and deflation surface, tied to real estate costs, millennials vs boomers, Austrian economics, and Jeff Booth’s “Price of Tomorrow.”40:00 They connect Bitcoin and AI as deflationary forces, population decline, productivity gains, and the idea of a personal Bitcoin denominator.45:00 The talk expands into Bitcoin mining, AI data centers, difficulty adjustments, and Richard Werner’s insights on quantitative easing, commercial banks, and speculative vs productive loans.50:00 Wrapping themes center on decentralized banking, the dangers of too-big-to-fail, assets as protection, Bitcoin’s volatility, and why it remains the strongest play for long-term purchasing power.Key InsightsOne of the strongest insights Brad shares is the shift from artificial intelligence to accelerated intelligence. Instead of framing AI as something fake or external, he sees it as a leverage tool to amplify human intelligence—whether emotional, social, spiritual, or business-related. This reframing positions AI less as a threat to authenticity and more as a partner in unlocking dormant creativity.Personal development surfaces through the mantra “my mess is my message.” Brad emphasizes that the struggles, mistakes, and rock-bottom moments in life can become the foundation for helping others. AI plays into this by offering low-cost access to self-insight, giving people the equivalent of a reflective mirror that can help them see patterns in their own thinking without immediately needing therapy.The episode highlights the emerging problem of AI psychosis. People overly immersed in AI conversations, chatbots, or hype cycles can lose perspective. Brad and Stewart argue that cognitive armor—what Brad calls the “sovereign mind” framework of define, collaborate, and refine—is essential to avoid outsourcing one’s thinking entirely to machines.Education is another theme, with Brad pointing to the classical Trivium—grammar, logic, and rhetoric—as the foundation of real learning. Instead of mass education modeled on the Prussian system for producing factory workers, he argues for rhetoric, debate, and critical thinking as the ultimate tests of knowledge, even in an AI-driven world.When the discussion turns to elites, Brad acknowledges that hierarchies are natural and unavoidable, citing experiments like Robbers Cave. The real danger lies not in elitism itself, but in concentrated control—particularly financial elites who maintain power through the monetary system.Bitcoin is framed as a “silent, nonviolent revolution.” Brad describes it as a Trojan horse—appearing as a speculative asset while quietly undermining government monopoly on money. Stablecoins, treasuries, and the Treasury vs Fed conflict further reveal how crypto is becoming a new driver of monetary demand.Finally, the synergy between AI and Bitcoin offers a hopeful counterbalance to deflation fears and demographic decline. AI boosts productivity while Bitcoin enforces financial discipline. Together, they could stabilize a future where fewer people are needed for the same output, costs of living decrease, and savings in hard money protect purchasing power—even against the inertia of too-big-to-fail banks.
In this episode of Crazy Wisdom, host Stewart Alsop sits down with Juan Samitier, co-founder of DAMM Capital, for a wide-ranging conversation on decentralized insurance, treasury management, and the evolution of finance on-chain. Together they explore the risks of smart contracts and hacks, the role of insurance in enabling institutional capital to enter crypto, and historical parallels from Amsterdam’s spice trade to Argentina’s corralito. The discussion covers stablecoins like DAI, MakerDAO’s USDS, and the collapse of Luna, as well as the dynamics of yield, black swan events, and the intersection of DeFi with AI, prediction markets, and tokenized assets. You can find Juan on Twitter at @JuanSamitier and follow DAMM Capital at @DAMM_Capital.Check out this GPT we trained on the conversationTimestamps00:05 Stewart Alsop introduces Juan Samitier, who shares his background in asset management and DeFi, setting up the conversation on decentralized insurance.00:10 They discuss Safu, the insurance protocol Juan designed, and why hedging smart contract risk is key for asset managers deploying capital in DeFi.00:15 The focus shifts to hacks, audits, and why even fully audited code can still fail, bringing up historical parallels to ships, pirates, and early insurance models.00:20 Black swan events, risk models, and the limits of statistics are explored, along with reflections on Wolfram’s ideas and the Ascent of Money.00:25 They examine how TradFi is entering crypto, the dominance of centralized stablecoins, and regulatory pushes like the Genius Act.00:30 DAI’s design, MakerDAO’s USDS, and Luna’s collapse are explained, tying into the Great Depression, Argentina’s corralito, and trust in money.00:35 Juan recounts his path from high school trading shitcoins to managing Kleros’ treasury, while Stewart shares parallels with dot-com bubbles and Webvan.00:40 The conversation turns to tokenized assets, lending markets, and why stablecoin payments may be DeFi’s Trojan horse for TradFi adoption.00:45 They explore interest rates, usury, and Ponzi dynamics, comparing Luna’s 20% yields with unsustainable growth models in tech and crypto.00:50 Airdrops, VC-funded incentives, and short-term games are contrasted with building long-term financial infrastructure on-chain.00:55 Stewart brings up crypto as Venice in 1200, leading into reflections on finance as an information system, the rise of AI, and DeFi agents.01:00 Juan explains tokenized hedge funds, trusted execution environments, and prediction markets, ending with the power of conditional markets and the future of betting on beliefs.Key InsightsOne of the biggest risks in decentralized finance isn’t just market volatility but the fragility of smart contracts. Juan Samitier emphasized that even with million-dollar audits, no code can ever be guaranteed safe, which is why hedging against hacks is essential for asset managers who want institutional capital to enter crypto.Insurance has always been about spreading risk, from 17th century spice ships facing pirates to DeFi protocols facing hackers. The same logic applies today: traders and treasuries are willing to sacrifice a small portion of yield to ensure that catastrophic losses won’t wipe out their entire investment.Black swan events expose the limits of financial models, both in traditional finance and crypto. Juan pointed out that while risk models try to account for extreme scenarios, including every possible tail risk makes insurance math break down—a tension that shows why decentralized insurance is still early but necessary.Stablecoins emerged as crypto’s attempt to recreate the dollar, but their design choices determine resilience. MakerDAO’s DAI and USDS use overcollateralization for stability, while Luna’s algorithmic model collapsed under pressure. These experiments mirror historical monetary crises like the Great Depression and Argentina’s corralito, reminding us that trust in money is fragile.Argentina’s history of inflation and government-imposed bank freezes makes its citizens uniquely receptive to crypto. Samitier explained that even people without financial training understand macroeconomic risks because they live with them daily, which helps explain why Argentina has some of the world’s highest adoption of stablecoins and DeFi tools.The path to mainstream DeFi adoption may lie in the intersection of tokenized real-world assets, lending markets, and stablecoin payments. TradFi institutions are already asking how retail users access cheaper loans on-chain, showing that DeFi’s efficiency could become the Trojan horse that pulls traditional finance deeper into crypto rails.Looking forward, the fusion of AI with DeFi may transform finance into an information-driven ecosystem. Trusted execution environments, prediction markets, and conditional markets could allow agents to trade on beliefs and probabilities with transparency, blending deterministic blockchains with probabilistic AI—a glimpse of what financial Venice in the information age might look like.
In this episode of Crazy Wisdom, Stewart Alsop sits down with Derek Osgood, CEO of DoubleO.ai, to talk about the challenges and opportunities of building with AI agents. The conversation ranges from the shift from deterministic to probabilistic processes, to how humans and LLMs think differently, to why lateral thinking, humor, and creative downtime matter for true intelligence. They also explore the future of knowledge work, the role of context engineering and memory in making agents useful, and the culture of talent, credentials, and hidden gems in Silicon Valley. You can check out Derek’s work at doubleo.ai or connect with him on LinkedIn.Check out this GPT we trained on the conversationTimestamps00:00 Derek Osgood explains what AI agents are, the challenge of reliability and repeatability, and the difference between chat-based and process-based agents.05:00 Conversation shifts to probabilistic vs deterministic systems, with examples of agents handling messy data like LinkedIn profiles.10:00 Stewart Alsop and Derek discuss how humans reason compared to LLMs, token vs word prediction, and how language shapes action.15:00 They question whether chat interfaces are the right UX for AI, weighing structure, consistency, and the persistence of buttons in knowledge work.20:00 Voice interaction comes up, its sci-fi allure, and why unstructured speech makes it hard without stronger memory and higher-level reasoning.25:00 Derek unpacks OpenAI’s approach to memory as active context retrieval, context engineering, and why vector databases aren’t the full answer.30:00 They examine talent wars in AI, credentialism, signaling, and the difference between PhD-level model work and product design for agents.35:00 Leisure and creativity surface, linking downtime, fantasy, and imagination to better lateral thinking in knowledge work.40:00 Discussion of asynchronous AI reasoning, longer time horizons, and why extending “thinking time” could change agent behavior.45:00 Derek shares how Double O orchestrates knowledge work with natural language workflows, making agents act like teammates.50:00 They close with reflections on re-skilling, learning to work with LLMs, BS detection, and the future of critical thinking with AI.Key InsightsOne of the biggest challenges in building AI agents is not just creating them but ensuring their reliability, accuracy, and repeatability. It’s easy to build a demo, but the “last mile” of making an agent perform consistently in the messy, unstructured real world is where the hard problems live.The shift from deterministic software to probabilistic agents reflects the complexity of real-world data and processes. Deterministic systems work only when inputs and outputs are cleanly defined, whereas agents can handle ambiguity, search for missing context, and adapt to different forms of information.Humans and LLMs share similarities in reasoning—both operate like predictive engines—but the difference lies in agency and lateral thinking. Humans can proactively choose what to do without direction and make wild connections across unrelated experiences, something current LLMs still struggle to replicate.Chat interfaces may not be the long-term solution for interacting with AI. While chat offers flexibility, it is too unstructured for many use cases. Derek argues for a hybrid model where structured UI/UX supports repeatable workflows, while chat remains useful as one tool within a broader system.Voice interaction carries promise but faces obstacles. The unstructured nature of spoken input makes it difficult for agents to act reliably without stronger memory, better context retrieval, and a more abstract understanding of goals. True voice-first systems may require progress toward AGI.Much of the magic in AI comes not from the models themselves but from context engineering. Effective systems don’t just rely on vector databases and embeddings—they combine full context, partial context, and memory retrieval to create a more holistic understanding of user goals and history.Beyond the technical, the episode highlights cultural themes: credentialism, hidden talent, and the role of leisure in creativity. Derek critiques Silicon Valley’s obsession with credentials and signaling, noting that true innovation often comes from hidden gem hires and from giving the brain downtime to make unexpected lateral connections that drive creative breakthroughs.
In this episode of Crazy Wisdom, Stewart Alsop speaks with Juan Verhook, founder of Tender Market, about how AI reshapes creativity, work, and society. They explore the risks of AI-generated slop versus authentic expression, the tension between probability and uniqueness, and why the complexity dilemma makes human-in-the-loop design essential. Juan connects bureaucracy to proto-AI, questions the incentives driving black-box models, and considers how scaling laws shape emergent intelligence. The conversation balances skepticism with curiosity, reflecting on authenticity, creativity, and the economic realities of building in an AI-driven world. You can learn more about Juan Verhook’s work or connect with him directly through his LinkedIn or via his website at tendermarket.eu.Check out this GPT we trained on the conversationTimestamps00:00 – Stewart and Juan open by contrasting AI slop with authentic creative work. 05:00 – Discussion of probability versus uniqueness and what makes output meaningful. 10:00 – The complexity dilemma emerges, as systems grow opaque and fragile. 15:00 – Why human-in-the-loop remains central to trustworthy AI. 20:00 – Juan draws parallels between bureaucracy and proto-AI structures. 25:00 – Exploration of black-box models and the limits of explainability. 30:00 – The role of economic incentives in shaping AI development. 35:00 – Reflections on nature versus nurture in intelligence, human and machine. 40:00 – How scaling laws drive emergent behavior, but not always understanding. 45:00 – Weighing authenticity and creativity against automation’s pull. 50:00 – Closing thoughts on optimism versus pessimism in the future of work.Key InsightsAI slop versus authenticity – Juan emphasizes that much of today’s AI output tends toward “slop,” a kind of lowest-common-denominator content driven by probability. The challenge, he argues, is not just generating more information but protecting uniqueness and cultivating authenticity in an age where machines are optimized for averages.The complexity dilemma – As AI systems grow in scale, they become harder to understand, explain, and control. Juan frames this as a “complexity dilemma”: every increase in capability carries a parallel increase in opacity, leaving us to navigate trade-offs between power and transparency.Human-in-the-loop as necessity – Instead of replacing people, AI works best when embedded in systems where humans provide judgment, context, and ethical grounding. Juan sees human-in-the-loop design not as a stopgap, but as the foundation for trustworthy AI use.Bureaucracy as proto-AI – Juan provocatively links bureaucracy to early forms of artificial intelligence. Both are systems that process information, enforce rules, and reduce individuality into standardized outputs. This analogy helps highlight the social risks of AI if left unexamined: efficiency at the cost of humanity.Economic incentives drive design – The trajectory of AI is not determined by technical possibility alone but by the economic structures funding it. Black-box models dominate because they are profitable, not because they are inherently better for society. Incentives, not ideals, shape which technologies win.Nature, nurture, and machine intelligence – Juan extends the age-old debate about human intelligence into the AI domain, asking whether machine learning is more shaped by architecture (nature) or training data (nurture). This reflection surfaces the uncertainty of what “intelligence” even means when applied to artificial systems.Optimism and pessimism in balance – While AI carries risks of homogenization and loss of meaning, Juan maintains a cautiously optimistic view. By prioritizing creativity, human agency, and economic models aligned with authenticity, he sees pathways where AI amplifies rather than diminishes human potential.
On this episode of Crazy Wisdom, host Stewart Alsop speaks with Michael Jagdeo, a headhunter and founder working with Exponent Labs and The Syndicate, about the cycles of money, power, and technology that shape our world. Their conversation touches on financial history through The Ascent of Money by Niall Ferguson and William Bagehot’s The Money Market, the rise and fall of financial centers from London to New York and the new Texas Stock Exchange, the consolidation of industries and the theory of oligarchical collectivism, the role of AI as both tool and chaos agent, Bitcoin and “quantitative re-centralization,” the dynamics of exponential organizations, and the balance between collectivism and individualism. Jagdeo also shares recruiting philosophies rooted in stories like “stone soup,” frameworks like Yu-Kai Chou’s Octalysis and the User Type Hexad, and book recommendations including Salim Ismail’s Exponential Organizations and Arthur Koestler’s The Act of Creation. Along the way they explore servant leadership, Price’s Law, Linux and open source futures, religion as an operating system, and the cyclical nature of civilizations. You can learn more about Michael Jagdeo or reach out to him directly through Twitter or LinkedIn.Check out this GPT we trained on the conversationTimestamps00:05 Stewart Alsop introduces Michael Jagdeo, who shares his path from headhunting actuaries and IT talent into launching startups with Exponent Labs and The Syndicate.00:10 They connect recruiting to financial history, discussing actuaries, The Ascent of Money, and William Bagehot’s The Money Market on the London money market and railways.00:15 The Rothschilds, institutional knowledge, and Corn Laws lead into questions about New York as a financial center and the quiet launch of the Texas Stock Exchange by Citadel and BlackRock.00:20 Capital power, George Soros vs. the Bank of England, chaos, paper clips, and Orwell’s oligarchical collectivism frame industry consolidation, syndicates, and stone soup.00:25 They debate imperial conquest, bourgeoisie leisure, the decline of the middle class, AI as chaos agent, digital twins, Sarah Connor, Godzilla, and nuclear metaphors.00:30 Conversation turns to Bitcoin, “quantitative re-centralization,” Jack Bogle, index funds, Robinhood micro bailouts, and AI as both entropy and negative entropy.00:35 Jagdeo discusses Jim Keller, Tenstorrent, RISC-V, Nvidia CUDA, exponential organizations, Price’s Law, bureaucracy, and servant leadership with the parable of stone soup.00:40 Recruiting as symbiosis, biophilia, trust, Judas, Wilhelm Reich, AI tools, Octalysis gamification, Jordan vs. triangle offense, and the role of laughter in persuasion emerge.00:45 They explore religion as operating systems, Greek gods, Comte’s stages, Nietzsche, Jung, nostalgia, scientism, and Jordan Peterson’s revival of tradition.00:50 The episode closes with Linux debates, Ubuntu, Framer laptops, PewDiePie, and Jagdeo’s nod to Liminal Snake on epistemic centers and turning curses into blessings.Key InsightsOne of the central insights of the conversation is how financial history repeats through cycles of consolidation and power shifts. Michael Jagdeo draws on William Bagehot’s The Money Market to explain how London became the hub of European finance, much like New York later did, and how the Texas Stock Exchange signals a possible southern resurgence of financial influence in America. The pattern of wealth moving with institutional shifts underscores how markets, capital, and politics remain intertwined.Jagdeo and Alsop emphasize that industries naturally oligarchize. Borrowing from Orwell’s “oligarchical collectivism,” Jagdeo notes that whether in diamonds, food, or finance, consolidation emerges as economies of scale take over. This breeds syndicates and monopolies, often interpreted as conspiracies but really the predictable outcome of industrial maturation.Another powerful theme is the stone soup model of collaboration. Jagdeo applies this parable to recruiting, showing that no single individual can achieve large goals alone. By framing opportunities as shared ventures where each person adds their own ingredient, leaders can attract top talent while fostering genuine symbiosis.Technology, and particularly AI, is cast as both chaos agent and amplifier of human potential. The conversation likens AI to nuclear power—capable of great destruction or progress. From digital twins to Sarah Connor metaphors, they argue AI represents not just artificial intelligence but artificial knowledge and action, pushing humans to adapt quickly to its disruptive presence.The discussion of Bitcoin and digital currencies reframes decentralization as potentially another trap. Jagdeo provocatively calls Bitcoin “quantitative re-centralization,” suggesting that far from liberating individuals, digital currencies may accelerate neo-feudalism by creating new oligarchies and consolidating financial control in unexpected ways.Exponential organizations and the leverage of small teams emerge as another key point. Citing Price’s Law, Jagdeo explains how fewer than a dozen highly capable individuals can now achieve billion-dollar valuations thanks to open source hardware, AI, and network effects. This trend redefines scale, making nimble collectives more powerful than bureaucratic giants.Finally, the episode highlights the cyclical nature of civilizations and belief systems. From Rome vs. Carthage to Greek gods shifting with societal needs, to Nietzsche’s “God is dead” and Jung’s view of recurring deaths of divinity, Jagdeo argues that religion, ideology, and operating systems reflect underlying incentives. Western nostalgia for past structures, whether political or religious, risks idolatry, while the real path forward may lie in new blends of individualism, collectivism, and adaptive tools like Linux and AI.
In this episode of Crazy Wisdom, Stewart Alsop talks with Paul Spencer about the intersection of AI and astrology, the balance of fate and free will, and how embodiment shapes human experience in time and space. They explore cultural shifts since 2020, the fading influence of institutions, the “patchwork age” of decentralized communities, and the contrasts between solar punk and cyberpunk visions for the future. Paul shares his perspective on America’s evolving role, the symbolism of the Aquarian Age, and why philosophical, creative, and practical adaptability will be essential in the years ahead. You can connect with Paul and explore more of his work and writings at zeitvillemedia.substack.com, or find him as @ZeitvilleMedia on Twitter and You Tube.Check out this GPT we trained on the conversationTimestamps00:00 Stewart Alsop and Paul Spencer open with a discussion on AI and astrology, exploring fate versus free will and how human embodiment shapes the way we move through time and space.05:00 Paul contrasts the human timeline, marked by death, with AI’s lack of finality, bringing in Brian Johnson’s transhumanism and the need for biological embodiment for true AI utility.10:00 They explore how labor, trade, food, and procreation anchor human life, connecting these to the philosophical experience of space and time.15:00 Nietzsche and Bergson’s ideas on life force, music, and tactile philosophy are discussed as alternatives to detached Enlightenment thinking.20:00 The conversation shifts to social media’s manipulation, institutional decay after 2020, and the absence of an “all clear” moment.25:00 They reflect on the chaotic zeitgeist, nostalgia for 2021’s openness, and people faking cultural cohesion.30:00 Paul uses Seinfeld as an example of shared codes, contrasting it with post-woke irony and drifting expectations.35:00 Pluto in Aquarius and astrological energies frame a shift from heaviness to a delirious cultural mood.40:00 Emotional UBI and the risks of avoiding emotional work lead into thoughts on America’s patchwork future.45:00 They explore homesteading, raw milk as a cultural symbol, and the tension between consumerism and alternative visions like solar punk and cyberpunk.50:00 Paul highlights the need for cross-tribal diplomacy, the reality of the surveillance state, and the Aquarian Age’s promise of decentralized solutions.Key InsightsPaul Spencer frames astrology as a way to understand the interplay of fate and free will within the embodied human experience, emphasizing that humans are unique in their awareness of time and mortality, which gives life story and meaning.He argues that AI, while useful for shifting perspectives, lacks “skin in the game” because it has no embodiment or death, and therefore cannot fully grasp or participate in the human condition unless integrated into biological or cybernetic systems.The conversation contrasts human perception of space and time, drawing from philosophers like Nietzsche and Bergson who sought to return philosophy to the body through music, dance, and tactile experiences, challenging abstract, purely cerebral approaches.Post-2020 culture is described as a “patchwork age” without a cohesive zeitgeist, where people often “fake it” through thin veneers of social codes. This shift, combined with Pluto’s move into Aquarius, has replaced the heaviness of previous years with a chaotic, often giddy nihilism.America is seen as the primary arena for the patchwork age due to its pioneering, experimental spirit, with regional entrepreneurship and cultural biodiversity offering potential for renewal, even as nostalgia for past unity and imperial confidence lingers.Tensions between “solar punk” and “cyberpunk” visions highlight the need for cross-tribal diplomacy—connecting environmentalist, primitivist, and high-tech decentralist communities—because no single approach will be sufficient to navigate accelerating change.The Aquarian Age, following the Piscean Age in the procession of the equinoxes, signals a movement from centralized, hypnotic mass programming toward decentralized, engineering-focused solutions, where individuals must focus on building beauty and resilience in their own worlds rather than being consumed by “they” narratives.
In this episode of Crazy Wisdom, Stewart Alsop talks with Cathal, founder of Poliebotics and creator of the “truth beam” system, about proof of liveness technology, blockchain-based verification, projector-camera feedback loops, physics-based cryptography, and how these tools could counter deepfakes and secure biodiversity data. They explore applications ranging from conservation monitoring on Cathal’s island in Ireland to robot-assisted farming, as well as the intersection of nature, humanity, and AI. Cathal also shares thoughts on open-source tools like Jitsi and Element, and the cultural shifts emerging from AI-driven creativity. Find more about his work and Poliebotics in Github and Twitter.Check out this GPT we trained on the conversationTimestamps00:00 Stewart Alsop introduces Cathal, starting with proof of liveness vs proof of aliveness and deepfake challenges.05:00 Cathal explains projector-camera feedback loops, Perlin noise, cryptographic hashing, blockchain timestamps via Rootstock.10:00 Discussion on using multiple blockchains for timestamps, physics-based timing, and recording verification.15:00 Early Bitcoin days, cypherpunk culture, deterministic vs probabilistic systems.20:00 Projector emissions, autoencoders, six-channel matrix data type, training discriminators.25:00 Decentralized verification, truth beams, building trust networks without blockchain.30:00 Optical interlinks, testing computational nature of reality, simulation ideas.35:00 Dystopia vs optimism, AI offense in cybersecurity, reputation networks.40:00 Reality transform, projecting AI into reality, creative agents, philosophical implications.45:00 Conservation applications, biodiversity monitoring, insect assays, cryptographically secured data.50:00 Optical cryptography, analog feedback loops, quantum resistance.55:00 Open source tools, Jitsi, Element, cultural speciation, robot-assisted farming, nature-human-AI coexistence.Key InsightsCathal’s “proof of liveness” aims to authenticate real-time video by projecting cryptographically generated patterns onto a subject and capturing them with synchronized cameras, making it extremely difficult for deepfakes or pre-recorded footage to pass as live content.The system uses blockchain timestamps—currently via Rootstock, a Bitcoin sidechain running the Ethereum Virtual Machine—to anchor these projections in a decentralized, physics-based timeline, ensuring verification doesn’t depend on trusting a single authority.A distinctive six-channel matrix data type, created by combining projector and camera outputs, is used to train neural network discriminators that determine whether a recording and projection genuinely match, allowing for scalable automated verification.Cathal envisions “truth beams” as portable, collaborative verification devices that could build decentralized trust networks and even operate without blockchains once enough verified connections exist.Beyond combating misinformation, the same projector-camera systems could serve conservation efforts—recording biodiversity data, securing it cryptographically, and supporting projects like insect population monitoring and bird song analysis on Cathal’s island in Ireland.Cathal is also exploring “reality transform” technology, which uses projection and AI to overlay generated imagery onto real-world objects or people in real time, raising possibilities for artistic expression, immersive experiences, and creative AI-human interaction.Open-source philosophy underpins his approach, favoring tools like Jitsi for secure video communication and advocating community-driven development to prevent centralized control over truth verification systems, while also exploring broader societal shifts like cultural speciation and cooperative AI-human-nature systems.
In this episode of Crazy Wisdom, host Stewart Alsop talks with Zachary Cote, Executive Director of Thinking Nation, about how history education can shape citizens who think critically rather than simply memorize facts. They explore the role of memory, the ethics of curation in a decentralized media landscape, and the need to rebuild trust in institutions through humility, collaboration, and historical thinking. Zachary shares insights from his teaching experience and emphasizes intellectual humility as essential for civic life and learning in the age of AI. You can learn more about his work at thinkingnation.org and follow @Thinking_Nation on social media.Check out this GPT we trained on the conversationTimestamps00:00 – Zachary introduces Thinking Nation’s mission to foster critical thinking in history education, distinguishing memory from deeper historical discipline.05:00 – They unpack the complexity of memory, collective narratives, and how individuals curate their own realities, especially in a decentralized media landscape.10:00 – Zachary explains why epistemology and methodology matter more than static facts, and how ethical curation can shape flourishing societies.15:00 – Discussion turns to how history is often used for cultural arguments, and the need to reframe it as a tool for understanding rather than judgment.20:00 – They explore AI in education, contrasting it as tool vs. crutch, and warning about students’ lack of question-asking skills.25:00 – The conversation shifts to authority, institutions, and tradition as “democracy extended to the dead.”30:00 – Stewart and Zachary reflect on rebuilding trust through honesty, humility, collaboration, and asking better questions.35:00 – They consider the decentralizing effects of technology and the urgency of restoring shared principles.40:00 – Zachary emphasizes contextualization, empathy, and significance as historical thinking skills rooted in humility.45:00 – They close on the challenge of writing and contributing meaningfully through questions and confident, honest articulation.Key InsightsZachary Cote argues that history education should move beyond memorization and focus on cultivating thinking citizens. He reframes history as a discipline of inquiry, where the past is the material through which students develop critical, ethical reasoning.The concept of memory is central to understanding history. Zachary highlights that we all remember differently based on our environment and identity, which complicates any attempt at a single, unified national narrative. This complexity invites us to focus on shared methodologies rather than consensus on content.In an age of media fragmentation and curated realities, Zachary emphasizes the importance of equipping students with epistemological tools to evaluate and contextualize information ethically, rather than reinforcing echo chambers or binary ideologies.The conversation calls out the educational system’s obsession with data and convenient assessment, arguing that what matters most—like humility, critical thinking, and civic understanding—is often left out because it’s harder to measure.Zachary sees AI as a powerful tool that, if used well, could help assess deeper thinking skills. But he warns that without training in asking good questions, students may treat AI like a gospel rather than a starting point for inquiry.Authority and tradition, often dismissed in a culture obsessed with novelty, are reframed by Zachary as essential democratic tools. Citing Chesterton, he argues that tradition is “democracy extended to the dead,” reminding us that collective wisdom includes voices from the past.Humility emerges as a recurring theme—not just spiritual or social humility, but intellectual humility. Through historical thinking skills like contextualization, empathy, and significance, students can learn to approach the past (and the present) with curiosity rather than certainty, making room for deeper civic engagement.
In this episode, Stewart Alsop speaks with Edouard Machery, Distinguished Professor at the University of Pittsburgh and Director of the Center for Philosophy of Science, about the deep cultural roots of question-asking and curiosity. From ancient Sumerian tablets to the philosophical legacies of Socrates and Descartes, the conversation spans how different civilizations have valued inquiry, the cross-cultural psychology of AI, and what makes humans unique in our drive to ask “why.” For more, explore Edouard’s work at www.edouardmachery.com.Check out this GPT we trained on the conversationTimestamps00:00 – 05:00 Origins of question-asking, Sumerian writing, norms in early civilizations, authority and written text05:00 – 10:00 Values in AI across cultures, RLHF, tech culture in the Bay Area vs. broader American values10:00 – 15:00 Cross-cultural AI study: Taiwan vs. USA, privacy and collectivism, urban vs. rural mindset divergence15:00 – 20:00 History of curiosity in the West, from vice to virtue post-15th century, link to awe and skepticism20:00 – 25:00 Magic, alchemy, and experimentation in early science, merging maker and scholarly traditions25:00 – 30:00 Rise of public dissections, philosophy as meta-curiosity, Socratic questioning as foundational30:00 – 35:00 Socrates, Plato, Aristotle—transmission of philosophical curiosity, human uniqueness in questioning35:00 – 40:00 Language, assertion, imagination, play in animals vs. humans, symbolic worlds40:00 – 45:00 Early moderns: Montaigne, Descartes, rejection of Aristotle, rise of foundational science45:00 – 50:00 Confucianism and curiosity, tradition and authority, contrast with India and Buddhist thought50:00 – 55:00 Epistemic virtues project, training curiosity, philosophical education across cultures, spiritual curiosityKey InsightsCuriosity hasn’t always been a virtue. In Western history, especially through Christian thought until the 15th century, curiosity was viewed as a vice—something dangerous and prideful—until global exploration and scientific inquiry reframed it as essential to human understanding.Question-asking is culturally embedded. Different societies place varying emphasis on questioning. While Confucian cultures promote curiosity within hierarchical structures, Christian traditions historically linked it with sin—except when directed toward divine matters.Urbanization affects curiosity more than nationality. Machery found that whether someone lives in a city or countryside often shapes their mindset more than their cultural background. Cosmopolitan environments expose individuals to diverse values, prompting greater openness and inquiry.AI ethics reveals cultural alignment. In studying attitudes toward AI in the U.S. and Taiwan, expected contrasts in privacy and collectivism were smaller than anticipated. The urban, global culture in both countries seems to produce surprisingly similar ethical concerns.The scientific method emerged from curiosity. The fusion of the maker tradition (doing) and the scholarly tradition (knowing) in the 13th–14th centuries helped birth experimentation, public dissection, and eventually modern science—all grounded in a spirit of curiosity.Philosophy begins with meta-curiosity. From Socratic questioning to Plato’s dialogues and Aristotle’s treatises, philosophy has always been about asking questions about questions—making “meta-curiosity” the core of the discipline.Only humans ask why. Machery notes that while animals can make requests, they don’t seem to ask questions. Humans alone communicate assertions and engage in symbolic, imaginative, question-driven thought, setting us apart cognitively and culturally.
In this episode of Crazy Wisdom, host Stewart Alsop sits down with astrologer and researcher C.T. Lucero for a wide-ranging conversation that weaves through ancient astrology, the evolution of calendars, the intersection of science and mysticism, and the influence of digital tools like AI on symbolic interpretation. They explore the historical lineage from Hellenistic Greece to the Persian golden age, discuss the implications of the 2020 Saturn-Jupiter conjunction, touch on astrocartography, and reflect on the information age's shifting paradigms. For more on the guest's work, check out ctlucero.com.Check out this GPT we trained on the conversationTimestamps00:00 Stewart Alsop introduces C.T. Lucero; they begin discussing time cycles and the metaphor of Monday as an unfolding future.05:00 Astrology’s historical roots in Hellenistic Greece and Persian Baghdad; the transmission and recovery of ancient texts.10:00 The role of astrology in medicine and timing; predictive precision and interpreting symbolic calendars.15:00 Scientism vs. astrological knowledge; the objective reliability of planetary movement compared to shifting cultural narratives.20:00 Use of AI and large language models in astrology; the limits and future potential of automation in interpretation.25:00 Western vs. Vedic astrology; the sidereal vs. tropical zodiac debate and cultural preservation of techniques.30:00 Christianity, astrology, and the problem of idolatry; Jesus' position in relation to celestial knowledge.35:00 The Saturn-Jupiter conjunction of 2020; vaccine rollout and election disputes as symbolic markers.40:00 The Mayan Venus calendar and its eight-year cycle; 2020 as the true “end of the world.”45:00 Media manipulation, air-age metaphors, and digital vs. analog paradigms; the rise of new empires.50:00 Astrocartography and relocation charts; using place to understand personal missions.Key InsightsAstrology as a Temporal Framework: C.T. Lucero presents astrology not as mysticism but as a sophisticated calendar system rooted in observable planetary cycles. He compares astrological timekeeping to how we intuitively understand days of the week—Sunday indicating rest, Monday bringing activity—arguing that longer astrological cycles function similarly on broader scales.Historical Continuity and Translation: The episode traces astrology’s lineage from Hellenistic Greece through Persian Baghdad and into modernity. Lucero highlights the massive translation efforts over the past 30 years, particularly by figures like Benjamin Dykes, which have recovered lost knowledge and corrected centuries of transcription errors, contributing to what he calls astrology’s third golden age.Cultural and Linguistic Barriers to Knowledge: Lucero and Alsop discuss how language borders—historically with Latin and Greek, and now digitally with regional languages—have obscured access to valuable knowledge. This extends to old medical practices and astrology, which were often dismissed simply because their documentation wasn’t widely accessible.Astrology vs. Scientism: Lucero critiques scientism for reducing prediction to material mechanisms while ignoring symbolic and cyclical insights that astrology offers. He stresses astrology’s predictive power lies in pattern recognition and contextual interpretation, not in deterministic forecasts.Astrology and the Digital Age: AI and LLMs are starting to assist astrologers by generating interpretations and extracting planetary data, though Lucero points out that deep symbolic synthesis still exceeds AI's grasp. Specialized astrology AIs are emerging, built by domain experts for richer, more accurate analysis.Reevaluating Vedic and Mayan Systems: Lucero asserts that Western and Vedic astrology share a common origin, and even the Mayan Venus calendar may reflect the same underlying system. While the Indian tradition preserved techniques lost in the West, both traditions illuminate astrology’s adaptive yet consistent core.2020 as a Historical Turning Point: According to Lucero, the Saturn-Jupiter conjunction of December 2020 marked the start of a 20-year societal cycle and the end of a Mayan Venus calendar “day.” He links this to transformative events like the vaccine rollout and U.S. election, framing them as catalysts for long-term shifts in trust, governance, and culture.
In this episode of Crazy Wisdom, host Stewart Alsop speaks with Ryan Estes about the intersections of podcasting, AI, ancient philosophy, and the shifting boundaries of consciousness and technology. Their conversation spans topics like the evolution of language, the impact of AI on human experience, the role of sensory interfaces, the tension between scientism and spiritual insight, and how future technologies might reshape power structures and daily life. Ryan also shares thoughts on data ownership, the illusion of modern VR, and the historical suppression of mystical knowledge. Listeners can connect with Ryan on LinkedIn and check out his podcast at AIforFounders.co.Check out this GPT we trained on the conversationTimestamps00:00 – Stewart Alsop and Ryan Estes open with thoughts on podcasting, conversation as primal instinct, and the richness of voice communication.05:00 – Language and consciousness, bicameral mind theory, early religion, and auditory hallucinations.10:00 – AI, cognitive ergonomics, interfacing with tech, new modes of communication, and speculative consciousness.15:00 – Scientism, projections, and authenticity; ownership of hardware, software, and data.20:00 – Tech oligarchs, Apple, Google, OpenAI, and privacy trade-offs.25:00 – VR, escapism, illusion vs. reality, Buddhist and Gnostic parallels.30:00 – Magic, Neoplatonism, Copernicus, alchemy, and suppressed knowledge.35:00 – Oligarchy, the fragile middle class, democracy’s design, and authority temptation.40:00 – AGI, economic shifts, creative labor, vibe coding, and optimism about future work.45:00 – Podcasting's future, amateur charm, content creation tools, TikTok promotion.Key InsightsConversation is a foundational human instinct that transcends digital noise and brings people together in a meaningful way. Ryan Estes reflects on how podcasting revives the richness of dialogue, countering the flattening effects of modern communication platforms.The evolution of language might have sparked consciousness itself. Drawing on theories like the bicameral mind, Estes explores how early humans may have experienced internal commands as divine voices, illustrating a deep link between communication, cognition, and early religious structures.AI is not just a tool but a bridge to new kinds of consciousness. With developments in cognitive ergonomics and responsive interfaces, Estes imagines a future where subconscious cues might influence technology directly, reshaping how we interact with our environment and each other.Ownership of software, hardware, and data is emerging as a critical issue. Estes emphasizes that to avoid dystopian outcomes—such as corporate control via neural interfaces—individuals must reclaim the stack, potentially profiting from their own data and customizing their tech experiences.Virtual reality and AI-generated environments risk becoming addictive escapes, particularly for marginalized populations. Estes likens this to a digital opiate, drawing parallels to spiritual ideas about illusion and cautioning against losing ourselves in these seductive constructs.The suppression of mystical traditions—like Gnosticism, Neoplatonism, and indigenous knowledge—has led to vast cultural amnesia. Estes underscores how historical power structures systematically erased insights that modern AI might help rediscover or recontextualize.Despite the turbulence, AI and AGI offer a radically optimistic future. Estes sees the potential for a 10x productivity boost and entirely new forms of work, creativity, and leisure, reshaping what it means to be economically and spiritually fulfilled in a post-knowledge age.
In this episode of Crazy Wisdom, Stewart Alsop sits down with the masked collective known as the PoliePals—led by previous guest Cathal—to explore their audacious vision of blending humans, nature, and machines through cryptographic reality verification and decentralized systems. They talk about neural and cryptographic projector-camera technologies like the “truth beam” and “reality transform,” analog AI using optical computing, and how open protocols and decentralized consensus could shift power away from corporate control. Along the way, they share stories from Moad’s chaotic tinkering workshop, Meta’s precise Rust-coded Alchemy project, and Terminus Actual’s drone Overwatch. For links to their projects, visit Poliebotics on Twitter and Poliebotics on GitHub.Check out this GPT we trained on the conversationTimestamps00:05 Neural and cryptographic projector-camera systems, reality transform for art and secure recordings, provably unclonable functions.00:10 Moad’s GNOMAD identity, chaotic holistic problem-solving, tinkering with tools, truth beam’s manifold mapping.00:15 Terminus Actual’s drone Overwatch, security focus, six hats theory, Lorewalker’s cryptic mathematical integrations.00:20 Analog AI and optical computing, stacked computational layers, local inference, physical reality interacting with AI.00:25 Meta’s Alchemy software, music-driven robotics, precise Rust programming, contrast with neural network unpredictability.00:30 Decentralization, corporate dependency critique, hardware ownership, open protocols like Matrix, web of trust, Sybil attacks.00:35 Truth beam feedback loops, decentralized epistemology, neo-feudalism, Diamond Age references, nano drone warfare theory.00:40 Biotech risks, lab truth beams for verification, decentralized ID systems, qualitative consensus manifolds.00:45 Maker culture insights, 3D printing community, iterative prototyping, simulators, recycling prints.00:50 Investment casting, alternative energy for classic cars, chaotic hardware solutions, MoAD workshop’s mystical array.00:55 Upcoming PolyPals content, Big Yellow Island recordings, playful sign-offs, decentralized futures.Key InsightsThe PoliePals are pioneering a system that combines cryptographic models, neural projector-camera technologies, and decentralized networks to create tools like the “truth beam” and “reality transform,” which verify physical reality as a provably unclonable function. This innovation aims to secure recordings and provide a foundation for trustworthy AI training data by looping projections of blockchain-derived noise into reality and back.Moad’s character, the GNOMAD—a hybrid of gnome and nomad—embodies a philosophy of chaotic problem-solving using holistic, artful solutions. His obsession with edge cases and tinkering leads to surprising fixes, like using a tin of beans to repair a broken chair leg, and illustrates how resourcefulness intersects with decentralization in practical ways.Terminus Actual provides a counterbalance in the group dynamic, bringing drone surveillance expertise and a healthy skepticism about humanity’s inherent decency. His perspective highlights the need for security consciousness and cautious optimism when developing open systems that could otherwise be exploited.Meta’s Alchemy project demonstrates the contrast between procedural precision and chaotic neural approaches. Written entirely in Rust, it enables music-driven robotic control for real-world theater environments. Alchemy represents a future where tightly optimized code can interact seamlessly with hardware like Arduinos while remaining resistant to AI’s unpredictable tendencies.The episode explores how decentralization could shape the coming decades, likening it to a neo-feudal age where people consciously opt into societies based on shared values. With open protocols like Matrix, decentralized IDs, and webs of trust, individuals could regain agency over their data and technological ecosystems while avoiding corporate lock-in.Optical computing experiments reveal the potential for analog AI, where stacked shallow computational layers in physical media allow AI to “experience” sensory input more like a human. Though still speculative, this approach could produce richer, lower-latency responses compared to purely digital models.Maker culture and hardware innovation anchor the conversation in tangible reality. Moad’s MoAD workshop, filled with tools from industrial sewing machines to 3D printers and lathes, underscores how accessible technologies are enabling chaotic creativity and recycling systems. This grassroots hardware tinkering aligns with the PoliePals’ broader vision of decentralized, cooperative technological futures.
loading
Comments