Discover
Voices of VR
Voices of VR
Author: Kent Bye
Subscribed: 5,439Played: 58,073Subscribe
Share
Description
Since May 2014, Kent Bye has published over 1500 Voices of VR podcast interviews featuring the pioneering artists, storytellers, and technologists driving the resurgence of virtual & augmented reality. He's an oral historian, experiential journalist, & aspiring philosopher, helping to define the patterns of immersive storytelling, experiential design, ethical frameworks, & the ultimate potential of XR.
319 Episodes
Reverse
IDFA DocLab is the immersive selection of non-fiction digital and immersive stories that is a part of the International Documentary Film Festival Amsterdam (IDFA), and they're having their 19th selection this year. DocLab founder Caspar Sonnen has been doing an amazing job of tracking the frontiers of new forms of digital, interactive, and immersive storytelling since 2007, and he joined me along with his co-curator Nina van Doren to talk about the ten pieces within the DocLab Competition for Immersive Non-Fiction as well as the nine pieces within the DocLab Competition for Digital Storytelling as well as portions of their DocLab Spotlight as well as the DocLab at the Planetarium: Down to Earth program, DocLab Playroom prototype sessions as well as the DocLab R&D Summit.
In trying to describe the types of immersive art and storytelling works that DocLab curates, then they have started to use the term "Perception Art" in order to describe the types of pieces and work that they're featuring. This year's theme is "Off the Internet," which speaks to both the types of works that critique and analyze the impacts of online culture on our lives, but also taking projects that were born on the Internet and giving them an IRL physical installation art context to view them. I'll be on site seeing the selection of works and also be interviewing various artists who are on the frontiers of experimentation for these new forms of "perception art."
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
The VRChat worlds by DrMorro are truly incredible. They're vast landscapes made of surreal mash-ups of various architecture styles and symbols that feels like you're walking through a waking dream. His Organism Trilogy (Organism, Epilogue 1, and Epilogue 2) is a true masterpiece of VR worldbuilding. And his latest Ritual is one of the biggest and most impressive single worlds on VRChat that feels walking through a fever dream, and probably the closest thing to Meow Wolf's style of immersive art. And his Raindance Immersive award-winning Olympia was his truly first vast world, and they've been getting bigger and bigger and more impressive ever since. He's got a keen ear for sound design and a sound track that will help set the eerie mood of his sometimes unsettling and liminal worlds. In short, the experience of spending 4-5 hours going through one of DrMorro's worlds is a completely unique and singular experience, as he's in a class of his own when it comes to VRChat world building.
https://www.youtube.com/watch?v=L4AfYsmHQB8
I have long wanted to conduct an interview with DrMorro doing a comprehensive retrospective of his works, but he's an anonymous Russian artist who doesn't speak English. He's only done one other interview with Russian Del'Arte Magazine, but otherwise he's a pretty mysterious and cryptic figure. I managed to got ahold of him through a mutual friend, and he suggested that we do a "19th-century-style written correspondence" where I would send questions over text chat over the course of a week. He would use an AI translator to translate what I said into Russian, and then he would then translate his Russian response back into English. For this podcast, I used the open source Boson AI Higgs Audio with Russian actor Yul Brynner's voice to bring DrMorro's personality to life, but the full transcript of our edited chat is down below if you prefer to read it as I had experienced it.
You can support DrMorro's work through Boosty, and you can support the Voices of VR podcast through Patreon.
Kent Bye: Alright! Can you go ahead and introduce yourself and what you do in the realm of VR?
DrMorro: Hello! The name's DrMorro – or well, that's my alias, to be precise. That's the name I'm known by as the creator of all those strange worlds in VRChat. For now, that's my only real achievement in the VR sphere. Other than that, I'm a 2D and 3D artist, which is my main profession.
Kent Bye: Awesome. Well, this is my first interview that I’ve done via text. Can you give a bit more context for why you prefer to do the interview in this way?
DrMorro: Honestly, I'm a pretty closed-off person, and it's easier for me to write than to talk. It’s just a character trait. Especially since I can't even imagine communicating through a voice translator. When I write, I can at least somehow control the translation. I don't know spoken English, but I manage fine in writing. So, no conspiracy theories. It's just how I'm used to communicating. Though it's strange because by nature, I'm a staunch introvert and I make worlds about total solitude. In ORGANISM, how many entities did you even find there besides the hat-wearing figure? And then suddenly, this popularity falls on me, and constant communication becomes the norm. Aaaahhh!
Kent Bye: Well, I very much appreciate you taking the time to do what you describe as a “19th-century-style written correspondence” with me over the next week or so. And it makes sense that you could have a little bit more control in how you can express yourself via written text through a translator.
Alight. So I always like to hear what type of design disciplines folks are bringing into VR, and so can you provide a bit more context about your background and journey into working with VR?
DrMorro: To put it briefly, my journey is that I essentially work in architectural visualization. But that's more of a day job to keep myself afloat and pay the bills.
My main interest,
Charles Melcher's new book "The Future of Storytelling: How Immersive Experiences Are Transforming Our World" was released on November 4, 2025, and I had a chance to take an early look and interview Melcher. The book is broken up into six main chapters where Melcher argues that the future of storytelling is agentic, immersive, embodied, responsive, social, and transformative.
Melcher covers over fifty different "living stories" across different genres including virtual reality stories, location-based entertainment, immersive stories, immersive theatre, immersive art, experiential brand activations, and interactive experiences. He told me that he's had a chance to experience around 80 to 85% of the experiences that he features in his book, which most of them are site-specific and many times time-limited, immersive exhibitions that are not always easy to get into. He's been traveling to different locations around the world with his Future of Storytelling Explorer's Club to see many of these experiences, as well as engage with the creators behind the experiences.
In his book, he shares some brief trip reports on over 50 different experiences, as well as some very high-quality, official photo documentation of these projects. It serves to provide some documentation of many of these ephemeral projects, but also tie together some of the common elements that helps to define and elucidate what exactly is meant by "immersive."
Melcher and I also talk about the founding of The Future of Storytelling Summit back on October 2012, as well as the start of his Future of Storytelling podcast on March 2020 that has published over 120 interviews since it started during the pandemic. Around 20% of the projects and creators that have appeared on his podcast are featured in his book as what he considers to be a canon of work that exemplifies these deeper trends of immersive storytelling and living stories.
While the book does provide a lot of valuable documentation, one complaint that I have is that it is not always easy to tell where Melcher is sourcing his quotes from project creators. The majority of quotations are coming from either private interviews that he personally conducted or from public conversations that he's featured on his podcast. But sometimes he uses quotes of creators from other publications without full attribution. So if there's a second edition, then I hope to see a more detailed set of footnotes and perhaps an index to make it an even more useful piece of documentation.
The way that Melcher is breaking down the different foundational qualities of immersive experiences also closely mirrors my own elemental approach, but with some slight deviations or different categorizations. His agentic qualities are equivalent to what I call active presence, his embodied is the same as my embodied presence, and his social is the same as my social presence.
I also have emotional presence and environmental presence, which he classifies as emotional and physical subsets of immersive qualities. Melcher also has a participatory subset under immersive qualities, which I consider to just be a part of active presence and what he is already classifying as agentic.
For me "immersive" is more of an umbrella term that includes all of the various qualities of presence, and Melcher proposes a sort of rating system judging the degree of immersiveness rated across the different physical, emotional, and participatory dimensions. But Melcher doesn't list social as it's own vector of immersiveness as he told me that he considers social to be a subsection of emotions, but I consider social qualities to be distinct from emotional ones.
Melcher also highlights the "responsive" qualities of a piece of work, which I see as both connected to ways of amplifying agency, but also something that contributes to Slater's Plausibility Illusion of an experience or a suspension of disbelief, which I classify under mental presence.
The Matrix at Cosm in LA opened on June 6th, 2025, which leverages Cosm's 87-foot, 12K+ LED immersive dome to show this classic film within a 16x9 frame while the additional space beyond the frame was filled with over 50 different scenes thereby expanding the worldbuilding beyond the frame. I finally had a chance to see it last month, and was really impressed with how much this additional space was able to increase the level of immersion, to amplify key emotional beats within the film, and create some truly awe-inspiring moments.
I had a chance to speak with Alexis Scalice, Cosm’s vice president of business development and entertainment, about Cosm's collaboration with Little Cinema, MakeMake, and Warner Brothers to launch their inaugural "Cinematic Shared Reality" immersive experience. The Matrix has a few more weeks of screenings before their second film Willy Wonka and the Chocolate Factory (1971) opens on November 21, 2025.
You can also hear more context from in Noah Nelson's No Proscencium podcast interview with Little Cinema's Jay Rinsky conducted ahead of the world premiere. And I also share some impressions of the two enhanced cinema productions of The Black Phone and M3GAN within Blumhouse Enhanced Cinema Quest App. These films have some similarities to what The Matrix at Cosm is doing, but at a much smaller scale and not nearly as effective as the expanded immersive worldbuilding in one of the greatest science fiction films of all time. The Matrix at Cosm is setting a quality high bar for this type of format that is going to be difficult to match. You can see more context in the rough transcript below.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
I had a chance to catch up with Wevr's CEO and co-founder Neville Spiteri, which has been making location-based VR experiences for the last decade in what he calls a "New Cinema." See more context in the rough transcript below.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
I did an interview with Joe Darko, Global Head of Developer Relations at Snap, at Snap's Developer Conference of Lensfest. See more context in the rough transcript below.
You can also check out all 11 episodes in this Snap Lensfest series here:
#1667: Kickoff of Snap Lensfest 2025 Coverage & SnapOS 2.0 Announcements
#1668: Snap Co-Founders Community Q&A about Specs 2026 Launch Plan
#1669: Snap's Resh Sidhu on the Future of AR Commerce & Developer-Centered Innovation
#1670: Snapchat's Embodied Gaming Innovations with AR Developer Relations Head
#1671: Reflecting on Snap's AR Platform & Developer Tools Past and Future with Terek Judi
#1672: Niantic Spatial's Project Jade Demo Shows Latest Location-Aware, AI Tour Guide Innovations
#1673: Snap Lensfest Announcement Reflections from AR Gaming Studio DB Creations
#1674: 3rd Place Spectacles Lensathon Team: Fireside Tales Collaborative Storytelling with GenAI
#1675: 2nd Place Spectacles Lensathon Team: CartDB Barcode-Scanning Nutrition App
#1676: 1st Place Spectacles Lensathon Team: Decisionator Object-Detection AI Decision-Maker
#1677: Snap's AR Developer Relations Plan for 2026 Specs Consumer Launch with Joe Darko
Here are some concluding deep thoughts that I just posted in a LinkedIn post.
Reflections on Snap Lensfest XR & AI Trends Covered in Latest Voices of VR Podcast Series
Snap brought me down to LA to cover their Lensfest developer conference where they made a lot of AR developer platform announcements, had a hackathon featuring those new capabilities, and are gearing up for their 2026 consumer launch of Specs, their fully 6-DoF, hand-tracked enabled, AR Glasses. It’s been a full year since their Spectacles dev kit was announced and made available to developers, and I feel like Snap is on the bleeding edge of where the overall XR industry may be headed.
These latest 11 Voices of VR podcast episodes spanning nearly 7 hours dig into these deeper trends that go beyond the headline announcements from Snap Lensfest. I recorded five interviews with various Snap employees, and I had a chance to catch up with some of the leading AR developers in the space, including Niantic Spatial’s latest VPS guided tour experience on the Spectacles with an AI virtual being. I also served as a preliminary hackathon judge where I got hands-on experiences with all of the AR experiences exploring what’s possible with the latest Snap Cloud announcements, and I’m featuring interviews with the top three Lensathon teams from the Spectacles track.
Snap's Latest AR Developer Platform Announcements
Snap is gearing up for a 2026 launch of Specs for what will likely be nearly two full years of the Spectacles dev kits having been made available. So this Lensfest marks a half-way point towards a consumer release, and the product team has been busy rapidly iterating on their bespoke, AR app production pipeline. Dedicated AR glasses are very resource constrained, and so Snap has been continuing to evolve their Lens Studio developer tool and optimizing their SnapOS platform for Spectacles. Snap didn't share any news on their target specifications for the Specs, but they released eight significant releases of their development tools over the past year with some of the biggest announcements being shared as the primary focus at Lensfest.
Snap is launching Snap Cloud, based upon a Supabase deployment of their open source, PostgreSQL hosted solution. This will allow developers to dynamically load assets, call edge functions, and more easily set up database backends. This will hopefully help to enable Spectacles AR lenses to go beyond some byte-sized entertainment and rapidly prototyped experiments into more fully-featured applications that also leverage cutting-edge AI models and computer-vision enabled applications. Spectacles developers have been limited by 25MB lens size limits, but the Snap Cloud announcements makes it so that larger assets can be dynami...
At Snap's Developer Conference of Lensfest, I did an interview with 1st place team in the Snap Spectacles Lensathon named Decisionator including Candice Branchereau, Marcin Polakowski, Volodymyr Kurbatov, and Inna Horobchuk. I also summarize the other 10 Spectacles Lensathon projects after serving as a preliminary judge for the competition. See more context in the rough transcript below.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
At Snap's Developer Conference of Lensfest, I did an interview with 2nd place team in the Snap Spectacles Lensathon named CartdB including Guillaume Dagens, Nigel Hartman, and Uttam Grandhi (the other team member Nicholas Ross had some prior commitments). See more context in the rough transcript below.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
At Snap's Developer Conference of Lensfest, I did an interview with 3rd place team in the Snap Spectacles Lensathon named Fireside Tales including Stijn Spanhove, Pavlo Tkachenko, and Yegor Ryabtsov. See more context in the rough transcript below.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
I did an interview with DB Creations co-founders Dustin Kochensparger and Blake Gross at Snap's Developer Conference of Lensfest. See more context in the rough transcript below.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
I did an interview with Alicia Berry, Executive Producer at Niantic Spatial, and Asim Ahmed, Head of Product Marketing at Niantic Spatial, at Snap's Developer Conference of Lensfest about their latest Project Jade Spectacles demo. See more context in the rough transcript below.
https://twitter.com/tweetsfromasim/status/1981830288771887606
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
At Snap's Developer Conference of Lensfest, I did an interview with Terek Judi who is working on Spectacles Product at Snap focusing on SnapOS, Platform, and Developer Tools. See more context in the rough transcript below, and if you'd like to check out the two interviews with Matt Hargett that I reference in the intro, then be sure to check out epsiode #1311 and episode #1660.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
I did an interview with Raag Harshavat, AR Developer Relations at Snapchat, at Snap's Developer Conference of Lensfest. See more context in the rough transcript below.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
I did an interview with Resh Sidhu, Senior Director of Innovation of Specs and Developer Marketing at Snap, at Snap's Developer Conference of Lensfest. See more context in the rough transcript below.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
The snap co-founders of CEO Evan Spiegel and CTO Bobby Murphy typically have a community-driven Q&A after their Lensfest Keynote where they field over a dozen questions from Lensfest attendees. I'm including this in my coverage again this year as it's a really great set of questions about their consumer release of Specs AR glasses next year, some of their thinking about the role of AI at Snap, and reflections of their 10 years of working with AR lenses going back to the vomiting rainbows facial filter released in 2015.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
This interview with Spectacles Community Manager Jesse McCulloch kicks off my coverage of Snap's Developer Conference called Lensfest. Snap is gearing up for a consumer release of their Snap Specs AR glasses some time next year, and they've been busy frequently updating their underlying operating system and platform tools like Lens Studio. There were no new announcements or reveals about the details of the Snap Specs that have been shared yet, but I did cover the biggest announcements at Lensfest throughout this series and in this interview with McCulloch.
I also had a chance to interview five different Snap employees exploring different aspects of their AR strategy, and I also interviewed some AR developers in from the Snap ecosystem. Snap brought me down to also cover the 25-hour Lensathon, and I had a chance to be a judge for the 10 different Spectacles-based hackathon projects, and so I'll be featuring the top 3 finalists in the series. I also interviewed the AR game developers from DB Creations, as well as the latest AI assistant, guided tour demo from Niantic Spatial.
Here is a list of the 11 episodes and nearly 7 hours of coverage from Snap's Lensfest:
#1667: Kickoff of Snap Lensfest 2025 Coverage & SnapOS 2.0 Announcements
#1668: Snap Co-Founders Community Q&A about Specs 2026 Launch Plan
#1669: Snap's Resh Sidhu on the Future of AR Commerce & Developer-Centered Innovation
#1670: Snapchat's Embodied Gaming Innovations with AR Developer Relations Head
#1671: Reflecting on Snap's AR Platform & Developer Tools Past and Future with Terek Judi
#1672: Niantic Spatial's Project Jade Demo Shows Latest Location-Aware, AI Tour Guide Innovations
#1673: Snap Lensfest Announcement Reflections from AR Gaming Studio DB Creations
#1674: 3rd Place Spectacles Lensathon Team: Fireside Tales Collaborative Storytelling with GenAI
#1675: 2nd Place Spectacles Lensathon Team: CartDB Barcode-Scanning Nutrition App
#1676: 1st Place Spectacles Lensathon Team: Decisionator Object-Detection AI Decision-Maker
#1677: Snap's AR Developer Relations Plan for 2026 Specs Consumer Launch with Joe Darko
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
I did an interview with VRChat co-founder and CEO Graham Gaylor at Meta Connect 2025 where we talk about the various different monetization strategies that VRChat has been exploring with their user-generated content platform. VRChat announced layoffs for 30% of their employees back on June 12, 2024, and so this is the first time I've had a chance to interview any of the VRChat executives since then.
I used to have a pretty consistent streak of interviewing either VRChat leaders or employees at various VR conferences running from 2014, 2015, 2016, 2017, 2018, and 2019, but after the pandemic they were not giving as many public interviews. I did however recently cover the VRChat Avatar Marketplace as well as a conversation with VRChat's new Trust and Safety lead Jun Young Ro about his plans to overhaul and modernize VRChat's Trust and Safety processes, especially as users like Harry X were pointing out some gaps in their moderation processes.
I had a chance to chat with Gaylor about some of the early decisions in VRChat for making custom avatars easily uploadable since version 0.3.5 on March 16, 2014 when co-founder Jesse Joudrey made his first public contributions to the project. Joudrey elaborated on his vision of what he considered to be "one of the corner stones of virtual reality and any cyberpunk offshoot... Customization. I don't want any limit on who or what I can be in virtual reality."
I had dug up these dates and posts in the write up for episode #1408 where I went down a deep rabbit hole of tracing down some of the origin story for VRChat. Gaylor had actually passed along some early emails and documentation of the early days of VRChat for that write-up.
The decision to make avatars completely customizable has been part of the magic and success of VRChat. But centralized and controlled identity has traditionally been one of the core pathways for monetization. In a conversation with VRChat community members after the June 2024 layoffs, qDot told me, "You cannot put the asset genie back in the bottle for VRChat. They can't just come up with an asset system that works this sort of centrally-regulated way now. Everyone is used to throwing these assets around, selling them on Gumroad, selling them on Booth." So I had a chance to talk with Gaylor about his paradox of customizable identity being both the secret sauce of VRChat, but also the clearest traditional path for monetization.
You can see more context in the rough transcript below.
This also happens to wrap up my coverage of Meta Connect 2025, and here's a recap of the different stories and coverage if you'd like to dig into more details of other things that were announced this year.
#1652: Kick-off of Meta Connect Coverage with Meta Ray-Ban Display Glasses Insights from Norm Chan
#1653: XR Analyst Anshel Sag on Meta's AI Glasses Strategy
#1654: CNET's Scott Stein's Reflections on Meta Ray-Ban Display Glasses Implications
#1655: Meta Horizon Studio News and Virtual Fashion with Paige Dansinger
#1656: Kiira Benz Part 1: "Runnin'" Large-Scale Volumetric Music Video (2019)
#1657: Kiira Benz Part 2: "Finding Pandora X" Bringing Immersive Theatre to VRChat (2020)
#1658: Kiira Benz Part 3: Immersive Storytelling Career Retrospective (2025)
#1659: VR Gaming Career Retrospective of Chicken Waffle's Finn Staber
#1660: Enabling JavaScript-Based Native App XR Pipelines with NativeScript, React Native, and Node API with Matt Hargett
#1661: State of VR Gaming with Jasmine Uniza's Impact Realities and Flat2VR Studios
#1662: Meta Connect Highlights & Meta Horizon News with JDun and JoyReign
#1663: ShapesXR Updates & Neural Band Design Implications of Transforming Your Hand into a Mouse
#1664: Resolution Games CEO on Apple Vision Pro Launch + Gaze & Pinch HCI Mechanic in Game Room (2024)
#1665: Resolution Games' "Battlemarked" Blends Mixed Reality Social Features with Demeo and D&D Gameplay
I did an interview with Gustav Stenmark at Meta Connect 2025 talking about their latest game Demeo x Dungeons & Dragons: Battlemarked, which enables some pretty interested co-located mixed reality social features, but also enables individual players to have their own mixed reality or VR POV. You can cee more context in the rough transcript below.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
I did an interview with Resolution Games CEO Tommy Palm soon after the Apple Vision Pro launch last year where we talk the in the Game Room game commissioned by Apple as well as the exploration of the relatively new gaze and pinch mechanic that's enabled with the eye-tracking of the Apple Vision Pro.
After seeing the Neural Band at Meta Connect, then I'm reminded about how ultimately the gaze and pinch mechanic is a lot more efficient and more optimized for quickly selecting items in a fully volumetric context. Meta's Neural Band announced at Meta Connect in the context of the Meta Ray-Ban Display glasses is only within a 2D context in a head-lock HUD display screen, and so operating the Neural Band feels a lot like what it feels like to navigate TV menus with a TV controller, but rather than a controller, then your thumb and side of your index finger are being transformed into a two-axis D-pad. Again, the ultimate form factor is likely going to come back to gaze and pinch, but that will require shipping with eye tracking. And so this unpublished conversation with Tommy Palm takes on a new context as we reflect upon the latest HCI innovations that were announced at Meta Connect and where the ultimate form factor may be headed.
Resolution Games also has quite a history of launching games on newly XR devices, and so this conversation with Palm is also within that spirit, and we'll be diving into Battlemarked within the next conversation You can also see more context in the rough transcript below.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality
I did an interview with Michael Markman at Meta Connect 2025 talking about all of the latest updates to the VR design and prototyping tool of ShapesXR, and then we start to dive into some of his hot takes after getting a chance to try out the Meta Ray-Ban Display glasses and associated Neural Band. He sees that the neural band is essentially transforming your hand into a mouse that is providing a simplified navigation system (probably closer to a D-pad on a TV remote), but the index-finger-to-thumb serves as a functional left click and middle-finger-to-thumb serves as a functional right click, which has been enough to build the foundation of most modern HCI for computer software for the last 57 years since The Mother of All Demos debuted the mouse in 1968. See more context in the rough transcript below.
This is a listener-supported podcast through the Voices of VR Patreon.
Music: Fatality






this was very alarming to hear. let's take steps backwards for inclusivity! come on.
love this podcast!
VR treadmill is an awesome idea and I think it will do good for people who like walking the treadmill
She seemed rather defensive in her interview. Maybe that's just me, but it's def the way I interpreted it.
The comment about Sat morning cartoons really made me think. It's easy to forget that our march forward technologically can undo certain protections we used to enjoy.
Welp, turns out this Rob guy is an F'N idiot
Free Roam is such an amazing technology and potential for amazing content! Can't wait to try and develop some of that content myself. There seems to be a huge dearth of certain content, like magic, that would be even more appealing then just more immersive shooters.
this guy is super out of touch
Idk if you read these comments.. But I'd love to help you out however I can!!! I work at a VR arcade and found you a few weeks ago. Really like this episode and this podcast and am soooo excited and ready to help shape these positive, experiential futures!!
Of course there's other big opportunities to replace the often unethical and undesirable ad tyranny. It just requires ideas from people outside the industry. I think we can and should take the monetization of customization and expression that we see in (free to play) games, and extrapolate from there. That's usually an incredibly profitable, scalable, and relatively high margin revenue model that is also much more consumer friendly if done ethically (ie no loot boxes or skinner boxes). Imagine we create tools and platforms where people can easily customize every aesthetic of their augmented or virtual reality and then share/sell that unique cosmetic thumbprint as a product or even a service. This allows us to turn the whole world into a connected market of user generated content, where everyone can profit, companies are not incentivized to play fast and loose with private data, and reality itself becomes more personalized.