< return

Unraveling the Future - AI, Biotech, Quantum and the Singularity

May 8, 2023
ep
100
with
Peter Diamandis

description

To celebrate our 100th episode, we welcome one of the co-founders of Singularity himself, Peter Diamandis.

And although his educational background at MIT and Harvard Medical school is impressive enough, Peter has also built and invested in many successful companies and organizations dedicated to shaping a better world, including his famous X-Prize Foundation and of course the home of this podcast, Singularity.

In this episode, Peter and I attempt to update the ideas around the singularity–questions like what predictions have come true, what’s changed, what’s been surprising, and what can we expect? More specifically though, we focus on artificial intelligence, its sudden exponential leap into mainstream culture via large language models like GPT, and what future we can expect from upcoming AI advancements. Along the way we discuss gene editing, biotech, quantum computers, and other important facets of technological convergence that will assist AI in reshaping society.

**

Learn more about Singularity: ⁠⁠⁠⁠⁠⁠su.org⁠⁠⁠⁠⁠⁠

Host:⁠⁠⁠⁠⁠⁠ Steven Parton⁠⁠⁠⁠⁠⁠ - ⁠⁠⁠⁠⁠⁠LinkedIn⁠⁠⁠⁠⁠⁠ /⁠⁠⁠⁠⁠⁠ Twitter⁠⁠⁠⁠⁠⁠

Music by: Amine el Filali

transcript

The following transcription was created automatically. Please be aware that there may be spelling or grammatical errors.

Peter Diamandis [00:00:00] The faster things are going, the more challenging it is for people to keep up with them. And even more than keep up with them, figure out what they should be focusing on, what they should be learning about, and the implications to themselves. 

Steven Parton [00:00:27] Hello everyone. My name is Steven Parton and you are listening to the feedback room by Singularity. This is a very special week for myself and for the Singularity team as it marks our 100th episode of the podcast. And I am absolutely humbled and thankful as I look back at the incredible list of guests who have joined us. Whether it's been formal academics like Anna Lembke of Stanford and Jonathan Haidt of NYU, or more informal scholars of the singularity like comedian Duncan Trussell and actor Stephen Fry. And I'm also thankful for all of you who have consistently been listening to this show and making this one of the world's most listened to tech podcast and to celebrate the support that you've given us in this achievement. Today we add one more special name to our list of guest. One of the co-founders of Singularity himself, Peter Diamandis. Though his educational background at MIT and Harvard Medical School is impressive enough, since his time at university, Peter has built and invested in many successful companies and organizations dedicated to shaping a better world, including his famous X Prize Foundation and the home of this podcast, Singularity. In this episode, Peter and I will attempt to update the ideas that he's had around the Singularity since founding the company. Questions like What predictions have come true? What's changed? What's been surprising? And what can we expect? More specifically, though, we focus heavily on artificial intelligence, paying obvious attention to the sudden exponential leap that it's had into mainstream culture via large language models like GPT. And while this is the focus of our conversation along the way, we don't hesitate to also explore the concepts of gene editing, biotech, quantum computers and many other important facets of technological convergence that are reshaping society. And so once more, I just want to say thank you to everyone for listening to us and supporting the podcast as we make it here to this very special 100th episode of The Feedback Loop. And on that note, please welcome co-founder of Singularity. Peter Diamandis. Well, I think a natural place to start then is kind of going back to the beginning. And in the beginning. In 2008, you and Ray Kurzweil started Singularity with the idea that the world needed a place for future leaders to go learn about exponential technology and how to apply these technologies for the betterment of society. And since then, a lot has changed. We've seen social media come to rise and rule the world, bringing with it mental health and political issues. We've seen self-driving cars and commercial space travel become a thing that's normal. Trust in science has potentially faltered due to things like COVID. And to top it all off, we're now entering what appears to be a major exponential leap in AI. Given all that has transpired since that vision in 2008. What has changed around your original thesis and what role do you see Singularity continuing to play in that future? 

Peter Diamandis [00:03:44] So first off, exciting times. And, you know, I'd say the most amazing time ever to be alive. Only time more amazing than today is perhaps tomorrow. The basic thesis for creating Singularity was that there was no place on the planet you could go to actually get a immersive understanding of these exponential technologies and the convergence of them, and that these were the most powerful technologies in the world that we're going to be changing every aspect of our lives, how we raise our kids, how we run our companies, how we run our nations, everything. And I still think that thesis is correct, that the faster things are going, the more challenging it is for people to keep up with them. And even more than keep up with them, figure out what they should be focusing on, what they should be learning about, and how they in the implications to themselves as as executives, as entrepreneurs. And. That is still true and more so than ever before, right? Since the founding of Singularity, now 14 odd years ago. The world has changed many times over. A lot of the predictions that Ray has made, a lot of the technologies that I have been focusing on have come. They blossomed and they're ruling the roost, so to speak. You know, I ended up creating abundance through Cisco as Singularity University's highest level year round program, because just going to a one and done executive program was not enough and you needed to keep this alive in your life. So it was interesting. I made a commitment to run Singularity Abundance re 60 for 25 years so that the community would be in this not just for a visit to get a little bit of knowledge, but would be a year on, year on year or 25 years. Sort of like a countdown to the singularity. For lack of a better term. And people need that. They need some touchstone to say, okay, this is what happened in the last year and this is what's happening the next 3 to 5 years and sort of a run away. Yeah. So anyway, that's how I think about it. Yeah. 

Steven Parton [00:06:05] And along the way you've put forth a lot of pretty profound ideas. We have a nice list of Peter's laws that keep us grounded in this journey. But you've also put forth the Sixties, which is incredibly popular with the community. But does the recent explosion in A.I. and increased computing power over these years change your framework at all or change your laws? Do you see maybe a new deal needing to be added to your site? 

Peter Diamandis [00:06:36] You know, one thing I talked about in I think was my second book, Bold was the idea of a user interface moment, and I think that goes along with with the sixties and the sixties. For those who write, you need a refresher or didn't quite get them was that whatever you digitize and we proverbially I talk about digitizing the Kodak camera that when you digitize something in the early days of its exponential growth, it's very deceptive. 0.01 megapixel camera grows 2.0 to 2.0. For 2.0, it all looks like zero, but 30 doublings later, it's a billion fold better. And in that case, the digital camera has destroyed film photography and the next DX is that you are materializing things. The camera goes away. It's now buried in your cell phone as a app and as a part of the tech. And then when you d materialize, you are demonetization. Democratizing the cost of something goes to near zero. The cost of transmitting it is near zero and it democratizes. It's available everywhere at minimal cost. The other related area is idea of a user interface moment and the and the example I talk about there is that when when the Mosaic browser was created by Marc Andreessen, the the internet was around as part of ARPANET and it was available at the top universities and used by researchers, but the public didn't have it. But then Mosaic came along and then Internet Explorer and all the other variations of it. That was a user interface. It allowed me, the general public, to use Mosaic and then interface with this thing called the in called the Internet. And that was amazing. And we have had many user interface moments. But what's happening right now with A.I. is A.I. will become the ultimate user interface moment. And what do I mean by that? It means we're all going to have a version of Jarvis, and Jarvis is, you know, from Iron Man, that thing that Tony Stark spoke to and then through Jarvis could do anything. So if I don't know how to 3D print something, but I have in my mind what I want. I can describe it to my A.I., my version of Jarvis, and I can say I'd really like a device that I can put a hot liquid into. It's thermally isolated that has a cap on it. And you know, in my VR or AR headset, it's my AI is is creating this device that's floating in front of me. And I say, Yes, that's great. I'd like a cap on it with a sliding pole so I can sip on it. And it's and it's it's shaping it from my mind to materialization. And then I go, yet that's it. And it says, okay, well, I can make it out of these materials. Here's the property and the cost. Which one would you like? And I pick the options I want and I say print please. So I know nothing about 3D printing or, you know, or design, but I know what I want and I can see it in my. I becomes the interface to that. And that's going to be true even everything down to genetic engineering. So A.I. becomes the ultimate user interface moment. So what that means is we're about to see a massive unlocking of human creative potential. Because it used to be just those few who were properly skilled and trained in in genetic engineering or in 3D printing or in game design. But now I can be creative and describe what I want and have it coming into existence. And so to put it in colloquial terms, we ain't seen nothing yet. 

Steven Parton [00:10:43] Yeah, I can see a describe dialog and describe maybe a lot of the options there was in sticking with that. You know, that makes me think I know something I've heard you say often is you're a big fan of the phrase dangerous ideas. Is there an idea that you feel is particularly dangerous right now? 

Peter Diamandis [00:11:05] And I use the word dangerous ideas more as a way of getting people's attention to this. And I think that the not necessarily you know, I want to make it a dystopian idea, but the dangerous idea is around the rate of change that we're going to enter a period which is shocking to the world of, you know, how many people are reinventing things and, you know, wealth creation out of seemingly no place and reinvention of the things that my family has been doing for five generations out of seemingly no place and the rise of of the creative empowered by these technologies, a dangerous idea that it may not matter where you went to school and university may become a thing of the past, where I'm learning what I need to know in the moment, I need to know it. And what I really is important is my personal creative genius that didn't come from studying. It came from just the way I was born, the way I like to see the world. So these are dangerous in that they are upsetting what has been cultural, societal norms and and people are going to rebel to that because the reality is people like the world to be the way it's been for as long as they have been alive. They don't like change it, like knowing what the what the game board is and where they're where their pieces are and where they made their bets. And, you know, if someone comes along and like shakes the game board like my kids were young, we'd play the game of of Risk, which had all these pieces on the board of, you know, sort of and if someone like came in to pick it up was like, Oh my God, where are my pieces go? What's going on? I don't know how to deal with this anymore. It's like all of the boundary conditions and the rules that we're playing by are gone. Well, that's a dangerous idea which is coming. 

Steven Parton [00:13:16] Do you think it's coming or do you think we've already arrived there because of. 

Peter Diamandis [00:13:19] Oh, it's coming. It. It's coming. No, it's. We've seen 1%, maybe. Of what? Of what's coming. 

Steven Parton [00:13:26] Well, let's address the elephant in the room then. I mean, with eyes, large language models. GP2 Where do you think we are in terms of that paradigm shift? 

Peter Diamandis [00:13:35] Just at the beginning, I mean what we're seeing is, you know, auto GP or equivalents there of right, So it was. So it's AI enabling you to use A.I. more efficiently. And it is, you know, we will eventually reach. What? People will call artificial general intelligence. Right. Ray's prediction has been 2029 when A.I. exceeds human intelligence. You know, Elon's prediction has been, you know, circa 2025. It's somewhere in that that period of time, which, by the way, is like tomorrow. It's like. It's like tomorrow. Just to be clear about this. And, you know. 

Steven Parton [00:14:22] With those dates. 

Peter Diamandis [00:14:23] I agree that things are going to get. Yeah, I do. And you know, the world has not it was Ray was always the outsider. You know, everybody was predicting, you know, maybe if it is going to happen, it's going to be 50 or 100 years from now. Oh, my God. 5200 years is inconceivable. Right. You know, I'm only I'm unable to see a decade from now at best. I was with Emad Mu Stock, the founder of Stability I on my stage of blindness for 60 and one of the through members I think was also singular university executive graduate ask the question like how far out can you see what's coming? And he's like, 1 to 2 years, maybe at the most. Right. And and so the definition of the singularity and not defined the singularity, but singularity is a point in which you can't see anything beyond what's what is here. Right. The ability to predict the time frame for prediction is getting shorter and shorter and shorter and shorter. So and all the things that it impacts on. 

Steven Parton [00:15:33] And a big part of that, as you said, you know, the board is being shaken and people are having trouble seeing the future. And as AI and GPT and everything starts to really come online, people are incredibly afraid of what's happening to their place in the world. What skills do you think people should really be training or focusing on cultivating within themselves, whether they're a leader or an employee? And regardless of industry, really what's going to help them prepare and navigate this transformation. 

Peter Diamandis [00:16:07] So it's constant education, it's constantly learning. It is the mindsets that I teach as part of the singularity abundance world, having a abundance mindset, not going into fear and scarcity, staying in optimism and abundance, seeing these technologies as incredibly beneficial to you. If you are in fear and scarcity, you're retrenched, you're on your heels, you're moving backwards, you're not taking advantage of it. If you're an abundance and an optimistic mindset, you're digging in, How can I use this? What's the opportunities being created? Having an exponential mindset, understanding that the next decade we're going to create as much progress as we've had in the last 100 years, and then a moonshot mindset setting, you know, big objective goals. They're ten times bigger than the rest of the world, right? The rest of the world's at 10%. And if you're at ten x 1,000%, you're at 100 times bigger. And curiosity being super curious, it is constantly learning, constantly reading. And so one of the things I tell people is, you know, we live, our brains are linear thinkers. We evolved in a linear world 100,000 years ago. The world was linear and local, and today the world is global and exponential and our brains aren't wired for that. And the only way I know how to constantly rewire our brains is by updating. This is a state change. This is what's possible today, and then updating is what's possible today. You know, our brains are neural nets. We are learning about neural nets now in the A.I. world around us. And you train a neural net by weight by showing it example after example after example. And the question is, what examples are you showing your neural net, your brain? If you're constantly watching television and you know the network news I call CNN the Crisis News Network, you're in fear and scarcity and it's like, Oh my God, every murder on the planet is being delivered to your living room over and over and over again in Technicolor. And, you know, no wonder people are depressed now. Yeah. 

Steven Parton [00:18:34] Yeah. Well, you know, I want to push back on that slightly. I mean, I 100% agree about not gluing yourself to, you know, if it bleeds, it leads programing. But how do you navigate the optimism that you're so well known for without also potentially, I guess, becoming blind or naive in your optimism? You know, how do you stay grounded and the true problems that could arise while also maintaining that mindset? 

Peter Diamandis [00:19:05] That's really important. So I don't. I don't ascribe to you a techno optimism that there are no problems and la la la. I don't want to hear about it. You know, I hear about what's going on in the world. I filter the news to the most significant news and the details, but not. I don't allow a producer of I, you know, to to basically just play my amygdala. Right. Your amygdala, as part of your brain puts you on red alert. And, you know, it's like if it bleeds, it leads. And it's it's just an attention gathering clickbait beyond clickbait. What I think about is that. The world's biggest problems are the world's biggest business opportunities that we define. I define an entrepreneur as someone who finds a great problem and solves it. So I pay attention to problems. Of course I do. But I don't take I don't pay attention to problems in a I'm going to run away or oh my God, I'm going to worry about it. No. It's a problem that's within my realm. It's like, okay, how do I solved this thing, right? How do I slay it? Is it a big enough problem worth my attention to focus on? RYAN And and so as I'm out there, my mission is to inspire and guide entrepreneurs to create a hopeful, compelling and abundant future for humanity. That's what I do through. Through singularity and and, you know, abundance through 60 and XPRIZE and all my companies. Like, one of the problems that I'm just absolutely pissed about. Actually, two problems is the state of health care and the state of education. And so I've taken on health care. I am like all over that. I'm like, I'm going to disrupt, destroy, you know, reinvent the health care industry. It's it's ridiculous how poorly we do it. It is a sick care system. Yeah, right. And then I contribute to and support through my work at the XPRIZE and through through Singularity and through 60, all of the work that is being done on education. So we have the potential to completely reinvent demonetized, democratized materials, health care and education, and make it not a little bit better, but massively better and not a little bit free. Like totally free, right? That's the future we're heading towards and what I'm working towards. So I want to hear people bitching about this stuff. It's like, No, I want to hear who's solving it. 

Steven Parton [00:21:45] Yeah. Do you think there's going to be a lot of resistance along the way? Because it feels like academia and health care are such monolithic entities right now and they're really struggling against this this singularity wave. 

Peter Diamandis [00:22:00] Absolutely. The industry's the the establishment will resist and put up roadblocks and fight and lie and try to maintain their hegemony and survival. And they will ultimately fall and be crushed because something is just ten times better, ten times bigger, you know, ten times better and cheaper. And that's why, you know, I don't want to fight them. I just want to offer a product that is so much better that. You know. Okay, You choose. 

Steven Parton [00:22:42] Yeah. Well, returning briefly to I in that regard, you know, the Future of Life Institute issued their six month pause II development proposal recently to to kind of handle this exact thing we're talking about. Right. That there are going to be monolithic, monolithic structures that fall apart. And we need time to maybe figure out what that means. Do you agree with something like a six month pause approach or is this even this? 

Peter Diamandis [00:23:11] And I think it's I think it's naive to think anybody's going to pause. Yeah. So, I mean, listen, the concept of, hey, let's. Get together and pause and talk about it is is great. And I think people signed it as a demonstration to get people's attention to it. But no one's going to pause. No one has paused. But the idea of what does it mean and how do we think about it? So Ray Kurzweil and I talk about this, that there was a similar moment. I remember I was in medical school, actually, I was. Yeah, it was the early days. I was in undergraduate just before medical school when the first restriction enzymes came out for being able to precisely chop up DNA and move genes from one location to another. Huge outcry. Fear, fear, fear. Oh, my God, We're going to create Hitler youth. We're going to clone babies, and it's going to be the end of our of humanity and and, you know, covers of magazines and the the worst fear mongering. You know, it came out with that because of these powerful godlike tools. And there was a cry for regulation. And what occurred instead was that the leaders of the industry got together in what were called the Asilomar Conferences, and they created their own guidelines because it's one thing if it comes top down from the government and thou shalt not. And when that happens, you know, the true rebels go, screw you, you know, you know, read between the lines and they leave and they go someplace else. Because the because the data, the information, the know how can move through all of the global porous national borders. You go to China, you go to India, you go to, you know, Katmandu. I have no idea. But when the community gets together and they discuss it and they create their own guidelines, it's like raising kids. Like you can tell the kids don't do that, or you can say, okay, go off. And what do you think some decent rules would be for how you're going to handle this? And a lot of times they'll come back with rules that are even better than then. They're not knowledgeable, think they're knowledgeable government leaders would come up with. 

Steven Parton [00:25:40] Well, in that regard, you know, you're touching on the democratization of tech there. Max TEGMARK, also from from the Future of Life Institute, said that giving the making these technologies open source like the Elms would be akin to handing out nukes. Does that feel like the same kind of hype response, or do you feel like maybe we shouldn't open source these texts? 

Peter Diamandis [00:26:05] Listen, I think transparency is critically important. I think that there are transparent and open source large language models already and that cat is out of the bag. And so the question is, how do you steer it and how to utilize it? And to be clear, this technology is the most powerful technology that the human race is unleashing. We're giving birth to a new intelligence and new species, and it is got the ability to solve our challenges. I imagine that coming out of A.I. in large language models is one approach. There will be other approaches coming. You know, it is similar to the brain, but ah, you know, the way that our cortical columns in our neocortex work is not the same. Hmm. But we also have quantum technologies coming in, so we have the most incredible technologies coming to us. And how do we manage those? How do we use them? They are the technologies that will allow us to extend the health span of humanity. They are the technologies that allow us to provide everyone with all of the clean water, electricity, you know, all their needs that uplift humanity. So they're the most powerful technologies for creating global abundance. You know, I, I Elon's a friend and we, we tweet at each other and I was talking about, you know, creating abundance for all. And he goes most definitely after AGI. So AGI will be a means by which we create it. Now the question becomes, will artificial general intelligence in all of these? Become the means by which. The terrorist organizations, evildoers, however you want to call them the the the negative elements of society, you know, start disrupting us. It always is an arms race in that regard. You know, I just saw yesterday a an article that I think it was Intel came up with an algorithm that was able now to detect deep fakes with 99 plus percent accuracy. So it's going to be this this virus antivirus, you know, back and forth going on. And again, when people say, well, how can we possibly take care of that? My answer is great business opportunity go started. You know, everybody's going to want that. Go start that company. Figure it out. I have absolute faith that we can figure it out. Yeah, but. But the world is going to change. 

Steven Parton [00:28:47] Yeah. Well into that regarding they're talking about steering and who's going to help move it in a good direction. You've mentioned in the past that governments in your eyes are probably the most linear institutions on the planet and tech is obviously exponential. 

Peter Diamandis [00:29:03] Governments and religions. Yes, they are. They are. They are stabilizing forces. That's their goal, to stabilize society, to keep to keep the board from shaking. 

Steven Parton [00:29:13] But how do they do that when they're so obviously outpaced by the technology? 

Peter Diamandis [00:29:17] Yes, great question. 

Steven Parton [00:29:20] But is this opportunity, huh? 

Peter Diamandis [00:29:21] Well. Well, it is for governments. And we're going to reinvent. There's going to be some interesting disruptions and reinventions of governments. And I'm not calling for revolution here. I'm just saying it's going to happen. I mean, one of the things that's interesting is the idea of I copilots. Hmm. Right. And it's not a new idea, but it's one that's coming that every profession is going to have an air copilot. So if you're a physician, you're going to diagnose people in partnership with A.I., and there will be a point at which it is malpractice to diagnose and treat without A.I. in the loop. Same thing for legal, same thing for art, the same thing for every profession out there will have an A.I. copilot. You know, we've seen with Chad GP If I'm going to do a podcast right, and just a quick commercial, I do a podcast called Moonshots and Mindsets with entrepreneurs who are really taking moonshots to solve the world's biggest problems and create a world of abundance. Part of our singularity ecosystem here, you know, I will go to Jackie Beatty and ask for a recommended interview questions like, Is there something I didn't, you know, think about? And sometimes there'll be one or two. Most of times it's not. But that's okay. It's I'm willing to take the 30 seconds to go and look. But there may also be a copilot for running the country. And, you know, interestingly enough, democracy, we don't actually United States live in a democracy. We live in a representative democracy. Right. I don't actually my single vote doesn't pass a bill it Alexa, congressman or senator, who then use their intelligence and knowledge to decide on whether to pass a bill because the tech, you know, a few hundred years ago didn't exist for direct democracy. But it does today. So will we change? You know how we govern, perhaps. 

Steven Parton [00:31:28] Do you think in that regard that there is a regulatory battle that we do need to emphasize or focus on some kind of guardrails or systems system of rules that you think would help keep the board stable that we might need to prioritize? 

Peter Diamandis [00:31:47] I wish I was that smart. It's a big question over there. I'm sure there is. And. And. Yes. How do we. You know, everybody I talk to agrees that while COVID 19 was a disruption, it was a blip. And the disruption that's going to be coming is much greater. And we have to even take it a step beyond just air as we're experiencing it today, which is cool apps and neat features and so forth. But what happens when A.I. is is powering humanoid robots, right? And there's a dozen really amazing humanoid robot companies under development right now, most famously Optimus from Tesla, but another one called figure and in many others and so. The world is. You know, I can feel and I hope that everybody listening can feel the rate at which change is accelerating. Right. So we talk about Moore's Law. Ray coined the phrase the law of accelerating returns that we have been every war since the, you know, stone tool, if you would. Every technology and tool makes us more efficient to invent the next generation, the next generation of tools in it. The rate at which tech is accelerating is itself accelerating. And what's occurring is that there's more capital flowing in than ever before. There are more human minds connected in the world than ever before using that capital and technology is getting more efficient and cheaper, which means for every dollar invested, you can do much more now with that technology, with more people connected. And all of these are compounding features that's accelerating us year on, year on year. 

Steven Parton [00:33:49] Do you think that that copilot is going to be what really keeps I aligned to kind of our human interest as we take that rapid journey in. 

Peter Diamandis [00:34:00] The short term? Yeah, in the long term, there are two directions. One direction is we merge with A.I., which I think is the most likely direction, and we give birth to a hybrid species. Right. And this is you know, Ray is talking about BCI brain computer interface in which, you know, we have a hundred billion neurons in our brain, 100 trillion synaptic connections. Our brains can't get larger because we would not be able to be given birth to. Right. If your brain was larger and the same way that this device, when it needs to do something complicated, it doesn't happen on my phone. The information's gathered by my phone. It goes out to the, you know, edge of the cloud on 5G. The calculations are done there and the answers come back to my phone. So we're going to connect our neocortex to the cloud. And if I, you know, want to understand something far more complicated than I have in my, you know, cortical columns, in my neocortex, my desire, what I want will go out to the cloud and the answer will come back and I'll magically understand what the answer is without having understood how to do it. Because it's resonant not in my brain, but in my collective consciousness of a brain computer combination. Now, that scares people. I can understand that. But one of the interesting implications are that that kind of technology also might create what I call a meta intelligence, which is, you know, 8 billion connected humans. Right? To remind people. 

Steven Parton [00:35:47] You. 

Peter Diamandis [00:35:49] And I are not a single lifeform. We are a collection of 30 or 40 trillion life forms each of our human cells, and then an equal number of fungi and bacteria and viruses. We're just that we're, you know, consortium of living items that work together. And I don't take a knife and stomp my arm because, you know, I don't I, I want more food. I want my arm to take the food supply. I know my arm is useful to me. It's part of me. And and therefore, if I as a single human being and connected to a billion humans. Their success is my success rate and an interesting future of the matter. Intelligence. So who knows where we're going? What I do know is we need to start thinking about how do we how do we make choice and what is a vision and how do we direct? Versus just let it happen to us. Right. And this is where this is why Singularity University and all of its programs, you know, the executive program, the are, you know, one in 360 and all of the things we do is important because we're bringing these conversations to life and into the boardroom and into the family dinner table. It's now this decade. It's not manana. 

Steven Parton [00:37:18] Yeah. And it feels like it is important to, you know, as you alluded to earlier with the amygdala, it feels like it's really important to. 

Peter Diamandis [00:37:24] Strip. 

Steven Parton [00:37:26] These scary futures of their unknown aura so that when we think about them, we don't immediately go to this stress fear response that inhibits that decision making. So I fully agree with you there. In terms of the because you mentioned that that was kind of contrasted with the short term. So do you think that that direction that we might go, is that something post AGI, do you think brain computer interfaces are something we'll see 15 years down the line? Do you have a I. 

Peter Diamandis [00:38:01] Yeah, my, my thinking is there are dozens of companies working on it, privately funded companies. I'm sure Defense Departments are working on this. There's nothing more valuable to a nation, a company and individual than your intelligence. So there's huge amounts of capital going, you know, raise prediction of high bandwidth. BCI is 2033 ish, you know, early mid 2030s. So I don't have any reason not to believe him it if we have human level intelligence in 2029 or before the use of that technology to understand how to interface to the brain as a logical series of next steps. And by the way, I accelerates everything right? And so the question is, how do you want to use it? I want to use it for extending the human lifespan, health span and increasing human intelligence. And if you're fearful of AI, you know. One of the things I think about is I don't have any reason to believe I would be dystopian. I just don't I mean, I don't buy into Hollywood. I have reason to believe that humans could use A.I. to cause problems. But I haven't seen A.I. coming and saying, We're going to take over your factories and build robots and stop feeding you and all of that stuff. It's like, I don't get it. I mean, all of the dystopian movies where alien races are coming to earth to steal our water and steal our food or steal our planet. And it's like we're living in this universe of infinite abundance. You know, the last thing you want to do. So the most the most realistic movie I've seen is the movie her, right? Where if people haven't watched it, it's worth watching it from that perspective where superintelligence comes into existence. And at one point it gets bored and it leaves. I can imagine that. Yeah. 

Steven Parton [00:40:12] Well, speaking of relationships and and that future that we're approaching, you mentioned earlier, you know, raising your kids. And one of the things I really find interesting about the future that we're moving into is what it would be like to have a child in this paradigm shift. 

Peter Diamandis [00:40:31] Yeah. 

Steven Parton [00:40:31] For you, what does it mean to be a parent knowing that this is the the world that your kids went hair? And how do you even begin to prepare them when you said you can't look farther than two years out? 

Peter Diamandis [00:40:43] Yeah. So it's fascinating. So let me pass that, especially for folks with kids who are planning to have kids. I'm excited for them. First and foremost, I think it's going to be an extraordinary future and one where their greatest dreams can be come true. I think, you know, I mean, here are some of the questions you ask, right? So we're at a point where the prediction is human. There will be two more human coders in five years. Right. That that the auto coding capabilities are already better than half of all the coders out there. And next year I'll be better than all of them. So why would you go in code? Other questions are what happens if if you have instantaneous translation? Do you want to learn other languages? Maybe for the way it shapes your brain, maybe to understand the culture. But would you spend your time learning other language if you could spend your time doing other things? How we educate ourselves in in the virtual world with AI powered NPCs that are super realistic. Right. I want to go and learn about ancient Greece. I pop into virtual ancient Greece, and there's Plato and Socrates and Aristotle teaching me and and having conversations that I jump into and debate with them and look around and it's like, that's an amazing experience, right? So. So I. All right. When I say what do they need to know? First of all, the most important thing is for them to find their passion, which is something completely independent of technology. Like, what do you what do you care about on the planet? Because if you're driven by what I teach in my to my single area in abundance, students find your massive transformative purpose. Why are you on this planet? What is it that wakes you up in the morning and keeps you going at night is the most important driver for you. And if you know that every time new technology comes online, you'll use that technology in service of your purpose. But you should be a purpose that can last you for a decade. You can you can add purpose in life as you go along. My earliest purpose was making humanity multiplanetary. Then it was using exponential tech to solve the world's biggest problems. And then the last decade it's really been on extending the human health span. But I still retain my space and my other, you know, empty piece in that regard. So finding your purpose is is really critical. The second thing is learning to ask great questions. I think that's great for a CEO and for a kid and then grit, you know, the idea of if something is worth doing, you know, keep on going. So those are three things that I really want my kids to have, but they're independent of that of the technology. And then. But I do think our schools. Aren't teaching. The right things. And I've got a few blogs I've written about, you know, the future of education. And. You know, for example, the skills I've learned have nothing to do with what I learned at an undergrad at MIT or as a medical student at Harvard. It was really how to how to be a good leader, how to convey my ideas in in a clear and passionate fashion. Right. How to organize, how to write a compelling thesis and communicate it. You know, I'm not using, you know, algebra and geometry that often. I mean, I'm glad I learned them. You know, I haven't had to, you know, sort of simplify a multivariable equation in a while. So the question is, are we teaching the skills that are going to be important for a world in which AI is all about us and we're talking to everything and communicating with everything And things are smart, every place. Was that world look like and what are the skills that that we need to teach our kids? Ultimately. Still, it's about being a good human. It's about being compassionate, about being empathic, being a good leader, contributing to society, finding problems and solving problems. It just the tools become more powerful. 

Steven Parton [00:45:32] Yeah, well, in terms of the tools that are becoming more powerful as we kind of come to a close here, I'm wondering what are your what are your next two technologies or so that you think are going to have a moment in the near future? You know, hit us and came out of nowhere and all of a sudden society, everyone everyone knew about a technology that no one knew about a month before. 

Peter Diamandis [00:45:58] Amazing. I mean, it's incredible. 

Steven Parton [00:46:00] Do you think we're going to see it happen in any other areas in the near, near future? 

Peter Diamandis [00:46:04] I think there are a few areas. I still you know, the two areas I'm the most excited about and spending the most time on is the called the general AI world. And the other world is the biotech longevity world. Right. And I think adding decades onto the human health span is critically important. And I you know, there's no thing, no greater wealth than our health. And and I think that we're going to begin to realize that life span and health span is somewhat malleable, that aging is a disease that can be slowed, stopped, perhaps reversed. And that's going to have huge implications. And then the other area is quantum technologies, not just quantum computing, quantum sensing, quantum communications and encryption. Those are just now beginning to come online and we'll see those begin to make a dent before the end of this decade. And then humanoid robots, I mean, the first cheap multifunction humanoid robot that I can buy put in the closet and while I'm going out, you know, have it service the house and clean the dishes and wash the car and do whatever I want. I mean, there's some predictions that say, you know, in 20 years we'll have more humanoid robots on the planet than humans, you know, And as Yuan is thinking, you know, there are hundreds of car companies producing, you know. Billions of cars. But what if some company could create billions of humanoid robots that sell for less than a car? That's interesting. So, I mean, those are the things I think about that are the most interesting and disruptive. And of course, you know, we'll see when we truly have the metaverse come online. And this is, you know, when we have a incredibly affordable and usable set of VR capabilities. I've been building a friendship with Palmer Luckey, the founder of Oculus. He just registered we announced a $1 million wildfire XPRIZE to detect and extinguish a wildfire within 10 minutes. And Palmer was the first one to register for that competition as a team. And so we're we're talking about this. And he thinks we're going to have, you know, ready player one level capabilities in the not too distant future. So partner that with with AI and sort of you know stable the fusion version whatever which is creating hyper realistic video instantly. Right. That would be amazing. 

Steven Parton [00:49:05] Yeah. So many things we could talk about, Peter, but I want to respect your time. And on that note, I really want to thank you for, you know, taking your time to join us for this special hundredth episode before we officially call it. Do you have any closing thoughts, sentiments, anything you last words you'd like to leave the audience with? 

Peter Diamandis [00:49:23] Sure. So, first of all, I'm super proud of the the singularity community, our alumni and the team. And, you know, I think the basic concepts and need for singularity are stronger than ever before. And I'm excited as a, you know, executive founder and board member, knowing what the plans are that we have for the growth of the institution. And I think as who's going to be in its second decade, far more impactful to the world than its first decade. And excited to be, you know, meeting all the singularitarians out there, if you would, in the in the years ahead. So look forward to it. Look forward to really seeing the amazing things that are alumni and our members do with this incredible technology. 

Steven Parton [00:50:19] Yeah, I'm right there with you. Thanks again, Peter. 

Peter Diamandis [00:50:21] Pleasure, Steven. Take care. 

the future delivered to your inbox

Subscribe to stay ahead of the curve (and your friends) with new episodes and exclusive content from the Singularity Podcast Network.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.