The Hidden Costs of AI: How Technology is Reshaping Our Relationships with Stephanie Antonian
This podcast episode features a compelling discussion on the significant impact of AI on family relationships, emphasising the alarming trend of family atomisation in the digital age.
Host Iyabo Oba and guest Stephanie Antonian, founder and CEO of Aestora, delve into the emotional consequences of technology on children and the disconnect it creates within families.
They explore the necessity for parents to regain control over their children's online experiences and the urgent call for a Digital Health Score, a tool designed to help families navigate the emotional landscape of online content.
Antonian shares her extensive research on the relationship between AI and family dynamics, highlighting the importance of community, compassion, and forgiveness in fostering human connections.
The conversation ultimately challenges listeners to rethink the role of technology in their lives and to prioritise nurturing relationships over the allure of digital distractions.
Takeaways:
- The impact of AI on family dynamics is a critical area that requires more research.
- We need to empower parents to protect their children from online harms using AI tools.
- The Digital Health Score aims to provide insights into the psychological effects of online content.
- Social media shapes youth identities, often exposing them to harmful content without parental awareness.
- Emotional well-being and community support are essential for families to thrive in today's digital age.
- AI can be harnessed for positive outcomes by promoting creativity and healthy relationships.
Links referenced in this episode:
Companies mentioned in this episode:
- Aestora
- DeepMind
- Google X
- NASA
- Accenture
- Meta
Podchaser verification - c7u6AmD8uVwTAgknzFXr
Transcript
Hi everyone and welcome to Relationships with AI. I'm your host Iyabo Oba. This podcast explores the real-world impact of AI on human relationships, work, romance, family, politics and more.
We'll hear from thought leaders and disruptors in AI as they share their insights on how AI is shaping society through the lens of human connection. Let's get into this week's episode.
Voiceover:You're listening to WithAI FM.
Iyabo Oba:Welcome, welcome, welcome. How are you doing Stephanie?
Stephanie Antonian:I'm good, thank you so much for having me.
Iyabo Oba:Yeah, well, happy New Year.
Stephanie Antonian:Happy New Year.
Iyabo Oba: irst or our first episode for:I hope you've all had great Christmas breaks, and yeah, well looking forward to kicking off Relationships WithAI part of the WithAI FM network and we've got some so many amazing things to discuss in today's episode, as you heard her a little bit earlier. My guest today is Stephanie Antonian, and we'd like to just open up the floors for her to introduce herself to you all. So Stephanie, take it away.
Stephanie Antonian:Oh well, hi, thank you for having me. So I'm Steph, I'm the founder and CEO of Aestora which is an AI think tank.
Before that I used to work with DeepMind and Google X and Google proper and research of NASA. I used to work for Accenture.
I've gone round the corporate world picking up all the shiny Pokemons before I had to just accept defeat and then set some something up myself which was very disappointing. But there we go.
Iyabo Oba:Well those shiny Pokemon's very impressive. Quite got in a quite a glittering career.
So I, I think you've scored many, many high-value points in those, in your collection, in your, your, your foraging, so to speak. You and I, we first met at the Digitally Curious book launch for, for Andrew Grill.
We just had a really brilliant discussion about the importance of relationships and AI, but particularly with the impact on the family. So before we start anything, could you want to chat to me about like let's get into it.
What's the most significant relationship or an important relationship for you?
Stephanie Antonian:Well, I mean there's many but I think the most formative have been with my family. So my parents and my sister who spent her whole life trolling me and keeping me humble. I guess.
Iyabo Oba:She older or younger than you?
Stephanie Antonian:Older. So when I set up a store and I put the website, the first email I ever got was from my sister being like, you are so full of rubbish. What is this?
So yeah, my sister and my parents who have given Me, who kept me ambitious, but then being able to pull me right back down. Keeping, like the blink of an eye. Yeah, exactly. Just a total smackdown. Definitely.
Iyabo Oba:Okay. Some WWE references there.
Stephanie Antonian:Okay.
Iyabo Oba:It's okay.
Stephanie Antonian:Not physically, just. Just emotionally.
Iyabo Oba:Okay. Well, I just hope that's all sort of helped to sort of establish you. Oh, well, clearly it has. So it's done some great things for you.
But, yes, going back, as we said, so we first met at Andrew Girl's book launch for Digitally Curious, and we connected because the fact we were just talking about.
I was talking to you about, like, my new show and just the fact that I'm looking at the way that AI impacts relationships, and you were straight away coming at me with a whole body of research you put together that you have also penned yourself and particularly fired up around the area of family.
So would you like to share with us and the listeners about sort of the research and the extensive research you've done in this particular area and also how you see AI and its impact in this particular area of family? Could you expand on that?
Stephanie Antonian:Yes.
So, as I said, I worked in, across all these different places, and what I found was that we don't really look at AI and actually humanity and what makes humans thrive, and we don't really look at the issues of forgiveness and compassion, community, relationship, mercy. Like, all these things that are so vital in a family.
Like, I can make jokes about, you know, like, you know, how they can bring me back to Earth, but it's like, I can still be in community. I'm like, they can, yeah. You know, I don't have to. My ego can only go so far in a family, which means I actually get to enjoy being.
Be part of something, like, and. And family is so crucial. And one of the things I thought was strange was that I hadn't really read much about the impact of AI on family.
And given how many AI researchers they are right now, that seemed a bit strange. And I have two nephews, and they're so wonderful.
And when I was reading more and more about teenagers who were committing suicide and who were struggling, and I was looking at what the response was to parents, which seemed so awful to me, of just sort of gaslighting it and blaming parents, which is so, like, not the answer. And the question about what AI is actually doing to our ability to protect our families and to be in this communion together was really on my mind.
And so I started to write about what. What. I started to research it and then write about what the implications of AI today's AI was on family and what we could do about it.
And so one of the things that I started to write about was about how in the industrial revolution the family nuclearised. So we used to have big extended families and then they become, came much more nuclear so that people could go to factories and they could work.
And that made the family smaller but also more affluent and more time with family. And, and there were, there were pros and cons and how in this revolution what we're not paying attention to is how the family is atomising.
So you can be a family of four, and you know, you can, you can be the most well-intentioned parents in the world and you've provided a safe house and, and, and good food and all that stuff. And your child can be upstairs in their room, and you can't protect them.
And they're now spending, people spending an average teenager spending average of 5 hours a day online. And you have no idea what that is.
And not only do you not know what that is, the government doesn't know what it is and the tech companies don't know what it is. And that's crazy. Like we're on our own now and we've been split by the screen and we don't know how to come together.
And that was something that was really shocking to me in that, you know, as we've allowed social media and these use cases of AI into the home and into the schools, we have allowed, allowed someone else to come in and shape our children.
And Internet, the Internet used to be also, I mean, you know, children used to be a, you know, representation of their family sort of, and now it's, it's the Internet. And that's, that was wild to me.
So that's why I started looking at and writing about and that that's been a lot of the research and, and what we do about it. And I think it's very easy to talk about online harms. It's very easy to talk about child safety and to describe the problem.
But what I don't see people talking about is why we need big tech to solve it.
Iyabo Oba:Right.
Stephanie Antonian:Instead of why they're involved with our kids anyway.
Like why do you know, sort of the public conversation about child safety is about what technology companies and governments should do to protect children. But there is nobody more passionate about their children than the parents.
Well, there hasn't been a lot discussed about is how we've just, just forgotten that, just given up on that. And then it's like, okay, my children aren't safe online.
So I need a meta or I need YouTube, like Google to, to protect them, but it's like, well, no, that's not, that's not very good. That's not the best case there. We're sort of setting our children up for disaster on that. Yeah, yeah, that was the main research there.
Iyabo Oba:That is fascinating. I mean, I just. Lots of questions come to mind when, in response to that.
So firstly, sort of what has been the response to that or to your piece of research once you've published it, particularly on your platforms? And how do you envisage, if you were ruling the world, how would you try and shape things to make the outcomes appear differently?
So firstly, what's been the response to that piece of research since you've produced it?
Stephanie Antonian:Well, the piece of research gives a solution which is the Digital Health score, which I'll come to a bit later. So most of the response has been to how we get the Digital Health score, like off the ground, up and running.
And typically on the essays that I write, the solution is, oh yeah, this is so obvious. Why, why aren't people talking about it? And I mean, one of the reasons why I set up a store as well was just that the incentives about AI are such.
Of the whole AI industry, as such that we can't ask the obvious questions.
Iyabo Oba:Yeah.
Stephanie Antonian:Because they are funded mostly by industry and you know, I love industry, I was in industry and stuff like that and. Or, you know, they're still think tanks that are also funded with grants from industry or like, you already know the answer.
I used to be in those jobs. I already know the answer that I have to write in those papers. And there, there isn't a lot of just organic thinking and it's not financially.
There's no financial incentive in giving parents more power. But that doesn't mean that's not the right answer. So, yeah, the most common feedback is, wow, this is really obvious.
And then the next piece is, okay, how do we get the Digital Health score?
Iyabo Oba:Yeah.
Stephanie Antonian:And so when I write, I also interested in coming up with solutions. So not just, oh, this is awful.
Iyabo Oba:Yes. Ringing.
Stephanie Antonian:Yeah, yeah, like, but, but in like a, you know, my hypoth, if you can really get to the root of what the actual problem is and what the actual system challenges, then you can design solutions, like real solutions that can make change and you can use AI to do that.
So like, if the real challenge here is that the family has now broken down to too small a piece and, and that individuals have, like, even individuals and families have lost their right to protect their small unit, which is so against human nature and against what. What we're okay with. Then we need a solution for that. And so the Digital Health score is a very simple solution.
It's just a 1 to 5, green to red score that shows you the likely psychological impact of your online content consumption.
Iyabo Oba:Right.
Stephanie Antonian:And so it's like cinema, cinema ratings, food ratings. Like in the world, whenever we have ambiguity, we just come up with rating systems. So people can just see. Roughly, this is what it is.
So yeah, we've invented that for the Internet. So you can just see in a split second should you be concerned. And you can see that for yourself and you can see that for your child.
That's what we're building right now. It's a Chrome extension, so anybody can download it while we're improving it to get it to the point where we could allow children to use it.
And we're working with the Anna Freud center at UCL to validate. And, and we're really in this like community collective validation mode.
So if anybody wants to download it and give feedback, that'd be really helpful.
Iyabo Oba:That's very. That's a great point. Because basically we'll definitely share all of the links to that in the show notes because certainly there'll be.
I'm sure there'll be a great swell of people that would be very interested in being able to contribute.
Stephanie Antonian:Yeah.
Iyabo Oba:So please do. I'll certainly share those. That information and links to everything in the show notes.
But as you were saying, you're collaborating with various bodies as well.
Stephanie Antonian:So could you explain Exactly.
Iyabo Oba:Or share some more about that?
Stephanie Antonian:Well, so we, it was very important to do this well and to validate it and to do it with community. And there's, you know, we're not incentivised to answer this question properly.
So right now, in this whole debate about online harms and stuff, it's so noisy.
Iyabo Oba:Yeah.
Stephanie Antonian:That you can't really say that it's nonsense when it is. So like right now it's.
You have companies like Meta saying that the solution is way too complicated for them to solve, but at the same time parents should have worked it out and should be able to protect their kids. And it's like those two things don't make sense. But it's. We're okay with that. Them both being out there together as an answer. And we're coming up.
We're trying to come up with solutions in this like, middle bit that doesn't really make sense. And so it's noisy. But we don't have the right financial incentives for the tech to make this work in a, in a meaningful and obvious way.
So what we've been doing is working with students at UCL Imperial Brighton Sussex, St Andrews to get people to be, to, to, to firstly create the product and then to test it. Because when it comes to sentiment analysis, we've never really asked the community, what do you think the emotion is on this?
What do you think the impact is on this? We've never even considered that. I mean, most sentiment analysis is just about how you use people's emotions to just get them to buy more.
Iyabo Oba:Right.
Stephanie Antonian:Sense of analysis is advanced, it's just focused on sales, it's not focused on human enlightenment. Like, you know this, we talk about AI like it's a rational thing and it's absolute rationality, but it's actually just emotional exploitation.
That's why we use AI to make you feel worse, like in small nudges to nudge you to worse. That's what social media is. It's like if you, these systems have worked out that if you make humans feel worse and worse, you optimise faster.
So if I show you things that make you feel bad about yourself, you click more, you spend more time online, you stay at home more, you become more depressed and they make more money. And it's not necessarily conscious, but it's what's out there. Like the, it's not. It wouldn't be profitable in the same way for social media.
And all these use cases of AI to make you feel good about yourself. You're not going to stay at home clicking if you feel great about yourself.
And so that's where sentiment analysis is and has been used on how you, how you are able to get people to buy more, engage with your stuff more and be more malleable to your influences. So what we've had to do is get a community together to say, okay, this website, yeah, we think it's this emotion.
So on the Chrome extension now, because we're still in early development, we say, okay, this is what we think the score is for the Digital Health score. And actually, this is what our AI model thinks that the main emotion of this website is. But do you agree?
And so you can say no, and then you can tell us what you think the actual emotion is.
And that's, that's an industry first to actually, actually just ask people, like the people who are the consumers of the Internet, what emotion is it triggering in you, rather than it being something subtle behind the scenes that you're not allowed to see or not allowed to know or being determined by a small group of people in the office.
Iyabo Oba:Well, that is absolutely fascinating. Again, unpacking the whole sort of psychological processes upon which decisions are made.
Marketing decisions are made, sales decisions are made about sort of the technology we consume and how AI is used to help empower that process as well. So it's interesting, and it's been enlightening to hear how you've been.
You've sort of shone the light on sort of the underbelly, so to speak, of how the tech companies use these specific tools in the, in the, in the different ways that you've mentioned to help shape and form society into becoming, as it were. You could, you could argue some form of groupthink with regard to how you respond.
And if you have younger minds accessing it, then there's potential for sort of increased levels of malleability and dependence upon it.
Just from the things that you've said, if all of these factors are in place, it makes for a very sort of heady cocktail of dependence upon technology.
Stephanie Antonian:And that's absolutely why it's being done because it's not. It's not accidental that the family's being atomised. It's that it's more profitable.
And it's just happening so silently that we're not even allowed to discuss it. That you have all these big AI researchers and these think tanks that won't even look at AI and family.
Because the most unimaginable thing and the most unnatural thing that we can imagine is a society where your children are taken away from you. I mean, when I was young, and this makes me sound very smart, but I don't know why. Yeah, but I read Plato's Republic when I was.
When I was like a teenager, like 18 or something, young in comparison.
Iyabo Oba:We'll put that in the show notes as well. Do you like to show the intellectual breadth and. And caliber of our. Of our guests?
Stephanie Antonian:Yeah. And basically I was so into it, and I was like, yes, Plato, yes.
Until he starts talking about how you have to take all the kids away and you have to train them by the state so that they're loyal. And I thought, oh, you've gone too far now. You've lost me.
Because the idea of somebody being like, right, we're going to take your children away from you and we're going to train them however we want is so insane. But that's what we've allowed. Like, that's actually what's happening now. Like, your children are taken away from You.
And it's not even to become nationalistic. Like, it's not even because they're going to wave Union Jacks or be like, I'm going to, to fight for the king, whatever. No one actually knows why.
It's whoever's going to pay the most to target them with whatever they want. Like, it's so insane, and it's so inhumane what's happening now that I wonder if, like we, it just can't, it doesn't even reach our consciousness.
It's like so shocking we can't even process it.
But basically, through technology, we have been able to, or, you know, organisations have been able to isolate individuals to make them much more malleable, to get them to do whatever they want and to break down that safety unit of a family. Because for most people, and of course not all, but for most people, family is the ultimate safety unit.
Iyabo Oba:Yeah.
Stephanie Antonian:You know, when I've got a problem, the first person I call is my sister. Like, there is, there is safety in that. Like, I don't know if people will ever care about, about me as much as my parents did.
And that's okay because that, that's what happens. But we've broken, we're breaking that down. And so you, you, as much as you might love your child, there is not, you can't do that much.
There are limitations now to what you can do to protect your child. And that is shocking. That is really shocking. And that's what stood out to me with the teenagers who've committed suicide. That is horrific.
And it's also horrific what happened to their parents.
It's also horrific that they got gaslit into being told that there were things that they could do when the honest truth is no, there were always limits to what they could do. And that's the problem to address now. Not parents. You should be doing this. Like, what's a parent going to do?
Spend five hours just going through what their child did? That it's not, it's not possible. None of these things are possible.
But the reality of where we are, which is the total breakdown, is too much to even discuss. Yeah.
Iyabo Oba:So just moving the conversation on how do you use AI in your day to day to bring out all of this research?
Stephanie Antonian:I don't really use it that much. I mean, obviously, there's like AI embedded and, you know, Gmail and stuff like that that I'm using like subconsciously, and that's great.
Occasionally, I'll use like an LLM to help write an email that I can't be bothered to write; we'll be like, make this more peppy. But I'm like, I don't really use it. Obviously, our tool, Digital Health Score, is used with AI. I mean, it's an AI solution to an AI problem.
Iyabo Oba:Yes, yeah, indeed.
Stephanie Antonian:Yeah, yeah.
Iyabo Oba:Could you, could you explain or expand a bit more about how the Digital Health Score uses AI technology and how it, how it helps? This is a good use, good AI for good, so to speak.
Stephanie Antonian:Yeah. Okay, so let me start. Like AI is just a tool and it can be anything we want it to be.
So like I was saying that when I was working in an industry, I felt like we were maybe building AI on the wrong paradigm and so things don't make sense.
So like we build these huge communication tools, people are more lonely, we invest loads in health tech and life expectancies declining and it's like really working. Things don't seem to be making sense. And my theory is that we build on the wrong hypothesis.
So, you know, we're not actually meant to be hyperproductive individuals. Like we don't. The markets are complex new man made things, but they're not like natural to us.
And so if we were looking at how you build AI to help humans thrive, like how do you help humans it become more peaceful and more joyous and and more loving that we would come up with different answers. And so that's where on this piece it was like, okay, well if we take the problem of online harms and child safety and then say what?
How could we use AI to help people protect themselves and be more loving and have the best of the Internet? Because social media can also be good. Like yeah, you know, it's like free access to education.
It's like, it's very informative, it's fun, it's very creative, there's lots of great things. So it's like how do we help people get the best and thrive? And then that would. So that was the idea.
And so the way that digital works is that we have a model that conducts AI sentiment analysis and it predicts the emotional load. And then we work with experts to set the benchmark for what the average person can cope with in terms of positivity and negativity.
Like do a whole load of calculations and then we give you a daily, weekly and monthly score so we could see the same piece of information get totally different scores. Yeah, because what matters is like what's happening in the day, in the week, in the month on your online thing.
You know, you can look at suicide once. Maybe you're doing a research project or something. But are you looking every day?
Iyabo Oba:Yeah.
Stephanie Antonian:Are you? If I see it, we, we both see a negative news story.
Like do I go on to look at puppies and it's clear that it hasn't impacted me or am I spiraling my go? Yeah, like, you know, am I just, you know, it's January so I'm like, how do I lose the Christmas weight?
Or am I, you know, looking at body stuff all the time?
Iyabo Oba:Sure.
Stephanie Antonian:Those are the things that are important for us to, to be able to flag ourselves when we're falling down. But also for parents to know how good or how bad is it?
And also for teenagers and children themselves to know because the younger you are, the less, the less context you have on what is normal.
So if you start searching how do I lose weight really quick and then you start clicking on ads of fasting, intermittent fasting and then adds on five day water fast. And then ads on like you're gonna, then you're gonna start seeing stuff about anorexia and it's going to be more and more and more.
But that's what you think is normal because that's all you see.
Iyabo Oba:Yeah.
Stephanie Antonian:So you don't realise how extreme it is. So even for, for young people themselves to have something that says, hello, you've fallen down a rabbit hole here and you're in a place of darkness.
So that's how we use AI to just help people find the light? Basically, yeah.
Iyabo Oba:This sounds like it's just a great, a really excellent use of this particular tool. Like you say, it's a tool like any other. So how can it help shape the research you're doing?
But again, it's for the betterment of society in a very tangible way. I suppose. My next question is sort of what next?
What are the next areas for you sort of in, in your areas of research, what things are you sort of wanting to push forward in with regard to sort of technology and, or AI specifically and how it's sort of positive or negative impact on society? What things are you looking at?
Stephanie Antonian:Well, we've got lots more product ideas. This one digital health score is the first one we brought to life.
And I, I'm very much interested in getting it over the line, putting it into public consciousness, as in like, hey, this makes so much sense.
Iyabo Oba:Yeah.
Stephanie Antonian:And let's come together in community and, and like let's bring different groups together to really work out how we create this framework and then find more interesting ways like gamifying, you know, some things we're looking at is how for younger people, how do you gamify it so that they stay positive, you know, so that we don't always have to be like, suicide. Yeah, the dangers. But just, hey, like, here's a little guide that's going to help you stay in the one to three category.
Like, it's going to help you enjoy the best of the Internet, enjoy the best of social media and. And reduce this fear so that we can be, you know, freer people. And then there's lots. So.
So there's the products that spin off from an actual framework being cemented, and then there's lots of other fields. I mean, there's not a lot of people looking actually about how you use AI for love. There's a lot of people looking at how you make AI less bad.
Iyabo Oba:Yes.
Stephanie Antonian:But if you just ignore the whole conversation and then just do your own thing of, you know, how. How can life be better for my nephews? Or how could life be better for anybody's kids? That's still actually an open field.
I mean, when I think about my nephews, I. And I think about the world in 10 years and what I wish for them. I will be totally honest. I couldn't care less if there are drones, and I.
I couldn't care less if there's a metaverse. But what I hope for them is that they're happy.
Iyabo Oba:Yeah.
Stephanie Antonian:And they're healthy and, you know, there'll be teenagers then. I hope they'll be happy to see me and.
Iyabo Oba:Yeah.
Stephanie Antonian:That they have opportunity and things like that. And. And I don't really feel like we're. We're really leaning towards that. I feel like it's likely that life will be harder.
There'll be less money, they'll have their head in a game set, and I won't really know what to talk about with them because I won't have any real view of what they're seeing. So, yeah, those are also the. Those are some more things that inspire me in just.
Okay, there's all this negative stuff and we know it, but, like, how do you. What. What would it look like to build stuff and use AI to just make children more smiley and more free and more playful and.
Sorry, this is sort of a bit of a ramble, but, like, I spent so much on my career and I was so driven and I was so pushy and. And I'm very lucky that I have family who, you know, after a certain point, really didn't care about it at all.
And it was like, I'D lost myself trying to be exceptional, but I just really wanted to be part of the family. Like, I wanted to. To be. To get on with my family and to. To have a community and.
And that's, you know, the deep mind stuff, the, like, all the big brand stuff, it's like, it didn't get me what I wanted, didn't lead me to a life with more love and more meaning and more compassion and all that stuff. It. It sort of took me away from it. So I'm like, in the smackdown thing. It's like.
Sounds aggressive, but it's like, it's been very lucky for me to have a family who's like, what are you doing?
Iyabo Oba:Yeah. Like, very basic questions.
Stephanie Antonian:Yeah. Like, what are you doing? Like, what's the point of this? Yeah, exactly. And then it's like, oh, gosh, it doesn't mean anything.
And so, but the stuff that means things are our relationships and they are communities. And. And that was a big turning point in, like, I'm going the wrong way.
Iyabo Oba:Right.
Stephanie Antonian:I'm trying to create all these big things to show you aren't I smart. Like, didn't everybody love me? And actually, I'm just alone. So it was like, okay, forget it. I don't know anything, man. Yeah, teach me the basics.
It's amazing. Yeah. That like 90 of the population seems to know what's up.
Iyabo Oba:Yeah.
Stephanie Antonian:But that's why they're not allowed to.
Iyabo Oba:Speak with that in mind.
And thinking about the whole concept about relationships and the technology of AI, how if you were queen or ruler of the world for the day, what things would you implement that would help foster that connection? And how would you use both the power of AI and. Or not.
Stephanie Antonian:That was actually quite easy for me in that I would just. Just hack all of the ads to be affirmations.
I would make young people subconsciously, in the same way now they're subconsciously seeing ads and messaging of negativity. I just flip that on the other side.
Iyabo Oba:Right.
Stephanie Antonian:And give messages of, you know, just subconscious subliminals of I prove myself.
Iyabo Oba:Yeah.
Stephanie Antonian:Worthy. I'm. I'm worth loving. Like, everybody loves me. My family loves me. You know, like Snoop Dogg's Affirmations for kids. Great song. I would.
I would flood the Internet with positive messaging and.
Iyabo Oba:And it would be. And it wouldn't have an issue of being fake news either, because everybody would be receiving it.
Stephanie Antonian:Yeah. And also, I mean, I guess that's what's difficult to truth, but I think everybody is laughable.
Iyabo Oba:Yeah.
Stephanie Antonian:Everybody is worthy. So I'm like, that those things are true to me. And I think we know how we get the best out of humans. This isn't rocket science. It's not complicated.
When humans are loved, they thrive.
Iyabo Oba:Yes.
Stephanie Antonian:When our basic needs are met, we thrive. Like in you.
When I talk about love, people are like, but it's like in Maslow's hierarchy of needs, like, love is the pivotal point that determines like whether you thrive or not. Like, not to mention like religious scriptures and things like that or, or just evidence in sociology.
It's like in every single field we're told love is the, the critical point, but we don't want to accept it. So, you know, we, we very much know what it takes to make humans thrive. It's love.
And we know how you help people to learn to love themselves, which is affirmations, care, reaching out, community. But we don't want to do it.
Iyabo Oba:Wow, that's certainly taken a sort of a deep philosophical sort of journey that's been really good.
Lots and lots of food for thought in our, in our discussion and hearing sort of your passion and hearing how all the different things that you've had to say about your interactions with this piece of technology and then how you've seen it both positively and negatively impact society. What do you see in the next? What do you see in the future?
What are future innovations in this space and that, that you see now and then what you would, your, what you would like to see.
Stephanie Antonian:Okay, that's a great question. I think that the future of the AI industry is going to be AI to fix the problems of AI.
I wish it would be more exciting, but AI is actually quite old. I mean, it's 75 years old as an industry.
So you don't need to be a futurist to predict what will happen because it would just be more of the same, which is more disappointment moving us further away from solving real-world issues like hunger and poverty and climate change. And so the focus, I think, will just be on building tools that can make AI less damaging. So I recommend to any entrepreneur then it's AI for AI Now.
Although I'm like, when I say I'm like an AI for an AI, the world goes blind. But that's, I guess, the path we're on. So how do we reduce the energy AI uses? How do we reduce the harm? AI creates that stuff.
And then I think that the real future past the AI, because there are really great use cases of AI like in health tech each. There's lots of great things that will come up like I have no doubt. So I'm not, this isn't an AI. The whole thing is bad.
But I think the incentives are a bit strange. But, but I think the next big wave we'll see is just like real human creativity.
So I think we're in this big midst of fear now, and we're all like AI is going to take over everything, and like, as a collective, our whole self-esteem is like really low.
Iyabo Oba:Yeah.
Stephanie Antonian:And we're going down this path of, you know, AI is going to take our jobs, it's going to take the creativity, and actually because of the incentive of the industry, even where that's happening, it will just be less creative, less interesting. So, like in film, it's like okay, yes, like you can usually say like to write films, but it's going to be like fast and furious 27.
Like it's not, it's going to be based on what we've seen before.
It's not going to be, it's going to stick to what has made money in the past and move that way and that's good for a while but then it's really boring.
Iyabo Oba:Yeah.
Stephanie Antonian:So I think there's going to be a huge resurgence of creativity. So I think there's just like a magic to human creativity and we're so thirsty for it and we're so hungry for it.
And so the people like if you can keep the faith and stay, keep your self esteem high enough to be creative then I think that's where the next big industry is going to be. But I'm like private equity is bought it all now. No more coffee shops. No more, no, no more coffee shops to buy.
Iyabo Oba:Sorry.
Stephanie Antonian:No more restaurants really. Like we, you know, it, it's sort of, it's all buying and eating itself.
Iyabo Oba:Yeah.
Stephanie Antonian:And we could get bogged down in the fear of that or just recognise there's a really big opportunity and in five to 10 years, it's going to be looking for the what's next, what's new, like who has come up with ideas and been, you know, nurturing these ideas that are exciting, that are novel, that are outside the system and that really could be anything. But it will be predominantly human creativity.
Iyabo Oba:So going back to, going back to source, as it were, is a general source.
Stephanie Antonian:Yeah. There's a magic to humans. Like there's a magic to being alive. Like life is incredible.
It's not a burden, it's not, it's not awful to write, it's not awful to create art. It's Wonderful. And when you see things, there's like an energy, there's, there's an intangible. That's what moves you.
It's like it's not just because art is pretty or, or what have you, but the problem is humans can only create when they have high enough self-esteem. So that's why I'm like, no, everybody, humanity's great.
Iyabo Oba:But also I think people, humans often create as well out of real dire situations. So that's true.
Stephanie Antonian:Actually.
Iyabo Oba:Is there extremity of something that has caused an innovation to happen or caused some form of forced by, you know, you've heard that phrase and forged by fire.
So, like, you know, you've been, you've experienced some situation that's so adverse that you have to come out rising like a phoenix from the ashes and just create something just so beautiful and just so distinctly different because of this environment that you've been in.
It's sort of, you know, there is a thing of creativity is going to happen because spirit means that it's always there pushing forward and making that happen.
Stephanie Antonian:Yeah, that's really, that makes me feel more hopeful as well. I'm like, yes, actually that is, that is how I believe it happens.
And like I think that we will realise just how important relationships are and with ourselves and with others because we'll miss it. In fact we already are missing it and there is no substitute and that's okay.
Yeah, we can just enjoy each other then like we can just accept that it didn't work and on some levels like it didn't all work and, and enjoy relationships again.
Iyabo Oba:Lots, lots of thoughts coming around. Lots of thoughts, lots of tangible or tantalising thoughts that have been, I've been left with since through our discussion.
So, I just want to say thank you so much, Steph.
It's just been really great to hear about the work that you're doing and also about the digital health Score product that you're creating amongst the whole suite of other things that you've got on in the coming through your pipeline. Please could you share with everybody your the best way to contact you? What are your socials? What? What's your website?
Stephanie Antonian:Yeah, so the best way to contact me is through the website of Astra. So www.w.astora A-E-S-T-O-R-A.com and it's me on the other side. So I'll respond, and then I'm on LinkedIn, but other than that, I'm not on.
I don't have any socials. I just realised life is very freeing without them. Not so good for spreading a message. But yeah, that's what it is.
And then if you want to know more about Digital Health Score, you can download it on the Chrome site by just searching Digital Health Score and you can see our Spin out website on digitalhealthscore.io
Iyabo Oba:Well, that's so good. Great having you on.
Thank you so much for sharing your thoughts and also just about your use of AI in this particular space and also the challenge that you've given us all on sort of how it impacts relationships. But thank you again so much for your time and look forward to yeah to speaking again. Anyway, take care. Bye bye.
Iyabo Oba:Thank you for listening to this week's episode of Relationships WithAI. If you haven't already and you like what you heard, click on the subscribe button and leave a review.
Let me know how the discussion made you think and feel. Share this podcast with others and let people know that this is valuable content.
You can find the show on Apple, Spotify, YouTube and all places that you get your podcasts. You can follow on socials too. Find our page on LinkedIn on Relationships WithAI and on IG on @WithAIFM until the next time.