Google BLEEDING Staff after DEI Disaster!!!

Categories
Posted in: Dr. Steve Turley, News, Patriots
SPREAD THE WORD

BA WORRIED ABOUT 5G FB BANNER 728X90


Summary

➡ Google is planning to lay off around 12,000 employees, which is about 6% of its workforce. This comes after the controversial launch of their new AI tool, Gemini, which has been criticized for its perceived bias. The tool, which was intended to promote diversity, has been accused of rewriting history and promoting a leftist agenda. The article also discusses the ethical responsibilities of tech companies and the importance of maintaining objectivity in AI development.
➡ The book “Killer Tech and the Drive to Save Humanity” discusses how technology can both help and harm us. It suggests that we should have control over our digital lives, including our photos and personal data, similar to how we control our cryptocurrency. The book also warns about the potential job loss due to artificial intelligence (AI) and suggests that we should learn to program AI as a future skill. Lastly, it encourages local communities to create tech centers to keep talent local and provide job opportunities.
➡ The article talks about a new book called “Killer Tech” by Mark Strauss, which discusses how modern technology is benefiting us. The author encourages readers to buy a copy for themselves and for their friends and family, suggesting it would make a great early Christmas gift.

Transcript

Google, if you could believe it, is bleeding staff. Just weeks ago, the tech giant announced that it was about to lay off upwards of 12,000 employees, the equivalent of 6% of its workforce. And the CEO, Sundar Pichai, he’s warned that more layoffs are to come in 2024. Now, this, of course, is all happening on the heels of Google’s premiere of their new AI tool, Gemini. You remember that disaster, don’t you? Let’s just say that rollout frankly shocked everyone.

And turns out that Gemini is little more than just a DEI infused tool that’s earned both the ire and the ridicule of social media users all across the planet. Google’s co founder, Sergey Brin, he tried to just play it off, saying, well, we don’t really know why it leans left in so many cases, it’s certainly not our intention. Yeah, b’s. What’s really happening most likely is that these companies are using their diversity hires to come up with the most liberal, loving content and technology that they can.

But what really upsets me in the end, gang, is that they’re training this AI technology to rewrite history, to turn AI into a historical revisionist tool that refashions figures like our nation’s founding fathers after their own leftist and frankly, lunatic image. Now, of course, I’m all for advancements in technology, but that involves only those advancements that are actually human affirming, that benefit the flourishing of our humanity rather than threaten it.

But of course, that raises the question so many of us are asking today. How do we faithfully navigate that often treacherous path between what’s technologically promising versus what’s technologically perilous? Well, joining me today is chief technology officer, sponsor and fellow Turley talker, my good friend, Mark Strauss. Mark is the author of an amazing, brand new book, Killer Tech and the Drive to save Humanity, which you can get just by clicking on that link below or going to markstross.

com dot. Mark. Welcome, my friend. Great to see you again. Thank you, Steve. You know, it is such a privilege to be on your show today. Your show taught me so much about the actual foundation that my book is based on. And it’s based on the idea of the mass society turning into the network society. And you’re the gentleman that brought those concepts to my little head and got me started down this rabbit hole, when I went down the rabbit hole of looking at how technology was having an impact on humanity, what you just talked about is really important.

First of all, I want everyone to appreciate Google’s AI runs the whole of Google. So the AI is actually really impressive and runs really well. I want to say that again, the Gemini experiment that we saw publicly was not the AI representative that Google has created. What you saw was how the search engine, that is the way we actually negotiate or actually use an AI. We use it through search.

We create search terms, and then the AI regurgitates something that it comes up with. But what people don’t realize is those search terms have to be translated to the AI language. And it was in those translations that the AI was instructed, for example, not to show any whites in american history, only to show all types of ethnicity except no whites. The DEI was actually introduced at the search level, not inside the AI itself.

If you were to ask the AI with a honest search that had no translation of DEi in it to do the search, it would have come out positive and correct. So I want to make that clear. Wow, that’s a bit. Well, again, this is why we have you on the show. You just, you’re so, you’re an expert in this. I’ve been. I told you, I’ve been thumbing through this book.

It’s so fascinating. It’s so good. I love the action items that you have at the end of each chapter. I want you to talk about that at some point. But, but first, I am really curious. You know, you and I have spent some time, time personally, one on one, chatting into the wee hours, which are quite wonderful. You know your stuff. I’m just curious. I mean, what’s, what’s your assessment of the impact of this unbelievably, absurdly biased AI coming out of Google on its users? That’s what killer tech is so interesting.

If I can’t. What killer tech is so interesting about is you’re fascinated by the effects of technology on us that we may not even be aware of. So I’m curious what you think about that. Well, as Peterson, as a philosopher, I realized that there is really very few people dealing with the philosophy of technology. So we’re dumping a whole new tier of philosophy onto the historic scene. And really, who is out there looking at this and actually documenting what is the impact of this technology philosophically? How does it impact our humanity? That is super important.

You know, when you’re talking about Google and you’re talking about choices and decisions, people at Google made decisions in order to make a woke AI. But did they ever think that in their attempt to basically make a biased machine that only spits out information they, like, they were going to scare 50% of the population from using AI. So ultimately they lose because they lose their jobs as they did.

So ultimately, when I look at this, the ultimate, the ultimate here is if you are actually fair, you actually find that you actually do good business. You know, it’s interesting, in Vegas, people always assume the slot machines are actually biased. No, actually they’re true random. Therefore the house always wins. So the truth is you don’t always have to cheat in order to win. You can win without cheating.

That’s, wow, that’s brilliant. It’s so neat, you say, because I’ve always found that liberalism just entails its own futility. And I love how you’re saying, in effect, this kind of DeI nonsense is a sin that entails its own punishment, as it were, as its own judgment. It’s literally, we have moral laws just like we have physical laws. Pride really does go before the fall, right? Things akin to that.

And I love the way you’re tracing that out. I mean, do you think tech companies have a responsibility to not just, not just prevent, not just prevent these ideological biases from taking hold in their AI in the first place, but to go above and beyond that, to actually, to assure their customers that they’ve taken great lengths to be as objective and as accommodating to all as possible with AI? Well, yes, they do have a responsibility because they sort government protection with the 230.

Government protection. So if you sink protection and you are going not to be a publisher, you’re going to be a utility, then you cannot do editorial bias and become a publisher. And that’s what most of these tech companies have done. You were published, you didn’t realize, but you had an editorial board on YouTube that decided to demonetize you. That’s publishing, because they editorially decided your content was not favorable to their audience.

However, let’s take this to the next step. Who basically nominated those people to be judging you. And therefore, if your audience judged you, you get an a. So if their audience judged you and you got a d, my problem with that is who are they to judge? Right? I prefer not to judge. Right. Right. It’s so interesting, I mean, because you’re getting into now the ethical, the ethical nature of things which you get into in the book on how these technocrats in effect, have an ethical responsibility in terms of how they’re using this technology.

Can you flesh that out a bit for us? What role do you play in the development of this type of innovation? In the book I talk about a digital bill of rights. It’s super important to me, the digital Bill of rights establishes ownership. We have lost ownership. You probably heard the famous quote from a game developer that said, get used to it. You don’t have to own anything.

I mean, and frankly, I’m not going to get used to that because ownership establishes a bond between human beings and the things they own. We used to own our photo albums. We used to be able to go to our death knowing the photo album was going to hand it down to the family. We never went to our death realizing that all our photographs are owned by a corporation and they have exclusive license of how to use the photographs.

So when you think of the difference between 30 years ago ownership and ownership today in the digital world, corporations are basically saying, advocate your ownership, advocate what you, and also advocate your thoughts, because we know better. We know what you should own and what you shouldn’t own. For example, PlayStation. This, Steve, is insane. You bought a whole bunch of games and you bought some movies from PlayStation, and then PlayStation does not renew the rights of these movies and these games.

And suddenly you get a letter from PlayStation stating, we’re sorry, but we just lost ownership of 200 titles and you lose all your titles. So my question is, was that user who should be called a human being instead of a user, did that human being just got told that the game that he owns is no longer owned by him, even though he owns it? So in my future, we’re going to have a digital wallet, like in crypto.

And you have a lot of crypto people that come on your show and talk about crypto. Well, they use a general wallet, and that wallet is available. It’s transparent and it’s available for anyone to look at. So you know what different people have in their crypto wallets? Well, in that same way, we should have a general ledger for everyone’s images and photographs. And you can either have them out there in the big bad world, or you can rescind access.

And you should be able to withdraw those photographs from meta or from x or from any of the platforms. And those pictures should just come down. Because if we agree that crypto is such a cool way to bring freedom into the idea of fiat currency, the idea that you can own something that the government doesn’t have access to, then couldn’t we use that same technology to bring back ownership? True ownership.

That’s one of the essential pennants of my book. We need to own stuff again. Own. Right, exactly. An owner based economy. The Internet that we have. Right. The Internet 2. 0 is just not an owner based economy. We need an Internet 3. 0 and the bill of rights, the user’s bill of rights, and so on. Gang. This is the book killer tech, the drive to save humanity. Just click on the link below.

Order your copy. You can go to markstros. com directly to order a copy. What I love about is it doesn’t just explore the types of technology infiltrating our lives and the dangers and the like. It’s hopeful. You love tech. This is what you do. You’re a cto, you love tech, and you give people roadmap on how to take back control. And that seems to be the key. If I may, a theme that comes out of this book is it’s a matter of control.

Do we control tech or does tech control us? So tell us a little bit about that. Can you just flesh out, really, what you were trying to convey overall, in terms of killer tech, in terms of the positive message there? Yes. The positive message is we’re human beings and we should have a relationship with technology that actually augments us, not depreciates us. The best way for me to describe what I want is I would, for example, like a positive TikTok instead of a negative TikTok.

And it is possible, because China’s version of TikTok is positivity. It actually ferrets out the very best representations of chinese culture, as opposed to our TikTok version, ferrets out the very base of our culture. So what I want is, I want a future where technology represents human beings in a civil, human way, and not to basically take away our humanity, deconstruct us into just zeroes and ones. That’s where we are today.

So a future where our digital lives are respected. A future in which we confront our digital dependency and we start to move away from addiction. And you heard what I said, dependency, Steve, because most of us are, let’s face it, a little bit addicted to our devices. Sure. Yeah. And we all need those devices in order to do our work. So I’m not suggesting that we suddenly don’t use those devices, but, like, with soda, we went from a society that didn’t think sugar water was bad for us to now where we all agree that sugar water should be done in moderation, and we have now adapted sugar water in our lives in a much more holistic way.

We can do that with our phones, we can do that with technology, but it’s going to take people like myself and yourself bringing these topics up, making sure that we have a human discussion about it and not alienate people. I want to make it clear, how can I call any of you guys addicted when I am probably more addicted than you are? Because I use these tools in order to be current to talk to you, Steve.

And yet these tools, and this is super important. Everyone, these tools today. Gen you up and gend up is a term we use in the book to describe when you have too much information and you’re overflowing with information and you get anxious and you get mad about it because you want to tell the world about your truth. Their only problem is it’s your truth. And the unbelievable part of our society today is social media has allowed us to design our own narratives, our own news.

We only see what we like to see because we’ve designed it that way. The reason that occurs is social media wants to keep your eyeballs on the medium. And the way they do that is they actually make sure you self addict yourself by making sure you see all the content you want to see. And the algorithms make sure that you spend as much time as you possibly can on your own echo chamber.

Right. Wow. Gang, I’ll just give you a little insight into some of the chapter. I love these titles, human productification. Chapter one. Chapter two, life after bandwidth. Chapter three, the hacking of critical infrastructure. Chapter five, the unseen cost of technology just jumping around here. I like chapter six, cyber warfare in your kitchen. Chapter eight, the illusion of ownership. And I love chapter eleven, your digital hygiene. And then of course, the final chapter.

Chapter twelve, bringing humanity back. And I mentioned at the end of it all, you have tech action, which you call where you list out action items at the end of each chapter that are proactive, positive steps we can all take, parents or professionals, people we can all take, to better harness the human affirming aspects of technology. Do you want to talk a little bit about what was the reasoning behind tech action? Yes, when I was writing the book, actually, you had something to do with this.

And in some of your podcasts after the 2020 election, you talked about we had to take action. You kept talking about this, this is actually on you. And I suddenly realized, I wrote the book, but where was the action? What were we going to do? I mean, you can’t just write a book and talk about all this gloom and doom and not come up with action steps and not give hope.

So what I’m hoping with Killertech and the drive to save humanity is that we have given hope. We give you guys a way out, and we give you guys a way to fight back. This is super important. One thing local communities should have incubators. Steve, I really believe in this. We have too much talent. That leaves the home base. They graduate high school, they leave the local community, because local community has no action for them.

So one thing I’m advocating in the book is we create local tech centers where you take local community programmers and people that could actually maybe do very well in the open market, but try to entice them to open businesses in their local community and have local towns support them. Believe it or not, there’s enough money in local towns and cities in the United States with grants and other programs that we could create successful incubator programs that would get kids off the street, that would actually start to support technology and teach people how to program, because I think that’s a very important skill for the future.

Not just understanding technology, but understanding how to program AI, for example. That’s going to be an incredibly important field coming out. And finally, to really point the audience to why this book’s important is AI is going to erode 40% of the jobs in the next ten years that are out there in the world. I mean, 40% of jobs that you see today will not exist in ten to 25 years.

30% would be gone in 40 to 55 years. They’re predicting 50% of the job market is gone. Wow. We’re talking from factory workers to surgeons. It is so wide because what AI is doing, it’s allowing us to bring in, for example, garbage collectors that aren’t union workers, so they can pick up the garbage 24/7 why would you not want that for your city when you can save money and do better trash pickup? But that also ultimately means you’re going to lose those jobs.

They’re going to be gone. And those are very high paying jobs today. Surgeons, anybody that does something like surgery, where it relies on a list of known symptoms and a history of symptoms, can be better served by AI, seeing the whole history of every symptom known to man. And if you have a surgeon working on you, would you prefer the surgeon that doesn’t come in from a hangover, hasn’t had a bad night or the surgeon that’s human, right? Right.

Wow, gang, this is amazing stuff. Killer Tech and the drive to save humanity by Mark Strauss. It’s a spectacular book full of insights and actual steps you and I can take to ensure that the emerging technology around us is helping humanity instead of harming. It’s a wonderful resource written by a fellow patriot, fellow Turley Talker. So click on that link below to get your copy or just go straight to Mark strauss.

com. That’s Mark Strauss. com. And grab your own copy today. While you’re at it, get one for all the patriots in your life, your friends, your loved ones. Make it an early Christmas. They’ll love you for it. And you’ll love this book. Killer Tech by Mark Strauss. Mark. Thank you, brother. It’s awesome seeing you again. Let’s do this bat. Let’s do this again real soon. Wow. Thank you, Steve.

What a privilege. Thank you. .

See more of Dr. Steve Turley on their Public Channel and the MPN Dr. Steve Turley channel.

Sign Up Below To Get Daily Patriot Updates & Connect With Patriots From Around The Globe

Let Us Unite As A  Patriots Network!

BA WORRIED ABOUT 5G FB BANNER 728X90

SPREAD THE WORD

Tags

benefits of modern technology control over digital lives ethical responsibilities of tech companies Google AI tool Gemini controversy Google employee layoffs job loss due to AI Killer Tech as Christmas gift Killer Tech book review local tech centers for job opportunities Mark Strauss Killer Tech objectivity in AI development perceived bias in AI tools programming AI as a future skill

Leave a Reply

Your email address will not be published. Required fields are marked *