Register now for better personalized quote!

Microsoft accidentally revealed why people don't trust tech companies

Mar, 26, 2023 Hi-network.com

Useful in all the right ways? 

Anadolu Agency/Getty Images

Trust.

That elusive notion that humans rush to embrace. At their peril, that is.

When trust is broken, it can be the worst of feelings. When you hold someone up as a superior, reliable form of human and they turn out to be just as putridly porous as the next human, the disappointment can be severe.

What could possibly lead me to such a maudlin musing?

Also: How to use the new Bing (and how it's different from ChatGPT)

Well, I've just caught up with Microsoft and its latest poetic use of words. Or, depending on your view, its twisting of the English language to serve a tortured ideal.

The company recently launched something called Copilot. This is a lump of AI that's apparently trained for the job of taking the weight off your mind.

It's there to help steer you to your destination. It's there to free you to focus on steering your life. And it's there to help you land on the perfect version of you, the one that does more in order to, I don't know, be more.

Also: Microsoft just launched a Notion AI competitor called Loop

There's one difference, though, between Microsoft's Copilot and, say, an American Airlines co-pilot.

Hark the words of Microsoft VP of Modern Work and Business Applications Jared Spataro: "Sometimes, Copilot will get it right. Other times, it will be usefully wrong, giving you an idea that's not perfect, but still gives you a head start."

I wonder how long it took for someone to land on the concept of "usefully wrong." 

You wouldn't want, say, the steering wheel on your car to be usefully wrong. Any more than you'd want your electrician to be usefully wrong.

Somehow, though, one is supposed to cheer that a piece of AI (hurriedly) slipped into one's most basic business tools can be utterly mistaken.

Also: ChatGPT vs. BingChat: Which AI chatbot should you use? 

A little like autocorrect, then?

Usefully wrong.

But this really isn't a micro-issue, is it?

The whole tech industry is seemingly built upon the hubris that whatever it does makes the world a better place. Even if, after a few uses and several years, it may do the opposite.

Everything from full self-driving to Facebook has been lauded as the next, greatest coming of an unfathomably glorious, correct future -- until, that is, it's revealed to either be utter hokum or, perhaps worse, something largely counter-productive to humanity.

A pause to consider whether then-Microsoft CEO Steve Ballmer was usefully wrong when he laughed at the first iPhone. Perhaps he was.

Also: The 5 best iPhones

Too often, the impulse toward new -- driven by the impulse toward money -- clouds the impulse to stop, think, and wonder what effect this may all have on humanity.

Humans are pathetically weak. They're impressionable. They readily gravitate toward new toys, in the hope that those toys make their lives better, richer, and more rewarding.

Only later might they discover that those new toys merely served to make their lives a whole lot more frustrating, while making the companies that created the toys a whole lot wealthier.

Of course, all these companies -- Microsoft, too -- claim they're being responsible in the way they create their new offerings.

Wait, didn't Microsoft just lay off its entire AI ethics and society team?

More Microsoft

Is Windows 10 too popular for its own good?The best Windows laptop models: Comparing Dell, Samsung, Lenovo, and moreHere's why Windows PCs are only going to get more annoyingHow to downgrade from Windows 11 to Windows 10 (there's a catch)
  • Is Windows 10 too popular for its own good?
  • The best Windows laptop models: Comparing Dell, Samsung, Lenovo, and more
  • Here's why Windows PCs are only going to get more annoying
  • How to downgrade from Windows 11 to Windows 10 (there's a catch)

tag-icon Hot Tags : Business Companies

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.