AI Made Friendly HERE

Microsoft was warned that adding GPT-4 into Bing would get weird

We may earn a commission for purchases using our links. Learn more.
If it seems like some companies are needlessly rushing to be the first to implement AI in their products, well, you’re right. It’s happening across the industry, and some companies like Apple are taking small steps like using it in the next iPhone keyboard, while others are going all in and putting a GPT-powered chat assistant into basically every product they offer. And that’s yielded some wild results — while usage of Microsoft’s Bing search tools hasn’t seen a dramatic increase, it did receive a lot of attention, and that attention revealed that Bing’s AI was quite the manipulator and liar.
It’s been a problem with ChatGPT and other AI chatbots since they first started showing up: they don’t know what they don’t know, and will draw on their database of information to make up something that seems plausible enough and then state that with supreme confidence. But Bing, that took a bizarre turn with numerous reports of the chatbot claiming to have fallen in love, that it spies on Microsoft employees, and it insulting users. It got to the point where Microsoft limited how long of an interaction you should have with the Bing AI, since it was almost invariably bound to go off the rails if you started treating it like an actual intelligence that you could reason with.
OpenAI made the GPT-4 AI model powering Bing — and they told Microsoft it wasn’t ready yet.
If you tried it and thought that Bing AI wasn’t ready for primetime, you’re not alone. In fact, OpenAI, the company that made the GPT-4 multimodal model powering Bing chat, told Microsoft exactly that. Thanks to a a multi-billion-dollar early investment that netted them a 49% stake in OpenAI, Microsoft had an inside track to getting their hands on GPT-4 before it was publicly released, and as reported by The Wall Street Journal they were warned by OpenAI that it wasn’t yet ready for such an audacious implementation. But Microsoft didn’t listen and went ahead with the plans. And, well, you saw how bizarre it got.
This is all part of a perceived “rush” to get AI out in the public, to have your company’s name associated with AI. A lot of what Microsoft has tried to remain relevant in recent years hasn’t gone incredibly well, though their foundational businesses of Windows, Office, and Xbox remain solid as ever. But as Microsoft was partly responsible for the computing revolution that changed modern life, they see the potential for AI to be the next big thing that does the same and sweeps the old companies out the door in the process. They don’t want AI to do to them what they did to IBM.
Bing AI chat was just a taste of Microsoft’s AI ambitions. A big part of this will be Microsoft 365 Copilot, an upcoming AI tool that will bring content generation to the core Office productivity apps like Word and Excel, as well as a Business Chat AI that can pull together data “from across your documents, presentations, email, calendar, notes, and contacts” to summarize information and generated new content.
But unlike Bing, where Microsoft ended up with a lot of publicity but egg on its face while it worked to fix the system while it was live, Microsoft is taking a more cautious approach to 365 Copilot. After all, if Bing crashed and burned, would anybody notice? But if Microsoft completely messes up Excel by integrating AI, that’s something that a lot of very important customers would notice and remember. It’s been 25 years since Clippy first graced our screens and everybody who lived through that headache still groans when the old Office assistant is mentioned.
Following big layoffs at Amazon, Salesforce, and other large tech firms, Microsoft is slashing its headcount by 5% while simultaneously making huge investments in AI that may obviate those jobs anyway.
OpenAI has a new offering for businesses: pay for the fancier version of ChatGPT with faster responses and better data privacy.
Just because you can use a new technology doesn’t mean you need to.
Turns out that ChatGPT’s ability to “pass the bar exam” doesn’t mean it will make a good lawyer.
Sign up for the latest from CrackBerry!

No spam, we promise. You can unsubscribe at any time, and we’ll never share your details without your permission.

source

Originally Appeared Here

You May Also Like

About the Author:

Early Bird