AI Made Friendly HERE

Groq AI model goes viral and rivals ChatGPT, challenges Elon Musk’s Grok — TradingView News



Groq, the latest artificial intelligence (AI) model to come onto the scene, is taking social media by storm with its response speed and new technology that may dispense with the need for GPUs.

Groq became an overnight sensation after its public benchmark tests went viral on the social media platform X, revealing its computation and response speed to outperform popular AI chatbot ChatGPT.

The first public demo using Groq: a lightning-fast AI Answers Engine.

It writes factual, cited answers with hundreds of words in less than a second.

More than 3/4 of the time is spent searching, not generating!

The LLM runs in a fraction of a second.https://t.co/dVUPyh3XGV https://t.co/mNV78XkoVB pic.twitter.com/QaDXixgSzp

Feb 19, 2024

This is due to the team behind Groq developing its own custom application-specific integrated circuit (ASIC) chip for large language models (LLMs), allowing it to generate roughly 500 tokens per second. In comparison, ChatGPT-3.5, the publicly available version of the model, can generate around 40 tokens per second. 

Groq Inc, the developer of this model, claims to have created the first language processing unit (LPU) through which it runs its model, rather than the scarce and costly graphics processing units (GPUs) typically used to run AI models.

Wow, that’s a lot of tweets tonight! FAQs responses.

• We’re faster because we designed our chip & systems

• It’s an LPU, Language Processing Unit (not a GPU)

• We use open-source models, but we don’t train them

• We are increasing access capacity weekly, stay tuned pic.twitter.com/nFlFXETKUP

Feb 19, 2024

However, the company behind Groq is not new. It was founded in 2016, when it trademarked the name Groq. Last November, when Elon Musk’s own AI model, also called Grok — but spelled with a “k” — was gaining traction, the developers behind the original Groq published a blog post calling out Musk for the choice of name:

“We can see why you might want to adopt our name. You like fast things (rockets, hyperloops, one-letter company names) and our product, the Groq LPU Inference Engine, is the fastest way to run large language models (LLMs) and other generative AI applications. However, we must ask you to please choose another name, and fast.”

Since Groq went viral on social media, neither Musk nor the Grok page on X has made any comment on the similarity between the name of the two models.





Source link

You May Also Like

About the Author:

Early Bird