Forget ChatGPT - Groq is a new AI platform that beats it with unparalleled computational speed

Forget ChatGPT - Groq is a new AI platform that beats it with unparalleled computational speed

Having developed custom hardware designed to run AI language models, Groq is on a mission to deliver faster AI

Speed is very important when using AI; when you are talking to an AI chatbot, you want that information to happen in real time; when you ask an AI to compose an email, you want the AI to produce results in seconds, so you can send it and move on to the next task

Groq (not to be confused with Elon Musk's Grok chatbot - and they are not too happy with the similar name) specializes in developing high-performance processors and software solutions for AI, machine learning (ML) and high-performance computing applications They specialize in the development of

So while the Mountain View-based company does not (at this time) train its own AI language models, it can run very fast on those developed by others [Groq uses different hardware than its competitors And the hardware they use is designed for the software they run, not vice versa

They have built a chip they call a language processing unit (LPU) designed to handle large language modes (LLM) Other AI tools typically use graphics processing units (GPUs), which, as the name implies, are optimized for parallel graphics processing

Even if they are running chatbots, AI companies have used GPUs because they can perform technical calculations quickly and are generally very efficient Based on the chatbot example, an LLM such as GPT-3 (one of the models used by ChatGPT) works by analyzing a prompt and creating text based on a series of predictions about which words should follow the preceding word

Groq's LMUs are specifically designed to handle a set of data (assuming DNA, music, code, and natural language) and thus perform much better than GPUs According to the company, users are already running LLM up to 10 times faster than GPU-based alternatives using its engine and API

Here you can try it out for free using a regular text prompt without installing any software

Groq is currently running Llama 2 (created by Meta), Mixtral-8x7b, and Mistral 7B

Regarding X, Tom Ellis, who works at Groq, said that custom models are in the works, but right now they are focused on building out open source models

Categories