Meta's Llama 3 is scheduled for release this summer, but a smaller version may be released next week

Meta's Llama 3 is scheduled for release this summer, but a smaller version may be released next week

Meta reportedly could release the next version of its large language model Llama 3 as early as next week

According to The Information, a smaller version of Llama 3 will be released early, and the full open-source model will be released as early as July, allowing it to compete with Claude 3 and GPT-4

Instagram's owners have spent billions building advanced AI systems, including purchasing hundreds of thousands of H100 GPUs from Nvidia to train Llama and other models

Llama 3 is a large language model, and will range in size from very small, competing with Claude Haiku and Gemini Nano, to large, capable of full response and inference, such as GPT-4 and Claude Opus

Little is known about Llama3, other than the fact that it is expected to be open source like its predecessor and is likely to be multimodal, able to understand visual as well as textual input

Llama 3 will likely come in a variety of versions and sizes, ranging from as small as 7 billion parameters to as large as over 100 billion parameters Still, it would be smaller than the trillion-plus parameters used to train GPT-4

Llama 3 is also likely to be less cautious than its predecessor, which was criticized for excessive moderation controls and overly strict guardrails

Meta released Llama 2 last July, most likely for the simple reason of wanting to keep a consistent release schedule

Launching a small version of the forthcoming AI early fosters hype about its capabilities; some of the features of Anthropic's small model Claude 3 Haiku are equivalent to OpenAI's giant model GPT-4

The AI model space is growing rapidly and competition is intense, including the open source space with new models from DataBricks, Mistral, and StabilityAI [Smaller models are becoming increasingly valuable to companies because they are cheaper to run, easier to fine-tune, and in some cases can be run on local hardware

Categories