Llama 3.1 70B

Llama 3.1 70B is a state-of-the-art large language model by Meta, featuring 70 billion parameters and an auto-regressive transformer architecture. It supports eight languages and can handle up to 128k tokens of context. Trained on approximately 15 trillion tokens and fine-tuned using SFT and RLHF, it excels in various NLP tasks, outperforming many existing models. Designed for commercial and research applications, it's optimized for dialogue and interactive systems, emphasizing safety and helpfulness. The model is available for exploration and implementation on Hugging Face .

Information

Family:
Llama 3.1
Released:
July 23, 2024
Parameters:
70B
Context:
128K
Variant:
base
Expert:
None
Repository:
HuggingFace
Knowledge Cutoff: