Meta, Long an A.I. Leader, Tries Not to Be Left Out of the Boom
Meta Meta #genai #llama 2 family of large language models (LLMs) is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. The models are intended for commercial and research use in English.
The Llama 2 models were pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples.
The Llama 2 models have been evaluated on standard academic benchmarks, and they have shown strong performance on a variety of tasks, including code, commonsense reasoning, world knowledge, reading comprehension, math, and safety.
The Llama 2 models are still under development, and Meta is working to improve their performance and safety. The models are available for free download and use, and Meta encourages the community to contribute to their development.
Here are some of the key features of the Llama 2 models:
They are large-scale LLMs, with up to 70 billion parameters.
They are pretrained on a massive dataset of text and code.
They have been fine-tuned on a variety of tasks, including code, commonsense reasoning, world knowledge, reading comprehension, math, and safety.
They are available for free download and use.
If you are interested in learning more about the Llama 2 models, please visit the following resources:
Llama 2 website: https://ai.meta.com/llama/
Llama 2 GitHub repository: https://lnkd.in/diUGNUuT
Llama 2 paper: https://lnkd.in/dh6amv76
Data Science and Advanced Analytics Enthusiast - Graph Builder
3wYes I am really impressed by specifically the Llama 3 model. Not only a quality model, but there are tools available to support building what is needed. https://huggingface.co/dbands most of this can be done on fre GPU servers. What bugs me is that I do not hear people discussing how this actually outperforms, even before fine tuning well established closed models. Well done in my world this was / is / will be huge contributer to moving forward.