- cross-posted to:
- meta@lemdro.id
- auai@programming.dev
- technews@radiation.party
- cross-posted to:
- meta@lemdro.id
- auai@programming.dev
- technews@radiation.party
Blog post: https://about.fb.com/news/2023/07/llama-2/
Blog post: https://about.fb.com/news/2023/07/llama-2/
Is there any actual information here other than the number 2 being added to the name?
I gather it’s better optimized for Windows, which is a welcome enough improvement imo. Other than that, they just go on about how “safe” it is.
I understand the concern about “AI safety” (at least in theory), but I wouldn’t exactly call it a selling point.
Double the training data, double the trained context (4096 now), a chat tuned varient, the omission of the 35b model for now (it apparently isn’t “safe” enough), and commercial use is allowed (not that most of the people using llama cares about licensing).
Thanks, that sounds more exciting than just “safety” haha.
There’s a few things, one of the biggest is the commercial license and the 4k context
Nice, thanks for the info!
It also appears on the HF leaderboard now so you can get a very general idea of how (at least the 70b model) compares: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard