#DataLeak – Llama 3 leaked before official release? A 4chan user claims to have leaked the base model of Meta LLaMA 3.1-405B a few days (possibly hours, according to speculations) before the official release of Meta LLaMA 3.0.
From the details provided, it appears that it was allegedly obtained from a public test repository that was accidentally made accessible on HuggingFace.
Meta launched Llama 3, its next generation of state-of-the-art open-source large language models. The first two models, Llama 3 8B and Llama 3 70B, set new benchmarks for LLMs of their size. However, in just three months, several other LLMs have surpassed their performance.
Meta has already revealed that its largest Llama 3 model will have over 400 billion parameters and is still in training. Today, the LocalLLaMA subreddit leaked early benchmarks of the upcoming Llama 3.1 8B, 70B, and 405B models. The leaked data suggests that Meta Llama 3.1 405B could outperform the current leader, OpenAI’s GPT-4o, in several key AI benchmarks. This is a significant milestone for the open-source AI community, marking the first time an open-source model may beat the current state-of-the-art closed-source LLM model.
Download:-
764GiB (~820GB)!
HF link: https://huggingface.co/cloud-district/miqu-2
Magnet: magnet:?xt=urn:btih:c0e342ae5677582f92c52d8019cc32e1f86f1d83&dn=miqu-2&tr=udp%3A%2F%2Ftracker.openbittorrent.com%3A80
What do you think?
It is nice to know your opinion. Leave a comment.