LLaMA Weights
Facebook

folder LLaMA (26 files)
filetokenizer_checklist.chk 0.05kB
filetokenizer.model 499.72kB
filellama.sh 1.93kB
file7B/params.json 0.10kB
file7B/consolidated.00.pth 13.48GB
file7B/checklist.chk 0.10kB
file65B/params.json 0.10kB
file65B/consolidated.07.pth 16.32GB
file65B/consolidated.06.pth 16.32GB
file65B/consolidated.05.pth 16.32GB
file65B/consolidated.04.pth 16.32GB
file65B/consolidated.03.pth 16.32GB
file65B/consolidated.02.pth 16.32GB
file65B/consolidated.01.pth 16.32GB
file65B/consolidated.00.pth 16.32GB
file65B/checklist.chk 0.48kB
file30B/params.json 0.10kB
file30B/consolidated.03.pth 16.27GB
file30B/consolidated.02.pth 16.27GB
file30B/consolidated.01.pth 16.27GB
file30B/consolidated.00.pth 16.27GB
file30B/checklist.chk 0.26kB
file13B/params.json 0.10kB
file13B/consolidated.01.pth 13.02GB
file13B/consolidated.00.pth 13.02GB
file13B/checklist.chk 0.15kB
Type: Dataset

Bibtex:
@article{,
title= {LLaMA Weights},
journal= {},
author= {Facebook},
year= {},
url= {https://github.com/Elyah2035/llama-dl},
abstract= {https://github.com/facebookresearch/llama
https://github.com/Elyah2035/llama-dl},
keywords= {chatgpt nlp llama},
terms= {},
license= {},
superseded= {}
}

Hosted by users

Send Feedback Start
   0.000006
DB Connect
   0.000618
Lookup hash in DB
   0.000402
Get torrent details
   0.000124
Get torrent details, finished
   0.000230
Get authors
   0.000019
Parse bibtex
   0.000076
Write header
   0.000297
get stars
   0.000167
home tab
   0.000396
render right panel
   0.000019
render ads
   0.000610
fetch current hosters
   0.000323
related datasets
   0.016705
Done