LLaMA Weights
Facebook

folder LLaMA (26 files)
filetokenizer_checklist.chk 0.05kB
filetokenizer.model 499.72kB
filellama.sh 1.93kB
file7B/params.json 0.10kB
file7B/consolidated.00.pth 13.48GB
file7B/checklist.chk 0.10kB
file65B/params.json 0.10kB
file65B/consolidated.07.pth 16.32GB
file65B/consolidated.06.pth 16.32GB
file65B/consolidated.05.pth 16.32GB
file65B/consolidated.04.pth 16.32GB
file65B/consolidated.03.pth 16.32GB
file65B/consolidated.02.pth 16.32GB
file65B/consolidated.01.pth 16.32GB
file65B/consolidated.00.pth 16.32GB
file65B/checklist.chk 0.48kB
file30B/params.json 0.10kB
file30B/consolidated.03.pth 16.27GB
file30B/consolidated.02.pth 16.27GB
file30B/consolidated.01.pth 16.27GB
file30B/consolidated.00.pth 16.27GB
file30B/checklist.chk 0.26kB
file13B/params.json 0.10kB
file13B/consolidated.01.pth 13.02GB
file13B/consolidated.00.pth 13.02GB
file13B/checklist.chk 0.15kB
Type: Dataset

Bibtex:
@article{,
title= {LLaMA Weights},
journal= {},
author= {Facebook},
year= {},
url= {https://github.com/Elyah2035/llama-dl},
abstract= {https://github.com/facebookresearch/llama
https://github.com/Elyah2035/llama-dl},
keywords= {chatgpt nlp llama},
terms= {},
license= {},
superseded= {}
}



Send Feedback Start
   0.000005
DB Connect
   0.000357
Lookup hash in DB
   0.000353
Get torrent details
   0.000970
Get torrent details, finished
   0.000222
Get authors
   0.000014
Parse bibtex
   0.000065
Write header
   0.000201
get stars
   0.000154
home tab
   0.002718
render right panel
   0.000020
render ads
   0.000398
fetch current hosters
   0.000393
Done