LLaMA Weights
Facebook

folder LLaMA (26 files)
filetokenizer_checklist.chk 0.05kB
filetokenizer.model 499.72kB
filellama.sh 1.93kB
file7B/params.json 0.10kB
file7B/consolidated.00.pth 13.48GB
file7B/checklist.chk 0.10kB
file65B/params.json 0.10kB
file65B/consolidated.07.pth 16.32GB
file65B/consolidated.06.pth 16.32GB
file65B/consolidated.05.pth 16.32GB
file65B/consolidated.04.pth 16.32GB
file65B/consolidated.03.pth 16.32GB
file65B/consolidated.02.pth 16.32GB
file65B/consolidated.01.pth 16.32GB
file65B/consolidated.00.pth 16.32GB
file65B/checklist.chk 0.48kB
file30B/params.json 0.10kB
file30B/consolidated.03.pth 16.27GB
file30B/consolidated.02.pth 16.27GB
file30B/consolidated.01.pth 16.27GB
file30B/consolidated.00.pth 16.27GB
file30B/checklist.chk 0.26kB
file13B/params.json 0.10kB
file13B/consolidated.01.pth 13.02GB
file13B/consolidated.00.pth 13.02GB
file13B/checklist.chk 0.15kB
Type: Dataset
Tags: chatgpt nlp llama

Bibtex:
@article{,
title= {LLaMA Weights},
journal= {},
author= {Facebook},
year= {},
url= {https://github.com/Elyah2035/llama-dl},
abstract= {https://github.com/facebookresearch/llama
https://github.com/Elyah2035/llama-dl},
keywords= {chatgpt nlp llama},
terms= {},
license= {},
superseded= {}
}

Hosted by users:

Send Feedback Start
   0.000006
DB Connect
   0.000402
Lookup hash in DB
   0.000590
Get torrent details
   0.001601
Get torrent details, finished
   0.000669
Get authors
   0.000032
Parse bibtex
   0.000443
Write header
   0.000431
get stars
   0.000605
home tab
   0.010820
render right panel
   0.000012
render ads
   0.000157
fetch current hosters
   0.005634
Done