LLaMA Weights
Facebook

folder LLaMA (26 files)
filetokenizer_checklist.chk 0.05kB
filetokenizer.model 499.72kB
filellama.sh 1.93kB
file7B/params.json 0.10kB
file7B/consolidated.00.pth 13.48GB
file7B/checklist.chk 0.10kB
file65B/params.json 0.10kB
file65B/consolidated.07.pth 16.32GB
file65B/consolidated.06.pth 16.32GB
file65B/consolidated.05.pth 16.32GB
file65B/consolidated.04.pth 16.32GB
file65B/consolidated.03.pth 16.32GB
file65B/consolidated.02.pth 16.32GB
file65B/consolidated.01.pth 16.32GB
file65B/consolidated.00.pth 16.32GB
file65B/checklist.chk 0.48kB
file30B/params.json 0.10kB
file30B/consolidated.03.pth 16.27GB
file30B/consolidated.02.pth 16.27GB
file30B/consolidated.01.pth 16.27GB
file30B/consolidated.00.pth 16.27GB
file30B/checklist.chk 0.26kB
file13B/params.json 0.10kB
file13B/consolidated.01.pth 13.02GB
file13B/consolidated.00.pth 13.02GB
file13B/checklist.chk 0.15kB
Type: Dataset
Tags: chatgpt nlp llama

Bibtex:
@article{,
title= {LLaMA Weights},
journal= {},
author= {Facebook},
year= {},
url= {https://github.com/Elyah2035/llama-dl},
abstract= {https://github.com/facebookresearch/llama
https://github.com/Elyah2035/llama-dl},
keywords= {chatgpt nlp llama},
terms= {},
license= {},
superseded= {}
}

Hosted by users:

Send Feedback Start
   0.000005
DB Connect
   0.000568
Lookup hash in DB
   0.001008
Get torrent details
   0.006507
Get torrent details, finished
   0.001666
Get authors
   0.000035
Parse bibtex
   0.000538
Write header
   0.000553
get stars
   0.001540
home tab
   0.003894
render right panel
   0.000016
render ads
   0.000214
fetch current hosters
   0.001361
Done