Name | DL | Torrents | Total Size | LLM Weights [edit] | 3 | 343.06GB | 41 | 0 |
LLaMA (26 files)
tokenizer_checklist.chk | 0.05kB |
tokenizer.model | 499.72kB |
llama.sh | 1.93kB |
7B/params.json | 0.10kB |
7B/consolidated.00.pth | 13.48GB |
7B/checklist.chk | 0.10kB |
65B/params.json | 0.10kB |
65B/consolidated.07.pth | 16.32GB |
65B/consolidated.06.pth | 16.32GB |
65B/consolidated.05.pth | 16.32GB |
65B/consolidated.04.pth | 16.32GB |
65B/consolidated.03.pth | 16.32GB |
65B/consolidated.02.pth | 16.32GB |
65B/consolidated.01.pth | 16.32GB |
65B/consolidated.00.pth | 16.32GB |
65B/checklist.chk | 0.48kB |
30B/params.json | 0.10kB |
30B/consolidated.03.pth | 16.27GB |
30B/consolidated.02.pth | 16.27GB |
30B/consolidated.01.pth | 16.27GB |
30B/consolidated.00.pth | 16.27GB |
30B/checklist.chk | 0.26kB |
13B/params.json | 0.10kB |
13B/consolidated.01.pth | 13.02GB |
13B/consolidated.00.pth | 13.02GB |
13B/checklist.chk | 0.15kB |
Type: Dataset
Tags: chatgpt nlp llama
Bibtex:
Tags: chatgpt nlp llama
Bibtex:
@article{, title= {LLaMA Weights}, journal= {}, author= {Facebook}, year= {}, url= {https://github.com/Elyah2035/llama-dl}, abstract= {https://github.com/facebookresearch/llama https://github.com/Elyah2035/llama-dl}, keywords= {chatgpt nlp llama}, terms= {}, license= {}, superseded= {} }