en_1000_no_stem.tar.gz | 8.63GB |
Type: Dataset
Tags: Wikipedia, nlp, word2vec, english, gensim, deeplearning, natural language, wiki
Bibtex:
Tags: Wikipedia, nlp, word2vec, english, gensim, deeplearning, natural language, wiki
Bibtex:
@article{, title= {Enwiki Word2vec model 1000 Dimensions}, journal= {}, author= {Idio}, year= {2015}, url= {https://github.com/idio/wiki2vec}, license= {}, abstract= {Gensim Word2vec model built on the english wikipedia, 1000dimensions, 10cbow, no stemming}, keywords= {Wikipedia, nlp, word2vec, english, gensim, deeplearning, natural language, wiki}, terms= {} }
No comments yet
Add a comment