Acceleration of word2vec using GPUs

Seulki Bae, Youngmin Yi

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

13 Scopus citations


Word2vec is a widely used word embedding toolkit which generates word vectors by training input corpus. Since word vector can represent an exponential number of word cluster and enables reasoning of words with simple algebraic operations, it has become a widely used representation for the subsequent NLP tasks. In this paper, we present an efficient parallelization of word2vec using GPUs that preserves the accuracy. With two K20 GPUs, the proposed acceleration technique achieves 1.7M words/sec, which corresponds to about 20× of speedup compared to a single-threaded CPU execution.

Original languageEnglish
Title of host publicationNeural Information Processing - 23rd International Conference, ICONIP 2016, Proceedings
EditorsSeiichi Ozawa, Kazushi Ikeda, Derong Liu, Akira Hirose, Kenji Doya, Minho Lee
PublisherSpringer Verlag
Number of pages11
ISBN (Print)9783319466712
StatePublished - 2016
Event23rd International Conference on Neural Information Processing, ICONIP 2016 - Kyoto, Japan
Duration: 16 Oct 201621 Oct 2016

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume9948 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Conference23rd International Conference on Neural Information Processing, ICONIP 2016


  • CUDA
  • Machine learning
  • Natural language processing
  • Neural network
  • Word embedding
  • Word2vec


Dive into the research topics of 'Acceleration of word2vec using GPUs'. Together they form a unique fingerprint.

Cite this