Literally, **word embedding (Embeddings)** is the use of dense vectors to represent the semantics of a word. **Scholars have shown that by comparing the distance between these word vectors, we can understand how "humans" understand the meaning of words**. So, if we have a corpus comparing the distance between "taxes" and social groups ("conservatives", "socialists"), semantically, "taxes" should be farther away from "socialists", after all The money collected is for the service of the general public and has elements of socialism. In the word embedding space, word vectors contain rich information, such as analogies. Spain is to Madrid what Germany is to Berlin and France to Paris.字面上,**词嵌入(Embeddings)**是使用稠密向量表示一个词语的语义。**学者们已经表明,通过比较这些词向量之间的距离,我们可以了解“人类”如何理解单词的含义**。因此,如果我们有一个语料库,比较“税收” 与 社会团体(“保守派”、“社会主义者”) 之间的距离, 按照语义,“税收”应该距离 “社会主义者” 跟多一些,毕竟收上来的钱是为了社会大众服务,有社会主义的成分。在词嵌入空间中,词向量含有丰富的信息,例如可以做类比。西班牙之于马德里, 正如德国至于柏林、法国之于巴黎。"...