且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

如何从Keras嵌入层获取单词向量

更新时间:2023-12-01 23:50:16

您可以使用嵌入层的get_weights()方法获得词嵌入(即,嵌入层的权重实质上是嵌入向量):

You can get the word embeddings by using the get_weights() method of the embedding layer (i.e. essentially the weights of an embedding layer are the embedding vectors):

# if you have access to the embedding layer explicitly
embeddings = emebdding_layer.get_weights()[0]

# or access the embedding layer through the constructed model 
# first `0` refers to the position of embedding layer in the `model`
embeddings = model.layers[0].get_weights()[0]

# `embeddings` has a shape of (num_vocab, embedding_dim) 

# `word_to_index` is a mapping (i.e. dict) from words to their index, e.g. `love`: 69
words_embeddings = {w:embeddings[idx] for w, idx in word_to_index.items()}

# now you can use it like this for example
print(words_embeddings['love'])  # possible output: [0.21, 0.56, ..., 0.65, 0.10]