lasagneword-embedding

how to give fixed embedding matrix to EmbeddingLayer in Lasagne?


I have implemented a deep learning architecture which uses Lasagne EmbeddingLayer.

Now I have the word vectors already learned using word2vec and do not want the word vectors to be the parameters of my network.

After reading the documentation, I think it specifies that the numpy array provided to the 'W' parameter is the initial value for the Embedding Matrix.

How can I declare/specify the EmbeddingLayer in the code so that it uses the input weight matrix as a fixed matrix of word vectors??


Solution

  • The above problem can be solved by adding the 'trainable=False' tag to the weight parameter of the custom layer defined to work as the Embedding Layer.