pythonhuggingface-transformersallennlp

In Allennlp, how can we set "sub_module" argument in PretrainedTransformerMismatchedEmbedder?


As title. I wanted to use BART or T5 as the embedder, and I'd like to use PretrainedTransformerMismatchedEmbedder because after embedding I wanted to do some operations on token level. However I found that the "sub_module" argument, which seems to be needed when using EncDec PLMs as embedder, is not available in the mismatched embedder. How should we set this argument for the mismatched embedder? Or is there a reason that this argument is incompatible with the mismatched embedder?


Solution

  • You are absolutely right, we should have a submodule for the mismatched embedder. If you make a PR to add it, I'll gladly review it and get it merged.