Dimension must equal, MatMul with input shapes: [?,53],[150,200]

我想将两个不同维度的向量送进Bi-LSTM中,然后concat起来,但是报错。

with tf.variable_scope("bi-lstm"):

cell_bw = LSTMCell(self.hidden_dim)

cell_fw = LSTMCell(self.hidden_dim)

(output_fw_seq_e, output_bw_seq_e), _ = tf.nn.bidirectional_dynamic_rnn(

cell_fw=cell_fw,

cell_bw=cell_bw,

inputs=self.word_embeddings,

sequence_length=self.sequence_lengths,

dtype=tf.float32)

output_e = tf.concat([output_fw_seq_e, output_bw_seq_e], axis=-1)

(output_fw_seq_d, output_bw_seq_d), _ = tf.nn.bidirectional_dynamic_rnn(

cell_fw=cell_fw,

cell_bw=cell_bw,

inputs=self.features_dim,

sequence_length=self.sequence_lengths,

dtype=tf.float32)

output_d = tf.concat([output_fw_seq_d, output_bw_seq_d], axis=-1)

output = tf.concat([output_e, output_d], axis=-1)

output = tf.nn.dropout(output, self.dropout_pl)

self.hidden_dim: 50维
self.word_embeddings: 100维
self.features: 3维

求助!
补充Weight和bias代码如下:

with tf.variable_scope("proj"):

W = tf.get_variable(name="W",

shape=[2 * self.hidden_dim, self.num_tags],

initializer=tf.contrib.layers.xavier_initializer(),

dtype=tf.float32)

b = tf.get_variable(name="b",

shape=[self.num_tags],

initializer=tf.zeros_initializer(),

dtype=tf.float32)

s = tf.shape(output)

output = tf.reshape(output, [-1, 2 * self.hidden_dim])

pred = tf.matmul(output, W) + b


回答:

[?,53][150,200]不能矩阵乘,第一个矩阵的第二维和第二个矩阵的第一维要相同。
问题应该是上一步的输出不对,或者模型不对。

试一下生成网络结构图,效果如下,http://ethereon.github.io/net...
Dimension must equal, MatMul with input shapes: [?,53],[150,200]

以上是 Dimension must equal, MatMul with input shapes: [?,53],[150,200] 的全部内容, 来源链接: utcz.com/p/937776.html

回到顶部