0

I am trying to train a CNN with two input branches. And these two branches (b1, b2) are to be merged into a densely connected layer of 256 neurons with dropout rate of 0.25. This is what I have so far:

batch_size, epochs = 32, 3
ksize = 2
l2_lambda = 0.0001


### My first model(b1)
b1 = Sequential()
b1.add(Conv1D(128*2, kernel_size=ksize,
             activation='relu',
             input_shape=( xtest.shape[1], xtest.shape[2]),
             kernel_regularizer=keras.regularizers.l2(l2_lambda)))
b1.add(Conv1D(128*2, kernel_size=ksize, activation='relu',kernel_regularizer=keras.regularizers.l2(l2_lambda)))
b1.add(MaxPooling1D(pool_size=ksize))
b1.add(Dropout(0.2))

b1.add(Conv1D(128*2, kernel_size=ksize, activation='relu',kernel_regularizer=keras.regularizers.l2(l2_lambda)))
b1.add(MaxPooling1D(pool_size=ksize))
b1.add(Dropout(0.2))

b1.add(Flatten())

###My second model (b2)

b2 = Sequential()
b2.add(Dense(64, input_shape = (5000,), activation='relu',kernel_regularizer=keras.regularizers.l2(l2_lambda)))
b2.add(Dropout(0.1))


##Merging the two models
model = Sequential()
model.add(concatenate([b1, b2],axis = -1))
model.add(Dense(256, activation='relu', kernel_initializer='normal',kernel_regularizer=keras.regularizers.l2(l2_lambda)))
model.add(Dropout(0.25))
model.add(Dense(num_classes, activation='softmax'))

But when I concatenate it gives me the following error:

enter image description here

I first tried using the following command:

  model.add(Merge([b1, b2], mode = 'concat'))

But I got the error that 'ImportError: cannot import name 'Merge''. I am using keras 2.2.2 and python 3.6.

5
  • Yes, I tried that too but I got the same error as before. And I am not sure how to change it to functional API. I am new to Keras and machine learning. Commented Sep 14, 2018 at 13:49
  • Aha! Use b1.output and b2.output instead of b1 and b2. Commented Sep 14, 2018 at 14:11
  • I just tried it and I got the following error now: TypeError: The added layer must be an instance of class Layer. Found: Tensor("concatenate_1/concat:0", shape=(?, ?), dtype=float32) Commented Sep 14, 2018 at 14:39
  • I'm not sure if this will help. But xtest.shape[1] is 5000 and xtest.shape[2] is 208. Commented Sep 14, 2018 at 14:41
  • Sorry! My initial comment was a bit wrong (as a result I deleted it to prevent further confusion). Please take a look at my answer. Commented Sep 14, 2018 at 15:07

1 Answer 1

1

You need to use the functional API to achieve what you are looking for. You can either use Concatenate layer or its equivalent functional API concatenate:

concat = Concatenate(axis=-1)([b1.output, b2.output])
# or you can use the functional api as follows:
#concat = concatenate([b1.output, b2.output], axis=-1)

x = Dense(256, activation='relu', kernel_initializer='normal',
          kernel_regularizer=keras.regularizers.l2(l2_lambda))(concat)
x = Dropout(0.25)(x)
output = Dense(num_classes, activation='softmax')(x)

model = Model([b1.input, b2.input], [output])

Note that I have only converted the last part of your model to functional form. You can do the same thing for the other two models b1 and b2 (actually, it seems that the architecture you are trying to define is one single model that consists of two branches that are merged together). At the end, use model.summary() to see and recheck the architecture of the model.

Sign up to request clarification or add additional context in comments.

1 Comment

Thank you. My network works fine now. You saved the day for me :)

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.