1
\$\begingroup\$

I am working on classification problem, My input data is labels and output expected data is labels

Labels Count
1 94481
0 65181
2 60448

I have made X, Y pairs by shifting the X and Y is changed to the categorical value

create X/y pairs

df1 = df['Data_Used']
df1 = concat([df1, df1.shift(1)], axis=1)
df1.dropna(inplace=True)
 X Y
 2 1.0
 1 2.0
 1 1.0
 2 1.0
 2 2.0
values = df1.values
encoder = LabelEncoder()
test_labels = to_categorical(encoder.fit_transform(values[:,1]),num_classes=3)
train_X,test_X,train_y,test_y= train_test_split(values[:,0], test_labels,test_size = 0.30,random_state = 42)
print(train_X.shape)
print(train_y.shape)
print(test_X.shape)
print(test_y.shape)

(154076,) (154076, 3) (66033,) (66033, 3)
Converting this to LSTM format

train_X = train_X.reshape(train_X.shape[0],1,1)
test_X = test_X.reshape(test_X.shape[0],1,1)
# configure network
n_batch = 1
n_epoch = 10
n_neurons = 100

ModelArchitecture

tf.keras.backend.clear_session()
model = tf.keras.models.Sequential([
 tf.keras.layers.LSTM(n_neurons, batch_input_shape=(n_batch, train_X.shape[1],train_X.shape[2]), stateful=True),
 tf.keras.layers.Dense(64, activation='relu'),
 tf.keras.layers.Dense(100, activation = 'relu',kernel_regularizer=regularizers.l2(0.0001)),
 tf.keras.layers.Dense(3, activation='softmax')
])
model.summary()
model.compile(optimizer='rmsprop',
 loss='categorical_crossentropy',
 metrics=['acc'])
history = model.fit(train_X,train_y,validation_data=(test_X, test_y),epochs=n_epoch, batch_size=n_batch, verbose=1,shuffle= False)

Validation Accuracy is not Changing

Epoch 1/5
154076/154076 [==============================] - 356s 2ms/step - loss: 1.0844 - acc: 0.4269 - val_loss: 1.0814 - val_acc: 0.4310
Epoch 2/5
154076/154076 [==============================] - 354s 2ms/step - loss: 1.0853 - acc: 0.4256 - val_loss: 1.0813 - val_acc: 0.4310
Epoch 3/5
154076/154076 [==============================] - 355s 2ms/step - loss: 1.0861 - acc: 0.4246 - val_loss: 1.0814 - val_acc: 0.4310
Epoch 4/5
154076/154076 [==============================] - 356s 2ms/step - loss: 1.0874 - acc: 0.4228 - val_loss: 1.0825 - val_acc: 0.4310
Epoch 5/5
154076/154076 [==============================] - 353s 2ms/step - loss: 1.0887 - acc: 0.4208 - val_loss: 1.0828 - val_acc: 0.4310

What can be the changes to improve the model.

asked Jun 28, 2020 at 12:08
\$\endgroup\$
3
  • 1
    \$\begingroup\$ Can you share the part of the code to download/ load the values? Also does increasing num_epochs has any effect ? \$\endgroup\$ Commented Jun 28, 2020 at 15:59
  • \$\begingroup\$ @ankk I have updated the code, eventhough increasing the num_epochs my validation accuracy is not changing \$\endgroup\$ Commented Jun 28, 2020 at 16:26
  • \$\begingroup\$ can you share your data? \$\endgroup\$ Commented Sep 2, 2020 at 3:01

1 Answer 1

3
\$\begingroup\$

One possible reason of this could be unbalanced data. You should have same amount of examples per label. And if you don't have that data, you can use Loss Weights. It is a parameter in model.compile(). You can learn more about Loss weights on google.

Also, I noticed you were using rmsprop as the optimizer. Try using Adam optimizer, as it is one of the best optimizer.

If, doing all of these I mentioned above, doesn't changes anything and the results are the same, remove the Dense() Layers and just keep 1 dense() layer, that is, just keep the last Dense Layer, and remove all the other Dense() Layers.

This will surely improve the model. But, if still it doesn't changes anything, then have a look here

answered Oct 24, 2020 at 6:23
\$\endgroup\$

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.