-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TensorFlow 'NoneType' object is not subscriptable during fit() #41
Comments
The code you have provided is correct. Since you don't have access to the original data the code does not run. I cannot provide the dataset details. Is the error due to the definition of the LSTM layer? I have looked extensively and could not find any other solutions to this error? |
I have the same issue, and have created a MWE here: I've found that iterating over the dataset and fitting on batches manually works:
but fitting on the dataset directly does not work: However, running eagerly works (with a performance penalty):
|
Is there any progress on this? Do RNNs work on other platforms, or not at all? I am trying to fit a simple model but I am getting this issue on every RNN layer.Env:
Reading in the time-dimension of a spectrogram with shape: (time=None, frequencies=257, channels=1) # Initial layers
model = Sequential()
model.add(Input(shape=(None, 257, 1)))
# Drop channels dimension
model.add(Lambda(lambda x: x[:, :, :, 0]))
# Add an RNN layer (LSTM) to process along the time dimension
model.add(LSTM(5))
# Final output layer
model.add(Dense(1, activation='sigmoid'))
# ...
model.fit(train, epochs=1, validation_data=test) Error:
Only works when |
I am building a TensorFlow model that takes 4 inputs and gives 2 outputs.
I first start with a pd.DataFrame:
Then, I use a generator to create the TensorFlow Dataset:
Here is what one "row" of train_ds looks like:
({"text": [1, 0, 0, ..., 1, 1], "prompt_question": [1, 0, 0, ..., 1, 1], "prompt_title": [1, 0, 0, ..., 1, 1], "prompt_text": [1, 0, 0, ..., 1, 1]}, {"content": 2, "wording": 1}))
Every value is a tensor.
Note that I use a TextVectorization layer from keras:
At this point, train_ds contains no None values.
Here is my model:
I compile it like this:
model.compile(loss={"content": "mean_squared_error", "wording": "mean_squared_error"}, optimizer="adam")
The error occurs when I try to fit the model:
history = model.fit(x=train_ds, batch_size=32, epochs=20, callbacks=[callbacks])
Here is the traceback:
It appears that the error rises because of the LSTM layer:
x = keras.layers.Bidirectional(keras.layers.LSTM(32))(embedded)
The text was updated successfully, but these errors were encountered: