ValueError: The layer sequential_22 has never been called and thus has no defined output
I am trying to run a GradCAM that gives me the below error. I am confused as to why this is as Sequential is clearly being called. I have tried uninstalling tensorflow, restarting the kernel, and trying other environments. Any help is appreciated.
Model Performance Lagging under 50% in this simple dataset
I am a newbie learning keras following this tutorial
Wrapping tf functions in layers: model.fit not working
I’ve been following along with a video series from 2020 that details how to build a variational autoencoder inside of Python, and I’ve had to reformat the code in some places to get it to work with the modern version of Tensorflow. I’ve run into an issue where I try to use K.function on a dense layer (called self.mu), but I keep getting an error saying that I can’t pass a Keras tensor into a Tensorflow function. For example, here is some of the problematic code:
Why is keras predicting on tensor much faster than dataframes?
I notice a huge difference in performance when running model.predict(x) for large datasets (~10mil)
Why is loss starting to be smooth from version >=(Tensorflow 2.16.0rc0, Keras 3.0.0) as compared to old version (Tensorflow 2.13.1, Keras 2.13.1)?
In a machine learning exercice done with
(Tensorflow 2.13.1, Keras 2.13.1) and using binary cross entropy,
I obtain the following loss curve for the training and validation part.
comparing transfer learning styles for creating model
i would like to ask question about one thing : for instance common form for transfer learning is this :
Keras/TensorFlow: How to avoid the “OperatorNotAllowedInGraphError: Iterating over a symbolic `tf.Tensor` is not allowed.”-Error
I’m using Keras 3/TensorFlow 2.16.1 and would like to implement this example (https://keras.io/examples/vision/semisupervised_simclr/) for semi-supervised image classification using contrastive pretraining for 3D-data instead of 2D-images.
Load inceptionResnetV2 cbam model
I have trained InceptionResnetV2 using CBAM. Heres the cbam i am using
How to access the the tensors between layers in keras as numpy
I have a keras model defined using Functional API. I also have its weights (.h5 file), and a single example input tensor (stored in a .npy file). I am trying to reimplement this exact model in pytorch. And i am getting a completely different result. I would like to see where did i go wrong by comparing the tensors between each layer. But i cant for the life of me extract those tensors.
Can fit directly the memmap returned by numpy.load into tf.keras.model.fit()?
I’m using these lines to load data from .npy files: