Relative Content

Tag Archive for pythontensorflowmachine-learning

batch size and epochs in machine learning?

I’m using tensorflow for image/pattern recognition from stock graphs, I’ve created a dir of about 20,000 images with examples of patterns before sharp increases or decreases in prices. What batch size and how many epochs should I use with a 80/20 training/validation split?

batch size and epochs in machine learning?

I’m using tensorflow for image/pattern recognition from stock graphs, I’ve created a dir of about 20,000 images with examples of patterns before sharp increases or decreases in prices. What batch size and how many epochs should I use with a 80/20 training/validation split?

Vector approximation program malfunction

I created a program with a friend that should approximate a vector, the input data being some points from that vector. My friend wrote a line of code that i cannot understand, and it is malfunctioning:

Vector approximation program malfunction

I created a program with a friend that should approximate a vector, the input data being some points from that vector. My friend wrote a line of code that i cannot understand, and it is malfunctioning:

how to change nural function 1.X tensorflow to 2.X tensorflow

layer_1 = tf.nn.relu(tf.add(tf.matmul(x, w_1), b_1)) layer_1_b = tf.layers.batch_normalization(layer_1) layer_2 = tf.nn.relu(tf.add(tf.matmul(layer_1_b, w_2), b_2)) layer_2_b = tf.layers.batch_normalization(layer_2) layer_3 = tf.nn.relu(tf.add(tf.matmul(layer_2_b, w_3), b_3)) layer_3_b = tf.layers.batch_normalization(layer_3) y = tf.nn.relu(tf.add(tf.matmul(layer_3, w_4), b_4)) g_q_action = tf.argmax(y, axis=1) # compute loss g_target_q_t = tf.placeholder(tf.float32, None, name=”target_value”) g_action = tf.placeholder(tf.int32, None, name=’g_action’) action_one_hot = tf.one_hot(g_action, n_output, 1.0, 0.0, name=’action_one_hot’) q_acted = […]

Sequential Model not Building

I’m trying to build a model for training data in python using TensorFlow, but it’s failing to build. Does anyone see any problems?