Machine Learning Using TensorFlow Cookbook by Alexia Audevart Konrad Banachewicz Luca Massaron

Machine Learning Using TensorFlow Cookbook by Alexia Audevart Konrad Banachewicz Luca Massaron

Author:Alexia Audevart , Konrad Banachewicz , Luca Massaron
Language: eng
Format: epub
Tags: COM037000 - COMPUTERS / Machine Theory, COM004000 - COMPUTERS / Intelligence (AI) & Semantics, COM018000 - COMPUTERS / Data Processing
Publisher: Packt
Published: 2021-02-04T06:27:24+00:00


Next, we need to declare our batch size, our seed in order to have reproductible results, and our input data layer as follows: batch_size = 90 seed = 98 np.random.seed(seed) tf.random.set_seed(seed) x_data = tf.keras.Input(dtype=tf.float64, shape=(7,))

As previously, we now need to declare functions that initialize a variable and a layer in our model. To create a better logistic function, we need to create a function that returns a logistic layer on an input layer. In other words, we will just use a fully connected layer and return a sigmoid element for each layer. It is important to remember that our loss function will have the final sigmoid included, so we want to specify on our last layer that we will not return the sigmoid of the output, shown as follows: # Create variable definition def init_variable(shape): return(tf.Variable(tf.random.normal(shape=shape, dtype="float64", seed=seed))) # Create a logistic layer definition def logistic(input_layer, multiplication_weight, bias_weight, activation = True): # We separate the activation at the end because the loss function will # implement the last sigmoid necessary if activation: return tf.keras.layers.Lambda(lambda x: tf.nn.sigmoid(tf.add(tf.matmul(x, multiplication_weight), bias_weight)))(input_layer) else: return tf.keras.layers.Lambda(lambda x: tf.add(tf.matmul(x, multiplication_weight), bias_weight))(input_layer)



Download



Copyright Disclaimer:
This site does not store any files on its server. We only index and link to content provided by other sites. Please contact the content providers to delete copyright contents if any and email us, we'll remove relevant links or contents immediately.