Lesson 1

Hello and welcome to the journey of understanding and implementing neural networks using Python! Neural networks play an essential role in **machine learning** and **AI**, paving the way for groundbreaking innovations in numerous fields. By the end of this lesson, you will be able to create and define a simple neural network using *Keras* in *TensorFlow*, and understand the components of a neural network, their layers, and the role of weights, biases, and activation functions.

Neural networks are computational systems inspired by the human brain. They consist of neurons (the most basic unit), which are assembled in layers to make the network. Each neuron in one layer is connected to neurons of the next layer through `synaptic weights`

. Moreover, each neuron has a bias that allows shifting the neuron's activation threshold.

An activation function regulates the output of a neuron given a set of inputs and the weights associated with them. One popular activation function we will be using is the `ReLU`

(Rectified Linear Unit) activation function. Notably, neurons and layers play essential roles in neural networks. So, let's understand them in detail while learning to build our neural network.

Visualizing a simple neural network with an input layer, a hidden layer, and an output layer:

In the above image, the input layer receives the data, the hidden layer processes it, and the output layer provides the final result. The hidden layer is where the magic happens, as it transforms the input data into a form that can be used to make predictions.

In the graphical representation, each circle represents a neuron, and the lines connecting them represent the weights. The weights are adjusted during the training process to minimize the error in the model's predictions.

Such a network can be used for various real-world applications, such as image recognition, natural language processing, and more — for example, predicting the price of a house based on its features, or classifying an image as a cat or a dog.

We will build a neural network using the powerful libraries *TensorFlow* and *Keras* in Python. Let's start by importing the required libraries:

Python`1import tensorflow as tf 2from tensorflow.keras.models import Sequential 3from tensorflow.keras.layers import Dense, Input`

Now that we have our libraries, we will accomplish the following steps:

- Initialize a Sequential Neural Network.
- Add an input layer.
- Add fully connected (dense) layers.
- Compile the model.

A neural network can be thought of as a sequence of layers. In TensorFlow, we can easily define this using the `Sequential`

class.

Python`1model = Sequential()`

The input layer forms the starting point of our network. It's where we feed in our data. For this model, we assume that we have 20 input features.

Python`1model.add(Input(shape=(20,)))`

In a fully connected layer, each neuron is connected to every neuron in the previous layer through the weights. The `ReLU`

activation function is employed here, introducing non-linearity into the output of a neuron.

Python`1model.add(Dense(64, activation='relu')) # Hidden Layer 2model.add(Dense(10)) # Output Layer`

We essentially have an input layer, followed by a hidden layer with 64 neurons, and finally an output layer with 10 neurons. The number of neurons in the output layer typically corresponds to the number of classes in a classification problem.

Now, we will compile the model. This step involves defining the loss function and optimizer. The `loss`

function measures how accurate the model is during training, and `optimizer`

dictates how the model is updated.

Python`1model.compile(loss='mean_squared_error', optimizer='adam')`

Let’s print the summary of our model:

Python`1print(model.summary())`

The output of the above code will be:

Plain text`1Model: "sequential" 2┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━┓ 3┃ Layer (type) ┃ Output Shape ┃ Param # ┃ 4┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━┩ 5│ dense (Dense) │ (None, 64) │ 1,344 │ 6├─────────────────────────────────┼────────────────────────┼───────────────┤ 7│ dense_1 (Dense) │ (None, 10) │ 650 │ 8└─────────────────────────────────┴────────────────────────┴───────────────┘ 9 Total params: 1,994 (7.79 KB) 10 Trainable params: 1,994 (7.79 KB) 11 Non-trainable params: 0 (0.00 B) 12None`

This summary showcases the architecture of the neural network we just created. It includes the layer types, output shapes, and the number of parameters at each layer and in total. Observing this summary helps in understanding the model's complexity and its learning capacity.

We've now successfully created a simple neural network with Keras and TensorFlow!

That's a wrap for this lesson! You've learned about neural networks, their components, and their architecture. You've also implemented a neural network using *Keras* and *TensorFlow*, defined layers, compiled the model, and interpreted its summary.

As we move forward, you will encounter exercises reinforcing these concepts and providing hands-on experience with this powerhouse combination of Python, TensorFlow, and Keras. Remember, becoming proficient takes practice and persistence, so keep experimenting and coding!