AI

Hyperparameter Tuning: Neural Networks 101 | by Egor Howell | Nov, 2023

How one can enhance the “studying” and “coaching” of neural networks by way of tuning hyperparameters

Neural-network icons created by Vectors Tank — Flaticon. neural-network icons. https://www.flaticon.com/free-icons/neural

In my earlier submit, we mentioned how neural networks predict and study from the information. There are two processes chargeable for this: the ahead cross and backward cross, often known as backpropagation. You may study extra about it right here:

This submit will dive into how we will optimise this “studying” and “coaching” course of to extend the efficiency of our mannequin. The areas we are going to cowl are computational enhancements and hyperparameter tuning and methods to implement it in PyTorch!

However, earlier than all that great things, let’s shortly jog our reminiscence about neural networks!

Neural networks are giant mathematical expressions that attempt to discover the “proper” perform that may map a set of inputs to their corresponding outputs. An instance of a neural community is depicted beneath:

A primary two-hidden multi-layer perceptron. Diagram by writer.

Every hidden-layer neuron carries out the next computation:

The method carried out inside every neuron. Diagram by writer.
  • Inputs: These are the options of our dataset.
  • Weights: Coefficients that scale the inputs. The objective of the algorithm is to seek out essentially the most optimum coefficients by way of gradient descent.
  • Linear Weighted Sum: Sum up the merchandise of the inputs and weights and add a bias/offset time period, b.
  • Hidden Layer: A number of neurons are saved to study patterns within the dataset. The superscript refers back to the layer and the subscript to the variety of neuron in that layer.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button