Machine learning theory & Neural networks
Throughout the first 2 weeks of my Artifical Intelligence module, I have been learning about the theory behind machine learning and neural networks.
Defining machine learning - Tom Mitchell presents a key definition: "A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P, if its performance at tasks in T, as measured by P, improves with experience E."
FORMULA:
Data (input) —> guess = output (prediction) —-> feedback —> new guess (= repeat) etc.
We understood simple examples of Neural Networks, and how they are used to make predictions based on input data.
Neural Networks are made up of layers of nodes. Each node takes in multiple inputs, applies weights to them, sums them up, and passes the result through an activation function to produce an output.
Neural Networks Visualised
FORMULA:
Data (input) —> guess = output (prediction) —-> feedback —> new guess (= repeat) etc.
We understood simple examples of Neural Networks, and how they are used to make predictions based on input data.
Neural Networks are made up of layers of nodes. Each node takes in multiple inputs, applies weights to them, sums them up, and passes the result through an activation function to produce an output.
Neural Networks Visualised