ANN: You are so artificial
- Charles Stoy
- Jan 9, 2023
- 10 min read
An artificial neural network (ANN) is a computational model inspired by the structure and function of the biological neural networks that constitute animal brains. ANNs are designed to recognize patterns, classify data, and make decisions based on input data. They are composed of interconnected units called artificial neurons, which are inspired by biological neurons found in the brain. These artificial neurons process input data through weighted connections to other neurons, ultimately producing output through a process known as forward propagation. ANNs can be trained using labeled data to adjust the weights of their connections in order to improve their performance on tasks such as classification or prediction. They are a powerful tool for machine learning and are widely used in many applications, including image and speech recognition, natural language processing, and more.
It is possible to build an artificial neural network (ANN) using Excel, although it may be more challenging compared to using specialized software such as Python. Here are some steps you can follow to build an ANN in Excel:
First, you need to gather your data and organize it in an Excel spreadsheet. Make sure to include a column for the output or label that you want to predict.
Next, you will need to create a new sheet to build your neural network. In this sheet, you can use formulas to create the input layer, hidden layer(s), and output layer of your ANN. You can use Excel's SUMPRODUCT and SUM functions to calculate the dot product of the weights and inputs, and the sigmoid function to squash the output of each neuron.
After you have created your neural network, you will need to train it using your data. You can use Excel's solver tool to adjust the weights of the network in order to minimize the error between the predicted output and the actual label.
Once your ANN is trained, you can use it to make predictions on new data by inputting the values into the input layer and using the trained weights to calculate the output.
Keep in mind that building an ANN in Excel can be time-consuming and may require a strong understanding of both Excel and ANNs. You may find it easier to use specialized software such as Python to build and train an ANN.
VBA and ANN
There are a few different approaches you can take when implementing an artificial neural network (ANN) in Visual Basic for Applications (VBA). Here are a few options you might consider:
Use an existing library: There are a number of pre-existing libraries available that you can use to build an ANN in VBA. One example is the Neuronet library, which is specifically designed for this purpose.
Write your own code: If you're interested in implementing an ANN from scratch, you'll need to understand the basic principles of how they work. This involves understanding the structure of a neural network, how to initialize the weights and biases, how to forward propagate inputs through the network, and how to backpropagate errors to update the weights and biases. You'll also need to be familiar with the basics of linear algebra and calculus.
VBA implementation
Here is an example of a simple artificial neural network implemented in VBA. This code creates a feedforward neural network with a single hidden layer and a sigmoid activation function. It includes functions for training the network and making predictions with the trained model. I used the model developed by bquanttranding because I am not a VBA programmer.
Option Explicit Option Base 1
Public Sub ANN_1HL_GDR() Dim ControlSht As Worksheet: Set ControlSht = Worksheets(“Control”) Dim DataSht As Worksheet: Set DataSht = Worksheets(“DataInput”) Dim ResultsSht As Worksheet: Set ResultsSht = Worksheets(“TrainingResults”) ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Read in NN topology and training data location x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ Dim Num_Inputs As Integer: Num_Inputs = ControlSht.Range(“Num_Input_Units”).Value Dim Num_H1_Nodes As Integer: Num_H1_Nodes = ControlSht.Range(“Hidden_Layer1_Num_Nodes”).Value Dim HiddenLayer1_Bias As Integer: HiddenLayer1_Bias = ControlSht.Range(“Hidden_Layer1_Bias”).Value Dim Num_Output_Nodes As Integer: Num_Output_Nodes = ControlSht.Range(“Output_Layer_Num_Nodes”).Value Dim OutputLayer_Bias As Integer: OutputLayer_Bias = ControlSht.Range(“Output_Layer_Bias”).Value Dim HL1_ActivationFunc As String: HL1_ActivationFunc = ControlSht.Range(“Hidden_Layer1_ActivationFunction”).Value Dim Out_ActivationFunc As String: Out_ActivationFunc = ControlSht.Range(“Output_Layer_ActivationFunction”).Value Dim MaxEpoch As Single: MaxEpoch = ControlSht.Range(“Max_Epoch”).Value Dim TainingInstances As Integer: TainingInstances = ControlSht.Range(“Training_Data_Instances”).Value Dim ValidationInstances As Integer: ValidationInstances = ControlSht.Range(“Validation_Data_Instances”).Value Dim Epsilon As Double: Epsilon = ControlSht.Range(“Epsilon”).Value Dim LearningRate As Double: LearningRate = ControlSht.Range(“Learning_Rate”).Value Dim Momentum As Double: Momentum = ControlSht.Range(“Momentum”).Value ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Assign activation function derivative names for calculation of deltas x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ Dim HL1_dAFunc_dNet As String: If HL1_ActivationFunc = “Logistic” Then HL1_dAFunc_dNet = “dLogit_dNet” ElseIf HL1_ActivationFunc = “HyperbolicTangent” Then HL1_dAFunc_dNet = “dTanh_dNet” Else HL1_dAFunc_dNet = “dLin_dNet” End If
Dim Out_dAFunc_dNet As String: If Out_ActivationFunc = “Logistic” Then Out_dAFunc_dNet = “dLogit_dNet” ElseIf Out_ActivationFunc = “HyperbolicTangent” Then Out_dAFunc_dNet = “dTanh_dNet” Else Out_dAFunc_dNet = “dLin_dNet” End If ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Prepare output sheets for model output x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ControlSht.Range(“Hidden_Layer1_Num_Nodes”).Offset(ControlSht.Range(“Num_Hidden_Layers”).Value, 0) _ .Resize(5 – ControlSht.Range(“Num_Hidden_Layers”).Value, 3).ClearContents ‘Removes superfluous hidden layer data
ResultsSht.Range(“B3”).Resize(ResultsSht.Range(“C:C”). _ Cells.SpecialCells(xlCellTypeConstants).Count, 3).ClearContents ‘ remove MSE data from TrainingResults tab
‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Count number of input nodes and create arrays x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ Dim NumWghts_I_H1 As Integer: NumWghts_I_H1 = (Num_Inputs + HiddenLayer1_Bias) * Num_H1_Nodes ReDim Weights_I_H1(Num_Inputs + HiddenLayer1_Bias, Num_H1_Nodes) As Double ‘ weight from input i to node j ReDim WeightsChange_I_H1(Num_Inputs + HiddenLayer1_Bias, Num_H1_Nodes) As Double ‘ gradient based weight adjustment ReDim WeightsChange_I_H1_Prev(Num_Inputs + HiddenLayer1_Bias, Num_H1_Nodes) As Double ‘ used if momentum is required
‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Count number of hidden layer nodes and create arrays x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ Dim NumWghts_H1_O As Integer: NumWghts_H1_O = (Num_H1_Nodes + OutputLayer_Bias) * Num_Output_Nodes ReDim Weights_H1_O(Num_H1_Nodes + OutputLayer_Bias, Num_Output_Nodes) As Double ‘ weight from node i to output node j ReDim WeightsChange_H1_O(Num_H1_Nodes + OutputLayer_Bias, Num_Output_Nodes) As Double ReDim WeightsChange_H1_O_Prev(Num_H1_Nodes + OutputLayer_Bias, Num_Output_Nodes) As Double
‘xxxxxxxxxxxxxxxxxxxxxx’ ‘x Initialise weights x’ ‘xxxxxxxxxxxxxxxxxxxxxx’ Dim ii As Integer: ii = 1 Dim jj As Integer: jj = 1 For jj = 1 To Num_H1_Nodes For ii = 1 To (Num_Inputs + HiddenLayer1_Bias) Weights_I_H1(ii, jj) = Rnd – 0.5 Next ii Next jj For jj = 1 To Num_Output_Nodes For ii = 1 To (Num_H1_Nodes + OutputLayer_Bias) Weights_H1_O(ii, jj) = Rnd – 0.5 Next ii Next jj ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Starting weights to replicate blog example x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘ Weights_I_H1(1, 1) = 0.1: Weights_I_H1(1, 2) = -0.3: Weights_H1_O(1, 1) = -0.35 ‘ Weights_I_H1(2, 1) = 0.2: Weights_I_H1(2, 2) = -0.2: Weights_H1_O(2, 1) = 0.35 ‘ Weights_I_H1(3, 1) = 0.3: Weights_I_H1(3, 2) = -0.1: Weights_H1_O(3, 1) = 0.15
‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Declaration of counters and summation holders x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ Dim Epoch As Single: Epoch = 1 Dim Obs As Integer: Dim Obs_Val As Integer Dim TempNet As Double: Dim TempDeltaSum As Double Dim AvgSqrError As Double: Dim AvgSqrError_Val As Double
‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x This is the main loop. Network is trained until maximum Epochs is reached or MSE converged to Epsilon x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ For Epoch = 1 To MaxEpoch ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Model uses Batch Training Mode. For each sample we loop through: x’ ‘x 1) Loads in Training Set x’ ‘x 2) Feedforward the training set through network x’ ‘x 3) Calculates error x’ ‘x 4) Calculates deltas (dError/dNet) x’ ‘x 5) Calculates weight change x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ For Obs = 1 To TainingInstances ‘xxxxxxxxxxxxxxxxxxxxxxx’ ‘x Loads Target Output x’ ‘xxxxxxxxxxxxxxxxxxxxxxx’ ReDim TargetNodeValue(Num_Output_Nodes) As Double For ii = 1 To Num_Output_Nodes TargetNodeValue(ii) = DataSht.Range(ControlSht.Range(“Target_Data_Range”).Value).Offset(Obs – 1, ii – 1).Value Next ii
‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Loads Input Nodes and incudes bias node value if selected x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ReDim INodeValue(Num_Inputs + HiddenLayer1_Bias) As Double If HiddenLayer1_Bias = 1 Then INodeValue(1) = 1# End If For ii = (1 + HiddenLayer1_Bias) To (Num_Inputs + HiddenLayer1_Bias) INodeValue(ii) = DataSht.Range(ControlSht.Range(“Input_Data_Range”).Value).Offset(Obs – 1, ii – 1 – HiddenLayer1_Bias).Value Next ii
‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x One of two loops to feedforward through the neural network x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Feedforward from Input Layer to First Hidden Layer x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ReDim H1NodeValue(Num_H1_Nodes + OutputLayer_Bias) As Double If OutputLayer_Bias = 1 Then H1NodeValue(1) = 1# End If For jj = (1 + OutputLayer_Bias) To (Num_H1_Nodes + OutputLayer_Bias) TempNet = 0# For ii = 1 To (Num_Inputs + HiddenLayer1_Bias) TempNet = TempNet + Weights_I_H1(ii, jj – OutputLayer_Bias) * INodeValue(ii) Next ii H1NodeValue(jj) = Run(HL1_ActivationFunc, TempNet) Next jj ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Feedforward from Hiddent Layer to Output Layer and calculates output node error x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ReDim OutputNodeValue(Num_Output_Nodes) As Double: ReDim OutputNodeErrorValue(Num_Output_Nodes) As Double For jj = 1 To Num_Output_Nodes TempNet = 0# For ii = 1 To (Num_H1_Nodes + OutputLayer_Bias) TempNet = TempNet + Weights_H1_O(ii, jj) * H1NodeValue(ii) Next ii OutputNodeValue(jj) = Run(Out_ActivationFunc, TempNet) OutputNodeErrorValue(jj) = TargetNodeValue(jj) – OutputNodeValue(jj) AvgSqrError = AvgSqrError + OutputNodeErrorValue(jj) * OutputNodeErrorValue(jj) Next jj
‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x One of two loops to calculate node deltas x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘xxxxxxxxxxxxxxxxxxxxxx’ ‘x Output Node Deltas x’ ‘xxxxxxxxxxxxxxxxxxxxxx’ ReDim OutputNodesDelta(Num_Output_Nodes) As Double For ii = 1 To Num_Output_Nodes OutputNodesDelta(ii) = OutputNodeErrorValue(ii) * Run(Out_dAFunc_dNet, OutputNodeValue(ii)) Next ii ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Hidden Layer Node Deltas x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ReDim H1NodesDelta(Num_H1_Nodes) As Double For ii = 1 To Num_H1_Nodes TempDeltaSum = 0# For jj = 1 To Num_Output_Nodes TempDeltaSum = TempDeltaSum + OutputNodesDelta(jj) * Weights_H1_O(ii + OutputLayer_Bias, jj) Next jj H1NodesDelta(ii) = TempDeltaSum * Run(HL1_dAFunc_dNet, H1NodeValue(ii + OutputLayer_Bias)) Next ii
‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x One of two loops to calculate weight changes x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Hidden Layer -> Output Node Weight Changes x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ For jj = 1 To Num_Output_Nodes For ii = 1 To (Num_H1_Nodes + OutputLayer_Bias) WeightsChange_H1_O(ii, jj) = WeightsChange_H1_O(ii, jj) + H1NodeValue(ii) * OutputNodesDelta(jj) Next ii Next jj ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Input Layer -> Hidden Layer Node Weight Changes x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ For jj = 1 To Num_H1_Nodes For ii = 1 To (Num_Inputs + HiddenLayer1_Bias) WeightsChange_I_H1(ii, jj) = WeightsChange_I_H1(ii, jj) + INodeValue(ii) * H1NodesDelta(jj) Next ii Next jj ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Continue above calculations for every observation in the training set x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ Next Obs
‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x This is the main loop that will calculate MSE for validation set to check the out of sample fit of the model x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ For Obs_Val = 1 To ValidationInstances ‘xxxxxxxxxxxxxxxxxxxxxxx’ ‘x Loads Target Output x’ ‘xxxxxxxxxxxxxxxxxxxxxxx’ ReDim TargetNodeValue(Num_Output_Nodes) As Double For ii = 1 To Num_Output_Nodes TargetNodeValue(ii) = DataSht.Range(ControlSht.Range(“Target_Data_Range”).Value).Offset(TainingInstances + (Obs_Val – 1), ii – 1).Value Next ii ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Loads Input Nodes and incudes bias node value if selected x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ReDim INodeValue(Num_Inputs + HiddenLayer1_Bias) As Double If HiddenLayer1_Bias = 1 Then INodeValue(1) = 1# End If For ii = (1 + HiddenLayer1_Bias) To (Num_Inputs + HiddenLayer1_Bias) INodeValue(ii) = DataSht.Range(ControlSht.Range(“Input_Data_Range”).Value). _ Offset(TainingInstances + (Obs_Val – 1), ii – 1 – HiddenLayer1_Bias).Value Next ii
‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x One of two loops to feedforward through the neural network x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Feedforward from Input Layer to First Hidden Layer x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ReDim H1NodeValue(Num_H1_Nodes + OutputLayer_Bias) As Double If OutputLayer_Bias = 1 Then H1NodeValue(1) = 1# End If For jj = (1 + OutputLayer_Bias) To (Num_H1_Nodes + OutputLayer_Bias) TempNet = 0# For ii = 1 To (Num_Inputs + HiddenLayer1_Bias) TempNet = TempNet + Weights_I_H1(ii, jj – OutputLayer_Bias) * INodeValue(ii) Next ii H1NodeValue(jj) = Run(HL1_ActivationFunc, TempNet) Next jj ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Feedforward from Hiddent Layer to Output Layer and calculates output node error x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ReDim OutputNodeValue(Num_Output_Nodes) As Double: ReDim OutputNodeErrorValue(Num_Output_Nodes) As Double For jj = 1 To Num_Output_Nodes TempNet = 0# For ii = 1 To (Num_H1_Nodes + OutputLayer_Bias) TempNet = TempNet + Weights_H1_O(ii, jj) * H1NodeValue(ii) Next ii OutputNodeValue(jj) = Run(Out_ActivationFunc, TempNet) OutputNodeErrorValue(jj) = TargetNodeValue(jj) – OutputNodeValue(jj) AvgSqrError_Val = AvgSqrError_Val + OutputNodeErrorValue(jj) * OutputNodeErrorValue(jj) Next jj
‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Continue above calculations for every observation in the validation set x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ Next Obs_Val
‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Calculate and print to output sheet the avg squared error for training and validation set for this epoch x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ AvgSqrError = AvgSqrError / (TainingInstances): ResultsSht.Range(“C2”).Offset(Epoch, 0).Value = AvgSqrError If ValidationInstances = 0 Then AvgSqrError_Val = 0 Else AvgSqrError_Val = AvgSqrError_Val / (ValidationInstances): ResultsSht.Range(“D2”).Offset(Epoch, 0).Value = AvgSqrError_Val End If ResultsSht.Range(“B2”).Offset(Epoch, 0).Value = Epoch
‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x One of two loops to update weights x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Update weights from First Hidden Layer to Output Layer H1-> O x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ For jj = 1 To Num_Output_Nodes For ii = 1 To (Num_H1_Nodes + OutputLayer_Bias) Weights_H1_O(ii, jj) = Weights_H1_O(ii, jj) + LearningRate * WeightsChange_H1_O(ii, jj) + Momentum * WeightsChange_H1_O_Prev(ii, jj) Next ii Next jj WeightsChange_H1_O_Prev = WeightsChange_H1_O ReDim WeightsChange_H1_O(Num_H1_Nodes + OutputLayer_Bias, Num_Output_Nodes) As Double ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Update weights from Input Layer to First Hidden Layer I-> H1 x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ For jj = 1 To Num_H1_Nodes For ii = 1 To (Num_Inputs + HiddenLayer1_Bias) Weights_I_H1(ii, jj) = Weights_I_H1(ii, jj) + LearningRate * WeightsChange_I_H1(ii, jj) + Momentum * WeightsChange_I_H1_Prev(ii, jj) Next ii Next jj WeightsChange_I_H1_Prev = WeightsChange_I_H1 ReDim WeightsChange_I_H1(Num_Inputs + HiddenLayer1_Bias, Num_H1_Nodes) As Double ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Check if training set Avg Square Error is less than Epsilo, exit if true x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ If AvgSqrError < Epsilon Then Epoch = MaxEpoch End If AvgSqrError = 0 ‘ reset avg square error total AvgSqrError_Val = 0 ‘ reset avg square error for validation sample ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x We have completed one Epoch of the batch training mode and now need to repeat the process until the network is trained x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ Next Epoch ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Insert a column of model output in the DataInput tab next to original data x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ For Obs = 1 To TainingInstances + ValidationInstances ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Loads Input Nodes and incudes bias node value if selected x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ReDim INodeValue(Num_Inputs + HiddenLayer1_Bias) As Double If HiddenLayer1_Bias = 1 Then INodeValue(1) = 1# End If For ii = (1 + HiddenLayer1_Bias) To (Num_Inputs + HiddenLayer1_Bias) INodeValue(ii) = DataSht.Range(ControlSht.Range(“Input_Data_Range”).Value). _ Offset(Obs – 1, ii – 1 – HiddenLayer1_Bias).Value Next ii
‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x One of two loops to feedforward through the neural network x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Feedforward from Input Layer to First Hidden Layer x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ReDim H1NodeValue(Num_H1_Nodes + OutputLayer_Bias) As Double If OutputLayer_Bias = 1 Then H1NodeValue(1) = 1# End If For jj = (1 + OutputLayer_Bias) To (Num_H1_Nodes + OutputLayer_Bias) TempNet = 0# For ii = 1 To (Num_Inputs + HiddenLayer1_Bias) TempNet = TempNet + Weights_I_H1(ii, jj – OutputLayer_Bias) * INodeValue(ii) Next ii H1NodeValue(jj) = Run(HL1_ActivationFunc, TempNet) Next jj ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Feedforward from Hiddent Layer to Output Layer and calculates output node error and print to DataInput sheet x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ReDim OutputNodeValue(Num_Output_Nodes) As Double: ReDim OutputNodeErrorValue(Num_Output_Nodes) As Double For jj = 1 To Num_Output_Nodes TempNet = 0# For ii = 1 To (Num_H1_Nodes + OutputLayer_Bias) TempNet = TempNet + Weights_H1_O(ii, jj) * H1NodeValue(ii) Next ii OutputNodeValue(jj) = Run(Out_ActivationFunc, TempNet) DataSht.Range(ControlSht.Range(“Target_Data_Range”).Value).Offset(Obs – 1, Num_Inputs + Num_Output_Nodes + jj – 1).Value = OutputNodeValue(jj) Next jj Next Obs ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ ‘x Retun trained network weights to sheet Trained_NetworkWeights x’ ‘xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx’ Worksheets(“Trained_NetworkWeights”).Range(“B2:Z100”).Clear Worksheets(“Trained_NetworkWeights”).Range(“B2”) = “I” If HiddenLayer1_Bias = 1 Then Worksheets(“Trained_NetworkWeights”).Range(“B3”).Value = 1 Worksheets(“Trained_NetworkWeights”).Range(“B3”).Interior.Color = 255 End If For ii = (1 + HiddenLayer1_Bias) To (Num_Inputs + HiddenLayer1_Bias) Worksheets(“Trained_NetworkWeights”).Range(“B2”).Offset(ii, 0).Value = _ DataSht.Range(ControlSht.Range(“Input_Data_Range”).Value). _ Offset(TainingInstances + ValidationInstances – 1, ii – 1 – HiddenLayer1_Bias).Value Worksheets(“Trained_NetworkWeights”).Range(“B2”).Offset(ii, 0).Interior.Color = 5287936 Next ii For jj = 1 To Num_H1_Nodes Worksheets(“Trained_NetworkWeights”).Range(“C2”).Offset(0, jj – 1).Value = “I->N” & jj For ii = 1 To (Num_Inputs + HiddenLayer1_Bias) Worksheets(“Trained_NetworkWeights”).Range(“C2”).Offset(ii, jj – 1).Value = Weights_I_H1(ii, jj) Next ii Next jj
If OutputLayer_Bias = 1 Then Worksheets(“Trained_NetworkWeights”).Range(“B2”).Offset(0, Num_H1_Nodes + 1).Value = “H1” Worksheets(“Trained_NetworkWeights”).Range(“B3”).Offset(0, Num_H1_Nodes + 1).Value = 1 Worksheets(“Trained_NetworkWeights”).Range(“B3”).Offset(0, Num_H1_Nodes + 1).Interior.Color = 255 End If For ii = 1 To Num_H1_Nodes Worksheets(“Trained_NetworkWeights”).Range(“B3”).Offset(OutputLayer_Bias + ii – 1, Num_H1_Nodes + 1).FormulaR1C1 = _ “=” & HL1_ActivationFunc & “(sumproduct(R[” & -(OutputLayer_Bias + ii – 1) & “]C[” & -(Num_H1_Nodes – (ii – 1)) & “]:R[” & _ (Num_Inputs + HiddenLayer1_Bias – (ii + OutputLayer_Bias)) & “]C[” & -(Num_H1_Nodes – (ii – 1)) & “],R[” & _ -(OutputLayer_Bias + ii – 1) & “]C2:R[” & (Num_Inputs + HiddenLayer1_Bias – (ii + OutputLayer_Bias)) & “]C2))”
Worksheets(“Trained_NetworkWeights”).Range(“B3”).Offset(OutputLayer_Bias + ii – 1, Num_H1_Nodes + 1).Interior.Color = 255 Next ii For jj = 1 To Num_Output_Nodes Worksheets(“Trained_NetworkWeights”).Range(“C2”).Offset(0, Num_H1_Nodes + jj).Value = “H->O” & jj For ii = 1 To (Num_H1_Nodes + OutputLayer_Bias) Worksheets(“Trained_NetworkWeights”).Range(“C2”).Offset(ii, Num_H1_Nodes + jj).Value = Weights_H1_O(ii, jj) Next ii Next jj
For ii = 1 To Num_Output_Nodes Worksheets(“Trained_NetworkWeights”).Range(“B2”).Offset(ii – 1, Num_H1_Nodes + Num_Output_Nodes + 2).Value = “Output”
Worksheets(“Trained_NetworkWeights”).Range(“B3”).Offset(ii – 1, Num_H1_Nodes + Num_Output_Nodes + 2).FormulaR1C1 = _ “=” & Out_ActivationFunc & “(sumproduct(R[” & -(ii – 1) & “]C[” & -(Num_Output_Nodes – (ii – 1)) & “]:R[” & _ (Num_H1_Nodes + OutputLayer_Bias – ii) & “]C[” & -(Num_Output_Nodes – (ii – 1)) & “],R[” & _ -(ii – 1) & “]C[” & -(Num_Output_Nodes + ii) & “]:R[” & (Num_H1_Nodes + OutputLayer_Bias – ii) & “]C[” & -(Num_Output_Nodes + ii) & “]))”
Worksheets(“Trained_NetworkWeights”).Range(“B3”).Offset(ii – 1, Num_H1_Nodes + Num_Output_Nodes + 2).Interior.Color = 15773696
Next ii ‘Delete all arrays to free up memory Erase INodeValue: Erase H1NodeValue Erase OutputNodeValue: Erase TargetNodeValue Erase Weights_H1_O: Erase Weights_I_H1 Erase WeightsChange_I_H1: Erase WeightsChange_H1_O Erase WeightsChange_I_H1_Prev: Erase WeightsChange_H1_O_Prev Erase H1NodesDelta: Erase OutputNodesDelta Erase OutputNodeErrorValue End Sub
(thanks to bquanttrading at https://asmquantmacro.com/2015/08/13/artificial-neural-network-with-backpropagation-training-in-vba)
Kommentarer