are very interesting subjects and language supports this technology using the framework . and neural networks Deep learning Go Gorgonia works like similar frameworks, for example, , allowing the creation of of operations and . Gorgonia Tensorflow graphs tensors There isn’t something like for , although the project is promising. Keras Gorgonia Golgi The programming is a little low level, because we have to create the graphs containing the operations and the tensors, so there is no concept of or , which exists in other frameworks. neuron layers In this post, I will show a very basic example of trying to find the weights for an approximation of the . - Multilayer Perceptron MLP exclusive disjunction function or XOR Installation First, we have to install the necessary libraries. Oh, and for starters, your environment has to be 1.12 or higher! Go $ go1. linux/amd64 go version go version 13.5 Then, install dependency packages: gorgonia.org/gorgonia go get There are very useful packages, such as: : like library for ; gonum numpy Go : like library for ; gota pandas Go But in this example I will use only . gorgonia The example I will create a two layered network, like the model: In this model, to simplify things, it doesn’t include , which may make the network take a little longer to converge, but a better model It would be like this: bias We have an input sequence of 4 pairs of numbers: {1,0}, {0,1}, {1,1}, {0,0}, with 4.2 (four lines and two columns). They are two input nodes, with four repetitions. shape For this, we will have a of 2 nodes, so we will have a weight matrix with 2.2 (two rows and two columns), between the inputs and the hidden layer. hidden layer shape And we have an exit node, so we have a weight column with 2. shape The expected result of a operation would be as follows: {1,1,0,0}. XOR Model Assembly and Training The imports the required libraries. I will start with the interesting part, which is to create a struct to represent our neural network model: example file nn { g *ExprGraph w0, w1 *Node pred *Node predVal Value } type struct This struct contains pointers to the operations graph (g), the weight layer nodes (w0 - input / hidden and w1 - hidden / output), the output node (pred) and its value (predVal). I created a method to return the weight matrices, or , what the model is expected to learn. This makes the much easier: learnables part Backpropagation { Nodes{m.w0, m.w1} } func (m *nn) learnables () Nodes return I also have created a to instantiate the neural network: factory method { w0 := NewMatrix(g, dt, WithShape( , ), WithName( ), WithInit(GlorotN( ))) w1 := NewMatrix(g, dt, WithShape( , ), WithName( ), WithInit(GlorotN( ))) &nn{ g: g, w0: w0, w1: w1} } * func newNN (g *ExprGraph) nn // Create node for w/weight 2 2 "w0" 1.0 2 1 "w1" 1.0 return Here we create two gorgonia matrices, informing their and initializing with random numbers (using the Glorot algorithm). shapes We’re just creating nodes in the graph! Nothing will really be performed by ! gorgonia I created a method for the that takes the input array and passes the elements across the network: Forward propagation { l0, l1, l2 *Node l0dot, l1dot *Node l0 = x l0dot = Must(Mul(l0, m.w0)) l1 = Must(Sigmoid(l0dot)) l1dot = Must(Mul(l1, m.w1)) l2 = Must(Sigmoid(l1dot)) m.pred = l2 Read(m.pred, &m.predVal) } func (m *nn) fwd (x *Node) (err error) var var // Camada de input // Multiplicação pelos pesos e sigmoid // Input para a hidden layer // Multiplicação pelos pesos: // Camada de saída: return nil We multiply the entries by the weights, calculate the and move to the hidden layer, reaching the end. Sigmoid Finally, in the method we instantiate our input vector and our result vector: main() xB := [] { , , , , , , , } xT := tensor.New(tensor.WithBacking(xB), tensor.WithShape( , )) x := NewMatrix(g, tensor.Float64, WithName( ), WithShape( , ), WithValue(xT), ) yB := [] { , , , } yT := tensor.New(tensor.WithBacking(yB), tensor.WithShape( , )) y := NewMatrix(g, tensor.Float64, WithName( ), WithShape( , ), WithValue(yT), ) // Set input x to network float64 1 0 0 1 1 1 0 0 4 2 "X" 4 2 // Define validation data set float64 1 1 0 0 4 1 "y" 4 1 Append the in the graph: Forward pass err := m.fwd(x); err != { log.Fatalf( , err) } // Run forward pass if nil "%+v" Append the loss function : MSE losses := Must(Sub(y, m.pred)) square := Must(Square(losses)) cost := Must(Mean(square)) // Calculate Cost w/MSE And append the gradient and in the graph: Backpropagation _, err = Grad(cost, m.learnables()...); err != { log.Fatal(err) } // Do Gradient updates if nil Finally, instantiate a virtual machine and run the graph: gorgonia vm := NewTapeMachine(g, BindDualValues(m.learnables()...)) solver := NewVanillaSolver(WithLearnRate( )) i := ; i < ; i++ { vm.Reset() err = vm.RunAll(); err != { log.Fatalf( , i, err) } solver.Step(NodesToValueGrads(m.learnables())) vm.Reset() } fmt.Println( , m.predVal) // Instantiate VM and Solver 0.1 for 0 10000 if nil "Failed at inter %d: %v" "\n\nOutput after Training: \n" I repeated the training many times running the graph with . vm.RunAll() This is the training result: Output after Training: C[ ] 0.6267103873881292 0.6195071561964745 0.47790055401989834 0.3560452019123115 You can create any neural network model with . This is just a kickstart. I didn't worry about regularization and performance, but that's a topic for another post! Gorgonia Cleuton Sampaio, M.Sc.