I recently went through a very detailed and well-explained Python-based project/lesson by which is called . This is a tiny scalar-valued autograd engine with a neural net on top of it. This video explains how to build such a network from scratch. karpathy micrograd https://www.youtube.com/watch?v=VMj-3S1tku0&embedable=true The project above is, as expected, built on . For learning purposes, I wanted to see how such a network may be implemented in and came up with a 🤖 repository (and also with a of how the network may be trained). Python TypeScript micrograd-ts demo Trying to build anything on your own very often gives you a much better understanding of a topic. So, this was a good exercise, especially taking into account that the whole code is just ~200 lines of TS code with no external dependencies. The micrograd-ts repository might be useful for those who want to get a basic understanding of how work, using a TypeScript environment for experimentation. neural networks With that being said, let me give you some more information about the project. Project structure — this folder is the core/purpose of the repo micrograd/ — the scalar class that supports basic math operations like , , , , , , and has a method that calculates a derivative of the expression, which is required for back-propagation flow. engine.ts Value add sub div mul pow exp tanh backward() nn.ts — the , , and (multi-layer perceptron) classes that implement a neural network on top of the differentiable scalar . Neuron Layer MLP Values - demo React application to experiment with the micrograd code demo/ src/demos/ - several playgrounds where you can experiment with the , , and classes. Neuron Layer MLP Micrograd See 🎬 The spelled-out intro to neural networks and back-propagation: building micrograd YouTube video (shared above) for a detailed explanation of how neural networks and backpropagation work. The video also explains in detail what the , , , and classes do. Neuron Layer MLP Value Briefly, the class allows you to build a computation graph for some expression that consists of scalar values. Value Here is an example of how the computation graph for the expression looks like: a * b + c Based on the class, we can build a expression . Here we're simulating a dot-product of matrix (input features) and matrix (neuron weights): Value Neuron X * W + b X W Out of , we can build the network class that consists of several of . The computation graph in this case may look a bit complex to be displayed here, but a simplified version might look like this: Neurons MLP Layers Neurons The main idea is that the computation graphs above "know" how to do automatic back propagation (in other words, how to calculate derivatives). This allows us to train the MLP network for several epochs and adjust the network weights in a way that reduces the ultimate loss: Demo (online) To see the online demo/playground, check the following link: 🔗 trekhleb.dev/micrograd-ts Demo (local) If you want to experiment with the code locally, follow the instructions below. Setup Clone the current repo locally. Switch to the demo folder: cd ./demo Setup node v18 using (optional): nvm nvm use Install dependencies: npm i Launch demo app: npm run dev The demo app will be available at http://localhost:5173/micrograd-ts Playgrounds Go to the ./demo/src/demos/ to explore several playgrounds for the , , and classes. Neuron Layer MLP I hope, playing around with the micrograd-ts code above and watching the video from Karpathy will be helpful at least for some of you, learners. Also published here.