A collection of from-scratch implementations to master Deep Learning and Large Language Model fundamentals.
This repository hosts educational projects designed to demystify modern Artificial Intelligence. By building core components from the ground up, we aim to understand what happens under the hood of libraries like Torch.
| Project | Folder | Description |
|---|---|---|
| Michigrad | Michigrad-from-scratch/ | A lightweight scalar-valued autograd engine implementing dynamic computational graphs and backpropagation. |
| LLMs | Llms-from-scratch/ | From tokenization and n-grams to self-attention and Transformer architectures. |
Python 3.8+ is recommended.
git clone https://github.com/igna-s/Michigrad-Autograd-Engine.git
cd Michigrad-Autograd-Engine
from michigrad.engine import Value #This is the michigrad library
a = Value(2.0, label='a')
b = Value(-3.0, label='b')
c = Value(10.0, label='c')
e = a * b
d = e + c
L = d * 2.0
L.backward()
print(f"Loss: {L.data}")
print(f"dL/da: {a.grad}")
cd Llms-from-scratch
jupyter notebook
Developed as part of a Large Language Models Workshop.
- Joaquín Bogado (GitHub: @jwackito)
- Andrej Karpathy’s micrograd
Created with ❤️ to learn AI by breaking things.