Train Your Own LLM from Scratch

293 points - today at 4:09 AM

Source

Comments

jvican today at 5:22 AM
If you're interested in this resource, I highly recommend checking out Stanford's CS336 class. It covers all this curriculum in a lot more depth, introduces you into a lot of theoretical aspects (scaling laws, intuitions) and systems thinking (kernel optimization/profiling). For this, you have to do the assignments, of course... https://cs336.stanford.edu/
NSUserDefaults today at 6:16 AM
Been doing it since the day I was born. The beginnings were hard but I’m getting there.
JoeDaDude today at 7:18 AM
Coincidentally, I just started on Build a Large Language Model (From Scratch), a repo/book/course by Sebastian Raschka [0][1][2]. Maybe it is a good problem to have to have to decide which learning resource to use.

[0] https://github.com/rasbt/LLMs-from-scratch

[1] https://www.manning.com/books/build-a-large-language-model-f...

[2] https://magazine.sebastianraschka.com/p/coding-llms-from-the...

Miles_Stone today at 11:53 AM
This is a really interesting direction. Thanks for sharing!
antirez today at 7:16 AM
Context: he is one of the MLX developers, a skilled ML researcher.
y42 today at 9:37 AM
shameless plug:

A series of Jupyter notebooks explaining the whole machine learning mechanism, from the beginning

https://github.com/nickyreinert/DeepLearning-with-PyTorch-fr...

and of course also how to build an llm from scratch

https://github.com/nickyreinert/basic-llm-with-pytorch/blob/...

kriro today at 7:30 AM
I did it back in the day when fast.ai was relatively new with ULMFiT. This must have been when Bert was sota. The architecture allows you to train a base and specialize with a head. I used the entire Wikipedia for the base and then some GBs of tweets I had collected through the firehouse. I had access to a lab with 20 game dev computers. Must have been roughly GTX 2080s. One training cycle took about half a day for the tokenized Wikipedia so I hyper parameter tuned by running one different setting on each computer and then moving on with the winner as the starting point for the next day. It was always fun to come to work the next morning and check the results.

The engineering was horrible and very ad-hoc but I learned a lot. Results were ok-ish (I classified tweets) but it gave me a good perspective on the sheer GPU power (and engineering challenges) one would need to do this seriously. I didn't fully grasp the potential of generating output but spent quite some time chuckling at generated tweets (was just curious to try it).

deleted today at 10:02 AM
ofsen today at 6:46 AM
This looks like exact copy of this video of andrej karpathy ( https://youtu.be/kCc8FmEb1nY ) but in a writing format, am i wrong ?
fabian_shipamax today at 9:36 AM
If someone is interested, I am giving short courses with walkthrough on how to train you LLM from scratch via AI Study Camp.
deleted today at 10:23 AM
steveharing1 today at 7:15 AM
The documentation is really helpful enough to get started
hiroakiaizawa today at 5:48 AM
Nice. What scale does this realistically reach on a single machine?
iamnotarobotman today at 4:55 AM
This looks great for a first introduction to training LLMs, and it looks simple enough to try this locally. Great job!
baalimago today at 5:28 AM
Train your LM from scratch*

I doubt you have a machine big enough to make it "Large".

DeathArrow today at 9:22 AM
I would start with linear algebra, some calculus and statistics and understand how a neural network - which really is just one type of ML - works, the learn the basics of CNN and RNN, then learn transformers and LLM.

But that is just me. I think is more useful to understand the how and whys before training a LLM.

yjaspar today at 7:58 AM
That’s actually super interesting
deleted today at 7:43 AM
Ozzie-D today at 9:46 AM
[flagged]
rithdmc today at 9:21 AM
I know it's a bit of a joke, but "I Built a Neural Network from Scratch in SCRATCH" gave me, a complete outsider, a lot of insight into how neural networks work.

https://www.youtube.com/watch?v=5COUxxTRcL0

gbkgbk today at 10:11 AM
[flagged]
flowdesktech today at 7:42 AM
[flagged]