Jon, the real-ness of numbers reminded me of the twins in Oliver Sacks "The Man Who Mistook His Wife for a Hat"... The relevant section is posted here:
Thanks, Jon, for the useful framework. I’m stoked for more of the series and to use it as inspiration / guide to get my hands dirty on some of these models. Super relevant content.
It covers the basics of neural nets, backward propagation, and gradient descent, along with minimal and simple ruby code, based on Andrej Karpathy's MicroGrad.
AI Content Generation, Part 1: Machine Learning Basics
Borges's Library of Babel is my favorite treatment of the “what does it mean to say that numbers really exist?” question.
Jon, the real-ness of numbers reminded me of the twins in Oliver Sacks "The Man Who Mistook His Wife for a Hat"... The relevant section is posted here:
https://empslocal.ex.ac.uk/people/staff/mrwatkin/isoc/twins.htm
Some are skeptical of the story... But there's got to be patterns we humans don't (yet?) perceive...
for those looking for more on the concept of latent space:
detailed:
https://towardsdatascience.com/understanding-latent-space-in-machine-learning-de5a7c687d8d
and ofc:
https://en.wikipedia.org/wiki/Latent_space
Thanks, Jon, for the useful framework. I’m stoked for more of the series and to use it as inspiration / guide to get my hands dirty on some of these models. Super relevant content.
Great primer. This might be interesting for you or your readers: https://github.com/rickhull/backprop
It covers the basics of neural nets, backward propagation, and gradient descent, along with minimal and simple ruby code, based on Andrej Karpathy's MicroGrad.
Thank you, Jon. I'm eager for the next one.
Great read. I wrote a small piece on the creator economy here: https://clarioncall.substack.com/p/the-book-that-predicted-the-creator