3 Comments
Feb 28Liked by Jon Stokes

Here is link to a TechCrunch article with the quote:

"According to OpenAI co-founder and CEO Sam Altman, ChatGPT’s operating expenses are “eye-watering,” amounting to a few cents per chat in total compute costs."

https://techcrunch.com/2023/02/01/openai-launches-chatgpt-plus-starting-at-20-per-month/

Expand full comment
Feb 25Liked by Jon Stokes

Some people in neuroscience think the hippocampus functions as a sequence model / generator, in a way that seems related to causal LM objectives: https://www.cell.com/trends/cognitive-sciences/fulltext/S1364-6613(18)30166-9

Though the details look pretty different, doesn't seem biologically plausible that the hippocampus *is* a transformer.

Expand full comment

Your comparison of LLMs and brains is a nice attempt to highlight a dimension most discussion seems to ignore. However, it seems hard to assign probabilities to the options: they don't feel evenly weighted but I am not aware of much research that would allow us to pick some of these options as more likely. Some models of cognition (like Friston's free energy principle or global workspace theory) do bear on these weights but it's not immediately obvious to me in what way. Hoel's forthcoming book might also be useful.

Expand full comment