[X.com] by @sebkrier
In theory, if you had infinite time, no hobbies, and were a genius at math, you could run a language model by hand by looking up token embeddings in giant tables and spending days doing matrix multiplications with thousands of numbers, applying mathematical functions like softmax at each step. Each word would take days of calculation as you transform vectors through layers of attention and feed-forward networks, all to finally get probabilities for what word should come next. Once you get to that last step and calculate the resulting token, congratulations, you have your analog handwoven slop. In this entire process, where and why would anyone see consciousness arising in any meaningful way?
Functionalism is weird; the temporal disconnection and the perfectly traceable, mechanical nature of each step makes it hard to see where anything like subjective experience would arise. What are other interpretations here?