Can’t LLMs take an insane number of tokens as context now (I think we’re up to 1M)
Anywho, he just like me fr
Rust (and Python) developer. Pretty good with the beep-boop computery stuff
Can’t LLMs take an insane number of tokens as context now (I think we’re up to 1M)
Anywho, he just like me fr
Fuck, I kinda wanna make this (for the funny of course)
Fs.?g??yy V>