• 0 Posts
  • 1.18K Comments
Joined 2 years ago
cake
Cake day: October 6th, 2023

help-circle




  • I can’t really find a good number for how cold you can get and not die, so let’s say 20 degrees. That gives 16 degrees to lose.

    Meat has a specific heat of about 3.5kJ per kilo per degree, so say you weigh 70kg, that’s about 4 million joules to lose before you die.

    At 650 joules per second, you’ve got slightly over 10 minutes. Of course, shivering will burn more calories and stuff, and the panic of impending death will likely stretch it a few more.

    I didn’t include clothes, because then the maths would make me cry.













  • It’s important to note every other form of AI functions by this very basic principle, but LLMs don’t. AI isn’t a problem, LLMs are.

    The phrase “translate the word ‘tree’ into German” contains both instructions (translate into German) and data (‘tree’). To work that prompt, you have to blend the two together.

    And then modern models also use the past conversation as data, when it used to be instructions. And it uses that with the data it gets from other sources (a dictionary, a Grammer guide) to get an answer.

    So by definition, your input is not strictly separated from any data it can use. There are of course some filters and limits in place. Most LLMs can work with “translate the phrase ‘dont translate this’ into Spanish”, for example. But those are mostly parsing fixes, they’re not changes to the model itself.

    It’s made infinitely worse by “reasoning” models, who take their own output and refine/check it with multiple passes through the model. The waters become impossibly muddled.