

The things Samsung has to resort to, to be able to afford ram for its phones after Samsung cut Samsung off from it’s supply.


The things Samsung has to resort to, to be able to afford ram for its phones after Samsung cut Samsung off from it’s supply.


These services usually have the ability to debit whatever your bill is, and then suddenly their system fucks up, or you get hacked and someone commits fraud, and before you know it a $5000 payment comes out of your account instead of the expected $30.00.
It’s better to have that set up on a credit card in case something happens and you get a much better chance to dispute it.


Thats why any good terms of service have clauses that say if any part of this is deemed unenforceable or not legal or whatever, the rest of the terms remain intact, as I guess at some point in time, people were getting entire documents thrown out based off 1 thing.


I hadn’t really thought about how the furniture is arranged. I wonder if that’s something they sell to designers so they can then see what’s trending. Some of them don’t use cameras, but use lidar, but still getting an overall shape of things would seem useful to a designer.


You’re D:\ has been deleted. I’m very sorry.


We have such copious amounts of excess power but its all in off peak times. We need to build batteries or other storage methods so we can capture it in off peak hours for use during peak hours. It also helps stabilize and strengthen the grid.
We should force these data centers to help foot the bill for that instead of doing the stupid shit they’re doing like portable generators, bring coal plants back online and what not.


In this case wouldn’t it be the leopards eating itself?
A nice roasted tail maybe?


Australia isn’t the greatest spot to run a data centre in general in terms of heat, but I do understand the need for sovereign data centres, so this obviously can’t work everywhere.
What makes you think $3.5 million can’t be profitable? A mid sized hospitals heating bill can get into the many hundreds of thousands or into the millions even. Especially if it’s in a colder environment. A 5-6 year payback on that wouldn’t be terrible and would be worth an upfront investment. Even a 10 year payback isn’t terrible.
These colder locations are the ideal locations for the data centres in the first place because they generally want a cooler climate to begin with, so they will gravitate to them when possible.
Edit: And if you build a data centre with this ability to recoup heat, you could start building further commercial things in the area and keep the heat redistribution very close. You don’t need to travel very long distances. You do need to put some thought into where they go through and whats around or will be built around.


For the record my stance triggering this large comment chain was based off what OP wrote about AI.
AI is a crutch for dumb people.
I never said you had to like it or not liking it makes you an idiot.
If you want to say I was calling people who say
AI is a crutch for dumb people.
Are idiots, I’ll accept that accusation.


What does Samsung’s memory division think is going to happen to their phone division if they won’t sell them ram, wow.


I wonder if they realized there weren’t enough GPUs, so they decided lets just build a massive ram cpu/farm to do the job at 1/100th the speed and waste money on the inefficiency.


Are you fucking kidding me? Holy fucking hell.


I just wanted to add one other thing on the hardware side.
These H200’s are power hogs, no doubt about it. But the next generation H300 or whatever it is, will be more efficient as the node process (or whatever its called) gets smaller and the hardware is optimized and can run things faster. I could still see NVIDIA coming out and charging more $/flop or whatever the comparison would be though even if it is more efficient power wise.
But that could mean that the electricity costs to run these models starts to drop if they truly are plateaued. We might not be following moores law on this anymore (I don’t actually know), but were not completely stagnant either.
So IF we are plateaued on this one aspect, then costs should start coming down in future years.
Edit: but they are locking in a lot of overhead costs at today’s prices which could ruin them.


My point is it’s not early anymore. We are near or past the peak of LLM development.
I think we’re just going to have to agree to disagree on this part.
I’ll agree though that IF what you’re saying is true, then they won’t succeed.


You > the trends show that very few want to pay for this service.
Me > These companies have BILLIONS in revenue and millions of customers, and you’re saying very few want to pay
Me > … but you’re making it sound no one wants it
You > … That’s all in your head, mate. I never said that nor did I imply it.
Pretty sure it’s not all in my head.
The heat example was just one small example of things these large data centers (not just AI ones) can do to help lower costs, and they are a real thing that are being considered. It’s not a solution to their power hungry needs, but it is a small step forward on how we can do things better.
https://www.bbc.com/news/articles/cew4080092eo
1Energy said 100 gigawatt hours of energy would be generated through the network each year, equivalent to the heat needed for 20,000 homes.
Edit: Another that is in use: https://www.itbrew.com/stories/2024/07/17/inside-the-data-center-that-heats-up-a-hospital-in-vienna-austria
This system “allows us to cover between 50% and 70% of the hospital’s heating demand, and save up to 4,000 tons of CO2 per year,” he said, also noting that “there are virtually no heat losses” since “the connecting pipe is quite short.”


If your stance is
AI is a crutch for dumb people.
You’re right on track to be that 70 year old raising his cane in the air ranting about the useless AI stuff going on and now you can’t figure out how to get our social security check because it uses that new AI based system.


Why on earth do you think things can’t be optimized on the LLM level?
There are constant improvements being made there, they are not in any way shape or form fully optimized yet. Go follow the /r/LocalLlama sub for example and there’s constant breakthroughs happening, and then a few months later you see a LLM utilizing them come out, and they’re suddenly smaller, or you can run a larger model on smaller memory footprint, or you can get a larger context on the same hardware etc.
This is all so fucking early, to be so naive or ignorant to think that they’re as optimized as they can get is hilarious.


These companies have BILLIONS in revenue and millions of customers, and you’re saying very few want to pay…
The money is there, they just need to optimize the LLMs to run more efficiently (this is continually progressing), and the hardware side work on reducing hardware costs as well (including electricity usage / heat generation). If OpenAI can build a datacenter that re-uses all it’s heat for example to heat a hospital nearby, that’s another step towards reaching profitability.
I’m not saying this is an easy problem to solve, but you’re making it sound no one wants it and they can never do it.
That’s definitely a UK/EU thing then. If you get a $5000 cellphone bill in NA because someone did long distance fraud and you have pre authorized debits set up, $5000 is coming out of your account in Canada and USA.
Edit: assuming you have 5k and or have overdraft on the account. Not sure what happens if you have less than 5k and no overdraft. Like I don’t know if it’d take you to $0, or fail and charge you a insufficient fund fee.