7/10. Too many multisyllabic words to be a Trump post.
7/10. Too many multisyllabic words to be a Trump post.
I have seen SQL written by professional Oracle DBAs. What I learned is that I do not want to look at SQL written by professional Oracle DBAs.
On delivery, not when they’re out in the field. Where would they even put a whole slaughtered cow? Especially with all the salt they’d need in pre-refrigerated times. Even with a wagon, that’s not an efficient way to bring food with you.
I have the most boring “this edible ain’t shit” story. Tried a bunch, and it knocked me off to sleep for a good 10 hours. I think they had a lot of CBD in them. They were sold as sativa. While research says there’s no difference between that and indica, I suspect things labeled as one or the other tend to have other things in them, like CBD, which makes them live up to their reputation. There probably is no difference when you prepare them all the same under lab conditions.
Don’t start like that. Do your research, and start with a low dose and work your way up.
Later, on stuff labeled indica, I started having light forms of ego death. Not the melting into the universe that you might get from proper hallucinogens, but more like a feeling of becoming part of my environment. My tolerance levels are too high now to get that sort of thing, though.
Are you under the impression that cowboys eat the herd they’re driving? No, those are for other people.
Cowboys were quite literally in the middle of nowhere with nothing but beans. What’s the British excuse? “Oh, my great grandmother cooked this in the blitz, and we forgot rationing ended decades ago”.
Zombiecide has entered the chat.
Except people take that method seriously all the time. All the objections I raised were one’s people have actually thrown back at me.
I’m tons of fun at parties. Everybody loves seeing my collection of bottles from defunct soda companies.
You know, maybe we shouldn’t be taking estimation advice from a 1980s science fiction movie that amounts to a systematic method of lying.
Yes, I’ve used it before. Yes, you can hopefully have everything average out in the end. Yes, project managers demand estimates. None of these are good reasons to back up how fundamentally flawed it is.
Also, in practice, they’re usually only good at one or two of the things on the list (at best) and hack their way through the rest. As much as people make fun of overspecialization, it happens in every field for a reason.
Neither is all that great in practice.
Gopher has many problems as a protocol. The original versions of HTTP had much the same problems, such as closing the connection at the end of a transfer rather than having a length header or a signal that the connection is actually done. HTTP went on to fix most of those problems, but Gopher never got the chance. Gopher+ started fixing it up, but it was a victim of bad timing. The Mosaic browser was released shortly after Gopher+ and everyone started switching over. To my knowledge, nobody has ever implemented Gopher+ on either a client or server. Not even after over 20 years of a “revival” movement.
Gemini intentionally limits things, such as not having inline images. This is supposed to be done to keep out methods that have been historically used to track users, but things don’t work that way. I can just as easily send my logs to a data broker without using a pixel tracker if that’s what I want to do.
In the end, you can just use HTTP with a static web page, zero cookies, and no JavaScript. That’s what I ended up doing for my old blog (after offering a Gemini version for a while), including converting a bunch of YouTube <iframe>
tags to linked screenshots so you don’t even get YouTube cookies.
Attempting to replace people in the workplace without changing society so that people can live without work.
Companies are expected to make money, not revolutionize the world
I’d like to believe that, but I don’t think investors have caught on yet. That’s where the day of reckoning will come.
AI is a field that’s gone through boom and bust cycles before. The 1960s were a boom era for the field, and it largely came from DoD money via DARPA. This was awkward for a lot of the university pre and post grads in AI at the time, as they were often part of the anti-war movement. Then the anti-war movement starts to win and the public turns against the Vietnam war. This, in turn, causes that DARPA money to dry up, and it’s not replaced with anything from elsewhere in the government. This leads to an AI winter.
Just to be clear, I like AI as a field of research. I don’t at all like what capitalism is doing with it. But what did we get from that time of huge AI investment? Some things that can be traced directly back to it are optimizing compilers, virtual memory, Unix, and virtual environments. Computing today would look entirely different without it. We may have eventually invented those things otherwise, but it would have taken much, much longer.
. . . with 10% increase in performance rather than 50 or 60% like we really need
Why is this a need? The constant push for better and better has not been healthy for humanity or the planet. Exponential growth was always going to hit a ceiling. The limit on Moore’s Law has been more to the economic side than actually packing transistors in.
We still don’t have the capability to play games in full native 4K 144 Hertz. That’s at least a decade away
Sure you can, today, and this is why:
So many gaming companies are incapable of putting out a successful AAA title because . . .
Regardless of the reasons, the AAA space is going to have to pull back. Which is perfectly fine by me, because their games are trash. Even the good ones are often filled with micro transaction nonsense. None of them have innovated anything in years; that’s all been done at the indie level. Which is where the real party is at.
Would it be so bad if graphics were locked at the PS4 level? Comparable hardware can run some incredible games from 50 years of development. We’re not even close to innovating new types of games that can run on that. Planet X2 is a recent RTS game that runs on a Commodore 64. The genre didn’t really exist at the time, and the control scheme is a bit wonky, but it’s playable. If you can essentially backport a genre to the C64, what could we do with PS4 level hardware that we just haven’t thought of yet?
Yeah, there will be worse graphics because of this. Meh. You’ll have native 4K/144Hz just by nature of pulling back on pushing GPUs. Even big games like Rocket League, LoL, and CS:GO have been doing this by not pushing graphics as far as they can go. Those games all look fine for what they’re trying to do.
I want smaller games with worse graphics made by people who are paid more to work less, and I’m not kidding.
Let me guess, you also think mushrooms are addictive.
Goth girls grow up so fast these days.
People used to care a lot. The GNU utils absorbed everything all the old Unix vendors did. This made them comparatively heafty back when a high end workstations might have had 64MB of RAM.
Now that Chrome takes up gigabytes per tab, nobody cares except a few old Unix curmudgeons.
Meh. It’s a nice bragging right, but that’s all it is at this point. Linux killed off almost all the old Unix vendors for a reason.
Electronics usually wants to control the temperature range more tightly than a butane soldering iron could do. Fine for plumbing work, though. Electronics soldering irons usually don’t have the thermal mass to handle plumbing work.
My biggest complaint about the ts100, Pinecil, and the iFixit station is that the tips are specialized and rather expensive.
Mostly, yes.
I’d like to find a better way to phrase "why aren’t you . . . " questions. It carries an accusatory tone in text, even if you don’t intend that. The answer is almost invariably going to be either “I didn’t know it existed” or “because reason X”. Neither case justifies the accusatory tone. Maybe if the “I didn’t know it existed” answer was something so basic that they really should have known it existed, but probably not even then.