Nah, just the sad message of “Pretty please love me (because we sunk a bunch of money into this).”
Nah, just the sad message of “Pretty please love me (because we sunk a bunch of money into this).”
One of the things I initially liked about Pixels was that I could uninstall/disable a lot of the proprietary garbage that would be mandatory on other phones. But now it looks like Google is abandoning that flexibility in favour of shoehorning Gemini into everything.
My only interaction with Gemini so far was telling it to kick rocks when it sent me an unsolicited text message. I also barely use Assistant to begin with. So once my current phone dies, I guess I’ll have to find something new.
Furthermore, there is an option to destroy the special “gift” if you can resist accepting it. However, all you get for doing so is a few brief lines from the Emperor. Your companions don’t seem to notice, and there isn’t even an quest log update.
As I said in another topic, this is the only way to play FF3 in its original form (or at least close to it) and in a language other than Japanese, outside of emulation. The DS remake is fine, but it is definitely a different experience.
The Stone Angel.
It’s a miserable story about a dying old woman regretting all her life choices. It’s also required reading in Canadian high schools because the author is Canadian.
And then, on top of all that, my teacher absolutely insisted that its only major theme was “hope” and docked marks for having any other interpretation.
I’ve never experienced that, and I’ve definitely told Google Assistant to fornicate with itself on multiple occasions.
I had never heard of Humane until I read this article. After also reading Engadget’s review of the thing, it sounds like an absolute nightmare to use.
Maybe I’m too old-school and impatient, but I’ve never been able to make voice assistants work for me. It’s a feedback loop: the assistant fails to do a task, so I become resistant to using it in the future. Even the thing I’ve used an assistant for the most, playing music out of a Nest speaker, seems to still be hit-or-miss after years of trying, and in some ways seems to be getting worse.
The gestures also sound awful. As with voice assistants, I’ve never gotten comfortable with smartphone gestures beyond the most rudimentary. I strictly use 3-button navigation on my phone, and I use Connect as my Lemmy app of choice because it allows me to disable all the swipe commands for upvote/downvote.
Briefly: I didn’t.
More substantively: I never owned a cell phone growing up, even though I was at the right age when they became a common thing for teenagers to have. It wasn’t a money thing, nor household rule, as my sisters got phones when they were in high school. The biggest reason was probably just how I communicate. I wasn’t big into IM services either, and I preferred email or face-to-face, or a (landline) phone call if it was an urgent matter.
Then there was also my adolescent brain thinking I was making a bold counter-culture statement by steadfastly resisting the march of technology. In reality, I was probably just being a pain in the neck for my friends and family, and I probably unnecessarily endangered myself at least once.
I did finally, begrudgingly, get an old hand-me-down flip-phone in my final year of university, but that was out of necessity, and I used it to make maybe only a dozen calls the 2.5 years I had it before getting a smart device.
To bring it full circle: I did try sending a text message with that flip-phone exactly once, at the insistence of my family. That message was predictably a garbled mess, and to this day my sisters still wonder how I managed to get a number to appear in the middle of the “word”.
I have a number of other somewhat amusing stories about people’s reactions to my lack of a cellphone, but this post is long enough already.
Am I the only person in my generation who never learned to type on a number pad? It wasn’t the only thing I didn’t recognize from the “test”, but it stuck out to me.
Well, there’s the fact that outrage seems to drive more activity than other types of content. YouTube sees it as a more profitable option to advertise a Very Angry Gamer™ to you, even if you aren’t interested. I guess they assume that you’ll find something to watch anyhow, but if they will profit even more of they can hook you into the outrage machine.
Then there’s my personal hypothesis that in order to enable this, YouTube’s algorithm weights your demographics, subscriptions, and viewing history much more heavily than your manual inputs.
My wife and I had this conversation the other day. Our kid is only two right now, but as we’ve learned, these milestones sneak up on you.
I used my own life as a guide to my opinion, and so landed on age eight or so. That’s around the age I remember being able to go to the park or to a friend’s house within the neighbourhood on my own.
Other questions about how much functionality the phone would have and how much access they would have to it at home are still to be determined.
I see where the disconnect is now.
I, and presumably others, associate obsession with religious minutiae with religious fervour. I have a lot first hand experience with this, as some of the most ardent Christians I knew were also the ones who were eyeballs deep in apologetics and church history (and also adult converts). It makes a certain amount of logical sense too, as you wouldn’t expect a casual church-goer to care that much about all that.
With that in mind, it isn’t a big leap to connect the original post to the phenomenon of the zeal of the convert.
What it comes down to, then, is that the original post has more than one layer to it. Rather than focus on the difference between charity and dogmatism, I chose instead to highlight contrast between the simplicity [of charity] and the convolution [of dogmatism]. Once again, my personal experiences informed the way I approached this post.
I’m completely lost. How and when did this become about religious people behaving badly? I am 99.9% sure that the point of the original topic was a commentary on how recent converts tend to be more enthusiastic about their faith than people raised in the church, regardless of what the individual beliefs actually are. The example beliefs from the original post (“feed the poor” and “women shouldn’t drive”) are just examples to help characterize this dichotomy in an amusing way.
In fact, that second example, about women and driving, is almost certainly not an actual Catholic doctrine. Any search for the full phrase leads only to reposts of this image, and I’d wager it was made by just stringing together some Christian buzzwords for humorous effect. While I don’t doubt some Catholics do believe women shouldn’t drive, I also very much doubt they’d use the phrasing and justification found in the original post.
Why would I need to be more specific about the different branches of Catholicism? The author in the screenshot doesn’t do that either. They simply point out their observation that lifelong Catholics tend to value broad teachings that aren’t necessarily specific to Catholicism, while adult converts become fanatical about doctrinal minutiae. In other words, the former is relaxed about their faith, while the latter is zealous.
I then related that to my own experiences, where someone who is raised in a belief system tends to be less aggressive about those beliefs than someone who converts to later in life - i.e. the “zeal of the convert.” This observation isn’t exclusive to Catholicism, it’s just being made into relation to it in this instance. This phenomenon isn’t even exclusive to religion, as one can observe it with political beliefs as well.
I don’t think anything here requires a differentiation between branches of Catholicism, because the observations are about the act of converting, not about what specific belief system the converts moving to and from.
The “belief” in this case is Catholicism.
I’d say this is part of the “zeal of the convert” phenomenon, where someone who converts to a belief tends to be more fanatical than someone raised in that belief.
There’s probably bias in this observation, as a couple of very loud people can drown out dozens of others and make a trend seem more prevalent than it actually is, but I also have personal experience here.
I mean, I didn’t develop my own musical taste until my mid-20s. My parents only played Christian worship music, while all my friends in highschool and university were various flavours of music snob. I was literally convinced that no one actually liked pop music because everyone I knew seemed to hate it.
I don’t know if I was ever a “people pleaser,” in that I never pretended to like a band or song just because everyone else did. However, I definitely avoided saying anything negative about the music I was exposed to for fear that I’d be ostracized all the same.
It took me a long time to overcome all that, and it took even longer to admit my tastes publicly.
I am but one man whose only education in programming was a first year university course in C from almost two decades ago (and thus I am liable to completely botch any explanation of CS concepts and/or may just have faulty memories), but I can offer my own opinion.
Most basic programming concepts I was taught had easily understood use cases and produced observable effects. There were a lot of analogous concepts to algebra, and functions like printf
did things that were concrete and could be immediately evaluated visually.
Pointers, on the other hand, felt designed purely of and for programming. Instead of directly defining a variable by some real-world concept I was already familiar with, it was a variable defined by a property of another variable, and it took some thinking to even comprehend what that meant. Even reading the Wikipedia page today I’m not sure if I completely understand.
Pointers also didn’t appear to have an immediate use case. We had been primarily concerned with using the value of a variable to perform basic tasks, but none of those tasks ever required the location of a variable to complete the calculations. We were never offered any functions that used pointers for anything, either before or after, so including them felt like busywork.
It also didn’t help that my professor basically refused to offer any explanation beyond a basic definition. We were just told to arbitrarily include pointers in our work even though they didn’t seem to contribute to anything, and I really resented that fact. We were assured that we would eventually understand if we continued to take programming courses, but that wasn’t much comfort to first year students who just wanted to pass the introductory class they were already in.
And if what you said is true, that later courses are built on the assumption that one understands the function and usefulness of pointers despite the poor explanations, then its no wonder so many people bounce off of computer science at such a low level.
That I will never enjoy the taste of wine.
I figured out I would never like coffee in my teens, and had the same realization about beer in my 20s.
But it wasn’t until this year, in my mid-thirties, that I finally accepted that I don’t like the taste of wine and probably never will. After years of trying the full spectrum of wines, I had to admit that it wasn’t the “notes” that were turning me off, nor was it a problem with the quality of the wine. It was the fundamental “wine-ness” that I disliked, the same as I don’t like the “beer-ness” of beer or the “coffee-ness” of coffee.