Has this impacted your self hosted instances of Immich? Are you hosting Immich via subdomain?
Related:
Was also flagged recently.
In my case it was the root domain which is- Geofiltered to only my own Country in Cloudflare
- Geofiltered to only my country in my firewall
- Protected by Authelia (except the root domain which says 404 when accessing)
So…IDK what they want from me :p My domain doesnt serve public websites (like a blog) destined for public consumption…
Why are the immich teams internal deployments available to anyone on the open web? If you go to one of their links, like they provide in the article, they have an invalid SSL certificate, which google rightly flags as being a security risk, warns you about it, and stops you from going there without manual intervention. This is standard behaviour and no-one should want google to stop doing this.
I was going to install linux on an old NUC to run immich some time soon, but think I might have to have a look to see if it has been audited by some legit security companies first. How do they not see this issue of their own doing?
It is for pull requests. A user makes a change to the documentation, they want to be able to see the changes on a web page.
If you don’t have them on the open web, developers and pull request authors can’t see the previews.
The issue they had was being marked as phishing, not the SSL certificate warning page.
It is for pull requests. A user makes a change to the documentation, they want to be able to see the changes on a web page.
So? What that has to do with SSL certificates? Do you think GitHub loses SSL when viewing PRs?
If you don’t have them on the open web, developers and pull request authors can’t see the previews.
You can have them in the open, but without SSL you can’t be sure what you’re accessing, i.e. it’s trivial to make a malicious site to take it’s place an MitM whoever tries to access the real one.
The issue they had was being marked as phishing, not the SSL certificate warning page.
Yes, a website without SSL is very likely a phishing attack, it means someone might be impersonating the real website and so it shouldn’t be trusted. Even if by a fluke of chance you hit the right site, all of your communication with it is unencrypted, so anyone in the path can see it clearly.
Yes, a website without SSL is very likely a phishing attack, it means someone might be impersonating the real website and so it shouldn’t be trusted. Even if by a fluke of chance you hit the right site, all of your communication with it is unencrypted, so anyone in the path can see it clearly.
No, Google has hit me with this multiple times for sub domains where the subdomain is the name of the product and has a login page.
So, for example, if I have emby running at emby.domain.com they’ll mark it as a phishing site. You have to add your domain to their web console and dispute the finding which is probably automated. I’ve had to do this at least three times now.
All my certs were valid.
Yes, Google has miss reported my websites in the past, all of which were valid, but the person I’m replying to seemed to assume no-SSL is a requirement of the feature, and he doesn’t understand that a wrong/missing SSL is indistinguishable from a Phishing attack, and that the SSL error page is the one that warns you about phishing (with reason).
The issue they had was being marked as phishing, not the SSL certificate warning page.
Have you seen what browsers say when you have a look at the SSL certificate warning page?
It is for pull requests. A user makes a change to the documentation, they want to be able to see the changes on a web page.
Why is a user made PR publishing a branch to Immich’s domain for the user to see?
You could just host it inside your network and do an always on VPN. That’s what I do.
Now imagine you’re running a successful open source project developed in the open, where it’s expected that people outside your core team review and comment on changes.
How would that work? The use case is for previews for pull requests. Somebody submits a change to the website. This creates a preview domain that reviewers and authors can see their proposed changes in a clean environment.
CloudFlare pages gives this behavior out of the box.
Ah, I missed that part
Stop using google. Don’t you know their motto? “Be evil”
OP is inpacted by Google SafeBrowsing which various websites use.
Easier said than done, if your end users run Chrome. Because Chrome will automatically block your site if you’re on double secret probation.
The phishing flag usually happens because you have the Username, Password, Log In, and SSO button all on the same screen. Google wants you to have the Username field, the Log In button, and any SSO stuff on one page. Then if you input a username and go to start a password login, Google expects the SSO to disappear and be replaced by the vanilla Log In button. If you simply have all of the fields and buttons on one page, Google flags it as a phishing attempt. Like I guess they expect you to try and steal users’ Google passwords if you have a password field on the same page as a “Sign in with Google” button.
Firefox ingests Google SafeBrowsing lists.
If you are falsely flagged as phishing (like I was), then you are fucked regardless of what you use (except you use curl).I couldnt even bypass the safebrowse warning on my Android phone in Firefox.
Immich users flag Google sites as dangerous
Google, protecting you from privacy
Also protecting you from the East Wing of the White House.
Google protecting Google from FOSS.
They’re right too, after using Immich I don’t want to go back.
jellyfin had a similar issue too for a long time for servers exposed to the internet. google would always reblock the domains soon after unblocking them. I think they solved it in the latest update. Basically it’s that google’s scraping bots think that all jellyfin servers are a scam that imitate a “real” website.
What is the usecase for exposing jellyfin to the outernet anyway ?
What’s the usecase for Netflix? Same case.
But the malvertisements on Google’s front page are ok, I guess
I smell fear.
The URLs mentioned in their blog article all have a wrong certificate (different host name).
I am sure if they fix it Google’s system would reclassify the sites as safe.
Yeah, sure, 5 years after google flagged one of the sites i hosted, some firewalls (including isp-level blocks) mark the domain as unsafe. Google removed the block after more than a week but the stink continues until today.
It was also a development domain and we were forced to change it.
I think that marking things as “safe” could have more complications than this depending on their definition but I think you’re right that’s probably all this issue is. This is almost the only sane comment here. Everyone else seems to be frothing at the mouth and I’m guessing its a decent mix of not understanding much of how these systems work (and blindly running tutorials for those that do self host) and blind ideology (big companies are bad / any practice that restricts my personal freedom in any way is bad)
I don’t blame people for thinking that something is off after reading the linked blog post. This wouldn’t be the first time Google does something like this to OSS that poses some kind of potential threat to their business model (this is also mentioned in the post).
Google
I have identified the problem.
Google marks half the apps on my phone as dangerous. Google are evil xxxxxx’s
They’ve also started warning against android apps from outside repos. Basically they want to force people to use their ai-filled bullshit apps.
this is why you disable google “safe browsing” in librewolf and use badblock instead
Thanks for sharing this nice blocklist :)
Librewolf has Google “safe browsing” to disable…? Google?
firefox has google safe browser api protection; librewolf disables it by default under librewolf settings.
https://support.mozilla.org/en-US/kb/safe-browsing-firefox-focus
I got a ‘dangerous site’ warning and then prompts for crap on my Vaultwarden instance (didn’t see it on Immich but this was a while ago). I think I had to prove I owned the domain with some DNS TXT records then let them “recheck” the domain. It seems to have worked.
Fuck you google. I can’t see youtube videos with my browser because google wants me to sign in. Tells me it is protecting the community.
BULLSHIT.
Because google doesnt make me sign in to view or edit someone elses google docs they are sharing. Which one is more important google? Assholes.
I can’t see youtube videos with my browser because google wants me to sign in. Tells me it is protecting the community.
I’m guessing the videos are age restricted 18+ videos? You don’t have to be signed in to watch any other videos.
No, not age restricted.
Happens most frequently with using any VPN, which we use all the time at work and I often use at home or while traveling.
But sometimes it just does it without.
I think most people are signed into their gmail account or have been recently so the cookie is set. It’s crazy when you don’t have one how hard Google pushes you.
Nope, sometimes it asks for normal videos as well, it really depends on the case since there’s a lot of background stuff happening, making the experience vary between users.
Same when you try to deviate from the approved path of email providers or, dog forbid, even self-host email.
This is why I always switch off that “block potentially dangerous sites” setting in my browser - it means Google’s blacklists. This is how Google influences the web beyond its own products.
edit: it’s much more complex than simple blocklists with email
This is why I always don’t use Chrome or Google Search
I wouldn’t recommend turning off safe browsing
If a page is blocked it is very easy to bypass. However, the warning page will make you take a step back.
For instance, someone could create a fake Lemmy instance at fedit.org to harvest credentials.
just use ublock origin and a proper password manager. google safe browsing means google sees what sites you browse.
@possiblylinux127 @A_norny_mousse ungoogled-chromium disables safe browsing, and for Debian’s chromium package I keep going back and forth about whether to pull that patch in or not.
@Andres4NY@social.ridetrans.it
Running Debian Stable, I have installed ungoogled-chromium which is also in the repos.
But Librewolf is my main browser, Chromium a rarely used secondary.
What I’m talking about is how these blocklists are used by many other browsers/softwares (e.g. Firefox) as well.














