Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>HTTPS-Only mode ... will also ask for your permission before connecting to a website if it doesn't support secure connections.

This is being done with the best of intentions but browsers scaremongering over HTTP sites as if they are dangerous is a bad thing. There is more to the web than commercial transactions!

Human people cannot feasibly be cert authorities. Only corporations can. When browsers will only display sites that are authorized by corporations we're eventually going to be in big trouble. Yes, LetsEncrypt is a benevolent corporation, yes there are even options beyond LE, but as we know from the dot org fiasco as long as there is potential money to be made these benevolent dictorships will eventually go very bad. And that's ignoring all centralization making a very juicy target for government censorship.

Encrypt, yes. But also allow plaintext. The potential for a down-grade attack on a "secure" site is worth far less to the world than being able to communicate person to person without a corporate intermediary approving every bit.



I don't really agree. Firefox itself could run a free Certificate Authority if it ever came to that. Your scenario is much less likely, and much more easily fixable than the status quo where most websites don't bother exposing their site over HTTPS, creating huge security and privacy risks for consumers of said sites.

There is absolutely no reason not to have your site functioning over HTTPs these days, except negligence.


try running `python3 http.server 8080` on a safe directory on your laptop, and see if you can easily access a file from it from your mobile.

With Firefox for Android, it doesn't permit you to connect a machine on your own LAN, even if you try to add an exception.

With stock browser it does.

I'm sure there are better ways to serve a file between machine and phone, but that python snippet was my easiest goto.

No more


> try running `python3 http.server 8080` on a safe directory on your laptop, and see if you can easily access a file from it from your mobile

Works without a problem (I just tried python -m SimpleHTTPServer). You just have to type out the http://-part explicitly. Wow that would be annoying if they dropped http completly.


? I just tried it and it works fine. Firefox 84.1.1 on Android, connect to 192.168.x.y:8080 and it comes up and shows me a directory listing that I can navigate and open files from. (you missed the -m, BTW; it's `python3 -m http.server 8080`)


wouldn't it be much easier to use scp for that? there are plenty of android clients, and plenty of windows servers.

I just actually use synced storage (Synology Drive from a Synology NAS), so I would probably use a Dropbox like service for that functionality.

There's an app called droid transfer that basically does what you did with python (setup a server and have a simple download client), and there's also wireless ADB (which will require you to use the USB cable for the initial setup)


I think the point is e.g. being able to quickly share a file with a smartphone nearby, that is not yours. Not telling your co-workers to install on their phones some app foreign to them.


> Firefox itself could run a free Certificate Authority if it ever came to that

how would that be any better?


Not to mention many non-https sites have zero reason to be. A plain HTML portfolio site, for example, should not be flagged with a scary "insecure" logo because its content is not encrypted.


We need to protect the integrity of the information so that kit's not abused to serve malware or pages that exploit zero-day vulnerabilities to the end-user when they browse on an insecure network (which, if Snowden's publications are anything to go by, is every network).

HTTPS is not just confidentiality - security is confidentiality, integrity and availability.


HTTPS ensures the data isn't tampered during transport, but it doesn't ensure the integrity of the data itself. That's why there's things like Subresource Integrity (which doesn't apply to top-level resources like HTML).

However, there's no way to ensure the files we download are created by who they say they are. A domain for example can change hands and existing links say on HN can be loaded with unexpected, potentially malicious, content. Same for hacked servers.

IMO we need some form standard page signing to enforce actual integrity of information, not just transport. I made a proof-of-concept Web Extension to show how that might be possible using PGP [1]. Of course PGP has its own issues but it's just an experiment.

[1] https://webverify.jahed.dev/


Having a valid ssl cert does not stop a website from serving malware.


No, but it stops third parties from adding it


No it doesn't.


Okay, I'll bite. Let's say I'm loading a page over https, and you control the network but not the server (mitm). How exactly are you adding content to the page?


Attack the client's (or some sub-population of clients at some ISP) DNS resolver and point it towards other IPs that serve up a clone of the web interface with their own LE signed HTTPS but now with some new content too.

This is the same argument technique being used when people say, "Oh, but you can MITM HTTP!" Yes, a target attack of something beyond your webserver is bad. But it applies to HTTP and HTTPS.


Yeah, no; there's a world of difference between subverting the uplink for a single client (all that you need for altering unencrypted traffic) and subverting the network for your target/victim and multiple LE servers that are designed to resist this attack [0]. If your proposed attack worked, it would be font-page news because it'd undermine the security of ex. gmail/banks. HTTPS moves the bar from "bored script kiddies can do this"[1] to "would need a nation-state-level attack to subvert that many networks at once and we'd know about it because that scale of network redirection would not fly under the radar".

[0] https://letsencrypt.org/2020/02/19/multi-perspective-validat...

[1] http://codebutler.github.io/firesheep/


1. You 're the one that told me I had control of the client network as per MITM; don't move the goalposts.

2. You're overthinking this. I'm not talking about hijacking established sessions. I'm talking about never letting the authentic session start in the first place.

As soon as the attacker controls the DNS resolver it doesn't matter what security you have in place. The bank and the LE servers and all that can be perfectly secure. But if the client is going to the wrong IPs they never will interact with them. They'll only interact with the perfectly valid HTTPS hosts the attacker sets up.


My initial statement was that HTTPS would prevent a third party from adding malware to a site (implicitly, "as it was loaded"), to which pbalau responded "No it doesn't." Are you suggesting that our attacker can "just" compromise the victim's machine or the hosting server? Because that's true, but HTTPS is effective against network attacks. Deploying HTTPS also won't prevent someone from mugging you IRL, but that doesn't mean that it's not an effective security measure for what it does.

If we're talking about SSL Stripping, then 1. we're back to you needing to control the network, so apparently my goalposts are exactly in the right place, 2. that is at best partially effective (AFAIK, requires the victim to start from a non-HTTPS page, so again, we really want 100% of sites on HTTPS), and 3. that works specifically by getting the victim back onto insecure HTTP, so if you need that then it's proof of the effectiveness of HTTPS.


You're missing your very own point again. You guys are up in arms about potential attacks against the client or the network. Attacks that having nothing to do with the web server.

I am giving examples of how that class of attacks is not mitigated by HTTPS.

I am not talking about SSL stripping. I am talking about not even letting the client talk to the remote host because in the scenario where you MITM you have control of the network.


Assuming you own the DNS server and redirect traffic to a different IP address, how will your fake host provide the CA-verified certificate to the client?


I thought about this a lot. You're right. There's no way. I was wrong. Sorry.


How would you get a LE cert for a domain you don't control? Your proposed attack is thwarted by ACME challenges.

You could redirect the user to a HTTP site, but 1. that can be defeated by adding the domain to hsts preload list 2. This isn't replacing content of HTTPS site, but replacing HTTPS site with a HTTP one.

To actually pull your attack off, you'd need to add your own root certificate to the client device (which means you either tricked the super into doing it and could've as well tricked them into letting you take control of their device anyway, or actually had control of their device - in both cases MITM is pointless at that point), or trick a CA into issuing you a certificate for a domain you don't own/steal a CA's private keys - both of which are things that can easily kill a CA (see DigiNotar, which stopped existing same month the security breach was reported), and therefore obviously aren't easy to pull off.


Would the third part not need to serve a valid cert to do this? I'd like to hear if it is easier or harder to do with http vs https


Yes, you'd need either a valid cert, or for the user to click through the big scary warning to accept the invalid (or self-signed) cert, either of which raises the bar way above unencrypted HTTP, where this attack is trivial (ex. https://en.wikipedia.org/wiki/Firesheep and https://stopthecap.com/2013/04/03/isp-crams-its-own-ads-all-... )


And, congrats, your portfolio now has injected banner ads and was used to deliver a drive-by exploit. HTTPS is confidentiality and integrity.


Why can't the portfolio page inject banner ads via HTTPS?


Because the modified page wouldn't be signed with that key? This is one of the basic things https does


HTTPS also protects against spying on the specific sub-page you're on for example. That's valuable.

Besides installing a Let's Encrypt cert is straightforward these days.


accessibility is important too. privacy is already a lost cause today for all but the ultra vigilant techies, and https won,t help much there.


Certs can be purchased for the price of a coffee or two and are generally trivia to implement — there’s really no reason NOT to upgrade these days, no matter how benign the content. And I say that as someone who still has a non-HTTPS portfolio site :)


And, therefore, a cert is about as meaningful as a coffee or two.

Someone not attesting the validity of the data served from their site is, effectively, lying if they provide a cert.


What exactly do you think https certs do?


Not very much. That is the point.


Totally agree. I have an iPad app that runs an embedded http server, so that other clients on the same lan can connect to it. Making that into an embedded https server would be a big hassle.


> Making that into an embedded https server would be a big hassle.

Why? HTTPS is an open protocol anyone can implement.


The certificate management is the problem here. Let's say the server is running on 192.168.1.73, how do I get a certificate for that? On the next start of the app, the server might be running on 192.168.1.76, so I need another certificate for that. You see the problem? Also take into account that the entire infrastructure for solving that problem needs to run inside the iPad App, which needs to be accepted to the Apple store ...


It's still a much bigger hassle to implement HTTPS than just HTTP.


The biggest hassle would be that the embedded HTTPS server needs a trusted certificate, but certificates are only issued for hostnames, not IP addresses. So the embedded HTTPS server's certificate and LAN clients would need to know the hostname of the device running the app.


Cloudflare manages to provide a certificate for https://1.1.1.1. (But not reserved ones; see [1].)

Aside from that, what is the purpose of this comment supposed to be? (The general tone reads as if its meant to refute the parent; did you intend it to be a reply to chrisseaton instead—and thus a defense of the HTTPS-complicates-things position?)

1. https://news.ycombinator.com/item?id=16717849


> https://1.1.1.1

That's correct, some CAs issue certificates for public IPs. You're never ever going to get a cert for a private IP, since these are not globally unique.


This isn't saying anything not already covered in the material I referenced in my comment—although it does say it less precisely, so I'd argue on those grounds that all things considered this comment need not have been posted...

(In any case, I'm totally mystified about why my own comment that includes that link and corrects the untrue statement about it not being possible possible to get certificates for IPs was deemed to offend someone's sensibilities. Surely the offense, if there is one, is in the comment that makes an outright, verifiably untrue claim?)


That’s interesting about the 1.1.1.1 certificate.

I wasn’t refuting jackewiehose‘s comment about HTTPS hassles. I was just sharing a specific example of a hassle.


Don't your libraries implement it?


Open doesn't imply it's easy.


I live in a rural area in North America, a stones throw away from a municipality of 160,000 people, and our internet is marginal. The constant push for https everywhere is making the web almost unusable for us. Https is being used to deliver fonts, javascript libraries, and images also over https from multiple domains all requiring multiple ssl negotiations. Usually one of these fonts/js/images times out, and as a result the entire page doesn't load.


I turned this on recently, and I don't think its a good idea, unless you're risk profile demands in - in which case you should be using something like Tails perhaps.

One interesting find was that close to 100% of all email tracking links are still on HTTP. None of the major email providers support HTTPS easily for those.


> When browsers will only display sites that are authorized by corporations we're eventually going to be in big trouble

Likewise they removed support for unsigned add-ons. So you can't just write a small private add-on without sending them your private(!) code. Who needs freedom when you are so well "protected" :-(


One reason to always use HTTPS is that HTTP connections can be hijacked to carry out DDoS attacks. See https://en.wikipedia.org/wiki/Great_Cannon


Generating a local CA and installing its certificate into the local browser(s) is not hard. Not harder than setting up an HTTP server on your LAN.

OTOH localhost is finally exempt from the "not https" warning.


https creates many barriers for accessibility, and i agree completely.

https is helpful but also a tool for more centralized control.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: