Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
W3.org Cert Expired (w3.org)
133 points by rockwotj on June 2, 2021 | hide | past | favorite | 119 comments


It was a huge mistake to have browsers treat an expired cert with so much force. An expired cert website is at the least, just as good as a http site and yet entirely unencrypted sites see a few warnings on login forms while expired cert sites are unaccessable.


Actually, there are certain cases where visiting a site under an expired certificate is strictly more vulnerable than visiting a normal http site. Certain web features only work in a secure context[1], which an http site does not expose. Therefore, a hacker who can convince the user to accept their invalid certificate can extract more information from the user.

Whether it was a good idea to limit those functionalities to a secure context or not, I don't know. I'm also a bit opposed to this forced HTTPS everywhere mentality.

[1] https://developer.mozilla.org/en-US/docs/Web/Security/Secure...


Seems to me, though, that the right solution would be to turn those features off as opposed to denying the whole site. (If the site relies on those features to work and breaks as a result, so be it.)


Disabling these features for expired certificates and limiting secure cookies to a single session sounds reasonable as a "limp home" degraded functionality mode. Obviously one wouldn't want the padlock icon to be displayed in the address bar in this case.


I wish browsers showed a large warning like "This site's TLS certificate EXPIRES IN FIVE DAYS", or something.

It would be less shameful to have it than an expired certificate. But most of all, anyone handling the site would have an advance warning, instead of a catastrophic failure. This would lower the number of actual expirations significantly.


I think that a bunch of things like the JavaScript console, DOM editor, etc. should only be enabled when a browser developer mode has been activated, and such a mode would also enable a variety of warnings like this. It shouldn't be difficult to find or enable the developer mode, but it shouldn't be on by default.

That way, people who know (or should know) what these sorts of warnings mean will see them without ordinary users getting unnecessarily scared or confused. Hopefully a site's own developers regularly view their own site in the same browser they've used in the past for debugging the site.


It would be terrible for visitors who have no clue on what a certificate is, or even what HTTP is. It could be scary for them, they could even think they were "hacked".


a certificate is either valid or its not. this is more for administering it or for users that rely on it and want to know within build systems, at least that is where it did hit me.

the failure then, the same.

wondered although if there is some use in knowing an upcoming expiration.

  # curl + GNU date + GNU grep -P
  $ echo "cert days left: ~$(( ($(date +%s -d "$(curl -IvsS https://www.w3.org 2>&1 >/dev/null | grep -Po '(?<=^\*  expire date: ).*')")-$(date +%s))/86400 ))"
  cert days left: ~396

but then what should the information give? that my build breaks, say next Tuesday? and what if until then the certificate expiration is enlarged?


Maybe there should be a "warned from" extension for certificates, which contains a date within the validity period from which on user agents are supposed to display warnings?


I believe the "Not Valid After" field should be used as a "Warn for x days after this date" value where x is a small value (such as 3).


Why? The date says “Not valid after”, and the site operator can monitor it, or just set a calendar invite when it’s issued. Making the “real” expiration some arbitrary number of days after the marked expiration date seems like it’s unlikely to modify cert owner behaviors, but it does make validation way more complicated for browsers.


I agree, but it is also overkill to display a warning page saying that "you are in danger" just because the certificate expired today. There should be a middle ground.


Why? There’s already a clear better option: teams that run HTTPS sites need to monitor for cert expiry.

It’s the easiest outage to avoid because the moment the cert is generated, you know the exact time it will expire.


Yes,it's a better option for the site admins. It doesn't help the consumer though.

Imagine you can't use your banking site because the certificate expired a few minutes ago and the browser displays a (unnecessarily dramatic) error message. Not everyone is tech savvy enough to ignore those messages.

The browsers already let you visit the site if you want to, so I don't think there is a big deal.


The user shouldn't trust a bank that let their cert expire, easy enough. For a bank, an expired cert is a nightmare in multiple departments.


I think that would cause issues during various security reviews, because it would imply that the browser is accepting expired certificates, even if there is a warning.

Also, CAs might not revoke expired certificates any more, as they are already expired, which hurts security as well if there is a reason to revoke the certificate, but no means to do so.

With an extension, this feature can be introduced gently, without risking any security issues.


It all depends on context and browsers have to have defaults that work for every context.

Sure a recently expired cert is pronably the least severe issue a tls cert can have, but still - expired certs that are compromised usually aren't revoked. If i'm visiting my bank, i definitely want things to err on the side of not working.


It's the combination of defaults that's problematic. If the site requires https, because it's e.g. a bank, then sure, require non-expired cert. But my static sites which have no auth, payment, or even subpages (path-obscuration being another of the touted benefits of https-everywhere), do not require https. Except because of the defaults Google's overzealous security team decided to inflict on the world, now I have to have a process that reaches out to LE every 3 months. For a static website which otherwise never needs updating.


Well according to others on this thread, w3.org enabled HSTS - so they specificly opted into strict mode. They were not using the defaults. So that criticism does not apply here.


Https also ensures that the connection has not been tampered with by an ISP. Its quite stupid, especially considering you're paying them already, but used to be common when most of the web was http. Also, router malware has been seen injecting JS into http pages to mine crypto.

https://www.infoworld.com/article/2925839/code-injection-new...

https://www.privateinternetaccess.com/blog/comcast-still-use...

https://blog.avast.com/mikrotik-routers-targeted-by-cryptomi...


>now I have to have a process that reaches out to LE every 3 months.

This is not particularly hard. Most static hosting services even do it for you.


Serve it both on ports 80 and 443.


Well from their perspective it was great. Why have an extra code path to worry about when you can just make the thing less useful?

I tried to remove a specific cookie from my browser session recently. The only way is to go into Developer Tools, find what tab has "Storage", find the cookies, select the cookie you want, right click and delete it. You can't do it from the 4 other user-friendly screens that already show you the individual cookies, because why would a user ever want to delete an individual cookie?

I think this is the same reason for all of the poor browser UX over the years, like personal certs. The web would be a lot more secure if personal certs had supplanted passwords 15 years ago. But then we'd have to build something other than a single pop-up box to manage them.

Sure, we got a built-in password manager, and we suggest random passwords for users, and save them locally, and the user needs to back them up (or reset them via e-mail). But doing the same thing for certificates might be confusing, meaning, somebody would have to actually talk to a user (until it became common knowledge). Better to wait for something way more complicated and expensive to show up, and ignore UX there too. (https://security.stackexchange.com/questions/1430/is-anybody...)


I totally agree w.r.t. password managers. Luckily, with browsers now having built-in password managers, it's less UI friction if we come up with some standard for user certificate workflow that's close to password workflow.

In a world where users rarely see their actual passwords, it's much less of a UI change to (nearly) silently replace password changes with certificate signings and (nearly) silently replace password logins with certificate presentations. A small extra attribute in the HTML input tag could signal to the password manager that it should perform the certificate workflow instead of the password workflow.

Ideally, instead of specifying a specific mechanism, the extra input tag attribute would signal the password manager to actually perform a SPNEGO mechanism negotiation, so the password manager and the server could negotiate if they were using certificates, Kerberos, or some future mechanism. Though, this would also require adding certificate support to GSSAPI. The upside would be that future changes could be done without any changes to HTML.


I can't agree. It should be enforced with extreme prejudice. Some people's lives may depend on it.


And their lives depend less when the site uses no certificate at all?


Chrome wasn't even giving me the option to proceed, something about HSTS.. Unsure I just know the option was not there.


Right, the website can opt-in to requiring TLS for connections. Browsers can choose to honor this and disallow all plain HTTP connections... and all TLS connections with "invalid" certificates. https://en.wikipedia.org/wiki/HTTP_Strict_Transport_Security It's a great way for site admins to turn a minor certificate issue into a complete disaster :(


If you don’t use HSTS you open your users up to downgrade attacks.

https://auth0.com/blog/preventing-https-downgrade-attacks/


That’s the whole point of hsts. Won’t work on safari either. I don’t use chrome and won’t recommend it but if you have to view it you can clear hsts for that domain in chrome://settings


The point is taking the option away from users to proceed if there is a cert issue? Really it's not.


That’s exactly right. Really it is - please review https://en.m.wikipedia.org/wiki/HTTP_Strict_Transport_Securi...


The main event is preventing silent security downgrades. Hiding proceed behind a secret code like it's a 90s NES game is a side show. They should have just made it the Double Dragon cheat code.


This option is likely there for testing only and not meant for use by regular users. Hence it’s not present in regular settings ui afaik. Would you prefer drm-style lock? That’s clearly going overboard


It will let you in if you type `thisisunsafe`.


Nice tip! I tried that, but then the company firewall blocked it for the same reason. (zscaler Access denied due to bad server certificate)


The reason if something is http only they dgaf about security. If I choose (or redirected to) https and it doesn’t work it might be an attack


No, it was a mistake not to implement everything in DNS ... since that is what cryptographically determines ownership of a domain anyway. Any other certificate mechanism is just middle men selling snake oil and causing additional administrative overhead.


The greater mistake isn't being perpetrated by the big browser companies scaremongering bad certs. The fault lies with every sysadmin or web dev that choses to 301 redirect from HTTP to HTTPS and not have HTTP+HTTPS. You can do your part to make this a non-problem by always serving HTTP and HTTPS both.


And i guess you think TLS-stripping attacks are unimportant?


For many sites, and probably most non-commercial sites, yes. Anyone thinking about this issue can decide if that threat model applies to them: do they even have a log in, or is it just html files and jpegs in directories? etc.


Well that depends on what the jpegs and html files contain.

For example, the fact you are reading the wikipedia article on Tiananmen square incident might be very sensitive if you live in China (ignoring the part where they block wikipedia). Other places might object to various other speech, etc.

Although to be fair, the mass survelience threat model is probably less likely to have an active attack like tls stripping. But annonoyminity is all about hiding in the crowd; only securing the connection when you have something to hide from eavesdroppers means that you have no crowd to hide in when you need to.


How do you determine a users privacy requirements by site alone? The content does not imply relevance to privacy abuse; abusers come in all manners and may take abusive objection to content like w3.org that would to most people seem innocuous.


It's pretty easy when you are the person that made the website (like most personal websites).

Again, I am not advocating for not having HTTPS so I don't know why you think HTTP+HTTPS is less private. It is exactly as private only it is also human readable and requires no centralized authority's lease to be visitable.


Do you know how I can set allow http connections to my website (on Github pages with a custom domain), while ensuring that anyone who types "example.com" without a protocol gets the https version by default? I wanted to set up my website that way, but I couldn't figure out a way to do it, since many browsers still default to http.


In many cases this isn't the behavior you as the consumer would want. For an example, if your bank's certificate expires it's not ideal for the end user to be unwittingly redirected to http, and inadvertently access their account over an unencrypted connection.


What's HTTP+HTTPS?


Serve the same content over HTTP and HTTPS. Many web admins today don’t serve content over http, instead they HTTP/301 redirect all http requests to https.


Its worst than that. Google chrome doesn't like more sites than others. On most you can simply click "proceed to danger" and ignore missing cert. On some other like cryptome, they don't even give you option to continue.

https://cryptome.org/


This is generally enforced by HSTS - one of the features is that, if the site has it enabled and the site has a TLS-related error, the browsers prevent users from bypassing the SSL error screen (at least, without typing `thisisunsafe` in chrome).

https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/St...

https://security.stackexchange.com/q/195688/96942


Like others said, the site is using HSTS:

  > curl -sSk -D- https://cryptome.org/ -o /dev/null | grep strict-transport-security
  strict-transport-security: max-age=31536000
But even then, Chrome 91 on both macOS and Windows totally does allow me to

  Proceed to cryptome.org (unsafe)
Maybe it's only offered because I never visited the site in the past?


I'm glad I saw this comment - What can make a site perform like that?

At work we use Chrome as our general browser, and we've had several issues with expired certs before. Some websites allowed you to expand the box and opt "Continue" but some simply didn't have the option. Whats the difference?


HTTP Strict Transport Security (HSTS) is enabled at the DNS level which tells modern browsers "I'm a modern website and want to only be served on valid certificates, otherwise refuse to allow access to my website because something must be very wrong for this to happen".

The assumption is "must be very wrong" is an attack you don't want people to "continue" past. Occasionally it bites back like this if you don't maintain your certificates.

Offering HTTP transport invites attackers to inject advertisements, malware, or viruses into your packet stream. ISP like comcast and ATT are notorious for doing this.

Allowing falsified or expired certificates invites attackers as well.

HTST This is a good thing.


DNS? You announce it with a header or get on the preload list. I don't think there's a DNS method.


Yeah, there is no DNS mechanism. Although, IIRC, it is possible to place an entire TLD on the preload list - which still doesn't use DNS, but its a mechanism to enable HSTS where the website itself doesn't do anything.


Sites can opt into "do not let somebody continue if the cert is bad" via HSTS.


Firefox 88 here, cryptome.org loads with a red exclamation warning icon in the address bar, w3.org refuses to load and when I click on "advanced" there is not even an "accept risk and continue" option. I don't even know how to load w3.org right now if I had to.


Typing `thisisunsafe` in Chrome to ignore the certificate gave me a 503 error. Perhaps the problem isn't quite what it seems?


Interestingly, it also doesn't allow the CA that it's serving, at least per the CAA records:

  w3.org.                 3599 IN SOA ns1.w3.org. hostmaster.w3.org. (
                                  2021060201 ; serial
                                  28800      ; refresh (8 hours)
                                  3600       ; retry (1 hour)
                                  604800     ; expire (1 week)
                                  1800       ; minimum (30 minutes)
                                  )
  w3.org.                 3599 IN CAA 0 issuewild "sectigo.com"
  w3.org.                 3599 IN CAA 0 iodef "mailto:sysreq@w3.org"
  w3.org.                 3599 IN CAA 0 issue "amazon.com"
  w3.org.                 3599 IN CAA 0 issue "sectigo.com"
  w3.org.                 3599 IN CAA 0 issue "letsencrypt.org"
It does look like the records were updated recently.

The spf records are also interesting, but I know nothing about their mail setup.

A new Gandi cert was just issued: https://crt.sh/?id=4630660228 Perhaps it just needs a kick, or it'll be fixed soon.


CAA only applies at time of issue. Its totally valid for your CAA records to not match the cert you are using, as long as CAA was ok at issue time.


Which it appears that it was not, at the time that cert was issued!

I have no idea if CAB have made CAA a requirement, but I presume not.



Oh, self-correction, Gandi is a Sectico delegate.


They might use the same certificate on the backends as well, and have the frontend load balancer verifying certificates to prevent MITM.


Wow what a neat little trick!


"Neat" is one word for it, sure.

It's nothing wrong with a user deciding "I realize something is wrong, and I shouldn't trust that anything on this website is legitimate, and I shouldn't give it any information, but I'd still like to read the story/article/blog."

There should be a low-opacity button inside the "Advanced" menu.


I have a button there: https://i.judge.sh/each/Star/chrome_qziy165xYV.png

But if the button isn't there, that's probably because w3 has HSTS enabled (for past visitors), which tells the browser to never load unencrypted. The `thisisunsafe` workaround is there to be a middle ground between actually making it impossible to load and preventing users loading an unsecure page when they shouldn't.


People don't read buttons. However, I share your disdain for Chrome's chosen method; it's very gate-keepery. It requires you to be "in the know"; those who understand the danger but haven't heard of the trick are deliberately left out.

I think a reasonable compromise would be a button that pulls up a dialog prompting you to type out a consent message. The dialog tells you what to write instead of keeping this as esoteric knowledge, but the user is still compelled to at least read the warning instead of blindly pressing buttons.


Compare to firefox which has no bypass.

This is the sort of thing, if you're not in the know than you probably dont understand the consequences.


> This is the sort of thing, if you're not in the know than you probably dont understand the consequences.

Understanding the premise and importance of HTTPS and knowing about some esoteric hidden UX of some Google software are not at all the same thing.


Kinda agree, but they don't have a button probably to prevent people from habitually pressing the button to continue. Training your users to bypass security measures is not good security.


For the same UI related reasons that makes confirmation dialogs useless.

(For them, undo is a much better replacement. Even gmail's 'fake' undo for unsending email.)


They added a button in the form of a hackers only keyboard command.

Engineers can be so out of touch sometimes.


Does anyone know how to "thisisunsafe" on mobile Chrome?


I clicked Advanced and then it shows Proceed (unsafe)...


I don't get that option, and instead get a message referencing HSTS, presumably because at some point in the past my browser picked up the HSTS setting for the site.


No look at the cert - it did expire today


Thing is it’s not just the certificate that’s down, the website behind the expired certificate is also down. The two are almost certainly connected, though cause and effect is uncertain.


Yeah if i had to guess the backend is using the same cert


Same thing.


Happened to w3schools a few years ago. A sign that there may be room for improvement as far as the process for setting up SSL certificates?


ACME/Let's encrypt has made this process streamlined enough that no manual involvement is needed any more:

https://en.wikipedia.org/wiki/Automated_Certificate_Manageme...

I think the main issue is now in increasing deployment of ACME. Let's encrypt has been issuing hundreds of millions of certificates for small websites, but how large is their market share in the serious websites market (Alexa ranks w3.org in the top 10k)? It seems to me that it's covered by different CAs. I wonder which angle Let's encrypt or other ACME/automation based CAs need to improve upon to make themselves attractive to that market.


Attempting to view w3.org currently crashes Safari 14.1 on an M1 Mac running Big Sur 11.3.


SSL certs are such a pain to manage in orgs.


And because I violent agree with this: even w/ Let's Encrypt this is still a pain. (Though, it is at least a better engineered pain, and theoretically tractable.) DNS providers that have bugs¹, services that want to host on port 80 & won't let me proxy them through nginx, auditors that want block port 80 (guess what ACME uses), services that generate malformed X.509 certs (golang's standard library…), programs that want certs in weird formats (PKCS#12 & Java, I'm looking at you), bad OpenSSL defaults, the rate limits (sorry LE, they're still too low) and just there are probably about 2 dozen people who really understand X.509 & web PKI and the rest of my company it is wizardry (it really isn't).

¹a certain large provider for the longest time failed to handle having two TXT records, which ACME requires in some circumstances.


> services that generate malformed X.509 certs (golang's standard library…)

In which way are those certificates malformed? Is it because they don't have the extensions needed to pass certlint?


Malformed subject, IIRC.

It's not every cert, mind you (I didn't intend to imply that) it's just the ones that happen to not specify some element. Unfortunately, it's the default of a utility that we use, that's written in Go. It generates a very long-lived cert on start, and by default, it'll be malformed.

We have another utility (that crawls certs looking for ones near expiry. That cert trips it up. (And it now has a configurable ignore-list specifically so that we can ignore that cert.)

But more and more I appreciate libraries who (produce well-formed output) xor (error).

Go also has a "simpler" interface to X.509 in its standard library that will easily produce the wrong (but well-formed; i.e., wrong semantically, not syntactically / not what any user is gong to want) output under very easy-to-accomplish circumstances.


This describes so much of my work.

"We have a thing that does X. Except sometimes it doesn't, so we have another thing Y that checks X. Except sometimes it doesn't, so we .... "


Thanks for this explanation! I'm always curious about such user stories because I maintain a library to generate certificates.


I found the bug: https://github.com/bitnami-labs/sealed-secrets/issues/398 — it's actually an empty Issuer DN.

Seems like it was recently fixed too, which is nice. Now I'll have to find a way to upgrade…


A grace period would have been a lovely feature to bake into this from the beginning. It's too late now of course.. oh well. I feel for you w3.org devs I've been on the other side of this many times.


There's not really any reasonable action for a visitor to take if a certificate is almost expired, so I don't know what a grace period would get you.

Some CAs are nice enough to issue certs with the Not Before date a day or so before the time at issuance, which is super handy for all the devices with wrong time and software that does odd things with timezone conversions.


The easiest way to manage I've found so far - put it behind Cloudflare. Caching, certs management, www. redirect handling (with SSL support), basic protection - all comes for free. You would need to install their own cert on your infra, but it has 10 or 15 year to expire.


Then you're shunning anyone who uses a VPN, anyone who uses privacy plugins, etc etc.

I despise Cloudflare these days because I'm pretty much required to use a VPN to access the web and get around geolocked content everywhere, and then Cloudflare blocks me from browsing even basic static sites.


Interesting, I've never tried this. I'll check it out.


CloudFlare, being a reverse proxy, has many limitations (such as forced cookies and the upload limit) but I agree that it is ok for many use cases. Let's Encrypt is easily automated and doesn't have any of those downsides.


Are there still "forced cookies" after this change? https://blog.cloudflare.com/deprecating-cfduid-cookie/


Not an option for Australia clients on the free tier - it gets routed through Hong Kong! :(

In any case, Cloudfront has been a cheap value option with a easy DNS based certificate renewal process.


>it has 10 or 15 years to expire

Dont browsers reject certs with longer than one year validity?


They’re talking about the cert between edge and origin (I.e. cloudflare and your server). The client never sees that cert because they don’t talk directly to your origin.


This is the internal traffic between CF and you. So it's never exposed to browsers.


Without HTTPS does work currently:

    curl 'http://www.w3.org/'
I'm one of the people who really think that HTTPS should be enabled by default and the only HTTP traffic that should be allowed are permanent redirects with a pinning header.


Just a few years ago, that was prohibitively expensive, but the browsers are pushing everything to https only pretty hard now.

Most setups are nearly seamless for getting a free cert going now, but there'll always be a few old sites that no one will ever update


Maybe they'll switch to Let's Encrypt now.


Gandi Standard Wildcard SSL apprently


This is more than a certificate problem…

503 Service Unavailable

No server is available to handle this request.


They probably use the same certificate on their backend servers and their load balancers refuse to proxy them.


Bad cert. great website.


Bring back 3 year certs please


stop treating self-issued certs as inherently insecure, please.


This is impossible since you can't be certain that a self signed cert is issued by $isp or $government versus the website owner.


if I issued the cert myself, for sure I can ;)


What really is the difference between a $5 secure certificate from ssls.com, a self-signed cert, a $60 godaddy cert, and a lets encrypt ssl. Just about nothing right? So why not trust self-signed certs.


Everything other than a self signed certificate has technical checks requiring that the person requesting issuance of the domain actually has control over the domain via dns or http challenges. Without those you (or your ISP or government) could self-issue accounts.google.com (or api.stripe.com or bankofamerica.com) and MITM people to steal information that otherwise would be presumed private to middlemen.


Let's be clear. Google insists on https because it keeps ISPs from intruding on their ads business. The other threats are small potatoes.


> Google insists on https because it keeps ISPs from [injecting any of their horseshit into my traffic.]

Huh, that's precisely why I insist on https.


In India, even govt run ISPs inject their ads on http sites.


Just encryption would do that, you don't need certs right?


Then the ISP would simply decrypt the traffic, insert their ads, and encrypt again.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: