Thanks @mitra for this.
The long lived certificates were one of the options we thought could be explored.
Though registrars could be willing to help, for the general case of anyone that might not have relation or time to build that relation with them, it is enough to know that a valid option could be to get one of this certs every 2-3 years.
I still think that an automated LetsEncrypt cert every three months is also a possible approach, if the fetching of the cert and the deployment is also automated.
I havenât come across a community that is so disconnected that no member of that community gets connection in three months with the outside world.
If that is the case, you could stream the TLS certificate over Satellite with something like Othernet/ Toosheh to beam down the certificate.
Sure Nico - theoretically possible, but youâve got to have all that work seamlessly with non technical users, and if they fail to renew in time, then the system goes down until they get a new cert.
IndeedâŚ
So then the options would be:
- Payed long-lived certs: simple and (maybe) expensive.
- Payed shared long-lived cert: simple and cheap, but no security in between the people that share it.
- LetsEncrypt cert: requires a regular update that if it doesnât happen breaks the system till it gets updated.
- Self-signed cert: you need to accept the dreaded browser warnings.
The other option, if you have the chance, is not using the browser at all, and use your own app, were you can do SSL pinning, which has itâs own pros and cons, and introduces the complexity of how you distribute your own app (and mobile/desktop support).
Any other option to consider?
Good summary @nicopace
The most important issue is what @devon mentioned: Use a fullly qualified domain name, such as myofflinedomain.com But even more than that, get a fully qualified domain name - and also set up a host on the internet, with some information about the site to let the user know about the content if they access it through another connection by accident - or that it is zero rated or free, or whatever the case may be, when accessed from their network. (If theyâre accessing it on the offline network, a simple script on your page will be able to tell them, by just checking if the IP address of the request is in the range of the network.)
Then on your main router you can redirect the traffic to the local site via NAT, or you can use use split DNS to point it: eg. myofflinedomain.com to your local IP address for the server with the content.
The fact that technicians have to click past certificate warnigs when configuring wireless points sets a very bad precedent. I want to STRONGLY URGE every community network to help their communities understand how computers are different from the real world, and how anybody can steal information that you might not think is valuable, but that is valuable to for example the financial world, or the police - and why they should never click past a certificate warning and never use self signed certificates or to teach anybody in their network how to install a CA in their browser - for the simple reason that this makes it look like an acceptable thing to do, when it is not so when a criminal prompts someone to do it in order to steal their identity, and take out loans in their name or with their banking details, which they will be held responsible for instead of the criminal, itâs not something brand new that theyâve never done, or something that theyâve been warned not to do, and so they will be more resilient against exploitation.
I think this is the best option with automated and periodic cert updates via LetsEncrypt, and it requires one (or more if you like) trusted server in the CN that is connected to the Internet, with the responsibility of getting the updated certs regularly and pushing them out to local servers.
For example, you may have a CN node at 10.0.0.5
hosting a chat server, that you want any device in the CN to access, even if these devices donât have Internet access. Letâs say you own the domain example.com
and has a local DNS server in the CN that points chat.example.com
â 10.0.0.5
. In this case, not even the chat server at 10.0.0.5
has Internet access, it just can connect to the cert server that has Internet access.
To do this, youâd want to set up the cert server to use Letsencryptâs DNS-01 challenge, which basically means publishing a TXT record as proof that you own example.com
. Once it sees your challenge response via TXT record, it will issue you a cert, which you will then push out to 10.0.0.5
(via the CN or run a USB key over, that doesnât matter). Note that the cert doesnât care what IP address is serving the cert, as the verification is completely decoupled from the web server.
When someone in the CN asks a DNS server, whether it be the local or an Internet DNS server, youâd want to direct them to 10.0.0.5
. The client will receive from 10.0.0.5
the valid cert issued from the Letsencrypt CA and be happy with it.
This setup is common in home networks (I do this for my own home devices with local IPs), or companies that run a lot of web servers and would delegate a single node to handle certs for everyone. Of course, these are all trusted environments, and in this CN case the cert server basically has the capability to act as anyone who depends on it, but it also handles all the complexity on behalf of the local servers.
I have some code here on how to set up the cert server to get certs for a list of domains, it uses dehydrated to do the Letsencryptâs DNS-01 challenge, and Digital Ocean APIs to manage automated TXT publishing. Everything should be in the [ crontab, dehydrated, nginx ] folders and should only take a couple hours to set up, assuming you already have a domain name to use and ok with pointing NS to Digital Ocean. The part I donât have here is how to push the fetched certs to 10.0.0.5
, but that distribution strategy should be how the CN wants to handle, and should be straight-forward.
Lastly, this may be a useful read, itâs essentially the same problem. I think I a months late to this thread, but hope this helps.
Thanks @benhylau for joining the conversation.
The biggest challenge lies in networks that donât have a reliable access to the internet, or no access to the internet at all.
I like the DNS txt challenge approach, as it is very simple to setup and then you can take the cert wherever you want.
And thanks everyone for working together to find out ways to go around a tricky issue that appears because CNs (and offline systems, among others) are not being taken into consideration when designing these structures.
I hope other encryption and trust chains can be developed in the future where this discussions can inform the process that shape them.
There is a very interesting discussion thread on wicg.io about exploring new protocols for trust in the local networks, without relying on internet infrastructure.
I encourage you all to jump in and contribute.
Just implemented SSL for the local server in my community network.
- Get a domain, I got moinho.app because itâs short and cheap
- On a cloud server generate a wildcard certificate using certbot. Instructions here.
- Copy certificates and generate
crt
andkey
files:
mv fullchain.pem moinho.app.crt
openssl pkey -in privkey.pem -out moinho.app.key
Change moinho.app
for your domain.
- Copy the
crt
andkey
files to thecerts
directory of the server (in my case nginx-proxy). - Done!
All my local services have ssl, and when you access moinho.app online you see a slightly different version of the app, also with ssl.
Next step would be to automate this, seems like @benhylau has done some work in that direction.
Interesting thread on HackerNews
FWIW, Iâve been using the howto at Lets Encrypt for internal hostnames | jsavoie.github.io for local https services and it works pretty cleanly.
And now there is https://www.getlocalcert.net/ with an accompanying debate on HackerNews