Privacy and security fans have long flocked to Swiss security enclaves, hoping for maximum protection against prying government eyes, much to the ire of those seeking to poke legal holes to get access to information on bad actors.
One such maneuver was to argue email and messaging providers should be classified as telecom providers, thereby subjecting them under Swiss law to rules allowing law enforcement access under the guise of data retention requirements. One such provider, ProtonMail, has long claimed they don’t have access to specific email account information since it’s encrypted, but forcing them to keep a readable store seemed to them like a step too far, so they took the Swiss government to court – and won.
We’re often asked if security can be too good – that is, if it doesn’t allow law enforcement access in exigent circumstances to data about imminent threats. But the devil is in the details. To allow access is to allow access, and since security companies don’t want to be in the business of building insecure code with privileged backdoors as a sort of meta-arbiter of intent, they instead focus on building strong security without defects that could allow access, which might open the doors for problems.
But companies have to operate legally in jurisdictions around the world, and are subject to the laws of each particular jurisdiction, which is why some head to Switzerland, long perceived as a safe haven for digital security.
RELATED READING: How encryption can help protect your sensitive data
The battle for email privacy has been a long one, with various providers shuttering altogether rather than granting authorities access they were uncomfortable with. Meanwhile, new technology platforms continue to roll out hoping to solve security issues while absolving the provider from potential liability.
One way is through the use of a zero-trust model. When a provider doesn’t know something about their customers, like the contents of their email accounts, the provider can’t be reasonably compelled to produce the information. This also means their customers can trust the provider not to produce the data in question because they never trusted the provider with it to begin with.
This and other potential single points of failure in the email chain are tricky problems to solve. One is the certificate of authority. If compromised, it can signal unwarranted trust to email systems and thereby allow rogue actors to siphon information along the way. One method of fixing that proposes distributing the certificates to a mesh of nodes, proving more difficult to game. But email security will always be a game of cat-and-mouse.
That’s because email is worth so much to someone seeking to reverse engineer your life. It’s not just the content; it’s the frequency and identity of the other parties on the email that suggest compelling, actionable patterns of life. This kind of evidentiary pattern matching can be too tempting for law enforcement to ignore.
Some government agencies have even gotten more granular, seeking to classify encryption as a form of weaponry and restrict its use and export across unfriendly borders. But that’s devilishly tricky to do. Cryptography, after all, is all about implementing a series of math equations on generic technology platforms. How would they reasonably restrict the use of math to certain geopolitical locales? It doesn’t really work.
What about thwarting “people who do really bad things with technology?” That’s certainly a field of interest for lots of technologists, but what should be considered a step too far for privacy? There has to be a balance, and those nuances will be considered by smart people for years to come. But for technologists, the remit centers around simply writing the best, most secure code with the fewest bugs and vulnerabilities, and not focusing on the determinants of intent. Good code is hard enough to keep us busy.
For now, at least, security and privacy just got a leg up in one small mountainous part of the world. We’ll have to watch the ripples spread out from there.