PersonalEncrypter – New and Improved

A couple of months ago I updated the PersonalCLI project to remove information leakage in the form of the original filename of the encrypted data. In my original post about this project I eluded to updating it to move away from the Microsoft encryption libraries and to implement the libSodium library instead.

Well, I’m happy to say that I’ve found some time to do this and moved to .NET Standard at the same time – yep, the future is here.

The initial project was really a quick and dirty exercise to demonstrate the ease with which these applications can be developed. It didn’t really lend itself well for extension – it was only meant to be a proof of concept after all.

In the new project I have created a class library to handle the mechanics of the encryption and a separate project for the CLI. There is also a skeleton project for a UWP desktop application which I’m currently working on (and committed in error – but hey, I’m only human). The UWP application will be initially aimed at Windows 10 but the ultimate aim to for a version of this project to be installable on just about any platform, e.g. Mac, Linux, iOS and Android.

Note: The libSodium project uses a different encryption algorithm to the Microsoft libraries so it will require a new key pair to be generated.

So, I’m now working on the UWP application which may even find it’s way onto the Microsoft Store (if they’ll have it) – afterall, that’s the aim isn’t it – to get encryption into the hands of the masses so that they can have some semblance of privacy.

I’ll be pushing updates to the public Github repo and announcing major milestones here so …… watch this space!

Personal Encryptor CLI v1.1.0 Released

It’s been a good seven months since I released the initial version of the PersonalEncryptorCLI project and in that time I’ve had more than a handful of emails asking me about the utility. Most of these were asking if I could/would be creating a Desktop version, i.e. something with a GUI. It seems only geeks like the Command Line – who knew 😉

Well the answer is yes, but I needed to clear the remaining issue that I had identified with version 1.0.0 – the need for the recipient to know the context of the encrypted content, i.e. whether it was a text file, an image or a Word document.

This was due to the initial release of the utility requiring the filename to be specified during the decryption process – including it’s extension. Now, if the user didn’t know that it was a Word document, how would they know to specify a filename with a .docx (or .doc) extension?

Sure, the sender could include that in the email (or some other mechanism) but this is information leakage and this could potentially assist so-called ‘bad actors’ in determining the contents – regardless of whether they would access the actual contents or not.

Well, today I managed to get a couple of hours to test update the code to include the original filename in the Encrypted Packet (the value being encrypted of course). During the decryption process this value is extracted and used to create the file for the decrypted data to be written to.

There have been a couple of tweaks to the project Readme.md file to handled the changes to the command line options for the decryption operation but it’s fairly trivial – just dropping the filename from the output path parameter.

“But what if I have encrypted packages that were created with the previous version – which don’t contain the filename?”

Don’t fret – I have a few of these too so the code will use a temporary filename in this instance (hopefully you remembered/know what the file was!).

I’ll be decrypting my now-legacy data and re-encrypting with the new version of the utility – you may want to do the same 😉

You can access the main project page here or the new 1.1.0 release here.

Now that the desks are clear – I can turn my attention to the Desktop client which may or may not look something like this:

Taking control of my Domain

Some time ago I was watching a Pluralsight course called ‘Master Your Domain’ where Rob Conery explained how to break your reliance on the major service providers for email, source code, blogs and file-sharing and create your own domain to host your data.

Following the course I started hosting my own Git server, Blog and File Sharing service but Email …. well that was too big a step for me to take at that time. However, times change and when I started experiencing issues with my email that was the trigger for me to take the plunge.

What was the problem with Google Mail?

When GMail moved to Inbox I have to say I was less than impressed. For my personal email it was fine – I didn’t mind moving to ‘Inbox Zero’ but, call me a dinosaur, it just jarred with me when it came to my business account.

Now, I really didn’t want the hassle of moving email providers so as many of the ‘problems’ were to do with the Inbox application I decided to use an alternative email client called WeMail on my Android phone and this served me very well for a couple of years.

Recently however, I started noticing multiple ‘draft’ messages hanging around (basically ‘snapshots’ of messages as I was typing and had then sent) and issues with message signatures – sometimes they were added, sometimes not. Was this a problem with the WeMail or Google – who knows.

What about Google Docs?

I was also not overly impressed to see that Google had blocked ‘large numbers’ of users from accessing their files on Google Docs. Admittedly Google were trying to filter out malicious content but the fact remains that they were scanning everybody’s files for what it deemed to be inappropriate. What if content in my files triggered a false positive and they blocked my access to an important document? What about my email? I have all sorts of information in there from company accounts and confidential client discussions to inane conversations with recruiters and colleagues.

Making the move

After deciding to make the move I had a look at what I actually stored on Google services and what needed to be migrated.

Obviously there was a couple of gigabytes of email but I also had a lot of stuff in Google Docs – from invoices to memes – where was I going to put all this stuff?

Files and Stuff

As already mentioned above, following Rob Conery’s course I had configured my own Dropbox-like File Sharing service using OwnCloud and this had been running fine for a while now. I had the server installed on a Raspberry Pi sitting on top of a 250GB SSD in an external drive enclosure. With the appropriate DNS configurations and port forwarding on my router this configuration worked well for me, allowing me to share folders with clients for them to upload large media files and letting me transfer files between my Windows workstation & laptops as well as my iMac and MacBook Pro.

Email

As Rob mentions in his course, it’s really not viable to host your own email these days. Messages coming from unknown sources are blacklisted by many services in an effort to reduce the level of spam reaching our inboxes. For this I needed to find an alternate provider; one that provided me with all the features I already had (spam protection, mobile access etc) but with some increased confidence over the privacy of my inbox.

In the course Rob recommends Fastmail and reading their Privacy Policy I was happy to give them a try – they offer a 30 day free trial and I did give them a try previously but not ‘in anger’ as it were, i.e. I created an account and sent test messages, added appointments etc but never actually used it on a daily basis.

After exporting my Calendar and Contacts from GMail I set about the import process from within Fastmail. The process itself was pretty straightforward with clear instructions and troubleshooting advise. I experienced no real problems but I’m sure that Fastmail support would have been on the case if I had.

The only ‘grumble’ I had at the time was that my Gmail data was imported into a folder called ‘migrated’ – I was expecting my Gmail Inbox messages to appear in my new Inbox. This caused a bit of consternation at the time but looking at it now I’m not so sure it’s a problem – all the data is there and I can easily move things around if I so desire.

Re-configuring my DNS to redirect email to the Fastmail servers was also straightforward and I’m happy to say that a couple of weeks into my trial I’m very happy with the service I’m receiving so will definitely be signing up to the full plan.

So what about Backup?

So I now have my email hosted successfully and files are back under my control so we’re all good yes?

Well not quite.

One of the things we don’t really think about it that on top of storing all our information and making it available to us online, Google are actually backing this stuff up. If one server was to totally fail then the data is ‘simply’ pulled from another and we never know there was a problem.

Well, the data is now sitting on a drive in my office – what happens if it fails, or the office burns down? How will I get that data back? I need a regular, offsite backup.

The answer was fairly simple and conforms with my need to keep my information private.

I had previously bought a Mac Mini for developing my Xamarin iOS applications, this was later replaced with an iMac, so I fired it up and installed the OwnCloud client onto it. This was set to sync everything to it’s local drive – and yes, it’s still sitting in my office so at this point I’ve gained nothing.

I then signed up for a SpiderOak account – initially 250GB but they later increased this to 400GB – using their 21 day trial. Their ‘SpiderOak One‘ client was then installed onto the Mac Mini and configured to backup everything in the OwnCloud sync folder.

I’ve also install the One client on my workstation and also mounted a couple of folders from my Synology NAS onto the Mac Mini for good measure and I have backed up almost 100GB of data so there is plenty of headroom for future expansion.

Going Forward

Ok, some of you may be asking about the cost of all this and yes there is some additional outlay – my Google Apps account was created when they were free and to their credit Google have honoured this long after charging for new accounts. But the cost to the business is minimal – and even as a personal user it’s certainly not prohibitive.

The backup solution I have in place does have it’s downsides – we had a power cut here a while back and I totally forgot to reboot the Mac Mini so there were no backups for a while.

But the fact is that I now have control over my data and if this takes a little more work and expense then such is life.

The Personal Encryptor 1.0 Released

 

Following on from my post about the UK Governments campaign to erode our privacy by demanding that tech companies put back doors in their encrypted products, I have created a simple utility to demonstrate how easy it is for a reasonably competent developer to create their own using standard development tools and libraries.

Now, I’m not expecting the UK Government to take a blind bit of notice but the fact is that encryption is out there, it’s only mathematics after all, and it’s not going away. You cannot feasibly make maths illegal – although the US did classify encryption as a weapon until 2000 (and to some degree still does).

Anyway, following my commitment to watch at least one Pluralsight course a month during 2018 I opted for Practical Cryptography in .NET by Stephen Haunts to give myself some suitable background.

The course was a minute under four hours and took me a couple of evenings to get through, Cryptography is not the most stimulating subject but Stephen did his best to key the information flowing. At times I did feel frustrated at how he seemed to labour some points but the upshot is that by doing this the information did seem to get through and stick. During the course he slowly increased the complexity, developing and enhancing C# code to demonstrate the principles.

It is this code which I have used as a base to create the ‘Personal Encryptor’ (hereafter referred to as PE) – a command line application that can be used to generate encryption keys, encrypt and, of course, decrypt data into files that can be safely sent over the Internet. Only someone with the required public and private keys will be able to decrypt the file and view the original data.

I’ll probably put another post together shortly diving a bit deeper into the functionality and explain the contents of the output file – but I highly recommend you watch the above course as Stephen know the subject inside out and does a great job of explaining it.

Why would I need/want to encrypt a file?

Imagine the following scenario;

Alice and Bob want to exchange private messages with each other; maybe they are planning  a surprise birthday party or sharing ideas about a new business venture. Whatever the messages contain, they are Alice and Bobs business and nobody elses.

  1. Alice and Bob both download the PE application and copy it to a location on their Windows PC (Mac version coming soon).
  2. They then use the utility to generate a Public and Private key pair – which will create two XML files.
  3. They each send each other their PUBLIC keys (this is just an XML file and can be freely sent over the Internet or via Email).
  4. Both Alice and Bob copy their PRIVATE keys to a safe location (maybe a secure USB key – or a normal USB key which is stored in a safe)

Now Alice wants to encrypt a file, a PowerPoint presentation for their new product, and send it to Bob

  1. Alice uses the PE application to encrypt the file using Bobs PUBLIC key.
  2. The PE application Digitally Signs the encrypted data using Alices PRIVATE key.
  3. A text file is created containing the encrypted data and everything needed to verify the contents has not been tampered with and to confirm that Alice encrypted it.
  4. Alice then emails the file to Bob as she normally would if she was sending a photo of her cat!

Bob receives the message and downloads the encrypted file to his computer.

  1. Bob uses PE to decrypt the file by specifying the location of his PRIVATE key and Alice’s PUBLIC key.
  2. The PE utility will check the digital signature using Alice’s PUBLIC key to confirm that it was signed with her PRIVATE key.
  3. It will then check the integrity of the package to ensure that it has not been tampered with in transit
  4. If all is well then the PE application will decrypt the file and write the contents out to a location that Bob has specified.
  5. Bob can now endure enjoy Alice’s PowerPoint presentation.

Of course if Alice (or Bob) just wanted to encrypt a file for their own personal use and not for sharing it is perfectly feasibly to provide their own Private AND Public keys to encrypt the data. These keys will be required to decrypt the data.

And that’s it, privacy restored/reclaimed.

I can now safely download my Lastpass vault in plain text, encrypt it and save it to any cloud drive I like, secure in the knowledge that, as long as my private key remains under my control, nobody can decrypt it to access it’s contents. Nothing illegal there – these are passwords to legitimate sites (Amazon, Pluralsight, Microsoft, Apple etc) and need to be protected. A valid use of The Personal Encryptor.

Going Forward

Yes, at the moment it requires the users to have some familiarity with the Command Line but this project was always intended to be a proof of concept. The original aim was to explore encryption to enable me to implement it in an existing mobile Chat application.

Creating a simple GUI would certainly be possible – a simple Winforms or WPF application to collect file paths and call out to the command line utility shouldn’t take too long for a competent developer to write. That said, I’m probably going to focus my attention elsewhere.

While using the Microsoft libraries is perfectly valid in my opinion, I am aware that many people will wince just a little bit. With this in mind I intend to investigate using the libSodium Crypto Library which Steve Gibson is such a fan of – so that’s good enough for me.

You can download the latest version of the Personal Encryptor application by clicking here. Alternatively you can download the full source from Github.

WhatsApp – a Haven for Paedophiles and Terrorists?

Yep – thought that would get your attention!

It’s headlines like this that the UK Government (and the press) are throwing around in order to drum up support for one of the most intrusive and privacy damaging campaigns to date.

The premise is that bad people use these services, which make heavy use of encryption to keep messages private, and by doing so hamper the security services who can no longer access private information in order to monitor them and stop them from doing bad things.

Now I’m not denying that these bad people do use WhatsApp (and similar applications) to enable them to communicate without their messages being intercepted. But I use WhatsApp and so do my wife and kids and we are not bad people. If WhatsApp are expected to put a backdoor into their systems to allow access to the content by so-called ‘authorised agencies’ then what about our privacy?

When I discuss this with people many will say “well, if you’re not doing anything wrong then what’s the problem?”. However, when I ask them for their email and social media passwords they are a somewhat reluctant to hand them over – “but if you are not doing anything wrong then why do you care?”, I ask.

The answer is simple, their email and social media feeds are private and none of my business. Just because something is private does not mean it’s illegal or even wrong, just private.

We may be discussing our medical history, financial details, travel plans or just what time we will be home for tea but that’s our business, it’s private and nobody else’s business except ours and whoever we’re talking to.

So while I am willing to accept that bad people use these platforms in an effort to hide their activities, I’m pretty sure that they make up a tiny percentage of the 1,000,000,000 (and increasing) WhatsApp users. Do we all have to give up our right to privacy for the sake of these people and will it even make a difference?

The Snoopers Charter

In 2016 the Investigatory Powers Act, or Snoopers Charter as it was dubbed, was passed into Law and with it the privacy of every UK citizen was eroded a little more.

Did you know that under this legislation your Internet Service Provider now has to keep your browsing history for 12 months and provide it on demand to authorised agencies?

If you did then you may have assumed that as long as you are not “doing anything wrong” then you have nothing to worry about as the Police and Security Services are only looking for bad guys.

Well, did you also know that on the list of agencies that can access these records are:

  • HMRC (the tax man)
  • The Department of Work and Pensions
  • The Department of Transport!
  • The Welsh Ambulance Services and National Health Service Trust!!
  • The Food Standards Agency!!!

Now what on earth to the Food Standards Agency need with my internet browsing history? What possible use could it be to them?

If the UK Government were to enforce a backdoor into WhatsApp and other platforms like it – who would be able to access the information and how secure would it be?

But that’s not all. If the Government weakens encryption and demands backdoors be created in otherwise secure systems, who knows who can gain access to the information that was once protected?

If SSL certificates (which put the padlocks on your browsers address bar to indicate that the page is secure) become less secure, how safe are you when you are accessing your online banking or shopping on Amazon?

The truth of the matter is that if the UK Government gets it’s way it’s not really them that we have to worry about – it’s the hackers. They will have a field day with all this insecure data flying over the wire. All it would take would be a poorly implemented backdoor and then all bets are off. If Government agencies cannot even secure their own data, what chance do they have of securing the keys to our data?

A Developers Viewpoint

So, apart from being a UK citizen, what has this got to do with me and why am I ranting about it?

Well, as a developer I know that writing a chat application is not really that hard – in fact I recently read a book which guided the user through cross-platform Xamarin development and the target project was a cross platform chat application. Moreover, the source code is actually on Github so there’s a starting point right there.

Currently that XamChat application stores and sends data in plain text so not secure or private. But how difficult would it be to upgrade the app to use encryption? Even though I am not a cryptographer by any stretch of the imagination I’m guessing not that hard at all.

And that’s the point – if I can do this then any reasonably competent developer could do it too. If the UK Government we to make it unattractive for the bad guys to use secure apps like WhatsApp then there is nothing stopping them from writing their own end-to-end encrypted messaging system using state of the art encryption that cannot be broken with today’s technology.

Meanwhile the rest of us will be using insecure systems that leak information and make us vulnerable to malicious hackers keen to exploit these weakness, gather personal information and use it to their own ends.

Going Forward

In an effort to prove my point, I’m going to take a run at this. Ultimately I’m going to see just how hard bolting encryption into the XamChat application.

I’m not expecting (or intending) to create a WhatsApp killer or even anything that polished – just something usable to prove the point.

First thing to do is to get up to speed on encryption, especially in .NET. There’s a 4 hour course on Pluralsight so I can kill two birds with one stone; my commitment to watch one Pluralsight course a month and create a Command Line application to create Encryption Keys, Encrypt & Decrypt text data in preparation for creating SecureXamChat.

Edit – 15th Feb 2018: Subsequent to me posting this there was a great article in The Guardian which (obviously) make a much better job of getting the point across and it well worth a read.

Moving to new Virtual (really) Private Server provider

With all the post-Snowden privacy concerns I recently started to consider how much of my data was being stored on US servers, and hence fell under US legislation. It didn’t really surprise me that the answer was ‘quite a lot’, when you consider GMail, Drive, Dropbox, GitHub, Squarespace etc, but what was the alternative?

I’ve hosted my own systems, such as Joomla, Drupal, Subversion and Redmine, on Virtual Private Servers (VPS) running Linux for years but some time ago I scaled this back in favour of the above services. This was not a cost issue, the two servers I had cost less than £10 a month, but I’m not really an Infrastructure guy and configuring/patching servers and applications is not really what I do. But with all this mass snooping going on I thought that maybe I should at least take a look at moving my data back under my control.

I’d previously watched a Pluralsight course called Mastering Your Own Domain which presented a couple of interesting applications:

  • OwnCloud – a Dropbox alternative
  • GitLab – as the name suggests, a self-hosted source control system similar to Github

There were a number of other very interesting points made during the course so if you have access to Pluralsight then I highly recommend watching it.

Having all but stopped using the two VPS that I had it was quite simple for me to re-image one of them and have a play with the above applications – and I must say I was very impressed (still am). But then came the bombshell – my VPS provider emailed out of the blue to say that they were merging with a US company with immediate effect (immediate meaning just that!). Now, the servers were still hosted in the UK but did they come under US legislation? If the NSA requested all of my data (or all of the data on the same appliance that my VPS was running on) would it simply be handed over? If so, what was the point in effectively moving from one US service to another?

I raised a support ticket and received a reply saying:

“.. buying a service from a non-US company gives you very little protection in any case..” 

“.. Remember GCHQ in the UK were collecting lots of data on behalf of the NSA, and vice versa”

Not very reassuring I think you will agree, so I pressed the point and asked them straight out:

“If you received a request for my data (or all data from an appliance) from the US authorities, would it be handed over?”

I received no response and the ticket remained open until I ultimately closed my account a few months later.

With my investigations into OwnCloud and GitLab now well underway and being happy with how they were panning out I decided to investigate an alternative provider. Taking everything into account I knew that the new service would have to be provided by a company that was neither in the US or the UK.

After quite a bit of searching around I finally decided on Bahnhof – a company based in Sweden and who also hosted WikiLeaks. Their Privacy Policy says it all really. Yes they cost a little more than my previous provider but not excessively so.

I have been running with my new server now for a couple of months and have had zero issues. OwnCloud and GitLab are running fine – although there are a few configuration niggles to iron out (which reminded my why I started using hosted services in the first place) but on they whole I’m happy.

I’ve also configured both services to run under SSL and used SSL Labs to help me configure the systems to gain an A+ rating. Not bad for a Developer 🙂

The whole process, and the ongoing commitment to maintain these servers has reinforced to me that privacy is something that needs work. It’s all too easy to do nothing and to say “I’ve got nothing to hide” – but as Edward Snowden says;

Arguing that you don’t care about the right to privacy because you have nothing to hide is no different than saying you don’t care about free speech because you have nothing to say“.