New OpenPGP Key

Over the new few months I will be slowly transitioning to a new OpenPGP (GPG) key. The reasons for this are as follows:

  1. In light of the recent information regarding the NSA, GCHQ, ASIO and other spying on citizens of the world, I believe a larger key size will increase security against attacks (even if the increase is small).
  2. I read about a patch to GnuPG to allow creation of larger key sizes and wanted to try it out.
  3. I wanted to have a clean slate with completely separate subkeys and good key hygiene (in regards to how the private key and revocation certificates were stored).

I have created a new 8K-length certification master key (0xB341C361CE04C603) with the following subkeys:

  • 4K Signing key (for signing documents and emails)
  • 4K Encryption key (for encrypting files)
  • 4K Authentication key (for logging in to systems, though in practise this isn’t really useful yet)

The reason for the 8K (for the uninitiated, this is a huge key that is overkill for current technology) separated certification key is so that I can keep that key safely on my home systems protected from the wild, whilst still being able to carry my signing, encrypting and authentication keys around on my laptop without too much trouble. Since the certification key is used for signing other keys and being signed by other keys (i.e., building the web of trust) it is a good thing if this key is both well protected and doesn’t change much.

The authentication key is interesting – in theory the underlying key data is such that you can use it for SSH logins, but it is such a pain in the arse to get the key data out and into a format that the SSH client can use that nobody bothers.

My old key (0xF3EABD1AAC83D520) no longer has a valid encryption key and I will be revoking the master key within the next few weeks.

You may also be interested to read my OpenPGP policies. 2014 – January 6

I’m (sporadically and with much delay) blogging my yearly pilgrimage to 2014, this year being held at the University of Western Australia in Perth.



We begun the first day of the conference with the morning keynote, which was presented by Suelette Dreyfus. She talked about some of the statistics around people’s feelings towards privacy, whistle-blowing and government surveillance. The thing I found most interesting was that the ordinary citizen supports whistle-blowing and doesn’t support government surveillance. Which leads to one of two conclusions:

  1. The government will soon have to start actually listening to citizens and do something about all this.
  2. The government is actually entirely controlled by the spy agencies and we’re all screwed.

Yay for freedom and democracy! :/

Rocketry & Radios

The next talks I attended were from the open radio miniconf, where Bdale Garbee and Keith Packard talked about the hardware and software they are using for rocket to ground radio communications on their rockets, and which they are successfully selling through their fully open-source business. I found a few points interesting:

  • RF circuit board design is hard. There is some serious smarts going on with designing those boards to not have everything interfere with everything else (especially in such as a small package, with two radios within a centimetre of each other).
  • Here is yet another FOSS small business that is clearly surviving and not a complete drain on the pocket (one assumes, you can never be sure). That’s good news, as the world needs more businesses to cross that divide between open-source and the commercial world.
  • Rockets are fun!

The Sysadmin Miniconf

Between lunch and afternoon tea I sat in on the sysadmin miniconf (there’s a mantra at if you’re in doubt as to what to see, tend towards the left hand side of the schedule). The most interesting talk for me was from Elizabeta Sørensen on RatticDB, which looks a pretty cool password management tool that would have been amazingly useful in my last job (where I worked as a sysadmin rather than being a programmer like I am now). Despite being immature software, it has a lot of promise and I’ll definitely be trialing it for my own uses.

I also found the talk on Husk by Phillip Smith to be very interesting. Writing iptables rules is a pain, and writing them twice (once for IPv4 and again for IPv6) is a complete pain. So Husk looks great because it gives you extra power in simply being able to write-once for both network stacks and being able to re-use variables and rulesets. It’s basically SCSS for firewalls.


After afternoon tea I went to the talk given by David Rowe on modems and how they work in a basic sense. Unfortunately I was completely out of my depth and I had no idea how the modem algorithm fit into the stack of hardware and software. Is the mixer hardware or software? Where is forward error correction done? No idea. More reading for me to do!


20140106_182003By this stage I was pretty exhausted, having not got much sleep the night before. I therefore retreated to the dorm room and had a quick nap, a cup of tea and a shower (Perth is hot!) before dinner. I went out with a few friends (new and old) to a great pub we’ve found nearby that does good pizza and amazing crème brûlée. Hopefully an early night tonight so I don’t get too exhausted before the week is out.

Going from Windows to Linux

A typical Linux Mint desktop (from ExtremeTech)

A typical Linux Mint desktop (from ExtremeTech)

I’ve recently installed Linux Mint on my laptop, replacing a horribly broken install of Windows 8.1 Preview. There have been good and bad things:

The good:

  • The Windows 8.1 Preview broke the wireless connectivity on my laptop horribly. Every time the laptop booted up or awoke from sleep, I would have to uninstall the wireless card from the device manager and then scan for new hardware to add it again. I would then have to key back in all the wireless keys for the networks I used before I could connect again. This got a bit annoying after a while. Installing Linux Mint, I had no issues with drivers or network connectivity, even with sound drivers, which is something that has plagued the Linux desktop world for years. It just works, and that is truly great.
  • With all the attention being given recently to the NSA’s spying on the citizens of the world, it’s nice using an operating system that gives you a little more protection (even if it isn’t very much more) from the spooks. I am still using many cloud services (including accounts with Facebook, Google, Twitter, Microsoft and Apple) so I still have a long way to go, but I can now PGP encrypt my mail with little effort, and should the need arise I can inspect every line of code on my system for back doors (though, it might take a while).
  • The GUI can actually be described as beautiful. While I’m a big fan of the classic Windows look (circa 2000 and XP) and I’m also a big fan of the Windows 8 Metro theming, the horrible combination of the two that most Windows 8 apps seem to have leaves much to be desired. In addition, most GNU/Linux distributions (looking at you especially, Ubuntu) have completely unusable GUIs. Linux Mint takes a beautiful looking GTK+ theme and marries it with a window manager (called Cinnamon) that is just stunning. It’s what Linux should have been like for years. And no Unity in sight.
  • Steam now works on Linux, and I can play Counter-Strike: Source again. This is a big deal, and it’s a great benefit to “Linux on the desktop”.
  • It uses Ubuntu’s package repositories, which use in turn use Debian’s awesome apt-based package management system. This gives you access to all of Ubuntu’s packages (which is a massive collection) and it uses familiar Debian configuration files. It’s a rock-solid (less stable than Debian Stable, but so are most nuclear reactors) core system.

The bad:

  • Over recent months I’ve done a lot of software development in Visual Studio. VS 2012 is a great IDE. And it has nothing that comes even close on Linux. Netbeans (my preference on Linux) is a pretty powerful IDE, but VS still blows it out of the water in every way. Similar to Evolution vs. Outlook, there are still a few killer applications on Windows that make it the default choice for getting things done.
  • Firefox and Thunderbird look ugly as sin on Linux Mint compared to Windows. I’m really disappointed as everything else is so good looking in comparison.
  • There’s no good replacement for MetroTwit. I’ve tried most of the Twitter clients for Linux, and they all suck in various ways. MetroTwit, as far as I’m concerned, is pretty much where it’s at with Twitter clients. It’s awesome.

Overall, I’m very impressed with Linux Mint. If you haven’t tried a GNU/Linux distribution in a while, give it a go. I think you’ll be pleasantly surprised. 2012 – Day 5 (Friday)

This week I’m at, the southern hemisphere’s premier open-source conference. This year it is being held in Ballarat, about an hour’s travel from Melbourne. I’ll be documenting the trip and conference as much as I can given the limits of my enthusiasm and awakeness.

Friday 20th January:

Friday is the last day of the conference, and everybody is starting to look tired; it’s a full-on week. But, before we all go home, there are just a few more excellent talks to attend. The first of these was Friday’s keynote, given by Jacob Appelbaum, and what an amazing keynote it was. Jacob talked about the state of surveillance states. He explained what they are doing to keep track of all of their citizens, and the special measures that have been put in place in the last few years (mostly since September 11) that significantly curtail our freedoms in the name of privacy and safety. A few choice quotes from the talk:

Free software is for freedom, open source is for business solutions.

Be the trouble you want to see in the world. [It's in my notes, but I'm pretty sure it was actually just written on his shirt]

90s Nihilism: I have nothing to hide.

The data kept about you in [server] logs around the world tells a story that is not necessarily true, but is made up of facts.

This talk flowed on nicely from Senator Ludlam’s talk at the Penguin dinner.

After morning tea, I watched the talk by Rusty Russell and Matt Evans about why UNIX has been getting bigger over time (in terms of binary bloat). It’s mostly due to new features, but also because of the infrastructure that modern systems have and the libraries that are statically linked in these days (glibc is basically just bloatware). Also in this session I attended the talk by Simon Horman on Open vSwitch. It’s really interesting content, but the presentation was a bit dry. It’s definitely something I want to check out when I get home though, as it could be quite useful for me when I have VMs set up in Linux. The support for VLANs makes it a much better choice than standard Linux network bridges.

During lunchtime there was a meeting between a group of Tasmanian delegates, and it was decided that the Hobart Linux User’s Group should be started up again. So if you’re reading this, like Linux and live in Hobart, get in touch!

After lunch was the best-of sessions. These were talks voted for by the delegates that they wanted to see again, or missed the first time around. I watched two fabulous talks. The first was on Codec2 (presented by David Rowe), an audio speech codec that uses 1400 bits/sec for transmission, which is a 500x improvement on raw 16bit 44.1kHz audio. Very impressive. The second was on the freedom box project (presented by Bdale Garbee, which is a platform for developing easy-to-use home servers oriented towards federated social networking services (such as or Diaspora). This followed on nicely from Appelbaum’s talk that morning, giving a solution to some of the problems that were outlined.

The final session of the conference was the lightning talks. The real highlight was watching Paul Fenwick jump up on stage between the lightning talks and try to give a several minute long presentation in thirty seconds. He failed, but it was funny to watch. After the lightning talks was the closing ceremony. The main reason for this is to hand out a few awards and thank some people, but also to find out where the next is going to be held. Next year, it’s in Canberra!

Review: The Googlization of Everything

The Googlization of Everything Front Cover

Front Cover

As you might be aware, I’m not a huge fan of Google, or indeed, cloud-based apps in general. When I saw The Googlization of Everything (and why we should worry) in my local campus bookstore, I decided that it would be a good read on the spot. It was a good read, and here’s why.

I will admit upfront that I’m probably not the most unbiased person in the world when it comes to what is really a review of a review of Google. If you dislike bias, go watch the Pakistani cricket team instead.

I’d always been a bit hesistant to use Google products (or any cloud product) because of lax privacy. How do I know that Google won’t be using my information against me? This book (partially) confirmed my suspicions. They are using my information, both for and against me. Every time somebody performs a Google search, Google stores the query and information about me (my IP, my location, browser, etc) and uses that to tune search results for me. This appears on the surface to be fine. I like it when I search “cat” on Google and Google knows that what I really want is the first result to be (I use the “cat” Google search as a quick method to test Internet browser connectivity, I don’t know why). But Vaidhyanatham (the author) raises other points about this. Firstly, how long is our data kept? And who else is it being shared with? But perhaps most disturbingly, Google might prevent me from seeing new information because it’s too busy telling me about what I think I want. For instance, if a new species of cat was discovered in the jungles of Peru, I might miss it because Google is too busy customising my results with Internet connectivity tests. While that not matter much, on other (more important) subjects the splitting up of information based on what we think we want to see is disturbing.

Vaidhyanatham raises many other points in his book too. The Google Books project is designed to give everybody in the world access to out-of-print books, instead of having them sit on dusty shelves in university libraries. It’s a nice goal. However, the program is structured in such a way that nobody else could possibly compete with it, due to arcane copyright law and out-of-court settlements. Do we want Google to be the sole provider of this service? Shouldn’t this be done by a public or community organisation instead? They’re difficult questions, and the fact that we haven’t even started considering them should warn us.

Another similar situation exists for Google Scholar. Google has obtained agreements with universities to provide academic articles for inclusion in Google’s archives. The idea, similarly to Google Books, is to allow more people to see things they wouldn’t have otherwise seen. A noble goal. Again, however, there are several problems with the project’s implementation. Again, arcane agreements and laws prevent universities from easily collaborating with an alternative archive agent. Even more worrying is the fact that to most users (those without access to the paywalled sites that the articles actually reside on) only the abstract of an article is available. This results in a broadening but a shallowing of the information available to most people. This is another of those projects that might be better taken on by the people, for the people. I know of one user on an IRC channel I frequent who is collecting datasheets and manuals from PC components from the 1980s, before these datasheets become extinct. While not legal, and while he hasn’t made this public, it’s the right direction to go in.

Then there’s the shallowing of our knowledge due to Google. This is a huge topic, and so many authors have covered it in so many various degrees of rigour that I won’t even begin to scratch the surface. However, here’s the gist of the idea: Because we have access to the largest library in human history (the web) at our fingertips at any point we’re in front of a computer (which for those of us with a smartphone, is constantly), we don’t remember information like previous generations did. I’m still not confident that this is a bad thing; I am a lot more knowledgeable than I would be if the only learning resource I had was a paper encyclopædia. I don’t know a lot of facts, but I know where to find them. In today’s world, that’s what counts. Still, it’s something we should look into further.

A good portion of the arguments put forward in this book are more general than Google and apply to the Internet at large (such as the shallowing of our knowledge). Most of the other arguments could be taken directly to Google’s legal department in a court showdown (which would almost certainly be the court case of the century). Whichever way you stand on the issues, more information is never a bad thing.