Atari is 40 years old.

Wow, I’m feeling old.

Last week, Atari celebrated its fortieth anniversary. For those not aware, Atari not only was the biggest name in arcade and home gaming in the late 1970’s and early 1980’s, it could be considered the founder of arcade gaming. Their very first video game, Pong, was the first widely successful arcade game. After that, they had a string of hits including Asteroids, Breakout, Battlezone, Centipede, and others. They were also responsible for the extremely popular Atari 2600, which became the first big home video game console.

I’m feeling old because my very first console was an Atari 2600, or rather, a Sears Video Arcade.

When I was four or five years old, I was a big fan of Space Invaders. When I turned seven (way back in 1980), my parents got me a Sears Video Arcade, which was an authorized rebrand of the Atari 2600, with a copy of Space Invaders (which, like some other Atari 2600 games, was also a Sears rebrand). They had an ulterior motive to getting me the console; I had fairly poor hand-eye coordination, so they hoped that playing the games would be a fun way for me to improve my coordination. Part of me thinks they kind of regret having done so, though, as I became obsessed with video games afterwards. My hand-eye coordination did improve, though.

Of course, history shows that Atari’s dominance ended with the video game crash of 1983. The entire industry crashed and burned; one of the biggest reasons was that everyone could make games for the Atari 2600, so everyone did. That led to a glut of shovelware games on the market causing the good games to be lost among the chaff. Atari’s own decisions were pretty poor, too, with an unfaithful port of Pac-Man and reviled adaptation of E.T. the Extra-Terrestrial damaging their brand pretty badly. The video game industry didn’t recover until the release of the Nintendo Entertainment System, whose arcade-quality graphics and lockout chip (only allowing licensed games to play) restored confidence in the industry. The company now known as Atari is Atari in name and licenses only.

To celebrate their fortieth anniversary, Atari made it where people who installed and/or used their iOS game Atari’s Greatest Hits on June 27th would have access to the entire downloadable game library (a $10 value) until such time as the app is removed or reinstalled. (I don’t believe the offer is still valid.) I went ahead and installed it that day, and gave it a try.

The app, while interesting, is something of a mixed bag. The game selection covers both arcade games and Atari 2600 games; the arcade game selection only goes up until 1984 or so, when Warner Communications sold off the arcade division of Atari, which subsequently became Atari Games. Therefore, classics such as Gauntlet and A.P.B. are not included. (The licenses for those games are held by WB Games and are included in Midway Arcade.) Also, for several of the arcade games, both the arcade and Atari 2600 versions are included. Of particular note is Tempest, whose Atari 2600 port was never released but was included anyway. Also, four of the Atari 2600 games use the Sears Tele-Games box art. While three are understandable as they were Sears exclusives, the fourth, Pong Sports, was a renamed version of Video Olympics. For obvious reasons, licensed games from other arcade manufacturers and third-party cartridges are not included.

The downside of this app is the controls. As might be expected, the games are controlled by a virtual controller on the touch screen. On a screen the size of an iPhone’s (I don’t own an iPad), it can be awkward. While a controller such as an iCADE or a JOYSTICK-IT would work nicely for the joystick-based games, several games emulate the use of a trackball, which I imagine would not work well with these controllers. It does take some getting used to, especially with the Atari 2600 games. Still, even with some practice, I still have difficulty with many of the games I used to be halfway decent on, simply due to the controls.

Still, the app is an interesting look back at the old Atari games, both arcade and console. It brought out quite a bit of nostalgia for me, and I’ve found myself trying out the odd game every so often. While Atari’s heyday is definitely long in the past, it’s good to see people recognizing their importance. More importantly, it’s fun to play the games again and seeing how many of them still hold up, even after a few decades.

Sorry, Windows Phone 7 users. No WP8 upgrade for you.

Right about the same time I was writing my last blog post, Microsoft officially announced Windows Phone 8, the new version of their phone-only operating system. I kept half an eye on Engadget’s coverage of the announcement via their Twitter feed, and there was one announcement that caught my eye: Windows Phone 8 will not be available for current Windows Phone 7 devices. Instead, people with those devices will get Windows Phone 7.8. Windows Phone 7.8 will support a number of the new features from Windows Phone 8 like the new Start screen, but it will not run apps written for Windows Phone 8.

Again, this makes me wonder about whether they’re really learning the right lessons from Apple. Recently there has been a glut of commercials for the Nokia Lumia 900, Nokia’s flagship Windows Phone, stating that “the smartphone beta test is over”. The premise is that the Lumia 900 is the ultimate in smartphones. However, those who bought a Nokia Lumia 900 now get to learn that they won’t even be allowed to run the most recent version of Windows Phone. By comparison, the iPhone 3GS was released in 2009, and can still comfortably run the most recent version of iOS (5.1.1). More, the 3GS is a supported platform for iOS 6. If a nearly three year old iPhone can run the latest and greatest iOS version, why can’t a Windows Phone-based handset that was released two months ago get the latest and greatest version of Windows Phone?

Sorry, Nokia and Microsoft. It seems to me that you’ve still got quite a bit of beta testing to go.

Scratching below the Surface of Microsoft’s apparent “Me Too” syndrome…

Two days ago, Microsoft announced the Surface, their new in-house tablet. Running Windows 8 or Windows RT (depending on the architecture), it appears to be a standard tablet with an interesting unique feature. The screen cover is in fact a keyboard, supposedly capable of detecting the difference between typing and when an arm or hand is laying across it.

Microsoft posted the announcement trailer on YouTube for all to see.

Right off the bat, I noticed two problems. The first is that according to reports, the Surface will not be sold everywhere. It can only be purchased online and through the Microsoft Store’s retail locations, as opposed to the iPad’s wide availability. The second is that Surface is a quickly recycled name. Until recently “Surface” was the name given to what is now called PixelSense, which is a technology used for touchscreen displays in large environments such as furniture. Harrah’s and Microsoft, for example, made a big deal over the fact that Microsoft Surface/PixelSense displays were installed in the iBar at the Rio in Las Vegas.

Of course, cries from some started coming out that Microsoft was ripping off Apple once again. This prompted an image meme I saw on a friend’s Facebook page yesterday.

To be perfectly honest, there’s some truth in both sides of the argument.

I don’t think anyone can seriously argue that Microsoft isn’t the first to market in some areas of technology, like tablets and phones. The “rip off” (if one wishes to use such a loaded and inaccurate term) is quite a bit more subtle than that. What happens is that Microsoft is the first to market, but its implementation of the technology is clunky at best. Eventually, Apple brings its own version to market, which becomes very popular due to their attention to quality and usability. Microsoft then looks at what Apple did, and re-engineers its products to fix the problems that Apple’s implementation corrected.

For example, let’s look at cell phones. I’ve personally used three different cell phones that used Microsoft’s Windows Mobile operating system: the Cingular 8125, the Cingular 3125, and the AT&T Tilt. While I enjoyed using the phones because the available software was more varied than the Blackberry devices of the time, the interfaces themselves tended to be awkward, and stability was never a sure thing. Once the iPhone 3G came out (the first version with Exchange ActiveSync support), I switched over and marveled at how intuitive it was to use. After that, Windows Mobile devices seemed even more painful to use. Since then, Microsoft has abandoned Windows Mobile and released Windows Phone 7, which is a far more efficient phone operating system using the Metro UI planned for Windows 8.

Tablets are another example. I’ve seen tablets as far back as 2004, when we purchased a tablet PC for the COO of the company I was working for at the time. To say that the tablet was terrible is, in my opinion, an understatement. While the unit had a touchscreen, a special stylus was needed in order to be able to write on it. Also, the operating system on the tablet was Windows XP Professional. While Windows XP is a great desktop OS, it’s absolutely miserable to use on a touchscreen. I’m not sure how the COO tolerated using that machine outside of a docking station. Of course, once the iPad came out, Microsoft’s tablet ambitions started gearing towards Windows 8/RT with their Metro interface.

Of course, there’s no guarantee that Microsoft’s Surface tablet will be a success like the Xbox 360 or a failure like the Zune. I personally admit to some trepidation, if only because of Microsoft’s OS strategy. Unlike Apple, who has one OS line for its desktop/laptop machines (Mac OS X) and another for its phones and tablets (iOS), Microsoft is using Windows Phone for phones only, while Windows 8 will be used on tablets and desktops/laptops. I’m not sure how well that will work out for them; while Metro works great for a touchscreen interface, I’ve tried it in a desktop environment (VMware virtual machine running Windows 8 Consumer Preview) and it was absolutely terrible. Worse, Metro is the default UI and disabling it is not possible. With a keyboard and mouse, Metro is unintuitive and frustrating to figure out.

Of course, only time will tell whether Microsoft’s strategy will work out for them. Unfortunately, while they look to Apple to fix where they went wrong beforehand, I fear they didn’t learn the proper lessons or implement the proper corrections.

Consolidation via virtualization…

As I’ve stated in previous posts, I host servers at my house. I suppose I could just have email and web services for my domain hosted by services like what Google offers, but to be honest I like having full control over my mail and web services. After the incident a few months back that caused me to upgrade my main server, I started looking into upgrading my two remaining servers. I reasoned that if a total hardware failure on old equipment can happen to my main server, it could happen to my two other servers. After all, they were much older, as the DNS server was fifteen years old, and the test server was ten years old.

However, the main problem I saw was one of costs and resources. While the hardware for both servers was ancient in computer terms, it also still did the trick as far as requirements were concerned. Neither server ran anything that was resource-intensive, so buying two brand new servers seemed like serious overkill. It finally occurred to me that I could migrate both servers to one brand new box using virtualization. I originally planned on using VMware vSphere Hypervisor, which is free and I was very familiar with from my last job. However, I soon realized that in order to use it, I’d have to purchase hardware that was quite a bit more expensive than I was willing to pay for. In the end, I opted for KVM running on top of CentOS.

I ended up ordering an HP Proliant server from Newegg that was similar to the previous server I had purchased, albeit a newer model. In addition, I purchased 4 GB of additional RAM to install. Once it arrived, I ended up shutting down the test server and putting the new server in its place. Installing the operating system went fine, as did making sure virtualization was enabled. When it came time to set up the virtual DNS server, though, I hit a couple of snags. The first was that Slackware Linux (the distro of Linux I use for the DNS server) didn’t like booting after installation from KVM. I ended up finding this tutorial that helped me get around that problem. The second was that once the old DNS server was shut down and the new one started, the new one couldn’t pass any traffic to the internet. It wasn’t until the next morning that I realized that simply power-cycling the cable modem (to clear the ARP cache) would fix the issue. As soon as the virtual DNS server was brought online, I quickly put together the virtual test server and shortly had it online, too. I’ve since done a secure wipe of the hard drives of the old servers, and have them in a corner of my office waiting to be taken in for recycling. The only part that isn’t going to be recycled is the test server’s 320 GB hard drive, and that’s simply because my father wants it for his home PC.

I have to say, I’m pretty happy with how this has turned out. For one thing, shutting down the two older servers and replacing them with the one virtual host server has actually caused the temperature in my office to go down considerably, where it’s actually comfortable to sit in there for extended periods now. In addition, whereas the test server couldn’t sit on the battery backup due to load concerns, I now can have all of my servers covered by the battery backup. All in all, I’m spending less (in electricity due to power and A/C), have more stable and up-to-date hardware, the cart with my servers is less cluttered, and it’s working as stable as it did before. I can pretty much consider this migration a success. :-)

A quick followup to the previous post…

Early last week, I wrote up a blog post arguing why we would not be seeing Mortal Kombat released for PC. Essentially, it boiled down to high piracy rates combined with low legal demand meaning that the projected proceeds would not be enough to cover the cost of porting. In response, several people argued that that there was demand for Mortal Kombat for PC, and that there would be enough sales that the “low” cost of porting the game would be covered. One item brought up by more than one person was that the soon-to-be-released PC version of Mortal Kombat Arcade Kollection would prove that there was a high demand for Mortal Kombat on PC.

Well, yesterday, I was informed by a knowledgeable friend that Mortal Kombat Arcade Kollection has been leaked to download sites by pirates, and MK fans have already been spreading the word about it.

Thanks, assholes. Way to prove me right.

Mortal Kombat on PC? Don’t bet on it.

Due to recent events occurring over at Mortal Kombat Online, I’ve started monitoring the site’s official Twitter account. For the most part, it’s been about the same as I can remember, with one slightly annoying difference: there have been a few people constantly flooding our “mentions” with requests regarding a PC version of Mortal Kombat, especially now that Mortal Kombat: Komplete Edition (with all of the DLC characters and skins included) has been announced. While we’re not the only people getting these messages, it’s still irritating to see the flood.

What I’m about to say is my own opinion, and not that of Mortal Kombat Online and (definitely) not NetherRealm Studios. The chances of Mortal Kombat coming to PC are non-existent, with the exception of Mortal Kombat Arcade Kollection. Even then, the PC version of MKAK has been delayed by several months.

I’ve already seen some of the most common arguments regarding porting Mortal Kombat to PC. One of these arguments is that there’s an online petition with something like 13,000 signatures, showing demand for the game. The problem there is that when it comes to showing demand, online petitions don’t mean much. Anyone can put in a signature, but it doesn’t mean they actually intend to buy the game. A secondary problem is that 13,000 is in fact a pretty small number. By comparison, 732,000 copies of the game were sold in the first week for consoles (reference). I would hazard that if the online petition wanted to be taken seriously, they would need 50-100 times the signatures that they currently have.

Another argument I’ve heard used is the fact that Capcom has released Street Fighter 4 and Street Fighter 4: Arcade Edition to PC, and that they’re planning on releasing Street Fighter X Tekken as well. They also claim that the popularity numbers are high for the game, indicating a lot of sales. Unfortunately, they don’t give figures as to the exact number of sales. In addition, it doesn’t take into account just how many sales there were compared to how much the game has been pirated. I’ve heard arguments that the SF4 games on PC have DRM to prevent piracy, but the problem there is that the pirates rather quickly defeated the DRM. I’ve heard stories about how tournaments would use pirated copies of SF4 for PC instead of licensed or console copies.

That said, people would still point to the PC sales and say, “They’re still making money!” The question becomes, “How much money?” If Netherrealm were to release a PC version of Mortal Kombat, they would have to use time and resources (in other words, money) towards porting the game over. If the game’s sales on PC do not cover the cost of porting, then it’s not worth it for them to go through the trouble. Unlike consoles, PC hardware runs the gamut of different manufacturers, drivers, and capabilities. They would have to take all of that into account.

So, one may ask, why was Capcom able to do it? The answer is simple: they already HAD a Windows version. Unlike Mortal Kombat, the Street Fighter 4 games were released for arcades, and the arcade hardware they used was a system called Taito Type X. Instead of using proprietary components, Taito Type X uses PC hardware and runs on Windows XP Embedded. In other words, a very good chunk of the porting had to be done anyway, so what extra they needed to do was covered by the PC sales.

That doesn’t change the fact that the Street Fighter 4 games are very heavily pirated. While it’s true that console games are pirated as well, the barrier for entry for pirating console games in much higher. While any PC can run a pirated PC game, game consoles have to be modified before they can run pirated console games. Modifying console hardware is a risky venture; not only can it ruin your console (rendering it a brick) should something go wrong, the security systems in the console itself will get you banned from online services if the modification is detected. While I’ve heard it said that only paying customers can play Street Fighter 4 online, I personally find that rather doubtful as I’ve not heard of how this is supposedly accomplished outside of a “Kombat Pass” system like Mortal Kombat uses. It wouldn’t be the first time I’ve heard that claim and have the reality be that many online players of a game were pirates.

The sad thing is that none of this is really anything new. When I started at MK Online (then called MK5.ORG) back in 2002, Mortal Kombat: Deadly Alliance was in development. We had gotten a lot of people asking if and when MK:DA would be released for PC. So, when I went to E3, I asked a couple of people in Midway’s marketing department if there was a PC version in development. They said that there wasn’t, simply because they had never made any money off of the PC versions. The fact that the game was very heavily pirated became apparent in that respect, because I knew a lot of people with MK games for PC, yet the game never made any money. In addition, at least one person who demanded a PC version in the site’s chat channel inadvertently admitted that the only reason he wanted it was so he could pirate it.

So, in essence, it really comes down to money. NetherRealm and Warner Bros. apparently do not feel that porting Mortal Kombat to PC would be worth the money. While the Street Fighter 4 games are out for PC, they were ported ahead of time for different reasons and as such the comparison is not valid. I’m sure Warner Bros. has already done the research and crunched the numbers, as if there were a true profit to be made we’d be seeing a port. Unfortunately, the petitions have too few signatures, the projected PC sales aren’t high enough, and the piracy would be rampant.

With all of the factors taken into account, there simply isn’t enough of a legal market for the port for the effort to be worth it.

Of webs and androids…

It’s Saturday night, and I’m sitting in the living room watching Mythbusters while Jennifer dozes on the couch. All in all, it’s a good evening.

Up until recently, though, I would be found in my office at least part of the evening, as I’d have an urge to browse the net. I would use my phone, but I like to keep it in the kitchen where it charges. I also have a work laptop, but it’s a little unweildy sitting in my lap while I’m in the living room. That was taken care of recently by my mother-in-law, who managed to get us an HP TouchPad. For $150, it was extremely inexpensive yet very functional. Both Jennifer and I use the tablet, and it now comes with us on trips instead of the work laptop.

Also, as Jennifer puts it, “I now see a lot more of Scott in the evening.”

However, while we’ve been very happy with the TouchPad, we have come to realize that there’s one major problem with it. The TouchPad runs webOS, which is HP’s own mobile operating system (which they got when they acquired Palm). While webOS is a nice operating system, with the discontinuation of the TouchPad it is becoming seen as a dead OS. The app support has been slight, and there have been no real additions to the lineup. What I needed most was a remote access app for work purposes, and ended up jury-rigging a proprietary solution.

Obviously we needed something better. Fortunately, a solution presented itself Thursday.

My friend Rigo Cortes posted on Twitter that a new release of CyanogenMod had been released for the TouchPad. Intrigued, I looked into it. CyanogenMod is a community-written third party distribution of Android. The installation process for CyanogenMod looked simple, and kept the original webOS install in place. I asked Rigo, and he assured me that while it was labeled as alpha software, it was still very stable.

So, last night I downloaded the needed software, and I installed it this morning.

How did it go? It turned out to be as easy as I had read. It’s also as stable as Rigo had said. I’ve gotten all sorts of software downloaded that I couldn’t get for webOS, like Fruit Ninja, Netflix, Trillian, and others. I even have remote access software installed. I’ve only used it today, but it’s working great.

So, now I’ve got a much more useful tablet. It’s too bad webOS had to be replaced, as we liked using it. However, app support is much better on Android, and app support is what’s most important. Jennifer gets her Fruit Ninja, and I get my other apps.

We’re both happy campers. :-)

Ultraviolet? More like ultra-annoying.

By now, it’s pretty obvious to people who know me that I like to have digital copies of my movies. I guess one could argue I got started relatively early (compared to the current consumer desire, that is), seeing as that as far back as 2006, I was making digital copies of Mortal Kombat to watch on my smartphone. Nowadays, though, when a movie comes out on DVD/Blu-Ray, I buy the version of the disc that comes with the digital copy. I like being able to have the movie in iTunes, where I can watch it later using my iPhone or HP TouchPad.

However, the studios have started offering this “new service” for the digital copies that is really annoying me.

I’ve noticed that several movies that featuring a digital copy have a label on the packaging advertising that it is an “Ultraviolet digital copy”. It’s a reference to Ultraviolet, a service by Flixster that sells access to streamed versions of movies, plus allows you to redeem codes for movies you’ve bought. You can watch the movies via the Flixster app on iOS and Android, and there’s also an app (based on Adobe AIR) for PCs as well. Sounds great, right?

In my case, it’s anything but.

The primary reason I have digital copies of movies is because I want to watch them while traveling. As I brought up earlier, I like watching Mortal Kombat when on trips for Mortal Kombat Online. Another example was when I was traveling from Dubai to Houston; I watched the entire The Lord of the Rings film trilogy on the flight. Also, on a drive from New Orleans to Houston, Jennifer and I once watched The Dark Knight. In other words, I mostly watch my digital copies on portable devices. PC playback support, while kind of nice, is less important to me.

There are really three problems here. The first problem is that I like to use iTunes to manage the media on my iPhone. It tells me how much space I have available on the device, so when I’m picking and choosing what movies I want on the phone, I can make accommodations for what I’m willing to remove to make space, like shrinking the music playlists and the like. Furthermore, with the movies on the hard drive, I can simply copy the movies over USB to the phone. With Ultraviolet, while I can copy the media to my devices, I can’t actually copy it to iTunes. I can copy it to my PC, but iTunes will not recognize it as a viewable movie. I can download it straight to my iPhone, but then I have to figure out ahead of time how much space I need to clear. Worse, according to Flixster’s website, I can’t delete the movie afterwards without deleting the entire app first.

The second problem is the one that REALLY irks me. There’s no support at all for the iPod Classic or the HP TouchPad. There isn’t a Flixster app for webOS (and I’m not holding out hope that one will come out), and the iPod Classic doesn’t have net access at all. The iPod Classic is what I bring with me on trips where I don’t know what I’ll want to watch beforehand, as my entire media library will fit on it (as it has a 160 GB hard drive). One might think it’s annoying watching a movie on a screen the size of the iPod Classic’s, but if I’m not sharing with another person, it’s not that bad.

The third problem is that all of this depends on the Flixster and Ultraviolet services being completely in sync. That doesn’t seem to be the case. During the setup process for my account (done while trying to redeem a digital copy of Green Lantern), it set up a Flixster account and asked me for an email address and password for the Ultraviolet account. However, I was never told what the sign in name was for Ultraviolet, and there isn’t a process by which to find out what it is on Ultraviolet’s website. I’ve got a support ticket in, and I hope they can tell me what it is. In addition, the Flixster app for iOS isn’t giving me access to the movies in Ultraviolet. The FAQ on Flixster’s site said that if the Flixster and Ultraviolet accounts are linked, I should have access to the movies. I guess the accounts weren’t linked after all (despite what the setup process said).

All in all, I’m pretty fed up. At least when the movies had iTunes support, all I needed to do was put the code in to iTunes and the movie would simply download to my PC, and work right off. All of these extra hoops I’m having to jump through with extra accounts, new players, and the like do nothing to make things easier for the consumer. It’s crap like this that helps increase piracy, as those people who “do the right thing” get the shaft, and the pirates can get their copies of the movie far more easily and without any of the hassle.

I guess I should just look at the bright side. The worst case scenario for me is that I can take the DVD and use HandBrake to create my own digital copies. Still, it’s frustrating, and I’m going to definitely think twice before buying any film that features an “Ultraviolet” digital copy.

Not really how I wanted to upgrade…

Some (but not many) people are probably wondering why it took me so long to put the blog post up announcing our marriage. The answer is that, well… it wasn’t exactly by choice.

A couple of days after we arrived in the Bahamas, I noticed that I wasn’t receiving email from my main server. A quick check showed that the web services weren’t responding and that the server itself wasn’t responding to pings. I wasn’t too worried; I just figured the system suffered a kernel panic (the Linux equivalent of a BSOD) and needed to be rebooted. I sent an email to the pet sitter asking him to hit the reset button on the server, and left it at that.

A day or so later, he emailed back saying he had done so. I checked, and it was still down. I still wasn’t too concerned, as I figured he probably hit the wrong button. I ended up asking Jennifer’s mom to power cycle the box. After the now-in-laws returned home, I got an email saying she had done so. I checked the server, and I still got nothing. Now I began to worry, thinking a hard drive might have failed.

When we got home, one of the first things I did was check on the server. It was worse than I thought: the system wouldn’t even bring up video. I sighed, and decided to check to see if I had a replacement motherboard. I did, but it was in an old case and wouldn’t come out.

I think it was about at that point that I decided “frak this” and started checking Newegg to see how much a new server would cost. To my surprise, there was an HP Proliant server that fit my needs that only cost $450. I talked to my dad, and he bought it for me with next day delivery (as I host his business email as well, and have done so for years).

We ended up learning a life lesson. When Newegg says “next day delivery”, they mean “delivery the day after the order is processed”. Until now, all of my orders had been processed within two hours. This time, it took two DAYS to process. The end result was that the box arrived on Wednesday instead of Monday. Dad was unhappy, even after Newegg graciously refunded the shipping costs.

As soon as the server arrived, I added a second hard drive (to hold backups), added the users and groups, installed the backup software, and copied the backup archives to the server. Thankfully, the backups restored with no problems. After that I got mail up and running (and was promptly buried under an avalanche of two weeks of undelivered email). Once email was up, I could get everything else up and running.

The new server is pretty nice, too. It’s a lot faster than the old box was (as the old one was using 10 year old hardware), has a smaller case, and it’s far more quiet. The only disadvantage is that I don’t have it set up for RAID, but I don’t need it just yet. Besides, the RAID on the old server caused me issues when I was trying to read the drives from my Windows PC for recovery purposes. :-)

Ah, well. The server, barring some last minor problems, is completely up and running. I’m just thankful the restore (once started) went as quickly as it did, and that the friends who do secondary MX for me were so willing to accommodate having two weeks worth of queued mail on their servers. :-)

No point in signing the mail if no one verifies it.

It’s pretty common knowledge that I tend to be something of a nut about security.

For the past several years, I’ve been using GnuPG to cryptographically sign all of my outgoing email. The digital signature was attached to every outgoing message, as a way of verifying that I was the one who originally wrote the message. Adding GnuPG support to my emails wasn’t hard; mutt had GnuPG support built-in, while Mozilla Thunderbird could get support via the Enigmail extension. However, after using it for several years, I decided this weekend to stop automatically signing all outgoing email.

The first reason for doing so is the fact that for a couple of years now, I’ve been accessing my email via my iPhone. It doesn’t support GnuPG or any sort of PGP natively, so of course I wouldn’t be able to send out signed emails. The second – and more important – reason is the fact that really… outside of me, no one cares. Most people I know use either a webmail of some sorts or Outlook to access their email, so to them my digital signature looks like a weird text attachment. They pretty much figure the message must have come from me anyway, and aren’t concerned about it. Also, most (if not all) emails aren’t even important enough to worry about signing; I’d been signing my emails pretty much out of sheer habit. For the most part, there isn’t anything that would require me to later on prove that I actually sent it.

So, I’ve decided to stop digitally signing my emails, so those friends of mine I do send emails to won’t have to wonder about those weird text attachments on them. For the most part, I’m sure they won’t care, and I really am not going to lose any sleep over it.

Though, I will admit, it’ll be nice not having to type in the encryption passphrase with every email sent anymore. :-)