Category Archives: Uncategorized

PSA: Prefer to use AdoptOpenJDK’s jdk-11 builds for embedding in Mac Apps

If you are planning to distribute a Java app on Mac, you should avoid using the JDK builds from jdk.java.net as they won’t necessarily work on Mac OS older than 10.13. This is because the libjvm.dylib is build with MACOSX_MIN_VERSION set to 10.13. This doesn’t necessarily cause a problem until you try to run a signed app on Yosemite or older (10.10). Your app just won’t open. Checking the logs you’ll receive an error like:

​Error: dl failure on line 542
Error: failed /Applications/MyApplication.app/Contents/Java/jre//lib/server/libjvm.dylib, because dlopen(/Applications/MyApplication.app/Contents/Java/jre//lib/server/libjvm.dylib, 10): no suitable image found.  Did find:
    /Applications/MyApplication.app/Contents/Java/jre//lib/server/libjvm.dylib: code signature invalid for '/Applications/MyApplication.app/Contents/Java/jre//lib/server/libjvm.dylib'

Now, you might be fine if you’re building the app on 10.10 or older, but not sure. This particular issue is a combination of:

  1. libjvm.dylib set with a min version of 10.13.
  2. codesign on 10.11 and higher automatically signs libs targeting 10.11 and higher with a different signature than is can be understood by gatekeeper pre 10.11.
  3. Gatekeeper barfing when it hits this signature.

So, If you’re building (signing) your app on the latest Mac OS and you want to be able to distribute it to older versions of OS X, you need to make sure that all of your libraries are built with the MACOSX_MIN_VERSION set to 10.10 or lower.

You can verify this using otool. Inside the standard openjdk build on jdk.net, you can go into the Contents/Home/lib directory, and run:

$ otool -l  */libjvm.dylib | grep VERSION -A 5 | grep version
  version 10.13
  version 0.0

(Note: libjvm.dylib is the only problematic one. All the other dylibs are built with 10.8 min version).

However, if you download the build from AdoptOpenJDK, and do the same thing, you’ll find

$ otool -l  */libjvm.dylib | grep VERSION -A 5 | grep version
  version 10.8
  version 0.0

Just another reason to use AdoptOpenJDK for your Java distro.

Entitled OSS Users and the Xamarin RoboVM acquisition

RoboVM has been acquired by Xamarin, it was announced; and it would no longer be open source.

Wow.

It only took five minutes for the forum posts and reddit threads to start up condemning the move as some sort of robbery. The RoboVM was accused of luring unsuspecting users into its community on the promise of open source, only to pull a switcheroo and sell out to big business. Some users were demanding the RoboVM team continue to share their work for free, because … that would only be fair.

RoboVM, on the other hand, explained that they had been open source for a few years and had received little to no contributions from the community, so there wasn’t much incentive to continue with that approach. My personal experience with managing open source projects is consistent with theirs. I released the first version of Xataface in 2005. In that time it has hundreds of thousands of downloads, and is still used in many enterprises as the back-bone of their web information systems (I don’t have an exact count since most apps built with Xataface are internal). In that time, I can count the number of community contributions on my fingers and toes. I’m thankful to all of the users who did contribute. But let’s be real, the case for open sourcing a project because the community will contribute is not compelling.

Shut up and Fork it!

No, really. The source (albeit a couple of months out of date) is still on GitHub and it is licensed under the GPL. That repository represents countless hours of high-quality work by incredibly skilled individuals. That is one hell of a contribution to the open source community. Let them move on; And if you want your open source RoboVM, you can build on this fantastic source base.

Personally, I think it is highly likely that the last open source version of RoboVM will continue to circulate for a long time to come. At least in its core as an AOT java VM, it should be maintainable by people on the outside because most of the heavy lifting is already done there. It is the value-added components like the iOS API bindings, and tool support, that will be difficult for the community to maintain going forward. These things are evolving too fast for volunteers to keep up with.

Dependent Tools

If you are an iOS developer who just uses RoboVM to build iOS apps in Java, then the move to close the source probably won’t affect you – except that your costs may be going up some. I wonder more about the impact that this has on other developer tools that have made RoboVM an integral part of their tool chain. I’m thinking about companies like Gluon that provides JavaFX support for iOS and Android. They use RoboVM for their iOS builds. DukeScript, which allows you to write Java apps with an HTML5 UI and deploy to iOS (and other platforms), also uses RoboVM for its iOS builds. How will they respond.

I had argued as recently as 6 months ago that we (at Codename One) should incorporate RoboVM into our toolchain rather than maintain our own Java VM. But we ultimately decided that there was too much risk in that approach because “what if RoboVM closes down, or goes closed source”. 20/20 hindsight shows that we made the right choice and our new iOS VM is now quite mature, performant, and robust. But most importantly we are not dependent upon other external factors for maintaining it.

What Open Source VMs are Left for iOS?

RoboVM wasn’t the only open source VM for iOS. It was just the most active, and provided the best and most comprehensive bindings to the iOS native APIs. But there are alternative VMs that the open source community may turn to for their supply chain. For example:

  1. Codename One (proper) – (Full disclosure, I work for Codename One)… Codename One is open source and provides a full cross-platform Java solution for write once run anywhere.
  2. Codename One’s VM – Codename One has developed its own Java VM for iOS that works as a cross-compiler from Java to C. This is open source and is a good option for Java tools that need a path to iOS.
  3. Avian – Avian is an AOT Java compiler that can be used to compile java directly to iOS binaries. It is written in C++, and has a very permissive license.
  4. XMLVM. This project has been discontinued. But I mention it for completeness in case people want to revive it.
  5. OpenJDK for iOS. It has been approved for an iOS port of the Open JDK to be developed. This may also present a long-term option, but this is still only in the planning stage.
  6. J2ObjC – A transpiler that converts Java source code into Objective-C source code.
  7. JUniversal – Java source transpiler to C# and C++ that includes a runtime library to help with portability.

OSCON 2014 Reflections

I’m sitting in Portland International Airport waiting for the chariot that will return me to Vancouver, so I thought I’d pass the time by reflecting on my experience at OSCON. I am not generally the kind of guy that gets the fullest experience out of a conference. I attend the talks, maybe meet a few people, and return to my hotel room to watch some Netflix. But even a social caterpillar like me has fun at these things. I thoroughly enjoyed all of the talks that I attended. I learned a lot from the tutorials, and I found the keynotes engaging.

Three talks particularly stood out to me, for various reasons:

Using D3 for Data Visualizations

The first tutorial that I attended (on Sunday), was led by Josh Marinacci on the topic of HTML5 Canvas, SVG, and D3 for data visualization. It was based on his onine book HTML Canvas Deep Dive. I found the teaching style well structured and engaging. I picked up a few tips on style that I adopted for my tutorial which I gave the following day.

It is amazing how far Javascript has come. D3 has brought data visualization to the point where every researcher (i.e. people who are producing data) should at least try to learn it. I have recommended it to my wife, who has only taken one programming course in her life and does not program on a regular basis. She will be my guinea pig to see if it’s easy enough for non-devs to pick up.

Here are the results of three exercises in the tutorial:

  1. Agricultural Productivity By State
  2. Bar Chart Using Canvas
    • A bar chart drawn with HTML5 Canvas
  3. Pie Chart
    • A Pie Chart drawn using SVG

OpenUI5

I attended a talk by some of the OpenUI5 team. OpenUI5 is an HTML5 framework developed by SAP for cross-platform/mobile development. It provides a large number of widgets and development tools that you can use for developing a cross-platform app. And the only dependency it has is a single Javascript file.

The things that I like about it are:

  1. Light-weight. Single JS include.
  2. Really nice looking controls and layouts.
  3. Apache License
  4. Backed by a big company (so it has a better chance of survival than some of the other little promising HTML UI kits out there).

More about this talk

AsciiDoc and AsciiDoctor

I attended a talk by Dan Allen on JRuby where he demonstrated some cool aspects of the language, compared its performance with MRI (the canonical Ruby) and shows some tips on making Ruby and Java work nicely together. Dan is one of the developers behind AsciiDoc which, until OSCON, I hadn’t been aware of. Asciidoc looks like an excellent tool for developing documentation and writing books. I have experimented with lots of solutions over the past several years in this space, including (but not limited to) Doxygen, TeX, DocBook, JSDoc, PHPDocumentor, restructured text, and, more recently, Markdown.

I will definitely be giving Asciidoc a go as it appears to provide the simplicity of Markdown with the power of DocBook. The fact that it is a format that is supported by O’Reilly for their authors, lends weight to its viability for arbitrary documentation projects.

Mirah

OK, there weren’t any talks on Mirah per-se, but the JRuby talk that I mentioned above reminded me of my unfinished netbeans module for Mirah. I ended up spending most of my evening hours of OSCON getting the Mirah module ready for release.

I fell in love with Mirah at first sight. It deserves a lot more attention than it is getting. Hopefully the Netbeans module will convince a Java developer or two to take a look. At the very least, it will enable me to start writing code in Mirah that I would otherwise write in Java. And nobody will be the wiser 🙂

My plans for Mirah center around Codename One. It is uniquely positioned to provide an alternate language for developing Codename One applications. I plan to use its macro ability to provide a sort of DSL specifically for removing the ceremony and boiler plate (inherent in Java) surrounding all of the common functions. I think I can improve productivity on CN1 apps by at least a factor of 2, and perhaps even more.

I’ll be posting more on that later.

Some Keynotes that You Should Watch

Andrew Sorensen : The Concert Programmer

This was really amazing to watch. This guy uses a special programming language to compose and sequence music. He codes up a pretty cool song right in front of your eyes.

Simon Wardley : Introduction to Value Chain Mapping

Simon demonstrates a really cool method to visually analyze and depict the value-chain in a company. I’m not really a management guy, but this technique looks like it could be applied to quite a few things. Watch it. You’ll learn something.

My Own Talk

Oh yeah. I led a tutorial on Codename One also. I’ll talk more about that in a separate post.

Speaking Cocoa From Java

Java on the Mac recently got a major boost to its profile as a platform for building Mac software when JavaFX ensemble was accepted into the App Store. This was not the first Java app to make it into the app store, but it was the first that used JavaFX. Having recently attended a number of talks on JavaFX at JavaOne, I can tell you that it is really something to get excited about, especially if you are a Java developer. The combination of its fast, rich graphics and media support with a first-rate visual GUI builder (JavaFX Scenebuilder) make for a platform on which you can be both productive and creative.

So let’s get busy writing Mac software in JavaFX. Well….. not so fast. In order to comply with App store guidelines, and to provide a native user-interface experience for Mac users, you still need to be able to talk to some native frameworks.

One major problem is that JFileChooser is not compatible with the Mac App Store because it doesn’t work inside the App Sandbox. The current recommended approach is to use the old java.awt.FileDialog class, but this has some serious limitations. For example, if your user enters a file name with an incorrect extension for your purposes, you are not allowed to programmatically fix it, and the FileDialog doesn’t include any features to force a certain file extension. In cases like this it would be really nice to just be able to use the native NSSavePanel class to show a modal save dialog without having to jump through too many hoops. In a previous post, I described how to use JNI to build a Java wrapper around NSSavePanel, and this is a viable way to go, but this is quite a lot of hassle to just be able to use a simple file dialog.

The NSSavePanel example is only the tip of the iceberg. When you start looking through the Mac OS X frameworks, it becomes clear that there are a lot of fun toys to play with, and a few other essentials that you cannot live without. (E.g. if you want to allow in-app purchase, you’ll need to be able to interact with the StoreKit framework).

Do we really have to battle with JNI, every time we want to make a call to an Objective-C library? Luckily, the answer is no. We do have some easier options. The Rococoa framework is an open source successor to Apple’s discontinued Java-Cocoa bridge. It sits atop JNA and provides access to some core Mac OS X frameworks by way of generated Java class stubs. JDK7 and JDK8 have a little documented library called JObjC included with it, that also provides the core OS X frameworks as generated Java class stubs. Unfortunately, both of these libraries suffer from a serious lack of documentation. In the Rococoa case, there is some documentation, but most of it was written in 2005 and much of it no longer works. JObjC doesn’t seem to have any documentation to speak of aside from the unit tests (It doesn’t even include any javadoc comments).

The biggest pitfall of both of these libraries (in my opinion), is that they rely on you to build class stubs for the frameworks that you require. There are build scripts for this (e.g. JNAerator for Rococoa and JObjC has its own Ruby script that generates class stubs using the OS X bridge support XML files), but they don’t work out of the box. And frankly, after spending 3 days fighting with JNAerator and not getting it to successfully build any stubs, I’m not even sure it the current version even works. The trouble with having to run build scripts is that you need to have your build environment set up just right for them to work. If you reach a point in developing a Java application where you just want to call native method, or access a native framework, you don’t really want to have to go and build an entire mess of class structures from source. Or maybe that’s just me.

So then what? Is there a simpler option? Not that I could find. So I built one myself.

Some background on me: I am primarily a Java developer, but I have read numerous books on Objective-C and Cocoa with the anticipation that, some day, I might actually get around to writing a Cocoa App. I, therefore, prefer to write more code in Java and less in Objective-C, but I’m quite familiar with how the Objective-C runtime works and what the Cocoa frameworks have to offer.

One nice thing about Objective-C is that it has a dynamic runtime that you can easily interact with by way of just a few C functions. It provides a means of passing messages
to objects. This is the entire basis for the “Objective” part of Objective-C. JNA is a fantastic library that allows us to call C functions directly from Java. Therefore, using JNA we can interact with the Objective-C runtime via its C API.

When it comes right down to it, all I really need is a few functions:

  1. objc_msgSend(), to pass messages to objects.
  2. objc_getClass(), to get a pointer to a class.
  3. selector_getUid(), to get the pointer to a selector.

And there are a few other useful functions, but I won’t mention them here. The first thing I needed to find out was how an Objective-C object looked inside of Java (using the JNA API). As it turns out, an Objective-C object can be simply represented by its address as a long variable. JNA provides a useful wrapper class, com.sun.jna.Pointer around an address that can be used interchangeably in your JNA method wrappers.

With just this tiny piece of information, you can start to see what would be required to build a generalized wrapper for Objective-C from Java. By obtaining a pointer to a class via the objc_getClass() function and a selector via the selector_getUid() function, you can create an instance of any class by passing the selector to the class via the objc_msgSend() function.

You can then pass messages to the resulting object using the objc_msgSend() function again.

This is almost enough to form the foundation of a one-way, working bridge. But there are, of course a few caveats.

For example the objc_msgSend() only works for messages that have simple return types (e.g. Pointers, ints, etc..). If it returns a structure, then you need to use the objc_msgSend_sret() function, and messages that return a double or float value need to be sent with the objc_msgSend_fret() function.

But, for the most part, this understanding is sufficient for creating a one-way bridge.

After a little JNA magic, we have a simple, low-level API for sending messages to the Objective-C runtime. The following snippet is from a unit test that is testing out this low-level API:

However exciting it may be to be able to send messages to Objective-C objects, this level of abstraction is still too low to be “fun”. So I built a little bit of abstraction over top of this layer to, for example, allow sending messages without having to first look up the class and selector.

The following is a snippet from another unit test that is sending messages at a slightly higher level of abstraction:

And this can be improved even more:

At this point, we can quite easily interact with Cocoa without too much pain. But we still aren’t quite at the level of abstraction that we like. The next step would be to build an object-oriented wrapper around the Objective-C objects so that we’re not dealing directly with Pointers and long addresses to objects.

For this, I created the Client and Proxy classes. The Client class is an singleton class that allows you to interact with the Objective-C runtime. It allows you to send messages to objects, and it provides automatic type mapping for both inputs and outputs so that the correct message function is used. E.g. using Objective-C Type codings for messages, it determines the return type and input argument types of the messages so that the correct variant of objc_msgSend() is used and that the output is in a format of our liking.

Objective-C objects are automatically converted to and from Proxy objects
(which are just wrappers around the Objective-C object pointers). NSString objects are automatically converted to and from Java Strings. It also introduces a set of
methods for sending messages that expect a certain return type. E.g. The sendInt() method is for sending messages that return ints, and sendString() method is for sending messages that return Strings.

The Proxy class is a thin wrapper around an objective-C object. It uses a Client instance to actually send the messages, and it provides wrappers around all of the relevant send() methods.

The following is some sample code that interacts with the Objective-C Runtime at this slightly higher level of abstraction:

At this level of abstraction, it is pretty easy to add little bits of Cocoa functionality into our Java application. But there is one Major thing missing: Objective-C to Java communication. This is critical if you want to be able to write delegate classes in Java that can be passed to Objective-C objects to handle some callbacks.

Calling Java From Objective-C

One-way communication will get you only so far. If you need to pass a callback to an Objective-C message, or write a delegate class in Java that can be called from Objective-C, we’ll need full two-way communication.

There are a few ways to accomplish this, but I chose to make use of the NSProxy and a little bit of JNI to create an Objective-C object that acts as a proxy to a Java object. Whereas our Proxy Java class that we defined before contained a one-way link to its native peer object, the NSProxy subclass will provide its own one-way link to its Java peer. Combining these two strategies, we end up with a two-way bridge between Java and Objective-C, such that we can call Java methods from Objective-C, and Objective-C methods from Java.

I’ll go into detail on how this was achieved in a future post, but for now I’ll give an example of how the resulting API looks from the Java side of things.

The following is a sample application that opens the NSOpenPanel dialog and registers itself (a Java class, which is a subclass of the NSObject Java class) as a delegate for the dialog.

We introduced a special annoation @Msg to mark Java methods that should be accessible as Objective-C messages. This allows us to specify the selector and signature for the message in a way that the Objective-C runtime understands. Notice that we can even use the Proxy.send() method to send messages to our Java class. This actually sends the message to the Objective-C runtime, which pipes it back to the Java class to be handled — and if the Java class doesn’t define a method to handle the message, it will pipe it back to Objective-C to be handled by the superclass.

To find out more and download the library…

You can download the source or binaries for this project on Github. I have also posted the Javadocs here.

In Depth with JavaFX Scene Builder

This post is just a collection of my observations after watching a talk at JavaOne by Jean-François Denise about the JavaFX scenebuilder application.

One word summary: Wow!!
Two word summary: This rocks!

Scenebuilder is a GUI application for editing FXML files (an XML format for defining user interfaces in JavaFX). It provides a drag and drop canvas for placing components in a scene. It really is almost as easy as photoshop when it comes creating pretty UI designs. In fact, it may be an effective tool for mocking up UIs. You can drag images directly out of finder onto a scene and then start resizing it, rotating it, etc.. right on the stage.

I was particularly impressed by how well it uses CSS for customizing the look of an interface. The CSS works just like normal CSS in the web, except it has its own directives prefixed by “-fx”. CSS stylesheets can be added at any level of the DOM. You can attach a stylesheet to the whole stage, or to a single panel, or even a single button. You can also add inline CSS styles to an individual tag. Two differences from HTML CSS is that JavaFX also allows you to define a theme and bean properties to style elements. The order of precedence is:
1. Inline
2. CSS Stylesheets
3. Bean properties (e.g. properties set in the node inspector).
4. Themes

Scenebuilder luckily keeps the styles sorted out so that you can don’t have to pull your hair out trying to find out where the “effective” style is defined. If you have overridden a bean property in a stylesheet, the bean inspector will no longer allow you to edit the property, but rather will link to the exact part of the stylesheet where the property is set.

There is also a powerful CSS analyzer that allows you to see all of the CSS properties for each node, and where they are defined. If you make a change to a CSS file, the preview of your UI will be updated instantly.

Denise briefly showed a demonstration of how to add a custom type to the scene builder, although apparently there is still work going on in this area to make it easier to add custom components. Currently you can only use the set of built-in components – and there are some notable components still lacking.

So far I’ve only played around with Scenebuilder and JavaFX briefly and sparsely. However I am looking forward to digging in.

Rogers/Fido Redemption

In my previous post, I described a situation where my wife was overcharged for transferring from Fido to Rogers in 2009. I spoke to both Rogers and Fido customer support at multiple levels and was told that because the overcharge was too long ago they were unwilling to refund it. I filed a complaint with the Rogers Presidents office and was told that they would not refund the money because it was Fido who had originally charged it and it was too long ago. Finally I filed a complaint with the CCTS but their investigation found that it was outside of their mandate because the overcharge was too long ago.

I cancelled our Rogers accounts and moved to a different provider, swearing to never deal with Rogers or any of their companies again.

UPDATE

Today we received a refund cheque from Rogers for $560. There was no explanation of the refund with the cheque, but I phoned customer support and they confirmed that it was a refund for the 2009 overcharge.

I am pleased that in the end the company did the right thing. It is unfortunate that they took so long to come to this conclusion, and in the mean time I was forced to cancel my account. However, with this gesture, they have redeemed themselves to the extent that I am willing to lift my personal ban on dealing with Rogers in the future. The next time I’m looking for services in which Rogers is one of the providers, I will be willing to consider becoming their client again.

Open Letter to Fido and Rogers

My wife and I have recently been forced to cancel our cell phone accounts with Rogers and we will never be returning to either Fido or Rogers for any service in the future. Over the past 5 years we have spent over $12,000 for our cell phone and data services and it is likely that we would be spending more than that amount in the next 5 years. I also have a growing business that will be requiring cell and data services in the coming years. Rogers and Fido will be receiving none of this business.

Why We Are Cancelling Our Service

In September 2009 my wife and I were both using Fido for our Cell services. She was a little over 1 year into a 3 year contract and I was not under a contract. I was looking to upgrade to a smart phone so that I could receive email on my phone so I went into the local Wireless Wave to learn about my options. The salesman informed me that he could save us money by switching to Rogers from Fido. He told me that because Fido was owned by Rogers, they had a migration program that meant that we wouldn’t have to pay a full penalty breaking my wife’s Fido contract. The penalty, he said, would be only $100.

Based on these numbers, I decided to switch to Rogers per the salesman’s advice. Unfortunately, the final Fido bill went to my wife and it was set up for automatic payment on her credit card. On this final bill the cancellation fee charged to her was $500. That is $400 more than the promised price. She thought this was high but she wasn’t aware of what the sales man had told me (about the $100 fee) so we didn’t pursue it.

Fast forward to January 2011.

My wife mentions how expensive switching from Fido to Rogers was and I am shocked to find out that we had been charged $500. I was certain that it was supposed to have been $100. There must be a mistake.

After phoning Fido customer service they verified that we had been charged $500 for cancellation and no refund was ever made. They also confirmed that they do have a migration program and that she should have qualified for the $100 cancellation fee.

However …

Since the mistake was made more than 90 days ago, they were not willing to correct the mistake in any way. I spoke with 4 separate people at Fido. They all gave me the same line. They suggested that since we no longer had an account with them, there was nothing they could do and that we should check with Rogers to see if they would help us out – since we were still customers of Rogers.

Unfortunately Rogers follows the same play book as Fido. I spoke with 3 people at Rogers customer service. They all quoted me the same thing: Any billing mistakes are assumed to be accepted by the client if they don’t object within 90 days.

Fair enough. We could have showed more diligence in monitoring our bills. We failed in this respect. However, we are still operating under the same 3 year contract for which the overcharge was made. The mistake was theirs – even if I made the mistake of not “catching” their mistake, I expect them to rectify it.

The Result …

1. I cancelled both of our services with Rogers and paid the penalty for the remainder of our contract. The value of this portion of the contract would have been approximately $2000. Instead they will receive a mere $450 penalty payment.

2. We signed with Koodo who did not require us to sign any contract. As it turns out Koodo will save us approximately $100 per month to get even more service than we were receiving through Rogers. (Please be advised that we never even would have looked for an alternative option had we not been mistreated by Rogers).

3. We will never again do any business of any kind with Rogers or any of its companies, including Fido. None of my businesses will ever again to any business with Rogers or any of its companies, including Fido.

The Balance Sheet ….

1. Fido (owned by Rogers) is up $400 for the money that they overcharged us initially.
2. Rogers is up $450 for the cancellation fees we paid to get out of our contracts.
3. Rogers is down $2000 over the next 9 months from the lost revenue they would have received had they rectified the problem.
4. Rogers is down at least $2500 per year in perpetuity from the forgone revenue from my patronage due to their failure to rectify this problem.

Forecast:

Over 9 months: Rogers is down $1150
Over 21 months: Rogers is down $3650
Over 33 months: Rogers is down $6150
…

Seeing how your arrogant policies hit your balance sheet, perhaps you will reconsider your ways for the future.

Best regards

Steve Hannah

UPDATE:

Rogers ended up refunding the overcharge. Read about it here.

So Steve’s leaving us…. but is he really?

I received my November issue of Mac Life in the mail yesterday. It includes an article on Steve Jobs entitled “Steve’s Gone… Now What?”. It went over some of the Wall Street reaction to his “leaving” and recapped some of the accomplishments and qualifications of his successor, Tim Cook. The article conveyed optimism for Apple’s future and didn’t seem to dwell too much on its founder’s demise. It wasn’t until I reached this second article that I realized that I was reading tragically old news hot off the press:

“So Steve’s leaving us … but is he really?”, the article began.

Hmmm.. Uh yes.. pretty sure he’s really leaving.

Then the next line cleared up the conflict of context:

“What does it mean to switch from being CEO to Chairman of the Board?”

Ah .. that explains it.

A Week On Android

2 years ago I purchased an HTC Magic which runs on Android. I used if for a day and decided I didn’t like it, so I switched to my wife’s old iPhone 3G instead. I never looked back, until …

Last week my wife’s new iPhone 3G got water poisoning, so I loaned her my iPhone (her old phone). My plan was to dust off the old HTC and try using that instead. After all my needs are quite simple. I read books with the Kindle app, I use email, and I listen to music. The only part that I anticipated difficulty on was the music portion because I had a number of albums that had been purchased from iTunes.

Kindle Block

After charging the phone up a bit I moved onto trying to load the Kindle app. Unfortunately I didn’t see it in the Android marketplace. So I did a Google search to see if I could download it from the web. I found it and downloaded it, but was prompted that it was from an untrusted source and that I needed to turn off some setting in order to be able to install it. After disabling this security setting, I tried to install it… but failed with an error “This app could not be installed on this phone”. No explanation – just failure.

I repeated this procedure on a few different websites before realizing that my version of Android (1.5) was too old to run the software. In fact it was also too old to even work with the Android marketplace on the web.

Upgrade Maze

My next mission was to find out if there were any updates for my phone. So I did some google searches for variations on my phone’s model and my phone carrier which led to an updates page with about 8 or 9 different updates that apparently could be downloaded. There was a date next to each update but it wasn’t clear which update I needed. Did I need them all or did later updates overlap with earlier updates? One of the updates appeared to be an Android 2.1 update but I wasn’t absolutely sure about that. So I downloaded that update – which was only available as an EXE file (i.e. won’t run on a Mac).

So dusted off my Windows laptop and installed the update software. Unfortunately this software said that it couldn’t detect my android phone. After a while I figured out that I needed to also install phone sync application for the computer. Once that was installed, I was able to run the updater program. However the update failed half way through with a non-specific error about my phone not supporting this update.

This led to a period of research where I scoured Google to try to find all the information I could about my phone and updates, and errors with updates. There was a wealth of contradictory information – some reports claimed that the phone could not be updated, other reports claimed that it could. In the end I just went back to the same HTC update page and tried downloading some older updates. I downloaded each update and applied them in order. Most seemed to work – but none of them yet brought my phone up to Android 2.1, which is what was needed to run Kindle.

Once all of the prerequisite updates had been installed correctly, I tried again to install the 2.1 update. But it failed again.

Luckily at this point (with one of the updates I installed) I was able to get the update using a built-in update feature on the phone. So after about 5 hours I had updated my phone to 2.1.

Still No Love

Now, with 2.1 installed I had renewed vigour to get Kindle up and running. I was able to download it, but when I tried to run it, it informed me that I didn’t have enough memory. Apparently the HTC Magic doesn’t have much internal memory, but it has a MicroSD card slot. So I marched over to London Drugs and bought a 16gig card to load into the phone.

This time around, I was able to install Kindle. Phew!

The Easy Part

Android 2.1 is better in every way than its 1.5 ancestor. Importing my contacts from the iPhone and transferring the music turned out to be quite easy.

The Good and The Bad

Most of my gripes (such as lack of features like pinch-zoom) were resolved, leaving the UI to be almost as fluid as the iPhone interface. I do like the multi-tasking ability of it for productivity. I don’t like the multi-tasking ability for battery life.

Turns Out Ben Affleck Survived the War After All

After a few days of Android, we discovered that my wife’s iPhone had recovered from the water poisoning and was ready to resume its duties as her faithful communicator. So I was faced with a dilemma. I had put so much work into reforming the Android phone, and it had come such a long way. It would seem like a betrayal to return to my old phone after all we’d been through.

But it’s just a phone. I’m sure it will understand that no matter how much it tries to look and act like my old phone – it will never be an iPhone.

So I returned to my iPhone this morning and all is well in paradise again.

Footnote: The android really does come close now – but in the end it was the battery life that tipped the scale and convinced me to return to my old phone.

Amazon EC2 Micro Instance By Accident

I recently had a problem with my Amazon EC2 server that caused me to have to fire up another fresh instance and transfer the files over. My original server was a “small” instance (which was the smallest available when I first set it up). I mistakenly chose a “micro” instance for the new server thinking that it was just another name for “small”.

3 days of misery ensued.

The server kept on maxing out with high server loads. (The server load average was hovering between 4 and 20 — ideal is around 0.3). However there were no processes that were using high CPU or memory resources according to the top command. With a server load of over 10, the most intensive process was using 0.1% of the CPU – which puzzled me. There was only one reading that appeared to be topping out (other than the load average), and that was %st (Stealth Time) – it was sitting at over 98%st.

Output of Top Command on Micro Instance

After some research I discovered that “Stealth Time” is percent of operations that are made to “wait” for the virtual machine to allow it to continue. This means that there was some throttling going on under the covers. This led me to this excellent review of Amazon EC2’s micro instances where the author describes the exact same experiences with his simple blog.

After converting the instance back to a small instance, all is well again.

Output of Top Command on Small Instance