All posts by shannah

Steve Hannah is a 28-year-old software developer currently studying and working at Simon Fraser University in beautiful Vancouver British Columbia. He specializes in web information systems and prefers Java, PHP, and Python as programming languages. He is a Christian and worships at Christ Church of China in Vancouver.

CodenameOne Project : Finally Java on the iPhone

I periodically monitor the progress of multi-platform development solutions for the mobile space. As a Java developer, I’m especially interested in solutions to bring Java to the iPhone. Even better would be a Java solution that will run on all of the major devices (Android, iOS, and Blackberry).

CodenameOne is the most promising project to appear in this space so far. It includes an API, a GUI designer, and a build/deployment solution so that you can develop your application in 100% pure Java then a native application on most major platforms (including iOS, Android, RIM, Windows Phone 7, and J2ME). The API forms a thin abstraction over the native libraries on each platform so many components are heavy-weight. This allows the applications to take on the native look and feel of the host system. The underlying mechanics work differently for each platform. E.g. on Android, the Java runs on the built-in VM, whereas iPhone builds use XMLVM to convert the Java bytecode to native code.

So far I haven’t done very much with it (just discovered it last night). I downloaded the Netbeans plugin which adds the ability to create a “Codename One Project” within Netbeans. This includes a GUI editor with a few themes to get you jump started with some common application structures. My initial test application was just a default tabbed application. It took me 2 minute to create it.
The Netbeans plugin includes an iPhone emulator so that you can test the application right inside netbeans. In my initial tests, this seemed to work quite well. Building the application was a single click.

I’m a little wary of the build process, as it occurs on their server. Apparently the Java is compiled to byte-code on your local machine, then it is sent to their server to have it converted to a native application. This requires that you sign up for a free account. After signing up for an account, and logging in, it said that I could perform up to 80 builds before upgrading. It didn’t say whether this is 80 per month, 80 per year, or 80 total — but I didn’t look into it too deeply. The build process took a little while (more than 20 minutes — I just requested the build, then waited 20 minutes and went to bed). In the morning the build was ready for download.

I was quite impressed with the file size. It was only 6.9 megs. Yes this was just a simple app with 4 tabs, some buttons, and forms, but this size is still quite good. Especially considering that it includes all code necessary from the Java libraries in order to run. I was expecting it to come in around 50 megs, as I fully expected the whole JVM to be statically compiled into it. Luckily, it looks like they use some optimizations to remove dead and unused code before building it.

Summary

Pros: Java. Support for most major devices. Good documentation. Good IDE support (both Netbeans and Eclipse). Easy, one-click-builds, Support for signing so you can submit apps to their respective app stores. Open Source (GPL 2. With Classpath Exception — free for commerical/non-commericial use)

Cons: Building is done in the cloud. This is convenient but it opens the door to problems in that you are now dependent upon them. They say that you can build it yourself and that there is information on the net on how to do this – but they offer no support for it.

Hopefully I’ll have the opportunity to build my first app with this in the next couple of weeks.

References

  • Codename One Website
  • How CodenameOne Works (a Stackoverflow discussion).
  • XMLVM – The underlying tool that allows them to cross-compile Java to native code.
  • LWUIT – A toolkit for user interfaces on mobile devices of which CodenameOne is a descendent.

Is Gatekeeper the writing on the wall?

I just updated one of my Macs to Mountain Lion so that I have a test machine for my software. As expected none of my software worked out of the box. When I tried opening PDF OCR X, it came up with a message saying that the application was damaged and should be dragged to the trash.

It turned out that the problem was that I hadn’t signed the application with my Apple Developer Key, and Mountain Lion is set up by default to disallow applications from running if they aren’t signed by a valid developer key (that you can only get with a paid membership to the Mac Developer program). There are workarounds (described in this support article), but this still happened to be a road block for a lot of people.

Luckily I have a Mac Developer membership and a key. It only took me an afternoon to sit down and figure out how to sign all of my code and upload an updated version. However this looks like it may be writing on the wall. In the security settings pertaining to this “Gatekeeper” feature, there are 3 options:

  1. Allow Only Applications downloaded from the Mac App Store
  2. Allow Only applications by identified developers (i.e. they pay to be in the developer program)
  3. Allow all programs

The default is #2 (allow only identified programs). If you try to select #3 it pops up with a warning message to discourage you from this choosing this option.

So probably most people will continue to operate with setting #2. Is this just the beginning though? Can we look forward to future releases of OS X having a default of #1 – or worse, not having an option at all? My experience with developing Mac software has been one of keeping up with the moving goal posts.

When I first started developing Mac Apps all you had to do was put your app on the Apple download site (it was free) and people would find you. Then apple came out with the App store and placed many restrictions on the applications – and of course took their 30% cut. I chose not to add my applications to the App store because of the large cut that apple took – and because it would have been a significant amount of work to make them comply with the app store guide lines (I use Java). When Java finally reached the point where it could be bundled into an app (and thus be accepted into the App store), I contemplated adding my application. but alas my choice was taken away as the goal posts moved again: Sandboxing.

Now applications must comply with Apple’s new sandboxing model which basically locks applications inside a sealed box. It cannot interact with the rest of the file system. Since my application is a utility that is meant to allow users to batch convert large numbers of files, it simply cannot be done using the Sandbox model.

Users should be concerned too. If they have purchased an application from the App store and then accidentally click “Update” to update to the latest version, they may have features taken away, never to be returned. This happened on my iPad when I accidentally updated my Kindle App. The old version contained a button to “buy books” that took me directly to the kindle store. However, apple since placed a regulation saying that Kindle couldn’t link to their website directly from the app (because Apple wanted a piece of all sales), so the link had to disappear.

In order to markets to function, consumers must have confidence in the market. Apple has proven time and time again that it will move the goal posts as far as it can get away with – so there can be no confidence that purchases I make today from a 3rd party developer in the Mac App store will not be taken away from me at a later date.

I, personally, will not be purchasing any software from the Mac App Store for this reason. I will purchase directly from the vendor if that is possible. If it is not possible, I will find an alternative. (Of course there are some things for which there is no alternative – e.g. Apple Software).

JavaFX Has Finally Arrived

JavaFX 2.1/2.2 finally gives me the tools that I need to build the applications that I want to build. This release provides 2 missing pieces that make all the difference:

  1. Scene Builder – Finally a good graphical GUI tool for Java. This makes building GUIs almost as easy as Apple’s Interface Builder.
  2. JavaFXPackager Native Bundles – Now you can instantly build native application bundles for Mac, Windows, and Linux.

The only question is whether these features are too late to the party to make a difference. Sun dropped the ball with desktop Java a long time ago. It survived the past 10 years almost entirely on the backs of JavaEE (i.e. web/server programming) and, more recently, Android. Desktop application developers have been fending for themselves for the most part.

Now we have the tools that we need to make some serious applications. The upcoming release of JDK 1.7 will also finally enable Java developers to get their applications into the Mac App store. Currently the bundle size for an application is quite large (over 50 megs for my hello world application test), because it needs to contain the entire Java Runtime Environment, but this should improve over time as we get better at compressing and splicing the JRE to suite specific purposes).

I downloaded Netbeans 7.2, JavaFX Scene Builder, and JavaFX 2.1 (included with JDK 1.7.0u6) to try to build a simple web browser application. This proved to be super easy, and it gives me a lot of faith in the power of technologies such as FXML (an XML format for representing user interfaces). Some key points that impressed me during my short test:

  1. Netbeans now has an option to create a Swing-wrapped JavaFX project. This was previously a real pain, since JavaFX has lots of cool toys, but wouldn’t give you a full desktop experience. E.g. You still needed swing to get menus at the top of the screen for Mac. Swing still does a lot of things well, so if you wanted to use JavaFX, you would generally need to rig up the Swing FX Panel yourself. Having this as a default option for a project is nice. No more messing around.

  2. Using a controller for the FXML file makes it very easy to achieve full separation of the view and logic. The @FXML annotation also makes it incredibly easy to reference elements of the FXML UI from the controller. Attaching events for controls was quite simple.

In order to truly be successful, JavaFX still needs to get a mobile presence. It currently doesn’t run on Android or iPhone (despite demonstrations to the contrary at previous JavaOne conferences). If they can somehow port it to these platforms, I think it will really take off.

Look Both Ways Before Adding Jars to Extensions Directory

I recently ran into a problem with a customer’s Java environment where some of the classes that PDF OCR X requires were being overridden by out-of-date classes. The error wasn’t exactly intuitive:


java.lang.AbstractMethodError: java.lang.AbstractMethodError: javax.xml.parsers.DocumentBuilderFactory.setFeature(Ljava/lang/String;Z)V

This was happening when I was trying to parse an XML document, and it only happened on this one customer’s machine. The problem was that he had an old version of Xerces installed in one of his Java extensions directory. He had probably installed it there years ago (or a program did it for him) as a shortcut to make sure that Xerces was always available without having to add it to the classpath for each project. Whatever the reason, it was causing problems for me in present day.

I actually include my own copy of Xerces as part of the application and it is included in the classpath. So shouldn’t that override any jars in the system extensions directory?

Apparently, No.

Java will look first in the bootstrap files, then in the extensions directory for classes. And finally, it will look in your classpath only if it can’t find the class in one of the first two places.

This means that it is a really bad idea to add libraries to the Java extensions directories unless you are willing to keep them up to date. You may have problems down the road, and they might not be easy to solve.

This reminds me of a current discussion on the Mac Java development mailing list regarding the pros and cons of bundling your own Java runtime when you distribute applications. If you do decide to rely on the system’s runtime, you are at the mercy of the user’s environment. They can break your application in a thousand different ways by installing old, broken, and dirty libraries into their extensions directories.

Best practice: If Oracle or Apple doesn’t put something in the extensions directory, then you shouldn’t either. Period.

SHOW TABLE STATUS vs SHOW TABLES vs INFORMATION_SCHEMA

I have been battling with some performance issues on one of my Xataface applications of late and I think I just found the cause of periodic slow-downs:

SHOW TABLE STATUS

The output of this command includes useful information about all of the tables in the database including such things as creation time, update time, average row length, and number of rows. In Xataface I primarily use this command to:

  1. Find the modification time of tables so that I can perform smarter caching operations.
  2. Determine if a view exists.

When working exclusively on MyISAM tables, this command is very fast as all of the information returned is cached all the time. However, when we start to throw InnoDB tables (and possibly views .. haven’t looked into it yet) into the mix this command becomes quite slow because much of the data returned needs to be calculated (e.g. the number of rows). I was facing an issue where this command could take upwards of 10 seconds to return when the application hasn’t been used in a while. It would also periodically hang even when the application was in frequent use. Presumably this is because MySQL does some caching of the values in this command, but the cache doesn’t last long.

In addition, InnoDB doesn’t keep track of modification times so despite the fact that it is performing calculations in this command, it still returns NULL for table update times. Which renders the function altogether useless for InnoDB tables.

Xataface has long had a back-up strategy for keeping modification times in InnoDB tables. It keeps its own table of modification times. This table is updated whenever a record is updated from within Xataface. It doesn’t work for updates performed outside of the application. In most cases this is good enough. Even with this back-up solution, the primary method of retrieving table modification times was still the “SHOW TABLE STATUS” mysql command.

Solution #1: Use the Information Schema

My first attempt to rectify the involved a direct query of the INFORMATION_SCHEMA. You can obtain the modification times of all tables in the database with the following query:

select TABLE_NAME as Name, UPDATE_TIME as Update_time from information_schema.tables where TABLE_SCHEMA='my_database_name'

Unfortunately, it turns out that this query is also quite slow (though a bit faster than show table status). My initial tests showed that on a database with about 60 tables it would take about 0.2 seconds to return. Far to slow for an operation that doesn’t contribute directly to the building of the page. What I need is something that returns in less than 0.01 seconds so that it is effectively negligible.

Solution #2: Use SHOW TABLES

SHOW TABLES simply returns a list of the tables in the current database. It doesn’t include any stats or information about those tables other than the table names. It is also very fast (generally returns in 0.00 seconds …. i.e. too small to matter). This is enough information to build my own modification times or check for the existence of a table/view.

References:

Rogers/Fido Redemption

In my previous post, I described a situation where my wife was overcharged for transferring from Fido to Rogers in 2009. I spoke to both Rogers and Fido customer support at multiple levels and was told that because the overcharge was too long ago they were unwilling to refund it. I filed a complaint with the Rogers Presidents office and was told that they would not refund the money because it was Fido who had originally charged it and it was too long ago. Finally I filed a complaint with the CCTS but their investigation found that it was outside of their mandate because the overcharge was too long ago.

I cancelled our Rogers accounts and moved to a different provider, swearing to never deal with Rogers or any of their companies again.

UPDATE

Today we received a refund cheque from Rogers for $560. There was no explanation of the refund with the cheque, but I phoned customer support and they confirmed that it was a refund for the 2009 overcharge.

I am pleased that in the end the company did the right thing. It is unfortunate that they took so long to come to this conclusion, and in the mean time I was forced to cancel my account. However, with this gesture, they have redeemed themselves to the extent that I am willing to lift my personal ban on dealing with Rogers in the future. The next time I’m looking for services in which Rogers is one of the providers, I will be willing to consider becoming their client again.

Open Letter to Fido and Rogers

My wife and I have recently been forced to cancel our cell phone accounts with Rogers and we will never be returning to either Fido or Rogers for any service in the future. Over the past 5 years we have spent over $12,000 for our cell phone and data services and it is likely that we would be spending more than that amount in the next 5 years. I also have a growing business that will be requiring cell and data services in the coming years. Rogers and Fido will be receiving none of this business.

Why We Are Cancelling Our Service

In September 2009 my wife and I were both using Fido for our Cell services. She was a little over 1 year into a 3 year contract and I was not under a contract. I was looking to upgrade to a smart phone so that I could receive email on my phone so I went into the local Wireless Wave to learn about my options. The salesman informed me that he could save us money by switching to Rogers from Fido. He told me that because Fido was owned by Rogers, they had a migration program that meant that we wouldn’t have to pay a full penalty breaking my wife’s Fido contract. The penalty, he said, would be only $100.

Based on these numbers, I decided to switch to Rogers per the salesman’s advice. Unfortunately, the final Fido bill went to my wife and it was set up for automatic payment on her credit card. On this final bill the cancellation fee charged to her was $500. That is $400 more than the promised price. She thought this was high but she wasn’t aware of what the sales man had told me (about the $100 fee) so we didn’t pursue it.

Fast forward to January 2011.

My wife mentions how expensive switching from Fido to Rogers was and I am shocked to find out that we had been charged $500. I was certain that it was supposed to have been $100. There must be a mistake.

After phoning Fido customer service they verified that we had been charged $500 for cancellation and no refund was ever made. They also confirmed that they do have a migration program and that she should have qualified for the $100 cancellation fee.

However …

Since the mistake was made more than 90 days ago, they were not willing to correct the mistake in any way. I spoke with 4 separate people at Fido. They all gave me the same line. They suggested that since we no longer had an account with them, there was nothing they could do and that we should check with Rogers to see if they would help us out – since we were still customers of Rogers.

Unfortunately Rogers follows the same play book as Fido. I spoke with 3 people at Rogers customer service. They all quoted me the same thing: Any billing mistakes are assumed to be accepted by the client if they don’t object within 90 days.

Fair enough. We could have showed more diligence in monitoring our bills. We failed in this respect. However, we are still operating under the same 3 year contract for which the overcharge was made. The mistake was theirs – even if I made the mistake of not “catching” their mistake, I expect them to rectify it.

The Result …

1. I cancelled both of our services with Rogers and paid the penalty for the remainder of our contract. The value of this portion of the contract would have been approximately $2000. Instead they will receive a mere $450 penalty payment.

2. We signed with Koodo who did not require us to sign any contract. As it turns out Koodo will save us approximately $100 per month to get even more service than we were receiving through Rogers. (Please be advised that we never even would have looked for an alternative option had we not been mistreated by Rogers).

3. We will never again do any business of any kind with Rogers or any of its companies, including Fido. None of my businesses will ever again to any business with Rogers or any of its companies, including Fido.

The Balance Sheet ….

1. Fido (owned by Rogers) is up $400 for the money that they overcharged us initially.
2. Rogers is up $450 for the cancellation fees we paid to get out of our contracts.
3. Rogers is down $2000 over the next 9 months from the lost revenue they would have received had they rectified the problem.
4. Rogers is down at least $2500 per year in perpetuity from the forgone revenue from my patronage due to their failure to rectify this problem.

Forecast:

Over 9 months: Rogers is down $1150
Over 21 months: Rogers is down $3650
Over 33 months: Rogers is down $6150
…

Seeing how your arrogant policies hit your balance sheet, perhaps you will reconsider your ways for the future.

Best regards

Steve Hannah

UPDATE:

Rogers ended up refunding the overcharge. Read about it here.

A Bug Hunter’s Diary: A Guided Tour Through the Wilds of Sofware Security

A Bug Hunter’s Diary is written as a journal of a “Bug Hunter”. It takes us through seven bugs that the author (Tobias Klein) has discovered in the past five years, giving the reader a glimpse of his process and an awareness of how easily bugs can creep into even the most carefully written software.

All of the bugs discussed in this book are related to memory errors that lead to a proofs of exploit that involve overwriting the program counter (and thus hijack subsequent program execution with arbitrary code). This narrow focus sets the stage for a book that is very heavy on Assembly code, debuggers, and a few easily compromisable C functions.

Some of the bugs involve open source software so we have source code to step through, but others provide nothing more than the finished binary to step through with a debugger and a disassembly of the machine code. For someone who doesn’t do much C programming anymore and hasn’t done Assembly in over 10 years, such as myself, the journey can be quite daunting. The guide on this journey stops occasionally to explain his intuition for certain decisions, but we’re largely left to just follow him along without really knowing how he knew to check for certain things and not others. After following him on 7 missions, however, some patterns do become evident. This repetition enables the reader to feel like more of a partner in the later journeys to contrast his bystander role in the first few.

Being primarily a Java, PHP, and Javascript developer of late, the tools and languages used in this book (Assembly and Debuggers) really stretched my ability to follow along in real-time. It reminded me a little of the first few computer programming books I picked up as a newby when I would be reading pages and pages of code and waiting (hoping) for the eureka moment when all the code would make sense. Like listening to an old song from my past, this feeling of having to follow the master and just have faith that things would make sense if I just stared a little longer, opened a flood of memories of my first days in programming. I’m thankful to the author for that.

If you can think of my train of thought during the reading of this book as a process, then this process had a few different threads running in parallel. In one thread, I was just trying to follow the Bug Hunter and understand the lessons as they were presented. In another thread I was trying to identify analogs in these lessons to my own domain of programming (e.g. web programming, and database programming). There were also a thread that was periodically assessing the author’s style and the the book’s relevance, to various target audiences, and its importance to software.

Thread 1: Just trying to keep up

If I had read this book as a tutorial where I actually do as the author says, it would have taken me a lot longer to read. At a certain point I just settled for getting the gist of what he was doing and focused more mental cycles to Thread 2.

Thread 2: Applying the Lessons to My Domain

The process that Klein uses to track down bugs serves as a forceful reminder of how bugs could slip into my own software if I’m not careful. In trying to exploit memory errors, Klein would focus on the interfaces between user space and kernel space as logically any exploit must pass through one of these bridges at some point. There is an analog in any programming environment. In web development you are looking for the entry points where user input is passed to the application. This book is a reminder that you must always be conscious of where data could possibly have come from before working with it as it is often a combination of small glitches that result in a large exploit down the road.

Thread 3: The Author’s Style

Klein writes much of this book as a sort of diary of his actions. As such it is more like a day in the life of a bug hunter than it is a tutorial on how to become a bug hunter. This style works well for imparting an authentic view of what a bug hunter really does – and as such might serve as inspiration for those who want to follow this craft.

At times (more early on than later) I struggled to grapple with the direction Klein was taking us. Is he just reciting his exploits as a way of bragging, or is he recounting his steps and building up to an eventual purpose. Throughout the second and third bugs I was bogged down a little with thoughts like “this is way too dense in Assembly for the average person”, but these would be countered with rebuttle thoughts like “but this isn’t intended for the average person…”. As I settled in, though, these thoughts were replaced by less critical internal commentary like “wow, it would be really easy to make that mistake” or “that’s really neat!”

By the time I reached the sixth and seventh bugs and I had seen the same types of bugs appear over and over, I had become a true believer. Not only was this book interesting; It was essential! Despite the lip service that is paid to “good programming practice” in computing science programs, I have never been made so acutely aware of some of the simple programming mistakes that can result in dreadful consequences. As I finished the seventh bug, the one clear capstone thought was that this book should be required reading to anyone who wants to write any software. Just as engineers are required to take an ethics course, computer scientists should be required to take a “Bug Hunters” course – if for no other reason than to raise their awareness of the nature of bugs.

When I think about the “buggy” code that I have read from new University grads it makes me shiver. Much of this buggy code is in production around the world and quite often the develop has no idea that the bugs are there. This book highlights a few specific functions to handle with great care in C, but every language has its own set of traps that all new programmers fall into. (E.g. in PHP registering global variables, unvalidated input to SQL queries leading to SQL injection, etc…). Computer books very rarely spend much time highlighting these traps, and as a result young programmers regularly fall into them. Unfortunately with software these traps often don’t give immediate feedback. Instead they linger for years before they are exploited – and sometimes to great disaster.

Summary

I meandered a bit in this review, but to summarize. If you can slog through the dense debugger readouts and assembly code and latch onto the core message in this book you will become a better software developer. Even if you don’t learn anything new in this book, it will serve to raise your awareness and prevent a bug.

And one more thing: This book should be required reading for new software developers.

Survey Builder 0.1 Released

I have just released Survey Builder on Sourceforge. Survey Builder is an application that allows you to design and host surveys using pure HTML. It is the first application built with the new Xataface 2.0 look and feel.

Check out the sourceforge project at https://sourceforge.net/projects/surveybuilder/
Check out the documentation at http://www.fcat.sfu.ca/websurvey/doc_output/html
Check out a sample survey at http://www.fcat.sfu.ca/websurvey/sample

Sourceforge Much Improved

I just went to start up a new sourceforge project for the Survey Builder application I just created. I guess it’s been a while since I created a project because the system is total different than it used to be. Much more powerful it seems. They now give you a blog, issue tracker, and wiki (as well as version control if you want it) which is all that I generally need. A long time ago I decided to move most of my project stuff off of source forge (except for the download hosting) because their facilities were clunky and slow.

There’s still time for me to reform a negative opinion, but for now it all looks good. I may not even set up a separate site for this project at all; just do it all on sourceforge.