Found this great IE blog article explaining the cryptic “Operation Aborted Error” in IE.
Click Here.
Another reminder of why I don’t use IE, but also why the fact that many people do use it makes my life as a developer miserable.
Posts about software development. Generally I use Java, PHP, and Python for development but occasionally I delve into other things as well.
Found this great IE blog article explaining the cryptic “Operation Aborted Error” in IE.
Click Here.
Another reminder of why I don’t use IE, but also why the fact that many people do use it makes my life as a developer miserable.
I am building some SFU websites using WordPress and I need to set them up to use the SFU central authentication service (CAS). A quick search on Google found this plugin:
cas-authentication
I downloaded and installed it without trouble, but couldn’t seem to get it working. Likely it has something to do with the fact that it was written for WordPress 2.5.1, and I’m using the latest (2.6.1). The errors I was getting were unhelpful, basically just stating that the CAS authentication failed and that I was not logged in. I tried in vain to debug the script for an hour or so.
Then I came across a thread in a WordPress message forum indicating that I was not the only one having these problems. Through this thread, I was able to find another CAS plugin (written by Casey Bisson) that claimed to work with WordPress 2.6.1 – and the claims were true.
The plugin can be found at http://wordpress.org/extend/plugins/wpcas/
(Note to self: SFU uses CAS 1.0; not 2.0).
The only thing that this new plugin lacked was the ability to automatically create accounts when users log in. I made the following change to the wpcas.php file to add this behavior.
function authenticate() {
global $wpcas_options, $cas_configured;
if ( !$cas_configured )
die( __( 'wpCAS plugin not configured', 'wpcas' ));
if( phpCAS::isAuthenticated() ){
// CAS was successful
if ( $user = get_userdatabylogin( phpCAS::getUser())){ // user already exists
// the CAS user has a WP account
wp_set_auth_cookie( $user->ID );
if( isset( $_REQUEST['redirect_to'] ))
wp_redirect( function_exists( 'site_url' ) ? site_url( $_REQUEST['redirect_to'] ) : $_REQUEST['redirect_to'] );
wp_redirect( function_exists( 'site_url' ) ? site_url( '/wp-admin/' ) : '/wp-admin/' );
}else{
// the CAS user _does_not_have_ a WP account
/** BEGIN CHANGES TO ADD AUTO ACCOUNT CREATION **/
if (function_exists( 'wpcas_nowpuser' ))
wpcas_nowpuser( phpCAS::getUser() );
else
{
// auto-registration is enabled
require(dirname(__FILE__).'/../../../wp-includes/registration.php');
// User is not in the WordPress database
// they passed CAS and so are authorized
// add them to the database
$username = phpCAS::getUser();
$password = md5('testing');
$user_email = '';
if ($cas_authentication_opt['email_suffix'] != '')
$user_email = $username . '@sfu.ca';
$user_info = array();
$user_info['user_login'] = $username;
$user_info['user_pass'] = $password;
$user_info['user_email'] = $user_email;
$res = wp_insert_user($user_info);
$user = get_userdatabylogin( phpCAS::getUser());
wp_set_auth_cookie( $user->ID );
if( isset( $_REQUEST['redirect_to'] ))
wp_redirect( function_exists( 'site_url' ) ? site_url( $_REQUEST['redirect_to'] ) : $_REQUEST['redirect_to'] );
wp_redirect( function_exists( 'site_url' ) ? site_url( '/wp-admin/' ) : '/wp-admin/' );
} // die( __( 'you do not have permission here', 'wpcas' ));
}
/** END CHANGES TO ADD AUTO ACCOUNT CREATION **/
}else{
// hey, authenticate
phpCAS::forceAuthentication();
die();
}
}
Now it works like a charm.
There are hundreds of reasons not to use IE (Internet Explorer), but the common folk still seem to use it en masse, so I am forced to deal with its bugs when developing web pages.
A couple that I ran into today in IE 6:
1. No support for onload handler on script tag (makes dynamically loading scripts a pain).
2. No support for transparency or opacity in PNG files. I have a beautiful logo that looks great in all browsers except IE. IE just shows a gray background where it is supposed to be clear.
Haven’t gotten my hands directly on an IE 7 machine to test out, but by all indications, these problems still exist in IE7.
If you are still using IE, I beg you to switch to something else – anything else (e.g. Firefox, Safari (yes it is available for windows now), Opera, Google Chrome (Google’s new shiny browser). Just stop using IE and make the world a better place.
When the statistics show that less than 2% of the users are using IE, I can stop wasting my time on workarounds for it and focus on developing.
Problem:
I have a website where users log in at http://example.com.
I have an iframe that embeds pages from example.com on http://mysite.com.
IE7 Won’t retain the cookie (i.e. it keeps asking me to log in in the iframe)!
Workarounds:
1. Have your IE7 users set Internet Options >> Privacy >> Advanced >> Check “Override Automatic Cookie Handling” and “Always allow session cookies”
This is not so good because it is inconvenient most times to have all of your users make this change.
2. Use a P3P header. IE7 will allow the cookies as long as your site appears to have a privacy policy (using the W3C standard). Send this header just after session_start(); in PHP:
session_start(); // start the session
header('P3P: CP="IDC DSP COR CURa ADMa OUR IND PHY ONL COM STA"');
For more on P3P see:
http://www.sitepoint.com/article/p3p-cookies-ie6/2/
For more on the IE7 Bug (or rather annoyance!!) see:
http://aspnetresources.com/blog/frames_webforms_and_rejected_cookies.aspx
I have just revamped my translation website, shifting the focus to providing machine translations of web sites. The site will allow users to upload their websites as a ZIP file, and have them automatically converted into nearly 30 languages.
I have leveraged quite a bit of open source software to make this happen, including Xataface, and this is only phase one. Xataface already allows developers to easily convert monolingual web applications written in PHP into multilingual applications complete with Google Translate integration, and support for human translation also. I am currently working on some modules to improve interoperability with other translation tools using the XLIFF document standard.
For those people who require a high quality human translation, I have provided a quote form from Translated.Net to get an instant quote. Ultimately translate.weblite.ca will become a portal with all kinds of tools and information about website internationalization. One step at a time ….
I was tired of having to resize my images before uploading them to the web. I also wanted to be able to host more video on my website in a simple way. So I created an application to manage and serve all of my videos and images – Internet Media Manager.
Now I manage all of my media from a central location, and I can easily embed images and video into any of my web pages by copying and pasting a snippet of code.
Here is a brief guided tour: (this video is hosted using the Internet Video Manager):
Some notable features that are included:
For the past couple of years I have been dabbling in the art of screencasting. I think that providing video tutorials for my software can be much more helpful to my users than producing standard web page documentation. However, I have struggled to find any good solutions. Prior to my discovery of screenflow, I had experimented with Snapz Pro X, iShowU, VLC, and many other forgettable solutions. The best of this bunch was iShowU, but it is so far behind Screenflow that it almost isn’t even worth comparison. I’m still bitter towards Snapz Pro X because of how difficult it seems to be to uninstall (and the pop-up box every 10 minutes asking me to purchase the software is a real pain).
So why is screenflow so much better?
1. I can’t figure out how it works so well. You can play high definition video on your mac and record it with screenflow without losing any apparent quality. It just works. This enables me to more easily take a clip of two from a DVD or streaming video source to include into my works without jumping through any hoops.
2. No more file size/video quality trade-offs. When recording it doesn’t have a zillion configuration options. You just press “record” and it works. What’s more it doesn’t seem to bog down the computer like all the other solutions did. You can start it recording and not even notice that it is running.
3. Integrated webcam and keynote. I have been longing for good solution in this realm for a long time. Screenflow will automatically record from your web cam while you perform your screencast. In post production it is easy to alternate between the webcam and the screen, and you can even do picture-in-picture. The keynote feature is really cool too, as it is a common requirement to have a sort of running keynote presentation along with the screencast.
4. Effects and post production. It has all the toys I need for polishing the screencast when i’m done. Zooming in on windows, highlighting sections of the screen, dimming out irrelevant portions of the screen, etc…
I give this software a 10 out of 10. It is the very first of its kind to get it all right.
This is yet another example of patents out of control. A company named Blackboard apparently “invented” the idea of allowing a single user to operate with different roles for different courses in an online learning system.
How, oh how can this be considered patentable. The concept of roles, and the ability to assign the same user different roles in different concepts has been around since the dawn of information systems. This is not an original idea!!!
All I can hope is that Obama’s promised patent reforms will address issues like this one.
More and more, I’m leaning toward the position of “NO SOFTWARE PATENTS”. This based on the fact that I have yet to see software patent that actually has merit.
I have recently moved from a managed hosting service to a dedicated web server where I essentially have my own machine. This is nice because it gives me full root access to install whatever I like and the performance is superb. However I’m on my own for all of the little things that I used to take for granted.
The first thing that I faced was the need for a DNS server. I’ll leave that adventure for another post. The next thing I realized was that I am responsible for my own backups. After pricing around for a backup server I decided that it would be just as well to do the backups locally (i.e. periodically download copies of important files to a local machine. I have been living in a laptop world for the past 4 or 5 years so my desktop computer situation is sorely lacking – and for this sort of thing I do need a desktop with a persistent internet connection.
At first I was thinking of just purchasing a cheap old G4 tower (I would rather at least have a mac). I recall that they used to be pretty fast, and you don’t need much for back-ups. Then it occurred to me that if I’m going to put a desktop computer into my little one-bedroom apartment, I might as well multi-task it. Craigslist has a good selection of G4 towers in the 400MHz range for about $150 or $200. Then I saw a Quicksilver 800Mhz model for only $220, and I started to think a little bigger.
“If I had a Quicksilver model, it would run Leopard, and ….” . Well the Quicksilver guy never returned my email (of course he didn’t post a phone number), so I started to evaluate my decision to upgrade to a Quicksilver. I asked the question: “How much faster is a Quicksilver than the Sawtooth (400MHz) model?” Thanks to lowendmac.com for providing benchmarks for all of the old macs so I could make this comparison. I noticed that Lowendmac listed something called a “Geekbench” score for almost all of its computers. Geekbench is a cross-platform benchmarking program that ranks a computer’s performance on a number of factors. I then stumbled across this little goldmine of a page that compares the performance of all Macs G4 and up.
The power mac G4 450MHz scores a 309
The Quicksilver 867 MHz scores 415
This is a small improvement, but nothing to write home about.
This is where trouble started brewing in my mind. I currently use a Powerbook G4 1.67GHz for my job and my fiancée has a Macbook 2.0 GHz intel core duo, so I thought I would see how they ranked so that I could make a comparison in tangible terms that I could relate to.
My powerbook scored a 774.
The macbook scored a 2534!!!
Wow. I didn’t realize just how much a performance difference there is between the G4 models and the current Intel core duo models. There is a huge difference.
In fact, up until last night, I was living in a world where I still thought that the G5 would hold its own with just about anything out there. The single processor G5 1.8GHz scores a 1049 on the Geekbench. These machines are currently going for around $800 to $1000 on craigslist. Even the dual processor 2.7GHz only scores a 2251 – less than my fiancées laptop. Just a note, the G5 duals go for anywhere from $1000 to $2500 on craigslist.
So I got another thought. For a long time, I considered the smaller computers like IMacs and Mac Minis to be inferior to the tower computers. Hence for some reason I had the idea that it was better value to buy an older G5 tower than to buy a new Mac mini. Well, the Mac mini 1.83GHz Intel core duo with 1 gig of RAM retails for $649! And its Geekbench score is 2365! Higher than the mamoth G5 of yesteryear.
In fairness the latest towers are now ranking over 10,000 on the Geekbench charts, but you’ll be spending well over $5000 to get that sort of machine. For average home use and amateur video editing I can’t imagine that the added power would warrant the added price tag.
So, long story .. well… long… I bought a new Mac Mini. It will serve as my universal business machine. I will be primarily using it for backups, but it will work out nicely for video editing, email, and pretty much everything else too.
And – its footprint is negligible. Its about the size of a cable model, and it is silent!
So where does that leave the old towers of yesteryear that people are asking $1000 and $2000 for? Sure they have more expandability, but most people out there will never want to expand. And in the era of USB2 and firewire 800 it is easy to expand externally. E.g. I’m running two external hardrives on my Mac mini for my backups.
The moral of his long, drawn-out story is to think before spending money on an older beefed up tower when you can get better performance, smaller footprint, and a sexier look from a new Mac mini.
If you’ve used the internet even casually over the past few years you have probably experienced CAPTCHA already. From wikipedia:
A CAPTCHA (IPA: /ˈkæptʃə/) is a type of challenge-response test used in computing to determine whether the user is human.
It is common to see an image like this:
and be asked to type the letters you see into a text field. If you answer correctly then your input is accepted. Otherwise you are assumed to be a robot, and your input is rejected.
CAPTCHA is an annoyance to the user because it makes him spend extra time every time he submits a form on the internet. It is, however, necessary, thanks to spammers.
This “annoyance” sparked an idea in some researchers at Carnegie Mellon University, to try to derive some good out of this situation. They made a key observation about CAPTCHAs:
Over 60 million CAPTCHAs are solved every day by people around the world. reCAPTCHA channels this human effort into helping to digitize books from the Internet Archive. When you solve a reCAPTCHA, you help preserve literature by deciphering a word that was not readable by computers.
If this catches onto some high-traffic web sites (say Facebook, or Gmail), imagine the productivity that we can attain in digitizing these old books.
I have to say that this is one of the cleverest ideas I have seen in a long time. It takes wasted energy and transforms it into useful energy.