Sunday, October 23, 2011

Unsung Heros - Dennis Ritchie

Coming a week after the death of Steve Jobs, it was announced that Dennis Ritchie had passed away at the age of 70.

There is huge media focus on Jobs, quite understandably and rightly so, but Ritchie, in my view, contributed so much more to the world of technology we now see around us today.

To be far, a number of main stream media (MSM) outlets did run with the story, including an obituary in the UK Guardian.

Ritchie joined Bell Labs in 1967 to work on Multics - the pioneering OS started in MIT / Bell Labs in the 60s, taken over by Honeywell in the 70s.

Ritchie joined the Multics programme with Bell at a point of turmoil. Multics was failing to deliver. Bell dropped interest in Multics in 1969, but Ritchie, with fellow "co-conspirators" Thompson, McIlroy and Ossanna knew there was a need for a time-sharing OS to support their programming and language development interests.

During 1969, Thompson started to developed a game called Space Travel on Multics, but with the shut down of the Multics programme he'd lose his game and hence it started to port it to FORTRAN on a GE-635. The game was slow on FORTRAN on the 635 and costly as computers were charged by the hour in those days.

So to keep his gaming interest alive, Thompson got access to a PDP-7 Minicomputer that had, for the time, a good graphics processor and terminal. It's wasn't long before Ritchie and Thompson had programmed the PDP-7 in Assembler to get the raw performance they wanted for Space Travel. In essence, they had to build an OS on the PDP-7 to support the game development. They called this OS Uniplexed Information and Computing Systems (UNICS) as a reference and pun on their ill fated Multics programme. UNICS got shorted to "Unix".

In 1970 Bell Labs got a PDP-11 and Ritchie and the team began to port Unix. By this time the features and stability of the OS was growing. By 1971 Bell Labs had started to see commercial potential in what Ritchie and the team had put together on the PDP-11 and by the end of 71 the first release of Unix was made.

Bell Labs was, essentially, a state monopoly in the Telecom space and was not allowed to commercially profit, so basically, they gave it away free to academic and Government institutions. Given that the period also coincided with the birth of large scale networking and the TCP/IP protocol, it's no co-incidence that Unix became synonymous with the growth of the the Internet.

Once Unix was ported on the PDP-11, Ritchie and the team set about getting a high-level language up and running on Unix. Thompson started to set about developing a FORTRAN port. During this developed Ritchie became influence by earlier work at MIT on a language called BCPL. This became known simply as B. The goal of the compiler was to try and bridge the traditional high-level languages of FORTRAN and COBOL with low level systems capabilities of Assembler.

Through a number of iterations B morphed into C, and the language we know today became pretty much complete by 1973.

Ritchie's work on C culminated in the classic text The C Programme Language, first published in 1978. I purchased a copy while (attempting) to teach myself C in 1984. In fact I still have that copy of the book.

Look at any computing device today, from a mobile phone (iPhone / Android) to Flight Control Computers on a UAV and the operating system running these devices can be directly traced to Ritchie's pioneering work in the late 60s and early 70s.

In terms of the legacy of Ritchie's work on C it's the basis of numerous modern programming languages in wide use today, from C#, Java, JavaScript, to influencing scripting languages like Python, Ruby and Groovy.

Steve Jobs can be certainly credited with the turnaround of Apple and bringing design and aesthetics to consumer technology, but it is Dennis Ritchie we should remember as providing the core foundations for computing today.

Dennis MacAlistair Ritchie, computer scientist, born 9 September 1941; died 12 October 2011.

Dennis Ritchie's Home Page on Bell Labs website.

Friday, February 4, 2011

Android Development Made Easy

I treat myself (and my family) to a Samsung Galaxy Tab for Christmas last year. It's a nice device, I prefer it over the iPad which, for me, is just too big to be portable and, also, lacks the full browsing experience.

To date I've only had hands on with Android using friends phones - I have a BlackBerry myself. Android's a great platform, with some cool applications and an expanding App Store environment. that's starting to rival Apple's. For really cool apps I recommend you get yourself a copy of Google Sky - absolutely amazing.

In my work, I've been exploring the application of mobile devices to Equipment Maintenance and Asset Management. I thought, seeing as I had a Galaxy Tab to hand, I'd take a look at prototyping some concepts on it.

So, I headed over to the Android Development Site and got the SDK, Eclipse Plug-In and Device Emulator. For accurate Galaxy Tablet emulation you need the Galaxy Tab avd profile on Samsung's Mobile Innovator site.

Naturally the native platform for Android development is Java and the SDK is very comprehensive. But, as you'd expect, it is quite heavyweight and there's a substantial code set just to get a basic application up and running. I guess this is okay if you're a full time professional Android development or you've really set up to dedicate huge amounts of time to the platform, but for me it was all too big a learning curve given all the projects I currently have on the go.

After a quick Google I came across Android Scripting, used be called ASE now called SLA - Scripting Layer for Android. SLA has been around for a year, but it's not a core supported component of the SDK, but a Google Code project released under an Apache License.

And what a great project it is. Essentially SLA's all about lowering the barrier to developing simple Android apps supporting a number of common scripting languages such as JavaScript, Python, Lua etc. To install, simply scan the barcode (more on barcodes later) and, assuming you've got a Net connection, the core SLA package will install. The basic SLA runtime comes installed with only HTML and JavaScript interpreters.

In terms of settling on a scripting language I choose Python. I've been doing a fair bit in Python currently on a collaboration with some work colleagues and I'm really liking it for it's productivity. Also, the Python library is a native C and complied so, in a lot of cases, is just as quick as the JVM.

Productive is it, and just to show you how much you can do with very little code, take a look at the following:
import android

droid = android.Android()
barcode = int(droid.scanBarcode().result["extras"]["SCAN_RESULT"])
url='' % barcode
As you can probably guess from reading the code, this little app invokes the Tablet's camera based barcode functionality, get's the barcode value, concatenates that with the URL for Google Books that displays a web page for the book you've just scanned! To support the barcode API call you do need the ZXing library installed. Still, pretty cool though.

Wednesday, January 19, 2011

NASA Conference on Intelligent Data Understanding

In October last year I had the privilege to attend the NASA Conference on Intelligent Data Understanding (CIDU) at the Intelligent Systems Division, Ames Research Centre in Mountain View CA.

CIDU is focused on applying Data Mining, Knowledge Discovery and Machine Learning techniques to a number of NASA relevant domains, including aviation, earth sciences and astronomy. These core areas of NASA's mission all have a common problem with the exponential increase in data generated. Whether this is the increased fidelity and resolution of the next generation telescopes such as the James Webb, or high resolution satellite data for near real time earth coverage mapping.

The main focus of the conference was on algorithm development, both improvement in accuracy and computation of existing algorithms and development of specific algorithms to solve certain classes of problem. Some of the applications were, for me, quite spectacular. In particular are the attempts to carry real-time classification of astronomical events from data streaming from digital sky surveys such as SDSS.

George Djorgovski, Co-Director for Advanced Computing Research at CalTech, gave a lecture on applying a number of advanced data mining techniques and algorithms to identify events such as Supernova etc. The real impressive aspect was the shear scale of the data being processed, in the Petabytes, and the fact that the goal was to able to react to these events in real-time to direct the right telescope / observation resources to investigate and gather detailed data.

Closer to my profession (and the reason I was there), were techniques to help drive diagnostics and prognostics in the aviation domain. Honeywell gave an interesting presentation on work they're collaborating with NASA on a next generation Vehicle Level Reasoning System. GE presented their research on Prognostics and anomaly detection on Jet Engines. The key difference between the Earth and Space Science fields and System Engineering, is, in general, the Systems Engineering problems tending to require combined (fused) data driven and model (physics) based approaches, rather then just data mining on it's own.

I guess to the casual observer these computing techniques may appear esoteric and only really relevant to high end science and complex engineering systems, but I believe that the exponential growth in data in business and consumer spaces will require the application of these techniques to have any chance of being able to "make sense" of it all. This is very much IBM's current viewpoint with their Smarter Planet theme.

The really exciting part of this conference though, was the knowledge that, as esoteric at they may seem, these computational capabilities are available to everyone, Open Source, though Apache Mahout. Mahout provides a lot of the core algorithms that the researchers presenting at CIDU were improving and / or extending. With Mahout sitting on top of the Hadoop platform and being available on Amazon EC2, everyone has (reasonable) access to their very own NASA Ames Supercomputing facility!

All in all a very fruitful event. Here's hoping I get to attend next year!