Monday, December 17, 2007

Why is Javascript almost Mandatory?

My Comment on Groklaw

I just posted this on Groklaw in response to PJ, asking why Javascript was necessary for the Web. These are my thoughts on the matter, not right, not wrong but mine.

Javascript might not be mandatory in some parts of the
internet, but it allows web-pages to leap out of the bad gui design of the
previous generation.
Web-sites characterised by page after page of detailed
content, which are are not changed from refresh to refresh cause the users to be
turned off. The guts of an easy to use site is a FAST responsive interface,
which can be easily done with Javascript (although not impossible to do with
other tools, and easy for Flash and other rich clients). There is a reason that
the early internet had such dismal 'stickyness', the reason is that the content
had to be sooooo much more attractive to keep the audience without a good
interface. Site owners might be able to mitigate responsiveness issues with
Hardware/Software/Bandwidth/Content Delivery Network but cannot fix them without
active code being run on the client.
NOTE: Geeklog(and blogs generally) allows
you and the reader to optimise pages into very large 'chunks' of comments, the
only disconnect comes when we feel the need to comment, we lose context of our
comment when posting.
NOTE2: Sucessful sites almost universally have trivial to
use GUI's with very little overhead, that is not an accident.

Thursday, November 15, 2007

If Social Networking Sites *Really* Wanted to Interoperate

This is funny with all the hoo-hah around open social, facebook, walled gardens and so on. O'Reilly really has great content, nay the best on the internet.
If Social Networking Sites *Really* Wanted to Interoperate

Tuesday, November 13, 2007

SOA vs Distributed Objects and performance

I don’t think this article goes into enough detail, but he delves into why object systems (such as EJB) have required architectural additions to simplify and control the performance problems.
i.e. EJB session beans to optimise and control access to EJB entity beans.

http://www.acmqueue.com/modules.php?name=Content&pa=showpage&pid=507

Also I think he is onto an important point that the design of a service oriented architecture is more suited to optimisation on a network because it makes intuitable the network nature and tradeoffs of a call over a network (entirely my interpretation of his words).

And also if the caller of a service is requesting data then a large graph of data can be moved entirely into the callers context for rapid data use, managing the overhead and failure possibilities once and not hundreds of times (literally). Exactly why SQL over SQLNet can be easily optimised for net traffic.

When looking at a problem in a SOA way it seems that you naturally evolve a more optimal way to use the network. I don’t have enough experience with SOA to know what all the pitfalls are of that orientation though, there will definitely be pitfalls. Some pitfalls might be marshalling/demarshalling overhead, lack of data locking, brittleness of implementation due to change, cascading hidden service dependencies, service versioning issues.

When I went to an IBM seminar on SOA it was quite illuminating when they talked about a SOA ‘call’ with hundreds of parameter data elements. I actually thought it was rubbish to design a call with so many parameters, but I really don’t know enough to comment on the design of these systems. It might indeed be ‘good’(tm) to create such a call when the service is ‘state-free’, i.e. implements an algorithm that acts only on parameter supplied data. But a service/interface with so many parameters is sure to be volatile and that volatility could effect callers of the service.

What do other people think?, am I off the mark?

Monday, November 5, 2007

Celebrate because Movember is upon us

The month formerly known as November has just got a facelift.
Sponsor my Mo for Prostrate Cancer
Movember donation
.

Tuesday, October 9, 2007

Agile Indian vs Mainstreet Cowboy

I feel the need to distinguish the “Agile Indian” from the “Main street Cowboy”.

Agile is an interesting and powerful set of methodologies, but I hate to see the collapsing of more than one term into one, especially to the detriment of Agile. I also hate to compete inside my organisation against Cowboy coding masquerading as Agile.

Let us start by distinguishing “Cowboy” coding.


  • Just leap into the project and start programming as fast as you can.
  • No thought to testing each function.
  • The emergent behaviors and corner cases in the product may not be considered up-front, only when they are proven to be a problem do they get addressed.
  • Whack a Mole style development, just hit the problem in front of you.

Then distinguishing “Agile” programming.

A set of methods that allow the following results.

Small / Quick / Short feedback loops:

Allowing any mistake or wrong direction to be quickly and cheaply rectified.

Techniques: continuous testing, client on site, fast iterations, small teams

Optimal process control:

Any process that is not required increases the cost, reduces morale, increases risk.

Techniques: no Quotes, Stories not Endless/Perfect a-priori Requirements, Optimal Metrics, One methodology per project.

I hope this makes sense, comments are encouraged.

Friday, September 21, 2007

Rant about an "Issue Aggregator"

When at BarCampWellingtonNZEGov, I had occasion to have a bit of a rant about an aggregator that was required for enhanced citizen participation in e-government.

I just saw Jason Ryans post about Media Monitoring over at The Network of Public Sector Communicators, where he talked about the open source



I was suggesting a news aggregator (similar to news.google.com) that could bring government/authoritative and private space comments together to provide a common view of the various sides to an issue. Ranging up and down the long tail to provide visiblilty to the Transit NZ discussions in Auckland over roads and equally apply to the liver bellied spotted newt's wildlife sanctuary proposed for just west of the median barrier on highway one north of Foxton.

Buzzmonitor seems to be a very good start to that aggregator. I plan to have a play with it myself.

I have thought harder on the problem and see major problems in my idea, among them :
The inherent bias of the 'controlling' entity, The difficulty in defining the issue accurately without misrepresentation. The lack of enough visibility into the authoritative government sites (my assumption, sorry to government web masters out there).

I suppose they are all solvable, likely by a bit of magic "Wisdom of the Crowds" fairydust, but for someone with more free time than me.

Wednesday, September 19, 2007

Agile: Test Infected, Living in a Test free world

I'm trying to think of what I could talk about for The Agile BarCamp currently scheduled for 1st December 2007 in Wellington.

Agile development is a lot of things to a lot of people but one of the key quotes I have heard on the subject is:

"XP matches observations of programmers in the wild"

Apparently the quote above was uttered by one of Kent Becks converts (link)

I am not really a talker, or an Agile practictioner, but I am interested and infected by 'testing'.

I read and got extremely interested in testing by reading Test Infected articleat the Sourceforge JUnit site.

Ever since then I have wanted to work in a development team that used a suite of unit tests.

Wherever I have worked since then I have continually re-invented bash scripts to test my code in an JUnit / xUnit manner.

The latest rewrite I remember was when I was using a series of programs to validate and transform xml files.
  • I used GNU sed (the stream editor) to transform some tricky characters,
  • I used tidy from the w3c to clean the xml,
  • I used Oracles XSL program to validate the xml is valid
  • And I used Multi Schema Validator from Sun to validate the xml according to my XML Schema.

So, in order to test this unholy mashup, I wrote some canditate xml files.
  • Some for Fatal errors,
  • Some for Warnings,
  • Some that validated OK,
  • some empty files,
  • some invalid dates (MSV from Sun does not validate dates are valid ... 31st Februrary anyone).
  • XML files with dtd's,
  • xml schemas,
  • extra namespaces,
  • other valid dialects.

I think I used over 40 xml files in the end, all sorted into their various directories with one shell script to test the lot and in the darkness bind them to a Test OK, or Test Failed: message.

This is satisfying stuff when I am programming on my lonesome.
But when I have to deliver it to others, it doesn't fit in the hand crafted versioning system we use, no developer looks at it when they maintain the software, people complain it is too complex.

OK I admit the last is probably true ;-).

So some questions I'll be bringing to BarCampAgileWellington.
  • Is Agile possible in Govermnent?
  • To what extent does Agile exist already in the 'necessary hacks' that are used to get around shortcomings in our non-agile methodologies?
  • What can we do to document these hacks and give them the name 'Agile'?
  • Agile, doesn't that mean slipshod, cowboys, broken and 'beta'?
  • How do you train people in Agile? Agile rigour?

Update:
I wont be able to make it now, a holiday interupts :(.

Wednesday, September 5, 2007

Jane Goodall talk at TED

I just heard Dr Jane Goodall talk at TED.
Discussing the big apes, man and development in the jungles of South America and finally her Roots and Shoots venture for kids.
I found her Institutes Gombe Chimp Blog which is a lovely geo-located blog (Google Earth and Maps and GeoRSS). Unfortunately the automatic updating Google Earth feed wasn't valid, but can be loaded using Google Earth using the KML link here. Each blog post has it's individual link to content to be viewed in Google Earth.

Dan Dennett talk on Memes

Here is an interesting Video of a talk by Dan Dennett about the care and feeding of memes at TED.

Wednesday, August 22, 2007

Reply to: Open XML Crunch Time

Rod Drury Blogged today on Office Open XML

My Answer to Rod on his blog:

Rod respectfully, I suspect that you have been talking to far too many people who have already hitched themselves to the Microsoft horse.

I work in Government and there is serious concern throughout government that this standard will perpetuate monopoly and achieve nothing else, except get around some mandating of standards in Government purchasing.

Microsoft has documented OOXML, great.
OOXML has different use-cases to ODF, great.
OOXML preserves legacy document formatting, great. Lets hope that Microsoft Office will actually display it correctly across versions, they often fail.

But why do we need to standardise something that is looking backwards not forwards. Standards are designed and discussed in order to serve the requirements of the future, and this standard wasn't designed to address the future and is badly designed for it. Why do we have to allow this standard that has massive holes in Internationalisation, Documentation, flexibility and implementability.

Ask your Microsoft friends to please throw out this insulting 'Promise not to sue' and actually give us a licence (irrevocable, perpetual, worldwide, sublicencable). We don't even have the right to amend the documentation for the standard to fix the gaping holes.

The standard will go through even with my vote of "No, with comments", because Microsoft and the ISO working group will fix many of the issues and re-submit the standard, because this isn't about not having a standard, it is about having a good one.

Andrew.

Additionaly it is named (or perhaps misnamed) Office Open XML, which is bound to confuse, unpronouncable and misleading compared with OpenOffice.Org (ooo). The earlier name "MS Office Open XML", is much clearer and accurate.

Monday, August 20, 2007

OECD seems to favour Open Statistical Data

Don't you just read your own interests into everything you read on the web.

When browsing on the BarCamp Wellington Google Group, I saw referenced the OECD second OECD World Forum on Statistics.
Their Istanbul Declaration has a paragraph that resonates with the purpose of my goal of Open Government Data.

A culture of evidence-based decision making has to be promoted at all levels, to increase the welfare of societies. And in the “information age,” welfare depends in part on transparent and accountable public policy making. The availability of statistical indicators of economic, social, and environmental outcomes and their dissemination to citizens can contribute to promoting good governance and the improvement of democratic processes. It can strengthen citizens’ capacity to influence the goals of the societies they live in through debate and consensus building, and increase the accountability of public policies.


Information can be locked into silos in government deparments and limited to a privilideged few who know and pay for the data.

Alternatively Information can be provided free or at low cost, which allows it to be pushed, prodded, bent, analysed and re-mixed. Taking a thousand paths towards creating real value for the Citizen / Global Citizen.

Hans Rosling mentioned at Govis that the cost of gathering statistical data far outweighed the maximum amount the govornment could charge for it.

I personally am aware of the value of having Geo-data right to hand (inside my organisation) without having to write a business case, justification or otherwise to get it. The benefits to having Open Geo-Data in New Zealand far outweigh the costs of supplying the data free.

I note that we almost have free geo-data from Land Information New Zealand. I have been tempted to get a licence $270 per quarter to the LINZ data and see how cheaply I could on-sell/licence it. It would go well with an open source project to create transformational scripts, to load and integrate the standard LINZ data into Open Source GIS programs (And Google Earth too).

I am convinced of the value to the above to the New Zealand public. Maybe we can start to make it happen from BarCamp. I hear in the distance the loud sucking sound of my free time being vacumed up.

Bar Camp Wellington NZ E-Government



title="BarCampWellingtonNZegov,

Saturday, 15th September 2007



This BarCamp is about making a difference to egovernment in New Zealand. We are a small country with a very well connected, vibrant web community. government 2.0 can happen here!


3 Queens Wharf (map) | mailing list





The rules of BarCamp (as listed by barcamp.org) are:

  • 1st Rule: You do talk about Bar Camp.
  • 2nd Rule: You do blog about Bar Camp.
  • 3rd Rule: If you want to present, you must write your topic and name in a presentation slot.
  • 4th Rule: Only three word intros.
  • 5th Rule: As many presentations at a time as facilities allow for.
  • 6th Rule: No pre-scheduled presentations, no tourists.
  • 7th Rule: Presentations will go on as long as they have to or until they run into another presentation slot.
  • 8th Rule: If this is your first time at BarCamp, you HAVE to present. (Ok, you don't really HAVE to, but try to find someone to present with, or at least ask questions and be an interactive participant.)

Friday, August 3, 2007

Office Open XML

What can I add to the debate.

I have listened to Groklaw, Rob Weir, Andy Updegrove, Bob Sutor and others on the Internet for a while and I finally find myself asked for my position on the subject, or more accurately for my input into the NZ Governments opinion on the subject.

What additional point of view can I bring, which hitherto unexplored insight can I highlight in the debate that will make a difference.

I don't know, but I intend to find out, wish me luck.

To those who don't already know, I am talking about the proposal that Microsoft has made for the ECMA and ISO organisations to rubber stamp it's new xml data format as a standard.

Tuesday, July 17, 2007

Technical Debt, the real costs of corner cutting

Found on Martin Fowlers Bliki yesterday:

http://martinfowler.com/bliki/TechnicalDebt.html

Technical Debt, a term invented by Ward Cunningham to describe the effects of having code being badly designed for current circumstances (my paraphrasing) due to a variety of circumstances.

In our case this debt is incurred by not re-working our software for new situations or platforms. The system as a whole isn't bad, but we have a few cranky old parts to our system that need a bit of TLC (Tender Love and Care). Whenever we need to look at those screens, we see how much work it would be to rewrite them the way we all know we want to but every time we have a project effect them we 'um' and 'ah' and finally decide that the project can't afford the rewrite, or schedules don't allow for it or there is no testing resource for the rewrite.

Monday, July 16, 2007

Forever minus a day Copyright, splutter splutter!

I came across a breath of fresh-air today, courtesy of Groklaw latest news picks .

Rufus Pollock has written a paper analyzing the optimal level for copyright in the current technological environment to balance the financial incentives to the author/publisher and the public, Forever Minus a Day? Some Theory and Empirics of Optimal Copyright.

This gives a bit more ammunition to offset the power of the entertainment lobby which constrains culture with their need to control their content mercilessly.

This control of culture and our more broadly our physical lives by corporations is one of the worst evils in the world today, not that I can do anything about it of course ;)

The corporations are somewhat forced into it by the imperative to maximise their profits at the expense of culture, environment, customers, employees, ethics and government. It is a fundamental weakness of the checks and balances in our capitalist/democratic/consumerist society.

More later.

Tuesday, July 10, 2007

Google Earth Integration Story: The Data

OK, So I had the GIS location of some of my companies properties, I thought it was latitude and longitude, but I was wrong.
I could program in Java and I could extract the data into this KML xml structure, that was to be my first project.
I created the KML OK, but the x and y co-ordinates didn't work.
After a lot of fluffing around the internet I realised that the projection was wrong, I was using NZMG projections, not wgs84 latitude and longitude.
It was a world of hurt to convert it, so I finally imported it into postgis and converted it using the standard postgis functions.
I now know that Oracle has the GIS locator class, which would be an easy way to do this complex function, but this was the first mechanism I found to do it myself.
So, I suceeded in getting all the properties into a postgres database so I could use java to extract the kml; hooray!.
I had placemarks all over my Google Earth, although it was a bit slow with so many in the one file.

Google Earth Integration Story: The Vision

It all started with a decision I made.
I was reading about all these cool technologies and not really understanding them at a level I felt satisfaction about, I was never actually using them to solve real problems.
I needed to get my feet wet in a non-trivial way and actually make something.
Then I saw Google Earth for the first time, I was smitten.
I loved how the earth looked and the awesome level of technical talent put into making it all work smoothly.
So, I decided to try to put some of the data at my disposal into Google Earth. This was in November 2005.

Thursday, July 5, 2007

Optimisation, afterthought or not

It is funny how placing your ideas on an open stage makes one re-consider ones position.

I mentioned in this post on How Google Earth Really Works about how optimisation at my workplace is an afterthought.

Ahhhh, no!

I started to realise that optimisation of our applications has such a solid foundation that it 'blends into the woodwork' of my daily existence. It was enlightening looking at another problem domain's optimisations.

We develop business software in Oracle, SQL, PL/SQL, Forms and Java with all sorts of other things like Perl, PHP and Shell scripting, even some old PRO*C code.

So what has blended into the background:

  • SQL statements over networks are course grained calls, which work well.
  • Indexes are always used for primary keys and foreign keys.
  • Oracle Forms uses those indexes and have reasonable optimisation defaults.
  • We tend to design using denormalised large tables, normalisation can have a drastic effect on performance, even in todays crazy supercomputers.
  • Probably a lot more that I can't see for the trees at the moment.


p.s. My spellchecker makes me aware that I am not writing US English, I am writing in UK/NZ English. For those who don't know, the words above do not end in 'ize' :).

The Tyranny of Subscription Living

In the beginning we lived from the land, with calluses and broken bodies to prove it.
These days companies are trying to nickel and dime us to death with the rise of the tyrannical Subscription Model. You pay $29/ month for telecom service, you pay $1 per day for your power service, you pay $78 per month for internet service, you pay the same way for Mobile phones and Cable TV and software is also being converted to subscriptions.

This is a terrible thing to do to the poor Citizen Consumers of the world, to tie us to products that we cannot purchase and must keep on paying for till we die, and probably beyond.

Besides the raw cost of living in such a model which is demonstrably more expensive than purchase living, there is the brittleness of living with dependencies on all these services. This brittleness is hidden until there is a disaster, but it should be uncovered and removed if possible.

A great example here in New Zealand (and elsewhere) is the preferred way to create a Photo Voltaic/Solar powered house. The preferred method is to hook the house to the grid and use the grid as a big battery. Thus perpetuating the dependency cycle. For an extra investment the home owner could use batteries to entirely disconnect themselves from the grid, and eliminate the per deim cost of the grid.

Similarly the Energy efficient Hybrid cars also tie the consumer to buying petrol, without the inbuilt ability to re-charge overnight. This is a little larger issue as it is a cross industry effort to retain consumers sucking at the petrol teat. If an equivalent teat could be constructed the corporations would jump on it.

One of the underlying issues seems to be the lack of a democratic force in the absence of a corporation to buy the politicians favour. Just in the case of Open Source software, there are very few people willing to go in to bat in the political arena for the right solutions if it doesn’t have a money spigot attached. I see the problem is world-wide and it is a fundamental weakness in the political system of democratic countries.

The Matrix says it best: You are just a battery for the soulless machines.

Wednesday, July 4, 2007

How Google Earth Really Works

One of the key original developers of Google Earth (nee Keyhole) explains how Google Earth works.

A fascinating romp through the ins and outs of one of my favorite toys.

How Google Earth [Really] Works, on realityprime.com

It is funny how as a developer I sometimes consider optimisation as an afterthought and unimportant, but deep thinking about it can develop radically new ways of looking at the world.

I will post in the future about some of my projects integrating dynamic KML into Google Earth, shout out to Andrew Hallam at Digital Earth for giving me a serious leg-up for these projects.

Saturday, June 16, 2007

Statistics NZ Opens its Data, Hurrah!!!!

The Free Our Data blog asks why Statistics NZ opened its data, This blog may have a hint. It was possibly due to Hans Rosling at Govis 2007.


This is great news, the availability of this data will enable many more innovative uses this information about ourselves, and also enable international mashups like the ones that Hans Rosling does with Gapminder.org.


Thank you Statistics NZ on behalf of all New Zealanders, and of course the Statistics Minister Clayton Cosgrove.

Friday, May 25, 2007

Hans Rosling at Govis 2007

Hans Rosling gave a great talk to close Govis 2007, he brought statistics to life, he has amazing flash graphs that compare worldwide statistical indices's.

Doesn’t sound interesting?

Go to the web-site and hit the ‘play’ button to play the statistics through time, and see what happens to the various countries. Change the measures on the axis. Have a play around, you’ll be surprised.

It is great tool to get the statistical truth in any discussion, rather than believe the ‘top-of-your-head’ claims that we all make. At work we have been googling for the truth, which keeps us much more real, and this resource can do the same for statistics and the relationships between them.

Statistics are used to befuddle and mislead much more often than they are used to inform and lead to inciteful conversations, this resource is a very important step in the right direction.

Apparently Google is supporting the work of Gapminder now, and taking the technology forward. Expect more from this crowd in the future.



It is interesting that Hans is leading a quest to Worldwide statistical data available for free, this allows many more innovative uses of the data. He pointed out that the data charges that the various government organisations are charging right now are only recovering a small percentage of the cost of acquiring the statistics and are very likely to be stifling innovative and socially desirable uses of the data. To this end apparently he talked to Statistics New Zealand when he was here and encouraged them to open the data.
I am aware of the drive internationally to open up access to Geographical Data around the world, and exactly the same issues apply.
If anyone from Statistics New Zealand read this, please, please open up the data, allow us NZ citizens free access to our own data.

Wednesday, May 23, 2007

Tech Futures - John Smart

I listened to a podcast today from IT conversations.
It was John Smart from the 2005 Accelerating Change conference, How to be a Tech Futurist.
He painted a very optimistic future where the exponential growth of some micro-small areas of technology (computing, nanotech, nano-tube-ribbon etc), is likely to pull us out of the problems we find ourselves in now (Climate change, Energy overuse and so on).

It is a heady vision that I would like to believe, A software developer always has an eye for the next great thing, bigger, better more complicated way of putting technology together ;).

I usually tend towards the cynical, morose, negative end of the spectrum after listening to a lot of material on Global Warming, Kyoto protocol, Renewable Energy, Oil Peaking, World Without Oil etc.

It is quite a change to get a dose of positivism, I like the change in focus but I think more psychic and pocketbook pain needs to be felt globally to get us moving on renewable/sustainable energy. I look to the oil shocks of the seventies which I only dimly remember, and the lack of actual improvement due to the pain felt back then and I dispair.

A negative world view can motivate us to more action. So I say as I sit on the fence looking at energy improvements that are possible in my home while not taking action.
Links:

Tuesday, May 22, 2007

Forgive me Father for I have Sinned

It has been 37 years since my last confession.

I'll be posting regularly to this blog, at least until I run out of things to say.

To connect, to train and improve myself.

To tame the voracious Internet beast within.