Thursday 24 December 2009

What did I learn in 2009?


As another year draws to an end, I thought I would pass on some of the lessons I've learned or had reinforced to me this year. Naturally, there's not room to go into every one of them (I've probably forgotten some of them anyway), so I'll list the important ones.

These are listed in no particular order.

  • The importance of your family and friends - While I have always valued the importance of my family and friends, it is still worth making special mention of them here. As with everyone, I have experienced my fair share of low points this year, and it has been thanks to my circle of family and friends who have helped me to get through these. I have been given non judgemental, non partisan advice, a listening ear, or a shoulder to cry on. I would not wish to embarrass them by mentioning their names in this post, but you know who you all are!
  • You cannot please everyone all of the time - In today's busy world with its many demands on my time and energy, I have learned to become more philosophical about how I use that time and energy. I would much rather focus what limited resources I have to doing something that will actually achieve something. That might be something for myself, or something that I want to do for someone else. I would much rather spend my time working on those things in my life that will be appreciated by others, than those that will simply go unnoticed and / or not be appreciated.
  • I can't do any more than to give something my best shot - Sometimes, you will not always achieve what you have set out to achieve. I do not see that necessarily as a failure. However far you may have got, you will have achieved something, and hopefully learned something along the way. Failure is when you do not meet your objectives, and learn nothing from the experience! Give everything your best shot, but don't beat yourself up if you don't meet your objectives.
  • Stay positive - Easier said than done I admit, but staying positive really is important. Something positive can be found from practically every experience. Even from a negative experience can come a valuable lesson. I try to remain positive in everything I do, and to be a positive influence to those around me. Having a positive attitude to life affects everything I do. Do I want to bring happiness to a room when I enter it, or when I leave it?
This will be my last post of 2009. I hope you have all enjoyed reading my posts, and will continue to read them in 2010.

A big thank you to you all!

Friday 18 December 2009

Merry Christmas everyone!


As we head towards Christmas and New Year, I would like to take the opportunity to say a heartfelt Merry Christmas and Happy New Year to you all!

I would like to thank all those people who follow my blog, read its content, commented on posts, or been the inspiration for some of the posts!

To all of you, have a very Merry Christmas and Happy New Year!

Thursday 10 December 2009

City Bankers angry at tax on their bonuses


Alistair Darling has announced in his pre-budget report that City bankers are to face a one off tax on their bonuses. As we have all come to expect from this shameless bunch of spivs, they are all up in arms and threatening to leave the country.

Not satisfied with causing global meltdown due to their unrestrained greed, their sense of entitlement now dictates that they should still receive obscene bonuses. While ordinary people are facing job losses, losing their homes and money worries, bankers are more worried about losing out on their bonuses.

I have no doubt that bankers perform a valuable service to this country, and that they generate a lot of business and commerce through their activities. However, that does not grant them immunity from their responsibility for rectifying the mess they have collectively caused.

To threaten to leave the country, and take their services elsewhere is tantamount to blackmail. When you have done something wrong, most people with any sense of integrity and decency try to correct it. Most people would not demand to be paid vast sums of money for making a gargantuan mistake, and then threaten to emigrate when those vast sums are not forthcoming.

As someone who works in local government, I have seen first hand how the recession has affected the delivery of services to local communities. Every single local government has had to face up to the financial crisis, and plan how they aim to keep delivering the same level of services with less money.

But despite the pain and suffering that they have inflicted on the rest of us, they still seem to think they are entitled to their obscene bonuses.

To those that wish to leave the country, please hurry up and go. If you lack the integrity, decency and responsibility to clear up the mess you have made with your unrestrained greed, then quite frankly, the country is far better of without you!

Monday 7 December 2009

What use is a general purpose manager?

A while ago, I wrote an article about what makes a good manager. One of the assets that I thought a good manager should possess is domain knowledge. They should understand (even if only in broad terms) what the team does. A full understanding and appreciation of what your team does is important.

I would therefore fully expect a Finance manager to have a good understanding of finances, a Human Resources manager to have a good understanding of Human Resources and so on. It seems pretty obvious, but there seem to be managers who do not possess this domain knowledge.
Being able to monitor your team's finances, manage their appraisals and other such managerial duties is the bare minimum I would expect from my manager. All managers should possess these skills. Above and beyond having these basic managerial skills though, I would expect them to have a sound and broad knowledge of their particular domain.

As the manager of a Development Team, I have a broad and deep understanding of technology. I work hard to keep myself abreast of new and emerging technologies. I fully understand what my team does, what we are delivering, how it can be delivered and what is possible.

More importantly, I am capable of making decisions. This is one of the vital duties of a manager. By fully understanding what my team does, and taking an active part in the day-to-day work of my team, gives me a deep understanding of what we do, which allows me to make decisions on behalf of my team.

I am very much a hands-on manager, and get involved in much of the day-to-day work, because this is the work that I enjoy the most. As a manager, I also have to undertake all the other managerial duties as well though.

I genuinely fail to see how you can make decisions without understanding what your team does. I could of course ask the team, and I would expect any manager to do this anyway. However, the final decision would need to be made by the manager, as this is ultimately their responsibility, and is what they are being paid for.

If they can't make those decisions, then why are they in charge?

Tuesday 1 December 2009

How Linked Data can improve web search



What is Linked Data?
  • Whilst at the recent Jadu Experience Day, the keynote speech included an emphasis on the concept of what is known as Linked Data. This is a method of making data visible, allowing it to be shared, and connecting to it using the web.

    The idea of Linked Data was first described by Sir Tim Berners Lee in his description of the Semantic Web. There are four key principles which need to be met for Linked Data to happen:

  • Assign all resources on the web with a unique identifier (called a URI – Unique Resource Identifier) to identify the resource
  • Use web based (HTTP - Hyper Text Transport Protocol) URIs so that these resources can be referred to and looked up on the Internet
  • Provide useful information (a structured description or metadata) about the resource so that people finding the resource know what it is
  • Include links to other, related URIs to improve discovery of other related information on the Web

So in essence, all resources on the web should be uniquely identified (via a URI) using the web as the protocol for looking them up (HTTP). The resources should contain descriptive information about themselves so that people can tell what the resource is, and what it can be used for (metadata). Finally, it should contain links to other related resources that may also be of use. So all resources can help you find links to further, related resources. This is similar to the ‘Related Information’ or ‘See Also’ links that you frequently find in web pages.

Extracted from Linked Data
“Linked Data is about using the Web to connect related data that wasn’t previously linked, or using the Web to lower the barriers to linking data using other methods. More specifically, Wikipedia defined Linked Data as a term used to describe a recommended practice for exposing, sharing and connecting pieces of data, information, knowledge on the Semantic Web using URIs and RDF (Resource Description Framework).”

For any of this to work, the web needs to have as much data exposed to it as possible. Every single organisation, business, research laboratory, government, school, hospital, university and so on has data. With the exception of personal data, all data should be available to the web. Once it is on the web, then it can become Linked Data.

To repeat the mantra first uttered by Sir Tim Berners Lee – we want raw data now!

As a scientist working on a cure for cancer, the more data you have at your disposal is important. You can cross check results from other studies, and link the data together to form a more complete picture. As each resource provides links to other related resources, then all manner of discoveries are possible, including those which were not originally thought of.



Intelligent searching
It is not difficult to see how Linked Data is critical to search discovery, and will allow search engines to become much more intelligent. If all data is available and exposed on the web, if all resources are identified and described, and if all resources provide links to other, related resources, then suddenly your search engine becomes much more powerful.

Some types of question are just not possible with current search engines. Where the question is simple and easily framed, then current search engines can return relevant results. But where the question is more complex, or not so easy to frame, then current search engines may return irrelevant results.

Search engines use complex algorithms to return results, but one key mechanism they all use is trying to match the specified search terms to the content (which will also include metadata). Where the combination of search terms has yielded a significant result, then a match is returned. This is simple word matching. I am sure we have all tried in vain to find an answer to a question that was very difficult to find a useful result for.

For example, trying to find meaningful results to the question "what were the crime rate fluctuations in the UK between 2000 to present", and current search engines would struggle to return anything useful, as the question, while easily stated in English, is not easily framed for a search engine to work with.

If government data on crime rates are available to the search engine, and this data is linked to other related data, such as crime patterns and statistics, as well as the raw data itself, then the question becomes more easily answered, and search engines more able to return meaningful results.

All kinds of questions become possible to answer with Linked Data. It empowers people. Search engines become much more intelligent, and capable of answering even the most complex of questions.

We want raw data now!

Thursday 26 November 2009

Some considerations with Open Source


What is Open Source?
Open Source is the practice within software development where access to the source code is granted. This allows a software developer to take the original source code, and modify it for their own specific requirements. Or to extend the original application to perform some function that it did not originally perform. In keeping with the ethos of Open Source, all such changes and modifications must also be placed in the public domain, for other software developers to use.

So the Open Source movement is all about the sharing of ideas and code, by and for other software developers. This is most certainly a good thing. The entire LAMP (Linux, Apache, MySql, PHP) software bundle is based around Open Source, and this is a very widely used development platform.

Another ethos of the Open Source movement is that it should be either free, or incur only minimal costs. Open Source is the antithesis of for-profit, and profiteering would be seen as breaking the spirit of the movement.

Open Source is driven by the ethos of freely sharing your application code and resources amongst others within the software development community.

A work colleague recently asked me why, when selecting a Content Management System for my place of work, I had not considered looking for an Open Source solution.

Given the many benefits of Open Source already described, it might seem to be lacking due diligence not to have considered them in the selection process. While I most certainly have a great deal of time and respect for Open Source, it just doesn't always provide the best solution that perfectly fits every requirement.

While software bundles such as LAMP (see above) are in common use, and have large communities surrounding them that can provide support, assistance, patches and so on, this does not exist for every Open Source product or application.

Risk and accountability
Using LAMP in a commercial environment does not pose any serious risk. It is very well supported within the Open Source community, so while there may be no formal Service Level Agreements in place with a specified supplier, the community is so large, with so many developers and users, that the risk is heavily mitigated.

Open Source applications such as Drupal and Joomla are both excellent examples of Open Source Content Management Systems. They are also both well supported by their respective development communities.

However, a web site forms a substantial investment (and by extension an asset) to any organisation. It will be used to store the organisation's content, documents, images and information. These are things you probably want to ensure are well supported and looked after. There is little comeback if the system fails in some way, so this poses a more serious risk to the organisation. The lack of accountability is their key detractor.

In a commercial setting (public or private sector), web site downtime is critical. So are the threats from the various types of attack that a web site can be subject to including Denial of Service (DoS), spamming and cross-site scripting (XSS) to name a few.

I would be reluctant to place my investment, costing tens of thousands of pounds, in the hands of a community of unaccountable people. This is not to say that the community would not help in my hour of need, but the key point is that they are under absolutely no obligation - financial, moral or otherwise - to do so.

With a non Open Source solution, there is accountability. If I need support, I can pick up the telephone and speak to their support team. If the system fails in some way, there is accountability that entitles me to certain contractual and legal rights, including compensation for any inconvenience that may have been caused.

Conclusion
The amount of risk appetite we have is a very personal thing. For some, the risk I have outlined may be tolerable, especially bearing in mind the tremendous benefits that come with an Open Source solution. For others, the risk is just too great to accept.

Open Source Content Management Systems certainly have their place, and the examples I have given above clearly show just how professional and feature rich they can be. When considering any solution though, you need to have full access to the facts to make an informed decision, and it is worth pointing out that despite the many benefits of Open Source, it is not a magic bullet solution, and has its own drawbacks that must be carefully considered.

Friday 20 November 2009

Using Landing Pages in a web site

What are Landing Pages?
To be clear from the outset, I am not referring to advertising Landing Pages used in pay-per-click ad campaigns. Instead, I am referring to the lower level Home Page variety of Landing Page. This is a Home Page that sits beneath the main web site Home Page in the overall hierarchy or structure of the web site. To differentiate between a web site's root (or top level) Home Page, and a lower level Home Page, these lower level pages are typically referred to as Landing Pages.

Every web site has a home page, or root page. This is the page that loads when you first navigate to the web site, and without specifying any other criteria, and without navigating there from a search engine result (which may take you to some other lower level page on the site relating to your search criteria).

In the hierarchical structure of a web site, a Landing Page would be positioned beneath the main Home Page. It's purpose is to form a portal for related information, so that all content that is related to a particular subject or item, is grouped together into a suite of pages which collectively are called a Landing Page.

The Landing Page does not in itself necessarily need to contain all the information related to the subject or item, but it should at least provide the portal from where all information that relates to the subject or item can be found. So a Landing Page may be composed of content, links, downloads, images, FAQs and so on. A Landing Page should provide a one-stop shop for its chosen subject or item.

Using Landing Pages is different to the document centric approach, where the web site is built from many related, but separate web pages. Instead, all related information is grouped together into a single Landing Page, so it is obvious that the content is related, and is easier to signpost and navigate.

So just to quickly recap, a web site will have only one Home Page, and one or more Landing Pages.

Why use Landing Pages?
As should be clear from their description, Landing Pages provide a one-stop shop for related information, and therefore allow a user easy navigation and signposting for searching and finding the information they are looking for.

Rather than have related information spread over various parts of the web site, giving no clear signposting that it is in fact related, the user will quickly become confused, and give up trying to find what they were looking for. If your web site is a web shop, and the user has left without making a purchase because they couldn't find what they were looking for, then this should spell disaster!

Designing a web site that uses Landing Pages
The web site design will largely dictate what information gets grouped into the Landing Pages, so it's well worth taking the time and effort to think about how you want your web site's content to be structured, what information should appear on the Landing Pages, and how the Landing Pages are signposted to get the maximum traffic.

I find that mapping the web site structure out first, using a hierarchical or family tree type diagram works well. Each Landing Page should be represented, and all links between them should be clearly displayed.

Using Landing Pages in Local Government web sites
Local government web sites provide a natural fit for a Landing Page approach. As local government web sites use structured taxonomies such as the LGNL (Local Government Navigation List), where the services that they deliver are broken down hierarchically into a service related structure, then clearly each service can have its own Landing Page. For example, you could have Landing Pages for Waste and Recycling, Councillors, Planning and so on. While a document centric approach would still work, a Landing Page based approach is perhaps a better fit.

Summary
So when designing a web site, it's worth considering how you want it to be structured, how you intend to group information together, and how it should be linked for easier signposting and navigation.

Wednesday 18 November 2009

Reasons for using Design Patterns

What is a Design Pattern?
Following on from my previous post about Why design is critical to software development, I would like to tackle a slightly more advanced aspect of software design called Design Patterns. As with my previous post, the idea for this post came about during a discussion concerning the merits of software design. The protagonist of the discussion was of the opinion that Design Patterns are too time consuming to be of use within the field of commercial software development. My intention here is to demonstrate why I believe that to be wrong.

I will not go into any details about the mechanics or implementation of any particular Design Pattern. There are many excellent sources for these available elsewhere.

So getting started then, what exactly is a Design Pattern? Here are a couple of definitions for the term:

Extracted from Wikipedia:
"A design pattern in architecture and computer science is a formal way of documenting a solution to a design problem in a particular field of expertise. "

Extracted from Data & Object Factory:
"Design patterns are recurring solutions to software design problems you find again and again in real-world application development. Patterns are about design and interaction of objects, as well as providing a communication platform concerning elegant, reusable solutions to commonly encountered programming challenges. "

Extracted from Data & Object Factory:
"The Gang of Four (GoF) patterns are generally considered the foundation for all other patterns. They are categorized in three groups: Creational, Structural, and Behavioral."

So a Design Pattern is a general purpose abstraction of a problem, which can be applied to a specific solution. As software developers tend to solve many similar problems, it makes sense that any software solution would incorporate similar elements from other solutions. Why reinvent the wheel?

Well documented and understood
As Design Patterns are well documented and understood by software architects, designers and developers, then their application within a specific solution will likewise be well understood (although given my reasons for writing this post, I should perhaps add the caveat that they will be understood 'only' by experienced software architects, designers and developers).

Design Patterns gives a software developer an array of tried and tested solutions to common problems, thus allowing the developer to save time by not having to think of a new solution from scratch.

To give an analogy of a Design Pattern from the field of civil engineering (which as I stated in my post Why design is critical to software development has close similarities to software engineering), would be to think of a solution for crossing a river. This is a recurring problem for civil engineers, to which there are a couple of well documented and understood solutions. The civil engineers may build a bridge (of which there are many different kinds, but for the purposes of this exercise, let's just refer to them collectively as bridge), or a tunnel.

Close parallels with civil engineering
Why would a civil engineer try to solve this problem from scratch when there are real world solutions that can be referred to? There are close parallels between the civil engineer solving the river problem, and the software engineer solving a software problem:

  • The solutions (bridge or tunnel) are both well understood and documented
  • The solutions (bridge or tunnel) solve recurring civil engineering problems
  • The solutions (bridge or tunnel) are not deterministic or prescriptive, but are abstract and can be tailored to the specific problem (the bridge or tunnel building materials for example can be selected for their alignment to the specific problem)
The argument against Design Patterns that they are not suitable for commercial use due to their taking too long to implement does not hold up. Design Patterns save time by giving the developer tried and tested solutions to many of their problems.

The only issue I have come across with Design Patterns is that they take time to learn. Some of them can be difficult to grasp and comprehend. However, it is worth taking the time to fully understand them, as they will quickly form one of your greatest assets.

Design Patterns reduce complexity, and therefore the solution becomes easier to comprehend.

Design Patterns are tried and tested solutions, the developer does not need to start from scratch, and can hit the ground running with a solution that has been proven to work (as long as the Design Pattern is being used to solve a similar problem; it would be wrong to expect a bridge to solve the problem of crossing an ocean, where a bridge would simply be unsuitable).

Whilst working as a Senior Software Engineer at Pegasus Software, I got my first exposure to Design Patterns in the work place, not just the theory from books. Much of the software framework used to underpin their products was developed using a variety of Design Patterns, including the Class Factory, Decorator, Template Method and Chain of Responsibility. The resultant code was far easier to comprehend, maintain and extend in the future.

Conclusion
Design Patterns, despite their learning curve initially, are a very worth while investment. They will enable you to develop tried and tested solutions to problems, thus saving time and effort during the implementation stage of the software development lifecycle. By using well understood and documented solutions, the final product will have a much higher degree of comprehension. If the solution is easier to comprehend, then by extension, it will also be easier to maintain.

Tuesday 10 November 2009

Why design is critical to software development


A good software engineer should always design
I recently had a discussion about the merits of design within the field of software development. My protagonist was of the opinion that design is not something that should be taught in the early stages of becoming a software developer, and that design was largely a matter of common sense anyway. Their opinion of design patterns was that they were too time consuming to be of use in commercial applications. I will address each of these points throughout the course of this discussion.

As a professional software engineer with over a decades worth of commercial experience, and having worked in both good and bad development environments, I know the benefits that come with good design.

I don't intend to describe each of the various design methodologies or design patterns as there are numerous books and articles already available on such subjects. I intend instead to describe why design is a crucial part of the software development life cycle.

Software is an engineering discipline
I always like to use the analogy of a software engineer and civil engineer. Both should employ engineering concepts to achieve their objectives. What this means is that when designing a software application, you should borrow engineering elements from the field of civil engineering.

Here is a definition of software engineering taken from Wikipedia:

"Software engineering is the application of a systematic, disciplined, quantifiable approach to the development, operation, and maintenance of software, and the study of these approaches; that is, the application of engineering to software".

When designing a suspension bridge to span a mile wide river, would you really want to just proceed without giving any thought to how you intended to build it. Just turn up one Monday morning, and start bolting steel girders together. The obvious answer is of course not, but this same level of diligence and engineering does not always seem to apply to software development, even it would seem from some of those working within the industry. It would be classed as utterly negligent to build a suspension bridge without having a design in place first.

The earlier you are taught design, the better. Imagine turning up to a course on car building, where the lecturer said "This week I'd like you all to begin building your cars please, and next week I'll show you how to design them". I'm sure you would think there was something wrong with the way the course was being taught.

Before you build something, anything, you design it first.

Software design is not a trivial process
Whilst some elements of software design may be common sense, that does not necessarily imply that it is all common sense by extrapolation. In fact, it is fair to say that the design process can be difficult and demanding. It can take years to develop the experience, knowledge and skills to become expert at it.

Design constraints including extensibility, reusability, tolerance, coupling, cohesion and reliability to name a few, must all be carefully considered. There will also be many trade offs and compromises that must be considered as part of the design. To say that all of this is simply common sense is nonsense.

The design process also manages the complexity that is inherent when building a fully fledged software application. With so many considerations to take into account, the human brain can quickly become overwhelmed with information. This needs to be broken down into manageable chunks that a designer can understand and work with.

Design is the key to communicating your intent
If you don't have a design, how do you and your team know how you are going to build the solution? Design takes time to get right, and can be fraught with problems. I spent several years as an Analyst Programmer, where part of my time would be spent visiting customers to document their requirements. I would then work this into a document explaining as clearly as possible how I had interpreted their requirements. This would also include using use-case diagrams using the formal notation of Unified Modeling Language (UML).

I never got this right the first time. I would send this to the customer for their input, and they would invariably point out the omissions and inaccuracies, for me to then revise the requirements document. After several iterations, you would arrive at a set of requirements that the customer would be happy with (for now, but that's a whole other story).

A similar process can and should be employed in the design process, taking the requirements document, and iteratively shaping this into a working design. The software design should form the key communication tool amongst the software team.

What do you mean you don't have time?
I have heard many excuses of why the design process is often skipped, the most common one being not having time. My reply to this is "you don't have time NOT to." Seriously, spending time getting the design right, just like spending time getting the requirements right, will always save you much more time over the longer term than it will cost you in the short term.

To rectify a design fault can be costly, but to rectify a source code fault is costlier still. The later in the software development lifecycle you find a defect, the costlier it will be to fix. So one way of looking at the need for design is to reduce the costs of defects that will be found in the implementation and testing phases (finding a defect in the testing phase is the costliest of all).

By spending time designing, you are actively trying to reduce the number of defects that will be found in the later stages of the software development lifecycle. You are also mitigating any risks by actively seeking out a working solution, rather than proceeding regardless, only to find later on that something is not possible, or must be compromised.

Design can reduce the risks and costs to your software project.

Benefits from adopting software design:
There are many benefits to incorporating design into your software development lifecycle. Here are a few examples:

  • Communication - the proposed solution is visible to the team and all project stake holders, and so everyone has a clear idea of what is required.

  • You can find problems with the solution before you start the implementation stage, where re-development will be far more costly.

  • Maintaining the application will be far easier when you have a series of design documents to refer to. This makes handing the maintenance of the application over to another team or third party much easier.
Conclusion
The design process is crucial to the success of any software development project. It forms the central communication document for the development team. It reduces the risks and costs to the project. It helps to manage the complexity that is inherent to the development lifecycle.

Wednesday 4 November 2009

The positive benefits of social media

Media distortion of the negatives
I am sure we've all heard the negative news stories relating to social media, especially within the workplace. No matter how unfounded or untrue some of these accusations may be, they have been widely circulated by a media that has little grasp of Web 2.0 or social media.

In an attempt to redress the balance, I thought I'd outline just a few of the positive benefits of social media, giving some examples of how I use it.

Staying up to date
First of all, as a professional software developer, I use it to stay in touch with the latest technological developments, be they related to software, social media, applications, architectures and so on. On Twitter, I follow many technologists, from a diverse range of technical backgrounds. Some are published authors and speakers. All are interesting and worth following if you have an interest in technology.

Often, I have sent a message or question to one of them, and more importantly, received a reply. This is a great way to engage with someone on a particular topic. Not only do I get to read the reply, but so does anyone else who has access to their social media channels too, so the information is circulated to a much wider audience.

Promotion
Social media can be used to promote yourself, your business or your campaigns and causes. This is something I do regularly. My Facebook and Twitter channels are full of links to my blogs, campaign and charity groups or causes I want to champion (such as PETA). I may simply link to an article which criticises some element of government legislation, I may ask a question to get others to think about the issue, I may link to a video showing animal abuse, or anything else I want to draw people's attention to.

This is a very powerful tool, and can be very effective. If you run a business or charity, you will be much more effective if you incorporate social media channels into your overall marketing and promotion strategy. Just to be clear, I am not advocating replacing your current marketing and promotion channels, but suggesting you add social media to complement your existing channels.

Spread the word
With real time social media channels such as Twitter, news and information can be spread instantly. Agencies such as BBC News and Downing Street use Twitter to great effect, to get the news out quickly, and usually before the more traditional news channels can. After all, it is quicker to type a brief message into Twitter and hit send, than put out the same story on the radio or television, which requires much more co-ordination and organisation.

The downside to this of course, is that it is just as quick to spread dis-information, whether deliberately or otherwise. This is not a fair criticism of the technology though, rather its mis-use by disingenuous people.

Power to the people!
During the Iran election protests earlier this year, many Iranian citizens were able to share their experiences to the Western world via social media channels. These were spread quickly around the Internet, and reported by Western news agencies. To spread information so quickly like this would have been almost impossible before. As long as you have Internet access, you have a voice, and more importantly, you have an audience.

And finally...
While certain sections of the media may indulge themselves by concentrating on the more negative aspects of social media, practically all of which are not related to the technologies themselves but their mis-use by human beings, we should not focus too heavily on them. With a little common sense, we can all use social media safely and sensibly.

Sunday 1 November 2009

Public Sector and Cloud Computing

Introduction
If you are not familiar with the concept of Cloud Computing, then please read my previous article entitled What the heck is cloud computing. In essence, Cloud Computing uses virtualisation technologies and the Internet as the platform for deploying applications. One of the key drivers for deploying applications to the Cloud is to make cost savings.

Private sector vs public sector
With the US and Japan already a part of the Cloud Computing public sector revolution, it was high time the UK followed suit. There is a lot of potential for long term change in the way in which public services are delivered in the UK, as well as relieving the financial burden on the tax payer.

Cloud Computing has already been widely adopted within the private sector, with many large and diverse enterprises taking advantage of the cost reductions and increased productivity that are on offer. Cloud Computing will usually involve renting a service on a pay-as-you-go basis. In comparison, procuring the service will involve having to purchase the necessary hardware, software licences and additional support, training and other operational costs. The application in its entirety will be deployed to a third party supplier, leaving the client to get on with their business. This model frees up the client from a heavy IT investment.

The story is different within the public sector, where the reputation of IT projects and their delivery has been less than outstanding. The long list of troubled IT projects includes such well known establishments as the DWP, MoD and the NHS. Even worse, is that the situation doesn't seem to be getting any better.

Cloud Computing could be a possible solution to these sorts of publicly funded IT projects, by introducing genuine transformation to the delivery of government services.

Possible incentives
The recent Digital Britain report has recommended that the UK should deploy what is known as the G-Cloud. This would allow local and central government to share centrally hosted applications. This would then lead to substantial cost savings in public spending by building a government wide Cloud Computing platform. Key savings would come from reducing the number of data centres, reducing overall IT spend (hardware, licences), and lower maintenance and security costs.

There would also be other positive side effects to such an endeavour. The creation of a single application platform would encourage the adoption of increased levels of sharing, as well as standardisation of IT services across multiple departments. It could also lead to better service delivery during periods of peaks and troughs of demand, which is critical for e-Government service delivery.

Putting the theory into practice
This theory is currently being put into practice. A role has been created within the Cabinet Office who will be responsible for formulating and managing the implementation of the G-Cloud. The Conservative Party has committed to reviewing, and possibly replacing the NHS National Programme for IT with a Cloud Computing based alternative, in the event they win the next election. If applied successfully to one such project there is no reason to suppose it could not be applied with equal success to other projects.

Monday 26 October 2009

The rise and rise of the BNP

With the BNP having appeared on the BBC's flagship politics programme Question Time, it seems that the BNP have made it into the mainstream political landscape.

It is really not surprising too see how this has happened. Over the last decade, we have seen rising numbers of immigrants arriving on our shores, with few if any checks to see if they were entitled to do so. The tide started under the leadership of Tony Blair's New Labour government.

While there is nothing wrong with having an immigration policy that let's in those who genuinely want to be here to get a better life and contribute to the country, there is everything wrong with a policy of letting everyone in, whether or not they have any intention of putting anything back into the country. And this is precisely where we now find ourselves.

The British public for the most part are decent, hard working and tolerant. The issue most people have with immigration is the strain it causes on our services (NHS, housing, schools), and the cost to the tax payer for footing the enormous bill.

Labour have failed to stem the rising tide of immigration. Our borders have been wide open for over a decade. While they may have recently amended this policy failure, it is too little, too late.

Amid all of this, a party such as the BNP cannot but fail to exploit the public dissatisfaction. People who are fed up with what they perceive as the failure of mainstream politics to resolve the issue, will be pushed further and further into the political verges and recesses where parties such as the BNP inhabit.

The rise of the BNP can be blamed fairly and squarely on the incompetence of the mainstream parties. With the recent scandals over their expenses, to their failure to curb the obscene bonus culture within the banking sector, it is little wonder that people are turning away from mainstream politics.

There was much controversy over Nick Griffin's appearance on Question Time. As a non partisan reporter of the news, they have to ensure all political parties are treated with the same level of impartiality. This has absolutely nothing to do with their policies, and everything to do with a democratic society.

The real question for me though is this. How can a party as overtly racist as the BNP have ever been formed in the first place? How can a political party get away with having a 'whites only' exclusion policy? If the BNP has passed all the checks and balances for forming a political party, then what on earth does that say about the rules we have for forming political parties in Britain.

As the BNP is a legitimate party, then that speaks volumes about the complete lack of regulation we have surrounding forming a political party. If you can create a political party as racist and vile as the BNP without breaking a single rule, then I find that more disgusting than whether or not they should appear on programmes such as Question Time.

Wednesday 21 October 2009

An easily offended nation

Hardly a week goes by without a story emerging from the media about someone who has been offended in some way. People seem to be getting offended in increasing numbers, and are getting offended by an ever increasing list of issues and subjects.

While I am most certainly not advocating deliberately offending someone, I do think that as a nation of self proclaimed tolerants, we need to take a step back and re-evaluate why we get so offended, and so easily.

With compensation culture now being a steadfast part of Britain, and a raising awareness of human rights issues , we seem to have also seen a rise in people taking offence. In my opinion, these are related. People are aware that they can seek compensation if they feel aggrieved. They can demand that they feel their human rights have been violated.

However, the biggest factor in the rise of offence being taken, is political correctness. It is now practically impossible to mention subjects relating to race, religion and so on, without falling foul of political correctness. This is one of the greatest causes of polarity within this country. It does little to create unity, but does everything to cause division. Again, I am not advocating deliberately offending someone.

Comedians are treading on ever increasingly thinning ice. The list of topics they can poke fun at is getting smaller by the day. Instead of laughing at ourselves, we take offence and file a complaint, or even a lawsuit.

Unless you stay inside your house permanently, do not read a single newspaper or magazine and do not watch television, then you are going to be offended. The only way an enlightened society can progress is to accept that other people have differing beliefs and opinions, and that yours are no more valid than theirs.

I once heard someone say the phrase "No one has the right NOT to be offended". I couldn't agree with this sentiment more.

Wednesday 14 October 2009

Restoring faith in British democracy Part II

Following on from my earlier article about how to restore faith in British politics, which was written as a response to the MPs expenses scandal, I now find myself writing an article on the scandal concerning the payback of those expenses.

The current review of MPs expenses, being led by Sir Thomas Legg, is to determine what should be repaid. Already, over three hundred letters have been sent out to MPs to either ask for further information to clarify a claim, or to demand that they return the money.

What is galling about this whole sorry episode is the fact that some MPs are so incensed by this, that they are threatening to refuse to pay the money back, with some threatening to take legal action (presumably at tax payers expense).

Having been caught red handed with their trotters well and truly in the till, having sucked on the teat of the tax payer's nipples, they now have the arrogance to whinge when asked to return what they were never entitled to in the first place. Their bleating that it is unfair and unjust is lamentable.

They were caught flipping their houses for personal gain, claiming for luxurious home appliances, moat cleaning and a duck pond amongst the most contemptible of their claims. In practically any other situation, they would have been charged with fraud. Now that they have been asked to return their ill gotten gains to the tax payer, it is appalling that some are so out of touch as to feel hard done by.

We are in the middle of a recession, money is tight, firms are closing all around, people are losing their jobs and homes, and yet some MPs are stamping their tiny little feet that they need to repay back their exorbitant claims.

Let's be clear, none of the MPs claims were just and reasonable, even under the old system. It just so happened that it was so riddled with corruption and inefficiency, that these claims ever made it through in the first place. Even judging many of their claims against the old system, they would have little if anything to do with their roles as a serving MP.

MPs are well paid, and are supposed to act in the interests of the public to whom they are rightly asked to serve. At least the party leaders seem to have correctly gauged the public outrage and disgust, and have demanded that their members return whatever cash is asked of them.

In my opinion, a failure to return what was not yours to take in the first place, should result in your swift exit from the party, and from politics. If they want to be voted in by their constituents, they had better get out their cheque books, and quickly.

Becoming an atheist

What is an atheist?
Having already written an article about becoming a vegetarian, I though it was high time I wrote an article explaining why I am an atheist.

Firstly, what exactly is an atheist? An atheist is someone who does not believe in the existence of a deity, who lacks religious faith, and has a naturalistic worldview free from superstition and mysticism. Just t clarify from the outset, I am not a practicing scientist, but I do think like one. I don't want to give out the false impression that atheism is confined to scientists only. Atheists can and do come from all walks of life.

So why do I reject the notion that there is a higher being who watches over us? An omnipotent, omniscient deity who created the universe and our planet in a handful of days, and created every single species to boot.

Atheism is aligned with science
To turn the question round slightly, instead of asking why I am an atheist, a better question is why should I believe in the notion of a deity? We have explanations for the formation of the universe and the planets. While the birth of our universe may not yet be fully understood, this does not mean we have to resort to superstition to fill the gaps. Thanks to branches of science including astronomy and cosmology, we understand how planets and stars are formed. Our knowledge of such formations grows daily.

We understand gravity, atoms, chemistry. To anyone who denies that science has shed light on our understanding of the world, I would urge you to step out of an aeroplane when it is 30,000 feet in the air and jump out without a parachute. The fact that an aeroplane flies at all is also a testament to science.

Where is the evidence?
As a died in the wool sceptic, I demand to be shown evidence to substantiate a claim, especially one as grand as an omnipotent deity. Am I really to believe in the idea that our universe, planets, animals, seas and life were all created by a single omnipotent being? This leads to the obvious question of - who or what created the creator?

We know from the physical sciences that you cannot conjure up complexity from nothing. This violates the first theory of thermodynamics, an expression of the conservation of energy which states that "energy can be transformed (changed from one form to another), but cannot be created or destroyed".

That a being complex enough to create planets, stars and life was spontaneously created out of thin air is therefore impossible. Complexity does not arise from complexity, it arises from simplicity. The creationist myth that the Earth is several thousand years old is pure fantasy. We know that the age of the Earth is somewhere between 4 to 5 billion years old.

Extracted from the Wikipedia:
"This age has been determined by radiometric age dating of meteorite material and is consistent with the ages of the oldest-known terrestrial and lunar samples."http://en.wikipedia.org/wiki/Age_of_the_Earth

Anyone who denies the age of the Earth is either scientifically ignorant, or pushing their religious ideology.

Morality
It is often claimed that atheists are immoral people, as they do not have the moral framework and guidance of a sacred text. As I was at pains to point out in my earlier article Do atheists have the moral high ground?, it is perfectly possible to be a good, decent and moral person without religion. For religion to make claims of morality is absurd, when you look at fundamentalists who fly planes into buildings, or murder physicians who perform abortions. The fact that these are extreme acts is irrelevant, they were nonetheless carried out under the name of religion.

As an atheist, I do not believe in the notion of going to hell for being a bad person. I therefore do not perform acts of kindness to curry favour with a vengeful deity, who will smite me down and let me burn in hell for all eternity if I do something wrong (so much for an all loving deity).

If you need to rely on a sacred text to tell you that murder is wrong is deeply worrying. Without resort to any sacred text, an atheist innately knows it is wrong. If theists could steal themselves to throw away their sacred texts, which are nothing more than moral crutches, they would realise that they knew it too.

I perform acts of kindness because they are the right and moral thing to do. Not out of fear of a vengeful deity, or because I have read it somewhere in a sacred book.

Atheism is the only true path to enlightenment
I am far happier living my life as I see it. My life is not based around superstition, fear, vengeance and all the other artefacts that constitute religion. It is based on common sense, evidence and logic.

I do not need to have a meaning to my life ascribed to me by a sacred text. My life already has meaning. I have family and friends, goals and objectives, love and happiness. All these things give my life meaning.

If being a husband, father and friend is not enough to give your life meaning, then I do not know what is.

There is no place or need for religion in my life. We have the scientific methodology to explain the world around us, and our place within it. We are all born with an innate sense of morality. We do not need religion to explain these things. The only thing religion teaches us is to be satisfied with not understanding the world.

Religion is superfluous, surplus to requirements, irrelevant.

I am perfectly happy without it.

Sunday 11 October 2009

Blocking web sites does not stop time wasters

Many companies and organisations cite the reason of reduced levels of productivity as a driver for blocking web sites, particularly social media sites such as Facebook, Twitter and the like.

It should be obvious to anyone who has given this more than a moments thought, that blocking access to certain web sites will not stop a time waster from wasting their time - they will simply waste their time in other ways.

As I have said before in previous articles, the issue of dealing with people spending too much time on social media sites during working hours should be dealt with by measuring their performance against set targets i.e. by measuring their performance indicators. Time wasting should NOT be dealt with by technological intervention. Using technology is a very blunt instrument which does not adequately deal with the problem.

There are many reasons why an individual may waste their time. They may be lacking in motivation, suffering from stress, a personal issue or any number of other reasons. What is important is to get to the bottom of the underlying problem, not simply tackle its after effects. Using technology to deal with a potential lack of motivation, does not deal with the lack of motivation.

The reason there is an appetite for blocking social media sites, is that many people simply do not understand it. I think I can safely predict than over the next few years, social media will be as ingrained into our collective technological consciousness as email and the Internet currently is.

Tuesday 6 October 2009

Extreme Daredevils

For anyone who has not watched this on television so far, it is a "series of visually stunning films following some of the world's most extreme individuals, who risk their lives pushing themselves to the physical and mental limits of human capability" - Extracted from the Daredevils web site

I have watched the last two in this series - The Ice Man and The Skywalker. The former program centres around Wim Hof, who is able to cope with freezing temperatures. In sub zero temperatures 200 miles north of the Arctic Circle, Dutch Daredevil Wim Hof attempts to run a full 26-mile marathon clad only in shorts and a pair of sandals.

In The Skywalker, Dean Potter is the only person in the world who does extreme slack-lining (similar to tightrope walking). We see him crossing a canyon some 3000 feet up on a one-inch-thick stretch of rope without using either a pole for balance, or any kind of safety harness.

The important question for me watching this series has not so much been why they do these extreme sports (although that would still be an interesting question to ask) but how. I find myself asking the question "could someone be trained to learn how to do that?"

They are probably not physically different from you or I, what sets them apart is their mental capacity to switch off their surrounding environment and completely focus on what they are doing. Dean Potter repeatedly talked about how slack-lining at such incredible heights would induce a state of hyper awareness, where he felt in complete control. Watching his face closely as he crossed a 3000 foot canyon in Yosemite, his face was a mask of concentration. Wim Hof would meditate and get his mind into the "zone" as he called it, before immersing himself in freezing Arctic water and doing an underwater dive.

On the one hand these are mortals made of flesh and blood just like the rest of us. What sets them apart is their ability to completely and utterly focus their minds on a task, to the exclusion of everything else. As the series has also highlighted, in their personal lives they are just as focused, and have found it difficult to form and keep normal relationships with people around them. This must be the trade off for having such enormous mental strength - shunning those around you, focusing on a task to the exclusion of everyone else.

Listening to them talking about what they did, there was always a sense of logic to their thought process. I found myself agreeing with much of their rationality and reasoning. Despite their rather extrovert behaviour, they seem to be introvert people in real life, not particularly wanting the attention that their stunts generate.

This has so far been a fascinating series, and shows just how powerful the human mind is, and what it can achieve. It has also showed some of the character make up of such people, and given us a glimpse of what drives such people to push themselves to such incredible limits!

Sunday 4 October 2009

Innovation vs opportunity

I was having a discussion recently with a work colleague about how some companies are very good at releasing innovative products to the market, whether or not they are fully fit for purpose.

An example of this can be clearly seen with the release of the Apple iPhone. In many ways, this was an innovative product, but it's fair to say that it was released before it was ready. It lacked many of the features that are now found in the current version. In fact, many industry experts have stated that the iPhone 3 represents the version that ought to have been released initially, as only now does it contain the features that make it a fully functioning product.

Other phones now contain similar features to the iPhone, but they have lost ground to Apple who got their first.

It's easy to see why companies release innovative products early. They immediately corner the market, so competitor companies are immediately on the back foot. This gives them an immediate advantage. Releasing an innovative product represents an opportunity. To fully exploit this opportunity, you simply have to get there first. Coming second in a race to release an innovative product is a major blow.

Another example is in the gaming machine market. Sony got there first with its Playstation, long before Microsoft with its XBox. Even though the Microsoft offering is equally good, it has lost huge market share to Sony.

There are some interesting products that may yet prove to be an exception to this rule. The Microsoft search engine Bing has made a huge impact in a short space of time, and it will be interesting to see how much of Google's market share it will steal. It will reciprocally be interesting to see how much of a dent Google makes into Microsoft with its Chrome browser and operating system (targeted to the netbook market).

I'll stick my neck out and say that Google will continue to lose market share to Microsoft in the search engine war, but that Google will still be the dominant player (keeping to the rule of releasing first). I will also predict that Chrome will win the war to be the dominant operating system and browser, but only in the netbook market. Microsoft will continue to be the dominant force across all other hardware platforms (and therefore breaking the release first rule, but only partially).

While it is certainly possible to gain sufficient market share to dominate a particular product or service space by releasing later, it makes doing so much harder. The key rule should always be release first!

Wednesday 30 September 2009

Reign of terror costs mother and daughter their lives

The harrowing story of Fiona Pilkington is one that has unfortunately come to typify modern Britain for all the wrong reasons. The single mother doused her car in petrol and set light to it. The car also contained her teenage daughter, who had the mental age of four. Both occupants were killed in the ensuing fire.

At first glance, this could have been mistaken for a distressed mother who was at the end of her tether. But piece by piece, the full story has come to light. Fiona Pilkington and her family had endured over two years of intimidation and abuse from their neighbours, and she eventually snapped, killing herself and her daughter.

The abuse began when Ms Pilkington's son, who sufferers from dyslexia, fell out with one of the sons of a neighbouring family. They were then terrorised from not just the younger members of that family, but children and teenagers from other families too, and a campaign of terror was unleashed.

During the inquest that followed, the jury heard that the gang, some as young as ten, would pelt their house with eggs, flour and stones, as well as putting fireworks through the letterbox and shouting obscene insults from the street.

Ms Pilkington contacted Leicestershire police on thirty three occasions, but they did nothing to stop the abuse. They have subsequently admitted that her plight was viewed at the time as a low priority. With nothing to stop them from prolonging their campaign of terror, the thugs continued to ruin the lives of the Pilkington family.

It is behaviour like this that is now becoming more common up and down Britain. While such behaviour is still not ubiquitous, it is getting more prevalent, more violent, more cruel and more terrifying each time it occurs.

The police in this particular case sat on their hands and did nothing. They viewed it as a low priority. It has been reported that at least one of the thugs involved in the abuse was heard to shout "We can do what ever we want, and there's nothing you can do to stop us". People like that obviously have no fear of the police, and indeed are full of contempt for it.

That a mother took such action is a sad testament to the current state of Britain. It is a reflection of both the level of callousness that is becoming ever more prevalent, and the inefficacy of the police. In days gone by, such thugs would have found themselves landed in court, with the very real prospect of doing hard time. Now the only deterrent is the risible badge of honour that is known as an ASBO (Anti Social Behaviour Order).

We need to get tough on this sort of behaviour, put a stop to all the pandering and pussy footing. It just gives these thugs a green light with absolutely no deterrent to ever stop. If we don't put real measures in place to stop this behaviour, there will unfortunately be plenty more people just like Fiona Pilkington.

Tuesday 29 September 2009

It's only a theory?

The above phrase is one I have heard endlessly when discussing the theory of evolution with theists. The usual retort is "yes, but it's only a theory". Evolution is a fact. It has a staggering amount of supporting evidence, and is now well established within the scientific community.

The word theory has at least two very distinct meanings. One refers to a systematic body of knowledge that coincides with observation. This is it's more formal usage, as used within the scientific community, including the natural sciences. Other well known theories include Computational theory, Quantum theory and Chaos theory (to name just a few). These are well understood, and have considerable evidence to support them.

The other meaning of the word theory refers to a hypothesis, an idea or conjecture. This meaning of the word does not have the supporting evidence or systematic body of knowledge as the previous definition does. It is this interpretation of the word that theists jump upon when criticising Darwin's theory of natural selection. Whether through scientific ignorance, or to create deliberate confusion, theists continually attempt to dilute the usage of the word to mean the latter.

Extracted from the Oxford English Dictionary:


  • Meaning 1 - A scheme or system of ideas or statements held as an explanation or account for a group of facts or phenomena; a hypothesis that has been confirmed or established by observation or experiment, and is propounded or accepted as accounting for the known facts; a statement of what are held to be the general laws, principles or causes of something known or observed.

  • Meaning 2 - A hypothesis proposed as an explanation; hence, a mere hypothesis, speculation, conjecture; an idea or set of ideas about something ; an individual view or notion.

Meaning 1 is obviously the one which would accomodate evolution. Meaning 2 is the one identified by theists. There are more meanings for the word defined, but these are the two that are of interest to this discussion.

Several American schools have received demands from parents and/or school governers to have stickers added to the front of books on Evolution stating that "Evolution is just a theory". If you are using the stricter definition of the word 'theory' i.e. Meaning 1 above, then the preceeding statement is anything but a criticism, as any scientist would wholeheartedly agree that evolution is a theory, and a very good one at that. To state that evolution is 'just' a theory shows a lack of comprehension of scientific enquiry and the scientific process, as you have not understood how the word theory is used within science.

Theists simply cherry pick their targets, as they presumably have no demands to have similar stickers placed on books relating to mathematics, physicics or computing (Chaos, Quantum and Computational theories respectively).

Evolution is a fact. Period. It is backed by a huge body of supporting evidence including carbon dating, genetics and gene theory. No scientist would contradict its message. While there may continue to be new knowledge brought to light that allows scientists to review their understanding of some of its details, none of that takes away from the fact that as an over-arching theory, it is fundamentally correct.

The theory of evoluiton is both elegant and simple. It has stood up to one hundred and fifty years of criticism from theists. It is surely one of the most important theories ever discovered, having huge implications to our understanding of ourselves, and our place within the natural world.


Impossible is nothing

I read this quote recently from the boxing legend Muhammad Ali, and liked it so much I thought I'd share it:

"Impossible is just a big word thrown around by small men who find it easier to live in the world they've been given than to explore the power they have to change it. Impossible is not a fact. It's an opinion. Impossible is not a declaration. It's a dare. Impossible is potential. Impossible is temporary. Impossible is nothing." — Muhammad Ali.

With quotes like that, it's no wonder the man is a legend!

Wednesday 23 September 2009

Selecting a Content Management System

As part of the web site improvement project for East Northamptonshire Council, I have been involved in the selection process for a replacement Content Management System (CMS). A CMS allows an organisation's content authors to publish content to its web site.

The current CMS was no longer fit for purpose, and had reached the end of its useable shelf life. It lacked many of the features now found in the modern CMS, such as RSS feeds, social bookmarking (allowing you to link the content to your preferred social media site) and other Web 2.0 features such as podcasts and blogs.

When selecting a new CMS, there is no fixed criteria or blueprint of what constitutes a requirement specification, but there are common themes that should be considered. the list that follows is not intended to be comprehensive, or go into great detail, but to serve as a starting point from which you can derive your own CMS requirements specification.

  • Technical specification - This should include the preferred operating system, application server, web server, database technology, programming language (if you intend to extend or modify the CMS) and any other appropriate technical requirements. You need to consider a CMS that is compatible with the tools and technologies that are used within the organisation. The market leaders may not be fit for purpose if they use unfamiliar technologies.
  • Content creation - Specify how the content should be created. For example, what requirements are needed in the authoring environment. This could include such features as WYSIWYG authoring (a type of authoring environment where what you see during the design is how it will appear once published and stands for "what you see is what you get"), drag and drop, spell checker, separation of content from formatting, content reusability, metadata creation. The authoring environment should make it as simple as possible for a non technical person to create engaging and professional looking content, quickly and simply.
  • Content management - Specify what tools you need to manage your content. For example version control on content to see when a piece of content has changed and by whom, audit trails to see what activity has taken place within the system, automatic notification of when content should be reviewed, Draft-Submit-Approve workflow model, the ability to manage the style(s) that are applied to the content and todo lists.
  • Publishing - All styling should be applied to the content during the publishing phase, leaving the author free to create their content without having to worry about how it looks. This should be achieved through the use of style sheets and page templates. This makes it easy to separate the look and feel of a web page from its content. You may want to be able to publish to multiple sites or to a staging server.
  • Presentation - It should be possible to view the published content in any of the major browsers. You probably want to consider the amount of client side scripting the CMS requires to run. Remember, you have no control over how the client has configured their computer (except perhaps in intranet scenarios where group policies may restrict or deny such configuration), so you may want to keep technologies such as Javascript to a minimum. The HTML that is created should conform to the latest W3C HTML specification. The metadata that is created for each page by the CMS must be sufficient that it can be used with an appropriate taxonomy, and used in searching and indexing the content.
  • Taxonomies and metadata - If you are using a taxonomy (such as LGNL, IPSV if working within local government), then you will need to ensure that it is fully supported. Can content authors add additional information (metadata) to the content to allow it to be more easily found.
  • Integration - Do you need your CMS to integrate with any of your back office systems, and if so, which ones? For example, you may need you CMS to integrate with your Customer Relationship Management (CRM) system, or your finance system if allowing online payments.
  • Compatibility and accessibility - The web pages produced should be compatible with a variety of different media, and should comply with the latest standards of web accessibility. The web pages should be fully functioning, well-performing and compliant. For example, within the public sector, it is important that web pages meet AA accessibility standards as a minimum.
  • Reporting requirements - Allow the interrogation of data held within the CMS. Information should be available on when each item of website content has been updated, and by whom. It may be useful to provide website usage statistics and publish selective information automatically to the website. The CMS should also ideally provide a rich set of standard reports which can then be easily customised by the content author and /or IT department.
  • Administration - The administrators of the system will need to create users (content authors), and grant them access to various parts of the CMS in line with their needs. This should be as granular as possible, to allow access to the system at the lowest levels possible. This should also ideally include the ability to reset passwords, amend the site styling, configure and run any scheduled reports, including a broken links report.
  • Maintenance and support - This should include both upgrades and patches. You will need to ensure you have read the support agreement, and are happy with it. What are their hours of business, how can they be contacted, how do you raise an incident with them. Are you happy with the terms of their Service Level Agreement (SLA)? How often can you expect an upgrade, and how is this delivered?
  • Data migration - If you have an existing CMS, is the new supplier able to migrate this content into their CMS on your behalf? Are there any additional costs associated with this?

This is far from a full CMS requirements specification, but is aimed to serve as a starting point to hopefully make you consider what you need from your own CMS.


Sunday 20 September 2009

Using blogging as a learning tool

When deciding what to write my next blog about, I often take a topic that I'm not very familiar with, but want to investigate and explore in more detail. This can be anything from a technology, to a debate, to a trending topic.

In order to write an article about these sorts of subjects, I am forced to undertake the necessary research to understand the topic in sufficient detail. This is a great way of really getting to understand a topic. Sometimes I may spend several days researching a topic before I feel confident enough to write about it.

When I want to understand a new or emerging technology, I will write a blog about it. So while I may not fully understand the subject matter when I begin, by the time I have investigated it, and wrote a blog about it, my understanding will have progressed enormously.

Writing about a subject, any subject, is a great way to learn about it and understand it, and is one of the key ways by which I develop my knowledge on a huge variety of subjects. I would recommend it to anyone who is keen to extend their knowledge, and has the motivation to do so.

Thursday 17 September 2009

Agnosticism vs Atheism

There is confusion in the semantic meaning between these two terms. Sometimes they are used inter-changeably, as if they had the same meaning. They don't.

An atheist does not believe in the existence of a deity, for the reason that there is no proof for its existence. This is in line with scientific enquiry, and is the rational position to take. An agnostic is not committed either way to the belief or disbelief of a deity. An agnostic may therefore believe that a deity may or may not exist, but simply not be fully committed.

Atheists often define themselves as agnostics because of the prejudices that often surround atheism. Agnosticism is often regarded as the more reasonable position. Why people who are atheists should wish to dilute their opinions is unclear, seeing as most theists do not seem to have similar reservations about their own religious beliefs.

I have heard the argument that atheists are as closed minded as theists, that they simply take the polar opposite view in not believing in a deity. That an atheist holds firm to the position that there is no god, as a theist does to the position that there is one.

For a start, atheism does not constitute a 'belief system', in the same sense as theism does. Believing in something is not necessarily the same as a 'belief'. You can believe in many things, none of which constitutes as having a belief. I believe I will enjoy my weekend, I believe I will finish the book I am reading. I believe in these things, but none of them are beliefs. We need to be very careful how we use these words, as they do not necessarily mean the same thing. They are deliberately or accidentally used incorrectly, and this just clouds the discussion.

As I stated earlier, although an atheist may not believe in the existence of a deity, given that there is no proof for one's existence, this does not mean that an atheist discards its existence out of hand. Given sufficient evidence, most atheists would change their minds, for the simple reason that atheism is the scientifically rational position to take.

Agnosticism is compatible with both atheism and theism, they simply do not claim to have a position either way. They may be an atheist and be uncertain, or a theist and be uncertain.

There is also a double standard that often crops up. Theists often claim that being an atheist is dogmatic and closed minded. If disbelieving in the existence of a god is dogmatic, then so surely is believing in one.

It is common for someone to be both agnostic and atheist. An agnostic atheist won't claim to know for certain whether or not something claiming to be a deity exists, but equally they won't actively believe that such an entity exists in the first place.

Tuesday 8 September 2009

Public sector life vs private sector life

Background
Having worked in both the public and private sectors, I feel I can comment on their similarities and differences. Most of my professional life has been spent in the private sector, working in software houses developing applications. I've now worked in the public sector since April 2007 as a Senior Systems Developer for East Northamptonshire Council. I have already written an article describing what I do, so I will not repeat that here. I thought it would be interesting to compare the two structures, to see if they are really so different.

This article will concentrate on the areas of job security and career prospects. To describe all the differences between the two structures would probably necessitate an entire book. I will probably revisit this area in the future, but look at different areas of comparison instead.

What are public and private sector?
First of all, what exactly do we mean by private and public sector? The public sector is that part of a nation that is run by provincial or local government, and will usually include services such as defense, national security, emergency services (police, fire brigade, ambulance etc), town planning and revenue services to name a few.

The private sector are those organisations that are not run by or on the behalf of government. They are funded not by the tax payer through the form of tax, but by private investment.

So the key difference is that private sector industries are largely driven by the pursuit of maximising profits, whereas the public sector is aimed at delivering cost effective services to the community or society at large.

A useful analogy would be to use the BBC and ITV, where the BBC would be the public sector structure, and ITV the private sector structure. The BBC is paid for by the licence payer, whereby ITV is paid funded by private revenue streams.

Career prospects
One key difference between public and private sector is in the structuring of salaries. Government employment provides more security, but promotion is mostly based on seniority, whilst the private industry provides salary increases and promotion based on performance.

The government corporate culture is more structured. The organisational structure is hierarchical whereas in the contemporary private sector, a move towards teamwork and project oriented management is now becoming common.

In certain areas such as health care, many professionals prefer the private sector because of the higher salaries. The private sector is the better alternative if you are looking for quick promotion prospects, while promotion may take longer in the government. In the government sector however, you can climb right to the top of the hierarchical structure. This means that there are more career prospects for promotion should you have the qualifications combined with years of service and patience.

Job security
As stated above, public sector employees on average are more secure than those of the private sector. Sometimes by as much as fifty percent. During the current recession, the private sector has been hit harder than the public sector. As the private sector aims to maximise profits, then this is to be expected.

It is a common practice for employees to start their careers in the public sector, where entry level jobs are more readily available. Government positions are also seen - rightly or wrongly - as the safer option. And as stated already, they have greater job security. This is largely due to the fact that government is responsible for the labor acts and laws - hence they have an obligation to adhere to them.

Summary
Neither structure is better than the other, they are simply different and offer different benefits. Public sector offers greater job security and the ability to climb to the top of the ladder. Private sector offers quicker promotion and higher salaries.

Wednesday 2 September 2009

Governance using social media in the work place

The debate
There is currently a great deal of debate, not to mention dispute, surrounding the use of social media within the work place. From outright banning of all social media sites, to an open policy allowing their use, or somewhere in between.

The general reason given by employers who ban social media sites is that they lead to a decrease in productivity. Employees who have access to social media sites such as Facebook or Twitter, will allegedly waste valuable time on such sites, rather than their day job.

Employers need to be forward thinking
Many organisations are adopting social media sites as part of their long term marketing strategies, so it does seem rather short sighted to ban their use. How are employees going to learn to use such sites effectively, and build the necessary relationships if they are banned from using them in the workplace. If you are networking through social media sites, then you really need to be using them during working hours.

To quote David Wilde, Chief Information Officer at the London Borough of Waltham Forest 'For managers it can be difficult to know what exactly their employees are doing. But the organisation needs to be outcome-based, and I don't think we should be using technology to prevent access to social networking sites. If there are staff performance issues, we should address them directly'.

The solution
The solution then is to address performance issues as they arise. There are many reasons why an employee may not be productive, and placing the blame on their use of social media sites may be to miss a more fundamental issue. Blaming social media usage for a loss of productivity is a very blunt instrument to find what may be a complex problem. Lack of training, lack of confidence, bullying and domestic problems can all have a negative impact on an employee's productivity. Compared to these, social media usage seems trivial.

Social media boosts productivity
However, contradictory to the popular conception of social media sites lowering productivity, recent research has revealed that in fact they can increase productivity. Studies have ranged from suggesting that merely surfing the Internet can boost productivity, to suggesting that the specific use of social media can boost productivity.

Conclusion
With appropriate staff performance policies in place, there is no reason why an employer cannot allow their employees access to social media sites. If an employer is using social media as part of their networking or marketing strategies, there is far less of an excuse to banning such sites. What is needed is an open-minded and progressive attitude to their use, rather than the blunt instrument that is banning.

Tuesday 25 August 2009

What the heck is Cloud Computing?

Introduction
The term 'Cloud Computing' takes its name from the fact that when representing the Internet on a computing diagram (such as a top level architectural diagram), it is represented as a cloud. The reason for this is to hide the technical details of the Internet from the diagram. As the Internet is really just a big network, containing servers, routers, virtual machines and so on, all of this detail is abstracted away behind a cloud for clarity and simplicity.

So the term 'Cloud' simply refers to the Internet.

So what is Cloud Computing?
There is no single, comprehensive definition of exactly what Cloud Computing is. In its most basic form, Cloud Computing is an architectural paradigm that relies on the Internet as its mode of delivery. It is a means of delivering services and applications through the Internet, independent of platform and hardware.

So rather than have applications installed locally, such as your word processor and spreadsheet, these applications would be delivered by the Internet. All your data files would be stored on the Internet too. All processing and computation would take place on a remote server.

All the local machine has to do is connect to the Internet.

This is not an entirely new concept. The idea of delivering applications and services via the Internet goes back to the early days of the Internet, circa the mid 1990s. However, there were too many technical problems for this to become a reality, such as the lack of available bandwidth. Whilst broadband may be common place now, a decade ago band width was nowhere near its current capacity.

In its simplest form, Cloud Computing has simply extended the concept of the thin client. The idea of the thin client is to free up the local machine by installing data and files on a remote server, typically on the same internal network. So a thin client is basically a client / server application, with all files and computation taking place on a remote server.

If you take this concept to its logical conclusion, the thin client becomes a dumb client, with the local machine holding no files or data whatsoever, and all applications and services are delivered by remote servers (the Cloud).

Application Virtualisation
Cloud Computing is dependant upon a technology called application virtualisation. This is a technology that allows applications and infrastructure to become independent of the underlying platform or operating system.

A fully virtualised application can run on any platform.

Virtualising an application means ensuring the application is completely self sufficient, and contains everything it needs to run. This self contained, virtualised application can then be deployed and run anywhere. If it can run anywhere, then it can be deployed to the Cloud.

Benefits
One major benefit of Cloud Computing is not having to worry about the servers, hardware and infrastructure that is required to deliver the application or service. This runs in the Cloud, and is therefore someone else's problem to worry about.

There is now no longer a requirement to have to devote time and effort to developing and maintaining applications we are not expert in. As Cloud Computing applications run in virtual application environments, which are completely self contained, this is now no longer an issue.

Drawbacks
The one size fits all approach typical of Cloud Computing applications, does not always work for large enterprises with complex requirements. They are general purpose applications for solving general purpose problems. For enterprises with specific problems, Cloud Computing may not be the answer.

For many enterprises and organisations, having data outside the firewall may pose a security threat. Remember, everything runs in the Cloud, and as we know all too well, communicating with the Cloud is not without its risks.

One of the major criticisms that has been aimed at Cloud Computing is the loss of control of data once it is in the Cloud. If the service provider is having technical problems, there may be delays in accessing your data. In the worst case scenario, you may not be able to access you data at all.

Many libertarians have stressed that Cloud Computing forces organisations and individuals to sacrifice their privacy and personal data to a third party. This is a particular issue for users of social networking sites such as Facebook and MySpace.

Summary
Cloud Computing as a paradigm, has been built on top of more traditional models such as thin client computing, and simply taken this to its logical conclusion. One of the key underlying technologies used by Cloud Computing is application virtualisation, whereby an application is encapsulated so it is self contained, and can therefore be deployed on any platform, including the Cloud. While it has several benefits, the loss of privacy and personal data to a third party is a fundamental criticism that needs to be addressed if Cloud Computing is ever going to seriously compete with more traditional models of deployment.