February 2009

"Network effects from user contributions are the key to market dominance in the Web 2.0 era."

Tim O'Reilly

In What is Web 2.0? Tim O'Reilly identifies the characteristics that distinguish Web 2.0 from the first generation of Web applications. One key aspect is participation. Instead of users simply consuming information, Web 2.0 technology enables all of us to participate in building content. The power of Web 2.0, in Tim O'Reilly's words, is that it provides a platform for "harnessing collective intelligence". Perhaps the best known example of this is Wikipedia, which is distinguished from other online encyclopedias by the fact that its content is provided by users rather than a small group of experts. This model has been so successful, even the Encyclopedia Britannica has adopted a Web 2.0 approach.

A critical challenge to participation is interoperability--integrating the islands of technology that characterize most information technology (IT) organizations. There have been earlier attempts to create a standard framework for distributed computing such as CORBA and DCOM, but the complexity of these environments has limited their adoption. A more recent and simpler approach is Representational State Transfer (REST). In this article, we begin with an examination of the critical challenges facing organizations responsible for maintaining our utility, telecommunications and transportation infrastructure, outline how open standards are helping to address these challenges, and then discuss how geospatial data and services can be exposed over the Web. We introduce REST, outline a RESTful implementation of geospatial Web services that provides simple and open access to geospatial data over the Web using standard Web protocols, and describe a prototype web site developed using RESTful Web services by the City of Nanaimo.

Critical Challenges

One of the most serious challenges facing organizations responsible for managing infrastructure, including water, waste water, power, gas, telecommunications, roads, and highways, is increasing the productivity of the field force. This challenge has become particularly urgent in North America where, as a recent study of the power utility industry documented, industry is facing the problem of an aging field force. Within the next few years, half of the field force, with their deep knowledge of network facilities, will retire to be replaced by young, inexperienced workers. In some sectors, the situation is dire. We recently chatted with an employee of an Arizona utility who said that 50% of the work force at his firm is eligible to retire this year. This aging work force represents a huge loss of collective intelligence. The challenge for these organizations over the next few years is to transfer the knowledge about the network infrastructure currently resident in the heads of experienced, and soon to retire, field workers into the organization's collective knowledge base. Only then can the collective intelligence be harnessed by all workers, and most critically, younger workers, to improve productivity in the future.

Another critical challenge is interoperability. For example, organizations with an engineering focus typically have islands of technology such as CAD, mobile, GIS, and tabular financial and business systems. Many of these systems are proprietary, often legacy, developed by different vendors, and are incompatible with each other. Productivity and efficiency are the business forces which are forcing IT organizations to look for ways to break down interoperability barriers.

Open Standards

One of the most important technical advances to provide a foundation for interoperability is open standards. The Web, which has become the world's operating system, is based on standards from the IEEE, IEC, W3C, and ECMA. Geospatial standards from the Open Geospatial Consortium (OGC) allow the exchange of spatial data. Web applications from the major geospatial vendors are still for the most part proprietary. But there are open source projects that are moving in the direction of an open Web 2.0 platform.

Shortly after the formation of the Open Source Geospatial Foundation (OSGeo), Autodesk released the source of the Feature Data Object (FDO application programming interface (API) and the MapGuide Open Source platform to the open source community. FDO is different from other programming interfaces. It was designed to support the editing and versioning of spatial data. FDO provides consistent access to a large number of spatial data stores including Oracle Spatial, SHP. ArcSDE, SDF, and GDAL/OGR as well as open standards like KML and WFS. The C++ code for FDO, which is available from OSGeo, has been compiled to run on Windows and Linux. Similarly, the source code for MapGuide Open Source is available from OSGeo.

But there is still the issue of how to expose these applications in a general way on the Web. For example, you can wrap a PHP, JSP, or ASP programming interface around an application with a Javascript client, but this approach will be different for each Web application. A more general approach is to wrap the C++ code with standards-based Web services. This not only allows client applications to access geospatial data and services in a standard way, but allows geospatial data and services to be integrated with other Web services using orchestration such as Business Process Execution Language (BPEL).

Earlier attempts to create a standard framework for distributed computing included CORBA and DCOM, but the complexity of these environments has limited their adoption. Two more recent approaches are: i) the W3C's Simple Object Access Protocol (SOAP) which is supported by application development tool makers such as IBM, BEA Systems, and Microsoft; and ii) REST which has been used by Amazon, Google, and others to create interfaces to their Web services.


The term REST was introduced by Roy Fielding in his Ph.D. dissertation and describes an architecture style of networked systems. The motivation for REST has been to rely on the simplicity of the HTTP protocol and data exchange based on XML and MIME. Since REST uses standard HTTP methods, RESTful applications are not hindered by firewalls. By linking to open source components such as MapGuide, FDO, and other open source libraries, a geospatial Web services framework, tentatively named king.rest, has been developed. It enables a site administrator with no programming experience to deploy HTML, KML, and other representations of geospatial data together with metadata pages that expose information about the Web services provided by the site in a form that is easily crawlable by search engines, and easily understood by anyone wanting to access the data for other applications.

Using the king.rest framework, the City of Nanaimo has implemented a prototype geospatial web services site where all of the City's public data will be exposed through a single URL. The first incarnation provides read-only access to the city's geospatial data. However, with an FDO data provider that supports geospatial RESTful Web services, any application that supports FDO will have edit capabilities over the Web.

The index is a static HTML page that allows users to search for data. A street index enables Web crawlers to access every property in the city. Geospatial data in different representations, such as KML, JSON, XML, or PNG, can be accessed simply through a URL.


RESTful geospatial Web services can provide simple and open access to geospatial data over the Web using FDO and standard Web protocols. Because FDO is differentiated from other programming interfaces as being designed to specifically support the editing of spatial data, a RESTful implementation of FDO enables full edit access to geospatial data and provides a Web 2.0 platform that can help address the challenges of the aging work force and interoperability.

Share this article:

Cite this article:

Rate This Content: 
No votes have been cast yet. Have your say!