South African Computer Journal - Volume 2009, Issue 43, 2009
Volumes & issues
Volume 2009, Issue 43, 2009
Source: South African Computer Journal 2009, pp 1 –2 (2009)More Less
The current issue of SACJ includes a selection of papers from the Proceedings of the academic track of the 2008 Free and Open Source Software for Geospatial (FOSS4G 2008) Conference, incorporating the Geo-Information Society of South Africa (GISSA) 2008 Conference, held in Cape Town from 29 September to 4 October 2008 (Coetzee et al 2008). The Open Source Geospatial Foundation (OSGeo, http://www.osgeo.org/) has organized annual FOSS4G conferences since 2006. OSGeo is a not-for-profit organization with the mission of supporting and promoting the collaborative development of open geospatial technologies and data. The FOSS4G conferences bring together the people who create, use, and support open geospatial software. In a change from the norm, the FOSS4G 2008 incorporated one of the biggest GIS events in South Africa, the biennial GISSA conference, providing an opportunity for GISSA members to network with geospatial professionals from around the world. The conference was the first major open source geospatial event on the African continent and hence had the theme : Open Source Geospatial: An Option for Developing Nations.
Source: South African Computer Journal 2009, pp 3 –16 (2009)More Less
Sensor Web Enablement (SWE) is a consistent standardization effort of the Open Geospatial Consortium (OGC) to cope with an environment of pervasive sensor networks. The standards suite is web service centric and defines a number of XML schemas to be used in information exchange. Here we report about an experiment to replace a conventionally programmed web service with one which uses a semantic network as backend.
Source: South African Computer Journal 2009, pp 17 –24 (2009)More Less
Geographic Information Systems (GIS) enable the user to capture, process and visualize their geodata. In the age of the web and internet technologies, GIS move from monolithic systems to distributed and web-based infrastructures. These so-called Spatial Data Infrastructures (SDIs) allow users to exchange their geodata across organizational and technical boundaries. SDIs are technically based on Web Service technologies. In the context of GIS, several standards for Web Services have been created, which address the requirements of GIS in particular. Anyway, the movement towards SDIs is still on-going and the current challenge is to integrate existing geoprocessing functionality of GIS such as generalization of complex geometric data structures or advanced analysis of satellite images into SDIs. Integrating such existing GIS functionality in SDIs enables the full potential of GIS on the web and prevents reimplementation. In this context the Open Geospatial Consortium (OGC) Web Processing Service (WPS) interface specification plays an important role, as it describes a Web Service standard for publishing geoprocessing functionality on the web. Thus, integrating existing GIS functionality into WPS seems to be effective. This so-called wrapping of functionality is demonstrated by the example of the popular Open Source desktop GIS GRASS. Additionally, the exposed functionality is applied to a complex geoprocessing scenario, which is realized as a Web Services chain using a BPEL-based approach (BPEL - Business Process Execution Language). In particular, the paper demonstrates a scenario, which monitors droughts in a Sub-Sahara region from multispectral satellite images. The GRASS-enabled WPS has been developed by the Geoprocessing Community (www.52north.org/wps) at the 52°North initiative. The applied software and the implemented approach are available through Open Source software license.
Source: South African Computer Journal 2009, pp 25 –34 (2009)More Less
Mobile GISs allow users to move through and explore the territory by means of maps and geo-database representations. The presently proposed mobile GIS is something more, because it combines pure GIS capabilities with real-time positioning and context awareness in a scenario from which many application fields (public safety, planning, public works, fleet monitoring, recreation) can take advantage. NAMGIS (Nomadic Adaptive Mobile GIS) is a context-aware Web GIS specifically studied for small screen and CPU limited hand-held devices.
Main components of NAMGIS are : NAMGIS Core, the mobile GIS, which is an extension of the University of Minnesota MapServer and SAF (Situation Aware Framework), which is the context-aware framework that supports the acquisition and elaboration of data describing a user's "context". Context data include GPS measurements, used to track the user's position and adapt the contrast / brilliance level of the interface in outdoor scenarios and RFID (Radio Frequency Identification) data, used to detect the user's proximity with features equipped with RFID tags.
The two components can also be used independently and are released with an Open Source license (LGPL for SAF, GPL for NAMGIS Core). In the paper the NAMGIS prototype and two examples related to archaeological contexts are presented.
Sensitivity analysis of Voronoi-based sensor deployment and reconfiguration algorithms : research articleSource: South African Computer Journal 2009, pp 35 –43 (2009)More Less
This study examines the effects of location inaccuracies on two movement-assisted Voronoi-based sensor deployment and reconfiguration algorithms, VEC and VOR, due to Wang et al. For the purposes of examining the extent to which the deployment and reconfiguration algorithms are capable of reducing coverage holes, a simulator environment was set up, using a custom-designed simulation tool. By integrating the environment with that of a GIS application, real-world distance and scaling can be applied, allowing the assessment of the algorithms to be performed in a virtual world mimicking that of a real-world deployment.
The simulation results suggest the VOR algorithm is reasonably robust if the location inaccuracies are somewhat lower than the sensing distance, and also if a high degree of inaccuracy is limited to a relatively small percentage of the nodes. The VEC algorithm is considerably less robust, but prevents nodes from drifting beyond the boundaries in the case of large inaccuracies.
Using supply chain management to enable GIS units to improve their response to their customers' needs : research articleAuthor Peter SchmitzSource: South African Computer Journal 2009, pp 44 –57 (2009)More Less
One of the challenges according to Dangermond (1999) and Tomlinson (2000) for GIS units in the 21st century is to be able to manage themselves successfully in order to deliver the right product at the right time to the right customer. Tomlinson (2000) also indicated that there will be several management tools available to assist GIS units to enable them to manage themselves successfully. This paper discusses such a management tool, namely supply chain management. Supply chain management is used in the manufacturing industry to manage the production of motor vehicles, electronic goods and pharmaceutical products. A supply chain consists of suppliers to a firm, the firm itself and customers at the other end of the chain. Goods flow up the chain from the suppliers, through the firm to the customers. Information and money flows up and down this chain. The management of this supply chain is known as supply chain management. The aim of supply chain management is to manage the suppliers, manufacturers and customers as a single entity to ensure that the whole chain is competitive at least cost to whole chain. Goods in the context of this paper are spatial data sets (GIS products) that have been produced by a GIS unit. In order to produce a GIS product a GIS unit needs to source spatial and non-spatial data from various suppliers, verify the data, manipulate and transform the data into a new GIS product, test and validate the new GIS product and deliver it to customer. ESI-GIS in Eskom will be used as an example to demonstrate this management culture.
Factors leading to success or abandonment of open source commons : an empirical analysis of Sourceforge.net projects : research articleSource: South African Computer Journal 2009, pp 58 –65 (2009)More Less
Open source software is produced cooperatively by groups of people who work together via the Internet. The software produced usually becomes the "common property" of the group and is freely distributed to anyone in the world who wants to use it. Although it may seem unlikely, open source collaborations, or "commons," have grown phenomenally to become economically and socially important. But what makes open source commons succeed at producing something useful, or alternatively, what makes them become abandoned before achieving success? This paper reviews the theoretical foundations for understanding open source commons and briefly describes our statistical analysis of over one-hundred-thousand open source projects. We have found, that leadership, clear vision and software utility may be causes of success early in a project's lifetime, and that building a community of software developers and users around an increasingly useful software product appears to be key to success for most projects later in their lifetimes.
A comparison of data file and storage configurations for efficient temporal access of satellite image data : research articleSource: South African Computer Journal 2009, pp 66 –74 (2009)More Less
Satellite data volumes have seen a steady increase in recent years due to improvements in sensor technology and increases in data acquisition frequency. The gridded MODIS data products, spanning a region of interest of approximately 10° by 10° for a single title, are stored as images containing almost six million pixels, with data in multiple spectral bands for each pixel. Time series analyses of a sequence of such images in order to perform automated change detection is a topic of growing importance. Traditional storage formats store such a series of images as a sequence of individual files, with each file internally storing the pixels in their spatial order. Consequently, the construction of a time series profile of a single pixel requires reading from several hundred large files, resulting in substantial performance overheads that severely constrain high-throughput analyses. We aim to minimize this performance limitation by restructuring the storage scheme for typical satellite imagery as temporal sequences in order to reduce overheads and improve throughput. Models are developed to compute the expected query time for both the time-sequential and the traditional image-based representations. These models are used to demonstrate the benefits of using a time-sequential representation. Four data structures (using the Hierarchical Data Format (HDF5), Network Common Data Format (netCDF) and a native file system approach) are implemented and compared in a series of experimental read tests to determine which format is most appropriate for implementation in the CSIR Cluster Computing Centre's facilities.