Even though one may think that software is just a tool, this particular tool is paramount when it comes to deal with the huge amount of information contained in the images. Fortunately there exist number of commercial tools available to choose from. Just try searching on Google with "remote sensing image software" and you will see.
These commercial packages offer a bunch of interesting functionnalities allowing to perform very interesting processing tasks on images. They also offer the support from their editors. Therefore they are a good choice for many users.
There also exist free software packages: free in the sense of gratis but also free in the sens of allowing you to access the source code and modify it to suit your needs. I will not try here to advocate for the use of free software, even if my preference goes there.
However, when new types of sensors become available, the end user suffers from a long delay between data availability and software support for the new type of data. I am not talking about image formats or meta-data access. Usually software editors are proactive and provide these kind of functionnalities rather quickly. I am talking about algorithms and techniques developed in the research labs and which allow to exploit the specificities of the new data.
The usual way of proceeding is the following: a lab develops an algorithm and writes some code in order to validate it on test data, then comes the publication under the form of a journal article. After that, software editors have the tough work -- and responsibility -- of deciding what can be interesting for users, code it and validate the new algorithms on big amounts of data before making this available to their users. This procedure is long and expensive and constitutes an important risk for software editors.
In many research fields -- other than geoscience and remote sensing -- scientists are eager to share data and software in order to intervalidate and compare different approaches (see the Public Library of Science or the Insight Journal, for instance). That means, that, together with the publication, the source code and the data used for the publication is made available under licences which allow the use, the modification and the distribution of those.
This approach allows a rapid evolution from concepts and prototypes to operational systems available to end users. This is not a bad thing for software editors, since they have found the way to integrate this kind of free software in their commercial packages (via plugins, external modules, etc.). This also opens up a market linked to education, application tuning for specific users, etc.
Of course, scientist in labs are not professional software developers, so the code they make available needs to be improved and optimized. Free software allows people to look how the code is made, so everybody can find bugs and correct them. Furthermore, in many fields, software editors themselves contribute to free software develoment, because they need this approach in order to provide interesting functionnalities for their users. Some of them are even sponsoring initiatives as the OSGeo Foundation.
In the field of remote sensing, most of free software projects are related to the data access and the pre-processing steps (see here for a list of projects) and there are few projects whose goal is providing tools for information extraction. This is no doubt due to the fact that these tasks may be specific to each application and different to automate. At the other end of the information production chain, there are lots of free software tools for mapping, GIS, etc.
At CNES, in the frame of a program called ORFEO which aims at helping future users of the Pléiades and Cosmo-Skymed systems to get ready to use the data, we are developing a set of algorithmic components which allow to capitalise the methodological know how, and therefore use an incremental approach to take profit of the results of the methodological research.
The ORFEO Toolbox (OTB) is free software and can be downloaded here.
OTB is based on the medical image processing library ITK and offers particular functionnalities for remote sensing image processing in general and for high spatial resolution images in particular.
Currently, OTB provides:
- image access: optimized read/write access for most of remote sensing
image formats, meta-data access, simple visualization;
- filtering: blurring, denoising, enhancement;
- feature extraction: interest points, alignments, lines;
- image segmentation: region growing, watershed, level sets;
- classification: K-means, SVM, Markov random fields, Kohonen's map;
- multiscale analysis;
- spatial reasoning;
- change detection.
Many of these functionnalities are provided by ITK and have been tested
and documented for the use with remote sensing data. We also use other libraries as GDAL, or OSSIM.
No comments:
Post a Comment