Wednesday, August 27, 2008

New sensors, new missions, new challenges in VHRRS

Very high spatial resolution earth observation systems are becoming available in both the active and the passive sensor fields: Quickbird, Worldview, TerraSAR-X, Cosmo Skymed, Pleiades, etc.

Although spatial resolution is catching the attention of many end users, there are other dimensions where resolution is increasing and allowing the access of new kinds of information. These dimensions are the spectral one and the temporal one.

The spectral dimension is usually limited to the visible plus the near infrared, but more and more systems are extending these capabilities to a superspectral (<20 bands) and hyperspectral sampling. One can also make the analogy between the increase of spectral bands in optical systems and SAR systems with increased polarimetric capabilities.

In the temporal dimension, thanks to constellations or to specific orbital configurations, revisit times of 1 day or even less are going to be available. Therefore, the monitoring of quickly evolving phenomena will be possible.

These new sensors have been developed in order to fulfill user requirements in terms of applications. However, new applications and uses will be imagined by remote sensing scientists as it has been the case in the past (the use of ERS-1 for SAR interferometry, the water color products derived from SPOT-Vegetation, etc.).

Beyond the use of a given sensor or system for applications for which it has not be designed, the great amount of newly available data with high resolution in different dimensions (spatial, spectral, temporal) allows us to imagine new synergies between different types of data.

Finally, the availability of spatial and geographical information is steadily increasing. From professional and commercial data bases to free (Google Earth, Virtual Earth, SRTM) an even open ones (Open Street Map, Geo Names), many sources of high quality data are available to users for fusion and synergy with remote sensed data.

On Monday 8th September I will be givin a talk about this subject in the International Summer School on Very High Remote Sensing in Grenoble, France.

The talk will give a detailed overview of Very High Resolution Remote Sensing in its 3 dimensions (spatial, temporal, spectral). New sensors and upcoming missions and systems will be presented. The new challenges in terms of image processing, data synergies, potential applications and end-users expectations will be analyzed.

Saturday, July 26, 2008

ORFEO Toolbox (OTB) on YouTube

Emmanuel Christophe has produced a very interesting video about ORFEO Toolbox.

The animation has been generated using an open source tool called Code Swarm http://vis.cs.ucdavis.edu/~ogawa/codeswarm/
. You can watch it here:


This video shows OTB's history in terms of change in source code. Each dot represents a file gravitating around the developer who edited it.

It is interesting to note the increase of activity during the weeks before a release and the decrease during summer holidays (the development team lives in France ...).

Anybody wants to join the adventure?

Saturday, July 19, 2008

Summer School in Very High Resolution Remote Sensing

You may be interested in the upcoming 2008 Summer School on Very High
Resolution Remote Sensing which will be held in Grenoble, France, over a
week (September 8-12).


I will be giving a talk about "New sensors, new missions and new challenges on very high resolution remote sensing imagery".

It is a pleasure for me having been invited by the organizing committee led by Jocelyn Chanussot to participate with international experts as:
  • Lorenzo Bruzzone, University of Trento, Italy
  • Paolo Gamba, University of Pavia, Italy
  • Antonio Plaza, University of Extremadura, Spain
  • Kostas Papathanassiou, DLR, Germany
  • Irena Hajnsek, DLR, Germany

CNES will take care of organizing lab sessions using the ORFEO Toolbox
(an open source library for remote sensing image processing algorithms developed by CNES).

You can find more information about the program and the registration procedure here.

OTB Live CD World Release

Last week took place 2 of the main remote sensing symposiums at the international level: IGARSS (in Boston) and ISPRS (in Beijing).

The OTB Team took the chance and 100 live CDs with the last version of OTB where distributed in Boston and Beijing. At the end of the week we still had some unhappy people asking for their sample when there where no CDs left!

We hope that the Live CD will allow people to test OTB easily and that they will find it useful for remote sensing image processing.

Saturday, November 3, 2007

OTB 1.6 is available

It's been a long time since I wrote my las post. This is mainly because I have been very busy with the new release of ORFEO Toolbox.

ORFEO Toolbox (OTB) is an open source library of image processing
algorithms. OTB is based on the medical image processing library ITK and
offers particular functionalities for remote sensing image processing in
general and for high spatial resolution images in particular. OTB is
distributed under a free software licence CeCILL (similar to GPL) to
encourage contribution from users and to promote reproducible research.

You can go to <http://otb.cnes.fr> and follow the download link there.

With this new release, we have started to seriously pave the road
towards OTB 2.0. As you will notice, the OTB Development Team has worked
very hard in order to make available plenty of very interesting things:

- Geometric sensor modeling, cartographic projections and SRTM/DTED DEM
handling using OSSIM.
- Vegetation indices plus encapsulation of the 6S Radiative Transfer
Code, which will be used in OTB 2.0 for radiometric calibration.
- Integration of external contributions: Bayesian image fusion (J.
Radoux, UCL), Stochastic Expectation Maximization and user defined SVM
kernels (G. Mercier, ENST-Bretagne).
- Several new demo applications: road network extraction, interactive
registration, ortho-registration and enhanced viewer.


The detailed documentation, which now includes a tutorial section, can
be downloaded also from <http://otb.cnes.fr>. The PDF file is 540 pages
long, so think twice before printing it (then, choose not to print it).

For details on what is new in this release, se below:

-------------------------------------
OTB-v.1.6.0 - Changes since version 1.4.0 (2007/10/25)
-----------------------------------------

*Applications:

- Added the otbImageViewerManager application which allows to
open multiple images, configure viewers and link displays.

- Added the otbRoadExtraction which demonstrates the road
extraction algorithm implemented in the FeatureExtraction
module.

- Added the otbOrthoRectifAppli application which allows to
ortho rectify images in command line using the brand new
Projections module of the Orfeo ToolBox. Old
ortho-rectification application has been moved to
otbPseudoOrthoRectif.

- Added an option in CMakeLists.txt to use VTK or not (enable
or disable the following application).

- Added the Pireo registration application (VTK needed).

*Radiometry:

- The 6S Transfer Radiative Code compiles within OTB.
- Added the Radiometry directory, containing everything that
has to do with image radiometry.
- Added the NDVI and ARVI (3 input bands) vegetation index
filters.

*Projections:

- Added the Projections directory, containing everything that
has to do with image projections.

- Added an otb::DEMHandler to fetch the elevation value in
SRTM/DTED directories.

- Added an otb::DEMToImageGenerator to generate an elevation
map.

- Added an otb::OrthoRectificationFilter to perform
orthorectification of geo-referenced images.

- Added the forward and inverse sensor model projection.

- Added several map projection transforms (Eckert4,
LambertConformalConic, TransMercator, Mollweid, Sinusoidal,
UTM)

*Fusion:

- Added the Fusion directory, containing everything that has to
do with image fusion.

- Added the otb::BayesianFusionImageFilter, a pan-sharpening
filter which algoritm has been kindly contributed by Julien
Radoux.


*Learning:

- Added methods to access the alpha values, the number of
support vectors, the support vectors themselves, the distance
to the hyperplanes.

- Added the otb::SEMClassifier, implementing the Stochastic
Expectation Maximization algorithm to perform an estimation of
a mixture model.

*ChangeDetection:

- Added the otb::KullbackLeiblerDistanceImageFilter to compute
the Kullback-Leibler distance between two images.

- Added the otb::KullbackLeiblerProfileImageFilter to perform a
multi-scale change detection using the Kullback-Leibler
distance.


*MultiScale:

- Various name changes and bugfixes in the morphological
pyramid segmentation classes.

*BasicFilters:

- Added StreamingVectorStatisticsImageFilter to compute
the second order stastics on a large vector image.

- Added the MatrixTransposeMatrixImageFilter to compute
the product of the matrix of vector pixels from image 1
in rowwith the the matrix of vector pixels from image 2
columns for large vector image.

- Added the otb::VectorImageTo3DScalarImageFilter which
transforms a vector image into a 3D scalar image where each
band is represented in a layer of the 3rd dimension.

- Added the otb::ImageListToVectorImageFilter and
otb::VectorImageToImageListFilter to convert a vector image
from/to an image list.

- Added the otb::ImageListToImageListApplyFilter which applies a
given scalar image filter to a list of images

- Added the otb::PerBandImageFilter, which applies a given
scalar filter to each band of a VectorImage. This is not the
optimal way for most processings but it allows the use of
almost every scalar filter on
vector images.

- Added the otb::StreamingResampleImageFilter, which is a
streaming capable version of the itk::ResampleImageFilter.



*Common:

- Added an otb::Polygon, which represents a closed polyline on
which intersection or point interiority can be tested.

*IO:

- Added an otb::DEMHandler to fetch the elevation value in
SRTMor DTED directories.

- Added an otb::DEMToImageGenerator to generate an elevation
map.

- Added a new tiling streaming mode.

- Added the otb::ImageGeometryHandler, which allows to handle
seamlessly the image geometry information.

- Fixed a bug in the otb::MSTARImageIO.





*Documentation:

- Added various documented examples in the SoftwareGuide for
the new classes.

- Added a Tutorial section in the SoftwareGuide.

*Utilities:

- Added the 6S library which will soon play a role in the
radiometry module.

- Updated the internal version of ITK to 3.4.0.

*Platforms:

- Fixed the random segfault of
otbInteractiveChangeDetectionAppli under Visual 8.0.



-------------------------------------------------------------------

Do not hesitate to use the users' list
<http://groups.google.com/group/otb-users> if you need support, or
contact us directly at <otb@cnes.fr>.

Enjoy OTB!

Friday, May 25, 2007

Image analysis software

In the field of remote sensing image analysis, there are 3 main components which have to be combined in order to achieve good results. These are: data, software and thematic expertise.

Even though one may think that software is just a tool, this particular tool is paramount when it comes to deal with the huge amount of information contained in the images. Fortunately there exist number of commercial tools available to choose from. Just try searching on Google with "remote sensing image software" and you will see.

These commercial packages offer a bunch of interesting functionnalities allowing to perform very interesting processing tasks on images. They also offer the support from their editors. Therefore they are a good choice for many users.

There also exist free software packages: free in the sense of gratis but also free in the sens of allowing you to access the source code and modify it to suit your needs. I will not try here to advocate for the use of free software, even if my preference goes there.

However, when new types of sensors become available, the end user suffers from a long delay between data availability and software support for the new type of data. I am not talking about image formats or meta-data access. Usually software editors are proactive and provide these kind of functionnalities rather quickly. I am talking about algorithms and techniques developed in the research labs and which allow to exploit the specificities of the new data.

The usual way of proceeding is the following: a lab develops an algorithm and writes some code in order to validate it on test data, then comes the publication under the form of a journal article. After that, software editors have the tough work -- and responsibility -- of deciding what can be interesting for users, code it and validate the new algorithms on big amounts of data before making this available to their users. This procedure is long and expensive and constitutes an important risk for software editors.

In many research fields -- other than geoscience and remote sensing -- scientists are eager to share data and software in order to intervalidate and compare different approaches (see the Public Library of Science or the Insight Journal, for instance). That means, that, together with the publication, the source code and the data used for the publication is made available under licences which allow the use, the modification and the distribution of those.

This approach allows a rapid evolution from concepts and prototypes to operational systems available to end users. This is not a bad thing for software editors, since they have found the way to integrate this kind of free software in their commercial packages (via plugins, external modules, etc.). This also opens up a market linked to education, application tuning for specific users, etc.

Of course, scientist in labs are not professional software developers, so the code they make available needs to be improved and optimized. Free software allows people to look how the code is made, so everybody can find bugs and correct them. Furthermore, in many fields, software editors themselves contribute to free software develoment, because they need this approach in order to provide interesting functionnalities for their users. Some of them are even sponsoring initiatives as the OSGeo Foundation.

In the field of remote sensing, most of free software projects are related to the data access and the pre-processing steps (see here for a list of projects) and there are few projects whose goal is providing tools for information extraction. This is no doubt due to the fact that these tasks may be specific to each application and different to automate. At the other end of the information production chain, there are lots of free software tools for mapping, GIS, etc.

At CNES, in the frame of a program called ORFEO which aims at helping future users of the Pléiades and Cosmo-Skymed systems to get ready to use the data, we are developing a set of algorithmic components which allow to capitalise the methodological know how, and therefore use an incremental approach to take profit of the results of the methodological research.

The ORFEO Toolbox (OTB) is free software and can be downloaded here.

OTB is based on the medical image processing library ITK and offers particular functionnalities for remote sensing image processing in general and for high spatial resolution images in particular.

Currently, OTB provides:


- image access: optimized read/write access for most of remote sensing
image formats, meta-data access, simple visualization;
- filtering: blurring, denoising, enhancement;
- feature extraction: interest points, alignments, lines;
- image segmentation: region growing, watershed, level sets;
- classification: K-means, SVM, Markov random fields, Kohonen's map;

- multiscale analysis;

- spatial reasoning;

- change detection.

Many of these functionnalities are provided by ITK and have been tested
and documented for the use with remote sensing data. We also use other libraries as GDAL, or OSSIM.

We have some tenths of users and we expect that the toolbox will receive contributions from reserachers and end users.

Tuesday, May 22, 2007

Multi-resolution change detection

Yesterday I was at the PhD thesis defence of Amandine ROBIN. The subject of the thesis is "Sub-pixel classification and change detection using multi-resolution remote sensing images" and the work has been co-funded by CNES (the French Space Agency) and EADS-Astrium.

The goal of the work was to develop techniques able to combine the high temporal repetitivity (1 day to 10 days) of low resolution (100 m. to 1 km.) sensors together with the high spatial resolution (10 m. to 30 m.) of sensors having longer revisit periods (from 30 days to several months). This means trying to combine SPOT XS images with MeRIS data, or Landsat with SPOT VGT.

In terms of application field this is very interesting, since, for instance, in the case of precision farming, one may want to update a landcover map several times in one season, but due to cloud cover issues it may be difficult to obtain a series of high spatial resolution data with a good temporal sampling. In this case, the approach proposed in the thesis was to use a high resolution geometric reference for the regions of the study area (the agricultural plots) and perform a classification of each region using the low spatial resolution temporal series.

Typically, this means using a SPOT XS image acquired at the begining of the season in order to perform a segmentation (the classification is nearly impossible with one single date, since most of the classes are distinguished by their temporal evolution rather than by their radiometry at a given date). Then, using several MeRIS images acquired during the season, the classification may be performed. It is interesting to note that the final classification has the SPOT resolution (20 m.), but the temporal series has pixels which have a size of 300m. by 300m. (resolution factor of 15 by 15). Amandine has shown that the results are good up to resolution factors of about 40 by 40.

The other application was detecting changes between a low spatial resolution temporal series and a high resolution reference classification. That means that you have a high resolution classification obtained with the method presented above and using low resolution images (even a single acquisition, no need for a long temporal series) you can detect which low resolution pixels have changed. This change alert can be used to trigger a new classification or even to schedule a high spatial resolution acquisition.

Amandine's work is based on a sound theoretical approach and the results can be applied to optical sensors either having the same spectral bands between the high and low resolution data or even having different bands but allowing to compute the same vegetation index (NDVI, Leaf Area Index, etc.).

I am persuaded that these results are ready to be used in operational chains.

Amandine's homepage has some material available for download which is worth looking at.