Sunday, January 30, 2011

Teenager contribution for remote sensig

Check out this article. A teenager proposes an instrument concept for measuring the temperature at the bottom of clouds. NOAA is testing it.

Monday, January 24, 2011

Multi-temporal series simulations


As I mentioned in a previous post, last September I participated to the Recent Advances in Quantitative Remotes Sensing Symposium. I presented several posters there. One of them was about the assessment of the classification accuracy of Venus and Sentinel-2 sensors for land cover map production.

While the results of the study are interesting, I think that the mostimportant thing that this paper shows is how a time series withrealistic reflectance values can be simulated.

The idea here is to find a good balance between image synthesis (low accuracy) and physically sound simulation (need for ancillary data and computational complexity). The choice made here is to use a real time series of Formosat-2 images (only 4 spectral bands) in order to simulated Venus and Sentinel-2 time series with the same temporal sampling but with more than 10 spectral bands.

The Formosat-2 time series is used in order to:


  1. Estimate the LAI (leaf area index) for each pixel
  2. Give a spatial distribution using a land-cover map
A database containing leaf pigment values for different types of vegetation is used together with the above-mentioned LAI estimation in order to drive a reflectance simulator. The simulated reflectances are then convolved with the relative spectral responses of the sensors in order to generate the simulated images.

The poster presented at RAQRS 2010 is here.


Monday, January 3, 2011

Reproducible research

I have recently implemented the Spectral Rule Based Landsat TM image classifier described in:

Baraldi et al. 2006, "Automatic Spectral Rule-Based Preliminary
Mapping of Calibrated Landsat TM and ETM+ Images", IEEE Trans. on Geoscience and Remote Sensing, vol 44, no 9.

This paper proposes a set of radiometric combinations, thresholds and logic rules to distinguish more than 40 spectral categories on Landsat images. My implementation is available in the development version of the Orfeo Toolbox and should be included in the next release:
One interesting aspect of the paper is that all the information needed for the implementation of the method is given: every single value for thresholds, indexes, etc. is written down in the paper. This was really useful for me, since I was able to code the whole system without getting stuck on unclear things or hidden parameters.

This is so rarely found in image processing literature that I thought it was worth to post about it. But this is not all.

Once my implementation was done, I was very happy to get some Landsat classifications, but I was not able to decide whether the results were correct or not. Since the author of the paper seemed to want his system to be used and gave all details for the implementation, I thought I would ask him for help for the validation. So I sent an e-mail to A. Baraldi (whom I had already met before) and asked for some validation data (input and output images generated by his own implementation).

I got something better than only images. He was kind enough to send me the source code of the very same version of the software which was used for the paper – the system continues to be enhanced and the current version seems to be far better than the one published.

So now I have all what is needed for reproducible research:
  1. A clear description of the procedure with all the details needed
    for the implementation.
  2. Data in order to run the experiments.
  3. The source code so that errors can be found and corrected.
I want to publicly thank A. Baraldi for his kindness and I hope that this way of doing science will continue to grow.

If you want to know more about reproducible research, check this site.



Tuesday, November 23, 2010

Change detection of soil states

As I mentioned in a previous post, last September I participated to
the Recent Advances in Quantitative Remotes Sensing Symposium. I
presented several posters there. One of them was about the work done
by Benoît Beguet for his master thesis while he was at CESBIO earlier
this year.

The goal of the work was to assess the potential of high temporal and
spatial resolution multispectral images for the monitoring of soil
states related to agricultural practices.

This is an interesting topic for several reasons, the main ones being:


  1. a bare soil map at any given date is useful for erosion forecast
    and nitrate pollution estimations

  2. the knowledge about the dates of different types of agricultural
    soil work can give clues about the type of crop which is going to
    be grown
The problem was difficult, since we used 8 m. resolution images (so no
useful texture signature is present) and we only had 4 spectral bands
(blue, green, red and near infrared). Without short-wave infra-red
information, it is very difficult to infer something about the early and late
vegetation phases.

However, we obtained interesting results for some states and, most of
all, for some transitions – changes – between states.


You can have a look at the poster we presented at RAQRS here


Sunday, October 31, 2010

Massive Remote Sensing

Some weeks ago I had the chance to assist to the 3r Symposium on
Recent Advances in Quantitative Remote Sensing, RAQRS, in València,
Spain.

It was the first time I attended such a conference. I've been to
nearly all IGARSS since 1998, but I had never been to a conference
where the main topic was the physics of remote sensing of continental
surfaces. All in all, it was a very interesting and inspiring
experience and I learned a lot of things.

Many of the talks and posters dealt with applications related to
multi-temporal and metric to decametric resolutions. This is due to
the fact that most of the phenomena of interest (as for instance, the
Essential Climate Variables, ECVs for short) are best monitored at
those resolutions.

It was in this context that I heard the massive remote sensing
expression. As I understand it, it makes reference to the future
available flow of data produced mainly by the ESA's Sentinel
missions. Indeed, with these sensors (and others as NASA's LDCM) a
frequent complete cover of the Earth's surface by high resolution
sensors will be available. And, in order for these data to be useful,
fast and efficient automatic processing methods will be needed.


This last sentence may seem as nothing new with respect to what has
been said inn recent years about very high spatial resolution sensors,
but I think that now there are several issues which make it really
crucial:


  1. Always on: the Sentinels (at least 1 and 2) will always be
    acquiring data, so the amount of images will be huge.


  2. Data really available: I don't know if this has been officially
    validated by ESA, but, as far as I know, the images will be free of
    charge or at a minimum cost


  3. Physical reality: the sensors will not be just taking pictures,
    but provide many spectral bands which can not be easily visually
    analyzed.

So I think it's time to start taking this challenge seriously and
addressing the tough points such as:


  1. How to produce global land-cover maps without (or with very little)
    ground truth?


  2. How to develop models and methods which can be ported from one
    site to another with minimal tuning?


  3. How to exploit the synergy between image data and ancillary data or
    between image modalities (Sentinel-1 and Sentinel-2, for instance).


Thursday, September 30, 2010

Sweet Remote Sensing

For a few months now, I have been taking a look at software tools designed for kids' education. There are many of them available, but I have been focusing on the works of the community founded by S. Papert and Alan Kay, inventors of the Logo and Smalltalk programming languages respectively. Papert and Kay got inspiration from the pioneer figures of pedagogy Piaget and Montessori and they took constructivism as main leading path for their developments.

The main idea behind constructivism is that the kid (but also adults as a matter of fact) learn by doing and exploring concepts. Papert and Kay adapted this theory to the use of computers: computers are simulators of the physical reality and as such, they allow to learn by doing even for concepts for which a physical world experience would not be feasible. The Squeak eToys environment is a good example of this.

One thing that struck me is that these ideas lead to (among others) the following conclusion: give a kid a laptop loaded with open source software and an Internet and she can learn whatever she wants. This is not just theory. The best example of the application of this idea is the OLPC (One Laptop Per Child) project and its success.

The OLPC software is based upon a software suite called Sugar which is made of lots of so-called activities for different disciplines: math, languages, physics, etc. This open source environment allows a kid to learn things, not just by using the activities, but also by allowing her to look at the source code (how things are done).

The strength of this kind of environment is that it can be used at several levels which allow the user to use the optimal trade-off between power and easy of use.

I would like to see an analogous environment for Remote Sensing Image Processing and Analysis. The Orfeo Toolbox is of course a very good candidate for this, and many steps have been taken in this direction.
The work of architecture and integration of the best existing open source libraries (ITK, OSSIM, 6S, libSVM, etc.) has provided the kitchen sink, the power. The creation of bindings (python, java), has allowed to expand the ways in which this power can be accessed. Finally, the GUI-based application Monteverdi has made the point and click use of (a limited part of) this power possible. The availability of QGIS plugins demonstrates that other paths can be imagined to make OTB usable by non programmers.

However, there is still lots of work to be done in order to have a completely integrated environment where, from the same starting point, a user can navigate through the different levels of complexity and power. Even if this is not an easy goal achieve, I think that all the building blocks are available to progress in this direction. This may only be a (very) long term goal, but I think that the existence of this vision can help to federate a community and motivate people to contribute.

Thursday, July 1, 2010

Community building

In my last post I wrote about open approaches for Earth Observation. Open (science/source/whatever) means community. I am in the middle of the reading of "The Art of Community" and it is very inspiring. In some ways, it reminds me of very much of Chapter 4 "Social and Political Infrastructure" of Karl Fogel's "Producing Open Source Software - How to Run a Successful Free Software Project", although Bacon's book is more general.

Anyway, both books give very good insight on the issues and tricks involved in a community-based project and even the particular case of projects which are created and managed by companies.

This is an interesting point, since one could think that a project funded by a company has more chances to succeed than a pure volunteer-based one, but many real examples show that this is not the case. The main pitfalls of a corporate open source project are related to decision making and communication. For a project to succeed, the community has to be respected and open discussions and meritocracy are the 2 pillars of community.

Examples of this are the fact that all decisions involving the development have to be explained and discussed in a way that all developers are involved. Examples of closed discussions at the company level leaving out the volunteer contributors are given.

Another interesting example is the one given by Fogel about meritocracy: any developer should earn the write access to the source repository by proving his value. Usually, in corporate environments, developers have commit access from the first day, while external contributors, which usually are much more capable, have to wait long time before that.

An important player in any open source project is the Benevolent Dictator (BD). I will cite Fogel verbatim here:

"It is common for the benevolent dictator to be a founder of the project, but this is more a correlation than a cause. The sorts of qualities that make one able to successfully start a project—technical competence, ability to persuade other people to join, etc.—are exactly the qualities any BD would need. And of course, founders start out with a sort of automatic seniority, which can often be enough to make benevolent dictatorship appear the path of least resistance for all concerned.

Remember that the potential to fork goes both ways. A BD can fork a project just as easily as anyone else, and some have occasionally done so, when they felt that the direction they wanted to take the project was different from where the majority of other developers wanted to go. Because of forkability, it does not matter whether the benevolent dictator has root (system administrator privileges) on the project's main servers or not. People sometimes talk of server control as though it were the ultimate source of power in a project, but in fact it is irrelevant. The ability to add or remove people's commit passwords on one particular server affects only the copy of the project that resides on that server. Prolonged abuse of that power, whether by the BD or someone else, would simply lead to development moving to a different server.

Whether your project should have a benevolent dictator, or would run better with some less centralized system, largely depends on who is available to fill the role. As a general rule, if it's simply obvious to everyone who should be the BD, then that's the way to go. But if no candidate for BD is immediately obvious, then the project should probably use a decentralized decision-making process ..."

And I will end this post by using the opening quote of Bacon's chapter 9:

"The people to fear are not those who disagree with you, but
those who disagree with you and are too cowardly to let you
know."
—Napoléon Bonaparte

Food for thought.