

{"id":79,"date":"2021-03-17T14:52:51","date_gmt":"2021-03-17T13:52:51","guid":{"rendered":"https:\/\/project.inria.fr\/sixp\/?page_id=79"},"modified":"2026-02-05T16:47:44","modified_gmt":"2026-02-05T15:47:44","slug":"imagery","status":"publish","type":"page","link":"https:\/\/sixp.inria.fr\/en\/research\/imagery\/","title":{"rendered":"Aerial imagery and its interpretation"},"content":{"rendered":"<p><\/p>\n<div class=\"page\" title=\"Page 4\">\n<div class=\"layoutArea\">\n<div class=\"column\">\n<p><strong>What is aerial imagery and how can it be interpreted with AI?<\/strong><\/p>\n<p>Airborne systems mounted with sensing devices are able to acquire digital observations of our planet at low cost, on wide, possibly hardly accessible (e.g. mountainous environments) areas. Technological advances led to lighter, smaller systems providing finer observations. In this project, the aerial imagery is acquired through UAV and consists in color images, multispectral images (covering also the invisible spectrum such as infrared wavelengths), and 3D point clouds.<\/p>\n\n\t\t<style type=\"text\/css\">\n\t\t\t#gallery-2 {\n\t\t\t\tmargin: auto;\n\t\t\t}\n\t\t\t#gallery-2 .gallery-item {\n\t\t\t\tfloat: left;\n\t\t\t\tmargin-top: 10px;\n\t\t\t\ttext-align: center;\n\t\t\t\twidth: 50%;\n\t\t\t}\n\t\t\t#gallery-2 img {\n\t\t\t\tborder: 2px solid #cfcfcf;\n\t\t\t}\n\t\t\t#gallery-2 .gallery-caption {\n\t\t\t\tmargin-left: 0;\n\t\t\t}\n\t\t\t\/* see gallery_shortcode() in wp-includes\/media.php *\/\n\t\t<\/style>\n\t\t<div id='gallery-2' class='gallery galleryid-79 gallery-columns-2 gallery-size-medium'><dl class='gallery-item'>\n\t\t\t<dt class='gallery-icon landscape'>\n\t\t\t\t<a href='https:\/\/sixp.inria.fr\/files\/2021\/05\/pic_3.jpg'><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"300\" src=\"https:\/\/sixp.inria.fr\/files\/2021\/05\/pic_3-300x300.jpg\" class=\"attachment-medium size-medium\" alt=\"\" aria-describedby=\"gallery-2-257\" srcset=\"https:\/\/sixp.inria.fr\/files\/2021\/05\/pic_3-300x300.jpg 300w, https:\/\/sixp.inria.fr\/files\/2021\/05\/pic_3-150x150.jpg 150w, https:\/\/sixp.inria.fr\/files\/2021\/05\/pic_3-768x768.jpg 768w, https:\/\/sixp.inria.fr\/files\/2021\/05\/pic_3.jpg 787w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/a>\n\t\t\t<\/dt>\n\t\t\t\t<dd class='wp-caption-text gallery-caption' id='gallery-2-257'>\n\t\t\t\tVisible images (visible red, green and blue) over Bocard area\n\t\t\t\t<\/dd><\/dl><dl class='gallery-item'>\n\t\t\t<dt class='gallery-icon landscape'>\n\t\t\t\t<a href='https:\/\/sixp.inria.fr\/files\/2021\/05\/pic_2.jpg'><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"300\" src=\"https:\/\/sixp.inria.fr\/files\/2021\/05\/pic_2-300x300.jpg\" class=\"attachment-medium size-medium\" alt=\"\" aria-describedby=\"gallery-2-256\" srcset=\"https:\/\/sixp.inria.fr\/files\/2021\/05\/pic_2-300x300.jpg 300w, https:\/\/sixp.inria.fr\/files\/2021\/05\/pic_2-150x150.jpg 150w, https:\/\/sixp.inria.fr\/files\/2021\/05\/pic_2-768x768.jpg 768w, https:\/\/sixp.inria.fr\/files\/2021\/05\/pic_2.jpg 787w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/a>\n\t\t\t<\/dt>\n\t\t\t\t<dd class='wp-caption-text gallery-caption' id='gallery-2-256'>\n\t\t\t\tMultispectral image (false color composition with infra red) over Bocard area\n\t\t\t\t<\/dd><\/dl><br style=\"clear: both\" \/><dl class='gallery-item'>\n\t\t\t<dt class='gallery-icon landscape'>\n\t\t\t\t<a href='https:\/\/sixp.inria.fr\/files\/2021\/05\/pic.jpg'><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"300\" src=\"https:\/\/sixp.inria.fr\/files\/2021\/05\/pic-300x300.jpg\" class=\"attachment-medium size-medium\" alt=\"\" aria-describedby=\"gallery-2-255\" srcset=\"https:\/\/sixp.inria.fr\/files\/2021\/05\/pic-300x300.jpg 300w, https:\/\/sixp.inria.fr\/files\/2021\/05\/pic-150x150.jpg 150w, https:\/\/sixp.inria.fr\/files\/2021\/05\/pic-768x768.jpg 768w, https:\/\/sixp.inria.fr\/files\/2021\/05\/pic.jpg 787w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/a>\n\t\t\t<\/dt>\n\t\t\t\t<dd class='wp-caption-text gallery-caption' id='gallery-2-255'>\n\t\t\t\tLiDAR elevation map over Chichoue area\n\t\t\t\t<\/dd><\/dl><dl class='gallery-item'>\n\t\t\t<dt class='gallery-icon landscape'>\n\t\t\t\t<a href='https:\/\/sixp.inria.fr\/files\/2021\/05\/pc.jpg'><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"300\" src=\"https:\/\/sixp.inria.fr\/files\/2021\/05\/pc-300x300.jpg\" class=\"attachment-medium size-medium\" alt=\"\" aria-describedby=\"gallery-2-266\" srcset=\"https:\/\/sixp.inria.fr\/files\/2021\/05\/pc-300x300.jpg 300w, https:\/\/sixp.inria.fr\/files\/2021\/05\/pc-150x150.jpg 150w, https:\/\/sixp.inria.fr\/files\/2021\/05\/pc-768x768.jpg 768w, https:\/\/sixp.inria.fr\/files\/2021\/05\/pc.jpg 787w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/a>\n\t\t\t<\/dt>\n\t\t\t\t<dd class='wp-caption-text gallery-caption' id='gallery-2-266'>\n\t\t\t\t3D LiDAR point cloud over Chichoue area\n\t\t\t\t<\/dd><\/dl><br style=\"clear: both\" \/>\n\t\t<\/div>\n\n<p>The Increasing availability of massive amount of remotely-sensed data also comes with a cost: the need for automated method to analyze such data, and to generate some meaningful knowledge out of it. Recently, such a need has been mostly filled thanks to Artificial Intelligence techniques, which have been shown successful in a wide range of fields. When applied to digital images, such techniques can help for instance recognizing a scene and the objects it includes. Still, their application to ecological data remains low, possibly due to the numerous\u00a0scientific challenges it raises. In this project, the AI techniques will help to automatically assess plant spatial distribution.<\/p>\n<p><strong>Objectives of this research axis:<\/strong><\/p>\n<p>The objective is to combine airborne data and in-situ species characterization in a deep learning framework to ensure species mapping. We assume that using such multiple sources in a multi-task, deep neural network should allow to derive high-resolution information beyond the standard land cover mapping achieved with semantic segmentation networks (that usually fail to extract precise object contours). It will enable the analysis of the emerging pattern of multiple plant interactions at the community scale, and illustrate the potential of Artificial Intelligence (AI) in ecology.<\/p>\n<div class=\"page\" title=\"Page 10\">\n<div class=\"layoutArea\">\n<div class=\"column\">\n<p>To reach this objective, we will include the following tasks: acquiring the multispectral data; designing a deep network able to perform semantic segmentation with ultra-high spatial resolution; learning to unmix or disentangle multispectral images in order to distinguish different species in mixes of species; coupling optical and LiDAR information into a multi-modal, multi-task deep architecture; and coupling deep learning with physical models for biophysical parameters (i.e. plant traits) estimation.<\/p>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<p><strong>Main researchers:<\/strong><\/p>\n<p>S\u00e9bastien Lef\u00e8vre, IRISA lab \/ OBELIX team, Full Professor at the University of South Brittany (UBS).<\/p>\n<p>Thomas Corpetti, LETG lab \/ OBELIX team, Senior Scientist at CNRS.<\/p>\n<p>Thomas Dewez, Scientist at BRGM.<\/p>\n<p>Benjamin Pradel and Marie Deboisvilliers, Engineers at l&#8217;Avion Jaune.<\/p>\n<p>Florent Guiotte, IRISA lab, Postdoc, developing novel AI methods for ultra-high resolution imagery, with the help of L&#8217;Avion Jaune company.<\/p>\n<p>Ho\u00e0ng-\u00c2n L\u00ea, IRISA lab, Postdoc, developing novel AI methods for processing 3D point clouds.<\/p>\n<p>&nbsp;<\/p>\n<p><strong>Main achievements:<\/strong><\/p>\n<p>We first demonstrated that existing methods for semantic segmentation or object detection, trained on general-purpose image datasets, are largely inadequate for mapping plants from sub-centimeter resolution images. While in remote sensing, semantic segmentation is the standard task for producing land cover maps, we observed that, in this case, object detection is more suitable, as the goal is to identify plants and their locations, rather than assigning a class to each pixel (on the order of a few millimeters).<br data-start=\"516\" data-end=\"519\" \/>We also validated the relevance of a data partitioning strategy in a semi-supervised framework, which facilitates the design of field measurement campaigns. To address the spatial noise present in the labels, we developed a label fuzzification approach (Figure, panel A), although the results remain insufficient for large-scale deployment.<br data-start=\"859\" data-end=\"862\" \/>We also developed several models (Figure, panel B), including a new semi-supervised approach, to estimate biophysical variables (plant height, volume and cover, biomass), and obtained very promising results by leveraging unlabelled images alongside a few annotated ones.<br data-start=\"1132\" data-end=\"1135\" \/>Finally, we designed a method combining 2D rasterization and generative adversarial models to learn how to produce a digital terrain model from a 3D point cloud.<\/p>\n<p><!--more--><a href=\"https:\/\/sixp.inria.fr\/files\/2026\/02\/WP3-1.jpeg\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-medium wp-image-406\" src=\"https:\/\/sixp.inria.fr\/files\/2026\/02\/WP3-1-300x166.jpeg\" alt=\"\" width=\"300\" height=\"166\" srcset=\"https:\/\/sixp.inria.fr\/files\/2026\/02\/WP3-1-300x166.jpeg 300w, https:\/\/sixp.inria.fr\/files\/2026\/02\/WP3-1-1024x567.jpeg 1024w, https:\/\/sixp.inria.fr\/files\/2026\/02\/WP3-1-768x425.jpeg 768w, https:\/\/sixp.inria.fr\/files\/2026\/02\/WP3-1-1536x851.jpeg 1536w, https:\/\/sixp.inria.fr\/files\/2026\/02\/WP3-1-150x83.jpeg 150w, https:\/\/sixp.inria.fr\/files\/2026\/02\/WP3-1.jpeg 1936w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/a><\/p>","protected":false},"excerpt":{"rendered":"<p>Sorry, this entry is only available in Fran\u00e7ais.<\/p>\n<p> <a class=\"continue-reading-link\" href=\"https:\/\/sixp.inria.fr\/en\/research\/imagery\/\"><span>Continue reading<\/span><i class=\"crycon-right-dir\"><\/i><\/a> <\/p>\n","protected":false},"author":1989,"featured_media":0,"parent":87,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"templates\/template-twocolumns-right.php","meta":{"footnotes":""},"class_list":["post-79","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/sixp.inria.fr\/en\/wp-json\/wp\/v2\/pages\/79","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/sixp.inria.fr\/en\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/sixp.inria.fr\/en\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/sixp.inria.fr\/en\/wp-json\/wp\/v2\/users\/1989"}],"replies":[{"embeddable":true,"href":"https:\/\/sixp.inria.fr\/en\/wp-json\/wp\/v2\/comments?post=79"}],"version-history":[{"count":25,"href":"https:\/\/sixp.inria.fr\/en\/wp-json\/wp\/v2\/pages\/79\/revisions"}],"predecessor-version":[{"id":408,"href":"https:\/\/sixp.inria.fr\/en\/wp-json\/wp\/v2\/pages\/79\/revisions\/408"}],"up":[{"embeddable":true,"href":"https:\/\/sixp.inria.fr\/en\/wp-json\/wp\/v2\/pages\/87"}],"wp:attachment":[{"href":"https:\/\/sixp.inria.fr\/en\/wp-json\/wp\/v2\/media?parent=79"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}