Skip to content

List of datasets, papers, and codes related to multimodal/multisource/multisensor remote sensing classification

License

Notifications You must be signed in to change notification settings

likyoo/awesome-multimodal-remote-sensing-classification

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 

Repository files navigation

Awesome Multi-modal/Multi-source/Multi-sensor Remote Sensing Classification

Especially focus on Hyperspectral and LiDAR images.

1. Dataset

  • LCZ Data (Multispectral and SAR data)

    The LCZ data sets are collected from Sentinel-2 and Sentinel-1 satellites, where the former acquires the MS data with ten spectral bands and the latter is able to generate the dual-polarimetric SAR data organized as a commonly used PolSAR covariance matrix (four components). Paper: Hong et al.2020

  • DFC2018 Dataset (Multispectral LiDAR and Hyperspectral Data)

    ◗ Multispectral light detecting and ranging (LiDAR) data have three simultaneous different optical wavelengths. For the sake of accessibility to various users, the data are available as point cloud data and digital surface models (DSMs) at a 0.5-m ground sampling distance (GSD).

    ◗ Hyperspectral data at a 1-m GSD cover a 380–1,050- nm spectral range with 48 contiguous bands.

    ◗ Very-high-resolution red-green-blue imagery presents at a 5-cm GSD. All data are geo-referenced and cover a geographic area of more than 4 km^2​. Paper: Saux et al. 2018

  • MUUFL Gulfport (Hyperspectral and LiDAR Data)

    The original MUUFL Gulfport data set campus 1 scene contains 325×337 pixels across 72 bands. Due to noise, the first four and last four bands were removed, resulting in a new hyperspectral image of 64 bands. The lower right corner of the original image contains invalid area, thus only the first 220 columns were used for the ground truth mapping. The size of the cropped hyperspectral imagery is 325 × 220 × 64. The ground truth map was provided by manually labeling the pixels in the scene into the following classes in the scene: trees, mostly-grass ground surface, mixed ground surface, dirt and sand, road, water, buildings, shadow of buildings, sidewalk, yellow curb, cloth panels (targets), and unlabeled points. Paper: Du et al.2017

  • Houston2013 (Hyperspectral and LiDAR Data)

    The data sets distributed for the Contest included an HSI, a LiDAR-derived digital surface model (DSM), both at the same spatial resolution (2.5 m), as well as the LiDAR point cloud. The HSI had 144 bands in the 380–1050 nm spectral region. The corresponding co-registered DSM represented the elevation in meters above sea level (per the Geoid 2012 A model). The “las” file of the LiDAR point cloud was also provided. Paper: Debes et al.2014

2. Paper

2.1 Survey Papers

2.2 Deep Learning

2.3 Traditional Method

3. Related Repositories

About

List of datasets, papers, and codes related to multimodal/multisource/multisensor remote sensing classification

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published