Click for details." />
Buy a Full Access Account and Enjoy Unlimited Download! Click for details.

Multi-Sensor and Multi-Temporal Remote Sensing: Specific Single Class Mapping

$20.00
Multi-Sensor and Multi-Temporal Remote Sensing: Specific Single Class Mapping
Buy a Full Access Account and Enjoy Unlimited Download! Click for details.

Multi-Sensor and Multi-Temporal Remote Sensing: Specific Single Class Mapping

$20.00

1st Edition 

by Anil Kumar (Author), Priyadarshi Upadhyay (Author), Uttara Singh (Author) 

This book elaborates fuzzy machine and deep learning models for single class mapping from multi-sensor, multi-temporal remote sensing images while handling mixed pixels and noise. It also covers the ways of pre-processing and spectral dimensionality reduction of temporal data. Further, it discusses the ‘individual sample as mean’ training approach to handle heterogeneity within a class. The appendix section of the book includes case studies such as mapping crop type, forest species, and stubble burnt paddy fields.

Key features:

  • Focuses on use of multi-sensor, multi-temporal data while handling spectral overlap between classes
  • Discusses range of fuzzy/deep learning models capable to extract specific single class and separates noise
  • Describes pre-processing while using spectral, textural, CBSI indices, and back scatter coefficient/Radar Vegetation Index (RVI)
  • Discusses the role of training data to handle the heterogeneity within a class
  • Supports multi-sensor and multi-temporal data processing through in-house SMIC software
  • Includes case studies and practical applications for single class mapping 

    This book is intended for graduate/postgraduate students, research scholars, and professionals working in environmental, geography, computer sciences, remote sensing, geoinformatics, forestry, agriculture, post-disaster, urban transition studies, and other related areas.
Year:
2023
Pages:
178
Language:
English
Format:
PDF
Size:
10 MB
ISBN-10:
1032428325
ISBN-13:
978-1-032-42832-1, 9781032428321, 978-1032428321, 978-1-032-44652-3, 978-1032446523, 9781032446523, 978-1-003-37321-6, 9781003373216, 978-1003373216
ASIN:
B0C1NZYC76