Author Archives: WQ

About WQ

I received my PhD (2013) in Remote Sensing, Earth and Space Science at the Dept. of Aerospace Engineering Sciences, University of Colorado, Boulder, USA, under a Fulbright fellowship. Currently, I'm an Assistant Professor in the Dept. of Space Science at Institute of Space Technology (IST), Islamabad, Pakistan, where I have been a founding member of the Geospatial Research & Education Lab (GREL). My general expertise is in Remote Sensing where I have worked with various remote sensing datasets through my career, while for my PhD thesis I specifically worked on Remote Sensing using SAR (Synthetic Aperture Radar) and Oceanography, working extensively on development of techniques to measure ocean surface currents from space-borne SAR intensity images and interferometric data. My research interests are: Remote sensing, Synthetic Aperture Radar (SAR) imagery and interferometric data processing & analysis, Visible/Infrared/High-resolution satellite image processing & analysis, Oceanography, Earth system study and modelling, LIDAR data processing and analysis, Scientific programming. I am a reviewer for IEEE Transactions on Geoscience & Remote Sensing, Forest Ecosystems, GIScience & Remote Sensing, Journal of African Earth Sciences, and Italian Journal of Agronomy. I am an alumnus of Pakistan National Physics Talent Contest (NPTC), an alumnus of the Lindau Nobel Laureate Meetings, a Fulbright alumnus, and the Pakistan National Point of Contact for Space Generation Advisory Council (SGAC). I was an invited speaker at the TEDxIslamabad event held in Nov., 2014. I've served as mentor in the NASA International Space App Challenge Islamabad events in April 2015 and April 2016.

Aperture Synthesis and Azimuth Resolution in Synthetic Aperture Radar – Lecture Notes

Teaching the fundamentals of Synthetic Aperture Radar (SAR) system design and imaging mechanism to remote sensing students / professionals is always a difficult task. Remote sensing students / professionals generally do not have an in-depth background of signal processing and radar system design, and as an instructor, I always have to think over how much I need to tell them about SAR system design, without diving into the detailed mathematics of signal processing and imaging mechanism. Normally, I go in-depth towards the imaging geometry and an understanding of the Doppler history curve, and briefly go over the signal-processing heavy concepts like pulse compression and matched filtering. A good fundamental understanding of the SAR system design, imaging geometry, and image formation is essential for remote sensing students / professionals to have a background context knowledge when they select SAR data and process / analyze it for different remote sensing applications.

For the past few years, I have been teaching a graduate course in Radar Remote Sensing and also run an annual Summer School on Earth Remote Sensing with SAR at our research group GREL. One of the core issues in understanding the aperture synthesis process is the requirement for enhancement of the azimuth / along-track resolution. It is always interesting to discuss in class how in normal imaging radar the azimuth resolution depends inversely on the antenna along-track length, while in fully-focussed SAR the azimuth resolution becomes half of the antenna along-track length. This is a significant reversal: In normal imaging radar, we need a bigger antenna in along-track dimension to get better azimuth resolution, while in SAR, the smaller the antenna in the along-track dimension, the better the azimuth resolution.

AzimuthResolutionSAR

To explain how aperture synthesis changes the azimuth resolution to half of the along-track antenna length, I have made some detailed notes for my ongoing graduate class on Radar Remote Sensing. These notes require just basic knowledge of geometry, algebra, and sum series in mathematics. I would like to share them with the wider scientific audience, please access the PDF notes here: Aperture Synthesis and Azimuth Resolution.

ApertureSynthesis

The synthetic aperture length is defined in the figure above. The azimuth resolution in fully-focussed SAR becomes half of the antenna along-track dimension.

I have taken the help of two excellent resources on SAR remote sensing in developing these notes:

For more in-depth understanding and analysis of how SAR is used for remote sensing, you can consider attending the next Summer School on Earth Remote Sensing with SAR, which I will be offering this summer. The summer school is coming up in July, 2018, and it will be open for international participants; formal dates will be announced soon. Keep watching the GREL website for updates.

Big Earth Data Documentary

Recently I ran into this wonderful documentary about how scientists are handling the huge amounts of remote sensing and earth science data being collected in the current age.

The documentary spends a lot of time talking about how remote sensing is used for oceanography and marine security monitoring, looking at concerns like monster waves, oil spills, surface ice content, ship routing through polar oceans, etc.

The EarthServer project is mentioned, which establishes “big earth data analytics, rapid ad-hoc processing and filtering on massive geodata.” Satellite images are shown to be useful also for automatic counting of houses or camps, and for disaster damage assessment. The use of GRACE satellite system for Earth gravimetry and water content measurement is mentioned.

For background information regarding the documentary, go here.

 

Summary: ISNET / NARSS Workshop on SAR Remote Sensing, 27th Nov. – 1st Dec., 2016

The Inter-Islamic Network on Space Sciences & Technology (ISNET), in collaboration with National Authority for Remote Sensing & Space Sciences (NARSS), held a 5-day Workshop on “Earth Remote Sensing with Synthetic Aperture Radar (SAR)” from 27 November – 1st Dec 2016 at NARSS premises, Cairo, Egypt. This workshop was supported by the OIC Ministerial Standing Committee for Scientific and Technological Cooperation (COMSTECH) and the Islamic Development Bank (IDB).

Teaching complex numbers NARSS SAR workshop

Reviewing complex numbers, which form the basis of SAR imaging.

The initial part of the workshop comprised of seminar and research presentations on SAR remote sensing applications. This was followed by 2.5 days of extensive tutorial modules on SAR fundamentals, and hands-on training workshop sessions on different softwares and tools that are required for SAR remote sensing applications. The tutorial and workshop sessions were led by me, and I was honoured to be invited by ISNET and NARSS to conduct these sessions.

Group picture NARSS SAR workshop

Participants of the hands-on training workshop sessions.

The hands-on workshop modules were conducted with actual SAR remote sensing imagery to give experience to participants on processing and analysis of SAR data. Open-source software tools specifically made for SAR data processing, such as ESA Sentinel Applications Platform (SNAP) and ASF MapReady, were utilized for this workshop to ensure large no. of participants and to make the hands-on workshop modules accessible to all participants. The hands-on modules covered topics like identifying errors in SAR imagery (topographic, radiometric, geometric), data pre-processing, SAR sub-surface imaging and SAR-optical data fusion, interpreting SAR data over the ocean, understanding complex SAR data, and basics of interferometry.

Overall, more than 60 participants took part in the training workshop. Although I teach a graduate course on Radar Remote Sensing and also conduct a SAR Remote Sensing summer school since the last 2 years at our research group, yet this was a first experience for me to conduct a international SAR workshop. I got great feedback, and more motivation to continue forward on my SAR journey.

 

Ocean Eddies & Slicks in SAR Imagery

In a recent post, I talked about observing an eddy in the Arabian Sea in L-band ALOS PALSAR SAR imagery. In this post, I want to talk briefly about the physical interaction between SAR signals and eddies.

Spiral eddies are often convergence zones and act as accumulators of surface slicks. These surface slicks (could be biogenic / natural oil seeps / mineral oil etc.) make a surface layer over the ocean and actually dampen the surface waves of the ocean through a phenomenon called Marangoni Damping (see this seminal paper by Alpers and Hühnerfuss).

However, sometimes it is also possible that an eddy may appear brighter in SAR imagery than the surrounding ocean, due to wave-current and shear interactions.

In my paper on ocean currents from sequential SAR imagery, I talk about this phenomenon in the introduction, and you can also find some good references therein.

For further interest, here are a few other seminal papers on the science of ocean wave damping by surface slicks:

A comparison of Landsat 7 ETM+, Landsat 8 OLI, and Sentinel 2A MSI over the visible and near-infrared parts of the spectrum

Scientia Plus Conscientia

How do different sensors perform across the electromagnetic spectrum? This question bears practical importance when we want to combine data acquired by different sensors. I therefore thought it would be interesting and fun to do a simulation of how different common sensors see the same feature.

We could in principle do this using subsets of images of the same region captured by different sensors, but it is actually easier to compare them using a given spectral signature, the reflectance (or emittance) of a certain material as a function of wavelength.

I therefore went to the Aster spectral library and downloaded several datasets corresponding to different spectral signatures. In the following example, we use that of common lawn grass:

Spectral signature of lawn grass. Spectral signature of lawn grass. Source: ASTER spectral library.

How do Landsat 7 ETM+, landsat 8 OLI and Sentinel 2A MSI “see” this grass? To answer this question…

View original post 668 more words