A Time Delay Calibration Technique for Improving Broadband Lightning Interferometer Locating.

Remote. Sens.(2023)

Cited 0|Views26
No score
Abstract
This article introduces a time delay calibration technique designed for processing broadband lightning interferometer data with the aim of solving the problem of increased noises in the location results when reducing the length of the data analysis window. The locating performances using three analysis window lengths, 1024 ns, 256 ns, and 128 ns, were compared and analyzed using a cloud-to-ground lightning record as an example. Without using the time delay calibration, the locating noises significantly increased as the length of the analysis window decreased. After the calibration, the problem was solved. Using statistical analysis of the least squares residuals and the signal correlation coefficients within the analysis windows, it was found that overall, there was no significant change in the distribution of residuals after using the time delay calibration method, but the correlation coefficients were significantly improved. The results indicate that the time delay calibration technique can improve the correlation of signals within the analysis window, thereby greatly reducing the ineffective locating results generated after narrowing down the analysis window. The article also analyzed the locating results, as well as the correlation coefficients and signal strength characteristics at the analysis window of 32 ns, the smallest ever reported before. Even at such a small window, the time delay calibration method can still ensure computational stability. The relevant analysis suggests that according to data usage, the correlation coefficient can be flexibly used as a quality control condition of the located results.
More
Translated text
Key words
time delay calibration technique,interferometer
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined