\`x^2+y_1+z_12^34\`
Advanced Search
Article Contents
Article Contents

Leveraging joint sparsity in 3D synthetic aperture radar imaging

  • *Corresponding author: Dylan Green

    *Corresponding author: Dylan Green 

This work is partially supported through the Autonomy Technology Research Center by AFRL contracts #FA8650-18-2-1645 (DG & JJ) and # FA8650-22-C-1017 (DG & JJ), and by the NSF grant DMS #1912685 (AG), AFOSR grant #F9550-22-1-0411 (DG & AG), DOE ASCR #DE-AC05-0 0OR22725 (AG), and ONR MURI grant #N00014-20-1-2595 (AG)

Abstract / Introduction Full Text(HTML) Figure(11) / Table(5) Related Papers Cited by
  • Three-dimensional (3D) synthetic aperture radar (SAR) imaging is an active and growing field of research with various applications in both military and civilian domains. Sparsity promoting computational inverse methods have proven to be effective in providing point estimates for the volumetric image. Such techniques have been enhanced by leveraging sequential joint sparsity information from nearby aperture windows. This investigation extends these ideas by introducing a Bayesian volumetric approach that leverages the assumption of sequential joint sparsity. In addition to obtaining a point estimate, our new approach also enables uncertainty quantification. As demonstrated in simulated experiments, our approach compares favorably to currently used methodology for point estimate approximations, and has the additional advantage of providing uncertainty quantification for two-dimensional projections of the volumetric image.

    Mathematics Subject Classification: 15A29, 62F15, 65F22, 94A12.

    Citation:

    \begin{equation} \\ \end{equation}
  • 加载中
  • Figure 1.  A graphical depiction of Theorem 2.3

    Figure 2.  A graphical depiction of SAR PHD $ \hat{{\mathit{\boldsymbol{g}}}} $ (3) in $ k $-space as well as the partitioning of the data into $ N_\theta $ partitions according to the azimuthal angle sets $ \Theta_n $ given by (9)

    Figure 3.  Different views at various dB thresholds of the 3D reconstruction of the synthetic cube data set in the ideal case; ground truth point cloud is displayed in black. The threshold values chosen to best demonstrate reconstruction quality

    Figure 4.  Cross-sections of the (left) 2D IRB and (middle) 3D SRCI reconstructions of the cube data set with no additional noise. (right) MHD values at various dB threshold values when either technique is used on the cube data set with no additional noise; the minimum MHD value calculated for the 2D IRB method is 0.7017cm, and for the 3D SRCI, the minimum MHD is 0.60140.6014cm

    Figure 5.  Different views at various dB thresholds of the 3D reconstruction of the B747 data set with no added noise; ground truth CAD model is displayed in black

    Figure 6.  Cross-sections of the (left) 2D IRB and (middle) 3D SRCI reconstructions of the B747 data set with no additional noise. (right) MHD values at various dB threshold values when either technique is used on the B747 data set with no additional noise; the minimum MHD value calculated for the 2D IRB method is 1.832cm, and for the 3D SRCI, the minimum MHD is 1.440cm

    Figure 7.  Threshold value vs. MHD for the cube (left) and B747 (right) data sets comparing the 2D IRB and 3D SRCI for both the JHBL and MLE approximations. (top) SNR $ \approx 0 $ dB; (bottom) SNR $ \approx -24 $ dB. In all plots, the dashed blue lines are the MLE MHD values, while the solid red lines are the JHBL MHD values. In all cases, it is straightforward to infer the rest of the characterization of the MHD values by continuing the trends in (A)-(H)

    Figure 8.  Slices of the (left) MLE and (right) JHBL reconstructions of the B747 with SNR of -30dB-30dB using the 3D SRCI approach

    Figure 9.  Slices of the (left) 2D IRB and (right) 3D SRCI reconstructions of the B747 with SNR of approximately -34dB. Note that for interpretability, the threshold dB scale is different for each figure

    Figure 10.  Different views at various dB thresholds of the 3D reconstruction of the B747 sub-sampled data set using our reconstruction techniques with no additional noise added; ground truth CAD model is displayed in black

    Figure 11.  Cross-sections of the (left) 2D IRB and (middle) 3D SRCI reconstructions of the sub-sampled B747 data set using the parameters in Tables 4 and 5. (right) MHD values at various dB threshold values

    Table 1.  Parameters of data sets used for experimentation

    Parameter Dataset Value
    Elevation Range $ [-3^\text{o},3^\text{o}] $
    Elevation Sampling $ 0.5^\text{o} $
    Frequency Range [27, 39]GHz
    Frequency Sampling 50MHz
    Bandwidth 12GHz
    Center Frequency 33GHz
    Azimuth Range $ [0^\text{o},359.9^\text{o}] $
    Azimuth Sampling $ 0.1^\text{o} $
     | Show Table
    DownLoad: CSV

    Table 2.  Sizes of the inputs and outputs of Algorithm 3 for our numerical experiments

    Parameter 2D IRB (Algorithm 4) 3D SRCI (Algorithm 5)
    Image Size 201×201 201×201×201
    Data Size 241×13×2 241×13×100
    Data Partitions 1800 36
     | Show Table
    DownLoad: CSV

    Table 3.  Minimum MHD (cm) achieved across tested dB thresholds

    2D IRB (Algorithm 4) 3D SRCI (Algorithm 5)
    MLE JHBL MLE JHBL
    Cube, High SNR 0.7281 0.6774 0.5925 0.5222
    Cube, Low SNR 0.6820 0.6408 0.5542 0.5570
    B747, High SNR 1.868 2.150 1.354 1.401
    B747, Low SNR 2.728 2.882 1.357 1.369
     | Show Table
    DownLoad: CSV

    Table 4.  Parameters of sub-sampled data set used for experimentation

    Parameter Sub-sampled Value
    Elevation Range $ [-3^\text{o},3^\text{o}] $
    Elevation Sampling $ 0.5^\text{o} $
    Frequency Range [31, 35]GHz
    Frequency Sampling 150MHz
    Bandwidth 4GHz
    Center Frequency 33GHz
    Azimuth Range $ [0^\text{o},359.9^\text{o}] $
    Azimuth Sampling $ 0.3^\text{o} $
     | Show Table
    DownLoad: CSV

    Table 5.  Sizes of the parameter inputs and outputs for Algorithm 3 for the sub-sampled data experiments

    Parameter 2D IRB (Algorithm 4) 3D SRCI (Algorithm 5)
    Image Size 201×201 201×201×201
    Data Size 27×13×2 27×13×30
    Data Partitions 600 40
     | Show Table
    DownLoad: CSV
  • [1] G. H. Aranda-BojorgesB. P. Garcia-SalgadoV. I. PonomaryovO. Lopez-García and R. Reyes-Reyes, Despeckling of SAR images using GPU based on 3D-MAP estimation, Real-Time Image Processing and Deep Learning, 12102 (2022), 181-192. 
    [2] C. D. AustinE. Ertin and R. L. Moses, Sparse multipass 3D SAR imaging: Applications to the GOTCHA data set, Algorithms for Synthetic Aperture Radar Imagery XVI, 7337 (2009), 19-30.  doi: 10.1117/12.820323.
    [3] A. H. Barnett, J. Magland and L. AF Klinteberg, A parallel nonuniform fast Fourier transform library based on an "exponential of semicircle" kernel, SIAM Journal on Scientific Computing, 41 (2019), C479-C504. doi: 10.1137/18M120885X.
    [4] S. BoydN. ParikhE. ChuB. Peleato and J. Eckstein, Distributed optimization and statistical learning via the alternating direction method of multipliers, Foundations and Trend®in Machine Learning, 3 (2011), 1-122.  doi: 10.1561/2200000016.
    [5] D. Calvetti and E. Somersalo, Inverse problems: From regularization to Bayesian inference, Wiley Interdisciplinary Reviews: Computational Statistics, 10 (2018), e1427. doi: 10.1002/wics.1427.
    [6] E. J. Candès and M. B. Wakin, An introduction to compressive sampling, IEEE Signal Processing Magazine, 25 (2008), 21-30.  doi: 10.1109/MSP.2007.914731.
    [7] L. ChenD. AnX. Huang and Z. Zhou, A 3D reconstruction strategy of vehicle outline based on single-pass single-polarization CSAR data, IEEE Transactions on Image Processing, 26 (2017), 5545-5554.  doi: 10.1109/TIP.2017.2738566.
    [8] M. Cheney, A mathematical tutorial on synthetic aperture radar, SIAM Review, 43 (2001), 301-312.  doi: 10.1137/S0036144500368859.
    [9] M. Cheney and B. Borden, Synthetic aperture radar imaging, in Handbook of Mathematical Methods in Imaging, Springer New York, 2015,763-799. doi: 10.1007/978-1-4939-0790-8_49.
    [10] V. Churchill, Synthetic Aperture Radar Image Formation with Uncertainty Quantification, Ph.D thesis, Dartmouth College, 2020.
    [11] V. Churchill and A. Gelb, Sampling-based spotlight SAR image reconstruction from phase history data for speckle reduction and uncertainty quantification, SIAM/ASA Journal on Uncertainty Quantification, 10 (2022), 1225-1249.  doi: 10.1137/20M1379721.
    [12] V. Churchill and A. Gelb, Sub-aperture SAR imaging with uncertainty quantification, Inverse Problems, 39 (2023), 054004. 
    [13] R. E. Crochiere and L. R. Rabiner, Interpolation and decimation of digital signals-A tutorial review, Proceedings of the IEEE, 69 (1981), 300-331.  doi: 10.1109/PROC.1981.11969.
    [14] A. W. Doerry, E. E. Bishop and J. A. Miller, Basics of backprojection algorithm for processing synthetic aperture radar images, Sandia Report SAND2016-1682, Unlimited Release, (2016), 59.
    [15] M.-P. Dubuisson and A. K. Jain, A modified Hausdorff distance for object matching, Proceedings of the 12th International Conference on Pattern Recognition, Jerusalem, Israel, 1994,566-568.
    [16] J. GlaubitzA. Gelb and G. Song, Generalized sparse Bayesian learning and application to image reconstruction, SIAM/ASA Journal on Uncertainty Quantification, 11 (2023), 262-284.  doi: 10.1137/22M147236X.
    [17] L. A. Gorham and L. J. Moore, SAR image formation toolbox for MATLAB, Algorithms for Synthetic Aperture Radar Imagery XVII, 7699 (2010), 46-58.  doi: 10.1117/12.855375.
    [18] C. V. Jakowatz, D. E. Wahl, P. H. Eichel, D. C. Ghiglia and P. A. Thompson, Spotlight-Mode Synthetic Aperture Radar: A Signal Processing Approach, Springer Science & Business Media, 2012. doi: 10.1007/978-1-4613-1333-5.
    [19] J. R. Jamora, Angular-Dependent Three-Dimensional Imaging Techniques in Multi-Pass Synthetic Aperture Radar, Master's thesis, Mississippi State University, 2021.
    [20] J. R. JamoraD. GreenA. Talley and T. Curry, Utilizing SAR imagery in three-dimensional neural radiance fields-based applications, Algorithms for Synthetic Aperture Radar Imagery XXX, 12520 (2023), 1252002.  doi: 10.1117/12.2656870.
    [21] J. R. JamoraP. SotirelisA. NolanJ. WalrathR. HubbardR. WeerasingheE. Young and S. Young, Multiple modality sensor fusion from synthetic aperture radar, lidar, and electro-optical systems using three-dimensional data representations, Algorithms for Synthetic Aperture Radar Imagery XXIX, 12095 (2022), 32-43.  doi: 10.1117/12.2618297.
    [22] S. JiY. Xue and L. Carin, Bayesian compressive sensing, IEEE Transactions on Signal Processing, 56 (2008), 2346-2356.  doi: 10.1109/TSP.2007.914345.
    [23] T. Li and L. Du, Target discrimination for SAR ATR based on scattering center feature and K-center one-class classification, IEEE Sensors Journal, 18 (2018), 2453-2461.  doi: 10.1109/JSEN.2018.2791947.
    [24] H. Mao, Q. Yu and T. Zhang, Matching SAR image to optical image using modified Hausdorff distance and genetic algorithms, MIPPR 2007: Pattern Recognition and Computer Vision, SPIE, 6788 (2007), 532-537. doi: 10.1117/12.750623.
    [25] D. C. MunsonJ. D. O'Brien and W. K. Jenkins, A tomographic formulation of spotlight-mode synthetic aperture radar, Proceedings of the IEEE, 71 (1983), 917-925.  doi: 10.1109/PROC.1983.12698.
    [26] R. Ng, Fourier slice photography, ACM SIGGRAPH 2005 Papers, (2005), 735-744. doi: 10.1145/1186822.1073256.
    [27] J.-C. NiQ. ZhangY. Luo and L. Sun, Compressed sensing SAR imaging based on centralized sparse representation, IEEE Sensors Journal, 18 (2018), 4920-4932.  doi: 10.1109/JSEN.2018.2831921.
    [28] J.-I. ParkS.-H. Park and K.-T. Kim, New discrimination features for SAR automatic target recognition, IEEE Geoscience and Remote Sensing Letters, 10 (2012), 476-480.  doi: 10.1109/LGRS.2012.2210385.
    [29] C. R. Paulson, Utilizing Glint Phenomenology to Perform Classification of Civilian Vehicles Using Synthetic Aperture Radar, Ph.D thesis, University of Florida, 2013.
    [30] B. D. Rigling and R. L. Moses, Polar format algorithm for bistatic SAR, IEEE Transactions on Aerospace and Electronic Systems, 40 (2004), 1147-1159.  doi: 10.1109/TAES.2004.1386870.
    [31] T. SandersA. Gelb and R. B. Platte, Composite SAR imaging using sequential joint sparsity, Journal of Computational Physics, 338 (2017), 357-370.  doi: 10.1016/j.jcp.2017.02.071.
    [32] T. Scarnati and J. R. Jamora, Three-dimensional object reconstruction from sparse multi-pass SAR data, Algorithms for Synthetic Aperture Radar Imagery XXVIII, SPIE, 11728 (2021), 143-155. doi: 10.1117/12.2593601.
    [33] Y. ShiX. X. Zhu and R. Bamler, Nonlocal compressive sensing-based SAR tomography, IEEE Transactions on Geoscience and Remote Sensing, 57 (2019), 3015-3024.  doi: 10.1109/TGRS.2018.2879382.
    [34] M. E. Tipping, Sparse Bayesian learning and the relevance vector machine, Journal of Machine Learning Research, 1 (2001), 211-244.  doi: 10.1162/15324430152748236.
    [35] Y. Xiao and J. Glaubitz, Sequential image recovery using joint hierarchical Bayesian learning, Journal of Scientific Computing, 96 (2023), 4.  doi: 10.1007/s10915-023-02234-1.
    [36] J. ZhangA. Gelb and T. Scarnati, Empirical Bayesian inference using a support informed prior, SIAM/ASA Journal on Uncertainty Quantification, 10 (2022), 745-774.  doi: 10.1137/21M140794X.
    [37] Z. ZhangH. Lei and Z. Lv, Vehicle layover removal in circular SAR images via ROSL, IEEE Geoscience and Remote Sensing Letters, 12 (2015), 2413-2417.  doi: 10.1109/LGRS.2015.2480415.
    [38] Q. Zhao and J. C. Principe, Support vector machines for SAR automatic target recognition, IEEE Transactions on Aerospace and Electronic Systems, 37 (2001), 643-654.  doi: 10.1109/7.937475.
  • 加载中

Figures(11)

Tables(5)

SHARE

Article Metrics

HTML views(1914) PDF downloads(225) Cited by(0)

Access History

Other Articles By Authors

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return