Literature DB >> 36238477

Blind Image Inpainting with Mixture Noise Using 0 and Total Regularization.

Xiaowei Xu1, Shiqi Geng1.   

Abstract

The blind image inpainting problem need to be handle when faced with a large number of images, especially medical images in medical health. For the proposed nonconvex sparse optimization model, a proximal based alternating direction method of multipliers (PADMM) method is designed to solve the problem. Firstly, ℓ 0 sparse regularization is imposed to the binary mask since the missing pixels are sparse in our experiments. Secondly, the total variation term is utilized to describe the underlying clean image. Finally, ℓ 2 regularization of the fidelity term is used to solve the given blind inpainting problem. Experiments show that this method has better performance than traditional method, and could deal with the blind image inpainting problem.
Copyright © 2022 Xiaowei Xu and Shiqi Geng.

Entities:  

Mesh:

Year:  2022        PMID: 36238477      PMCID: PMC9553350          DOI: 10.1155/2022/3180612

Source DB:  PubMed          Journal:  Comput Math Methods Med        ISSN: 1748-670X            Impact factor:   2.809


1. Introduction

Medical images can directly reflect the function and health status of human tissues, and have become one of the standards of diagnosis and medical intervention. With the increasing availability and utilization of modern medical imaging such as disease database, X-ray film and magnetic resonance imaging, the demand for automatic processing of medical image data is increasing. With the application of medical images, automatic medical image analysis has become one of the hot directions of contemporary medical imaging research [1, 2]. In the process of image capture, imaging sensors broken or the error in information transmission may cause some pixels missing or corrupted by impulse noise [3-8]. In this work, the image is defined as a vector with n pixels in lexicographic ordering, x ∈ ℜ. It can be represented mathematically as follows: where y ∈ ℜ is the observed image, x ∈ ℜ is the clean image and A ∈ ℜ is an identity matrix. If the clean image x is corrupted by the additive Gaussian noise η, the model will be updated as follows: The image is corrupted by impulse noise, and the pixels in y is corrupted by the impulse noise η. The blind image inpainting model with the mixture noise (Gaussian and impulse noise) is finally described as follows [9]: How to solve the E.q. (2) efficiently and effectively is the most important issue. It is clear that E.q. (2) is quite challenging. It has three unknown term A, η and η. The goal of this paper is to estimate the clean image x from the partial observation y without unknow mask A, noise η and η. In the earlier researches, the inpainting problem can be solved by many approaches related image reconstruction from the aspect of sparse modeling [10-13]. More common strategies for the removal of impulse noise for blind inpainting problems are to estimate an approximated A by computing the support set of the noisy pixels with some outlier detection methods, and apply the reconstruction methods for a known mask A. The difficulty of reconstructing an image with missing data and mixture noise is basically to detect the locations of outliers. Some filtering based methods estimate the missing values, such as the adaptive median filter [14], and adaptive center weighted median filter [15]. Two phase based methods for blind inpainting problems involve the estimation of mask A, which is calculated by some outlier detection approaches. After the mask estimation, the inpainted image is generated by the image reconstruction step, which is implemented by a standard convex optimization. Some convex methods were ulitized to estimate the inpainted images [16]. Used the total variation (TV) regularizer [17] estimates the inpainted images. Xiao et al. [18] proposed a combination of ℓ1-norm and ℓ0-norm regularizers for simultaneously removing impulse noise and computing learning dictionary after the mask. In addition, the authors [19] presented an approach for mixed impulse and Gaussian noise removal. In the approach, a logarithmic transformation strategy is applied to convert the multiplication between the image and binary mask. Then, the image and mask terms are estimated iteratively with TV regularization applied on the image. Especially, the method can also be extended to the removal of impulse noise by relaxing the regularizer from the ℓ0 norm to the ℓ1 norm. Some approaches could estimate the mask and impulse noise field by an iterative process instead of involving a separated mask detection step, such as a low-rank matrix recovery method [20]. The proposed approach belongs to the category of simultaneously estimating the mask and impulse noise. To address the challenging blind inpainting task with mixture noise, a novel model is proposed based on imposing a ℓ0 sparse regularization to the binary mask. The proposed model can be efficiently solved by a designed proximal based alternating direction method of multipliers (PADMM) method. The main contribution of this work is given as follows: 1) A new model that fits in the practical situation of blind inpainting problem is proposed. 2) The new model solves the challenging blind inpainting task with mixture noise. 3) An efficient algorithm is given to effectively solve the proposed model. The outline of this paper is given as follows. The proposed method including the new model and the designed algorithm is exhibited in Section 2. The solution of the proposed method is described in Section 3. Section 4 shows the experimental results and analysis. Finally, we draw some conclusions in Section 5.

2. Model Building

The paper proposes a minimization model to solve the image inpainting problem on the basic blind image with mixture noise. we denote a as the verctor form of mask A and the E.q. (3) can be expressed as follows: where ⊗ stands for a dot product between vectors, and Ι is a vector with all the value Ι. x ∈ ℜ is the vector form of one matrix x ∈ ℜ with n = n1 × n2 in (2), and a, y are the vectors with the same dimensions as x. As a is the position of pixels missing, the Ι-a represents the locations of impulse noise in the image. we will give more explanations for E.q.(4) to present the proposed model for the blind image inpainting with mixture noise: Since a ⊗ (y − x) mainly represents Gaussian noise, we use the ℓ2 norm ‖a ⊗ (y − x)‖22 to construct the fidelity term3 2. (1‐a) ⊗ y only represents impulse noise (e.g., salt-peppers, random value), thus we may use ℓ1 sparse regularization to describe (1‐a) ⊗ y, i.e., ‖(1‐a) ⊗ y‖1 Especially, if the impulse noise is relatively dense, it may impose £q regularization to a in the new model, i.e., ‖a‖0. Otherwise, if the impulse noise is sparse and the condition of minimizing ‖a‖0 may not hold, we could easily set a small parameter to control it Finally, we employ the (anisotropic) total variation (TV) regularization to the underlying clean image x, i.e., ‖∇x‖1, the TV regularization is quite popular and useful in the applications of image processing As presented above, we formulate the final proposed model for the inpainting task as follows: The ℓ0 minimization can be equality described as such that v ⊗ |a| = 0 and 0 ≤ v ≤ 1 based on [8]. Thus, E.q.(2) can be expressed as follows: As discussed above, we may get the following augmented Lagrangian problem instead of the constrained minimization E.q. (6) with variable substitution. Where π1, π2 and π3 are Lagrange multipliers, and β1, β2 and β3 are three positive parameters. The Lagrangian problem ζ(x, a, z, w, v, π1, π2, π3) can be solved alternatively and iteratively by the following minimization subproblems in Section 3.

3. Model Solution Method

We add the proximal term 1/2‖x − x‖2 to the x subproblem from E.q.(7) and denote ‖x‖2 = xDx to get the following proximal Where The solution of E.q.(8) is given as follows: Where The a-subproblem is shown as the following: We need to discuss the solution by the following two cases: When a >0, When a <0, Therefore, the reformulation is: Where The z-subproblem can be written as the following minimization problem: which has a closed-form solution by soft-thresholding [7]. Where Similarly, the w-subproblem is given as the follows: which holds the closed-form solution by soft-thresholding: The v-subproblem is given as the follows: which could also hold the closed-form solution [9]: Where We finally update the Lagrange multipliers by: We may effectively obtain the solution of the constrained model (5) with the initial guesses u0 = v0 = a0 = 0. We summarize the above steps to get the following Algorithm 1:
Algorithm 1

Solve the optimization model (5) by PADMM.

Although Algorithm 1 involves some parameters, these parameters are actually not sensitive and easy to select. We also compute the energy of each iteration. If the energy is below a given tolerance, the iteration will stop and output the final result. In the next section, we will exhibit the experiment results to demonstrate the effectiveness of the proposed method.

4. Experimental Results and Analysis

The numerical experiments in this section are implemented with MATLAB (R2016a) for both simulated and real images. The experimental computer has 2G RAM and Intel(R) Core(TM)i3-2370 M CPU: @2.40GHz 2.40GHz. Since the literatures for blind image inpainting with mixture noise are limited, we here only compare the proposed method with one recent state-of-the-art blind inpainting approach [6], denoted as “ASInpaint” 4. In Figure 1, we present the whole process of image, which is degraded jointly by Gaussian noise and impulse noise. The goal of this work is to recover the clean image X from the degraded image Y. To evaluate the quantitative performance of the compared approaches, we employ two kinds of metrics to estimate the performance of different methods: peak signal-noise ration (PSNR) and structural similarity (SSIM)5 [21].
Figure 1

The flowchart of how to simulate the input image. Note that A and X are both blind and need to compute.

In the experiments, we assume that the pixel values are within the interval [0, 255]. The added salt&pepper type of impulse noise η can have a value of either 0 or 255. For Gaussian noise, the values are also uniformly distributed within the interval [0, 255]. For the parameters in Algorithm 1, we empirically set λ1 =0.9, λ2 =0.08, λ3 =0.08, and β1 = β2 = β3 = 200 for experiments. Note that we could tune the paramters to get better results, and we fix them in the experiments to illustrate the stability of the given method. For the parameters of “ASInpaint”, we keep the default settings of the provided code. In Figure 2, we illustrate the visual performance of the two compared methods by four different simulated images with mixture noise (see Figure 2(a)) named “Lena”, “Cameraman”, “Phantom”, and “Satellite”. We added the Gaussian and impulse mixture noise on the images (see Figure 2(b)). In particular, we evaluate the effectiveness of the proposed method by the same image with different levels mixture noise (see Figure 2 the 1st and 3rd rows). Although the ASInpaint approach also obtained competitive results (see Figure 2(d)), the proposed method could obtain better visual performance, especially on the shape profile of the images (see Figure 2(e)). we have to emphasize that the visual performance of both methods seem to be not better than the recovered image by other literatures. The problem addressed is quite challenging that there are four unknown variables in the problem, such as the underlying clean image X, the mask A, the impulse noise N, and the Gaussian noise N. Although we may reduce to only two unknown variables X and A, it is still very difficult to recover the underlying clean image X. However, the given model has also recovered the relatively good visual results by the given algorithm.
Figure 2

The visual comparisons between ASInpaint and the proposed method. (a) The ground-truth image; (b) The mask for the missing pixels; (c) The degraded image by Gaussian and impulse noise; (d) The recovered image by ASInpaint [6]; (e) The recovered image by the proposed method.

Meanwhile, denoising results on a real color image by all competing methods (NLH method, NSNR method, WNNM method) is shown as Figure 3. The 15 test images used in image denoising experiments are shown as Figure 4. Inpainting results on images Starfish by different methods (Random mask with 75% missing values) are shown as Figure 5. Inpainting results on images Monarch by different methods (Text mask) are shown as Figure 6. These results show that PADMM algorithm has high performance of image denoising.
Figure 3

Denoising results on a real color image by all competing methods.

Figure 4

The 15 test images used in image denoising experiments.

Figure 5

Inpainting results on images Starfish by different methods (Random mask with 75% missing values).

Figure 6

Inpainting results on images Monarch by different methods (Text mask).

The quantitative comparisons of all methods are reported in Table 1, which indicates that the PADMM algorithm can improve the performance and yield the best quantitative results. Meanwhile, the paper tests the performance of NLH, KSVD, BM3D and WNNM algorithms on PSNR and SSIM of 15 pictures in the different value of σn. The experimental results are shown in Tables 2 and 3. The results show that with the increase of σn value, the PSNR and SSIM of each picture gradually decrease, but the performance of these algorithms is significantly worse than PADMM algorithm.
Table 1

The quantitative performance of Figure 2 for the two compared methods with the corresponding noise setting, i.e., missing proportion for impulse noise Ni and the a for Gaussian noise Ng.

ImageNoise settingASInpaint [6]PADMM
PSNRSSIMPSNRSSIM
Lena50% missing σn = 1521.820.634722.780.6650
Cameraman30% missing σn = 1919.130.543519.600.5630
Lena10% missing σn = 1524.290.717524.490.7240
Phantom10% missing σn = 0.449.250.965749.350.9666
Satellite10% missing σn = 1522.410.644622.540.7662
Table 2

Denoising results (PSNR, SSIM) by competing methods on 15 test images.

NLHKSVDBM3DWNNM
PSNRSSIMPSNRSSIMPSNRSSIMPSNRSSIM
σn =15
C.man32.00540.900131.40740.892631.91520.900732.17680.9036
House35.28320.898134.3080.875834.94470.890735.15330.8909
Peppers32.94160.908732.20620.898732.70170.906432.9740.9098
Straws28.57210.928528.32310.926228.56180.931729.13960.9396
Leaves32.09510.969730.88060.956231.72330.965932.82660.9735
StarFish31.41400.900730.73770.893131.14580.900731.82550.9081
Monarch32.10650.938831.38640.929131.85970.936032.71780.9424
Airplane31.40840.902530.79550.893731.07680.899531.40040.9029
Ma31.98380.865731.49100.854431.92930.866732.1230.8701
J.Bean36.16620.970835.51880.963535.70380.967836.56420.9735
Couple31.94140.869231.44980.854032.10870.876132.18180.8746
Parrot31.38260.891931.03670.891531.37600.894431.60710.8968
Barbara32.83840.921632.42140.909933.11410.922833.61140.9277
Boat31.99440.848331.7033.841032.14010.853432.28000.8549
Lena34.19020.895333.74100.885134.27160.895034.38220.8973

σn =30
C.man28.86070.840228.01580.815728.63770.836628.78270.8399
House32.45700.850231.17540.830532.08710.847432.5510.8523
Peppers29.57430.854028.7910.840729.27990.850029.49160.8567
Straws24.42530.803824.30210.796424.83580.83225.24570.8497
Leaves28.12280.933326.96650.911827.81110.927528.60830.9389
StarFish27.89240.833127.23250.813027.65350.828628.06890.8357
Monarch28.72200.889128.01090.871728.36410.881728.91350.8926
Airplane27.97360.843927.25950.825227.55920.836627.81760.8438
Ma28.99990.780328.32440.751428.85970.779828.97980.7818
J.Bean32.04280.932131.61620.922731.96690.935032.50050.9438
Couple28.97260.796428.97260.746328.86910.794328.96790.7945
Parrot28.32000.831927.55510.818628.11840.831328.32020.8346
Barbara29.83740.874628.60060.822629.81360.868230.30860.8812
Boat29.16630.778528.40930.744029.11720.779129.22620.7792
Lena31.31940.847430.41920.824531.26210.844331.43150.8502

σn =50
C.man26.34660.790325.73610.745126.11300.782226.41760.7848
House30.51780.830627.94680.760229.69390.811630.33250.8231
Peppers27.05240.806326.03680.769526.68340.793226.91230.8008
Straws21.69290.630821.32630.580022.28740.689822.72610.7305
Leaves25.35670.890724.21360.857124.68180.867725.47210.8925
StarFish25.21000.749224.38760.712525.04430.742925.43270.7596
Monarch26.29020.835425.16630.793725.81860.819626.31700.8350
Airplane25.36110.782124.62000.743125.10220.771625.42440.7850
Ma26.87620.703126.03080.662526.80810.705126.93730.7090
J.Bean29.69370.911428.17450.852629.25950.899829.63510.9098
Couple26.46040.705725.30370.630926.46380.706426.64360.7135
Parrot25.98560.78125.41870.754025.89840.780426.09260.7847
Barbara27.48330.812825.56000.719127.22540.794227.78870.8199
Boat26.86250.703225.93570.656926.78080.705026.96930.7083
Lena29.20230.806927.87010.760629.05020.798929.25120.8059

σ n =75
C.man24.75290.749223.18040.655024.32540.733424.55200.7353
House28.53250.796325.33690.680027.50850.764028.23780.7887
Peppers25.16320.751023.51630.683524.73410.736424.91520.7418
Straws20.44220.521419.27920.360420.55880.544021.00390.6040
Leaves23.00930.830620.76230.729622.488920.807023.05940.8350
StarFish23.22200.664222.10930.602723.27460.666723.47200.6801
Monarch24.49820.783122.90800.718323.90730.755324.30750.7754
Airplane23.69140.728922.32930.661123.47490.714523.74070.7302
Ma27.43660.882425.43100.761627.21530.856527.42330.8707
J.Bean27.43660.882425.4310.761627.21530.856527.42330.8707
Couple24.91900.642223.57760.551124.69880.625724.85770.6369
Parrot24.37940.742123.37860.682024.18560.730224.36980.7410
Barbara25.63790.743023.04970.603225.12380.710825.81230.7486
Boat25.30210.648723.97560.579525.11960.640725.29510.6465
Lena27.59960.770625.74840.693927.25690.751027.54320.7657

σ n =100
C.man23.53290.705021.67120.576223.08130.692223.35790.6968
House26.72030.758923.67510.618625.87230.719626.66400.7536
Peppers23.80280.707621.82890.623823.39460.687623.44850.6978
Straws19.40430.400418.38010.265519.43030.422319.68780.4537
Leaves21.59630.784418.28960.593420.90950.748121.56580.7884
StarFish22.16770.615820.96690.539722.09770.605122.22630.6170
Monarch23.14980.732220.55680.615422.51850.701722.95000.7257
Airplane22.68910.695320.84160.577322.10940.671022.55290.6854
Ma24.4735.613923.38940.548224.22370.597524.35840.6048
J.Bean26.18010.856123.69840.691125.80100.817526.02930.8337
Couple23.74070.583522.61830.499223.51070.566123.55970.5702
Parrot23.13670.705321.83620.682022.95930.689223.18630.7045
Barbara24.47120.696021.88340.533223.62430.642624.10980.6862
Boat24.2023.607322.78060.524623.97030.593224.10980.5981
Lena26.45090.742824.35080.638725.95480.708526.21270.7256
Table 3

The average PSNR & SSIM values by comprting methods on the 15 test images: The best results are highlighted in bold.

σ n =15 σ n = 30 σ n =50 σ n =75 σ n =100
PSNRSSIMPSNRSSIMPSNRSSIMPSNRSSIMPSNRSSIM
BM3D32.30480.907228.94900.844826.46070.777924.61260.712023.29720.6575
KSVD31.82710.897728.30990.822325.58180.733223.26840.637221.78450.5685
NLH32.42160.907328.30990.845926.69280.782624.93420.726923.49990.6747
WNNM 32.7309 0.9116 29.2809 0.8517 26.8235 0.7908 24.9360 0.7300 23.6013 0.6761

5. Conclusions

In this paper, we present a novel optimization model and design the corresponding algorithm to address the challenging blind inpainting task with mixture noise. There are three main contributions in this work: 1) The model intetrates a ℓ0 sparse regularization to the binary mask, the total variation term to the underlying clean image and a ℓ2 regularization to describe the fidelity term; 2), Theproximal based alternating direction method of multipliers (PADMM) method was utilized and implemented to solve the optimization problem; 3) Experiments on some simulated examples with complex mixture noise are implemented, and the visual and quantitative results demonstrate the proposed method outperforms the other method.
  5 in total

1.  Image quality assessment: from error visibility to structural similarity.

Authors:  Zhou Wang; Alan Conrad Bovik; Hamid Rahim Sheikh; Eero P Simoncelli
Journal:  IEEE Trans Image Process       Date:  2004-04       Impact factor: 10.856

2.  Sparse MRI: The application of compressed sensing for rapid MR imaging.

Authors:  Michael Lustig; David Donoho; John M Pauly
Journal:  Magn Reson Med       Date:  2007-12       Impact factor: 4.668

3.  Blind inpainting using l0 and total variation regularization.

Authors:  Manya V Afonso; Joao Miguel Raposo Sanches
Journal:  IEEE Trans Image Process       Date:  2015-07       Impact factor: 10.856

4.  Prediction of fetal weight at varying gestational age in the absence of ultrasound examination using ensemble learning.

Authors:  Yu Lu; Xianghua Fu; Fangxiong Chen; Kelvin K L Wong
Journal:  Artif Intell Med       Date:  2019-11-17       Impact factor: 5.326

5.  Adaptive Swarm Balancing Algorithms for rare-event prediction in imbalanced healthcare data.

Authors:  Jinyan Li; Lian-Sheng Liu; Simon Fong; Raymond K Wong; Sabah Mohammed; Jinan Fiaidhi; Yunsick Sung; Kelvin K L Wong
Journal:  PLoS One       Date:  2017-07-28       Impact factor: 3.240

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.