
Light pollution (from street lighting etc) and moon glow are a couple of examples of how images can be affected by varying levels of background light during exposure. PixInsight has two tools to deal with this — Automatic Background Extraction (ABE) and Dynamic Background Extraction (DBE). This shows the latter

The sample boxes placed over the background should generate smoothly transitioning levels when looking at the background model (left hand image), and with grey scale images the 'FlatContourPlot' script can help as a comparison with the background model (right hand image). If the level transitions are not smooth then the sample points will need changing or moving. In general the smooth appearance will be achieved by removing the points over the anomalous areas
LRGB image processing is immediately below and narrowband image processing starts after that
With LRGB images, the next step is to use 'LinearFit' which matches the mean background and signal levels of each of the master images (L, R, G & B) before they are combined into a single colour image. 'LinearFit' needs one of the individual frames to be selected as a reference image to which the others will be corrected, and it's usual to select the one with the most signal. Hover the mouse over the image on the left to see a comparison of the histograms for each filter (green being the strongest)

This is straightforward, ensure RGB is selected in 'Color Space', assign the correct linear fitted master frames to the respective channels and 'Apply Global (F6)'

Deconvolution is an optional step that tries to correct for loss of definition from atmospheric distortion, and is only applied to the luminance channel / image. The first step is to create a point spread function image, and it's been made much easier by the introduction of the 'PSFImage' script

Using the 'StarMask' process, reduce the 'Noise threshold:' to about 0.02 and increase the 'Small–scale Structure Growth' to 3 which should be sufficient to produce a suitable star mask

Hover the mouse over the image to see the difference

Switching attention back to the RGB master image, this now needs to be colour corrected. The process to do this is 'PhotometricColorCalibration.' This process will also allow background neutralisation to be carried out at the same time. Most of the parameters can be left at the default but it needs a couple of items updating:
Previously I've used 'MultiscaleLinearTransform' and 'TGVDenoise' but since the introduction of the 'EZ Processing Suite' scripts I've started using 'EZ Denoise', mostly with the default settings, and found it works extremely well in most cases. Applied to both the luminance and RGB master images

In order to get a good balance between the luminance and RGB images when they're combined, the luminance component of the RGB image should match the levels of the luminance image. These steps are taken directly from Juan Conejero's forum post
Apply the initial nonlinear histogram transformations to RGB and L: