LRGB Editing Workflow for PixInsight

One of the hardest parts of learning PixInsight is knowing what processes to use & when. While there can never be a one-size-fits-all guide, this guide outlines the workflow that I typically use when processing my LRGB images, and you may find it a helpful reference as you edit your own image. 

This guide assumes you:

  1. Are familiar with basic PixInsight navigation — e.g., how to apply AutoSTF, the difference between linear and nonlinear images, etc. Though it doesn’t walk through the mechanics of each tool in detail, as I find that is best learned via video tutorials, PixInsight’s documentation, or hands-on experimentation.
  2. Have already stacked your master lights using something like WBPP
  3. Have the following repositories installed (or are comfortable using alternatives):
 Plugin Repository
(Put these in Resources > Updates > Manage Repositories)
Alternative
BlurXTerminator https://www.rc-astro.com/BlurXTerminator/PixInsight/ I'm not impressed with alternatives at the time of writing. You can skip this if you don't have it, or ask a friend to do it for you.
StarXTerminator https://www.rc-astro.com/StarXTerminator/PixInsight/ StarNet
GeneralizedHyperbolicStretch https://www.ghsastro.co.uk/updates/ Use whatever other stretching method you're comfortable with
DeepSNR https://pixinsight.deepsnrastro.com/ NoiseXTerminator
RC Astro's TensorFlow GPU Accelerator (Windows Only) https://www.rc-astro.com/TensorFlow/PixInsight/GPU Linux Instructions
Can also just go without this

 

Linear Workflow

Step 0: Open your stacked master L, R, G, and B

Close your integration weight images & rejection maps that open alongside your master lights. AutoSTF each channel. 

Step 1: Rename your images to L, R, G, and B. 

Step 2: Use LRGBCombination 

Combine just your R, G, and B channels using the LRGBCombination process. Leave L unchecked. Apply an unlinked stretch to see the result. 

RGB Image Steps:

The following steps are to be applied to the RGB image you just created. Steps for the Luminance image will come later. 

Step 3: Gradient removal with DynamicBackgroundExtraction

While you can use GradientCorrection, AutomaticBackgroundExtraction, or GraXpert, DynamicBackgroundExtraction is the superior background extraction method. It takes practice  and I recommend you look up a detailed guide or ask an experienced friend for help — but here's a high-level overview of how DBE works:

In DBE, you place sample points on regions of your image that you want to make a uniform brightness; typically you place these on the "background" parts of your image where there is just empty space. DBE averages the pixels within each sample point, ignoring pixels that are too dark or bright, as defined by the Tolerance parameter. It then takes those averaged samples and calculates a smooth, interpolated background model between them. Then, it subtracts (or divides) that background model from your original image, leaving you with a corrected image where all of the sample regions you selected are now the same brightness/color. 

Some notes:

  1. Set "Correction" to either "Subtract" or "Divide" based on what type of background you are correcting. Divide is for removing multiplicative gradients (e.g., vignetting from your lens or nonuniform flats). Subtract is for additive gradients (e.g., light pollution). If you're unsure which to pick, it doesn't matter too much; just roll with one and see if it looks good. 
  2. Enable "Normalize" and "Replace target image"
  3. Place your sample points on regions with no stars. Turn your "Tolerance" setting up such that the background is selected (shown as white in the preview window) and any stars are rejected (black). 
    1. For simple gradients, a handful of sample points around the edges/corners of your image should suffice. Just make sure you're not placing them on the structure of the nebula/galaxy. 
    2. If you're a pro, you can remove the stars prior to this step, more-easily generate a background model using the starless image, undo the background extraction and star removal, then re-subtract/divide the star-containing image by the background model you generated from the starless image. The details of this are a bit advanced for this guide, but for those interested the pixelmath is $T - bg + med(bg), or $T * med(bg) / bg depending on whether you want to subtract or divide your background model. 
  4. Don't tell anyone I told you this, but sometimes AutomaticBackgroundExtraction does an OK job. Just turn the function degree down to 1 for linear gradients or 2 for anything more complex. 

Step 4: Background Neutralization

To fix color cast, create a preview of a starless portion of the background whose color you'd like to turn a neutral grey. You can create a preview using this button:

Then apply BackgroundNeutralization using that preview window as your reference. After this is complete, switch to a linked AutoSTF.

Step 5: Color Calibration

Use either SpectrophotometricColorCalibration or ColorCalibration. I personally advocate for CC, but SPCC is favored by some. For CC, use the same preview as the previous step for your background reference. 

Step 6: Deconvolution

Run BlurXTerminator 

Step 7: Star Removal

Run StarXTerminator (or StarNet). Keep the extracted stars. Leave "unscreen stars" unchecked, since your image is still linear.

Luminance Steps:

Step 8: Gradient removal, just like with the RGB image

Step 9: Deconvolution

For luminance, I run BlurXTerminator at a higher strength than what I use on my RGB image. 

Step 10: Star Removal

Run StarXTerminator (or StarNet). Discard the extracted stars. 

Step 11: LinearFit your Luminance to your RGB

For the upcoming LRGB combination step, you need your Luminance image to be roughly the same brightness as your RGB image. To do this, we'll extract the luminance from your RGB image, and then match the real Luminance image to this extracted Luminance image using LinearFit. 

Extract the luminance from your RGB image using this button in the top left:

Then use this extracted luminance as the reference image in LinearFit. Turn your "Reject low" up slightly so that it ignores the influence of any uncropped, black stacking artifacts in either of your images. It needs to be above zero, but still below the darkest non-zero pixel in your image. 

 

Nonlinear Workflow

Step 12: Stretching

Now that your Luminance and RGB images are matched in brightness, you can stretch them. I use GeneralizedHyperbolicStretch, but you can use whatever method you're familiar with. The important thing here is that you should apply the same stretch to both your RGB and Luminance images. They need to stay matched in brightness. You may feel the itch to stretch one (usually Luminance) more than the other's data can handle, but resist this urge. 

You don't need a fully dialed-in stretch just yet. A rough approximation of your final stretch will do. 

Step 13: Denoising

Use DeepSNR (or NoiseXTerminator) on the RGB and Luminance images. You can denoise RGB pretty aggressively without affecting much of the final detail that detail will come back once you add Luminance back in

Step 14: LRGB Combination

Next I add my Luminance to my RGB image. Uncheck all the channels except "L", type the name of your Luminance image in, and then apply this to your RGB image:

Some Additional Tweaks

These next steps can be done in any order you want whatever looks good to you. Feel free to repeat steps, too. But here's the rough order of operations that I usually follow:

Step 15: Curves

CurvesTransformation is the most powerful tool in PixInsight. I use it for everything:

  • Want to increase the contrast? Curves.
  • Want to adjust your stretch beyond your initial stretch? Curves.
  • Want to tweak the hue? Curves.
  • Want to eliminate a background color cast? Curves.
  • Want to adjust your background brightness? Curves.

Step 16: Saturation

Curves is excellent for adjusting saturation globally, but if you want to target a specific hue, the ColorSaturation script is the way to go. 

Step 17: LHE

LocalHistogramEqualization is very easy to overdo. If you use it at all, you should have your "Amount" slider turned way down. 

Add the stars back

Step 18: Stretch the Stars

Use your preferred stretching method to stretch your stars. 

Step 19: Saturate Stars

The stars usually need a hint of saturation. I use the CurvesTransformation process. 

Step 20: PixelMath to Add Stars Back

Use the following PixelMath expression to add the stars back to your background:

~(~stars*~starless)

Step 21: Crop & Rotate

Technically, you could do this anytime, but I like saving it for last. Use DynamicCrop and FastRotate to get your desired framing. 

1 comment

I’ve got some LRGB data from March that I’ve been meaning to reprocess. Going to use this guide on that data when I get the chance.

Mat DurrAliens

Leave a comment