Just my opinions.
How much compromise in the quality of the data results from using WBPP? Everyone's mileage may vary, not least I believe by the level of post processing skills. Personally a small loss will hardly be noticeable in my final image, however, I'll still try to extract as much as possible from what I receive. I'd say very little will be lost simply as a result of using the automatic script in question but then only as of recently. The following is just how I currently approach calibration.
I use other tools in pixinsight with caution to avoid culling data too soon in the process. First a fastish 'Blink' to identify individual subs or groups where cloud or very occasionally misalignment occurs. I can then eliminate those early on. After calibration but before alignment I usually run the subframe selector but only to try and identify a good sub or two for aligning all the data. In addition any that are excessively bad as identified from the peaks on the graphs I'd look at before excluding. Using a suitable algorithm to do this selection on a scientific basis is almost certainly a better way, but I prefer the more personal approach before obliterating anything.
For many months now I have been using the Normalise Scale Gradient (NSG) after alignment to more scientifically identify the poorer subs and exclude them from the Integration. Any suggestions on better ways of doing any of the aforementioned I'd very much welcome.
So what are the penalties for using WBPP to do the calibration and does convenience result in compromise? I'm currently aware of a situation where narrowband data ought to be looked at prior to the calibration process to see if any data might be lost, but that's been only recently.
Prior to then WBPP did not have a way to deal with such a loss and may explain a recent criticism that it loses much of your data. Since this summer it's been changed to allow us to mitigate a particular situation which could give rise to some loss. Admittedly before then, providing you were aware of the issue and also how to deal with it, you'd have been recommended not to use the script.
As just mentioned I'm now aware of just one issue which may arise (solely/mainly) with narrowband data.This as far as I understand is where there is a weak signal and some pixels may end up getting set to zero after dark frame subtraction. With regards to broadband and OSC data I'm not currently aware of what WBPP may be doing to lose data from those, so I'd be most grateful if someone would reply and explain how this may arise so that I can look for solutions. Until then I will stick with WBPP.
Hope this is understandable.
I'd welcome some feedback if there are still scenarios that would make the use of the script undesirable
Cheers,
Ray