![]() ![]() I'm confident what you are gong to see is a lot of noise and mottling. Please show your Startoolers and Raw Therapee again but crop in to 100% of your developed image. But it obliterates the color from my existing data. I assume this comes from the respective noise control implementations-the Siril and PI approach will eliminate the red blotchiness I have to remove from my backgrounds. StarTools and RawTherapee, on the other hand, do not. My takeaway is, Siril and PixInsight require much longer subs for them to render color-or at least the light data has to be farther away from the floor than 30 seconds of f/3.5 data on 4.3µm pixels on a stock DSLR from literally the brightest Ha target in the night sky. Could be wrong.Įdited by bobzeq25, 16 September 2022 - 11:25 PM. The red nebulae just isn't there, and I don't think it's a Siril issue. Note that the stars and the Milky Way have a ton of it. I did a quick process of the Siril stack in PixInsight, doing my best to emphasize color. So, there are no saturated pixels, there should always be some. You're only using about 1/3 of the camera's dynamic range. A light averaged about 550 ADU on a 16,000 ADU (14bits for both) scale. Did you by any chance, use a duoband filter? That passes two narrow peaks, the unmodded camera decimates the red one. Takes out perhaps 90% of the Ha emission from the nebulae. If you have some Siril calibration magic, I'm all ears… Here's what I was able to pull with calibration with RawTherapee: There are those who claim it can be done. ![]() I've had a lot of trouble pulling any color out of my hour on the Carina Arm of the Milky Way with Siril 1.0.5 (the latest rev) from my unmodded Canon 600D/T3i. I lack time and some familiarity with the tools to rapidly prototype the idea so of someone else wants to give it a shot, be my guest.Long time lurker, first time poster of a "Process My Stack" post. What I want to try when I find some time is to basically use the local contrast (variance) as an estimation of depth/haze, filter it with some edge preserving filter (guided bilateral blur or something) and then use that with an auto detect (or user provided) haze color to subtract the haze. I currently solve it using either Lab curves or local white balance for the color part and curves + some form of local contrast for the luminance part of the equation, trying to reconstruct depth using painted and parametric masks. I take a lot of shots when paragliding or generally in the mountains and haze is pretty much always an issue. It’s something I wanted to do for quite some time. This is the result I got doing pretty much what was described above, using a fairly cheap aps-c camera (a6000) and non exotic lens (24/1.8). This will give you a relatively clean image to work with which can then be whitebalanced, pushed using curves and some form of local contrast enhancement and denoising and of course saturation boosts. If you want to take milkyway shots IMO stacking is your biggest friend. Dehazing algorithms generally need to deal with varying levels of haze because of variations in depth - now while the variation in the distance of different stars and the milkyway are gigantic, they are pretty much irrelevant because most of it is empty space. Honestly dehazing is likely the wrong tool for this job. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |