We do not have Sinar 4-shot anfd 16-shot samples, so we cannot test this support (and hope dcraw-derived code works right).
Could you please provide several 4-shot and 16-shot samples for testing (use any file sharing service and send link to info@libraw.org or post link in comments here)
Sorry for delayed publishing of this topic, summer is vacation time even for libraw.org moderator :)
Could you please provide image sample to test it with. open_bayer should work with 8-bit images, but it looks like it needs additional testing.
version 0.19 expected to be released this year or in Q1-2019 (we definitely will publish 1-2 snapshots before that). Snapshots are used in our applications (RawDigger, FastRawViewer) and stable for production use.
To get the same Temp/Tint as in ACR you you need to do:
1. CameraNeutral[3] = {G1/R,1,G1/B}; // R, G1, B are WB mul
2. use NeutralToXY from dng_sdk_1_4\dng_sdk\source\dng_color_spec.cpp
3. use Set_xy_coord from dng_sdk_1_4\dng_sdk\source\dng_temperature.cpp to get Temp/Tint
I rewrote the code and got the same result as in ACR. But there is a trouble. I have to get CameraCalibration matrixf for RAW. Another matrix I goto from Adobe dcp files (from DNG Converter).
You may disable automatic brightening and use manual exposure shift (exp_correc, exp_shift, exp_preser parameters). The exact amount of shift will differ on different cameras/different ISO values (see, for example, this article: https://photographylife.com/adobes-silent-exposure-compensation )
'not exposed normally' means that graypoint value is below 18% by some amount.
Using no_auto_bright=0 made the resulting image brighter than how the CR2 looks.
The histogram of the image changes. It's generally the same shape, but shifted. What parameters of libraw can i modify to try to keep the histogram in the same place? would modifying the tone curves that libraw uses acheive this?
What do you men by "not exposed normally"? I used 1/10 second exposure time, fNumber 5.6, ISO 160.
I downloaded both packages again today (0.18.2 and rawspeed-master) and recompiled them from scratch in two separate folder with and without rawspeed. The tiff generated by dcraw_emu.exe with RawSpeed shows a blue sunflower, while the one without RawSpeed shows a proper yellow sunflower.
I did not change anything in the configurations except whatever was needed to get it to compile.
If you use no_auto_bright=1, than any brightening will be disabled, resulting in dark image if your shot is not exposed normally.
With auto_bright, LibRaw will put 1% of pixels (with default options) to saturation.
Today cameras gray point is usually at 12-10-even 7-8% of maximum (so, up to 4 stops from saturation). Raw converters (DPP, ACR, etc) usually use custom tone curves to compress highlights and get gray point back to 18% (so 0.7-1.5EV gray point move is usual)
I originally wrote Rawshack in 2011 but have recently been updating it with new features. It's GPL v3 and runs on Windows, OSX, and Linux. Here's the home page:
Yes, FastRawViewer uses LibRaw (+RawSpeed +Adobe DNG SDK) to decode raw data and its own (post)processing code (written in SSE3/AVX2 assembly for speed, that's why FRV is Fast)
RawDigger uses LibRaw (+libs listed above) for decoding and dcraw_process() (with bilinear 'demosaic', fast but far from perfect), because RGB rendering is only for user reference (we plan to change demosaic to own code in future releases to provide better rendering, because many users use RawDigger as raw processor :).
I'm not familiar with Apple Core Image. It is very likely it will accept (cropped?) imgdata.rawdata.rawimage[] as input to CIRawFilter (this is flat array of unprocessed raw values decoded from camera).
Thanks Alex. Given your reply, am I correct in assuming that applications like FastRawViewer are not using dcraw for processing and are instead using their own algorithms?
On a related note, in Apple's Core Image framework for macOS, they have a filter called CIRawFilter that can take RAW data as input. The header file states that the following formats are supported:
If I want to try and use Core Image for post-processing, is there a way to get the unprocessed image data from LibRaw in any of the formats listed above?
dcraw_process is derived (without any significant changes) from dcraw.c code
It is slow and image processing options are limited.
You may play with dcraw_emu sample and different command line options to get output that suits your needs. Each dcraw_emu command-line parameter translates into LibRaw.imgdata.params value.
Oops, I didn't even think of that but, you're right. Allocating a new LibRaw object dynamically solves the problem. Thanks for the quick response and for your work on LibRaw.
If I set "use_auto_wb" to 1, how can I retrieve the final set of white balance coefficients that is calculated by "libraw"? also, if I use "greybox" method to calculate the white balance coefficients, is there a variable through which I can get the final values?
WB selection logic is in scale_colors() (this call will print multipliers if compiled with DCRAW_VERBOSE defined.
You may copy-paste this logic to your own call, or, if you do not use all possible ways to set WB coeffs, you may simplify it.
Thank you for your request!!
We do not have Sinar 4-shot anfd 16-shot samples, so we cannot test this support (and hope dcraw-derived code works right).
Could you please provide several 4-shot and 16-shot samples for testing (use any file sharing service and send link to info@libraw.org or post link in comments here)
The file you've shared is only 772415 bytes size (unless I downloaded it wrong, please check too).
The original RAW_CANON_40D_SRAW_V103.CR2 from Rawsamples.ch (same preview, so I'm sure it is the same file) is 6800865 bytes.
Looks like your sample is truncated.,
Sorry for delayed publishing of this topic, summer is vacation time even for libraw.org moderator :)
Could you please provide image sample to test it with. open_bayer should work with 8-bit images, but it looks like it needs additional testing.
version 0.19 expected to be released this year or in Q1-2019 (we definitely will publish 1-2 snapshots before that). Snapshots are used in our applications (RawDigger, FastRawViewer) and stable for production use.
Indeed: https://www.libraw.org/news/libraw-0-18-released
I agree, aligned scanlines are good.
We're open to patches/contribution.
To get the same Temp/Tint as in ACR you you need to do:
1. CameraNeutral[3] = {G1/R,1,G1/B}; // R, G1, B are WB mul
2. use NeutralToXY from dng_sdk_1_4\dng_sdk\source\dng_color_spec.cpp
3. use Set_xy_coord from dng_sdk_1_4\dng_sdk\source\dng_temperature.cpp to get Temp/Tint
I rewrote the code and got the same result as in ACR. But there is a trouble. I have to get CameraCalibration matrixf for RAW. Another matrix I goto from Adobe dcp files (from DNG Converter).
You may disable automatic brightening and use manual exposure shift (exp_correc, exp_shift, exp_preser parameters). The exact amount of shift will differ on different cameras/different ISO values (see, for example, this article: https://photographylife.com/adobes-silent-exposure-compensation )
'not exposed normally' means that graypoint value is below 18% by some amount.
Using no_auto_bright=0 made the resulting image brighter than how the CR2 looks.
The histogram of the image changes. It's generally the same shape, but shifted. What parameters of libraw can i modify to try to keep the histogram in the same place? would modifying the tone curves that libraw uses acheive this?
What do you men by "not exposed normally"? I used 1/10 second exposure time, fNumber 5.6, ISO 160.
I downloaded both packages again today (0.18.2 and rawspeed-master) and recompiled them from scratch in two separate folder with and without rawspeed. The tiff generated by dcraw_emu.exe with RawSpeed shows a blue sunflower, while the one without RawSpeed shows a proper yellow sunflower.
I did not change anything in the configurations except whatever was needed to get it to compile.
Here is a comparison image, from left to right original, with rawspeed and without.
https://www.dropbox.com/s/wlfwna06nf67ktn/rawspeed.JPG?dl=0
I could zip up my working folder and upload it if that would help.
So how should this be fixed? I presume it'll involved detecting if this is a SuperCCD camera and setting a different white level?
If you use no_auto_bright=1, than any brightening will be disabled, resulting in dark image if your shot is not exposed normally.
With auto_bright, LibRaw will put 1% of pixels (with default options) to saturation.
Today cameras gray point is usually at 12-10-even 7-8% of maximum (so, up to 4 stops from saturation). Raw converters (DPP, ACR, etc) usually use custom tone curves to compress highlights and get gray point back to 18% (so 0.7-1.5EV gray point move is usual)
I do not see any problems with RawSpeed and RAW_SAMSUNG_NX300.SRW sample.
What RawSpeed version do you use?
Thank you for the problem report.
This code line should work for old Fuji cameras (SuperCCD), but not for current line (X-Trans and current bayer line)
Looks like ./configure has missed -lc++ library. Try to add it manually to linker options,
I've also tried converting the .CR2 to .TIF using Digital Photo Professional and Irfan View, and the files these create don't lose any brightness.
I originally wrote Rawshack in 2011 but have recently been updating it with new features. It's GPL v3 and runs on Windows, OSX, and Linux. Here's the home page:
http://testcams.com/rawshack/
Thanks for the additional information and openness Alex, much appreciated.
Yes, FastRawViewer uses LibRaw (+RawSpeed +Adobe DNG SDK) to decode raw data and its own (post)processing code (written in SSE3/AVX2 assembly for speed, that's why FRV is Fast)
RawDigger uses LibRaw (+libs listed above) for decoding and dcraw_process() (with bilinear 'demosaic', fast but far from perfect), because RGB rendering is only for user reference (we plan to change demosaic to own code in future releases to provide better rendering, because many users use RawDigger as raw processor :).
I'm not familiar with Apple Core Image. It is very likely it will accept (cropped?) imgdata.rawdata.rawimage[] as input to CIRawFilter (this is flat array of unprocessed raw values decoded from camera).
Thanks Alex. Given your reply, am I correct in assuming that applications like FastRawViewer are not using dcraw for processing and are instead using their own algorithms?
On a related note, in Apple's Core Image framework for macOS, they have a filter called
CIRawFilter
that can take RAW data as input. The header file states that the following formats are supported:If I want to try and use Core Image for post-processing, is there a way to get the unprocessed image data from LibRaw in any of the formats listed above?
dcraw_process is derived (without any significant changes) from dcraw.c code
It is slow and image processing options are limited.
You may play with dcraw_emu sample and different command line options to get output that suits your needs. Each dcraw_emu command-line parameter translates into LibRaw.imgdata.params value.
Oops, I didn't even think of that but, you're right. Allocating a new LibRaw object dynamically solves the problem. Thanks for the quick response and for your work on LibRaw.
What is stack size in (GCD) background threads?
LibRaw objects are large, you may need to allocate it dynamically to save stack space.
imgdata.color.pre_mul[] coeffs, but only after dcraw_process()
Hi,
If I set "use_auto_wb" to 1, how can I retrieve the final set of white balance coefficients that is calculated by "libraw"? also, if I use "greybox" method to calculate the white balance coefficients, is there a variable through which I can get the final values?
Thanks a lot,
Mio
WB selection logic is in scale_colors() (this call will print multipliers if compiled with DCRAW_VERBOSE defined.
You may copy-paste this logic to your own call, or, if you do not use all possible ways to set WB coeffs, you may simplify it.
Pages