I was just wondering if you'd had any success with this? I'm considering my options for creating multiple full-resolution, non-bayer, non-interpolation, monochrome/grayscale camera for a Raspberry Pi. I could try and remove the bayer filters from all of them, but this is trick and could easily cause damage. I'd much rather process the Raw files from the cameras (v2 8MP) to create monochrome, but with the red, green and blue scaled correctly...
LibRaw provides LibRaw::open_bayer() call for such tasks.
Unfotrunately, the only place where it is documented is Changelog.txt file, it is missing from docs (to be fixed).
Also, look into samples\openbayer_example.cpp
If the camera records raw in linear space, there must be a way to know it during "unpack()" stage, correct? is it indicated somewhere in the meta data of the RAW file?
Just wondering if some camera actually record raw data in linear space? If that is true, can “libraw” tell us if the raw data is already in linear space?
White balance is, in most cases, applied to linear data, so linearization is done before white balance. Traditional approach to demosaicking is also to apply it to linear data, so linearization quite often precedes demosaicking. That makes linearization the first step of raw processing.
first: "So when you say daylight colour profile, its a preset that comes with libraw vs what the camera thinks it should be?"
Second, if there is no way to correctly white balance to 6500k, then why do we even have these numbers to begin with?
There are other reasons for wanting to get a correct d6500 white balance than matching a physical lamp. For example if i have to work with raw files from several cameras: then if they could all reliably be set to a 6500k white balance, they would at least all match eachother regardless of the actual lamp temperature, saving one self the work of having to grade each camera individually.
Also look at the rawtoaces project. it is specifically made to convert raw files to aces colourspace which is calibrated around a 6000k (?) whitepoint. Unfortunately not all cameras have had their sensors analysed for spectral sensitivity.
Your question is 'is there any reason the derived values should be more correct....'
My question is: 'more correct for WHAT?'
Real scene is lit by some real (daylight) light source, not (imaginary/synthetic) black body at 6500K.
Both settings are 'not correct' for real image/real scene.
So when you say daylight colour profile, its a preset that comes with libraw vs what the camera thinks it should be? is there any reason the derived values should be more correct than the makernote values?
Quote from my last reply in that thread:
1st: the backtraces points to LibRaw::LibRaw (constructor), so the problem is not related to any specific RAW file
2nd: I've tried XCode 6, XCode 8, XCode 9 builds (using make -f Makefile.dist) and was unable to reproduce the problem using both dcraw_emu (single thread) and half_mt (multithreaded). Both samples works fine with DSC_1796.NEF sample (link above). The sample was multiplied into many (same) files: DSC_1796-[0-9].NEF
So, I'm still suspect that this is not LibRaw problem, but not enough stack problem. LibRaw objects are big (e.g. several 16-bit curves), so default stack size could be not enough (it is better to allocate LibRaw object dynamically to avoid that).
Is there any way to see is enough stack space is present [in your app]?
Sorry I think I misunderstood.
The problem is because the error I cannot compile the lib to test it. I thought you told me by removing the flag it would be ok.
Sorry,
Imagine scene, contains red and blue surfaces (patches), shot via camera with color (bayer) filters.
Red/Blue (pixel values) ratio on red will be much higher than R/B ratio on blue patch. This is how a color camera works.
There is no way to fix it via single (per channel) gains, R/B(red) always will be higher than R/B(blue), regardless of gain used.
So, demosaic (i.e. recovering of missing color values), than RGB -> grayscale conversion looks like the only way to go.
I was just wondering if you'd had any success with this? I'm considering my options for creating multiple full-resolution, non-bayer, non-interpolation, monochrome/grayscale camera for a Raspberry Pi. I could try and remove the bayer filters from all of them, but this is trick and could easily cause damage. I'd much rather process the Raw files from the cameras (v2 8MP) to create monochrome, but with the red, green and blue scaled correctly...
google search with site:libraw.org works well enough to not maintain own search engine :)
LibRaw provides LibRaw::open_bayer() call for such tasks.
Unfotrunately, the only place where it is documented is Changelog.txt file, it is missing from docs (to be fixed).
Also, look into samples\openbayer_example.cpp
There are a lot of different RAW formats (data formats, metadata formats, metadata values), it's pretty hard to discuss them all at once.
In general, you may assume that unpack() provides linear data/black level not subtracted.
Thanks, Alex!
If the camera records raw in linear space, there must be a way to know it during "unpack()" stage, correct? is it indicated somewhere in the meta data of the RAW file?
LibRaw applies (some) linearization data on unpack() phase.
Thanks a lot!
Just wondering if some camera actually record raw data in linear space? If that is true, can “libraw” tell us if the raw data is already in linear space?
White balance is, in most cases, applied to linear data, so linearization is done before white balance. Traditional approach to demosaicking is also to apply it to linear data, so linearization quite often precedes demosaicking. That makes linearization the first step of raw processing.
Hi Alex,
Just wondering where the "Linearization" stands during "raw processing"?
Thanks a lot,
Mio
As soon as we'll able to decode it.
Any help is highly appreciated.
LibRaw allows to set any white balance, including the ones that comes from camera metadata (use imgdata.params.user_mul[] for that).
the question was a two-parter.
first: "So when you say daylight colour profile, its a preset that comes with libraw vs what the camera thinks it should be?"
Second, if there is no way to correctly white balance to 6500k, then why do we even have these numbers to begin with?
There are other reasons for wanting to get a correct d6500 white balance than matching a physical lamp. For example if i have to work with raw files from several cameras: then if they could all reliably be set to a 6500k white balance, they would at least all match eachother regardless of the actual lamp temperature, saving one self the work of having to grade each camera individually.
Also look at the rawtoaces project. it is specifically made to convert raw files to aces colourspace which is calibrated around a 6000k (?) whitepoint. Unfortunately not all cameras have had their sensors analysed for spectral sensitivity.
Your question is 'is there any reason the derived values should be more correct....'
My question is: 'more correct for WHAT?'
Real scene is lit by some real (daylight) light source, not (imaginary/synthetic) black body at 6500K.
Both settings are 'not correct' for real image/real scene.
sorry but that doesn't really answer the question
Unless your scene is lit by exact D65 (imaginary) light source, both values are 'not correct'
So when you say daylight colour profile, its a preset that comes with libraw vs what the camera thinks it should be? is there any reason the derived values should be more correct than the makernote values?
Derived D65 multipliers are calculated from daylight color profile, while makernotes multipliers are recorded by camera.
Thank you for confirming that.
Yes, LibRaw(0.19) object is significantly larger than (0.18). Looks like we need to move curve[0x1000] to dynamic allocation to save stack.
It was a stack problem. The memory footprint must have changed from 0.18.x to 0.19.
So, please close the issue.
There is very similar report related to digiKam: https://github.com/LibRaw/LibRaw/issues/186
Quote from my last reply in that thread:
1st: the backtraces points to LibRaw::LibRaw (constructor), so the problem is not related to any specific RAW file
2nd: I've tried XCode 6, XCode 8, XCode 9 builds (using make -f Makefile.dist) and was unable to reproduce the problem using both dcraw_emu (single thread) and half_mt (multithreaded). Both samples works fine with DSC_1796.NEF sample (link above). The sample was multiplied into many (same) files: DSC_1796-[0-9].NEF
So, I'm still suspect that this is not LibRaw problem, but not enough stack problem. LibRaw objects are big (e.g. several 16-bit curves), so default stack size could be not enough (it is better to allocate LibRaw object dynamically to avoid that).
Is there any way to see is enough stack space is present [in your app]?
OK, I got it. Thank you.
Makefile.mingw already defines -DLIBRAW_NODLL, so disabling DllDef (and, of course, DLL builds):
DllDef is defined in libraw_types.h:
#ifdef WIN32
#ifdef LIBRAW_NODLL
#define DllDef
#else
#ifdef LIBRAW_BUILDLIB
#define DllDef __declspec(dllexport)
#else
#define DllDef __declspec(dllimport)
#endif
#endif
#else
#define DllDef
#endif
Sorry I think I misunderstood.
The problem is because the error I cannot compile the lib to test it. I thought you told me by removing the flag it would be ok.
Sorry,
I can not understand the question. What feature you want to disable?
Pages