~25% exposure bump in some processed images?

I'm working on using raw images in an industrial measurement process, converting to 16-bit linear images using libraw with parameters...

    libRaw.imgdata.params.use_camera_wb = 1;
    libRaw.imgdata.params.no_auto_bright = 1;
    libRaw.imgdata.params.gamm[0] = 1;          // linear gamma
    libRaw.imgdata.params.gamm[1] = 1;          // linear gamma toe slope
    libRaw.imgdata.params.output_bps = 16;      // 16 bit output
    libRaw.imgdata.params.highlight = highlights; // Highlight mode (0=clip, 1=unclip, 2=blend, 3+=rebuild).

...with highlights set to 1 or 3.

What I find is that, over most exposures, the code values in my exported TIFFs track those in the raw files, as expected. However, when I'm close to my optimal exposure (1/3 stop below clipping), the code values are about 25% higher than expected; my image is nearly as bright as a 1/3-stop-brighter image (see attached graph).

Am I doing something wrong, or is this a bug?

Details: libraw 0.21.3 on Mac and Windows. Changing user_qual (interpolation) has no effect; changing highlight has no effect (unless highlight=0; see below). The samples are Sony ARW files, but I've seen the same thing from Panasonic GH5 raw and Canon EOS RP raw files.

I captured a ramp of exposures at ISO 800, f/4, varying shutter speed in 1/3 stop steps from 1/100 to 1/20, using a Sony A7Siii. I processed the ARW files using dcraw_emu -4 -w -T -H1 [filename] to generate TIFFs, echoing what my code does.

Source files: https://www.dropbox.com/scl/fi/hzz3dntjl2f3i9f30o652/LibRaw-0.21.3-Test....

I opened the ARWs in RawDigger and measure green code values in a central area (screenshot). I opened the TIFFs in Affinity Photo 2, cropped to the same central area, and measured the average green value to .5 precision (I sweep the mouse around the area; if G stays at 22 it's recorded as 22, but if it oscillates evenly between 22 and 23, it get recorded as 22.5). I then scaled the TIFF values to match the raw values and plotted the results.

The raw values (and their observed images) increase in the expected, nearly linear manner as exposure increases. The TIFFs increase similarly, except at 1/40 sec exposure, where the code values jump about 24% and the 1/40 sec TIFF is nearly as bright as the 1/30 sec TIFF.

So, what am I doing wrong?

[Side issue: when highlight == 0, code values are about 2.53x higher than when highlight != 0. I compensate in my code with:

    // if clipping highlights, compensate for the ~2.53x brighter rendering when doing so
    if (highlights == 0) libRaw.imgdata.params.bright = 0.395;
    else libRaw.imgdata.params.bright = 1.0;

But this is a separate issue and one I can easily work around.]

AttachmentSize
Image icon Screenshot of RawDigger with test image327.18 KB

Forums: 

You got hit by 'automated

You got hit by 'automated maximum adjustment' code implemented in LibRaw's postprocessing many years ago to avoid 'pink clouds' problem: real data range is not present in RAW metadata while real data maximum is lower than format maximum.

To turn the feature off: use
imgdata.params.adjust_maximum_thr = 0;
(corresponding dcraw_emu command line switch: -c 0)

Here are your 0018 file processed via:
dcraw_emu -w -T and dcraw_emu -w -c 0 -T respectively
https://www.dropbox.com/scl/fo/8bgts0j5m5aa8007czqkf/AC8-URS3v42QVYOD_wk...

-- Alex Tutubalin @LibRaw LLC

That was it>

That was it: I set adjust_maximum_thr = 0 and now my captures are consistent. Thanks very much!