Recent comments

Reply to: About LibRaw   15 years 6 months ago

Is there an easy way to use LibRAW to move the whole processing procedure into Matlab?

Thanks.

Reply to: Out of order call to libraw function   15 years 6 months ago

Thank you, much appreciated.

Reply to: Out of order call to libraw function   15 years 6 months ago

You should call open() on each iteration:

for(x=0; ; x++) {
RawProcessor.open(filename);
if(x>P1.raw_count) break;
RawProcessor.unpack();
process();

}

Reply to: Out of order call to libraw function   15 years 6 months ago

I should mention that the loop for getting multiple images
looks something like:

for(x=0; x < P1.raw_count; x++) {
OUT.shot_select = x;
RawProcessor.unpack();
RawProcessor.dcraw_process();
}

Reply to: libopenraw   15 years 7 months ago

libopenraw is completely different project.

Reply to: Ideas... Any idea box somewhere ?   15 years 7 months ago

*Decoding* process is very simple and limited to file (disk) bandwidth.

Postprocessing (demosaicing, color conversion, noise suppression etc) is another story. GPU may help a lot. On the other side, only very good quality postprocessing code should be accelerated, there is no need to very fast and very bad postprocessing. We think, that current LibRaw postprocessing implementation is bad enough....

Reply to: Usage Examples   15 years 7 months ago

Thank you for your suggestion.

We'll rename identify sample to libraw_identify

Reply to: Usage Examples   15 years 7 months ago

Maybe you could rename the identify sample program. Anybody who installs it will have a conflict with ImageMagick's command of the same name.

Reply to: Download   15 years 8 months ago

You're right.

We'll include original dcraw.c in next release.

Reply to: Download   15 years 8 months ago

because the #line directives in various files direct to it. Unfortunately downloading dcraw.c doesn't help, because it changes all the time and is therefore out of sync with libraw.

Reply to: makefile for compiling on mingw   15 years 9 months ago

unfortunately, file not found....

wget http://www.jellofishi.com/apps/libraw/mingw/Makefile
--2009-03-01 17:29:09-- http://www.jellofishi.com/apps/libraw/mingw/Makefile
Resolving www.jellofishi.com... 67.15.172.4
Connecting to www.jellofishi.com|67.15.172.4|:80... connected.
HTTP request sent, awaiting response... 404 Not Found
2009-03-01 17:29:10 ERROR 404: Not Found.

Reply to: C++ API   15 years 9 months ago

Yes, for '99%' of cameras (bayer pattern ones) half_size is lossless. For most RGB-bayer patterns there is four different pixel positions in 2x2 block, so if you want to recover exact pixel position you need to examine "filter" field which describes exact layout.

There is some 'special cases': CMYG cameras, Leaf cameras with 8x2 blocks, but most cases are simple.

Also, we'll sometimes refactor this place in LibRaw. Having multiple planes will make simpler to deal with multi-planes cameras: Fuji SuperCCD (2 images in one file), Sinar 4-shot and 16-shot.....

Sometimes on LibRaw 0.9

Reply to: C++ API   15 years 9 months ago

I understand the philosophy of "one size fits all". The DNG-SDK persues the same strategy, but uses more than one plane for the remaining 1% of sensors, i.e. instead of using four values for one pixels it uses up to four pixel planes. But I certainly understand that you have to follow the design of dcraw.c closely. That will allow you to incorporate new cameras very easily into libraw.
Does 'half-size' processing mean that the top left corner, which may look as follows:
RG
GB
is merged into one pixel with four values? In other words, is 'half-size' processing lossless?

Reply to: C++ API   15 years 9 months ago

dcraw's (and LibRaw's) internal structures are 'one size fits all': image[][4] can fit all 'life types' of RAWs (1-color bayer, 3-color foveon, 4-color Sinar 4-shot). Sure, ~99% of RAW images are simply bayer-pattern RGB, but we should not ignore remaining 1%.

If you want to save memory, you can use 'half-size' processing on unpack() stage. In this case each 4 pixels of raw data are packed into one image[][4] item. You should take in account the exact bayer pattern layout to restore original pixel position.

Reply to: C++ API   15 years 9 months ago

I remember FC(row,col) from the times, when I tried to dig through dcraw.c. I didn't succeed, because after a pause a had to start all over again. I'm very grateful for your work here. Being frustrated by dcraw.c I turned to the DNG-SDK, which also has the ability to write raw files, which is crucial for my RawImageClearer ( http://www.RawImageClearer.de.tf ). The DNG-SDK has lots of pitfalls and is very bloated, I have to say.
I would like to implement a new kind of demosaicing with libraw. Rather than creating an RGB 4:4:4 image it creates directly a YUV 4:2:0 image, which is exactly the color space used for JPEG compression. Thus I do not really fill in the remaining 2 or 3 values as it is done with the usual RGB demosaicing methods.

Reply to: C++ API   15 years 9 months ago

There is FC(row,col) function, which returns index (in 0..3) range of value within image[i][4].
Look in the samples/unprocessed_raw.cpp for usage example

You're right, that raw_image[row*col], than processed_image[row*col][3] may be better alternative. Sometimes in future, LibRaw may be refactored this way.
There is many reasons to have current memory allocation pattern:
* (main one) LibRaw derived from dcraw sources, so memory pattern is preserved

* there is some special cases (Foveon, Sinar 4-shot) where more than one component is needed, so 4-component image[] is more general and fits all cases.

* most LibRaw users uses post-processing stage too. With postprocessing there is no advantage in 2-phase allocation (one for raw data and one for postprocessing).

Reply to: C++ API   15 years 9 months ago

Thank you for your prompt information.
If only one of the four possible values is filled with data by unpack, then there must be either some information about the mosaic pattern or the unfilled items must be marked by a special value?
It would be nice to have a low level funtion, which just writes the sensor data (only 1 ushort per pixel) to a specified buffer:
unpack(unsigned short *buffer, int pitch)
Of course, the user has to assure that there ist enough memory allocated. This rather flexible function has also the advantage that the user can padd the data appropriately, which is often neccessary. Also the DNG-SDK unpacks in stage1 the raw data with just one 16 bit value per pixel, if it's not a Foveon sensor. At least internally such a function should precede libraw::unpack(). It would be nice to have such a function also for libraw users.

Reply to: C++ API   15 years 9 months ago

Each pixel is 4-component, 16 bit per component.

On unpack() sensors only one component is filled. There is two exceptions:
* if half_size parameter is set, then image is resized two-fold (in each dimension), and all 4 components if image[i] filled with data
* for foveon-based sensors three components is filled.

On postprocessing stage remaining components are interpolated 'in-place', without image[] reallocation

Reply to: C++ API   15 years 9 months ago

The LibRaw::unpack fills imgdata.image with values. However, imgdata.image is of type ushort (*image)[4] rather than ushort *image, although raw files of Bayer sensors have only one 16 bit value per value. What do I misunderstand here?

Reply to: how to get XYZ colors?   15 years 9 months ago

You can use cam_xyz matrix. it is 4x3 because bayer data contains 4 channels (in general).
For cameras with two identical green channels this matrix is actually 3x3 (last row is zeroes)

Reply to: Region of Image   15 years 9 months ago

Current LibRaw postprocessing routines (derived from dcraw) can only develop entire image area.

You may use unpack() to load entire Bayer pixels, than do demosaic/etc itself

Reply to: Two Paths Leading Nowhere   15 years 10 months ago

Peter,

my experiments show two interesting things
1) Some DNG implementations use camera specific processing paths. I.e. one way for black calculation for Canon cameras and another way for 'xxx' one. So, these DNG engines are not camera independent.

2) Most common DNG converter calculates wrong Black Level tag. This is not a specification problem, but implementation one. Anyway, from practical point of view, DNG specs are a theory, but Adobe DNG Converter is the reality.

So, in current reality it is not possible to build good RAW processor for DNG files which uses only Adobe-documented DNG tags. At least for DNGs produced from Canon .CR2 good RAW engine should ignore DNG-specific Black level tags and calculate black level by itself. If application relies on DNG tag, it will result in very bad rendering in Zone V and below (Also, dynamic range will drop from 8-11 stops to 5-6).

Of course, DNG is more documented than usual (undocumented by vendor) RAW format. This is definitely a step in the right direction. But this step is not big enough: DNG processors must use vendor-specific EXIF fields for good rendering, vendor-independent DNG-fields are not enough.

Some DNG problems are related not to the specs itself, but to particular Adobe DNG implementation. On the other side, there are not many DNG converters on market, so Adobe DNG converter problems and DNG format problems are not different in mass mind.

Reply to: Two Paths Leading Nowhere   15 years 10 months ago

To whoever you are,
Since you are not responding to any points I'm bringing up, other than to pick them apart, this will be my last post here.

> The rendering tools will inevitably use differing sub-components.
> The only way to stop that s to forbid progress in rendering tools.
Once again you are trying to pass on the issue at hand. Problem is not what we want to do, but what DNG forces us to do - to ignore the contained information to assure normal processing.

And I asked, can you not find that information if the DNG? Is it not in the EXIF space for you to find? THis is an actual question.

> The DNG spec has been submitted to the ISO for consideration,
> so there is some discussion happening there, I assume.
Thank you to be so opened about the nature of that discussion. It sounds especially unusual given it comes from a person who founded the Digital Standards and Practices Committee with the American Society of Media Photographers.

I don't understand your point here. All I happen to know about this is that the DNG spec has been submitted to the black hole that is the ISO organization. That's public knowledge. If you're interested, why don't you ask the people that do know?

> I can tell you that Adobe is most interested in speaking to people who are
> at least willing to acknowledge the good work they have already done
You seem not to realize that Adobe are not the only party here to decide. Somehow I doubt that you gave here an accurate profile to Adobe, too; and that you really understand that it is not just about Adobe doing good job with their applications, but about the future of our archives, and the quality of our conversions being on par to the money invested into the lenses and cameras and studios etc, all other things being equal.

I'm acutely aware of that. It's just that Adobe is the only company that has bothered to try and create a universal raw container. I assume that you would never consider saving a file as a PSD, which is far more proprietary and undocmented than DNG.

>If only Adobe ACR/LR would be the best converter in the world, DNG might look much more attractive. But it is not.

No arguement there. "Best" is a matter of personal taste, to a large extent.

> Camera Manufacturers unwillingness to support DNG is a political issue,
> not a technical one
Please try to support your statement with some evidence.

Have you spoken to any camera manufacturers about it? I have. And I've never heard any technical barrier. If there is one, please feel free to present it. Saying they have not done it,is not the same as saying that there is a technical limitation.

> As to usable, DNGs can be read by most Adobe products, Capture 1, Bibble 5,
> Lightzone, Apple on system level, and Windows on a system level,
> and nearly all DAM products.
To read is not everything. There are at least two questions here, firstly, is the rendition in third-party raw converters based solely on DNG data fields;and secondly, how different that rendition is when compared to the rendition from the original raw data.

The only thing that I think you're saying is that you need to bypass the DNG's Adobe renedering tags for those in the EXIF that is also contained in the file. If it's something different, please correct me.

> I am curious as to who exactly "we" is.
Once again in your own words - we like people who have done their homework first.
I remember well your writing from the past, September 2005: "While DNG is not guaranteed to be around forever it has a better chance than any particular individual camera format currently available". You continue than: "As more photographers see its benefits, the number of DNG files in existence will dwarf any other single format." That is, you were acknowledging that DNG can go to oblivion yet suggesting to use it as widely as possible. Sound logic and good advice.

Hmm, I would think that a format person would understand that "going into oblivion" is pretty unlikely for an openly documented format. Adobe could go under tomorrow, and DNGs could still be opened by third parties. And I do believe that there are probably more DNGs in existence than any other single raw format.

Since you won't identify yourself, or your own financial interest, I'm out.
Peter

Pages