In-camera HDR, it is time to innovate, an open question to all camera manufacturers.

One of the trending technique in photography right now is HDR or High Dynamic Range.  The basic concept of HDR photography coming from the fact that current technology of today camera, or should I say camera’s sensor to be specific, cannot capture all the range of exposure level as wide as human eyes can.  So, the HDR technique will combine a set of multiple level of exposure images and create a new one that combined all exposures levels.  Trey Ratcliff is probably is king of HDR and his works is being shown at if you want to learn/know about HDR photography that’s where you should start.  What I’m writing here isn’t about how to make HDR but rather a rant about what I find in today camera regarding HDR.

The true is I don’t very do HDR photography.  I have thought about trying it some day.  Some time I bracket shoot with the mind set that I might do HDR some day.  And yet, I’m still lazy enough to not actually put those shots together.  Now a day, some cameras do support HDR, meaning it will take 3 shots and do all the technical works inside the camera and spit out all the 3 images it originally capture and the 4th which combine the highlight and shadow details of the 3 original shots, a.k.a. HDR image.  With that said, that usually come out not as good as manually doing a HDR in the computer using specialize tool like photomatrix.

I’m not complaining about that, nor why today sensor can’t still capture HDR like exposure level.  Considering the competitive market of photography tools(notice I didn’t use the word camera here) now a day, I doubt the camera manufacturers are slacking off.  Both old and trusted Canon and Nikon are very competitive in all the market segments from high-end dSLR(digital Single Lens Reflex) to Point-and-Shoot and everything in the middle including entry level dSLR and enthusiast P&S.  Then there is Sony, a giant electronic compnay with tons of cash have a fast growing market in this area and then all other companies that make EVIL(Electronic Viewfinder Interchangable Lens) cameras like Olympus, Panasonic, Samsung and Fujifilm.  Now a day, almost all P&S camera have to compete with smartphone as well.  Maybe with the exception of Samsung whom decided to merge the two together instead.

This is how we do HDR today.  We take 3 shots of the same subject(s) from the same point of view.
Key Frame -2
Image captured at -2EV
Key Frame +0
Image captured at normal exposure level
Key Frame +2
Image Captured at +2EV


Regardless of whether to use in-camera HDR or do it in the computer later you then create a HDR image

**Simulated images shown for the purpose of illustration only.

Let’s take a look at the timeline here.

Set the camera to Aperture Priority with bracketing or manually set them one after another, and lets’ assume the correct shutter speed for normal exposure level is 1/30sec with the aperture value of f/8 and ISO 100

Set the shutter speed to 1/120sec captured first image
Set the shutter speed to 1/30sec captured second image
Set the shutter speed to 1/7.5sec captured third image


Now, here is my rant, why(insert your own connotation for special impact here) do need to do that in the digital age?

Seriously why haven’t camera manufacturers improve thing like this already.  The camera don’t need to expose the sensor 3 different times to capture 3 images.  It only need to expose 1 time to capture 3 or more images.   First, let me explain that by expanded the timeline.

clear the sensor Set the shutter speed to 1/120sec and start capture sensor readout to memory export sensor memory captured to first image
clear the sensor Set the shutter speed to 1/30sec and start capture sensor readout to memory export sensor memory captured to second image
clear the sensor Set the shutter speed to 1/7.5sec and start capture sensor readout to memory export sensor memory captured to third image

This is what it should be

clear the sensor camera adjust the shutter speed to 1/120sec and start capture pause sensor and copy sensor readout to memory export sensor memory captured to first image
camera adjust the shutter speed to 3/120sec (1/90sec) and continue capture pause sensor and copy sensor readout to memory export sensor memory captured to second image
camera adjust the shutter speed to 12/120sec (1/10sec) and continue capture pause sensor and copy sensor readout to memory export sensor memory captured to third image

I think Canon, Sony and Fujifilm produce their own image sensors for their own cameras.  This kind of things should have been possible.  The whole process would have take about the same time or slightly more than as you would take an image with the exposure of +2EV alone.

The benefit of capture the image this way for the purpose of doing HDR is to reduce the total time it take to capture all the necessaries exposure level and thus reduce the camera shake thus improve in image quality.  And in the case of dSLR, expose the image this way would only require the mirror to flip once and thus reduce the camera vibration caused by mirror flipping.  I know the latest and greatest stuff like Canon EOS 5D mark III already have feature to combine images from different exposure level, this isn’t a stretch from what is already available.  Considering the newer cameras now have some sort of HDR available, yet they still do it by capture 3 images from 3 exposures.  I don’t believe there is a downside of the camera tying to make or assist in making the HDR this way.  In the age where most camera now can shoot video, eg: camera sensor are flexible enough to work differently from when it was originally designed to do, I don’t see why this is impossible.

I believe that this is an area that the camera manufacturers can improve upon.  There is no reason for the digital camera to be stuck trying to work around the old way we capture image from the film day.  It can works differently for the better.


Red in the NAS box?

I recently listened to the TWiCH(This Week in Computer Hardware) podcast episode 252, which talk about the red drive.  They talked about the different between green and red drive and point to the article in the   This wasn’t the first time I heard about the red drive, but this is the first time I decided to go read the article.  I presume this is the article Patrick Norton and Ryan Shrout, the hosts of TWiCH, talked about.

The article brought up a very easy to understand series of event in which the green drive fail.

  • Array starts off operating as normal, but drive 3 has a bad sector that cropped up a few months back. This has gone unnoticed because the bad sector was part of a rarely accessed file.
  • During operation, drive 1 encounters a new bad sector.
  • Since drive 1 is a consumer drive it goes into a retry loop, repeatedly attempting to read and correct the bad sector.
  • The RAID controller exceeds its timeout threshold waiting on drive 1 and marks it offline.
  • Array is now in degraded status with drive 1 marked as failed.
  • User replaces drive 1. RAID controller initiates rebuild using parity data from the other drives.
  • During rebuild, RAID controller encounters the bad sector on drive 3.
  • Since drive 3 is a consumer drive it goes into a retry loop, repeatedly attempting to read and correct the bad sector.
  • The RAID controller exceeds its timeout threshold waiting on drive 3 and marks it offline.
  • Rebuild fails.

Before I continue, I should point out that the article outlines legitimate problem and anyone building or plan to build/buy a NAS box should be aware of.  The jist of the problem is that the number of error(bad sector) exceeds the numbers of tolerance provided by the RAID.  eg: 1 tolerance in RAID 5 and 2 tolerances in RAID 6.  Simple enough concept.  Some of the people already wrote in the comment of the article that it’s still possible to use some sort of harddrive utility like Spinrite on the harddrives before putting bring the NAS back online and rebuilt.  Technically there is nothing wrong with this solution except that this is not ideal for business.  What I mean is that generally speaking, consumer and business don’t necessary have the same objective.  While yes, both consumer and business want some sort of backup/redundancy system but business can spend more money and at least in the US, it’s tax deductible as it’s consider cost of doing business but at the same time the recovery process need to be quick.  On the other hand, for consumer use this is almost the opposite.

Another idea that I think about is that this problem is preventable.  Looking at the first bullet point, it’s easy to see that this is where the problem start.  A bad sector in the area of a file that is not commonly use and thus not detected.  And why does it has to be like that.  Imagine a home build RAID using something like FreeNAS, NAS4free, unRAID or whatever the system you want to use, why can’t they put in a feature to read those files when the NAS aren’t being active?  Especially a linux based NAS solution, people could write a script to read (least accessed) files when the CPU or IO usage is low.  Ideally, this could be build as a feature into the NAS distro.  And it’s still possible for the manufacturer to do this in the off-the-shelf upgradable NAS system like Synology or Drobo.  Even if the files or filesystem is encrypted, reading it would merely be readable or unreadable and thus generates some sort or log or trigger the SMART.  If the unreadable sector can be discovered early, then we wouldn’t run into this problem.  I’m sure this blog isn’t exactly a popular one, nor it’s the most useful one.  I just thought why doesn’t anyone brought this up before?

Nook HD+

I just got a hold of Nook HD+ for a few days now. I think most people already know what it is. If you don’t know what it is, google it. I’m not doing unboxing either. I’m sure there are other that already doing that, if not already done so. This is basically a tidy up of what my first impression of it, some of the things are what other have missed. Also, The images are to illustrate how to skip OOBE(Out Of Box Experience) on this Nook HD+ unit, and maybe the same for HD unit as well.

(image 01)
This is what you will see when it’s power OFF.

(image 02)
This is what you will see turning it ON.

(image 03)On power off stage, holding the Home button(that lower case “n”) and power button then release them when you see the word “nook” appear on the middle of the screen(image 02), and you will get this Factory Reset screen.

(image 04)
Factory Reset confirmation page.

(image 05)
If you did the factory reset, you will see this “nook” screen with the status bar underneath it.

(image 06)
More status percentage page.

(image 07)
Welcome Screen, this is what you would normally see the first time you boot or after factory reset.

(image 08)
At welcome screen(image 07), if you hold down the volume up button(top-RIGHT) and hit the bottom-RIGHT of the touchscreen(yes the actual screen, white portion), then you can go into Factory Setting page.

(image 09)
Alternatively at the welcome screen (image 07), you can choose to hold down volume-down(top-LEFT) and hit the bottom-LEFT of the touchscreen(yes the actual screen, white portion), to enable ADB.  Let me make it clear, I have no idea what it will do, since I have not get the ADB to work yet.  It has been a pain for me.  Maybe somebody else will make a guide for this.

(image 10)
You can do both on the same screen.

(image 11)
hitting the “Factory” on-screen button as shown on image 08 or 10 will get you some juicy info.  If you gonna ask if all of the image are docked, then yes.  But of course I did so to remove the device specific information, I’m not that nut.

(image 12)
Skip OOBE, while on the factory setting page(image 11), holding down the volume up button(top-RIGHT) and hit the bottom-RIGHT of the touchscreen(yes the actual screen, white portion) like you did to reveal “Factory” button, you can now see the “Skip Oobe” button.  This will get you into Test mode and play around with it w/o having to register first.  Beware, once you skip the OOBE, it stay in that mode.  It will survive soft and hard reboot until you re-do factory reset.

(image 13)
After hitting that “Skip Oobe” button, this is what’s next.

(image 14)
Cold-reboot doesn’t get rid of “Skip OOBE” mode, it stay until factory reset.

Now, on a side note, I think the Nook HD+ feel very good in hand.  Although it doesn’t feel very solid like some other device, but it doesn’t feel cheap either.  When holding it, you can feel the sense of design putting into this.  With that said, I can also feel the sense of rough design on this as well.  For example, the front trim, it feel like it’s another piece that glue to the front.  You can feel it like you can just rip that out.  The glass screen on the front you can see the edge of the glass looking from the angle.  I suppose who ever design it wanna cut the weight to the limit.  Seeing the edge of the glass isn’t very appealing.  Also I don’t know what type of the screen it is, but when the screen is turn off or have dark static background, I can see array of dot arranged in the matrix when looking from certain angle.  The buttons layout isn’t the best in my personal opinion, but that could be just a matter personal taste.  I don’t mind the power button, it could be right or top, but I think volume rocker is better off on the side.  The home button may need to get used to if coming from other non-nook device.  I think it does a decent job as a movie playing and book reading device, I think the form factor are good for both, and of course the screen resolution justify it.  I think if you plan to use the tablet for gaming or use camera in anyway, that’s taking picture or video calling this is just not for you.  But for other normal task I think it will do just fine.