Hi,
I got a brand new U2713HM a couple of days ago. It looked a little too bright in the mid tones compared to my previous monitors, and a profiling of the uncalibrated monitor showed it had a gamma of ~1.86. This was done from two different computers w/two different NVidia graphics cards (on both HDMI and VGA), with the help of a DTP-94 and Argyll/dispcalGUI.
I have also inspected the accompanying calibration card, which shows a gamma curve going from 0 to 350 cd/m2 and passing through a reading of around 95 cd/m2 at grey level 128. I've calculated the native gamma from the calibration card by means of a curve fit, and it also shows a best fit at around ~1.87, consistent with what I've measured.
I've seen a couple of other calibration cards that are on the web, and they all show a higher gamma than on my monitor, much closer to the 2.2 (or the sRGB curve itself).
The gamma does not change with the mode (Custom or sRGB) nor with adjustments to the luminosity (I've tested the range from 28 to 75, corresponding to somewhere around 100 cd/m2 to +250 cd/m2).
The gamma is thus quite far from the spec'ed sRGB mode of close to 2.2*) most of the range on my monitor.
Do I need to ask for a replacement unit, or is there some "hidden" setting that I can effect?
I tend to believe this is related to the screen itself, and I'm wondering why the sRGB factory calibration hasn't caught this large deviation -- it looks like it is "out of spec" compared to the advertised sRGB mode.
Thanks in advance,
-- Per.
*) Yes, I'm aware that sRGB is really not gamma 2.2 but ((x+a)/(1+a))^2.4, but they are quite close in the mid tone areas :-)