Categories
Linux / Gentoo Linux

Debugging nVidia EDID resolutions

After installing an nVidia card into my HTPC I ran into a problem I never had before: the driver will not allow me to use the full 1280×720 resolution of my TV. After running xrandr on the console, I can confirm that Xorg reports the following supported resolutions:

Screen 0: minimum 320 x 240, current 720 x 480, maximum 1920 x 1080
default connected 720x480+0+0 0mm x 0mm
720x480        50.0*    51.0
680x384        52.0     53.0
640x480        54.0     55.0
512x384        56.0
400x300        57.0     58.0
320x240        59.0
480i/60        30.0
480p/60        60.0
720p/60        60.0
1080i/60       30.0
1080p/60       60.0

Now if you read the above closely, you can see that the display should have modes up to 1920×1080 (interlaced in my case but still). This proves that in fact the EDID information (which should allow your graphics card to auto-detect monitor properties) is coming through but that it is not working properly. Read on to solve this problem.

After enabling debug support in the xorg.conf file (add Option "ModeDebug" "TRUE" to the Monitor section) I get a full dump of all modes supported by my TV, including the missing 720p and 1080i modes! For example:

(II) NVIDIA(0):   Validating Mode "1920x1080":
(II) NVIDIA(0):     1920 x 1080 @ 60 Hz
(II) NVIDIA(0):     Mode Source: EDID
(II) NVIDIA(0):       Pixel Clock      : 74.18 MHz
(II) NVIDIA(0):       HRes, HSyncStart : 1920, 2008
(II) NVIDIA(0):       HSyncEnd, HTotal : 2052, 2200
(II) NVIDIA(0):       VRes, VSyncStart : 1080, 1084
(II) NVIDIA(0):       VSyncEnd, VTotal : 1094, 1124
(II) NVIDIA(0):       H/V Polarity     : +/+
(II) NVIDIA(0):       Extra            : Interlace
(II) NVIDIA(0):     Mode (1920 x 1080) is too large for DFP Native Resolution
(II) NVIDIA(0):         (Max: 720 x 480); mode will not be allowed to scale to
(II) NVIDIA(0):         the DFP's native resolution.
(II) NVIDIA(GPU-0):     BestFit Scaled and BestFit AspectScaled are identical;
(II) NVIDIA(GPU-0):         collapsing BestFit AspectScaled.
(II) NVIDIA(GPU-0):     BestFit Centered and BestFit Scaled are identical;
(II) NVIDIA(GPU-0):         collapsing BestFit Scaled.
(II) NVIDIA(GPU-0):     BestFit Centered         1920x1080
(II) NVIDIA(GPU-0):       Horizontal Taps        0
(II) NVIDIA(GPU-0):       Vertical Taps          0
(II) NVIDIA(GPU-0):       Base SuperSample       1
(II) NVIDIA(GPU-0):       Base Depth             32
(II) NVIDIA(GPU-0):       Distributed Rendering  1
(II) NVIDIA(GPU-0):       Overlay Depth          32
(II) NVIDIA(0):     Mode is valid.

In here you can clearly see that the mode came from the EDID data, it got verified as a valid mode (clock ranges match the requirements for the resolution) and all is well up to this point “Mode (1920 x 1080) is too large for DFP Native Resolution”.

It seems the nVidia drivers have a design flaw that screwed me over: the first mode from the EDID block is the physical resolution from the attached panel. For most monitors this seems to be the case. However, if I read the Xorg.0.log file from my HTPC, I find the following in the EDID dump (remember that you need to enable debugging to see all this):

(--) NVIDIA(0): Detailed Timings:
(--) NVIDIA(0):   720  x 480  @ 60 Hz
(--) NVIDIA(0):     Pixel Clock      : 27.00 MHz
(--) NVIDIA(0):     HRes, HSyncStart : 720, 736
(--) NVIDIA(0):     HSyncEnd, HTotal : 798, 858
(--) NVIDIA(0):     VRes, VSyncStart : 480, 489
(--) NVIDIA(0):     VSyncEnd, VTotal : 495, 525
(--) NVIDIA(0):     H/V Polarity     : -/-
(--) NVIDIA(0):   1920 x 1080 @ 60 Hz
(--) NVIDIA(0):     Pixel Clock      : 74.25 MHz
(--) NVIDIA(0):     HRes, HSyncStart : 1920, 2008
(--) NVIDIA(0):     HSyncEnd, HTotal : 2052, 2200
(--) NVIDIA(0):     VRes, VSyncStart : 1080, 1084
(--) NVIDIA(0):     VSyncEnd, VTotal : 1094, 1124
(--) NVIDIA(0):     H/V Polarity     : +/+
(--) NVIDIA(0):     Extra            : Interlaced

The first resolution is 720×480@60Hz and the next one is the 1080i resolution. The nVidia driver sees this as: native resolution is 480p and the next supported resolution is 1080i (and 40-ish more resolutions after that). The problem is that whatever your Xorg wants to do, it will always get stuck on the supported resolutions from the nVidia driver.

Now we are going to solve this problem, simply by disabling EDID and specifying the proper settings by hand. First, find the beginning of the EDID information in the log file. This is the section for my HDTV:

(--) NVIDIA(0): --- EDID for LG 26LC2R-ZJ (DFP-0) ---
(--) NVIDIA(0): EDID Version                 : 1.3
(--) NVIDIA(0): Manufacturer                 : GSM
(--) NVIDIA(0): Monitor Name                 : LG 26LC2R-ZJ
(--) NVIDIA(0): Product ID                   : 22046
(--) NVIDIA(0): 32-bit Serial Number         : 5918
(--) NVIDIA(0): Serial Number String         :
(--) NVIDIA(0): Manufacture Date             : 2006, week 7
(--) NVIDIA(0): DPMS Capabilities            :
(--) NVIDIA(0): Prefer first detailed timing : Yes
(--) NVIDIA(0): Supports GTF                 : No
(--) NVIDIA(0): Maximum Image Size           : 920mm x 520mm
(--) NVIDIA(0): Valid HSync Range            : 25.0 kHz - 50.0 kHz
(--) NVIDIA(0): Valid VRefresh Range         : 45 Hz - 65 Hz
(--) NVIDIA(0): EDID maximum pixel clock     : 80.0 MHz

See the horizontal and vertical sync ranges? Put those in your xorg.conf file and disable EDID to force the driver to figure out which resolutions are supported and add the ones you want (read on to figure out how to create your own ModeLines):

Section "Monitor"
  Identifier     "Monitor0"
  VendorName     "LG"
  ModelName      "26LC2R-ZJ"
  Option         "DPMS"

  # Enable (mode) debugging to see what is going on
  Option         "ModeDebug" "TRUE"

  # Disable EDID as we will end up with a wrong native resolution otherwise
  Option         "UseEdid" "FALSE"
  # Specify the sync values from the EDID information
  HorizSync       25.0 - 50.0
  VertRefresh     45.0 - 65.0

  # Override the DPI to the EDID value
  Option         "DPI" "19x23"

  # Add the HDTV modes by hand, 480p and 720p
  # 720x480 @ 60 Hz (EDID) HSync: 31.4685 kHz
  ModeLine "720x480" 27.00 720 736 798 858 480 489 495 525 -HSync -VSync
  # 1280x720 @ 60 Hz (EDID) HSync: 45 kHz
  ModeLine "1280x720" 74.25 1280 1390 1430 1650 720 725 730 750 +HSync +VSync
EndSection

Note: Make sure you copy the log file while running with EDID on to determine values later on.
For example, the sync rates and DPI are copied straight from the Xorg.0.log.

The tricky bit are the ModeLine entries: those actually add the desired resolutions to the resulting modes. The clock ranges give the driver a sandbox to fit each and every resolution to, but it will only check for 4:3 resolutions.

To add your fancy 16:9 HDTV resolutions, add the ModeLine entries to the Monitor section. Compose them from the EDID debug information. Look at the examples below:

(--) NVIDIA(0): Detailed Timings:
(--) NVIDIA(0):   720  x 480  @ 60 Hz
(--) NVIDIA(0):     Pixel Clock      : 27.00 MHz
(--) NVIDIA(0):     HRes, HSyncStart : 720, 736
(--) NVIDIA(0):     HSyncEnd, HTotal : 798, 858
(--) NVIDIA(0):     VRes, VSyncStart : 480, 489
(--) NVIDIA(0):     VSyncEnd, VTotal : 495, 525
(--) NVIDIA(0):     H/V Polarity     : -/-

Becomes: Modeline "720x480" 27.00 720 736 798 858 480 489 495 525 -HSync -VSync

(--) NVIDIA(0):   1280 x 720  @ 60 Hz
(--) NVIDIA(0):     Pixel Clock      : 74.25 MHz
(--) NVIDIA(0):     HRes, HSyncStart : 1280, 1390
(--) NVIDIA(0):     HSyncEnd, HTotal : 1430, 1650
(--) NVIDIA(0):     VRes, VSyncStart : 720, 725
(--) NVIDIA(0):     VSyncEnd, VTotal : 730, 750
(--) NVIDIA(0):     H/V Polarity     : +/+

Becomes: ModeLine "1280x720" 74.25 1280 1390 1430 1650 720 725 730 750 +HSync +VSync

(--) NVIDIA(0):   1920 x 1080 @ 60 Hz
(--) NVIDIA(0):     Pixel Clock      : 74.25 MHz
(--) NVIDIA(0):     HRes, HSyncStart : 1920, 2008
(--) NVIDIA(0):     HSyncEnd, HTotal : 2052, 2200
(--) NVIDIA(0):     VRes, VSyncStart : 1080, 1084
(--) NVIDIA(0):     VSyncEnd, VTotal : 1094, 1124
(--) NVIDIA(0):     H/V Polarity     : +/+
(--) NVIDIA(0):     Extra            : Interlaced

Becomes: ModeLine "1920x1080" 74.25 1920 2008 2052 2200 1080 1084 1094 1124 +HSync +VSync Interlace

Save your xorg.conf and restart the X server, if you are lucky, the new resolutions are picked up. If not (like me), you might have to disable the check against the native resolution altogether to get it to work:

Option "ModeValidation" "NoDFPNativeResolutionCheck"

Some people reported that it worked after that. This will only work if the nVidia driver decided that the native DFP size is larger than the resolution you just tried to add. If it reverts to the first available mode as a default (640×480), you will be out of luck as the mode will get reject once again:

(WW) NVIDIA(0):     Unable to use mode "1680x1050" for DFP-0; cannot compute
(WW) NVIDIA(0):         backend DFP timings (mode is larger than native
(WW) NVIDIA(0):         backend 640 x 480).
(WW) NVIDIA(0):     Mode is rejected: Unable to determine BestFit backend DFP
(WW) NVIDIA(0):     timings.

The problem here is that the default DFP Native Resolution is even worse than the EDID one, making it impossible to get the new modes validated.

Troubleshooting

If you are still reading this then you are having the same stubborn driver issues I am having and it is time to bring out the big guns: “ExactModeTimingsDVI”. This option will tell the nVidia driver to disable any checking of the available modes and just apply them.

BIG FAT WARNING:
You *will* be able to damage your TV if you start experimenting with the mode line values. If you created your own like I showed in the example you will be fine, if you have a fancy TV you will most likely be fine as the electronics will have built-in protection against insane modes (for example 640×480@500).

And besides having the ExactModeTimingsDVI option, you need to specify the available modes in the Screen section as well. The resulting snippet for my TV is as follows:

Section "Monitor"
  Identifier     "Monitor0"
  VendorName     "LG"
  ModelName      "26LC2R-ZJ"
  Option         "DPMS"

  # Enable (mode) debugging to see what is going on
  Option         "ModeDebug" "TRUE"

  # Disable EDID as we will end up with a wrong native resolution otherwise
  Option         "UseEdid" "FALSE"
  # Specify the sync values from the EDID information
  HorizSync       25.0 - 50.0
  VertRefresh     45.0 - 65.0

  # Override the DPI to the EDID value
  Option         "DPI" "19x23"

  Option     "ExactModeTimingsDVI"     "True"

  # Add the HDTV modes by hand, 480p and 720p
  # 720x480 @ 60 Hz (EDID) HSync: 31.4685 kHz
  ModeLine "720x480@60" 27.00 720 736 798 858 480 489 495 525 -HSync -VSync
  # 1280x720 @ 60 Hz (EDID) HSync: 45 kHz
  ModeLine "1280x720@60" 74.25 1280 1390 1430 1650 720 725 730 750 +HSync +VSync
EndSection

Section "Device"
  Identifier     "Device0"
  Driver         "nvidia"
  VendorName     "NVIDIA Corporation"
EndSection

Section "Screen"
  Identifier     "Screen0"
  Device         "Device0"
  Monitor        "Monitor0"
  DefaultDepth    24
  SubSection     "Display"
   Depth       24
   Modes      "1280x720@60" "720x480@60"
  EndSubSection
EndSection

Save the file, restart Xorg and you should be looking at the normal High Def resolution your TV was made to display. Lets hope nVidia wises up and gets a simpler fix for this problem soon.

FYI: the driver in this case was version 185.xx running a GF9500 and hooked up to the TV using a HDMI-DVI cable.

Related Post

Leave a Reply

Your email address will not be published. Required fields are marked *