Categories
How-To's

Multi-monitor with KDE 4 and XRandR

One of the annoyances I’ve had with *nix so far was the apparent lack of multi-monitor control from the GUI. I’ve have experience with TwinView using xorg.conf, which worked fine but was quite a hassle to set up.

A side effect of this is the fact that once set up, I never touched the configuration again. For a computer with 2 screens hooked up permanently, this is fine. For a laptop which is dragged along and connected to beamers, TVs and monitors – not so much.

The good news is that I run a modern linux distro which led me to believe that the multi-monitor stuff should be enabled by default. However, I am not using Kubuntu (which would probably auto-enable all new toys for me) but rather I am using Gentoo. So it took some digging to figure out what is going on these days and how to use it. On a side note: you could use nvidia-settings instead after plugging something in but I’d rather use the automatic method.

First a word of warning: the binary NVidia drivers do not support RandR 1.2. I have the 195.xx drivers installed and NVidia expressed back in 2007 that RandR 1.2 support was a ‘priority’ feature. One that apparently needed more than 3 years to be released. This means that using the binary ‘nvidia’ driver will give you RandR 1.1 which does not support the on-the-fly hotplug for displays. Instead you are forced to use TwinView, define meta-modes for every possible configuration and switch to those using ‘xrandr’ (or use the nvidia-settings tool each time).

Since this sort of defies having RandR 1.2 support altogether in Xorg, I decided to ditch the binary nvidia driver in favor of ‘nouveau’: the open-source replacement for ‘nv’ with decent 2D acceleration support (although I couldn’t resist and enabled the highly experimental Gallium3D support as well).

To start off: trim down your Xorg configuration to a minimum, note that a recent Xorg is needed (I have 7.4 at the time of writing). This is needed to enable the auto-detection of many things or rather: manually specifying properties will override auto-detected settings and cripple Xorg’s ability to handle everything on its own. As an example, I have included my own xorg.conf below.

Section "ServerLayout"
  Identifier "Default Layout"
  Screen 0 "Screen0"
EndSection

Section "ServerFlags"
  Option "AutoAddDevices" "true"
  Option "AutoEnableDevices" "true"
EndSection

Section "Device"
  Identifier "nVidia_8600M_GS_nouveau"
  Driver "nouveau"
  Boardname "GeForce 8600M GS"
EndSection

Section "Screen"
  Identifier "Screen 0"
  Device "nVidia_8600M_GS_nouveau"
  Monitor "Monitor 0"
  SubSection "Display"
    Modes "1680x1050"
  EndSubSection
EndSection

Section "Monitor"
  Identifier "Monitor 0"
  VendorName "Primary Monitor"
  Option "DPMS"
EndSection

Section "Extensions"
  Option "Composite" "Enable"
EndSection

The ‘ServerFlags’ section is pretty simple: use every auto-detection known in Xorg to hotplug devices. This means both monitors and input devices like mouses and keyboards.

Next up are the 2 sections for my NVidia display card. Since I am using ‘nouveau’ instead of ‘nvidia’, make sure you are not loading ‘glx’ somewhere (if you were using ‘nvidia’ in the past like me, you most likely have that somewhere).

The last 2 sections are fairly simple as well but note that I do specify the native resolution of my LCD screen as the preferred resolution.

Thats it! Restart your X server and run xrandr so see the result. I plugged a VGA monitor in my notebook and this is the result:

cyberwizzard@cyberxps ~ $ xrandr
Screen 0: minimum 320 x 200, current 2960 x 1050, maximum 8192 x 8192
LVDS-1 connected 1680x1050+0+0 (normal left inverted right x axis y axis) 331mm x 207mm
1680x1050      60.0*+   60.0
1400x1050      60.0
1280x1024      59.9
1280x960       59.9
1152x864       60.0
1024x768       59.9
800x600        59.9
640x480        59.4
720x400        59.6
640x400        60.0
640x350        59.8
HDMI-1 disconnected (normal left inverted right x axis y axis)
VGA-1 connected 1280x1024+1680+0 (normal left inverted right x axis y axis) 312mm x 234mm
1680x1050      74.9 +   60.0
1280x1024      85.0*+   75.0     60.0
1792x1344      60.0
1920x1200      59.9
1600x1200      75.0     70.0     65.0     60.0
1400x1050      85.0     74.9     60.0
1440x900       84.8     75.0     59.9
1280x960       85.0     60.0
1360x768       60.0
1280x800       84.9     74.9     59.8
1152x864       75.0
1280x768       84.8     74.9     59.9
1024x768      100.0     85.0     75.1     75.0     70.1     60.0     43.5     43.5
832x624        74.6
800x600        85.1     72.2     75.0     60.3     56.2
848x480        60.0
640x480        85.0     75.0     72.8     72.8     66.7     60.0     59.9
720x400        85.0     87.8     70.1
640x400        85.1
640x350        85.1
Categories
Linux / Gentoo Linux

Debugging nVidia EDID resolutions

After installing an nVidia card into my HTPC I ran into a problem I never had before: the driver will not allow me to use the full 1280×720 resolution of my TV. After running xrandr on the console, I can confirm that Xorg reports the following supported resolutions:

Screen 0: minimum 320 x 240, current 720 x 480, maximum 1920 x 1080
default connected 720x480+0+0 0mm x 0mm
720x480        50.0*    51.0
680x384        52.0     53.0
640x480        54.0     55.0
512x384        56.0
400x300        57.0     58.0
320x240        59.0
480i/60        30.0
480p/60        60.0
720p/60        60.0
1080i/60       30.0
1080p/60       60.0

Now if you read the above closely, you can see that the display should have modes up to 1920×1080 (interlaced in my case but still). This proves that in fact the EDID information (which should allow your graphics card to auto-detect monitor properties) is coming through but that it is not working properly. Read on to solve this problem.