Building OpenElec with Lirc audio_alsa

Since OpenElec 2/3 does not ship with ‘audio_alsa’, here are the instructions to build OpenElec with the ‘audio_alsa’ lirc driver. This module is used with home-brew IR receivers using the soundcard audio input.

Use the standard build guide from the OpenElec wiki. After cloning the GIT repository, modify the lirc build script to build the correct module.

Open packages/sysutils/remote/lirc/build and replace the following line for the userspace driver: ... --with-driver=userspace \ ... to ... --with-driver=audio_alsa \ ...

Update: I tried various things but for some reason I could not get the alsa support to be simply added to the standard build (this seems to be a bug in LIRC which should have been fixed upstream). You can try adding the line instead of replacing it (note the trailing slash!)

All done: now start the system build as you normally should: PROJECT=Generic make release
Sit back and wait; on a Core2Duo system the build takes several hours and close to 9GB of space. Follow the guides to modify the lirc config in userspace, or modify the ‘install’ script to install the correct configuration into the read-only system build.

Updates Break LIRC!

Your custom built OpenELEC will attempt to update itself with standard releases. You can disable automatic updates or place the LIRC binaries on the /storage mount where they will survive updates; just make sure you call the correct binaries.

Note that this will only allow you to have updates as long as the binary compatibility of OpenELEC does not change. It seems plausible that this will remain as such at least between large updates but if you want to be really sure you should switch to manual updates to prevent sudden breakage…

Audio Setup

Some notes to set up the audio capture correctly as these seem to be scattered over the internet.

First off, mute the capture for all inputs but the jack you connected the receiver to so only the correct input is enabled. On OpenELEC it seems like all inputs are muted by default so you only need to unmute the correct input from within /storage/.config/

# Select the correct input channel
amixer set "Capture",0 cap
# Set the volume of the capture
amixer set "Capture",0 20

In case you wonder which channel is the correct one or how loud it should be, use arecord. You most likely will not have this program on OpenELEC but it is in fact identical to ‘aplay’: create a symlink to ‘aplay’ called ‘arecord’ and you are good to go.

Use ‘arecord’ to display the volume of the incoming signal (using the builtin VU), tweak to whatever your system needs:

arecord -vv -D hw:0 -c2 -r48000,l -f S16 > /dev/null

I experimented with the recording amplification until I had a near zero output when idle (2 to 3% on the VU) and 30% to 35% peak volume when holding a button down. At first I had the volume tweaked so input would trigger 10% on the VU. At this point irrecord seemed to work but it failed on the ‘Toggle Bit’ detection where you have to push a button as often as possible. Increasing the volume solved this (and simply opening the volume up to the max will break it again – calibrate it properly or ‘irrecord’ will fail!).

And for reference, this is how I start lircd from in OpenELEC:

/storage/.bin/lircd --driver=audio_alsa --output=/var/run/lirc/lircd \
 --pidfile=/var/run/lirc/ -d hw@48000,l /storage/.config/lircd.conf

You could add --uinput to generate Linux input events besides standard lirc events. This effectively means the XBMC would not use LIRC events but something like keyboard strokes. It depends on personal preference but I still use the old fashioned LIRC events.


Sharing Sound From Ubuntu to XBMC

Summary: in this article I explain how you can stream music (audio) from your Ubuntu computer to XBMC using UPnP. While this recipe also allows you to play every MP3 file on your computer, it is targeted to getting the audio from your speakers to your home cinema set.

While test-driving XBMC 12 during the pre-beta phase I noticed how I could use AirPlay with iTunes under Windows to get sound from my crappy laptop speakers to the home cinema setup powered by XBMC.

Thinking this should also be possible from within Ubuntu Linux sent me on a goose chase that seemed to end up nowhere.

From the ‘paprefs’ utility, you can actually connect with the AirPlay capabilities of XBMC. But instead of sharing sound, you in-sta crash or hang XBMC. Even better, since this functionality is managed by PulseAudio, merely logging in on Ubuntu will hang XBMC as PulseAudio keeps connecting.

I read multiple stories how this is a RAOP (I assume this is the Linuxy name for AirPlay) issue where UDP should be used but TCP is preferred or something. Looking at the libshairplay website (which seems to be used by PA), it says UDP is supposed to be working for quite some time now. I gave up on this.

Next up is sharing local audio using UPnP: paprefs => ‘Network Server’ => ‘Make local sound devices available as DNLA/UPnP Media Server’. Fire up XBMC, search for audio using UPnP aaaaaand…. nothing.

It turns out that enabling this option only loads a module which creates an HTTP audio sink. Put another way: the audio from your sound card is shared on the network, it is just not advertised so nothing can find it.

This is the point where you need Rygel. While the description tells you about transcoding media for Xboxes and PlayStations, the core of Rygel is a UPnP media server. You need this functionality to advertise the audio sources to the local network.

If you now start Rygel (3) by typing ‘rygel’ you will get a listing for any found media, Rythmbox music collection (if its running) and the internal sound cards (1).

But all is not well yet: when you point XBMC to the internal sound card (or the special DNLA device if you enabled it in ‘paprefs’) XBMC will crash:

Hopefully this bug will get fixed soon but in the meantime it leaves us without streaming audio. Luckily there is another solution in the form of the GstLaunch (2) plugin for Rygel. Do note that this is another package seperate from Rygel itself (it took me 10 minutes to figure out why the plugin was seemingly awol when it was in fact not installed).

The GstLauncher plugin allows you to connect a GStreamer pipeline to the network as a UPnP source. While this in theory opens up loads of fun stuff to play with, it can also be used to generate audio streams from the internal sound card for XBMC.

Open up your rygel.conf (I removed the one in ~/.config/rygel.conf and put everything in /etc/rygel.conf) and put the following in there (comment the existing GstLaunch section):


pulseaudioflac-title=FLAC on @HOSTNAME@
device=upnp.monitor ! flacenc

pulseaudiomp3-title=MP3 on @HOSTNAME@
device=upnp.monitor ! lamemp3enc

pulseaudiopcm-title=PCM on @HOSTNAME@
device=upnp.monitor ! wavenc

Feel free to cut this list down to just the formats you want (do not forget to modify launch-items accordingly). MP3 is lossy and as such lowers audio quality and has a high latency. FLAC is losless and should have low latency but XBMC refuses to play the FLAC stream. It complains about the -lack of- metadata and similar issues have been solved in GStreamer 0.11 while I am using 0.10. I assume a newer Ubuntu will fix this. Finally PCM is uncompressed wave but requires the most bandwidth – I am using this option.

You will end up with something like this:

         +--> (1) PulseAudio => HTTP stream from internal audio --+ 
Ubuntu --+--> (2) Rygel => HTTP audio stream from GstLaunch ------+--> network -> XBMC
         +--> (3) Rygel UPnP server ------------------------------+

Note that (1) would be the preferred option as its built into Ubuntu but currently does not work in XBMC. Start Rygel and find the GstLaunch entry in the UPnP listing. Pick one of the streams to start listening, enjoy!


Thomson TG787v and PXE boot

If you have a Thomson TG787v modem, like we have from KPN Business in the Netherlands, you might want to add an PXE server to your network to quickly install computers without the hassle of running around with CDs or thumb drives.

I will not explain how to set up the PXE environment itself (which is in fact a TFTP service) but I will stick to the modification of the modem/router. This modification can NOT be done using the webbased configuration panel.

We will log in using telnet, create option templates for the DHCP server, install those option templates and finally add them to the active pool so they will actually be in use.

If you think that’s rather a lot of work to install option 66 (TFTP server address) and option 67 (Bootfile name) into your modem, please complain to Thomson and insist they make it a) more intuitive; b) improve their documentation; c) implement these things in their web-based configuration; d) all of the above.

  • Log in on the CLI using telnet at your router, port 23 and enter your credentials
    Note: with the unlocked versions, your login will be the username from the web-based configuration and no password. Try “Administrator”.
  • Install the new template for the PXE server address:
    dhcp server option tmpladd name=tftp_server_name optionid=66
  • Install the new template for the PXE bootfile name:
    dhcp server option tmpladd name=bootfile optionid=67
  • Instantiate the templates:
    dhcp server option instadd name=tftp_server_name tmplname=tftp_server_name value=(addr)
    dhcp server option instadd name=bootfile tmplname=bootfile value=(ascii)pxelinux.0

    Note: The file name can be configured to whatever you need.

  • List all DHCP server pools: dhcp server pool list
  • Optionally, add the option to the desired pool:

    dhcp server pool optadd name=LAN_private instname=tftp_server_name
    dhcp server pool optadd name=LAN_private instname=bootfile

It seems that the template instantiation is added by default to the “LAN_private” pool.


Force HTC Hero 1.5 to update

Since it took me hours to find this answer, I decided to put it up clear and simple: this is how you ‘force’ your stock HTC Hero to check for updates.

First off, this is for the original HTC Hero. So not the Sprint or CDMA version, but the GSM version (with the chin). Most sites tell you to use the “Check for updates” option. The original HTC Hero firmware does not have this option.

Instead, you need to install the first (of three) updates to get the updater. The original update mechanism checks every two weeks for an update. If you need it to check right away, you can move the clock 2 weeks forward or use enter this number: *#*#682#*#*. If it worked, the number will disappear from the dialer and almost instantly a popup will tell you that an update is available.

After this update, force the phone to check for future updates by doing:

  1. From the Home screen, press MENU, and then tap Settings.
  2. Scroll down the screen, and then tap About phone > System software updates.
  3. On the System software updates screen, tap Check now.

Gentoo KVM virtual machine support plus networking

The Gentoo Wiki article about KVM uses the less flexible networking setup with dedicated TAP devices like ‘tap0’. While this setup works fine, other distributions use the bridge device in a different way. By creating a virtual bridge and forcing the host system to connect through this bridge, virtual machines can simply connect to the bridge device and ‘plug in’. No need to manually create TAP devices. In this guide we will set up host and guest networking using the virtual bridge device and DHCP for wired ethernet.


Upgrading LIRC to ir-core and imon

While the title may sound a tad cryptic, that matched my first impression about the new infrared remote subsystem used in Ubuntu 10.10 and kernel 2.6.36+ based distributions. In Linux 2.6.36, a new sub-system is introduced for remote controls. This subsystem is a partial replacement for lirc as it features complete IR drivers for numerous devices.

In reality, the glass is only half full as some IR receivers (and transmitters) deal with raw data. Writing universal drivers for such devices can be bothersome or even impossible, for example when dealing with universal receivers. As such, LIRC is not completely written off as it can be used to do userspace IR processing as it used to. The exact details are unknown to me as I have an SoundGraph iMon receiver (device ID 0038 – LCD), which does not deliver raw sensory input but rather complete scancodes.

When I upgraded from Ubuntu 10.04 running lirc 0.8.6 to Ubuntu 10.10 and lirc 0.8.7, the entire IR subsystem died on me. To my surprise, while the remote was dead, the LCD screen was working fine. Further inspection showed that the ‘lirc_imon’ driver was not loaded at all.

With the move to the new input layer driver for supported IR devices in the kernel, the new ‘imon’ driver (version 0.8 at the time of writing) no longer provides a ‘lirc0’ device. As such, lircd will fail when attempting to claim it and loading ‘lirc_imon’ (provided you still have it) will not work as the device is already claimed by the ‘imon’ driver.

The fastest way to get up and running again is as follows. The ‘imon’ receiver will now be a HID input device. I got the impression it would send keystrokes to a graphical application but I didn’t get that working. Instead, we will use ‘lircd’ again using a special driver called ‘devinput’. This driver reads the key strokes from the input layer device and converts them into LIRC events. All your LIRC capable programs will then use LIRC like they used to – we only need to use the new button names.

To find the iMon receiver in the list of input devices, lets create a udev rule to symlink to the device. This way we will not have to rely on obscure device names like ‘input1’. Make sure the device name matches whatever hardware you have. I inserted this rule in a file called /etc/udev/rules.d/99-imon.rules:

KERNEL=="event*", SYSFS{name}=="iMON Remote (15c2:0038)", NAME="input/%k", SYMLINK="input/imon_remote", MODE="0666"

Lets begin by stopping lirc if you still have it running. Next, reconfigure LIRC to use the devinput driver – when it asks which device you have, DOT NOT SELECT THE SOUNDGRAPH IMON but ‘Linux input layer (/dev/input/eventX)’. In the next screen select ‘None’ as you probably do not use a transmitter.

/etc/init.d/lirc stop
sudo dpkg-reconfigure lirc
> Linux input layer (/dev/input/eventX)
> None
/etc/init.d/lirc start

In case you are not using Ubuntu Linux, I will explain the key parts of the LIRC configuration. The /etc/lirc/lircd.conf file now has this part in it:

#Configuration for the Linux input layer (/dev/input/eventX) remote:
include "/usr/share/lirc/remotes/devinput/lircd.conf.devinput"

The lirc.conf.devinput file is a standard (generated) file with key codes as provided with LIRC. There is no need to manually set up a file with button scan codes as the new ir-core kernel subsystem will generate standard codes.

The file called /etc/lirc/hardware.conf has something like this in it:

REMOTE="Linux input layer (/dev/input/eventX)"

All that remains is to use the new buttons in your LIRC capable program. For example, this is a part of my configuration of XBMC (Lircmap.xml) for use with the SoundGraph iMon receiver:

  <remote device="devinput">

New phonon-vlc backend for KDE 4

According to the devs of Amarok, the new VLC based backend for Phonon delivers better sound quality than the Xine backend. Something like this just begs to be explored of course and the instructions below are based on the generic ones but specifically for Gentoo. Please note that the backend is alpha: it is incomplete and changes daily.

Install VLC 1.1 pre by installing the live version (note that VLC 1.1 is about to be released any time now so this might be available by the time you read this). Note that I had to disable the Gentoo patches in the vlc-9999 ebuild as they prevent VLC from compiling.

ACCEPT_KEYWORDS=** emerge =vlc-9999

Note to self: add the keywords to the /etc/portage/package.keywords file. 😉

Next, fetch the Phonon-VLC backend from GIT. Note that is should become ‘stable’ soonish (it said somewhere it should be moving to kdereview in the near future) so it should be safe to assume it sort of works.

cyberwizzard@cyberxps ~ $ mkdir kde
cyberwizzard@cyberxps ~ $ cd kde
cyberwizzard@cyberxps ~/kde $ mkdir src
cyberwizzard@cyberxps ~/kde $ cd src
cyberwizzard@cyberxps ~/kde/src $ git clone git://

Initialized empty Git repository in /home/cyberwizzard/kde/src/phonon-vlc/.git/
remote: Counting objects: 483, done.
remote: Compressing objects: 100% (400/400), done.
remote: Total 483 (delta 330), reused 116 (delta 66)
Receiving objects: 100% (483/483), 102.22 KiB, done.
Resolving deltas: 100% (330/330), done.

cyberwizzard@cyberxps ~/kde/src $ cd phonon-vlc/
cyberwizzard@cyberxps ~/kde/src/phonon-vlc $ cmake -DCMAKE_BUILD_TYPE=debugfull -DCMAKE_INSTALL_PREFIX=/usr $HOME/kde/src/phonon-vlc/

...configure here...

cyberwizzard@cyberxps ~/kde/src/phonon-vlc $ make

...wait a bit more...

cyberwizzard@cyberxps ~/kde/src/phonon-vlc $ su -c "make install"

[100%] Built target phonon_vlc
Install the project...
-- Install configuration: "debugfull"
-- Installing: /usr/lib/kde4/plugins/phonon_backend/
-- Set runtime path of "/usr/lib/kde4/plugins/phonon_backend/" to "/usr/lib:/usr/lib64/qt4"
-- Installing: /usr/share/kde4/services/phononbackends/vlc.desktop

cyberwizzard@cyberxps ~/kde/src/phonon-vlc $ su -c "kbuildsycoca4 --noincremental"
...password and lot of output here...

And thats it! Just fire up Amarok -> Settings -> Configure Amarok -> Playback -> Configure Phonon (or go through System Settings in KDE) and select the VLC backend (currently shows up as version 0.2).


Multi-monitor with KDE 4 and XRandR

One of the annoyances I’ve had with *nix so far was the apparent lack of multi-monitor control from the GUI. I’ve have experience with TwinView using xorg.conf, which worked fine but was quite a hassle to set up.

A side effect of this is the fact that once set up, I never touched the configuration again. For a computer with 2 screens hooked up permanently, this is fine. For a laptop which is dragged along and connected to beamers, TVs and monitors – not so much.

The good news is that I run a modern linux distro which led me to believe that the multi-monitor stuff should be enabled by default. However, I am not using Kubuntu (which would probably auto-enable all new toys for me) but rather I am using Gentoo. So it took some digging to figure out what is going on these days and how to use it. On a side note: you could use nvidia-settings instead after plugging something in but I’d rather use the automatic method.

First a word of warning: the binary NVidia drivers do not support RandR 1.2. I have the 195.xx drivers installed and NVidia expressed back in 2007 that RandR 1.2 support was a ‘priority’ feature. One that apparently needed more than 3 years to be released. This means that using the binary ‘nvidia’ driver will give you RandR 1.1 which does not support the on-the-fly hotplug for displays. Instead you are forced to use TwinView, define meta-modes for every possible configuration and switch to those using ‘xrandr’ (or use the nvidia-settings tool each time).

Since this sort of defies having RandR 1.2 support altogether in Xorg, I decided to ditch the binary nvidia driver in favor of ‘nouveau’: the open-source replacement for ‘nv’ with decent 2D acceleration support (although I couldn’t resist and enabled the highly experimental Gallium3D support as well).

To start off: trim down your Xorg configuration to a minimum, note that a recent Xorg is needed (I have 7.4 at the time of writing). This is needed to enable the auto-detection of many things or rather: manually specifying properties will override auto-detected settings and cripple Xorg’s ability to handle everything on its own. As an example, I have included my own xorg.conf below.

Section "ServerLayout"
  Identifier "Default Layout"
  Screen 0 "Screen0"

Section "ServerFlags"
  Option "AutoAddDevices" "true"
  Option "AutoEnableDevices" "true"

Section "Device"
  Identifier "nVidia_8600M_GS_nouveau"
  Driver "nouveau"
  Boardname "GeForce 8600M GS"

Section "Screen"
  Identifier "Screen 0"
  Device "nVidia_8600M_GS_nouveau"
  Monitor "Monitor 0"
  SubSection "Display"
    Modes "1680x1050"

Section "Monitor"
  Identifier "Monitor 0"
  VendorName "Primary Monitor"
  Option "DPMS"

Section "Extensions"
  Option "Composite" "Enable"

The ‘ServerFlags’ section is pretty simple: use every auto-detection known in Xorg to hotplug devices. This means both monitors and input devices like mouses and keyboards.

Next up are the 2 sections for my NVidia display card. Since I am using ‘nouveau’ instead of ‘nvidia’, make sure you are not loading ‘glx’ somewhere (if you were using ‘nvidia’ in the past like me, you most likely have that somewhere).

The last 2 sections are fairly simple as well but note that I do specify the native resolution of my LCD screen as the preferred resolution.

Thats it! Restart your X server and run xrandr so see the result. I plugged a VGA monitor in my notebook and this is the result:

cyberwizzard@cyberxps ~ $ xrandr
Screen 0: minimum 320 x 200, current 2960 x 1050, maximum 8192 x 8192
LVDS-1 connected 1680x1050+0+0 (normal left inverted right x axis y axis) 331mm x 207mm
1680x1050      60.0*+   60.0
1400x1050      60.0
1280x1024      59.9
1280x960       59.9
1152x864       60.0
1024x768       59.9
800x600        59.9
640x480        59.4
720x400        59.6
640x400        60.0
640x350        59.8
HDMI-1 disconnected (normal left inverted right x axis y axis)
VGA-1 connected 1280x1024+1680+0 (normal left inverted right x axis y axis) 312mm x 234mm
1680x1050      74.9 +   60.0
1280x1024      85.0*+   75.0     60.0
1792x1344      60.0
1920x1200      59.9
1600x1200      75.0     70.0     65.0     60.0
1400x1050      85.0     74.9     60.0
1440x900       84.8     75.0     59.9
1280x960       85.0     60.0
1360x768       60.0
1280x800       84.9     74.9     59.8
1152x864       75.0
1280x768       84.8     74.9     59.9
1024x768      100.0     85.0     75.1     75.0     70.1     60.0     43.5     43.5
832x624        74.6
800x600        85.1     72.2     75.0     60.3     56.2
848x480        60.0
640x480        85.0     75.0     72.8     72.8     66.7     60.0     59.9
720x400        85.0     87.8     70.1
640x400        85.1
640x350        85.1

Adding to Nokia N95

So you want to use your Nokia N95 with the popular service to start using VOIP calls? You have come to the right place. Note that these instructions will most likely apply to all Symbian S60 phones, like the Nokia E65. Also note that in case of the N95, this even works for phones without a SIM card, making it possible to call with those phones as well!

In this article I assume that you already have a account and have set up an extension for your Nokia N95, if not please look at this guide.

Let create the SIP profile to connect to

  1. Create the profile:

    1. Open the menu
    2. Open Tools
    3. Open Settings
    4. Select Connection
    5. Scroll down and select SIP Settings
    6. Click Options
    7. Select New SIP profile
  2. Set up the new profile:

    1. Select Use default profile
    2. Change the profile name to something familiar:
    3. Leave Service Profile set to IETF
    4. Set Default access point to your current Wifi network
    5. Set Public user name to where you substitute myuser for your account name and 800 for the extension number you created for this phone
    6. Leave Use compression set to No
    7. Set Registration to Always on to force the phone to connect to the SIP service when starting up (or leave it if you are not planning on using it all the time)
    8. Leave Use security on No
    9. Leave Proxy server empty
  3. Set up the registrar:

    1. Enter Registrar server
    2. Set Registrar server address to
    3. Set Realm to
    4. Set User name to myuser-800 and substitute like before
    5. Set Password to the password you set on for this extension number
    6. Set Transport type to UDP
    7. Leave Port on 5060
  4. Set up internet calling:

    1. Click Back to return
    2. Click Back to return to SIP Settings
    3. Click Back to return to Connection
    4. Select Internet tel
    5. Click Options
    6. Select New profile
    7. Change the name if desired.
    8. Select the SIP profiles to use, if you only added it will be selected by default.
    9. Click Back to “Internet telephone”
    10. Click Back until you reach the “Tools” menu.
    11. Select Internet tel.
    12. Click Options.
    13. Select Connect to service

The phone will now connect to the Wi-Fi network you specified earlier and a globe with phone icon will appear showing that the registration was successful. When you want to call using the SIP account, use Options and Internet call to activate the SIP mode.

To switch to SIP calling by default do the following:

  1. Open the menu.
  2. Open Tools.
  3. Open Settings.
  4. Select Phone.
  5. Select Call.
  6. Scroll down and set Default call type to Internet call

Next time you will call a number it will use your SIP account automatically! (Note: if the SIP account fails – no Wi-Fi etc – the phone will switch back to normal calling modes).

Common pitfall: ‘Address not in use’ when you try to call a number (meanwhile you can receive calls fine). This happened to me because I did not specify a Default route for the newly added extension, after making the standard route truly global all was well.


Debugging upload problems on JoomGallery

After installing JoomGallery and writing a migration script to migrate from Menalto Gallery2 (G2) , I finally got to uploading photos. But for some mysterious reason, every attempt to upload images would fail.

The manual upload did not work (blank page), the FTP upload did not work (blank page), the batch upload did not work (blank page or session timeout sending me back to the login) and the Java upload did not work either: 

wjhk.jupload2.exception.JUploadExceptionUploadFailed: wjhk.jupload2.policies.PictureUploadPolicy.checkUploadSuccess(): The string "JOOMGALLERYUPLOADSUCCESS" was not found in the response body