Archive for the ‘informatik’ Category

Increase avatar size in bbpress

I needed to increase the image dimensions for the avatars in bbpress. There is no direct filter available which lets you change the size for a specific avatar location. From the forums, I gather that you have to edit the template. I managed to make it work using the filter system. The filters get called for every avatar location, so I base my heuristic on the requested avatar size. The default ’small‘ avatar size is 14px and nearly useless. These are displayed inline. The faces next to the postings are 80px by default. I increased them to 110. Bigger images would likely break the formatting.

function my_bbp_change_avatar_size($author_avatar, $topic_id, $size) {
    $author_avatar = '';
    if ($size == 14) {
        $size = 24;
    if ($size == 80) {
        $size = 110;
    $topic_id = bbp_get_topic_id( $topic_id );
    if ( !empty( $topic_id ) ) {
        if ( !bbp_is_topic_anonymous( $topic_id ) ) {
            $author_avatar = get_avatar( bbp_get_topic_author_id( $topic_id ), $size );
        } else {
            $author_avatar = get_avatar( get_post_meta( $topic_id, '_bbp_anonymous_email', true ), $size );
    return $author_avatar;

/* Add priority (default=10) and number of arguments */
add_filter('bbp_get_topic_author_avatar', 'my_bbp_change_avatar_size', 20, 3);
add_filter('bbp_get_reply_author_avatar', 'my_bbp_change_avatar_size', 20, 3);
add_filter('bbp_get_current_user_avatar', 'my_bbp_change_avatar_size', 20, 3);

Note that we hook up three different filters. You can play around with different avatar locations this way, but I couldn’t be bothered to find out the details. The first one is for some inline avatars (likely for the topic author only), the second one probably for inline avatars for reply authors. The last one probably is for the avatars displayed next to full postings.

We also need some CSS:

/* We increased the tiny avatar size, so adjust the position */
#bbpress-forums p.bbp-topic-meta img.avatar, #bbpress-forums ul.bbp-reply-revision-log img.avatar, #bbpress-forums ul.bbp-topic-revision-log img.avatar, #bbpress-forums div.bbp-template-notice img.avatar, #bbpress-forums .widget_display_topics img.avatar, #bbpress-forums .widget_display_replies img.avatar {
    margin-bottom: -2px;
/* Increase max-width for the big avatars */
#bbpress-forums div.bbp-forum-author img.avatar, #bbpress-forums div.bbp-topic-author img.avatar, #bbpress-forums div.bbp-reply-author img.avatar {
/*margin: 12px auto 0;*/
max-width: 110px;

There is a filter available that will simply change the default avatar size, but that is only available for the user detail page – e.g. the profile. The code in BBPress looks like this:

echo get_avatar( bbp_get_displayed_user_field( 'user_email', 'raw' ), 
                       apply_filters( 'bbp_single_user_details_avatar_size', 150 ) );

I wish the authors had included this for every avatar location.

Bonus – Better Editor

By default, current BBPress releases only include the default HTML editor toolkit. Very useful. To get TinyMCE back, install bbpress-enable-tinymce-visual-tab and enable TinyMCE in Settings->Forums. I use the twentythirteen theme and the mode switching tabs (text<->visual) look weird. I also found the image insertion dialog horribly broken. Some CSS fixes that:

/* Fix tab switch in TinyMCE BBpress */
.wp-editor-tabs .wp-switch-editor {
    padding-top: 0px;

/* Hide image embedding widget in TinyMCE BBpress.
 * It's just horribly broken.
#bbp_topic_content_image {
    display: none;

Kategorien:informatik, oss

Web2py, JQuery Mobile and Caching

So I’ve been playing around with JQuery Mobile to build a nice mobile version of my web2py app. Once I had that running, I wanted to make the website load a bit faster, because waiting about five seconds is way too much! Time to install the Google Pagespeed extension into my trusty Firefox, set the UA string to something iphone-is and look at the recommendations!


The Pagespeed extension complained about my files not being compressed. Some googling turned up the necessary .htaccess magic to enable mod_deflate on HTML, CSS and JS files. The transfer size went down from 448kb to 124kb! That’s an insane improvement. Still, there was some work left. Speaking from memory, this took page load time from about 5s down to 3.4s.

Proper Caching

Another complaint by the pagespeed extension was the caching: I needed to set the expires header! By setting the timestamp in the expires header, you tell the browser not to worry for a while and just fetch the file from cache. If you don’t set that header (or „cache-control“, alternatively), the browser will perform a conditional request, asking the server „hey, did that file change since last time I downloaded it?“. This is not a lot of overhead, but still, setting the „expires“ header can save us quite a few network requests.

Modifying the header via .htaccess did not work because web2py was already setting it in gluon/ Apparently, if there is a timestamp included in the file name, the expires header is set to a date in the far future. This technique is called URL fingerprinting. I decided not to care and just hardcoded the expires header for all static files. This will eventually come back to bite me. The result for this tweak: page load time went down from 3s to 1.85s with a warm cache! The 3s case was with a warm cache and conditional requests – each request took 500ms to execute and not all of them were parallel. Page load time with cold cache was 3.4s!

To Do

All requests are currently handled via the CGI interface. Letting Apache handle static files directly would likely increase performance and magically make the expires header work without having to hack web2py.

The real WTF

JQuery: 91.4kB, JQuery Mobile: 141.1kB, JQuery Mobile CSS: 92.4kB. That’s about 300kB of code. I have not done any benchmarking yet to determine parsing and execution time, but it would seem that 300kB of code is a bit excessive for a puny mobile device.


Kategorien:informatik, oss

FHEM 5.3 on OpenWRT with WOL

I’ve finally installed the FHT80b thermostat in the bathroom. While I was fiddling with FHEM, I figured I wanted to use WOL for my media box. This required an update to FHEM 5.3.

On the OpenWRT box, some additional steps are required. My FHEM instance does not run with root privileges, so I’m also setting some +s bits.

# opkg install etherwake
# ln -s /usr/bin/etherwake /usr/bin/ether-wake
# chmod +s /usr/bin/etherwake

Ping is also provided by busybox, but I did not want to +s the whole busybox binary, so I installed a dedicated ping binaryand removed the symlink from /bin/ping to /bin/busybox.

# opkg install iputils-ping
# chmod +s /usr/bin/ping
# rm /bin/ping

Kategorien:informatik, oss, wohnung

MythTV, VDPAU, Studio Levels and the Philips 42PFL7406

I haven’t touched my calibration settings in a while. I finally got sufficiently bored again, so here we go. The basic problem was to get adequate black levels throughout the pipeline; several knobs can be tweaked for that. I have now settled for the following: VGA card is configured to output limited RGB space (16-235), the VDPAU studio filter is off and the PC mode on the TV set is off. This theoretically causes some quality degradation as the (my theory) video decoder expands the RGB range while the nvidia driver re-compresses it, but meh. Settings on the TV set are now: brightness 64, contrast 80, color 50. On the APL clipping test video, I can’t get any bars beyond 18 or 233 to show up, which is not perfect but usable. I might have to revisit this in the future…

As for color: red is way over-saturated in „warm“ mode, clipping way too early. I have been unable to adjust this on the TV set itself. The graphics driver itself provides better adjustment knobs with the nvidia-settings utility. I set brightness for the red channel to -0.15; the result is nicely differentiated shades of red in the color clipping pattern. Be aware, though, that you need to set the VDPAU_NVIDIA_NO_OVERLAY=1 environment variable for the settings to work in VDPAU playback.

edit: Turns out I get tearing with overlay disabled. I’m a sad Panda and using XV for video rendering again. (As a side note, I apparently had limited RGB output enabled the whole time in my Xorg.conf; not sure if it stuck as nvidia-settings was not picking it up)

Kategorien:informatik, leben, wohnung

Free disk space monitoring: munin & MythTV

Sometimes, the root file system on my MythTV box gets full. This causes all kinds of fail, including the inability to watch TV.

I wanted to set up some monitoring. I already have munin set up, so why not let munin tell me when things get critical? Regarding the general notification setup, we have two nice documents: one by Jason and the official munin documentation, which also explains how to fire scripts instead of emails. Go read both.

Two possible ways of providing notifications come to mind:  via the OSD on the TV or via email.

As for the OSD, I’d use the MythMessage interface which will give me a nice remote-controllable popup. Unfortunately, I ran into bug 10815 while setting up the notifications via mythutil –message. As an alternative, we can use MythMessage via the remote control interface like this:

echo "message low disk space on / " | nc localhost 6546

There is also the possibility to invoke MythMessage over the Services interface, but I have not tried that.

With the whole OSD thing, there is also the question if you really want a pop-up on the TV while you’re cuddled up with the girlfriend. That’s why I might prefer email notifications, but I do not have an MTA set up on the MythTV box, so there’s that. I’m a bit lazy in a bit of an regeneration cycle right now, so I will just go back on the couch and come back to this post the next time the disk fills up again.

Kategorien:informatik, wohnung

RTF: Fail and you

As with any programming gig, you are bound to encounter some interesting fails.

One of our customers is providing RTF files generated by a third party. To do the silly little things we computational linguists tend todo – a bit of information extraction here, some sentiment analysis there – I convert the RTF to HTML using rtf2html to process it with BeautifulSoup.

But, of course, the umlauts are broken: any umlaut in the generated HTML is preceded by an @ sign in my editor, which turns out to be 0 – a null byte. This caused some pre-emptive facepalms because I knew I was in for some pain.

Looking at the relevant section in the RTF document, I see the following:

\uc2 M\u228\’00\’E4rz

This is supposed to „März“. I found the download for the 1.91. RTF spec on where you can choose between .doc and .docx. Microsoft, this is bad and you should feel bad.

The spec tells me that RTF initially was not unicode-aware. When unicode was introduced in a later revision, they decided any characters which are not ASCII should be represented by their Unicode code position preceded by \u. Following this unicode representation, a close representation of the character in the declared codepage for the document should be used. In our case, the representation consists of two ASCII characters: \’00 and \’E4. The \uc2 keyword tells unicode-enabled RTF readers to skip 2 characters after the unicode character because they’re obviously redundant.

The unicode presentation here is \u228 which properly points to „ä“. For RTF readers which are not unicode-enabled, the unknown \u keyword is simply ignored and the regular \’xx representations in the declared codepage are used.

So we have two fails here:

  • rtf2html does not support Unicode. This is no big deal per se as the RTF spec is backwards compatible in this regard, but me being a big fan of Unicode, I will consider it s a minor fail. If rtf2html supported unicode, the broken document would be displayed properly
  • the alternate representation of the Unicode character is broken. The declared codepage is cp1252, so why would you ever use a two-byte replacement? Interestingly enough, \’E4 is indeed the correct cp1252 code for „ä“.

Specs are hard. Let’s go shopping. Do people actually test their software before they deploy it? I’m either going to patch rtf2html to ignore 0 (which will probably break other documents in equally hilarious ways) or add proper unicode support which should not be too hard.

Kategorien:informatik, oss

SSDs, ATA trim and long sync() times

Fun times – I just enabled ATA TRIM on my laptop, which has been running an OCZ Vertex 2 for a while now. I stumbled across a blog entry by Patrick Nagel on the impact of the ‚discard‘ option (which essentially enables ATA trim) on delete speed, or rather on sync() speed following file deletions. It turns out that discard support may cause significant overhead. I managed to replicate his findings with my SSD, but I am not yet sure about the real-world implications there. Head over to his blog for the full discussion.