WP Super Cache and mod_pagespeed

So I finally got a chance to try mod_pagespeed on this server. I particularly wanted to know if it behaved well with WP Super Cache as I’d read reports that it causes problems.

Unfortunately those problems are real but I’ve been told that a new release will be out shortly to address a few bugs so perhaps this will help.

If you’d like to try mod_pagespeed make sure you disable compression in WP Super Cache and clear the cache first. Even though the docs state that the module always generates uncompressed HTML it appears to do the opposite. In fact, it tries to load mod_deflate:

# more pagespeed.load
LoadModule pagespeed_module /usr/lib/apache2/modules/mod_pagespeed.so

# Only attempt to load mod_deflate if it hasn’t been loaded already.
<IfModule !mod_deflate.c>
LoadModule deflate_module /usr/lib/apache2/modules/mod_deflate.so
</IfModule>

When things were working, supercached files were processed by mod_pagespeed correctly, I noticed inline Javascript was modified to remove whitespace and I presume other changes were made too but I already minify things and have static files off on another domain so perhaps the changes made on my pages are less minimal.

The changes made by mod_pagespeed, like minifying inline Javascript, are not cached by WP Super Cache so your server has to make these changes each time a page is served. I know that mod_deflate does not cache the gzipped page content, but zips up the page each time it’s served. Mod_pagespeed does however provide a caching mechanism so there’s a good chance those changes are cached there. I haven’t looked at the code so I don’t know.

I did have problems with dynamic pages. A simple phpinfo() refused to load quite often, and backend requests sometimes became stuck. Load on the server sky rocketed occasionally, usually when the module cache directory was emptied.

For now I’ve turned mod_pagespeed off but that might change as this is a young project and maturing fast! I’ll update this post whenever this happens.

I’m slowly making my way through all th…

I’m slowly making my way through all the video editing software in Linux. I’ve tried Cinnerella, Lives, Kdenlive, Openshot and a few others but for one reason or another they’re all lacking. Mainly bugs unfortunately.

Of the ones I tried I like Kdenlive and Openshot and was able to do some decent multi-track editing of a video. Kdenlive crashed quite a few times and I think I failed to create a video. Openshot did really well but then I triggered an odd bug where some clips in the video simply froze. The audio continued but the image displayed was the first frame. I went searching for it and found a discussion of the very problem. Seems to be slightly glitchy 1080×720 video will cause it.

Keep searching …

My phone is faster than yours

If you’re using a Samsung Galaxy S or one of it’s variants then my phone may well be twice as fast or even faster than your phone! How? It’s all rather simple actually.

First of all, I downloaded Quadrant Standard from the Android market. This is a benchmarking app that you can use to find out how fast your phone is. Run a benchmark and note the performance figure for your phone. Now, go look for “One Click Lag Fix” in the market and install that too.


This little app will root your phone, and install a new ext2 partition on your phone. The default Galaxy S filesystem isn’t that hot at running apps. The new partition will be used to store cache data, and because ext2 is supposedly better at caching your apps will load faster, and you’ll experience less or no lag when opening them. That was my experience with it anyway. This will help your phone’s performance significantly.

In recent updates to OCLF two new options were added, “Alter Minfree”, and “Change Scheduler”. Adjusting these will make a huge difference to your phone. Each one is explained briefly, with a recommended setting. I followed that advice and it’s like my phone is on steroids now! Apps open faster than ever and I’m just waiting for it to dance a jig it’s so fast and responsive.

Please be aware that running OCLF means rooting your phone and invalidating your warranty. You may brick your phone. That means it won’t work any more and can’t be fixed. It more than likely won’t happen and I haven’t read about it happening but you should be aware of the risks involved.

Bonus tip: If you’re running Linux on your desktop computer, the scheduler can be changed on that too. Must give that a go some time.

Who's abusing your website?

I wanted to know what IP addresses were hitting my website. I’d done this before and it only took a moment or two to recreate the following commands. Still, here it is for future reference.

grep -v "wp-content" access.log|grep -v wp-includes|cut -f 1 -d " "|sort|uniq -c|sort -nr|less

This code:

  • Excludes “wp-content” and “wp-includes” requests.
  • Uses “cut” to cut out the IP address.
  • Sorts the list of IP addresses.
  • Uses “uniq” to count the occurrence of each IP.
  • And finally reverse sorts the list again, by number of occurrences, with the largest number at the top.

You’ll probably find Google and Yahoo! bots near the top of the list, but I also found the “Jyxobot/1” bot was quite busy today.

Admin 101: Postfix smtp limits

I’ve just moved all my sites on to a new install of Ubuntu on one of my VPses. This site and In Photos are now on the same server again and the VPS has finally calmed down.

Between configuring Apache (turn off keep alives, and reduce the number of child processes), installing xcache and the WordPress object cache and configuring it, and configuring MySQL I totally forgot about Postfix.

I did install Postgrey of course but when Blacknight switched the ocaoimh.ie web and mail traffic to this server things started to go screwy.
Load average shot up, I thought it was Apache and spent quite some time playing with the number of processes, all to no avail. I didn’t immediately notice the large number of smtp processes when I did a “ps auxw”. I was looking at Apache.

What was happening was a rumplestiltskin attack on my server. Rogue bots all over the Internet try to send spam emails to mail servers using randomly generated addresses in the hope of guessing a correct one. It happens all the time, and I had configured Postfix correctly in the past, but I had forgot this time.

So, if your server is suffering under the strain of too many Postfix smtp processes open up /etc/postfix/master.cf and look for the smtp line:

smtp inet n – – – – smtpd

Change the last dash to a number, try small first, depending on how much mail traffic your server gets. I changed mine to 3, restarted Postfix, and the server is humming along nicely now. Postfix was actually using up more resources than Apache during those attacks! It’s unfortunate that Ubuntu (and probably every other dist of Linux) allows unlimited number of smtp processes.

Oh, I’m hosted at Linode. Yes that’s an affiliate link, but I’ve been using them for years and been very happy with them.

Ubuntu Linux and the Canon MP492 printer

I bought a shiny new combination scanner/printer/copier last weekend. It’s the Canon MP492 and I expected it would work just fine in Linux. I mean, it’s just a printer, right?

Nope. The printer was detected as a Canon MP490 but unfortunately the CUPS system used by Ubuntu Linux didn’t support that particular version. It supported the 520 and others but no sign of my new purchase. To be honest, I was dumb founded. I even configured it on my Macbook and thought about sharing it over the network but the Macbook is on the wireless network while everything else is wired, and I didn’t feel like making things more complicated.

So I went searching again and eventually found this helpful thread (it didn’t show up on my first searches, Google refresh?). Drivers are available for the printer here on the Canon Thailand website. Thankfully the instructions were all in English, and the .deb package installed and configured correctly. Drivers for the scanner are listed on that forum post too but I don’t have an immediate need for that so I didn’t test them out.

Phew.

The printer itself is an average photocopying machine that does the job. Here’s the back cover of the December 1954 issue of the National Geographic. The copy is pretty good except for banding on big blocks of colour like that in the Coca Cola logo.

I must admit, after reading this review …

I must admit, after reading this review (and this one) I want a GP2X Wiz.


Quake 1 running on the GP2X Wiz (lots more)

You can buy the unit at Play Asia where it’s described as:

GP2X is going in the next round with a completely reworked new gadget, the GP2X Wiz!

Powered by a 533Mhz 3D flash engine, the GP2X Wiz is the long-awaited update to the beloved Linux based handheld, offering updated multi-media features, a long serving rechargeable battery, a touch screen and a new sleek design.

Play games, read e-books, see videos and play music files. Let the GP2X Wiz be your all-round multimedia parter for all situations.

‘Course my only problem is that I work at home, and when I go anywhere I’m driving and when I get there it’s usually with family so I’ll either have a better computer nearby, or won’t have an opportunity to play it. Might suit my son Adam as a portable gaming platform?

Fill and span DVD archives with Discspan

I have a huge archive of photos. I shoot tens of thousands of photos every year. Storage requirements for all those photos was bad enough when I shot in Jpeg but then I switched to RAW and space usage jumped! Here’s what the last 3 years looks like:

169GB of data is a lot of stuff to store. Originally I had them all duplicated on two external drives but then I bought a 500GB internal drive for my laptop for speedier access. Unfortunately that drive simply wasn’t big enough. I need to convert some of my RAW files to Jpeg to save space. To preserve the original RAW files I want to archive them somewhere permanently. I have a DVD writer so that was an obvious choice.

Burning data to lots of DVDs is tiresome. You can use tar, zip or another archiver to split the data but then you have to run through all the DVDs to pick out a file to restore. I like having the files directly accessible but that means endless selecting files, making sure they’re as close to the DVD size as possible, burning them, moving on to the next bunch. In the bad old DOS days I had a program to fill floppy disks if you pointed it at a directory but I’ve spent years searching for a similar Linux script. Last week I found one.

Enter Discspan. My 2007 archive was already burned to DVD, and I wish I had this script while doing it. I’ve burned my 2008 archive with Discspan and it was a doddle. Point it at the right directory, feed it some details about the DVD drive and let it go. 26 DVDs later and my 2008 archive is safe on DVD!

The script scans the directory, figures out how many DVDs are required and it fills each DVD with data, spanning my digital archive over multiple DVDs.

Be aware when using it that you should let Linux detect the next blank DVD before pressing return. The first time I ran it the script bombed out when growisofs didn’t see media to write to. You also need to patch it because it doesn’t detect the right size of DVD+R’s but it’s a simple one-liner.

Another Linux project, Brasero promises to span disks too but it didn’t for me. It’s the default CD/DVD burner in Ubuntu now and it’s a shame this functionality is broken in it.

Hopefully Brasero will be fixed for the next release. I’d offer to help but my C/C++ is very rusty.