Category Archives: Internet

bind refuses to restart, debian squeeze

After an upgrade, I’ve noticed a few times that bind has refused to restart or reload, saying:

Stopping domain name service: namedrndc: connect failed: connection refused

This seems to be a permissions bug in debian, quite a long lasting one. In order to cheat-fix it quickly, I do the following:

chown bind:root /etc/bind/rndc.key
chmod 660
/etc/init.d/bind9 restart

That seems to fix it well enough. I think it’s a problem in that bind starts as one user, but runs as another. It may be that 440 are all the perms that are necessary. The debian bug report is here:

Magento Session Files

Magento (the popular open-source online shop system) likes to store its PHP session files in ~/public_html/var/session/

Most debian servers don’t have that in their cron job that deletes old session files.

So, you probably want to set it to store it’s session files in the default location (/var/lib/php5) or alter your cron job (/etc/cron.d/php5)


Eaccelerator mirror / downloads

Eaccelerator is insanely useful in my line of work. However, their main downloads are down right now, so I’m mirroring the latest version here:

You can see the files sha1sums here:

Alternatively, if you’re scripting (we are), you can use the following to get my (‘up-to-date’) version:

bz2.. because that’s the version we use here 😉


Web 3.0

Web 3.0 is coming soon…

IMHO the Web 3.0 revolution will consist of websites and web apps from the 2.0 era becoming closer.
I think that it will become easier to link together content across web sites to create new forms of content.

In the Web 2.0 revolution was helped by blogs with authors linking together information in posts. (This I might add has been very useful to combat the slew of dodgy sites that  sit high in Google’s results but just spit back the search terms as results, nullifying your search. Nowadays I find use ‘blog’ in search terms, especially when looking for reviews.)

I can’t wait until someone puts together a really good way of visualizing all this data. As the internet grows the importance of being able to sift through the available data and collate it into collections on particular topics is becoming paramount.

I have been looking out for a system to visualize my internet links in some kind of subject oriented way with a timeline / time axis. So far the only thing that comes close is Basket Notes for KDE (screenshots). If only that were a web app! (if i had the motivation and focus, I’d turn my meagre php programming skills to that task myself, but alas like my sketched design for a social networking site written in my design book pre the advent of facebook, I think I’ll leave it to someone else!)

I guess the closest web based similar system (I’m aware of) currently in operation is Wikipedia!

Look at the useful plugin Ubiquity, and the fantastically useful cross platform application and search launcher, Launchy for example. Both of  these are designed to give us quicker access to and search abilities for our data.

Making computers integrate seamlessley to our lives rather than inturpting them.
Today the focus of computing is shifting from _ to the workflow -how we get things done. I think this is essential because your average end user doesn’t care how things get done, just as long as they can get done.

Digital Photographers often use a prescribed workflow when working on digital photos – ‘developing them’ as it were to bring out the best. PCPro Magazine suggests 1. Levels and Curves then 2. Colour adjustment followed by Sharpening. But I’m talking more than just the best sequence of events to achieve the best quality output. I’m talking about the process itself.

Our brains think sequentially, each action is broken down step by step and steps performed one after another. A break in our concentration, or ‘flow’ impacts our effectiveness. This is especially true for people with ADHD (like me). Reducing the need for context switching.

“Consider that it takes 15 minutes for a developer to enter a state of flow.  If you were to interrupt a developer to ask a question and it takes five minutes for them to answer, it will take a further 15 minutes for them to regain that state of flow, resulting in a 20 minute loss of productivity. Clearly, if a developer is prevented from flowing several times during the day their work rate declines substantially. “

(Retrieved from

For example, downloading pictures from your digital camera and uploading them to facebook. Recently I’ve been using ‘Windows Live Photo Gallery’. Ugh, I know, but the point is it that Vista offered it to me, and it was an easy to find and add plugin that allows me to upload direct to facebook, where most of my photos end up these days.

To download the pictures I simply flip out the SD card from my camera, and insert it into my laptop (useful laptop buying advice)’s SD card slot

And that’s the point, people will take the path of least resistance/effort.

Path of least effort Principle
According to my observations
like people walking down the high street striving to avoid collision with other pedestrians, my observation leads me to believe that everybody is operating on the principle of least effort, where the person you are approaching will attempt to take a path that will need the least amount of diversion from their original path in order to avoid collision, while you yourself will attempt to do the same thing.

how does this come back to web 3.0?

How many clicks does it take while searching for some long forgotten but relevant piece of information before a user will get bored and move on? [research advertising, google hotspots, number of clicks] Could it be as low as 3, and as high as 8?

Unified User Interface
Facebook for example. I was trying to find my note on laptops to include a link in this article, but alas my click on Notes from the home page only brought up a ‘feed’ of Notes. Where I ask is the Filter options that preside on everyone’s profiles? Why can’t I select ‘Just Garreth’ here too?

If something like that is useful, it should also be Unified, that is available everywhere!

In the time it took me to discover the ‘workflow’ to access my notes in this ‘fast/bitesize/information obsessed’ age my poor overloaded ADHD (video: ADHD impact on life) brain might easily have become bored frustrated and more importantly distracted and moved on…

Cloud computing and Rich Web Applications (Blog: Google and Rich Web Application)

Organisation of Data

It’s an inverse law – As our attention spans decrease, so the conciseness of the data we consume must increase ceterus paribus.

Why do my spidey senses tell me facebook, not google may be the winner in the Web 3.0 revolution?

  1. Reduce the need for context switching
  2. Make data transfer between devices, programs and operating systems simpler and more unified
  3. Make data easier to locate and retrieve
  4. Make locating an open program/context switching easier and more natural – in doing so reducing the impact on flow by automatically knowing how to get back to the other program/where it is.
  5. Design and create more natural interfaces – e.g the Apple’s iPhone and iTouch.
  6. Consider how context switching works in our heads and apply this to UI.
  7. Work on unified User Interfaces as not to interupt flow

What do you think? Leave some comments of your vision, and what you think of my ideas.

Windows Command Line Ping Replacement

So the windows version of ping is really stupid.

I was writing a batch script to mount up a network share that involved checking to ensure my NAS unit was turned on. The script is scheduled to run after the computer resumes.

What I found out is that the built in version of Ping.exe is terrible at telling you whether the ping has returned successfully or not. I was checking the ERRORLEVEL – %ERRORLEVEL% variable to find out what ping was returning. It should be 0 for success and 1 or higher for a failure.

What I found was, i was getting replies from the local pc (dunno why, leave me a comment if you know) and ping was reporting a success even though the correct pc failed to reply. The solution?
Replace the Windows ping.exe with Fping. It has a lot more options and appears – from some initial quick tests – to correctly report the errorlevel.

Kudos to Wouter Dhondt for developing it. I’ll update this post with any more news!


image Fping vs Ping errorlevel return values

PC Gamer Rips off Rock Paper Shotgun

Back in June of this year, PC Gamer launched a new website. This website design appears to be a rip-off of that used by Rock Paper Shotgun. With all the images that follow, click through for a larger version.

But, let’s roll back shall we? Rock Paper Shotgun launched September 2007, though their first post goes back to July 2007. They were a novel pc gaming blog site, trying to do something different in the gaming scene. They concentrated on PC games and only PC games, with running jokes. They have a small enough set of writers, that you can pick up the personality of each. (Kieron takes the weird ones, VERY NSFW: example.)

Back in 2007, redirected to a sub-site of Since then, they haven’t altered the design at all. Now, it redirects to Looking at the two  reveals this:


As an ex-web-developer, it looks to me like someone decided that they quite liked the RPS type website and went ‘make me a website like that, but in this style’. And tweaked the mock ups (and site designs) a few times, till what they had looks remarkably like what we see now.

Saying that, of course, this is quite a standard design style. It comes quite often easily when you use WordPress as your back-end engine, as this blog does, and as RPS does. However, they’ve not just used the site layout of wordpress as a base, they’ve decided to publish all of their posts in the same sort of format as RPS, with the same aim at getting discussions around their posts via the commenting.

A little birdie 1 tells me that someone at future (the company behind PC gamer) really might hate Rock Paper Shotgun. Would rather they disappear. It’s almost like, they’ve finally decided to fight this sphere of influence, with money, and lots of people, finally decided that maybe their website is worth working on and taking care of.

What annoys me, is that the big guy is trying to kill the little guy 🙁

Here are a whole load of screenshots, save you finding them. Some are from Wayback machine, some are from the website directly.

The old website, up till June. This image was recovered with a lot of hard work from webpigeon, of, (thanks!) since PC gamer used some really horrible website coding, which broke the waybackmachine copy. This has to be one of the ugliest websites I’ve seen, though not the worst. You could switch the big image, and below it was a list of recent stories.

how looked like till June.

And, if you scroll down a bit..:

Old PC gamer site in wayback, scrolled down.

They seem to be trying to throw links at you, lots and lots and lots of them, in a really small space. Check it out for yourself.

Rock Paper Shotgun’s footer:

PC Gamer’s footer:

OOo… don’t they look similar? Apart from the ‘we must keep up with the cool kids’  twitter panels and lots and lots of post links (which RPS doesn’t force on you, or puts in the right hand panel). This mess could also be due to Search Engine Optimization, that dark art in which you try to trick search engines into putting you higher up on their listings than your arch rivals.

Now, I work for the company that keeps RPS online. I like the guys that work there, I think they do a good job, especially considering they’re not getting paid much from it.

Also interesting, is the fact that PC Gamer seem to have thrown money at this venture. I work with some high-load wordpress-powered sites, and there is some very obvious things you do to make them work fast. Very fast. PC Gamer isn’t doing at least one of the most obvious, which suggests that instead they’ve thrown cash at keeping it online, with a cluster of computers working on it. Don’t know how a website works? Find out here 2

  1. Source, not related to RPS
  2. All images are Fair Use under the DMCA.

The fallacy of bandwidth limits

Currently, according to mainstream media, bandwidth is defined as the quantity of data you download or upload to the internet over a month. So, for example, your ISP will tell you the maximum bandwidth limit is 100GB. Or whatever.

That, however, is not it’s true definition. It’s true definition is:
a data transmission rate; the maximum amount of information (bits/second) that can be transmitted along a channel 1

This is the secret thing about bandwidth. ISPs don’t care about how much you upload to the web over a given period. We care about how fast you upload it.

When you pay for a high-level connection to the internet, that you use to connect houses to, or web-serving computers, you do not pay in quantity over time. You pay in speed. So, for example, 1 gigabit per second. If you go over that speed, longer than a allowed ‘burst’ period, you pay an overage charge, always assuming that your network is even capable of going over that speed.

Think of bandwidth like gas going through a pipe. (Terrible, terrible analogy, I know. But it’s the easiest way to explain.) That gas can only flow so fast, and only so much can be fit in the pipe at any one time. We don’t particularly care if you use 100GB by taking a trickle out of the system at any one time. We do care if you take a torrent.

Realistically though, customers never notice bandwidth. They’re too busy playing with computer-resource hungry things, like wordpress, to even be able to consume all of their allocated bandwidth. Only very, very rarely do we actually start thinking about bandwidth rather than computing resources. Normally, it’s podcasts. Static file. Almost no server-resources required to send it out onto the internet. But it eats bandwidth. Most are ~50-80Megabytes per episode. You get enough people downloading that simultaneously, and we’re going to start noticing…

As long as the current trend continues, i.e. the more computing power we have available to provide you with your shiny websites, the more the people creating the shiny websites waste computing power, the mainstream will never notice this secret.

More often than not, the reason we ask people to upgrade off our shared servers, is not because they’ve reached any arbitrary bandwidth limit, although we may use this as a guide to identify them. It’s because they’re using too much CPU time.


Easy on the eyes

Just a quick post here…

Recently my eyes have been a little strained using the computer. I think it probably has something to do with the misplacement of my reading glasses somewhere at University. Hopefully I’ll find them before my Mum finds out and goes nuts lol.

Anyways to reduce browser related eye strain, I found a handy script for Greasemonkey (in Firefox) that kinda inverts the webpage/makes it less white and a bit easier to read (higher contrast). Its not perfect but it’s a handy hack until I can do some more hunting for my glasses!

Anyway enough text, here’s the links:

Invert web page colours (lifehacker)

Direct link to Greasemonkey script

Options are customisable, so you can restrict the websites it works on…

Oh, and here’s a screenshot:


Prevent Adobe Acrobat Crashing Firefox

I’m using Adobe Acrobat (for compatibilities sake only, please post your favourite PDF program in the comments below!), but I’ve been rather annoyed recently at it having a tendency to hang Firefox if I tried to open more than one PDF file from the internet.

Simple fix/hack – make Firefox save PDF files rather than open them.

  1. Open Options (Tools \ Options in Windows and Edit \ Preferences in Linux)
  2. Open the Applications tab
  3. under ‘Adobe Acrobat Document’ change the value of the dropdown to ‘Save file’

Firefox Applications options tab. Vista I know!

  1. OK the change
  2. All done. Hopefully that’s one less annoying crash to worry about!

Ps get Session manager to save yourself loosing a window full of tabs or having to do a horribly manual procedure like recovering tabs from a accidentally closed Firefox window.

Denial of Service attacks.

I don’t like them. But, at least some of them I can work around using bash, logfiles, awk, grep, tail, cut and netstat are handy tools 🙂

That’s all for today, I’m afraid. Not much else to talk about 🙁

Hopefully I’ll be able to produce some proper posts next week!