Invalid method in request \x16\x03\x01

So, ran into this one. Firefox is throwing this error, along with ‘SSL_ERROR_RX_RECORD_TOO_LONG’. Turns out, apache was serving plain HTTP on port 443, as it hadn’t been given a default SSL config.

Other causes may be: Corrupted SSL cert (rare). Mis-configured proxy. Not adding “SSLEngine On” after configuring an SSL cert. But mostly, you’re trying to talk HTTPS to an HTTP serving webserver.

`a2ensite default-ssl` (on debian) fixed it. Well, fixed in in that the default server now has a snake-oil self-signed cert, but, you know, fixed it. 🙂

Possible missing firmware on debian squeeze

If you get these errors:
W: Possible missing firmware /lib/firmware/rtl_nic/rtl8105e-1.fw for module r8169
W: Possible missing firmware /lib/firmware/rtl_nic/rtl8168e-2.fw for module r8169
W: Possible missing firmware /lib/firmware/rtl_nic/rtl8168e-1.fw for module r8169
W: Possible missing firmware /lib/firmware/rtl_nic/rtl8168d-2.fw for module r8169
W: Possible missing firmware /lib/firmware/rtl_nic/rtl8168d-1.fw for module r8169


W: Possible missing firmware /lib/firmware/rtl_nic/rtl8168f-2.fw for module r8169
W: Possible missing firmware /lib/firmware/rtl_nic/rtl8168f-1.fw for module r8169
W: Possible missing firmware /lib/firmware/rtl_nic/rtl8105e-1.fw for module r8169
W: Possible missing firmware /lib/firmware/rtl_nic/rtl8168e-3.fw for module r8169
W: Possible missing firmware /lib/firmware/rtl_nic/rtl8168e-2.fw for module r8169
W: Possible missing firmware /lib/firmware/rtl_nic/rtl8168e-1.fw for module r8169
W: Possible missing firmware /lib/firmware/rtl_nic/rtl8168d-2.fw for module r8169
W: Possible missing firmware /lib/firmware/rtl_nic/rtl8168d-1.fw for module r8169

Do this as root (or using sudo):

#  apt-get install firmware-realtek


If you have other firmware issues, n1md4 suggested installing: firmware-linux and firmware-linux-nonfree in the comments (thanks!)

Eaccelerator mirror / downloads

Eaccelerator is insanely useful in my line of work. However, their main downloads are down right now, so I’m mirroring the latest version here:

You can see the files sha1sums here:

Alternatively, if you’re scripting (we are), you can use the following to get my (‘up-to-date’) version:

bz2.. because that’s the version we use here 😉


Linux Beep Music

Just a silly short post about a beep song i was making while waiting for a partition resize to go through.

This should run on pretty much any linux system, just copy and paste 😉

beep -f 1000 -n -f 1500 -n -f 600 -n -f 500 -n -f 100 -r 2 -l 10 -n -f 50 -r 2 -l 200 -n -f 40 -r 2 -l 300 -n -f 60 -r 3 -n -f 50 -r 3

Thanks gparted and Sytem Rescue Cd (Linux)

Please continue my little ditty in the comments!

Ps modern computers may need speakers plugged in and on to make the magic happen, but shouldn’t need sound drivers.



Edit: check out the followup post here:

The fallacy of bandwidth limits

Currently, according to mainstream media, bandwidth is defined as the quantity of data you download or upload to the internet over a month. So, for example, your ISP will tell you the maximum bandwidth limit is 100GB. Or whatever.

That, however, is not it’s true definition. It’s true definition is:
a data transmission rate; the maximum amount of information (bits/second) that can be transmitted along a channel 1

This is the secret thing about bandwidth. ISPs don’t care about how much you upload to the web over a given period. We care about how fast you upload it.

When you pay for a high-level connection to the internet, that you use to connect houses to, or web-serving computers, you do not pay in quantity over time. You pay in speed. So, for example, 1 gigabit per second. If you go over that speed, longer than a allowed ‘burst’ period, you pay an overage charge, always assuming that your network is even capable of going over that speed.

Think of bandwidth like gas going through a pipe. (Terrible, terrible analogy, I know. But it’s the easiest way to explain.) That gas can only flow so fast, and only so much can be fit in the pipe at any one time. We don’t particularly care if you use 100GB by taking a trickle out of the system at any one time. We do care if you take a torrent.

Realistically though, customers never notice bandwidth. They’re too busy playing with computer-resource hungry things, like wordpress, to even be able to consume all of their allocated bandwidth. Only very, very rarely do we actually start thinking about bandwidth rather than computing resources. Normally, it’s podcasts. Static file. Almost no server-resources required to send it out onto the internet. But it eats bandwidth. Most are ~50-80Megabytes per episode. You get enough people downloading that simultaneously, and we’re going to start noticing…

As long as the current trend continues, i.e. the more computing power we have available to provide you with your shiny websites, the more the people creating the shiny websites waste computing power, the mainstream will never notice this secret.

More often than not, the reason we ask people to upgrade off our shared servers, is not because they’ve reached any arbitrary bandwidth limit, although we may use this as a guide to identify them. It’s because they’re using too much CPU time.


Charity shop thievery

I know that someone stole from the charity shop today. Found the remnants of a plastic tag broken by teeth on the floor in the changing room. Thought there was someone doing something suspicious in there earlier, but got distracted by people paying.

Not the first time either, we had a set of known thieves three weeks ago, think they probably suceeded, someone found a destroyed tag outside the shop.

I wonder, do they steal from need, from dependence on stealing, or for the excitement, the thrill of the crime?
Having never stolen anything physical, to knowledge, I don’t know.

Guess this will be one area of curiosity never sated. I just have to keep an eye out for them.

Photo below, Gerald the Giraffe enjoying a glass of coke, having just been rescued from the kidnapping admin team at one of our offices 😉

I hate book study.

I really hate studying from a book. My learning style is much more practial hands on. My mind just does not want to read and make notes on this boring technical book, and I can’t keep myself from getting side tracked.

Case in point, Page 235 of my LPCI 1 text book (awful by the way, don’t get LPIC-1 in depth by Michael Jang, its useless, honestly), I decide to browse through my photo archive after I pulled that shot out yesterday. And this is what I came up with:

sunset photo, taken 20th april 2007 in Wales, UK.
Click to see full size image

Right, mind. Back to shell scripting. (While loops.)

To the Oxfordshire Police Force

I was impressed by the tobogganing policemen, for having a sense of humour, and getting involved with the local community. As a UK tax payer I whole heartedly endorse this use of my money.

I as not impressed by them being told off.

Have fun (or not, as the case is much more likely to be).


Help firefox wget and ssh shell script

I’m trying to create a script to allow me to command a remote server to download a file from firefox.

There are various reasons for this, mainly todo with connection speed.

What I have at the moment is:
terminator -x ssh wget -qc -t 3 -o ~/wget_testlog \\& \& &

I want it to kick off, ask for a password to login via ssh and then go away…
I would like to be able to set the location for the download to ~/www/files/

I was planning to place this script in /usr/bin and install it in firefox using the code/link provided on this blog: Wget from firefox

Can anyone complete my solution with the correct syntax, or provide a better solution (preferably KISS)?
I’m more of hacker than an expert IMO and I know when I’m out of my depth!