This year Google have made a $20 million donation to charity, rather than sending goodies to publishers and ad partners. Some cynics have claimed that this is less about doing good for charities, but rather more about reducing their tax liability but what the heck - at least the dosh is going to good causes.
Google have provided a list of the charities which are to benefit from Google's $20 Christmas present, and most of them are lesser known charities. One name, however, has stuck out (at least for those of us in the UK) and that is "Loud Against Nazis."
They appear to be a German charity who are trying to tackle "the increasing right-wing radicalism" which, apparently "is a problem in everyday German life."
If you are looking for more information on the charity, a Google search brings up a rather cool looking skin-head egg-cup, but not much else.
This is because the charity is actually called "Laut Gegen Nazis" (it is, after all, a German charity).
More info at their home page.
Merry Christmas!
Thursday, 24 December 2009
Wednesday, 30 September 2009
RetroVaders 1.34 - coming soon
After receiving a bug report that RetroVaders crashes on Windows 7 (and finding that it also seems to do the same on XP Service Pack 3) I've delved back into the code and fixed a couple of other minor issues.
The new version will be available once Virgin Media sort out write access into my damn FTP site.
The new version will be available once Virgin Media sort out write access into my damn FTP site.
Friday, 11 September 2009
A week without Broadband
Broadband Internet access is something that I tend to take for granted these days, so when it is taken away it comes as a bit of a shock to the system.
What triggered this tale of woe was when I contacted Virgin Media to try and get the date of my direct debit changed. This is the second attempt at this as the first one resulted in them changing the billing period on my bills, but the direct debit date remaining the same.
This has (hopefully) been changed now, but while I was on the phone I was offered a free upgrade to the 20 meg service (well, free for 12 months). I'm actually quite happy on the 10 meg service, but what the Hell, it's free so I'll have it!
Time passed and my broadband speed didn't increase. I phoned again and got them to check, yes, I'm on the 20 meg service, but I was still only getting 10 meg speed. "I'll give it 24 hours, reset the modem and see if the speed increases," I thought.
That very afternoon something happened to my modem. All connectivity went and the only lights showing on the box were the power, ethernet and a rapidly flashing "sync" light.
Following the instructions on the modem I powered it down, waited 30 seconds and brought it back online. . . . Still no sync.
Try again, three minutes. . . . Nope.
As there was the chance that Virgin might have been experiencing problems I left it overnight and tried again. . . . and still it didn't work. At this point I called the technical support line, who arranged for a technician to come out on Friday - after, of course, they talked me through turning off the modem, waiting a couple of minutes and turning it back on. Three days with no Broadband access! Dear God NO!!!
The technician arrived at the appointed time, swapped the modem and left. Total time taken: 3 minutes. It took a couple of attempts to get the sign-up working, but after that - full speed internet! YAY!
However there was a "hidden problem" but I'll post about that later.
What triggered this tale of woe was when I contacted Virgin Media to try and get the date of my direct debit changed. This is the second attempt at this as the first one resulted in them changing the billing period on my bills, but the direct debit date remaining the same.
This has (hopefully) been changed now, but while I was on the phone I was offered a free upgrade to the 20 meg service (well, free for 12 months). I'm actually quite happy on the 10 meg service, but what the Hell, it's free so I'll have it!
Time passed and my broadband speed didn't increase. I phoned again and got them to check, yes, I'm on the 20 meg service, but I was still only getting 10 meg speed. "I'll give it 24 hours, reset the modem and see if the speed increases," I thought.
That very afternoon something happened to my modem. All connectivity went and the only lights showing on the box were the power, ethernet and a rapidly flashing "sync" light.
Following the instructions on the modem I powered it down, waited 30 seconds and brought it back online. . . . Still no sync.
Try again, three minutes. . . . Nope.
As there was the chance that Virgin might have been experiencing problems I left it overnight and tried again. . . . and still it didn't work. At this point I called the technical support line, who arranged for a technician to come out on Friday - after, of course, they talked me through turning off the modem, waiting a couple of minutes and turning it back on. Three days with no Broadband access! Dear God NO!!!
The technician arrived at the appointed time, swapped the modem and left. Total time taken: 3 minutes. It took a couple of attempts to get the sign-up working, but after that - full speed internet! YAY!
However there was a "hidden problem" but I'll post about that later.
Monday, 7 September 2009
Mono gets useful
Regular readers will know that I am not a fan of restrictive formats, and especially DRM. The reasons against them are too numerous to go into right now, but the main one is that they tend to prevent Linux (and often Mac) users from accessing content.
A case in point is the ITV catch up / watch live TV service. Available to Linux users? Nope, because it uses Microsoft's Silverlight. Yes, there are third party "solutions" to this, but lets face it, nothing beats being able to just go to the site and view the videos as originally intended.
Although Moonlight - the Open Source re-implementation of Silverlight - has been around for quite a while now (generating controversy wherever it goes) it hasn't supported enough of the Silverlight 2 standard to allow the multimedia content to work on the ITV website.
Until now.
Yes, if you go to the Moonlight site and download the latest beta plugin then that will allow you to play the video (you may need to right-click on the video and allow it to install the Microsoft codecs first).
It is still a bit of a CPU hog - even when compared to the likes of Flashplayer on Linux - but it does actually work! How much of a CPU hog? Well, even on a 2.8ghz P4 it will still take up most of your CPU time when trying to play windowed video. Full screen? Forget it.
Still, as a first step it is good to have it working, and at least now I can watch Coronation Street without requiring Linux.
Erm.
Maybe I'd better just uninstall it after all!
A case in point is the ITV catch up / watch live TV service. Available to Linux users? Nope, because it uses Microsoft's Silverlight. Yes, there are third party "solutions" to this, but lets face it, nothing beats being able to just go to the site and view the videos as originally intended.
Although Moonlight - the Open Source re-implementation of Silverlight - has been around for quite a while now (generating controversy wherever it goes) it hasn't supported enough of the Silverlight 2 standard to allow the multimedia content to work on the ITV website.
Until now.
Yes, if you go to the Moonlight site and download the latest beta plugin then that will allow you to play the video (you may need to right-click on the video and allow it to install the Microsoft codecs first).
It is still a bit of a CPU hog - even when compared to the likes of Flashplayer on Linux - but it does actually work! How much of a CPU hog? Well, even on a 2.8ghz P4 it will still take up most of your CPU time when trying to play windowed video. Full screen? Forget it.
Still, as a first step it is good to have it working, and at least now I can watch Coronation Street without requiring Linux.
Erm.
Maybe I'd better just uninstall it after all!
Wednesday, 26 August 2009
Vista Probs - Slow Internet Speed
I had a phone call from my younger sister today - her fiance's PC was running strangely. Sometimes the internet was fine, other times it ground to a halt. They'd checked for spyware and viruses - but couldn't find anything wrong.
Whilst logged in as an administrator open a command prompt (or use the "run as" option on the command prompt shortcut).
Type the following command:
netsh interface tcp show global
This should return a fair bit of information, but the line we are interested in is the one that starts:
Receive Window Auto-Tuning Level
If this shows anything other than "disabled" then you can try entering this:
netsh interface tcp set global autotuning=disabled
This should return "OK".
At this point we tried browsing the web - and lo-and-behold it was browsing at a more normal (ie fast) speed.
If you want to set it back to the way it was, you can try this:
netsh interface tcp set global autotuning=automatic
A quick fix, but one that is worth remembering.
Whilst logged in as an administrator open a command prompt (or use the "run as" option on the command prompt shortcut).
Type the following command:
netsh interface tcp show global
This should return a fair bit of information, but the line we are interested in is the one that starts:
Receive Window Auto-Tuning Level
If this shows anything other than "disabled" then you can try entering this:
netsh interface tcp set global autotuning=disabled
This should return "OK".
At this point we tried browsing the web - and lo-and-behold it was browsing at a more normal (ie fast) speed.
If you want to set it back to the way it was, you can try this:
netsh interface tcp set global autotuning=automatic
A quick fix, but one that is worth remembering.
Chaos Caverns II - Dev Diary - Part 2
Updated 28/08/2009
As the original post is getting a bit on the long side, I'll be continuing the Chaos Caverns 2 dev diary here.
I've been taking a couple of days off to think through what direction I want to take the game in. There are still a couple of things that I'd like to add in, conveyor belts for example, but nothing that is likely to stop me from finishing the game.
I've been toying with the idea of releasing a demo of the game so far, but to be honest I'd rather release a finished (and more importantly tweaked and polished) game this time around. With that in mind I'll be putting out a call for volunteers to beta-test the game once the game is a bit further along. Interested? Then feel free to get in touch either via the comments on this blog post, the development thread on Retro Remakes or by email at dans.remakes(AT)googlemail.com.
Right, enough gabbing for now - back to coding!
28/08/2009
One of the problems with being less than artistically gifted is that doing the graphics for a game takes a lot longer than you might expect. Trying to make animated sprites takes even longer.
Rather than trying to create the sprites from scratch I'm (mainly) shinking and editing images from photographs. This is giving the game a much more "unique" look and certainly makes up for my lack of artistic talent.
As the original post is getting a bit on the long side, I'll be continuing the Chaos Caverns 2 dev diary here.
I've been taking a couple of days off to think through what direction I want to take the game in. There are still a couple of things that I'd like to add in, conveyor belts for example, but nothing that is likely to stop me from finishing the game.
I've been toying with the idea of releasing a demo of the game so far, but to be honest I'd rather release a finished (and more importantly tweaked and polished) game this time around. With that in mind I'll be putting out a call for volunteers to beta-test the game once the game is a bit further along. Interested? Then feel free to get in touch either via the comments on this blog post, the development thread on Retro Remakes or by email at dans.remakes(AT)googlemail.com.
Right, enough gabbing for now - back to coding!
28/08/2009
One of the problems with being less than artistically gifted is that doing the graphics for a game takes a lot longer than you might expect. Trying to make animated sprites takes even longer.
Rather than trying to create the sprites from scratch I'm (mainly) shinking and editing images from photographs. This is giving the game a much more "unique" look and certainly makes up for my lack of artistic talent.
Saturday, 22 August 2009
Sky+ problems: Freezing picture
Sky+ is one of the most useful gizmos that I've ever bought. It really has changed the way that my family and I watch TV. Being able to pause and rewind live TV and the ease of recording stuff is just fantastic.
I've had relatively few problems with the box. Last year I had some severe picture problems, but a slight adjustment of the angle of the receiver dish solved that. Recently, however, our Sky+ box has started locking up whilst viewing live tv. Rewinding a few seconds seemed to resolve it. Until today that was, when even recorded programs were locking up, followed by the Sky+ box shutting itself down.
These days if I've got the choice between phoning customer support or checking on Google, customer support comes second. One thing that is obvious is that an awful lot of people are reporting the same problem.
Now, I'm assuming if you've read this far you are interested in how to fix it, yes?
Captain Obvious says: "Kids, if you follow instructions posted on websites and things go wrong it is YOUR PROBLEM, not theirs."
Step 1: Power off the Sky+ box at the mains, an leave it powered down for at least two minutes.
Step 2: Restore the power to the box and start up. If your box is like mine it may take a few minutes to start up.
Step 3: Click the "Services" button on your Sky+ remote and go into the "System Setup" menu.
Step 4: Press "0" on the Sky+ remote, then press "1" and finally press "Select". This calls up the "Super Secret Hidden Menu" (seriously, it does!).
Step 5: At this point there are two options you can use. One will remove everything from your system and wipe the internal hard drive, the other performs a disk check, fixes file-system problems and hopefully gets you back to a working state.
So, if this is the first time you've tried this, you will want to highlight the option "Sky+ Planner Rebuild" and then press "Select". This should run the maintenance routines on your box, which will power itself off once it has completed.
Once you go back into Sky+ you will need to check through your list of recorded programs as any recovered but corrupted programs will need deleting. It is also worth noting that some people report problems when their Sky+ box is over 90% full, so deleting some unwanted programs may not be a bad idea either.
If this doesn't work then you can try the final option which will wipe EVERYTHING from the planner, and that is "System Reset."
Luckily for me deleting some old programs and performing a planner rebuild seems to have fixed my Sky+ box, so I've not needed to nuke the system. Apparently this is the routine that Sky customer service will tell you, but seeing as I didn't phone them I can't confirm that.
So if you are having the same issues as I did (and assuming you've taken note of the sage advice of Captain Obvious) then this might be worth trying.
I've had relatively few problems with the box. Last year I had some severe picture problems, but a slight adjustment of the angle of the receiver dish solved that. Recently, however, our Sky+ box has started locking up whilst viewing live tv. Rewinding a few seconds seemed to resolve it. Until today that was, when even recorded programs were locking up, followed by the Sky+ box shutting itself down.
These days if I've got the choice between phoning customer support or checking on Google, customer support comes second. One thing that is obvious is that an awful lot of people are reporting the same problem.
Now, I'm assuming if you've read this far you are interested in how to fix it, yes?
Captain Obvious says: "Kids, if you follow instructions posted on websites and things go wrong it is YOUR PROBLEM, not theirs."
Step 1: Power off the Sky+ box at the mains, an leave it powered down for at least two minutes.
Step 2: Restore the power to the box and start up. If your box is like mine it may take a few minutes to start up.
Step 3: Click the "Services" button on your Sky+ remote and go into the "System Setup" menu.
Step 4: Press "0" on the Sky+ remote, then press "1" and finally press "Select". This calls up the "Super Secret Hidden Menu" (seriously, it does!).
Step 5: At this point there are two options you can use. One will remove everything from your system and wipe the internal hard drive, the other performs a disk check, fixes file-system problems and hopefully gets you back to a working state.
So, if this is the first time you've tried this, you will want to highlight the option "Sky+ Planner Rebuild" and then press "Select". This should run the maintenance routines on your box, which will power itself off once it has completed.
Once you go back into Sky+ you will need to check through your list of recorded programs as any recovered but corrupted programs will need deleting. It is also worth noting that some people report problems when their Sky+ box is over 90% full, so deleting some unwanted programs may not be a bad idea either.
If this doesn't work then you can try the final option which will wipe EVERYTHING from the planner, and that is "System Reset."
Luckily for me deleting some old programs and performing a planner rebuild seems to have fixed my Sky+ box, so I've not needed to nuke the system. Apparently this is the routine that Sky customer service will tell you, but seeing as I didn't phone them I can't confirm that.
So if you are having the same issues as I did (and assuming you've taken note of the sage advice of Captain Obvious) then this might be worth trying.
Saturday, 1 August 2009
Chaos Caverns II - Dev Diary
Updated 21/08/2009
A couple of months ago there was a discussion on Retro Remakes relating to creating platform games. The discussion grew to the point that I got a bit carried away and added an early (very early in fact) version of the source code to Chaos Caverns, as well as writing some pseudo-code to demonstrate how simple it is to make a basic platform game.
The thing is that once I'd written the pseudo code I found myself wondering if it would actually work, and the only way to test this is to bloody well write it. And so, using my pseudo code as a template I wrote a basic platform game engine. It works too. In fact in some respects it works better than the code for the original Chaos Caverns.
So now I'm in a bit of a quandry as I'd already been planning an update of Chaos Caverns and was in the middle of fixing up the old code.
So on the one hand is the original Chaos Caverns, which has already been expanded to allow me to create a flip-screen arcade adventure in a similar style to Jet Set Willy and pretty much only needs the levels completing to make it into a completed game.
On the other is the new code sitting in the Testplatform2 folder. This isn't quite as complete as the other but has some important improvements:
1 - It has been upscaled to 640x480. This is double the resolution of the original and does look a lot nicer.
2 - I've written the new version using Object Oriented methodology, rather than the spaghetti code of the first one. It has taken longer trying to remove things like the level transitions from the old one than it has to write the new one from scratch. Also I've got the benefit of a couple of years experience compared to when I started writing Chaos Caverns.
3 - Even on the short test level I've created it feels rather nice to play.
4 - It should allow me to create a much better game, more in a 16bit retro style than the 8bit style of the first one.
Point 4 is the decider really. Even though I could knock out a quick game using the old code base I wouldn't be as happy with it as I think I could be with the new game engine. So I'm re-implementing the functions that are still missing, which is a slightly frustrating exercise, but the new features (such as animated tiles, and each tile being available in multiple states) already there makes up for this.
So here we are at the birth of what I'm currently calling "Chaos Caverns II".
I'll be updating this post with the development diary as time goes on, so stay tuned!
2/8/2009
Level switching code has been added to the game, so now I can start building the game proper. I've put together three screens as a test and it all works rather well. At this point it is starting to feel a bit more like a proper game.
Next step: Items and enemies.
12/8/2009
Summer time isn't the most productive time for programming, but have managed to track down and fix some bugs, including an extremely annoying one which caused the player to fall through the floor, but only when existing the screen to the left. Exiting from the right worked fine. Finally managed to fix it by re-arranging the movement code to make sure that the move down / platform checking code ran before the left / right movement and exit code.
I've also spent some time tidying the code up a tad, and starting to add in the code for the enemies.
I'm dealing with the enemies in a different way to how I did in Chaos Caverns. This time all the enemies from all the levels are read into a single array when the game initialises - rather than refreshing the list at each level change. This allows me to "kill" enemies and for them to stay "dead" between screen changes.
I'll be dealing with objects in the same way. Some objects will allow you to destroy enemies, so for example, collecting a pin will allow you to destroy the giant beach balls. More on that later...
To make the game feel a bit more like a game and less like a demo I've added a place-holder title screen.
OK, so that's a long way from brilliant, but at least it makes testing the game a whole lot easier in that once you've lost all your lives you return to the title screen rather than just quitting or jumping straight back to the first screen.
I've also added a music track for the title screen - again this will probably change before the game is finished - but for now I'm using Mozart's Rondo Alla Turca (thank God for classical music in the public domain).
As a side note - if any budding musicians are reading this and feel like donating a music track or two then please get in touch.
As of midnight I've got a moving enemy on screen (at last!) and can move onto the enemy collision detection routines tomorrow. *yawn*
13/8/2009
Oh no, Thursday the Thirteenth!
I'm not superstitious about the number thirteen, but today hasn't been a great day, work-wise.
At lunchtime we lost access to the internet, email, our advertising system AND our page transmission system - while we were half-way through transmitting a newspaper that had a 1pm deadline. It was down for a couple of hours due to an alleged power-cut in Oxford.
So we had three hours of dealing with complaints about something that we've got no control over.
Wonderful.
Follow that up with me being on call, and getting called back in twice (although as luck would have it I was able to deal with both calls remotely).
On the bright side I've got collisions with the first (and currently only) enemy in the game working. So that's nice.
15/8/2009
Motivation is often an issue when writing software for yourself. When things get tricky, I try and add "one feature a day". This at least gives some impression of the game developing rather than just lurking there waiting for me to complete it.
OK, so sometimes you miss a day - but today's feature is one that makes the game a bit more tricky - and that is allowing the player to die if they fall from too great a height. Not a difficult thing to add but something that I had deliberately left out of the original Chaos Caverns.
This time, however, if the level designs that I have in mind are going to work then this is a feature that I need - otherwise there won't really be any challenge.
Features remaining to be added (and this isn't a full list):
Other things to do:
16/08/2009
Today's new feature: Lifts. In getting the lifts working I've had a think about how I'm coding the game, and more importantly, how to reduce the amount of routines that I need whilst achieving the features that I require.
The lifts are a case in point. I'd considered making a new game object for the lifts, but then realised the obvious: "a lift is just a platform that moves." So by adding some new parameters to the platform object and creating a "move" method I've got a basic lift working with very little new code required.
I've still got to code the interactions between the player and the moving platform (ie. if the platform is moving left or right then the player's movement follows it). Easy for left to right, slightly more complex for up and down. Especially when I need to find a way of slowing down the lifts a bit as by default they are a bit nippier than I'd like.
I was also going to add another object for "traps" - basically killer objects. Thinking about it what is a killer object? In the terms of this game it is an enemy that doesn't move, so if I just use the existing enemy routines with a static enemy then job done!
18/08/2009
Still working on the lifts. As they were running a bit faster than I'd like I've added in some code to create a delay between each move - so the lift will only move every x number of frames. This works well and allows me to have fast and slow lifts as required.
Moving and jumping whilst on the lift works but... sometimes whilst jumping you fall through the lift, and I can't see why. Ah well, tracking down and squashing bugs is all part of the fun.
I've also been re-jigging the player movement code a bit so it should be a wee bit more optimised, as well as a bit smaller. For those interested, at present on a P4 2.8ghz the game uses around 4% of the CPU time.
Update: Bug fixed, so the lifts are now fully working (vertically at least). I've adapted the second screen so that it can only be traversed by using the lift - and it all seems to work nicely so far.
19/08/2009
Now that the lifts are working it is time to start expanding on the code to make this less of a demo and more of a game. With that in mind I've started adding a fairly basic score / level / lives display at the bottom of the screen.
The strange thing is that this revealed another bug: if you dropped from a height into water you'd lose two lives instead of just one.
After one hell of a lot of hunting through the code I finally realised that I was forgetting to reset the fall height, so when the water code killed you the fall counter still contained a value, and because you re-spawned on a platform it treated it as if you'd landed after a long fall, which kills you a second time.
An easy one to fix, but a bugger to track down.
20/08/2009
Today I've been expanding the code a touch (extra baddie movement types, horizontally moving platforms and some other bits).
I've also put together a short video showing the game in action. Not a lot to see so far, but you can get the general idea of the progress so far.
Tomorrow I'm planning to add the code for collectible objects into the game. Once this has been done then the game engine will be basically complete and it will be time to start adding enemies and designing some proper levels.
21/08/2009
Found and eliminated another bug - one that allowed you to pass through a solid block if you hit it precisely on the top corner whilst falling.
I've amended the second screen to add in a horizontally moving platform - which worked first time without problems. Mind you, once you've got the code working for the vertical lifts there really isn't a lot you need to change to get it working for horizontal ones.
I've also posted about this game on the Retro Remakes forum - so a big hello to anyone visiting from there.
Update: Collectible items are now working. Quite simple to add really, providing you remember to use methods rather than functions for your reading / drawing code. Why? Because functions can't access an objects fields (the variables that the object uses). This is a "it's too late at night for this shit" type bug.
22/08/2009
Today has been mainly spent "enjoying" a bout of either man-flu or hayfever. Either one gives the same result - my IQ has dropped so far that it now has a negative value, my nose and eyes are taking it in turns to drip, different parts of me are competing in the "most painful" competition (I think that my back is winning so far). Not surprisingly I've not been especially productive on the old programming front.
That said, I've started expanding the baddie routines a bit, so the beach-ball enemy now rotates as it bounces (which looks cool), I've added a balloon as another enemy, which is also semi-transparent. This also looks pretty cool in action.
I've also started tidying up some of the routines, and simplifying the code in a couple of places.
Continued in Part 2
A couple of months ago there was a discussion on Retro Remakes relating to creating platform games. The discussion grew to the point that I got a bit carried away and added an early (very early in fact) version of the source code to Chaos Caverns, as well as writing some pseudo-code to demonstrate how simple it is to make a basic platform game.
The thing is that once I'd written the pseudo code I found myself wondering if it would actually work, and the only way to test this is to bloody well write it. And so, using my pseudo code as a template I wrote a basic platform game engine. It works too. In fact in some respects it works better than the code for the original Chaos Caverns.
So now I'm in a bit of a quandry as I'd already been planning an update of Chaos Caverns and was in the middle of fixing up the old code.
So on the one hand is the original Chaos Caverns, which has already been expanded to allow me to create a flip-screen arcade adventure in a similar style to Jet Set Willy and pretty much only needs the levels completing to make it into a completed game.
On the other is the new code sitting in the Testplatform2 folder. This isn't quite as complete as the other but has some important improvements:
1 - It has been upscaled to 640x480. This is double the resolution of the original and does look a lot nicer.
2 - I've written the new version using Object Oriented methodology, rather than the spaghetti code of the first one. It has taken longer trying to remove things like the level transitions from the old one than it has to write the new one from scratch. Also I've got the benefit of a couple of years experience compared to when I started writing Chaos Caverns.
3 - Even on the short test level I've created it feels rather nice to play.
4 - It should allow me to create a much better game, more in a 16bit retro style than the 8bit style of the first one.
Point 4 is the decider really. Even though I could knock out a quick game using the old code base I wouldn't be as happy with it as I think I could be with the new game engine. So I'm re-implementing the functions that are still missing, which is a slightly frustrating exercise, but the new features (such as animated tiles, and each tile being available in multiple states) already there makes up for this.
So here we are at the birth of what I'm currently calling "Chaos Caverns II".
I'll be updating this post with the development diary as time goes on, so stay tuned!
2/8/2009
Level switching code has been added to the game, so now I can start building the game proper. I've put together three screens as a test and it all works rather well. At this point it is starting to feel a bit more like a proper game.
Next step: Items and enemies.
12/8/2009
Summer time isn't the most productive time for programming, but have managed to track down and fix some bugs, including an extremely annoying one which caused the player to fall through the floor, but only when existing the screen to the left. Exiting from the right worked fine. Finally managed to fix it by re-arranging the movement code to make sure that the move down / platform checking code ran before the left / right movement and exit code.
I've also spent some time tidying the code up a tad, and starting to add in the code for the enemies.
I'm dealing with the enemies in a different way to how I did in Chaos Caverns. This time all the enemies from all the levels are read into a single array when the game initialises - rather than refreshing the list at each level change. This allows me to "kill" enemies and for them to stay "dead" between screen changes.
I'll be dealing with objects in the same way. Some objects will allow you to destroy enemies, so for example, collecting a pin will allow you to destroy the giant beach balls. More on that later...
To make the game feel a bit more like a game and less like a demo I've added a place-holder title screen.
OK, so that's a long way from brilliant, but at least it makes testing the game a whole lot easier in that once you've lost all your lives you return to the title screen rather than just quitting or jumping straight back to the first screen.
I've also added a music track for the title screen - again this will probably change before the game is finished - but for now I'm using Mozart's Rondo Alla Turca (thank God for classical music in the public domain).
As a side note - if any budding musicians are reading this and feel like donating a music track or two then please get in touch.
As of midnight I've got a moving enemy on screen (at last!) and can move onto the enemy collision detection routines tomorrow. *yawn*
13/8/2009
Oh no, Thursday the Thirteenth!
I'm not superstitious about the number thirteen, but today hasn't been a great day, work-wise.
At lunchtime we lost access to the internet, email, our advertising system AND our page transmission system - while we were half-way through transmitting a newspaper that had a 1pm deadline. It was down for a couple of hours due to an alleged power-cut in Oxford.
So we had three hours of dealing with complaints about something that we've got no control over.
Wonderful.
Follow that up with me being on call, and getting called back in twice (although as luck would have it I was able to deal with both calls remotely).
On the bright side I've got collisions with the first (and currently only) enemy in the game working. So that's nice.
15/8/2009
Motivation is often an issue when writing software for yourself. When things get tricky, I try and add "one feature a day". This at least gives some impression of the game developing rather than just lurking there waiting for me to complete it.
OK, so sometimes you miss a day - but today's feature is one that makes the game a bit more tricky - and that is allowing the player to die if they fall from too great a height. Not a difficult thing to add but something that I had deliberately left out of the original Chaos Caverns.
This time, however, if the level designs that I have in mind are going to work then this is a feature that I need - otherwise there won't really be any challenge.
Features remaining to be added (and this isn't a full list):
- Collectible items
- Killer items
- Special items
- Lifts
- Conveyor belts
Other things to do:
- Add more levels
- Add more enemies
- Add more graphics and sounds
- Write a proper front-end to the game
- Add an end sequence
16/08/2009
Today's new feature: Lifts. In getting the lifts working I've had a think about how I'm coding the game, and more importantly, how to reduce the amount of routines that I need whilst achieving the features that I require.
The lifts are a case in point. I'd considered making a new game object for the lifts, but then realised the obvious: "a lift is just a platform that moves." So by adding some new parameters to the platform object and creating a "move" method I've got a basic lift working with very little new code required.
I've still got to code the interactions between the player and the moving platform (ie. if the platform is moving left or right then the player's movement follows it). Easy for left to right, slightly more complex for up and down. Especially when I need to find a way of slowing down the lifts a bit as by default they are a bit nippier than I'd like.
I was also going to add another object for "traps" - basically killer objects. Thinking about it what is a killer object? In the terms of this game it is an enemy that doesn't move, so if I just use the existing enemy routines with a static enemy then job done!
18/08/2009
Still working on the lifts. As they were running a bit faster than I'd like I've added in some code to create a delay between each move - so the lift will only move every x number of frames. This works well and allows me to have fast and slow lifts as required.
Moving and jumping whilst on the lift works but... sometimes whilst jumping you fall through the lift, and I can't see why. Ah well, tracking down and squashing bugs is all part of the fun.
I've also been re-jigging the player movement code a bit so it should be a wee bit more optimised, as well as a bit smaller. For those interested, at present on a P4 2.8ghz the game uses around 4% of the CPU time.
Update: Bug fixed, so the lifts are now fully working (vertically at least). I've adapted the second screen so that it can only be traversed by using the lift - and it all seems to work nicely so far.
19/08/2009
Now that the lifts are working it is time to start expanding on the code to make this less of a demo and more of a game. With that in mind I've started adding a fairly basic score / level / lives display at the bottom of the screen.
The strange thing is that this revealed another bug: if you dropped from a height into water you'd lose two lives instead of just one.
After one hell of a lot of hunting through the code I finally realised that I was forgetting to reset the fall height, so when the water code killed you the fall counter still contained a value, and because you re-spawned on a platform it treated it as if you'd landed after a long fall, which kills you a second time.
An easy one to fix, but a bugger to track down.
20/08/2009
Today I've been expanding the code a touch (extra baddie movement types, horizontally moving platforms and some other bits).
I've also put together a short video showing the game in action. Not a lot to see so far, but you can get the general idea of the progress so far.
Tomorrow I'm planning to add the code for collectible objects into the game. Once this has been done then the game engine will be basically complete and it will be time to start adding enemies and designing some proper levels.
21/08/2009
Found and eliminated another bug - one that allowed you to pass through a solid block if you hit it precisely on the top corner whilst falling.
I've amended the second screen to add in a horizontally moving platform - which worked first time without problems. Mind you, once you've got the code working for the vertical lifts there really isn't a lot you need to change to get it working for horizontal ones.
I've also posted about this game on the Retro Remakes forum - so a big hello to anyone visiting from there.
Update: Collectible items are now working. Quite simple to add really, providing you remember to use methods rather than functions for your reading / drawing code. Why? Because functions can't access an objects fields (the variables that the object uses). This is a "it's too late at night for this shit" type bug.
22/08/2009
Today has been mainly spent "enjoying" a bout of either man-flu or hayfever. Either one gives the same result - my IQ has dropped so far that it now has a negative value, my nose and eyes are taking it in turns to drip, different parts of me are competing in the "most painful" competition (I think that my back is winning so far). Not surprisingly I've not been especially productive on the old programming front.
That said, I've started expanding the baddie routines a bit, so the beach-ball enemy now rotates as it bounces (which looks cool), I've added a balloon as another enemy, which is also semi-transparent. This also looks pretty cool in action.
I've also started tidying up some of the routines, and simplifying the code in a couple of places.
Continued in Part 2
Tuesday, 21 July 2009
On the Road again - Part 6
On the Road (and off the road) and back on the road again!
The new bike arrived on the expected date. The absolutely HUGE box with the bike parts in it was sat in our dining room ready for me to build my trusty steed v2. Putting it together didn't take too long. I'm getting quite experienced at doing this now, so it only took around an hour to build.
I discovered the problems pretty quickly. First, the gears weren't set up correctly. This isn't that unusual - I've often had to "tweak" the gears to get them right with a new bike. These, however, were well out. The front gears were nearly impossible to change, the back ones were sticky, to say the least. More worrying was that the front wheel was slightly warped - giving it a distinct wobble when riding.
At this point I decided that rather than playing around with it myself I'd take it to a bike shop to make sure it was set up correctly. Twenty four hours, a couple of phone calls and £20 later and I had a fully functioning and safety tested bike with a straight front wheel. Or so I thought.
I hadn't ridden the bike too far, in fact I'd only done around six or seven miles over the following couple of days when something unexpected happened - the left crank fell off whilst I was riding to work. Luckily I was able to keep control and not fall into the oncoming traffic, but as accidents go this could easily have been very nasty.
Once I'd got my breath (and the crank arm) back I found that the nut that holds the crank onto the crank shaft had fallen off. Some safety check that turned out to be! I free-wheeled the bike home and walked to work - feeling a certain amount of animosity to bike engineers in general.
On my way home I managed to find the missing nut, so I was able to re-attach the crank before walking the bike back to the shop to have a gentle word or two about the quality of their service.
Luckily they were most apologetic (them: "that shouldn't have happened, we're very particular about checking that sort of thing", me: "no kidding!") and took the bike back in for another go. They were as good as their word and the bike was back in my hands within three hours - with the crank securely attached. At least I hope it is securely attached.
It's going to be a while before I can put my trust back in this bike.
Buying a bike from a catalogue can be a bit hit-and-miss. I've bought a couple this way and this is the first time that I've really had any problems. Paying a proper bike shop to safety check and tweak your build is a good idea, as long as you can be sure that they have actually checked everything properly. You might need to be prepared to re-check everything yourself, just in case.
I'll leave the last word to the bike shop, as the assistant said to me as I was pushing the bike out to take it home. "This is why we don't sell this kind of bike."
The new bike arrived on the expected date. The absolutely HUGE box with the bike parts in it was sat in our dining room ready for me to build my trusty steed v2. Putting it together didn't take too long. I'm getting quite experienced at doing this now, so it only took around an hour to build.
I discovered the problems pretty quickly. First, the gears weren't set up correctly. This isn't that unusual - I've often had to "tweak" the gears to get them right with a new bike. These, however, were well out. The front gears were nearly impossible to change, the back ones were sticky, to say the least. More worrying was that the front wheel was slightly warped - giving it a distinct wobble when riding.
At this point I decided that rather than playing around with it myself I'd take it to a bike shop to make sure it was set up correctly. Twenty four hours, a couple of phone calls and £20 later and I had a fully functioning and safety tested bike with a straight front wheel. Or so I thought.
I hadn't ridden the bike too far, in fact I'd only done around six or seven miles over the following couple of days when something unexpected happened - the left crank fell off whilst I was riding to work. Luckily I was able to keep control and not fall into the oncoming traffic, but as accidents go this could easily have been very nasty.
Once I'd got my breath (and the crank arm) back I found that the nut that holds the crank onto the crank shaft had fallen off. Some safety check that turned out to be! I free-wheeled the bike home and walked to work - feeling a certain amount of animosity to bike engineers in general.
On my way home I managed to find the missing nut, so I was able to re-attach the crank before walking the bike back to the shop to have a gentle word or two about the quality of their service.
Luckily they were most apologetic (them: "that shouldn't have happened, we're very particular about checking that sort of thing", me: "no kidding!") and took the bike back in for another go. They were as good as their word and the bike was back in my hands within three hours - with the crank securely attached. At least I hope it is securely attached.
It's going to be a while before I can put my trust back in this bike.
Buying a bike from a catalogue can be a bit hit-and-miss. I've bought a couple this way and this is the first time that I've really had any problems. Paying a proper bike shop to safety check and tweak your build is a good idea, as long as you can be sure that they have actually checked everything properly. You might need to be prepared to re-check everything yourself, just in case.
I'll leave the last word to the bike shop, as the assistant said to me as I was pushing the bike out to take it home. "This is why we don't sell this kind of bike."
Monday, 20 July 2009
On the Road Again - Part 5
There is a kind of bitter sweet element to this post. Originally it was intended to mark the one-year anniversary of my return to cycling - with a short update on general health, fitness, weight loss etc.
Unfortunately (as mentioned in my previous post) the biggest weight loss was the loss of my bike - now relegated to memory and a "crime reference number."
In a strangely ironic twist, my replacement bike should arrive on the one-year anniversary of my buying the original. Sadly I did need to spend a little over the £50 mark this time (the cheapeast budget bike I could get in a hurry was £139). This gives me an 18 speed, rigid frame mountain bike with front suspension.
The bike itself is a Townsend "Dark Mesa" mountain bike. Cheap (relatively speaking) but should get me to work on time. As I've recently been playing Half-Life 2 (XBox 360 Orange Box release) this has raised a smile. I may have to see if I can get a Black Mesa logo onto the frame somewhere. . .
Getting back on topic, my weight has stayed contant at 12 stone (as said before, down from 13.5 stone) and I'm feeling fitter than I have done in years. I don't really get out of breath on the ride to and from work (unless I REALLY push myself) and can comfortably make the journey in five minutes (traffic allowing, and yes, that's even including the hill on the way back).
After doing some rough-and-ready calculations I've cycled over 1000 miles in the last year (more or less) and I've developed some nice muscles where once was flab.
Speaking of flab, my "love handles" vanished at some point over the last six months. No idea when, but they've gone.
So overall (and without harping on about the loss of my bike too much) it's been a good year for my health and fitness, has given me more time at home, and has also given me some quality time going out on rides with my daughter.
So for anyone who may be considering getting a bicycle, I can honestly say I'd recommend it without reservation. Just make sure you keep it in a safe place.
Thieving gits.
Unfortunately (as mentioned in my previous post) the biggest weight loss was the loss of my bike - now relegated to memory and a "crime reference number."
In a strangely ironic twist, my replacement bike should arrive on the one-year anniversary of my buying the original. Sadly I did need to spend a little over the £50 mark this time (the cheapeast budget bike I could get in a hurry was £139). This gives me an 18 speed, rigid frame mountain bike with front suspension.
The bike itself is a Townsend "Dark Mesa" mountain bike. Cheap (relatively speaking) but should get me to work on time. As I've recently been playing Half-Life 2 (XBox 360 Orange Box release) this has raised a smile. I may have to see if I can get a Black Mesa logo onto the frame somewhere. . .
Getting back on topic, my weight has stayed contant at 12 stone (as said before, down from 13.5 stone) and I'm feeling fitter than I have done in years. I don't really get out of breath on the ride to and from work (unless I REALLY push myself) and can comfortably make the journey in five minutes (traffic allowing, and yes, that's even including the hill on the way back).
After doing some rough-and-ready calculations I've cycled over 1000 miles in the last year (more or less) and I've developed some nice muscles where once was flab.
Speaking of flab, my "love handles" vanished at some point over the last six months. No idea when, but they've gone.
So overall (and without harping on about the loss of my bike too much) it's been a good year for my health and fitness, has given me more time at home, and has also given me some quality time going out on rides with my daughter.
So for anyone who may be considering getting a bicycle, I can honestly say I'd recommend it without reservation. Just make sure you keep it in a safe place.
Thieving gits.
Tuesday, 14 July 2009
Off the road (for a bit)
Well, thanks to the twat that stole my mountain bike from outside my house last night I'm now a pedestrian again.
I'd claim on insurance, but by the time I've paid my excess it's just not worth it. I have, of course, reported it to the police and got a crime incident number, but let's be honest here - the chances of me actually getting the bike back again are somewhere between slim and none.
Ho-hum.
My bike - July 21st 2008 - July 14th 2009. R.I.P. (Rust In Pieces).
Small update: Out of sheer bloody-mindedness I checked with my insurers about claiming for the bike and found out the following:
I'm not covered for it.
Yes folks, pedal cycles (as they quaintly put it in the small print) are not covered by the homes and contents insurance. So if you do own a bike and imagine that your home insurance covers it you might want to double-check with them BEFORE it goes walkies.
I'd claim on insurance, but by the time I've paid my excess it's just not worth it. I have, of course, reported it to the police and got a crime incident number, but let's be honest here - the chances of me actually getting the bike back again are somewhere between slim and none.
Ho-hum.
My bike - July 21st 2008 - July 14th 2009. R.I.P. (Rust In Pieces).
Small update: Out of sheer bloody-mindedness I checked with my insurers about claiming for the bike and found out the following:
I'm not covered for it.
Yes folks, pedal cycles (as they quaintly put it in the small print) are not covered by the homes and contents insurance. So if you do own a bike and imagine that your home insurance covers it you might want to double-check with them BEFORE it goes walkies.
Friday, 26 June 2009
RIP: Michael Jackson
Pop Superstar Michael Jackson has died of a suspected heart attack, aged 50.
I'm not going to claim to be a big fan of his music, but I can't deny the impact he had on the music scene. He certainly had talent, even if he was a bit "odd" at times.
One thing that has already been mentioned in the press is the accusations of his alleged impropriety with minors. It is worth remembering that when this did go to court he was found innocent - and that allegedly one of the accusors is well known as a "grifter" (or con-artist) and has been sent to prison for it.
So was he guilty of anything? Probably only of being "different", and that should never be treated as a crime.
I'm not going to claim to be a big fan of his music, but I can't deny the impact he had on the music scene. He certainly had talent, even if he was a bit "odd" at times.
One thing that has already been mentioned in the press is the accusations of his alleged impropriety with minors. It is worth remembering that when this did go to court he was found innocent - and that allegedly one of the accusors is well known as a "grifter" (or con-artist) and has been sent to prison for it.
So was he guilty of anything? Probably only of being "different", and that should never be treated as a crime.
Friday, 29 May 2009
Thoughts on Unix File System Structure
There is a school of thought that believes that the Unix File System layout (or Filesystem Heirarchy Standard) is needlessly complicated. Some would go as far as to claim that it is fundementally broken.
The recent post on OSNews by Thom Holwerda (and especially many of the comments) provide some good examples of the perceived problems and the oft touted solutions.
There is a counter argument though, which goes something like this: "the Unix File System has been evolving for well over 30 years, isn't it strange that no-one noticed just how broken it is?"
Let's look at some of the arguments for and against the current system.
To make this easier, I'll pick up on some of the comments and see if they can be answered.
One quick point. In most cases I'm going to refer to Unix where this covers all Unix based operating systems. If I say something specific to GNU/Linux or another operating system then I'll name it.
I'm just trying to explain that many people are put off diving further into the intricacies of the computer simply because of how daunting everything is. By making a system easy to use and understand not only at the very highest level (the UI) but also all the levels below that, we might enable more people to actually *understand* their computers better, which would be beneficial to *all* of us.
I am of the strong belief that there is no sane reason WHATSOEVER why we couldn't make computers easier to use and understand on ALL levels, and not just at the top - other than geek job security.
This is a good place to start. The layout is there to simplify maintenance of the system, not to complicate it. This is nothing to do with "Job Security" - more to do with making a usable, maintainable system. Having people dipping into the OS structure (whether it be Windows, Unix or MacOS) would create MORE work for the support geek, not less.
I'll give you a real life example. Around ten years ago I installed Linux for a friend. This was back in the days that installing it was still a bit of an art. At the time getting XWindows up and running was cause for celebration, and as for working sound, IN YOUR DREAMS BUDDY!
After an hour or so of fiddling around with the config files, xf86config (remember that?), making sure that the correct packages were installed I gave him a quick run-though of how the system worked. As he had come from a DOS/Windows background I'd configured everything to look pretty similar, and showed how the basic commands worked (use "ls" instead of "dir", "cd" works about the same, "rm" instead of "del" and so forth) and gave a quick guided tour of XWindows, X11Amp and the other installed goodies.
He collared me the next day: "I thought you said this Linux stuff was stable. I restarted it and now I can't get back in! It's shit!"
On further investigation what he had done became apparent. He'd had a wander through the file system, picked some files that "didn't look important" (including /etc/passwd in this case) and deleted them to free up a bit of space.
This doesn't just happen with Linux. A year or so later the Telecoms manager at my company phoned me because his PC wouldn't boot any more. He'd been trying to upgrade Internet Explorer to the latest version and had run out of space on his C:\ drive. He'd managed to find a folder that didn't look that important but "had a lot of stuff in" it that he "didn't need" and deleted it. His PC had crashed part-way through and now it wouldn't start any more. Sadly the junk folder he'd chosen was called C:\WINDOWS.
OK, so these are vaguely amusing war stories but what is my point? Well, my point is this: Users don't understand operating systems. I'd go as far as to say that they shouldn't actually have to. In the majority of cases the best thing that a user can do is to not mess with the underlying OS at all. Hiding as much of it as possible from them is A Very Good Thing Indeed.
As is traditional at this point, lets turn to our old friend the analogy. Many people drive cars. You sit down, turn the ignition, grab the steering wheel, press down the accellerator and off you go (yes, I know that there is a little more to it than that, but you get the general idea). Now, how many drivers could strip an engine? How about the gears, know how they work? Could you strip the gearbox down if you needed to and reassemble it in a working condition afterwards?
The fact is that you don't need to know the mechanics of a car in order to drive one. Although anyone could pop open the bonnet and have a root around inside most people don't. If there is a problem, they take it to a garage.
Of course, some people DO tinker with their cars. They take a great deal of pride in being able to maintain and even customise their car. Is what they do easy? Of course not. Can anyone do it? No. Only an idiot would imagine that everyone can do everything, some degree of knowledge or learning may be required. This isn't meant to be an insult, but it is a fact.
To come back to the point of the analogy, is the Car any less useful because people don't understand how it works? Of course not.
This follows through to computers. Most people can use their computer quite happily with no idea of the underlying mechanisms. If they have problems then they can get in touch with their friendly neighbourhood technician. There is nothing stopping them learning about it if they want, just don't expect it to be easy. Just like a car, an operating system (and its component parts) is made to fulfil a function, not to be played around with.
pulseaudio is yet another layer on top of a broken audio foundation. Adding layers does not make things better, it just hides it a little longer.
Another good example of mistaken thinking. Abstraction can be a very good thing, and pulseaudio is an excellent example of this. Let's see how this works.
Just for the same of argument, lets say we were trying to write a simple audio player on GNU/Linux. Now, how do you make it play sounds? At a very basic level you might write directly to /dev/dsp. So now your app plays sounds. It might lock the /dev/dsp device but hey, this is just a simple example.
Let's up the stakes a bit and try and port the app to, say, Windows. What happened to /dev/dsp? It doesn't exist. How about MacOS X? Nope, not likely to work here either.
How does this relate to abstraction? Well, if our audio app uses pulseaudio to plays its sound it will now work on any platform that pulseaudio is supported on. For something like KDE that is aiming to be a cross platform environment this makes coding your apps an awful lot easier.
In other words the GNU/Linux audio foundation isn't broken, it just doesn't exist on Windows.
Why bin? Because that's where your 'binaries' are, right? oh, except there are programs now that are text files run through an interpreter, so that doesn't really apply. A user's files aren't under /usr, my webserver by default isn't under /svr, it's under /var/www. /etc? Yeah, something about etcetera really says 'config files'. Seriously, who thought /etc was a good name?
This is the biggie. To answer this, it is necessary to look at and understand where Unix came from.
First, another quick experiment. Try and find a Unix reference manual from any time in the last twenty years or so. The command references are still likely to work. Any shell scripts (providing you are using the modern version of the same interpreter) are also likely to work without any changes.
In the earliest days of Unix space was at a premium. Shorter command names meant shorter scripts (and less space in the file allocation tables). This is why the "base" commands are only two characters long, for example, "ls", "cd", "rm", "du" and so forth. Although we don't have the same physical limits these days there are a lot of scripts out there that rely on the short versions of the file names. Keeping them the same means that people don't have to re-learn all their skills with each new release of the OS (something that Microsoft could learn from).
This also follows through to the file system layout (again, I'm going to simplify this a bit, but hopefully you'll get the idea).
At the root of our Unix system we find these main folders:
These are the most basic parts of your Unix system. These are the base commands and libraries that are required to give you a bootable system with access to a network.
Moving down the tree, we come to /usr.
This is the next level up. /usr is NOT where user files are stored, or for user generated versions of applications. In this case "usr" stands for Unix System Resources (although originally this was the location of users home directories). This is where the vendor provided files live (the stuff that isn't part of the standard base files). For those who argue about everything being shoved into /usr by Ubuntu, RedHat or whoever, this is actually where they SHOULD go. Anything in here should have been provided by the distro maintainers. Between / and /USR that should contain everything that your operating system needs. All applications, all configuration files, everything.
So what about /usr/local?
The /usr/local section of the file system is where any binaries that YOU create are stored, along with their configuration files. If you wanted to create a custom version of any application it should appear in here. This keeps your stuff separate from what the vendor provides, and in theory prevents you from permanently damaging the operating system. If you do manage to balls things up totally then deleting /usr/local should be enough to fix it again (as all the vendor provided files should still be intact and untouched).
Another benefit of this approach is that once your root system is installed, the actual location of /usr becomes irelevent. It could just as easily be on a shared network drive as it could be on your local disk. If disk space is at a premium this can be a very effective way of working. It also means that every users has the same base system, because they are running the same apps from the same place.
OK, so thats not as useful for a single user system, but it is still functionality that is used in some places. Just because YOU don't use it, doesn't mean it isn't useful.
Before anyone pipes up yes, I am fully aware of /opt, /var, /tmp, /dev and so forth. All of these have their uses, but are not relevent for the purposes of this discussion.
That's a big giant gaping hole in Linux, not in Thom's proposed filesystem layout. There's no such distrinction in a Linux distro, as there's no such thing as "the OS" vs "the user apps". Once someone gets the balls to stand up and say "this is the OS" and package it separately from "the user apps", the FHS will never change.
Actually GNU/Linux and Unix already does separate the OS from the User Apps. Remember our three levels? The bottom level is the OS - the bit you need to get a working system (ie. /bin /sbin /lib and so on). Anything in /usr or above is a user app. Yes, you may see XFree86 as essential, but GNU/Linux can run without it. Same for Mozilla, and FireFox and anything else in /usr or /usr/local.
* * *
The biggest problem there is with operating systems in general (not just GNU/Linux) is that for some reason people assume that it should all be easy. The desktop is easy to use therefore the underlying system should also be easy to use.
This is a very strange form of logic. Simplifying where necessary is a good thing, providing it doesn't impact on functionality or reliability. To go back to our car analogy there would be an argument for simplifying the innards of the car to make it much easier to understand and maintain for the common user. As a thought experiment, let's try it.
Let's start with the gearbox. Much too complicated and a potential point of failure, choosing a good default gear should do away with the need for that. How about a petrol engine? All that internal combustion malarkey sounds a bit dangerous to me. Running a vehicle based on small controlled explosions? Stuff that for a game of soldiers! Let's replace that with an electric one. But wait, maybe some people don't understand how an electric motor works either. So on second thoughts, let's replace it with a pedal driven one.
Hmm, it's a bit heavy to pedal, so lets remove most of the metal bodywork, a canvass roof should suffice (plus it's easy to repair or replace).
Anti-lock brakes? They'd have to go as well. Disc brakes are much simpler. Power streering? Not really needed now, drop that too. We can also leave out the airbags as we won't be going that fast now anyway.
So what are we left with? Basically a four-wheeled bicycle. Handy in some circumstances, easy to maintain but not necessarily as useful as what we started with.
Yes, this is taking it to the extremes, but that is the equivalent of what people are suggesting is done to the Unix file system. Let's remove everything that we don't understand the reasons for and just use what is left. Sadly what is left may be easy to understand, but its functionality would likely be crippled.
* * *
Does any of this mean that people (like GoBoLinux for example) shouldn't experiment and try different things? Of course not. Finding new (and potentially better) ways of doing things is something that can end up as a benefit to everyone. But making changes for the sake of being different is not so good.
Looking closer at GoBoLinux it is adding one hell of a lot of complexity to the system in order to just keep things working (have a check of http://www.gobolinux.org/index.php?page=at_a_glance and ask yourself about all the symlinks), whilst loosing some of the benefits of the traditional Unix system.
Reading http://www.gobolinux.org/index.php?page=doc/articles/clueless gives plenty of information on why GoBoLinux have chosen their approach. It also re-inforces some of the points made above, especially with regard to the three-tier approach of traditional Unix.
* * *
In the end, used properly the current Unix File System Layout actually works rather well, changing to something else isn't going to solve the problems of people ignoring a standard. All it will achieve is change for the sake of it, and chances are some benefits will be lost in the process.
The recent post on OSNews by Thom Holwerda (and especially many of the comments) provide some good examples of the perceived problems and the oft touted solutions.
There is a counter argument though, which goes something like this: "the Unix File System has been evolving for well over 30 years, isn't it strange that no-one noticed just how broken it is?"
Let's look at some of the arguments for and against the current system.
To make this easier, I'll pick up on some of the comments and see if they can be answered.
One quick point. In most cases I'm going to refer to Unix where this covers all Unix based operating systems. If I say something specific to GNU/Linux or another operating system then I'll name it.
I'm just trying to explain that many people are put off diving further into the intricacies of the computer simply because of how daunting everything is. By making a system easy to use and understand not only at the very highest level (the UI) but also all the levels below that, we might enable more people to actually *understand* their computers better, which would be beneficial to *all* of us.
I am of the strong belief that there is no sane reason WHATSOEVER why we couldn't make computers easier to use and understand on ALL levels, and not just at the top - other than geek job security.
This is a good place to start. The layout is there to simplify maintenance of the system, not to complicate it. This is nothing to do with "Job Security" - more to do with making a usable, maintainable system. Having people dipping into the OS structure (whether it be Windows, Unix or MacOS) would create MORE work for the support geek, not less.
I'll give you a real life example. Around ten years ago I installed Linux for a friend. This was back in the days that installing it was still a bit of an art. At the time getting XWindows up and running was cause for celebration, and as for working sound, IN YOUR DREAMS BUDDY!
After an hour or so of fiddling around with the config files, xf86config (remember that?), making sure that the correct packages were installed I gave him a quick run-though of how the system worked. As he had come from a DOS/Windows background I'd configured everything to look pretty similar, and showed how the basic commands worked (use "ls" instead of "dir", "cd" works about the same, "rm" instead of "del" and so forth) and gave a quick guided tour of XWindows, X11Amp and the other installed goodies.
He collared me the next day: "I thought you said this Linux stuff was stable. I restarted it and now I can't get back in! It's shit!"
On further investigation what he had done became apparent. He'd had a wander through the file system, picked some files that "didn't look important" (including /etc/passwd in this case) and deleted them to free up a bit of space.
This doesn't just happen with Linux. A year or so later the Telecoms manager at my company phoned me because his PC wouldn't boot any more. He'd been trying to upgrade Internet Explorer to the latest version and had run out of space on his C:\ drive. He'd managed to find a folder that didn't look that important but "had a lot of stuff in" it that he "didn't need" and deleted it. His PC had crashed part-way through and now it wouldn't start any more. Sadly the junk folder he'd chosen was called C:\WINDOWS.
OK, so these are vaguely amusing war stories but what is my point? Well, my point is this: Users don't understand operating systems. I'd go as far as to say that they shouldn't actually have to. In the majority of cases the best thing that a user can do is to not mess with the underlying OS at all. Hiding as much of it as possible from them is A Very Good Thing Indeed.
As is traditional at this point, lets turn to our old friend the analogy. Many people drive cars. You sit down, turn the ignition, grab the steering wheel, press down the accellerator and off you go (yes, I know that there is a little more to it than that, but you get the general idea). Now, how many drivers could strip an engine? How about the gears, know how they work? Could you strip the gearbox down if you needed to and reassemble it in a working condition afterwards?
The fact is that you don't need to know the mechanics of a car in order to drive one. Although anyone could pop open the bonnet and have a root around inside most people don't. If there is a problem, they take it to a garage.
Of course, some people DO tinker with their cars. They take a great deal of pride in being able to maintain and even customise their car. Is what they do easy? Of course not. Can anyone do it? No. Only an idiot would imagine that everyone can do everything, some degree of knowledge or learning may be required. This isn't meant to be an insult, but it is a fact.
To come back to the point of the analogy, is the Car any less useful because people don't understand how it works? Of course not.
This follows through to computers. Most people can use their computer quite happily with no idea of the underlying mechanisms. If they have problems then they can get in touch with their friendly neighbourhood technician. There is nothing stopping them learning about it if they want, just don't expect it to be easy. Just like a car, an operating system (and its component parts) is made to fulfil a function, not to be played around with.
pulseaudio is yet another layer on top of a broken audio foundation. Adding layers does not make things better, it just hides it a little longer.
Another good example of mistaken thinking. Abstraction can be a very good thing, and pulseaudio is an excellent example of this. Let's see how this works.
Just for the same of argument, lets say we were trying to write a simple audio player on GNU/Linux. Now, how do you make it play sounds? At a very basic level you might write directly to /dev/dsp. So now your app plays sounds. It might lock the /dev/dsp device but hey, this is just a simple example.
Let's up the stakes a bit and try and port the app to, say, Windows. What happened to /dev/dsp? It doesn't exist. How about MacOS X? Nope, not likely to work here either.
How does this relate to abstraction? Well, if our audio app uses pulseaudio to plays its sound it will now work on any platform that pulseaudio is supported on. For something like KDE that is aiming to be a cross platform environment this makes coding your apps an awful lot easier.
In other words the GNU/Linux audio foundation isn't broken, it just doesn't exist on Windows.
Why bin? Because that's where your 'binaries' are, right? oh, except there are programs now that are text files run through an interpreter, so that doesn't really apply. A user's files aren't under /usr, my webserver by default isn't under /svr, it's under /var/www. /etc? Yeah, something about etcetera really says 'config files'. Seriously, who thought /etc was a good name?
This is the biggie. To answer this, it is necessary to look at and understand where Unix came from.
First, another quick experiment. Try and find a Unix reference manual from any time in the last twenty years or so. The command references are still likely to work. Any shell scripts (providing you are using the modern version of the same interpreter) are also likely to work without any changes.
In the earliest days of Unix space was at a premium. Shorter command names meant shorter scripts (and less space in the file allocation tables). This is why the "base" commands are only two characters long, for example, "ls", "cd", "rm", "du" and so forth. Although we don't have the same physical limits these days there are a lot of scripts out there that rely on the short versions of the file names. Keeping them the same means that people don't have to re-learn all their skills with each new release of the OS (something that Microsoft could learn from).
This also follows through to the file system layout (again, I'm going to simplify this a bit, but hopefully you'll get the idea).
At the root of our Unix system we find these main folders:
/ -- root
/bin -- binaries
/sbin -- system tools (ie. fdisk, hdparm, fsck)
/lib -- libraries
/etc -- configuration files / scripts / anything that doesn't fit
in the other directories
These are the most basic parts of your Unix system. These are the base commands and libraries that are required to give you a bootable system with access to a network.
Moving down the tree, we come to /usr.
/usr -- root
/usr/bin -- binaries
/usr/sbin -- system tools
/usr/lib -- libraries
/usr/etc -- configuration files / scripts / anything that doesn't fit
in the other directories
This is the next level up. /usr is NOT where user files are stored, or for user generated versions of applications. In this case "usr" stands for Unix System Resources (although originally this was the location of users home directories). This is where the vendor provided files live (the stuff that isn't part of the standard base files). For those who argue about everything being shoved into /usr by Ubuntu, RedHat or whoever, this is actually where they SHOULD go. Anything in here should have been provided by the distro maintainers. Between / and /USR that should contain everything that your operating system needs. All applications, all configuration files, everything.
So what about /usr/local?
/usr/local -- root
/usr/local/bin -- binaries
/usr/local/sbin -- system tools
/usr/local/lib -- libraries
/usr/local/etc -- configuration files / scripts / anything that doesn't fit
in the other directories
The /usr/local section of the file system is where any binaries that YOU create are stored, along with their configuration files. If you wanted to create a custom version of any application it should appear in here. This keeps your stuff separate from what the vendor provides, and in theory prevents you from permanently damaging the operating system. If you do manage to balls things up totally then deleting /usr/local should be enough to fix it again (as all the vendor provided files should still be intact and untouched).
Another benefit of this approach is that once your root system is installed, the actual location of /usr becomes irelevent. It could just as easily be on a shared network drive as it could be on your local disk. If disk space is at a premium this can be a very effective way of working. It also means that every users has the same base system, because they are running the same apps from the same place.
OK, so thats not as useful for a single user system, but it is still functionality that is used in some places. Just because YOU don't use it, doesn't mean it isn't useful.
Before anyone pipes up yes, I am fully aware of /opt, /var, /tmp, /dev and so forth. All of these have their uses, but are not relevent for the purposes of this discussion.
For a start, it has a gaping hole: he doesn't explain how you separate "System" from "Programs".
That's a big giant gaping hole in Linux, not in Thom's proposed filesystem layout. There's no such distrinction in a Linux distro, as there's no such thing as "the OS" vs "the user apps". Once someone gets the balls to stand up and say "this is the OS" and package it separately from "the user apps", the FHS will never change.
Actually GNU/Linux and Unix already does separate the OS from the User Apps. Remember our three levels? The bottom level is the OS - the bit you need to get a working system (ie. /bin /sbin /lib and so on). Anything in /usr or above is a user app. Yes, you may see XFree86 as essential, but GNU/Linux can run without it. Same for Mozilla, and FireFox and anything else in /usr or /usr/local.
* * *
The biggest problem there is with operating systems in general (not just GNU/Linux) is that for some reason people assume that it should all be easy. The desktop is easy to use therefore the underlying system should also be easy to use.
This is a very strange form of logic. Simplifying where necessary is a good thing, providing it doesn't impact on functionality or reliability. To go back to our car analogy there would be an argument for simplifying the innards of the car to make it much easier to understand and maintain for the common user. As a thought experiment, let's try it.
Let's start with the gearbox. Much too complicated and a potential point of failure, choosing a good default gear should do away with the need for that. How about a petrol engine? All that internal combustion malarkey sounds a bit dangerous to me. Running a vehicle based on small controlled explosions? Stuff that for a game of soldiers! Let's replace that with an electric one. But wait, maybe some people don't understand how an electric motor works either. So on second thoughts, let's replace it with a pedal driven one.
Hmm, it's a bit heavy to pedal, so lets remove most of the metal bodywork, a canvass roof should suffice (plus it's easy to repair or replace).
Anti-lock brakes? They'd have to go as well. Disc brakes are much simpler. Power streering? Not really needed now, drop that too. We can also leave out the airbags as we won't be going that fast now anyway.
So what are we left with? Basically a four-wheeled bicycle. Handy in some circumstances, easy to maintain but not necessarily as useful as what we started with.
Yes, this is taking it to the extremes, but that is the equivalent of what people are suggesting is done to the Unix file system. Let's remove everything that we don't understand the reasons for and just use what is left. Sadly what is left may be easy to understand, but its functionality would likely be crippled.
* * *
Does any of this mean that people (like GoBoLinux for example) shouldn't experiment and try different things? Of course not. Finding new (and potentially better) ways of doing things is something that can end up as a benefit to everyone. But making changes for the sake of being different is not so good.
Looking closer at GoBoLinux it is adding one hell of a lot of complexity to the system in order to just keep things working (have a check of http://www.gobolinux.org/index.php?page=at_a_glance and ask yourself about all the symlinks), whilst loosing some of the benefits of the traditional Unix system.
Reading http://www.gobolinux.org/index.php?page=doc/articles/clueless gives plenty of information on why GoBoLinux have chosen their approach. It also re-inforces some of the points made above, especially with regard to the three-tier approach of traditional Unix.
* * *
In the end, used properly the current Unix File System Layout actually works rather well, changing to something else isn't going to solve the problems of people ignoring a standard. All it will achieve is change for the sake of it, and chances are some benefits will be lost in the process.
Tuesday, 26 May 2009
Puncture-resistant Bike Tires
In the year(ish) that I've been a commuting cyclist I've been lucky enough to have avoided getting any punctures. My daughter has been less lucky. A couple of weeks ago she managed to get a staple in her back tire which caused multiple small punctures.
Trying to repair a puncture on the back tire is about a big a pain in the arse as you can get. You have the choice between trying to find the puncture without removing the wheel (tricky, time consuming and runs the risk of tearing the tube), or having to remove the rear brake, chain, gears and the wheel to get the inner-tube out, and that's before you even start trying to find and repair the puncture.
I'd reached the point where I was ready to call it quits, buy an inner-tube from Halfords and pay them the extra to fit it ('cos I'm a lazy git when it boils down to it).
Although I've not had any punctures I have managed to shatter the right pedal on my bike. I'm not sure how I managed it but rest asured it was totally wrecked, so today I spent my lunchbreak looking for a pair of standard pedals (in other words cheap, 'cos not only am I lazy, but I'm also a tight-wad).
Our local hardware store was selling pedals at £2.99 a pair, which isn't too bad really. They also had tubes of this strange-looking orangey-pink stuff called "Doctor Sludge."
According to the label it repairs punctures and makes the inner-tube puncture-proof for the lifetime of the tube. Did I hear a "yeah right?" I thought so. As it was going for just over four quid I decided to give it a shot.
What does this involve? Well, first you unscrew and remove the valve from the inner-tube (you get a small tool for this with the sludge), next attach the bottle of sludge using the extendable tube and pour about half the 250ml bottle into the tire. After that you replace the valve, pump up the tire and that's it, job done.
It works by the air pressure in the tire forcing the sludge out through any holes, at which point the sludge sets solid. Any (small) punctures after the tire has been treated should heal straight away.
Well, I've just tried this with my daughters bike. Bear in mind that I've had four attempts at repairing the tire the old-fashioned way. I've probably wasted a couple of hours and had a pretty frustrating time into the bargain messing with the bowl of water, rubber glue, repair patches and so forth. To make it worse, after all that effort it still wasn't fixed (it would deflate after about half-an-hour).
The Doctor Sludge method took around ten minutes, and so far (twelve hours later) the tire has stayed fully inflated. Not only that, but I've got enough left to sort out her front tire as well.
Before anyone asks, no, I'm not associated with the Doctor Sludge people. I think I'll be buying some more of this for my own bike (unless anyone from Doctor Sludge is reading this and feels like sending me a couple of free bottles. . .).
Trying to repair a puncture on the back tire is about a big a pain in the arse as you can get. You have the choice between trying to find the puncture without removing the wheel (tricky, time consuming and runs the risk of tearing the tube), or having to remove the rear brake, chain, gears and the wheel to get the inner-tube out, and that's before you even start trying to find and repair the puncture.
I'd reached the point where I was ready to call it quits, buy an inner-tube from Halfords and pay them the extra to fit it ('cos I'm a lazy git when it boils down to it).
Although I've not had any punctures I have managed to shatter the right pedal on my bike. I'm not sure how I managed it but rest asured it was totally wrecked, so today I spent my lunchbreak looking for a pair of standard pedals (in other words cheap, 'cos not only am I lazy, but I'm also a tight-wad).
Our local hardware store was selling pedals at £2.99 a pair, which isn't too bad really. They also had tubes of this strange-looking orangey-pink stuff called "Doctor Sludge."
According to the label it repairs punctures and makes the inner-tube puncture-proof for the lifetime of the tube. Did I hear a "yeah right?" I thought so. As it was going for just over four quid I decided to give it a shot.
What does this involve? Well, first you unscrew and remove the valve from the inner-tube (you get a small tool for this with the sludge), next attach the bottle of sludge using the extendable tube and pour about half the 250ml bottle into the tire. After that you replace the valve, pump up the tire and that's it, job done.
It works by the air pressure in the tire forcing the sludge out through any holes, at which point the sludge sets solid. Any (small) punctures after the tire has been treated should heal straight away.
Well, I've just tried this with my daughters bike. Bear in mind that I've had four attempts at repairing the tire the old-fashioned way. I've probably wasted a couple of hours and had a pretty frustrating time into the bargain messing with the bowl of water, rubber glue, repair patches and so forth. To make it worse, after all that effort it still wasn't fixed (it would deflate after about half-an-hour).
The Doctor Sludge method took around ten minutes, and so far (twelve hours later) the tire has stayed fully inflated. Not only that, but I've got enough left to sort out her front tire as well.
Before anyone asks, no, I'm not associated with the Doctor Sludge people. I think I'll be buying some more of this for my own bike (unless anyone from Doctor Sludge is reading this and feels like sending me a couple of free bottles. . .).
Saturday, 9 May 2009
Dell GX270
I've normally got mixed feelings about Dells. On the up side they tend to be fairly reliable (apart from some well publicised problems with POTs bursting on some motherboards). The downside is that they tend to be fairly hard to upgrade.
So why change my computer for an old GX270? Well, the fact is that it is significantly faster than the venerable old Athlon XP 2200 which has served me well for the past doo-de-dah years (where doo-de-dah is a number between one and too-many).
One thing that is taking some getting used to is the noise, or lack of it. My original PC sounded not unlike a Harrier jump jet taking off when it was started, and made a constant noise whilst in use. Yes, you get used to it but My God was it loud!
To give you an idea of how quiet the GX270 is, my LCD monitor makes more noise. Seriously, this thing is whisper quiet. I'd go as far as to say it is one of the quietest PCs that I've ever used (and that is including some of those with specialised cooling equipment).
Really, the only problem with this model of Dell is reliability. Hopefully this will be one of the PCs that doesn't suffer with swollen capacitors, think of it as the XBox 360 of the PC world.
I don't like playing Russian Roulette with my equipment, but I'm going to take a chance and hope for the best (I'm just going to have to put my original PC "on ice" as a backup in case of emergencies).
So why change my computer for an old GX270? Well, the fact is that it is significantly faster than the venerable old Athlon XP 2200 which has served me well for the past doo-de-dah years (where doo-de-dah is a number between one and too-many).
One thing that is taking some getting used to is the noise, or lack of it. My original PC sounded not unlike a Harrier jump jet taking off when it was started, and made a constant noise whilst in use. Yes, you get used to it but My God was it loud!
To give you an idea of how quiet the GX270 is, my LCD monitor makes more noise. Seriously, this thing is whisper quiet. I'd go as far as to say it is one of the quietest PCs that I've ever used (and that is including some of those with specialised cooling equipment).
Really, the only problem with this model of Dell is reliability. Hopefully this will be one of the PCs that doesn't suffer with swollen capacitors, think of it as the XBox 360 of the PC world.
I don't like playing Russian Roulette with my equipment, but I'm going to take a chance and hope for the best (I'm just going to have to put my original PC "on ice" as a backup in case of emergencies).
Friday, 8 May 2009
Back Again. . .
I've been kind of busy of late - hence the lack of blogging.
In fact I've written a couple of pieces (that I'm going to have to finish and post) but, well, sometimes life just gets in the way, and recently life has been getting in the way in quite a large way.
So, two funerals later (seriously) and I'm back.
To get the ball rolling again, here are some of the things that I would have been blogging about under normal circumstances.
Hackintosh Havoc
I've had a play with a home-made hackintosh. What do I mean by that? Well, basically it is MacOS X running on a standard Intel PC. Was it worth the effort? Probably not. To cut a long blog short you would be better buying a second hand Mac than the amount of effort you will need to go to in order to get it working properly (and in case you are wondering - it never really worked properly, although it was damn close).
In case any Apple lawyers are reading, the hard disk used was humanely destroyed after the experiment. No copies of OSX Tiger were harmed in attemping this.
Ubuntu Blues
I've recently aquired a Dell OptiPlex 270 (Pentium 4, 2.8ghz folks!) which I'm going to be using as a replacement for my venerable 1.8Gghz Athlon XP 2200. The strange thing about this PC is that although it supports Hyper Threading (sort of a cheap alternative to Dual Core) this isn't enabled by default. Strange but true. Once it has been enabled the PC performs rather well with Windows XP.
Of course, I also installed Ubuntu alongside it. Unfortunately the "newest" version of Ubuntu I had to hand was an 8.04 install. Not to be deterred by this I installed it (reducing the XP partition down to 20 gig) then upgraded it to 8.10 and then again to 9.04.
One problem with the OptiPlex 270 is that it needs a low profile AGP graphics card - and the only one I was able to rustle up at short notice was an ancient 16 meg ATI Rage Pro 128. Not the best card by a long straw - but better than the onboard 8 meg(!) Intel one.
Luckily I was able to borrow a 128 meg Geforce FX 5200 PCI card - which worked well in Windows (after a little bit of fiddling to disable the onboard card). It also worked well in Ubuntu - up to the point where I rebooted and checkdisk started - but claimed that it couldn't repair the file system. After a couple of hours of running FSCK on Ubuntu's root partition (and watching hundreds of errors appear on screen) I gave up, downloaded the Ubuntu 9.04 install CD and reinstalled from scratch (this time as EXT4).
As the PCI graphics card was still too slow I bought a cheap-and-cheerful low profile AGP card (another Geforce 5200) which now runs like a charm. Ubuntu accepted the new card without problems (booted up and hey! It works!) - Windows however took about an hour to get everything running correctly (which included uninstalling and re-installing the drivers, swearing and playing around with different configuration options to get the card recognised correctly, and then to get the resolution so that it would change from 640x480 up to 1280x1024).
Ubuntu 9.04 boots extremely quickly (just like Windows XP), and is ready to work as soon as the desktop is displayed (unlike XP which is unusuable for the first minute or two).
SCO is Gonna Go
It has been recommended by the U.S. Trustee that SCO is moved out of Chapter 11 Bancruptcy into full Chapter 7. This basically means that it has been agreed that there is no future for SCO as a company, and so its assets are to be sold and the proceeds given to the creditors.
Could this be the end of the extremely long running SCO vs The World saga?
Seeing as SCO, as usual, are planning to oppose this decision (gosh, really?).
So what did Darl McBride have to say? “We are reviewing the motion that was filed in Delaware today with counsel and will have a detailed response for the court in due course. We plan to oppose the motion and present our own suggested course of action to the court."
Time will tell if this lawsuit still has legs.
In fact I've written a couple of pieces (that I'm going to have to finish and post) but, well, sometimes life just gets in the way, and recently life has been getting in the way in quite a large way.
So, two funerals later (seriously) and I'm back.
To get the ball rolling again, here are some of the things that I would have been blogging about under normal circumstances.
Hackintosh Havoc
I've had a play with a home-made hackintosh. What do I mean by that? Well, basically it is MacOS X running on a standard Intel PC. Was it worth the effort? Probably not. To cut a long blog short you would be better buying a second hand Mac than the amount of effort you will need to go to in order to get it working properly (and in case you are wondering - it never really worked properly, although it was damn close).
In case any Apple lawyers are reading, the hard disk used was humanely destroyed after the experiment. No copies of OSX Tiger were harmed in attemping this.
Ubuntu Blues
I've recently aquired a Dell OptiPlex 270 (Pentium 4, 2.8ghz folks!) which I'm going to be using as a replacement for my venerable 1.8Gghz Athlon XP 2200. The strange thing about this PC is that although it supports Hyper Threading (sort of a cheap alternative to Dual Core) this isn't enabled by default. Strange but true. Once it has been enabled the PC performs rather well with Windows XP.
Of course, I also installed Ubuntu alongside it. Unfortunately the "newest" version of Ubuntu I had to hand was an 8.04 install. Not to be deterred by this I installed it (reducing the XP partition down to 20 gig) then upgraded it to 8.10 and then again to 9.04.
One problem with the OptiPlex 270 is that it needs a low profile AGP graphics card - and the only one I was able to rustle up at short notice was an ancient 16 meg ATI Rage Pro 128. Not the best card by a long straw - but better than the onboard 8 meg(!) Intel one.
Luckily I was able to borrow a 128 meg Geforce FX 5200 PCI card - which worked well in Windows (after a little bit of fiddling to disable the onboard card). It also worked well in Ubuntu - up to the point where I rebooted and checkdisk started - but claimed that it couldn't repair the file system. After a couple of hours of running FSCK on Ubuntu's root partition (and watching hundreds of errors appear on screen) I gave up, downloaded the Ubuntu 9.04 install CD and reinstalled from scratch (this time as EXT4).
As the PCI graphics card was still too slow I bought a cheap-and-cheerful low profile AGP card (another Geforce 5200) which now runs like a charm. Ubuntu accepted the new card without problems (booted up and hey! It works!) - Windows however took about an hour to get everything running correctly (which included uninstalling and re-installing the drivers, swearing and playing around with different configuration options to get the card recognised correctly, and then to get the resolution so that it would change from 640x480 up to 1280x1024).
Ubuntu 9.04 boots extremely quickly (just like Windows XP), and is ready to work as soon as the desktop is displayed (unlike XP which is unusuable for the first minute or two).
SCO is Gonna Go
It has been recommended by the U.S. Trustee that SCO is moved out of Chapter 11 Bancruptcy into full Chapter 7. This basically means that it has been agreed that there is no future for SCO as a company, and so its assets are to be sold and the proceeds given to the creditors.
Could this be the end of the extremely long running SCO vs The World saga?
Seeing as SCO, as usual, are planning to oppose this decision (gosh, really?).
So what did Darl McBride have to say? “We are reviewing the motion that was filed in Delaware today with counsel and will have a detailed response for the court in due course. We plan to oppose the motion and present our own suggested course of action to the court."
Time will tell if this lawsuit still has legs.
Tuesday, 31 March 2009
The "New" Rob Enderle
Over the years no pundit has taken quite as much flack as Rob Enderle (some less charitable than myself would probably say deservedly).
Although I still think that his anti-Linux stance really didn't do him any favours it is fair to say that he certainly isn't a pro-Microsoft shill. One of his latest articles could almost be proof of that.
A well written article mixing Apple Computers, Politics and Dancing with the Stars? You betcha!
So is Rob now on the side of good? A reading further through his output unfortunately reveals that his anti-Linux stance is as much in evidence as ever, whilst he can still churn out pro-Microsoft stuff with the best of them.
Whilst I still don't think that he is a paid shill (although he does a darn tooting impression of one I must admit) some of this anti-Free Software stuff just doesn't do him any favours. Many companies can (and do) support free software solutions mixed in with the commercial ones. These days it is so common that it hardly even warrants a mention.
The company I work with supports both Open Office AND Microsoft Office, although we only install Microsoft Office where there is a damn good business case for spending money on the software licence. Does Open Office cost us any more to support than Microsoft Office? Of course not. Don't be silly.
You see, in a support role you often end up supporting desktop applications that you haven't had full (or in some cases any) training on. It doesn't matter whether it is a Microsoft, Sun or Joe Bloggs Open Source program, part of a good technicians skills are being able to deal with new stuff "on the fly" as it were and learn (and remember) enough to be able to support the users.
So what Open Source stuff gets widely used? Apart from the obvious such as FireFox and OpenOffice, tools such as ImageMagick are in heavy use (often as part of the back-end software for commercial products). Some of the more useful Unix commandline utilities have been ported to Windows and again, can often be found as part of the "glue" behind larger commercial products. Samba is so common that it is almost a certainty to find it installed on any given Unix server that you find.
And as for Linux not working on the desktop? Some of us switched to it years ago, and I can tell you this: even with some of the issues I've encountered over the years I'm not switching back! (Using the beta version of Windows 7 has made that decision a damn site easier to handle!)
Although I still think that his anti-Linux stance really didn't do him any favours it is fair to say that he certainly isn't a pro-Microsoft shill. One of his latest articles could almost be proof of that.
A well written article mixing Apple Computers, Politics and Dancing with the Stars? You betcha!
So is Rob now on the side of good? A reading further through his output unfortunately reveals that his anti-Linux stance is as much in evidence as ever, whilst he can still churn out pro-Microsoft stuff with the best of them.
Whilst I still don't think that he is a paid shill (although he does a darn tooting impression of one I must admit) some of this anti-Free Software stuff just doesn't do him any favours. Many companies can (and do) support free software solutions mixed in with the commercial ones. These days it is so common that it hardly even warrants a mention.
The company I work with supports both Open Office AND Microsoft Office, although we only install Microsoft Office where there is a damn good business case for spending money on the software licence. Does Open Office cost us any more to support than Microsoft Office? Of course not. Don't be silly.
You see, in a support role you often end up supporting desktop applications that you haven't had full (or in some cases any) training on. It doesn't matter whether it is a Microsoft, Sun or Joe Bloggs Open Source program, part of a good technicians skills are being able to deal with new stuff "on the fly" as it were and learn (and remember) enough to be able to support the users.
So what Open Source stuff gets widely used? Apart from the obvious such as FireFox and OpenOffice, tools such as ImageMagick are in heavy use (often as part of the back-end software for commercial products). Some of the more useful Unix commandline utilities have been ported to Windows and again, can often be found as part of the "glue" behind larger commercial products. Samba is so common that it is almost a certainty to find it installed on any given Unix server that you find.
And as for Linux not working on the desktop? Some of us switched to it years ago, and I can tell you this: even with some of the issues I've encountered over the years I'm not switching back! (Using the beta version of Windows 7 has made that decision a damn site easier to handle!)
Monday, 30 March 2009
Hide KMix at start-up on KDE 4
Although I've rather got to like KDE4 there has been one thing that has been a constant (although minor) irritation to me - and that is that KMix doesn't minimise at start-up.
Well, it can be done. In ~/.kde4/Autostart create a file called kmix which contains the following script:
#!/bin/bash
kmix; qdbus org.kde.kmix /kmix/KMixWindow close
Make sure that the script is executable
chmod u+x kmix
Now when KDE starts kmix will launch and minimise straight to the dock.
Well, it can be done. In ~/.kde4/Autostart create a file called kmix which contains the following script:
#!/bin/bash
kmix; qdbus org.kde.kmix /kmix/KMixWindow close
Make sure that the script is executable
chmod u+x kmix
Now when KDE starts kmix will launch and minimise straight to the dock.
Friday, 27 March 2009
Ubuntu 9.04 Jaunty Jackalope - First Impressions and problems
I've had a bit of a love / hate relationship with Ubuntu in the year that I've been using it as my main desktop.
The big problem for me has been the fact that the via_pata driver that they insist on using isn't reliable on my hardware, causing random lock-ups. Not nice that, and certainly not what I expect from Linux.
I've managed to get around it by using an older 2.6 kernel that was patched to use the original via82cxxx module, which is a less than satisfactory solution really (and yes, I'm already using 80 wire IDE cables thank you very much).
With all this in mind it was with some trepidation that I took the plunge and upgraded to the new Ubuntu 9.04 beta. If you want to do this, then simply run update-manager -d and follow the on-screen prompts to upgrade.
The upgrade itself was (as usual) painless, although it did take about two hours for the download / upgrade process to complete. Once the upgrade was complete I rebooted and. . . .
I was faced with a nice, shiny KDE 4.2.1 desktop. There was a problem though, which I discovered about ten seconds after launching Amarok and trying to update my collection. Yes folks, the system locked solid. Trying to launch the old version of the kernel left me without a graphical desktop, so back into the latest kernel and a second attempt to update my MP3s in Amarok - followed by a system lock after a few seconds.
I should be able to get around this problem by recompiling the kernel to allow me to revert to the original via drivers (like I've got the time for this shit. . .). As a last resort (and after consulting my good friend Google) I've added "defoptions=all-generic-ide" to /boot/grub/menu.lst and restarted, which so far seems to have done the trick as I've been able to update my collection without any locks.
I'm going to have to see how well the system holds up. If this sorts it then great, but if not then I'll have to look using a different Linux distro.
. . .
As it turns out this didn't sort the problem at all. Spooling a video to my XBox caused the system to lock solid - so onto Plan B - recompiling the kernel.
There are some pretty straight-forward instructions here on how to recompile the kernel for Ubuntu. One thing that shocked me is just how long it takes to recompile the Ubuntu way. With Slackware compiling a kernel took less than an hour, with Ubuntu? I gave up waiting after a couple of hours. God only knows what it is doing, but it is certainly taking its time about it.
Adding in support for the old via82cxxx module is reasonably straight-forward, and after the compile had finished I installed the new modules, checked the blacklist to make sure that the pata_via module wasn't going to load, rebooted and. . .
No difference. Running lsmod didn't show pata_via, but it also didn't show via82cxx. In fact, I couldn't see any modules relating to the IDE drives. Checking /proc/bus/pci/devices shows that pata_via has actually been loaded. What?
Checking back through the kernel configuration shows why - they've compiled support for pata_via directly into the kernel rather than as a module. Bastards!
After setting it back to being a module (rassen frassen Ubuntu) I did a clean recompile as per the Ubuntu instructions. So now I'm going to spend the morning waiting for the compile to complete, and hoping that the PC doesn't lock up in the meantime.
Fun, I don't think.
*** Three hours later ***
OK, so I've recompiled the kernel, reinstalled it and rebooted and yes! We are no-longer using pata_via, the drive names have reverted from SDxx to HDxx, and hopefully that should be it for the lock-ups.
If (like me) you are using the NVidia drivers then you will also need to recompile the Ubuntu restricted drivers.
One further problem was that the kernel package I'd created never seemed to complete installing (even though it was added to the grub menu and I could boot the system using it). After a fair bit of faffing around I traced the problem to the file /etc/kernel/postinst.d/nvidia-common throwing errors due to the new kernel. To get around this I removed the nvidia-common package, at which point the custom kernel completed installing without any errors, then I re-installed nvidia-common. Easy when you know how. . .
So, what's new? Faster start-ups, at least it seems faster to me. A new notification system (for Gnome at any rate), updated apps, 2.6.28 based Kernel and an awful lot more "polish" to the desktop experience. Oh, and the splash screen is slightly different too.
For those of us using KDE we now get KDE 4.2.1 - which is a seriously nice upgrade to the KDE4 series. One bone of contention over the past couple of years is that Ubuntu's tools tended to be better integrated into Gnome than KDE - but this time round I'd say that both desktops are on an equal footing - and personally I'd say that KDE4 may even have a bit of an edge.
I've been so impressed with the latest release that I've upgraded my Toshiba Satellite Pro from Ubuntu 8.04 to 9.04. This has been, shall we say a time consuming exercise. Firstly I had to upgrade to 8.10 and from there to 9.04. Each upgrade took around four hours.
Upgrading to 8.10 was fine, everything worked without problems, however upgrading to 9.04 killed knetworkmanager. The program launches, sees the local wireless network but won't connect. As the network connection still worked from Gnome I had a search around and found that yes, knetworkmanager is stuffed on the latest KDE.
I finally got the wireless network running from the network pane of the Systems Settings.
One further bonus is that 3D acceleration now works on the laptop's Trident Cyberblade/XP card. It isn't fast, I'll admit, but at least it now works.
The big problem for me has been the fact that the via_pata driver that they insist on using isn't reliable on my hardware, causing random lock-ups. Not nice that, and certainly not what I expect from Linux.
I've managed to get around it by using an older 2.6 kernel that was patched to use the original via82cxxx module, which is a less than satisfactory solution really (and yes, I'm already using 80 wire IDE cables thank you very much).
With all this in mind it was with some trepidation that I took the plunge and upgraded to the new Ubuntu 9.04 beta. If you want to do this, then simply run update-manager -d and follow the on-screen prompts to upgrade.
The upgrade itself was (as usual) painless, although it did take about two hours for the download / upgrade process to complete. Once the upgrade was complete I rebooted and. . . .
I was faced with a nice, shiny KDE 4.2.1 desktop. There was a problem though, which I discovered about ten seconds after launching Amarok and trying to update my collection. Yes folks, the system locked solid. Trying to launch the old version of the kernel left me without a graphical desktop, so back into the latest kernel and a second attempt to update my MP3s in Amarok - followed by a system lock after a few seconds.
I should be able to get around this problem by recompiling the kernel to allow me to revert to the original via drivers (like I've got the time for this shit. . .). As a last resort (and after consulting my good friend Google) I've added "defoptions=all-generic-ide" to /boot/grub/menu.lst and restarted, which so far seems to have done the trick as I've been able to update my collection without any locks.
I'm going to have to see how well the system holds up. If this sorts it then great, but if not then I'll have to look using a different Linux distro.
. . .
As it turns out this didn't sort the problem at all. Spooling a video to my XBox caused the system to lock solid - so onto Plan B - recompiling the kernel.
There are some pretty straight-forward instructions here on how to recompile the kernel for Ubuntu. One thing that shocked me is just how long it takes to recompile the Ubuntu way. With Slackware compiling a kernel took less than an hour, with Ubuntu? I gave up waiting after a couple of hours. God only knows what it is doing, but it is certainly taking its time about it.
Adding in support for the old via82cxxx module is reasonably straight-forward, and after the compile had finished I installed the new modules, checked the blacklist to make sure that the pata_via module wasn't going to load, rebooted and. . .
No difference. Running lsmod didn't show pata_via, but it also didn't show via82cxx. In fact, I couldn't see any modules relating to the IDE drives. Checking /proc/bus/pci/devices shows that pata_via has actually been loaded. What?
Checking back through the kernel configuration shows why - they've compiled support for pata_via directly into the kernel rather than as a module. Bastards!
After setting it back to being a module (rassen frassen Ubuntu) I did a clean recompile as per the Ubuntu instructions. So now I'm going to spend the morning waiting for the compile to complete, and hoping that the PC doesn't lock up in the meantime.
Fun, I don't think.
*** Three hours later ***
OK, so I've recompiled the kernel, reinstalled it and rebooted and yes! We are no-longer using pata_via, the drive names have reverted from SDxx to HDxx, and hopefully that should be it for the lock-ups.
If (like me) you are using the NVidia drivers then you will also need to recompile the Ubuntu restricted drivers.
One further problem was that the kernel package I'd created never seemed to complete installing (even though it was added to the grub menu and I could boot the system using it). After a fair bit of faffing around I traced the problem to the file /etc/kernel/postinst.d/nvidia-common throwing errors due to the new kernel. To get around this I removed the nvidia-common package, at which point the custom kernel completed installing without any errors, then I re-installed nvidia-common. Easy when you know how. . .
So, what's new? Faster start-ups, at least it seems faster to me. A new notification system (for Gnome at any rate), updated apps, 2.6.28 based Kernel and an awful lot more "polish" to the desktop experience. Oh, and the splash screen is slightly different too.
For those of us using KDE we now get KDE 4.2.1 - which is a seriously nice upgrade to the KDE4 series. One bone of contention over the past couple of years is that Ubuntu's tools tended to be better integrated into Gnome than KDE - but this time round I'd say that both desktops are on an equal footing - and personally I'd say that KDE4 may even have a bit of an edge.
I've been so impressed with the latest release that I've upgraded my Toshiba Satellite Pro from Ubuntu 8.04 to 9.04. This has been, shall we say a time consuming exercise. Firstly I had to upgrade to 8.10 and from there to 9.04. Each upgrade took around four hours.
Upgrading to 8.10 was fine, everything worked without problems, however upgrading to 9.04 killed knetworkmanager. The program launches, sees the local wireless network but won't connect. As the network connection still worked from Gnome I had a search around and found that yes, knetworkmanager is stuffed on the latest KDE.
I finally got the wireless network running from the network pane of the Systems Settings.
One further bonus is that 3D acceleration now works on the laptop's Trident Cyberblade/XP card. It isn't fast, I'll admit, but at least it now works.
Friday, 20 March 2009
DanO vs XBox 360 and 3 Red Lights Part 2
Well, it finally happened - a couple of in-game lock-ups followed by three red lights that just won't go away. As I mentioned before, Microsoft have extended the warranty for the Red Ring of Death to three years, so even though the standard warranty is over, registering the console with Microsoft (via www.xbox.com) still allowed me to request a repair.
Sending back the XBox
Depending on where you live and how old your XBox is one of three things will now happen, either:
A: You'll receive the return packaging from Microsoft.
B: Microsoft will post you the UPS packaging slip.
C: You'll be emailed the link to a UPS Packaging Slip and a delivery receipt.
In my case I received the link to download the prepaid packaging slips. Once you've printed them out you'll need to find a box big enough to hold the XBox (don't use the original XBox packaging as it won't be returned) and something to keep it securely in place.
When you've packaged it up neatly (and remembered to write your name and the customer service request ID on the outside of the box) it is time to phone UPS and arrange a time for them to pick it up, or to be more exact a day that they can collect it as they refuse to be narrowed down to a time, and they won't collect on a weekend.
You can track the package at the UPS website. From the UK it takes around two days for the package to make its way to the repair center at Frankfurt.
As well as checking the UPS tracking status, you can also check the repair status with Microsoft, again via the XBox Live website. A word of warning to the impatient (like me): there is likely to be a delay from when UPS delivers the parcel and the status changing from "Waiting for device" to "Device received". In my case it took a worrying 20 hours. I'm guessing it depends on just how busy they are.
Of course, you don't have to keep checking as Microsoft will email you once the status changes.
The next email should be when the repair has (hopefully) been completed.
A couple of days later. . .
After checking the repair status today I found that my repair status had changed from "Device Received at Service Center" to "No pending repair". . .
What!?!??
As I haven't had an email to state that the console had been repaired or replaced this is more than a bit worrying.
I decided to contact Microsoft's customer support (0800 587 1102) and see what had happened.
After negotiating their phone system and finding out that the voice recognition system isn't keen on a Northern accent ("Check status. . . I'm sorry, I didn't understand that. Check Status!. . . I'm sorry, I didn't understand that. CHECK STATUS!!!") I was transferred through to a customer support agent.
Apparently my 360 has been replaced and is being shipped back to me as we speak - along with a one month gold card for XBox Live. Hopefully I should get an email and tracking number later on today - and all being even the console before the weekend.
I'd also like to mention that the customer service guy was extremely helpful and very thorough in dealing with my request - so full marks to Microsoft there.
Later that afternoon. . .
Still no email from Microsoft, but the repair status has now changed to "Device shipped to customer - Your console has been repaired or replaced. We are shipping it back to you," and there is a tracking number for UPS too.
Something else has changed too. The console has gone from being "out of warranty" to "in warranty". It is back under full warranty until June of this year.
A couple of days later. . .
UPS came to visit today, "Hi, here's your XBox." Yes!!!
So I've finally got my 360 back! Well, not my original 360 but a refurbished one. How can you tell that it is a refurb? Look at the tag on the back of the console. A console that has never been repaired should have a "manufactured date" on it. A refurb has a "serviced date" instead.
Also, the documentation that comes back with the console states that the console is a refurbished one, and that it has a different serial number to the original one - although all warranty / registration details will be automatically updated for me.
So how long did it take? From being collected by UPS on a Tuesday afternoon, I received the replacement back the following Thursday. Nine days isn't a bad turnaround in my opinion.
I've quickly tested the console out - and so far, so good. It actually sounds less like a washing machine on the spin cycle than the original 360 did, and has played the games I tried it on without any problems.
So is it worth sending your 360 back, I'll say a resounding YES to that. For Gods sake if you have any warranty left send it back before trying to mess around with the "alternative" methods.
The benefits of sending it back?
1 - You get a properly repaired 360.
2 - You get an additional three months of warranty on the repaired / replaced machine (longer if your machine was still under the regular warranty).
3 - You get one month's XBox Live Gold subscription as a "sorry" from Microsoft.
It certainly looks like Microsoft are starting to get their act together when it comes to dealing with customer issues, although there are some things to look out for.
Don't take their word for it about getting confirmation emails. Although I got the one to say that they'd received the console - that was it. I had to check to website to see how things were progressing (and eventually phone customer services).
The customer services phone system stinks, but once I'd got through to him the customer services guy was pretty good.
Overall - not a bad experience, although not without its problems.
Update: I've just received an email from Microsoft telling me that my XBox 360 has been shipped and should be with me within the week, the same XBox that arrived yesterday. . .
One of the things in the email was that all licenses should be automatically transferred from the old console to the new one, this would enable me to use my paid-for content when not online. All I should need to do would be to re-download the content. Unfortunately it didn't work.
There are a couple of things you can do at this point. One is phone customer support, another is to use the license transfer tool on XBox Live. Using it is actually quite simple. When you run it for the first time you are shown how many xboxes have your content is assigned to, all licenses are transferred to your live account, then when you log onto your live account from your xbox all licenses are transferred back.
After I'd run though this I redownloaded Space Giraffe, logged out of live and was still able to play the full version of the game.
Sending back the XBox
Depending on where you live and how old your XBox is one of three things will now happen, either:
A: You'll receive the return packaging from Microsoft.
B: Microsoft will post you the UPS packaging slip.
C: You'll be emailed the link to a UPS Packaging Slip and a delivery receipt.
In my case I received the link to download the prepaid packaging slips. Once you've printed them out you'll need to find a box big enough to hold the XBox (don't use the original XBox packaging as it won't be returned) and something to keep it securely in place.
When you've packaged it up neatly (and remembered to write your name and the customer service request ID on the outside of the box) it is time to phone UPS and arrange a time for them to pick it up, or to be more exact a day that they can collect it as they refuse to be narrowed down to a time, and they won't collect on a weekend.
You can track the package at the UPS website. From the UK it takes around two days for the package to make its way to the repair center at Frankfurt.
As well as checking the UPS tracking status, you can also check the repair status with Microsoft, again via the XBox Live website. A word of warning to the impatient (like me): there is likely to be a delay from when UPS delivers the parcel and the status changing from "Waiting for device" to "Device received". In my case it took a worrying 20 hours. I'm guessing it depends on just how busy they are.
Of course, you don't have to keep checking as Microsoft will email you once the status changes.
The next email should be when the repair has (hopefully) been completed.
A couple of days later. . .
After checking the repair status today I found that my repair status had changed from "Device Received at Service Center" to "No pending repair". . .
What!?!??
As I haven't had an email to state that the console had been repaired or replaced this is more than a bit worrying.
I decided to contact Microsoft's customer support (0800 587 1102) and see what had happened.
After negotiating their phone system and finding out that the voice recognition system isn't keen on a Northern accent ("Check status. . . I'm sorry, I didn't understand that. Check Status!. . . I'm sorry, I didn't understand that. CHECK STATUS!!!") I was transferred through to a customer support agent.
Apparently my 360 has been replaced and is being shipped back to me as we speak - along with a one month gold card for XBox Live. Hopefully I should get an email and tracking number later on today - and all being even the console before the weekend.
I'd also like to mention that the customer service guy was extremely helpful and very thorough in dealing with my request - so full marks to Microsoft there.
Later that afternoon. . .
Still no email from Microsoft, but the repair status has now changed to "Device shipped to customer - Your console has been repaired or replaced. We are shipping it back to you," and there is a tracking number for UPS too.
Something else has changed too. The console has gone from being "out of warranty" to "in warranty". It is back under full warranty until June of this year.
A couple of days later. . .
UPS came to visit today, "Hi, here's your XBox." Yes!!!
So I've finally got my 360 back! Well, not my original 360 but a refurbished one. How can you tell that it is a refurb? Look at the tag on the back of the console. A console that has never been repaired should have a "manufactured date" on it. A refurb has a "serviced date" instead.
Also, the documentation that comes back with the console states that the console is a refurbished one, and that it has a different serial number to the original one - although all warranty / registration details will be automatically updated for me.
So how long did it take? From being collected by UPS on a Tuesday afternoon, I received the replacement back the following Thursday. Nine days isn't a bad turnaround in my opinion.
I've quickly tested the console out - and so far, so good. It actually sounds less like a washing machine on the spin cycle than the original 360 did, and has played the games I tried it on without any problems.
So is it worth sending your 360 back, I'll say a resounding YES to that. For Gods sake if you have any warranty left send it back before trying to mess around with the "alternative" methods.
The benefits of sending it back?
1 - You get a properly repaired 360.
2 - You get an additional three months of warranty on the repaired / replaced machine (longer if your machine was still under the regular warranty).
3 - You get one month's XBox Live Gold subscription as a "sorry" from Microsoft.
It certainly looks like Microsoft are starting to get their act together when it comes to dealing with customer issues, although there are some things to look out for.
Don't take their word for it about getting confirmation emails. Although I got the one to say that they'd received the console - that was it. I had to check to website to see how things were progressing (and eventually phone customer services).
The customer services phone system stinks, but once I'd got through to him the customer services guy was pretty good.
Overall - not a bad experience, although not without its problems.
Update: I've just received an email from Microsoft telling me that my XBox 360 has been shipped and should be with me within the week, the same XBox that arrived yesterday. . .
One of the things in the email was that all licenses should be automatically transferred from the old console to the new one, this would enable me to use my paid-for content when not online. All I should need to do would be to re-download the content. Unfortunately it didn't work.
There are a couple of things you can do at this point. One is phone customer support, another is to use the license transfer tool on XBox Live. Using it is actually quite simple. When you run it for the first time you are shown how many xboxes have your content is assigned to, all licenses are transferred to your live account, then when you log onto your live account from your xbox all licenses are transferred back.
After I'd run though this I redownloaded Space Giraffe, logged out of live and was still able to play the full version of the game.
Sunday, 15 February 2009
The Pros and Cons of second hand software
Regular readers may have already read about my newly resurrected XBox 360. Having a console (for free) is A Good Thing, but having something to play on it is even better.
The Xbox 360 can allegedly play some XBox games. Some, but not all. Unfortunately for me, you need to have at least the 20gig hard drive for this to work, and all I have is a 256meg memory card.
Also, just to make life that bit more fun, I didn't have a controller either. My USB keyboard works the 360 Dashboard, but does sod all in games. Something I've got to say about the 360 is this - the controllers for it are bloody expensive, even second hand. After a bit of shoping around I managed to snag a wired controller for it for £15 (a wireless one would have been around £25).
As for cheap games, there are a couple of places that you can get cheap software. Aside from your friendly neighbourhood back-street games shop, there are a couple of places on the highstreet that specialise in second hand stuff.
Game (formerly Electronic Boutique) sells a mix of new and second hand gear. This was my first stop to get something to test my 360 with. After a brief hunt I walked away with a copy of Blue Dragon for £4.
As it turns out the game is, essentially, Final Fantasy 7 with Dragonball Z artwork. Spread over three DVDs it is a really nice, well paced little RPG, I'm actually surprised at just how good it is.
Of course, one game is never enough, so for our next jaunt I headed over to CeX. A bit unusual this lot, as all of their stock is second hand. You can also drop in your old DVDs, consoles or games and either get back cash, or use them against purchases in store (you get back more for your stuff this way).
I came out with Project Gotham 3 for a fiver, and the XBox 360 Live Arcade compilation for £4. Sadly Project Gotham 3 didn't work, but the Live Arcade disk did, and apart from me playing Uno into the small we hours on XBox Live, the rest of the family were hooked to Boom Boom Rocket.
Trying to return Project Gotham was a bit, well, strange. I still had the receipt for the game so I didn't forsee any difficulties in sending it back. As it turns out the game was physically damaged, and after a fair bit of muttering, huddled employees examining the game disk and "are you SURE you bought it from us?" they replaced the game. I've not had the chance to play it yet (hopefully later on tonight) so fingers crossed.
Looking closely at the game that I've now got in my grubby hands, I've actually ended up with Project Gotham 4 in a Project Gotham 3 box. As this retails for £12 then I'm quids in - assuming that it actually plays!
I also traded in some of my old DVD box sets and used the money to buy a 20 Gig hard drive for the 360 for £35. As with Project Gotham I'll be testing that later on tonight, once I get chance to get onto the 360.
(Later)
It looks like my luck has held out - both Project Gotham 4 and the hard drive work perfectly. So, yay for that!
The Xbox 360 can allegedly play some XBox games. Some, but not all. Unfortunately for me, you need to have at least the 20gig hard drive for this to work, and all I have is a 256meg memory card.
Also, just to make life that bit more fun, I didn't have a controller either. My USB keyboard works the 360 Dashboard, but does sod all in games. Something I've got to say about the 360 is this - the controllers for it are bloody expensive, even second hand. After a bit of shoping around I managed to snag a wired controller for it for £15 (a wireless one would have been around £25).
As for cheap games, there are a couple of places that you can get cheap software. Aside from your friendly neighbourhood back-street games shop, there are a couple of places on the highstreet that specialise in second hand stuff.
Game (formerly Electronic Boutique) sells a mix of new and second hand gear. This was my first stop to get something to test my 360 with. After a brief hunt I walked away with a copy of Blue Dragon for £4.
As it turns out the game is, essentially, Final Fantasy 7 with Dragonball Z artwork. Spread over three DVDs it is a really nice, well paced little RPG, I'm actually surprised at just how good it is.
Of course, one game is never enough, so for our next jaunt I headed over to CeX. A bit unusual this lot, as all of their stock is second hand. You can also drop in your old DVDs, consoles or games and either get back cash, or use them against purchases in store (you get back more for your stuff this way).
I came out with Project Gotham 3 for a fiver, and the XBox 360 Live Arcade compilation for £4. Sadly Project Gotham 3 didn't work, but the Live Arcade disk did, and apart from me playing Uno into the small we hours on XBox Live, the rest of the family were hooked to Boom Boom Rocket.
Trying to return Project Gotham was a bit, well, strange. I still had the receipt for the game so I didn't forsee any difficulties in sending it back. As it turns out the game was physically damaged, and after a fair bit of muttering, huddled employees examining the game disk and "are you SURE you bought it from us?" they replaced the game. I've not had the chance to play it yet (hopefully later on tonight) so fingers crossed.
Looking closely at the game that I've now got in my grubby hands, I've actually ended up with Project Gotham 4 in a Project Gotham 3 box. As this retails for £12 then I'm quids in - assuming that it actually plays!
I also traded in some of my old DVD box sets and used the money to buy a 20 Gig hard drive for the 360 for £35. As with Project Gotham I'll be testing that later on tonight, once I get chance to get onto the 360.
(Later)
It looks like my luck has held out - both Project Gotham 4 and the hard drive work perfectly. So, yay for that!
Thursday, 12 February 2009
XBox 360 Controller on Ubuntu 8.10
Seeing as I've now got an XBox 360 which uses a USB controller, I thought I'd give it a shot on Linux (or on Ubuntu to be exact).
Most devices these days are plug-and-play in Linux, and this controller is no exception - but. . .
It detects it as a mouse! The left stick causes the mouse pointer to move, which is less than helpful for playing games.
However, help is at hand. To get the gamepad working, open a command window and enter the following:
xinput list
This should give you a list that contains the XBox controller details:
"Microsoft X-Box 360 pad" id=5 [XExtensionPointer]
Num_buttons is 32
Num_axes is 2
Mode is Absolute
Motion_buffer is 256
Axis 0 :
Min_value is -32768
Max_value is 32767
Resolution is 10000
Axis 1 :
Min_value is -32768
Max_value is 32767
Resolution is 10000
Type the following, replacing [device] with the device ID provided above (in our example 5):
sudo xinput set-int-prop [device] 'Device Enabled' 32 0
And that's it, at least until you reboot.
Update: For those of you on 9.04, good news! The controller works properly without any additional configuration. Nice one!
Most devices these days are plug-and-play in Linux, and this controller is no exception - but. . .
It detects it as a mouse! The left stick causes the mouse pointer to move, which is less than helpful for playing games.
However, help is at hand. To get the gamepad working, open a command window and enter the following:
xinput list
This should give you a list that contains the XBox controller details:
"Microsoft X-Box 360 pad" id=5 [XExtensionPointer]
Num_buttons is 32
Num_axes is 2
Mode is Absolute
Motion_buffer is 256
Axis 0 :
Min_value is -32768
Max_value is 32767
Resolution is 10000
Axis 1 :
Min_value is -32768
Max_value is 32767
Resolution is 10000
Type the following, replacing [device] with the device ID provided above (in our example 5):
sudo xinput set-int-prop [device] 'Device Enabled' 32 0
And that's it, at least until you reboot.
Update: For those of you on 9.04, good news! The controller works properly without any additional configuration. Nice one!
Saturday, 7 February 2009
DanO vs XBox 360 and 3 Red Lights
I've just been given a "broken" XBox 360. Basically I've been told "if you can fix it, you can have it." As my future brother-in-law has already bought a replacement this was a nice challenge.
Update: Before we go any further with this there is something worth mentioning. Microsoft have extended the warranty on ALL XBox 360s to three years for this problem (and this problem only), so if your XBox 360 is less than three years old and you have the Red Ring of Death then call Microsoft for a replacement first. Read more about this in part 2.
Powering the console on gives 3 Red Lights, otherwise known as The Red Ring of Death (or to be more exact a general hardware failure).
There are a lot of rumours going around on the causes and fixes to the Red Ring of Death, so this looked like a good opportunity to maybe put the myths to bed.
Apparently the "R.R.O.D." can be caused by the soldered connections cracking on the GPU. This can be fixed by re-soldering the GPU (if you can get hold of a micro-solder kit), replacing the X-Clamp that holds the GPU in place with one that gives a tighter fit, or by "baking" the console.
Now, I've already sent off for a replacement X-Clamp (courtesy of EBay), but in the meantime I've been having a bit of a read-up on how the repair takes place.
According to popular rumour, the baking method goes something like this:
Step 1: Remove the hard drive, video cable, memory cards, in fact everything but the power supply from the XBox.
Step 2: Power on and wait for the three lights (making sure the XBox is lying flat).
Step 3: Taking three large DRY bath towels, wrap the XBox up completely, and leave for around 25 minutes.
Step 4: Unwrap the XBox which should still be showing three red lights and power it off.
Step 5: Wait for the XBox to cool completely (30-60 minutes) and power back on.
The theory goes that this causes the XBox to get so hot that the solder melts fixing any cracked joints.
Seeing as the XBox hasn't cost me anything, I'm going to try the "baking" method.
. . .
So after following the above instructions I powered the XBox back on and. . .
Still three red lights.
Just in case it hadn't been hot enough I re-wrapped it again, making sure it was nice and snug, and this time. . .
SUCCESS!!!
One working XBox! At least it was for around three minutes before it locked up. At this point I turned it off for another half-an-hour to make sure that it had cooled completely, and then tried playing a game - after half-an-hour of playing Dead or Alive 4 it was still working perfectly.
Nice one!
It is worth mentioning a couple of things at this point. One, if you choose to do this you do so at your own risk. Two, keep an eye on the XBox, as a house fire isn't fun for anyone concerned. Three, if you have a warranty on your XBox, then for Gods sake use that rather than messing around with alternative methods.
Also, this isn't necessarily a permanent fix. According to some people they have to do this every week or so.
And - just in case you do want to try this yourself. . .
The Towel Fix
NOTE: No, that isn't me in the video. Just so you know.
Another alternative fix - The Penny Fix
And, of course, the X-Clamp fix
Update 2 - Four days later. . .
So, the big question is "Is the XBox still working?"
All went swimmingly for a day or two, right up to the point that I tried to sign up for XBox Live. Half way through the sign-up process the XBox froze. Powering off and back on again brought me back to the three red lights.
Aaaaargh!
One more go with the towels (wrapped up for 30 minutes this time) and I had a working XBox 360 again. One the XBox had cooled I went back and signed up for XBox Live, downloaded a couple of demo games, watched some movie trailers, played a couple of DVDs, set up a UPnP server on my PC, watched a couple of episodes of Family Guy, and basically messed around for a couple of hours whilst waiting for the inevitable return of the red lights.
Not one lock-up.
On Monday I bought Blue Dragon (which I highly recommend to any Final Fantasy fans out there) which, in common with most RPGs shows the game play time on the saves. To date I've had over nine hours of game play with no lock-ups or other problems.
So does that mean that this time the fix is permanent? Who knows, although probably not. Since it turns out that this XBox is still covered by Microsoft's extended warranty for the Red Light problem it will be winging its way back to them should the issue re-occur.
Update 3: 1 week after the "towel fix"
Strange but true - the XBox 360 is still going. I did have one moment of "aaaargh" when the power cable got knocked partly out, followed by the familiar three red lights, but powering off and back on again with the cable securely in place did the trick.
The system has coped with what I'd call normal usage for a console (around three hours usage a night) without any problems.
In a fit of purest optimism I've bought another couple of games, and more importantly a 20 gig hard drive, so hopefully I'll be able to play some of my old XBox games on it too.
I'll be posting some more on my first impressions of the 360 in a different post, so stay tuned.
Update - Around three weeks since the last "towel fix"
The RROD has come back, so this time I'll be sending it back under warranty rather than messing around. Want to know how this went?
Read on in Part 2
Update: Before we go any further with this there is something worth mentioning. Microsoft have extended the warranty on ALL XBox 360s to three years for this problem (and this problem only), so if your XBox 360 is less than three years old and you have the Red Ring of Death then call Microsoft for a replacement first. Read more about this in part 2.
Powering the console on gives 3 Red Lights, otherwise known as The Red Ring of Death (or to be more exact a general hardware failure).
There are a lot of rumours going around on the causes and fixes to the Red Ring of Death, so this looked like a good opportunity to maybe put the myths to bed.
Apparently the "R.R.O.D." can be caused by the soldered connections cracking on the GPU. This can be fixed by re-soldering the GPU (if you can get hold of a micro-solder kit), replacing the X-Clamp that holds the GPU in place with one that gives a tighter fit, or by "baking" the console.
Now, I've already sent off for a replacement X-Clamp (courtesy of EBay), but in the meantime I've been having a bit of a read-up on how the repair takes place.
According to popular rumour, the baking method goes something like this:
Step 1: Remove the hard drive, video cable, memory cards, in fact everything but the power supply from the XBox.
Step 2: Power on and wait for the three lights (making sure the XBox is lying flat).
Step 3: Taking three large DRY bath towels, wrap the XBox up completely, and leave for around 25 minutes.
Step 4: Unwrap the XBox which should still be showing three red lights and power it off.
Step 5: Wait for the XBox to cool completely (30-60 minutes) and power back on.
The theory goes that this causes the XBox to get so hot that the solder melts fixing any cracked joints.
Seeing as the XBox hasn't cost me anything, I'm going to try the "baking" method.
. . .
So after following the above instructions I powered the XBox back on and. . .
Still three red lights.
Just in case it hadn't been hot enough I re-wrapped it again, making sure it was nice and snug, and this time. . .
SUCCESS!!!
One working XBox! At least it was for around three minutes before it locked up. At this point I turned it off for another half-an-hour to make sure that it had cooled completely, and then tried playing a game - after half-an-hour of playing Dead or Alive 4 it was still working perfectly.
Nice one!
It is worth mentioning a couple of things at this point. One, if you choose to do this you do so at your own risk. Two, keep an eye on the XBox, as a house fire isn't fun for anyone concerned. Three, if you have a warranty on your XBox, then for Gods sake use that rather than messing around with alternative methods.
Also, this isn't necessarily a permanent fix. According to some people they have to do this every week or so.
And - just in case you do want to try this yourself. . .
The Towel Fix
NOTE: No, that isn't me in the video. Just so you know.
Another alternative fix - The Penny Fix
And, of course, the X-Clamp fix
Update 2 - Four days later. . .
So, the big question is "Is the XBox still working?"
All went swimmingly for a day or two, right up to the point that I tried to sign up for XBox Live. Half way through the sign-up process the XBox froze. Powering off and back on again brought me back to the three red lights.
Aaaaargh!
One more go with the towels (wrapped up for 30 minutes this time) and I had a working XBox 360 again. One the XBox had cooled I went back and signed up for XBox Live, downloaded a couple of demo games, watched some movie trailers, played a couple of DVDs, set up a UPnP server on my PC, watched a couple of episodes of Family Guy, and basically messed around for a couple of hours whilst waiting for the inevitable return of the red lights.
Not one lock-up.
On Monday I bought Blue Dragon (which I highly recommend to any Final Fantasy fans out there) which, in common with most RPGs shows the game play time on the saves. To date I've had over nine hours of game play with no lock-ups or other problems.
So does that mean that this time the fix is permanent? Who knows, although probably not. Since it turns out that this XBox is still covered by Microsoft's extended warranty for the Red Light problem it will be winging its way back to them should the issue re-occur.
Update 3: 1 week after the "towel fix"
Strange but true - the XBox 360 is still going. I did have one moment of "aaaargh" when the power cable got knocked partly out, followed by the familiar three red lights, but powering off and back on again with the cable securely in place did the trick.
The system has coped with what I'd call normal usage for a console (around three hours usage a night) without any problems.
In a fit of purest optimism I've bought another couple of games, and more importantly a 20 gig hard drive, so hopefully I'll be able to play some of my old XBox games on it too.
I'll be posting some more on my first impressions of the 360 in a different post, so stay tuned.
Update - Around three weeks since the last "towel fix"
The RROD has come back, so this time I'll be sending it back under warranty rather than messing around. Want to know how this went?
Read on in Part 2
Sunday, 1 February 2009
The Best (and worst) of Cooking
The Internet is a great resource for anyone interested in cooking. There are so many different recipes out there, many of which would be unlikely to appear in any regular cook books.
Some, like Latka, have become a regular part of our cooking, others such as Loco Moco have given me a new favourite "junk food" type snack.
There are some great resources out there. Old Scrote's Real Food Cookbook is one of the best old-fashioned cook books going, with a range of UK and International recipes and cooking tips, with the added benefit of a good dollop of British humour.
So, as you can tell, I enjoy cooking, and I definitely enjoy trying out new recipes. Today's new recipe was one for Chocolate Mousse. This is a really easy one to make from scratch and proved to be a hit with the whole family (again, a big thank you to Old Scrote for that one).
Out of interest I had a bit of a search around to see what alternative recipes for Chocolate Mousse they were, and happened upon this "gem".
The crux of the above recipe is this: make up a packet of instant mousse, pour into glasses and eat.
I hate to point this out but THAT IS NOT COOKING!!!
This is one of my pet hates - convenience food that isn't convenient. Let me give you an example: Yorkshire Pudding mix, just add milk and an egg. In other words what you've got is a small packet of very expensive flour with a touch of salt. It is cheaper and just as fast to make a Yorkshire Pudding from scratch. More convenient? Not really.
As Old Scrote appears to be off-line at the moment I'll finish with his recipe for Chocolate Mousse - enjoy!
Update: Old Scrote himself has been in touch, his site is now available at http://www.oldscrote.talktalk.net/.
Chocolate Mousse
There are not many recipes for desserts in Scrote's repertoire but this one is a cracker, very easy, impressive and despite the tiny portions, devastatingly rich. You need a whisk of some sort and a bowl which you can set over a pan of boiling water (i.e. not plastic) to melt the chocolate. For four servings you need a large ½lb bar of plain chocolate (200g) and four eggs. The eggs must be very fresh.
For each person allow 1 egg & 2 ounces of ordinary plain chocolate. (Do not use cooking chocolate, it is a completely different stuff.) Break the chocolate into small pieces and set to melt in a bain marie- a pyrex glass bowl over a pan of boiling water. Meanwhile separate the yolks from the whites of the eggs, into separate containers. When the chocolate has completely melted, remove it from the heat and mix into it just the egg yolks (the whites are used later). The chocolate will become stiffer and take on a 'greasy' appearance- this is correct. Cover & set aside for 15 minutes to cool a little. In the mean time beat egg whites with a balloon whisk until eventually the mixture becomes white & stiff, so that it stands up in soft peaks. After the 15 minutes, fold the beaten egg-white a little at a time into the chocolate mixture, mixing very thoroughly.
Pour the resulting creamy chocolate mousse mixture into serving glasses (short glass tumblers or wine glasses are ideal) & put in the fridge to chill for 2 hours and set. Serve with a generous pouring of single cream. (Ruth puts a dash of Armagnac in and then adds the cream to keep the alcohol in!)
Some, like Latka, have become a regular part of our cooking, others such as Loco Moco have given me a new favourite "junk food" type snack.
There are some great resources out there. Old Scrote's Real Food Cookbook is one of the best old-fashioned cook books going, with a range of UK and International recipes and cooking tips, with the added benefit of a good dollop of British humour.
So, as you can tell, I enjoy cooking, and I definitely enjoy trying out new recipes. Today's new recipe was one for Chocolate Mousse. This is a really easy one to make from scratch and proved to be a hit with the whole family (again, a big thank you to Old Scrote for that one).
Out of interest I had a bit of a search around to see what alternative recipes for Chocolate Mousse they were, and happened upon this "gem".
The crux of the above recipe is this: make up a packet of instant mousse, pour into glasses and eat.
I hate to point this out but THAT IS NOT COOKING!!!
This is one of my pet hates - convenience food that isn't convenient. Let me give you an example: Yorkshire Pudding mix, just add milk and an egg. In other words what you've got is a small packet of very expensive flour with a touch of salt. It is cheaper and just as fast to make a Yorkshire Pudding from scratch. More convenient? Not really.
As Old Scrote appears to be off-line at the moment I'll finish with his recipe for Chocolate Mousse - enjoy!
Update: Old Scrote himself has been in touch, his site is now available at http://www.oldscrote.talktalk.net/.
Chocolate Mousse
There are not many recipes for desserts in Scrote's repertoire but this one is a cracker, very easy, impressive and despite the tiny portions, devastatingly rich. You need a whisk of some sort and a bowl which you can set over a pan of boiling water (i.e. not plastic) to melt the chocolate. For four servings you need a large ½lb bar of plain chocolate (200g) and four eggs. The eggs must be very fresh.
For each person allow 1 egg & 2 ounces of ordinary plain chocolate. (Do not use cooking chocolate, it is a completely different stuff.) Break the chocolate into small pieces and set to melt in a bain marie- a pyrex glass bowl over a pan of boiling water. Meanwhile separate the yolks from the whites of the eggs, into separate containers. When the chocolate has completely melted, remove it from the heat and mix into it just the egg yolks (the whites are used later). The chocolate will become stiffer and take on a 'greasy' appearance- this is correct. Cover & set aside for 15 minutes to cool a little. In the mean time beat egg whites with a balloon whisk until eventually the mixture becomes white & stiff, so that it stands up in soft peaks. After the 15 minutes, fold the beaten egg-white a little at a time into the chocolate mixture, mixing very thoroughly.
Pour the resulting creamy chocolate mousse mixture into serving glasses (short glass tumblers or wine glasses are ideal) & put in the fridge to chill for 2 hours and set. Serve with a generous pouring of single cream. (Ruth puts a dash of Armagnac in and then adds the cream to keep the alcohol in!)
Tuesday, 27 January 2009
William the Conqueror
Hey kids! Annoy your history teacher with this fun fact: "William the Conqueror" (b. 1028, d. 1087) was known at the time as "William the Bastard" (Guillaume le Bâtard).
He got this name because he was the illigitimate son (ie. a bastard) of "Robert the Devil", Duke of Normandy.
So you could say that the bastard did well for himself...
He got this name because he was the illigitimate son (ie. a bastard) of "Robert the Devil", Duke of Normandy.
So you could say that the bastard did well for himself...
Friday, 23 January 2009
Fun Things to do around the house #1: Repair a washing machine
As you've probably gathered from the sidebar (assuming you've read it), I'm the proud parent of three little nippers. Their ages range from just over three months, to nearly nine years old for our eldest daughter.
Recently our eldest has decided to help out around the house - which is nice. Well, usually it is.
The other day she decided to help mummy out by filling the washing machine. OK, so she didn't sort out the colours from the whites, and God only knows how much washing powder she used, but hey, it's A Good Thing that she is willing to help.
Unfortunately that was when the washing machine started doing a pretty good impression of a machine gun, and then stopped whilst half-full of water, which my wife only discovered when she opened the washing machine's door and flooded the kitchen.
Trying to run the machine again (in a vain attempt to get it to drain the water) just resulted in a red flashing light on the front of the machine. Luckily the machine is only around six months old so it's still under warranty, and we had purchased an extra couple of years extended warranty on top of that.
Of course, where there is an upside, there is also a downside. Where the heck did we put the warranty information, because without it they won't even come to look at the machine.
Whilst in the process of searching around the house I decided to check online to see if anyone else had reported the same sort of problem with their machine and managed to get hold of the user manual which gave me the details on what the flashing light actually meant. . . .
"Unable to drain water."
Gosh, that was helpful. I KNOW that the water isn't draining, my wet feet are a testament to that fact. More useful was the troubleshooting information - check the outlet hose and the filter, and a further note on how to remove the filter.
Removing the filter was a slow process, especially as there was still water in the machine, so it was a case of loosening the filter casing, catching most of the water that gushed out into a tin tray (the only thing thin enough to slide under the filter casing), empty that into a bucket and repeat.
Half-an-hour and two full buckets later. . .
Once all the water had gone I was able to completely remove the filter, and out fell a five pence piece which must have been left in some-ones pocket. Sensing victory, I put the filter back in, crossed my fingers and selected the "Spin" mode and turned it back on.
It worked! Sadly it still sounded like a machine gun, and also made some quite worrying grinding noises when it started up.
I waited for it to finish, eased the tray back under the machine and re-opened the filter, and this time a rather battered hair clip fell out.
Feeling slightly braver, this time I selected a "Rinse" cycle, and it worked, didn't sound like it was ready to explode, and finished without any problems.
We'll still have to find where the warranty is, just in case our eldest decides to help again (aaargh!), but hopefully she'll wait to be shown what to do next time.
Recently our eldest has decided to help out around the house - which is nice. Well, usually it is.
The other day she decided to help mummy out by filling the washing machine. OK, so she didn't sort out the colours from the whites, and God only knows how much washing powder she used, but hey, it's A Good Thing that she is willing to help.
Unfortunately that was when the washing machine started doing a pretty good impression of a machine gun, and then stopped whilst half-full of water, which my wife only discovered when she opened the washing machine's door and flooded the kitchen.
Trying to run the machine again (in a vain attempt to get it to drain the water) just resulted in a red flashing light on the front of the machine. Luckily the machine is only around six months old so it's still under warranty, and we had purchased an extra couple of years extended warranty on top of that.
Of course, where there is an upside, there is also a downside. Where the heck did we put the warranty information, because without it they won't even come to look at the machine.
Whilst in the process of searching around the house I decided to check online to see if anyone else had reported the same sort of problem with their machine and managed to get hold of the user manual which gave me the details on what the flashing light actually meant. . . .
"Unable to drain water."
Gosh, that was helpful. I KNOW that the water isn't draining, my wet feet are a testament to that fact. More useful was the troubleshooting information - check the outlet hose and the filter, and a further note on how to remove the filter.
Removing the filter was a slow process, especially as there was still water in the machine, so it was a case of loosening the filter casing, catching most of the water that gushed out into a tin tray (the only thing thin enough to slide under the filter casing), empty that into a bucket and repeat.
Half-an-hour and two full buckets later. . .
Once all the water had gone I was able to completely remove the filter, and out fell a five pence piece which must have been left in some-ones pocket. Sensing victory, I put the filter back in, crossed my fingers and selected the "Spin" mode and turned it back on.
It worked! Sadly it still sounded like a machine gun, and also made some quite worrying grinding noises when it started up.
I waited for it to finish, eased the tray back under the machine and re-opened the filter, and this time a rather battered hair clip fell out.
Feeling slightly braver, this time I selected a "Rinse" cycle, and it worked, didn't sound like it was ready to explode, and finished without any problems.
We'll still have to find where the warranty is, just in case our eldest decides to help again (aaargh!), but hopefully she'll wait to be shown what to do next time.
Subscribe to:
Posts (Atom)