Tuesday, December 28, 2010

A MythTV box for the lounge room - Part 1: getting the display to work

The first hurdle:

After a number of years trying to get a picture on my extremely fussy lounge room TV from the computer, I finally bit the bullet and ordered a VGA to component transcoder - an Audio Authority 9A60A. This is because I am still using a Panasonic TX-76PW200A CRT TV, with only component inputs. Also, since it is not in the USA, the refresh rate is 50Hz,rather than the 60Hz that the nvidia drivers support out of the box. I mentioned in an earlier post that I was going to have a task getting a working resolution to the TV, but eventually something was found. Initially I used the program Powerstrip, with my desktop system set up in the lounge. A couple of settings worked, but with pretty bad overscan. Every time I tried tweaking it a little, the display just went out of range and I was greeted with a blank screen. Disheartened, I packed up the PC and contemplated putting the transcoder on ebay.

After a couple of days' break, i thought about trying again, this time with my laptop that runs linux and has a Mobility radeon graphics chipset. Using the xrandr command, I was able to feed it a modeline that I had found on another forum. An owner of the same TV I had, had managed to get a good picture up after a bit of experimenting. Lo and behold, I was able to get a picture up!

From memory, the xrandr commands used for the laptop were:

xrandr --newmode "1920x1080i_50" 78.300 1920 2464 2520 2784 1080 1102 1117 1125 interlace +hsync +vsync

That created a new mode for the display to use.

xrandr --addmode VGA-0 1920x1080i_50

This assigns the mode to the display device - in this case, it was to the VGA external output for the monitor.

xrandr --output VGA-0 --mode 1920x1080i_50

And that final command sent the 1920x1080 resolution to the VGA output - where the ubuntu desktop came up on the TV, much to my relief!

Knowing that the TV was capable of accepting the resolution, and that the transcoder was able to output the correct timings to the TV, it was back to the main box to see how the output worked from an Nvidia card.

The Nvidia binary driver unfortunately does not accept the xrandr commands I had used with the radeon drivers on the laptop, so another method was going to have to be used. After some inital lack of success with manually editing the xorg.conf file, I had a go using nvidia's utility designed for adding things to the file, nvidia-xconfig. Using the command:

sudo nvidia-xconfig --mode=1920x1080i_50

added a mode to the xorg.conf file. I opened that file up in a text editor and had a look where it turned up. When I found it, I added the following line straght below it:

Modeline "1920x1080i_50" 78.300 1920 2464 2520 2784 1080 1102 1117 1125 interlace +hsync +vsync

I restarted the machine to let nvidia-settings find the modeline. From there, it was then selectable from the resolution dropdown box, where I could choose it. It worked again!

The final step was getting it to work when the TV was the only display hooked up. To summarise, the xorg.conf file needed to be edited to remove all references to the PC monitor I had connected as well, so the TV was the only device it detected. Have a look at the xorg.conf file here, for your reference.

One other thing I had to do was set the PC to log in automatically - since this is to be a Home Theatre PC, there will not be a keyboard connected to allow typing in a password. It's not what I was after anyway. This option can be found under System -> Administration -> Login Screen

In a later entry I'll go over the setup of the actual system, and what I've learned in its first week of use.

Saturday, December 18, 2010

New version 1.0.10 of Dropbox released

Just a couple of days ago, a new version of the online file sync and backup utility Dropbox was released. It now supports selective sync, allowing certain folders only to be synchronised across different PCs, instead of just everything. I've just installed it on two of my Linux machines, with no ill effects. I followed the instructions shown on the Dropbox forums here, with a slight modification as shown here. I'll describe it here as well, in a way that doesn't involve using the command line (if that isn't your bag, baby):

Step 1: Delete the .dropbox-dist directory, found in your home directory. You may have to hit control+h to show the hidden files and directories in Nautilus.

Step 2: Run Dropbox, from the Applications -> Internet -> Dropbox menu

Dropbox will then start, popping up a box asking whether you would like to download the proprietary daemon. Click Yes to that (else it won't work), and it should start up.

I'll be interested to see if it has overcome the problem it had with the previous version on my laptop, where it didn't seem to like the encrypted home directory, and would often crash on startup. I've tried one reboot so far, it came up OK, so I'll keep an eye on it.

Sunday, November 28, 2010

Getting a media centre remote working with LIRC, in Ubuntu 10.10

Recently I bought a new remote control for my PC, to replace the one that came with my TV tuner card. I picked up a Hauppauge Media Center Remote Control Kit, model 335, which has more logically labelled buttons than what I had. It looks a bit nicer, too, an important consideration as I am attempting to finally build a Home Theatre PC to go in the lounge room.

While the previous Technisat remote worked fine in Ubuntu 10.10, I read of some other people having problems with getting remotes to work - namely the Media Centre-compatible versions. When I plugged in my new one, I also noticed the problem.

A bit of trawling around on the internet uncovered a thread at the ever-reliable Ubuntu forums. It turns out that the newer kernel includes a module for the media centre remote (mceusb), that clashes with the lirc_mceusb module you want to use, with LIRC. When the two modules are loaded, neither works.

What has to be done, is make sure only one of them gets loaded. The user uniden9 posted an excellent how-to in that thread linked above, describing two methods that are available - disabling the old lirc one, or disabling the newer, kernel built-in module. Based on other responses in the thread, most people had more luck with the second method - disabling the newer one. I guess the MythTV key bindings etc. would work better with the old version.

I have included the relevant quote from uniden9's post here, in case there is a problem accessing the forums:

2. Blacklist the kernel source drivers and use the old dkms built drivers.
-Well you already have lirc-sources-modules install and probably had this working on 10.04 or older version of ubuntu.
A. Edit /etc/modprobe.d/blacklist.conf , create it if it doesn't exist.
Add>
blacklist mceusb
line to the file and save it.
B. Update initramfs, #sudo update-initramfs -u -k all
C. reboot
D. Reconfigure lirc if its not already working.
#sudo dpkg-reconfigure lirc
On the Remote, Select "Windows Media Center Transceiver/Remote (All)
E. Thats it.
This is the method I chose after wrestling with button repeats on option 1. I use XBMC and could not figure out how to disable Ubuntu treating the remote as a keyboard device.

I used this method as well, and now have the media centre remote working just fine on my system. I'm now just waiting on the arrival of a VGA to Component Transcoder, that will hopefully allow my PC to connect with my older-style TV that has only component inputs. As an aside, although I have a graphics card that has a component-out attachment, it only puts out a 1080i, 60Hz signal. My TV, being made for the Australian, PAL-based market, requires a 1080i, 50Hz input signal. Fingers are crossed that the transcoder will allow it to work - I'll be posting an update when it arrives and I can try it out.

Thursday, November 11, 2010

Getting Ubuntu installed on a laptop

I had managed to score for free from work an old laptop that was past its use-by date, for business use at least. It's an LG S1 Express Dual, with a Core Duo (not Core 2) processor. Previously, to extend its life a little longer I had installed 2GB of memory in it and a faster hard drive - that made a remarkable difference.

Being from around 2006, it had come with Windows XP pre-installed. I wanted to try a newer operating system than that, so initially I put Windows 7 on it. Now for people saying Linux is hard to get driver support for, Windows 7 was an ordeal in itself. First of all, LG have basically stopped making laptops. Drivers for it were not available on their site for windows 7. Some Vista drivers were available, so I had some of those installed. The graphics chip in this machine is an ATI Mobility Radeon X1400. Going to ATI/AMD's site to find a current driver just sends you to a page saying that it's the laptop maker's responsibility - they don't support it any longer.

Even things like network drivers were unable to be found easily. I remember having to scrounge around looking for whatever I could find, on obscure and slightly dodgy driver sites. In the end, Most of it worked. The on-screen display didn't work, some of the special functions didn't work, and got no joy from the included remote control.

One other setback was the BIOS not recognising hard drives over 137GB/128GiB in size. A bit of a setback, since I'd like to make use of all of the 160GB drive that was installed in it. First up, I tried dual booting with Linux, using the space at the end of the drive that BIOS and Windows couldn't see, but was no problem for Linux. Ubuntu booted fine, but Windows would refuse to boot, until I shrank the partitions down to fit under the 137GB limit.

Finally, I decided to just run a complete install of Ubuntu. Taking advantage of the release of version 10.10, I put in the 32-bit install disc (the Core Duo T2400 does not support 64-bit OSes) and nearly everything seemed to work pretty well. Especially networking, which did not work under Win 7.

A bit of tweaking was needed though. Sound wouldn't work. A little digging through the forums revealed the module for the sound card wasn't installed, and the following had to be added to the end of the file /etc/modprobe.d/alsa-base.conf:

options snd-hda-intel model=lg

This loaded the correct module at boot time, and sound now works.

One other problem I had was with the suspend functionality. While the system would go into suspend mode correctly, upon resuming the wired networking would refuse to work. I haven't tested wireless, as I do not have a wireless router yet. A little research revealed it was due to this bug, and that by manually unloading and reloading the network module, it would come back to life again. The commands to do that are:

sudo modprobe -r et131x
sudo modprobe et131x


Thanks to the user P4Man at ubuntuforums.org, a short script has been created, as detailed in this post:

Create a script named /etc/pm/sleep.d/11ethernet, with the following contents:

#!/bin/bash
case $1 in
suspend)
rmmod et131x
;;
resume)
modprobe et131x
;;
esac


Changing the “et131x” value to whatever the network module is. That is what the module is on this LG laptop.
Save file and make it executable:

sudo chmod 755 /etc/pm/sleep.d/11ethernet

Now, when the laptop goes into suspend mode, the network module gets unloaded, and upon waking up, the module gets reloaded. Networking works!

Boot up time is nice and quick with 10.10 as well - quicker than windows 7 which itself was no slouch. A far cry from its original state, with a slow old hard drive, Windows XP and a bajillion little programs that started on boot. All in all, it's a nice machine. Now I just have to buy a battery for it that works...

Tuesday, October 26, 2010

Latest Ubuntu conkyForecast update fix

(Updated 27 October, added recommended ln command, see below)
Today's update to conkyForecast, the program that enables the current weather conditions and a four-day forecast to be displayed on the screen, came with an unexpected downside. It stopped working. A bit of searching around for others with similar problems prompted me to look at the .xsession-errors file, in my home directory. From there, the message:

/usr/bin/conkyForecast: 3: /usr/bin/python2: not found

kept repeating. Having a look for the file /usr/bin/python2 showed that yes, that file was indeed non-existent. This was puzzling. I opened up the file /usr/bin/conkyForecast, to see where it was mentioned. It is only a short file of a couple of lines, but one of them did call python2. This is a link in the /usr/bin directory, pointing to python2.6.

There are two fixes to this. One, I edited the file /usr/bin/conkyForecast and changed python2 to python. This fixed it - the forecast was displaying on screen again, but further updates to the script would overwrite this, meaning you'd have to do it again. Therefore, I'd recommend the following method.

The other, recommended, way to fix it is to re-create the python2 link in /usr/bin: I just restored it from a backup made a few days ago (Yay for backintime!), but you could just create a link to python2.6 with the ln command, as below:

sudo ln -s /usr/bin/python2.6 /usr/bin/python2

As an update, a little further searching revealed this post on the Ubuntu forums. This is a known issue with Ubuntu 10.10, where there is no link to python2. ConkyForecast was updated to point to python2, in preparation for later versions of Python.

Saturday, October 16, 2010

Upgrade to Maverick Meerkat - version 10.10 worked!

I am writing this from my newly-upgraded Ubuntu 10.10 system! This has been the second time I upgraded the installation, rather than start afresh. I was a little nervous, to be honest, but backups were made before the upgrade so there was an escape plan.

There were only a few issues that were pretty easily fixed, and I may, just may, have fixed the hanging on boot problem I had with version 10.04.

One problem I had was the Hard drive temperature reporting in Conky - it stopped displaying a temperature. This was because the hddtemp command could only be run as root - Conky doesn't run with those privileges, and I didn't want to start doing so. A quick search revealed a single line command to allow hddtemp to be run as a regular user:

sudo chmod u+s /usr/sbin/hddtemp

After doing that, the disk temperature re-appeared and all was good.

Fixing the boot screen

This was a problem that I had in the previous version, 10.04, as well as in 10.10. It is to do with running the Nvidia proprietary drivers, and switching to different video modes. Basically, there are some steps to try at this site, which I did. It involves setting some video modes in the boot loader. I initially tried 1680x1050 (about the best that was supported by my card, even though it runs at 1920x1080 once at the desktop) and got a comically large ubuntu logo during boot. A later change to 1280x1024 yielded a much better result.

So far, the boot and shutdown screens look much better, and I haven't yet had it hang on me. I am hesitant to call it truly fixed until a few more weeks have passed.

Edit January 7, 2011: If you do these changes and then switch back to the open source driver (nouveau), you have to undo the changes to be able to set the proper resolution for your screen(s).

Monday, October 11, 2010

Adding the new Ubuntu fonts to your existing installation

With the release of version 10.10 comes a new set of fonts, named Ubuntu. If you're not ready to upgrade to the new version just yet, you can install just the new fonts by downloading them here.

Just double-click the downloaded file to run the installer. You'll have to change the fonts used under System -> Preferences -> Appearance -> Font; change The Application, Document and Desktop fonts to "Ubuntu", the Window title font to "Ubuntu Bold".

If you're using Google Chrome, you can install this extension to use the new fonts. You will probably have to change the default fonts under the preferences menu.

Friday, October 8, 2010

My list of tweaks - what I change and add after a fresh install

With the release of version 10.10 in a few days, this post partly serves as a reminder to myself what I have installed on my machine, and how it is set up. It will probably also make pause and reconsider doing a clean installation, when I look at the amount of configuration I would have ahead of me!

Programs not in Ubuntu repositories:

Crashplan - for backing up user data such as documents, music, photos. Not so much for system data - I use backintime for that.

Dropbox - for syncing some of my data between computers - e.g. between home and work PC.

A whole bunch of Google programs - Google Earth, Chrome, Picasa and Desktop. They probably now know more about me than I do...

conkyforecast - a part of Conky that supplies weather forecast info on the desktop, along with a bunch of other system stats that allow me to keep tabs on what's happening on the PC. One of the programs I consider essential. Working without it is like driving a car with no dashboard.

Photography/Image editing:

Flickr Uploader - I initially downloaded this because I stopped using F-Spot, as it was too damn buggy. When I was trying other programs, none of them had a way to upload photos to my flickr account. This utility is pretty handy, letting you tag photos individually or as a group.

DigiKam - I thought for a while before installing this, as it is a KDE program. Choosing to install this requires also installing a bunch of the KDE framework. If you are interested in keepiing a lean system, maybe this isn't for you. However, I was looking at some sort of photo management and editing software. Bibble and Adobe Lightroom are the main contenders. Bibble is a paid app, as is Lightroom, which has the downside of being Mac and Windows only. I thought, what the hell, digiKam is free, so I'll try this one first. So far, it has worked pretty well. It isn't quite as polished as the other two, but my wallet says it is just fine.

Music:

Rubyripper - an open-source equivalent of Exact Audio Copy for windows.

Other:

Bleachbit - does a similar job to what CCleaner does in Windows.

Virtualbox - for running virtual machines, testing other operating system installs. Good to tinker and learn things.

Deluge Bittorrent client. I prefer it to Transmission (which is installed by default).

HDParm: a later version, which includes a copy of the wiper script that performs garbage collection on solid state hard drives, marking sectors as deleted. The new version of Ubuntu should have this included by default (as well as the 2.6.35 kernel, that supports the TRIM command).

Tweaks:

Since I have a solid state hard drive in my system (and I highly recommend them for anyone looking to upgrade their PC) there are a few SSD-specific tweaks you can do to get the best out of them. This site has an excellent guide. I didn't do all of them, just enabled the noop disk scheduler and mounted filesystems with the noatime flag.

There are a number of things to be done with regards to MythTV, one of which is setting up LIRC for the remote control to work. One thing I noticed with a bit of early tinkering with a later kernel was that the remote ceased to work with that kernel - that's a concern I have with moving to the new version.

Enabling the medibuntu repositories to allow MP3 playback, DVDs and a few other restricted formats. Version 10.10 apparently has an option to enable all of these at install time, which is welcome.

That's just a few things I've changed, no doubt I'll think of many more, especially if I do a reinstall from scratch. I'll let you know how I go: if I do an upgrade (quick, convenient, chance of hosing the system), fresh install (time consuming with all the re-configuring and reinstalling, but will be a nice fresh system), or staying with 10.04 (quickest of all, but missing out on new toys). I might hold out for a little while, but I'm sure I'll succumb to the allure of new and shiny...

Tuesday, September 28, 2010

Don't forget to update Grub on new kernel updates

Today there was a new kernel update - to version 2.6.32-25. I ran the update manager and the updates installed fine, and let it reboot. Upon restarting, I went to a terminal and checked what kernel was running: the previous one still. For some reason, after performing a kernel update, my system does not update the Grub menu to load the new version. Whether this is by design or it's just a quirk of my installation, I don't know. But let this just be a heads-up for other users. The security fixes in the new kernel won't be of any use if you are still running the old one!

Fortunately, it is easily fixed by entering "sudo update-grub" at a command prompt. It will look through your system and add the new kernel to the startup menu. After a while, when the new kernel appears to be working OK, you can uninstall the previous version to save some space.

Monday, September 27, 2010

Fix for Gnome Window List not updating

Over the last couple of months, I had been running the bottom Gnome taskbar panel with the "Expand" option not enabled. This meant that when no windows were open, the taskbar reduced in size, centred along bottom edge of the screen. As more programs are opened, it would expand to fit, eventually stretching across the screen. I did this to give a little more screen real estate to the Conky system monitor display down the right hand edge of the screen.

I ran it like that for a while, but started noticing the window list was acting a bit strangely. When I switched tabs in a web browser, the window title down at the bottom of the screen would not update: it was stuck on a previous page. At first I thought it was a Google Chrome issue, but the same thing happened in Firefox. I started working things out - it would update if I opened or closed another program. The extra window title being added to or removed from the taskbar would update all the others as it expanded or shrank. But this was a workaround - something was wrong.

With the new release of Ubuntu mere weeks away now, I didn't look too hard into fixes. Hopefully the new version would rectify it. Today, though, a thought just struck me - what if I turned on the "Expand" option again? I did it, and it all seems to be working fine. I can switch from tab to tab in the web browsers and the window list updates instantly.

So really, it isn't a fix, more a bug in Gnome's window list application. If you haven't purposely gone into the options to disable expand, you won't be affected - it's enabled by default in Ubuntu. Hopefully it will save some head scratching in the future.

Now my only ongoing Ubuntu issue is the occasional hang at boot or shutdown, on the Ubuntu splash screen. Still no fix for that that I have found, but I'll let you all know if I find something.

Sunday, August 29, 2010

New graphics card helps with Compiz effects and MythTV playback

Previously in my linux box I was running an Nvidia 8600GT graphics card, along with the Nvidia proprietary driver. This driver has the VDPAU extensions, that enable some of the video calculations to be offloaded to the graphics card, giving smoother playback. Since this works in the GeForce 8-series and later cards, it worked on mine.

As it turns out, it didn't work terribly well. When I enabled visual effects (via the System -> Preferences -> Appearance -> Visual effects option), all the standard Compiz effects like wobbly windows and transparency worked, but when viewing TV, either live or recorded in MythTV, playback was slightly choppy, with some tearing of the picture. Since I valued smooth video over wobbly windows, I disabled the video effects.

Looking around on some forums I noticed some saying that their playback was OK, even with Compiz enabled. I finally tried biting the bullet and getting a quicker video card (A couple of recent game purchases had nothing to do with it, honest...). I was concerned about the power usage of a quicker card, so I kept an eye out for power usage figures. Eventually I came across the excellent list of specs at TechARP, where the Nvidia GTS250 was listed. It showed a fairly high max usage of 145 Watts, but also some good performance figures compared with the 8600GT I had. A few reviews showed that recently, a new revision of the board was out - a "green" version. Clocked slightly lower with the GPU and memory speeds, it also only had one additional power connector compared with previous versions that had two. They also showed its power usage was reduced.

I managed to find an Inno3D 1GB GTS250 Green at a pretty good price, plugged it in the PC, and fired it up, with one eye on my watt meter that the PC is plugged into. To my surprise, power usage was about the same, if not less, than with the 8600GT. The shrinkage in process technology really made a difference. The only downside was the fan noise - it's easily the loudest component in the PC now, and of course louder than the previous card, which was a passively-cooled model from Gigabyte.

Best of all, I booted into linux, enabled the Compiz visual effects, and fired up MythTV to watch some video. It played back beautifully. I think that the previous card was only marginal in performance, and with its memory being fairly slow DDR2, didn't have quite enough grunt to run MythTV with Compiz. As mentioned earlier, other people have reported success with a 7600GT and a 9500m GS, and another person had success running an 8800GT, with two monitors at 1920x1080.

I believe when they were released, the 8-series cards drew some flak for using more power but not having much more performance than the previous 7-series. Maybe there is some truth to that. This new card has power to spare, with the side benefit of good gaming performance. For a HTPC, I'd look into a different, quieter cooler for it - something like an Arctic Cooling Accelero Twin Turbo Pro.

I should add that I was tempted to buy an ATI card - the 5 series cards are quite power efficient with better performance than what Nvidia has to offer, and if I was running a straight Windows system I'd have no hesitation in picking one up. Unfortunately, ATI's drivers for linux are not as well developed, particularly their equivalent to the VDPAU functionality, so I was scared off buying one for that reason.

Saturday, July 24, 2010

Something I just learned about in Gnome

Just today I learned about a feature in the Gnome desktop environment that is the Ubuntu default. I absent-mindedly middle-clicked the title bar of a window, and found it sent the window to the back, behind any other windows. It's the opposite to left-clicking the title bar, which brings the window to the front, which everyone knows about.

You learn something every day.

Thursday, July 1, 2010

Should you use the XFS file system?

One of the good things, and also one of the confusing things about Linux, is the amount of choice you have. Even down to what sort of file system you want to use for storing your data. By default, the Ubuntu installer will use ext4, a good, general purpose file system that will suit most people's needs just fine.

There are other options though, depending on your usage patterns. If you have a lot of large files stored, such as lots of video or movies, maybe even if you record TV with MythTV like I do, a good alternative is XFS. Its design means it is particularly good at handling large files, such as High Definition TV recording, giving good performance in reading and writing them.

You can specify a partition to be in XFS format at the time of installation, or later on by using GParted. Once it is created, there are some further tweaks you can do to make its performance better, and also to maintain its performance over time.

1. Add the following options to the fstab entry for the partition (located in /etc/fstab):

UUID=xxxx /var/lib/mythtv xfs defaults,noatime,allocsize=512m,logbufs=8 0 2

noatime refers to stopping the file system logging each time the file is accessed. Normally it is not needed, and this option reduces the number of writes done to disk, speeding things up.

allocsize=512m lets XFS set aside 512 megabytes of space at a time when it writes. This reduces the chance of the file being split up into many small chunks, or fragmenting. Especially good for recording TV.

logbufs=8 refers to the log buffers held in RAM. Values can be from 2 to 8 buffers, which are 32K in size. Since most PCs have RAM to spare these days, there is no problem maxing this out.

Sometimes a bit of periodic maintenance is needed with XFS, and this can make people used to NTFS on Windows feel a little more at home. You sometime have to defragment the file system, and there is a utility that can do it. It isn't installed by default - you have to install the xfsdump package to get the folllowing commands.

To check the fragmentation level of a drive, for example located at /dev/sda6:

sudo xfs_db -c frag -r /dev/sda6

The result will look something like so:

actual 51270, ideal 174, fragmentation factor 99.66%

That is an actual result I got from the first time I installed these utilities, previously having no knowledge of XFS maintenance. Pretty nasty. Basically, the 174 files on the partition were spread over 51270 separate pieces. To defragment, run the following command:

sudo xfs_fsr -v /dev/sda6

Let it run for a while. the -v option lets it show the progress. After it finishes, try checking the fragmentation level again:

sudo xfs_db -c frag -r /dev/sda6

actual 176, ideal 174, fragmentation factor 1.14%

Much better! I have the xfs_fsr command scheduled to run daily with a cron job, at a time when it is unlikely to be recording, to keep things in good shape.

More in depth information on XFS can be found at the MythTV wiki. Some more detailed performance optimisation info can be found here and here.

Tuesday, June 29, 2010

Where to go for help with Ubuntu

OK. So you've downloaded the latest version of Ubuntu, or even scored yourself an installer CD. Something doesn't go quite right with the install - where do you go next?

You may be able to still use Ubuntu as a live CD, meaning you could use it to access the internet. Or, if you were thinking of trying a dual-boot installation, you could still get on to the net from your existing install.

The first place I look is the Ubuntu forums. They are quite popular, with loads of visitors. Whatever problem you're having, someone is bound to have had something similar. These things are invaluable. As a matter of fact, even though I've visited them for years, and found heaps of solutions, I'm still yet to make a post myself. I just haven't needed to. The same thing happened with the Gentoo Linux forums, back when I used to run that.

As a matter of fact, if you can't find an answer in the Ubuntu forums, try some different ones - like the Debian User Forums. Since Ubuntu is based on Debian, there is a lot of similarity.

Another community I've been a long-time member of is Overclockers Australia. Their forums also provide a wealth of information - the Other Operating Systems forum in particular comes in quite handy.

Back on the Ubuntu site, you could try the Official Ubuntu Documentation. Sometimes though, it can be a bit lacking in detail, or it can be a little dated. This is why I suggest visiting the forums first.

Then there's always Google. Which could be how you found this page...

Another option is to download the free ebook, Getting Started with Ubuntu 10.04, by the Ubuntu Manual Project. Lots of info there in setting it up.

Monday, June 28, 2010

Ubuntu 10.04 progress update... update

Well I should have known I'd jinx myself when I mentioned in my last post that the strange, hanging at boot problem has solved itself. Just this morning, the day after I posted, the thing just hung at the boot splash screen. A quick control-alt-delete got the thing rebooting, and it all started fine after that. Looking in the log file viewer revealed nothing out of the ordinary; checking the ubuntu forums showed up nearly nothing, apart from the fact that it happens to other people too.

I say nearly nothing, because I did learn one thing. If you hit escape while in the boot splash screen, the pretty graphics go away and the boot messages come up. I now want the machine to have problems at boot, so I can try this and see if anything interesting appears. Yeah, reverse psychology. I want you to fail at boot, all the time!

(Originally posted June 28, 2010 on my other blog)

Ubuntu 10.04 progress update, and a bit about Crashplan

Well it has been a couple of months now, using Ubuntu 10.04 Lucid Lynx. Overall, I have been pretty happy with it. Boot times are very quick, especially with it installed on the Intel X25-V solid state drive. The interface is quite polished and looks good. MythTV is working nicely as well. I did have a spell a little while back where it would hang during boot or shutdown, requiring a restart. Either that, or there would be a quite long delay in the boot process. I tried to work out what was causing the holdup by installing bootchart (available in the repositories) and looking at the results.

Wouldn't you know it, but after I installed it and rebooted, the system worked fine. Everything has been back to normal. Don't know what was wrong or what fixed it. Oh well, I'll take it.

I have a sneaking suspicion that it might be due to doing an upgrade install rather than doing a fresh install - I'll wait and see if it keeps behaving itself.

Backing up the various systems around the house has been something that has been looked into lately, too. I have purchased a new external hard drive to replace the old one, that is now getting a bit old. Following on from discussions on the Overclockers Australia forums, I have been trialling Crashplan, a backup service that allows you to back up to local folders or attached drives, other computers on a network, other people's PCs, or their own online backup service. It is more concerned with backing up data, rather than complete system images, so that's what I use it for.

It has enabled me to get around some of Windows 7 Home Premium's limitations about backing up to a network. The setup I have now for my wife's Windows 7 PC is that the standard windows backup runs, sending its data to a small external drive attached to it. Supplementing that is Crashplan, that backs up documents, photos, music and the like to the new external drive attached to my PC, giving a bit of redundancy.

My own PC has a regular backup scheduled via backintime to get most of the system data, and Crashplan to back up the music, photos, documents and the like.

On all machines, these are scheduled to run daily with the exception of the windows system backup that is set for weekly running.

I haven't yet decided whether to take the plunge and purchase the Pro version of Crashplan, or to enable the online backup component. I'm just waiting to see how the software behaves itself - I have overcome one particularly nasty teething problem.

Sometimes when my Linux machine boots, if the external drive hasn't mounted in time, Crashplan would decide to re-create the backup directory on the system partition and try to back up ~140GB of data to it. I've fortunately managed to catch it in the act and stop it before anything nasty happened, but it made me hesitant to make the purchase. I am not the only one who has had the problem, as this support forum thread describes.

I tried a couple of fixes - one was to change the program setting to only allow it to run between certain times, so the machine would be on, and the drive mounted in time, before it starts. This worked well until a few days later when I came home late from work and booted the PC. Backup to system partition happened again. I tried editing the startup scripts, adding a "sleep 20" command, to delay the program from starting until the drive was mounted. I thought this was the solution, until I one day turned the PC on and walked away to do something else. I came back, logged in, and realised the drive wasn't mounting until after I did that. Crashplan started as soon as the 20 second delay ran out, which turned out to be before my login.

The final solution, and I think this has nailed it, was to add an entry for the external hard drive in the /etc/fstab file. This now mounts the drive straight after it mounts the other, internal drives, and well before the Crashplan engine starts. Since the external drive is a semi-permanene attachment to the PC, it works for me.

I may yet start using the online backup - the only issue now is the matter of uploading nearly 200GB total over a 512 kilobit upload link, with uploads counted towards my data allowance. This would have to be spread out over a couple of months of uploading at a throttled speed.

Overall, I'm pretty happy with Crashplan so far. It supports Windows, OSX and Linux, which is rare. The clients all find and connect to each other with a minimum of hassle as well. It would be good having some peace of mind that all our photos and documents would survive if the house burnt down or something.

(Originally posted June 26, 2010 on my other blog)

Most painless Ubuntu upgrade ever

Wow. That's all I can say. Today I decided to upgrade to Ubuntu 10.04, but rather than doing a clean install like I have done with previous versions, I thought I'd have a go at the upgrade. To prepare, I copied the /boot, root and /home partitions of the existing 9.10 install to some free space on another hard drive. Just as a precaution in case it all went pear-shaped. I ran the update manager, chose the upgrade option, and away it went - downloading all the new packages, applying the updates. Only a couple of dialog boxes popped up asking about Grub, and what it should do with it.

The process finished without incident, I rebooted, and surprisingly, it started, allowed me to log in, and it got to the desktop! All previous programs and settings there, all programs updated. I am pleasantly surprised - even MythTV updated without a hitch, for the first time. I'm sure there will be the odd hiccup along the way that I'll discover in time, but so far so good.

Oh yes, regarding Windows 7 - I've removed it from my machine. Partly because I was barely using it, apart from watching full-screen Flash videos in decent quality, partly because I have obtained an old laptop from work, and installed 7 on that.

(Originally posted on May 2, 2010 on my other blog)

New keyboard! And more on Windows 7

This post is basically inspired solely by the fact that I have bought a new keyboard and need an excuse to do a lot of typing on it. After finally having enough of interference, non-responsiveness and battery changes, I ditched by former wireless keyboard and mouse combo and bought a couple of new ones - a Logitech M500 corded mouse and an Illuminated keyboard. It has a nice, laptop-style key feel with just a little more travel, with the added bonus of keys that light up for use in a dark room.

I love it. Ubuntu picked it up without a problem, all keys working just fine. Windows detected it as well, but after installing drivers for it, insisted on a reboot. Some things never change. I didn't bother installing the setpoint software that is included - no need for it in my opinion.

As a progress update on my dual-boot exploits, there have been good points and bad about windows.

The good:
  • Sleep actually works. It worked even better when I downloaded a patch that stopped it crashing on resume when the hard drive didn't wake up in time. In Ubuntu, I only ever got to a blank screen with a blinking cursor when I tried the suspend feature.


  • Full-screen flash videos work with hardware acceleration. Only 5-10% CPU utilisation, compared with almost maxing-out a core under linux. This is due to differing stages of development of Adobe's flash player.


  • General polish and feel of the desktop. You can tell a lot of work has gone into this. The help functionality is excellent as well - far better than Ubuntu's vague documentation.


  • Backup works great - nice and straightforward, asking to also create a startup disc for system recovery. In Ubuntu, I am using backintime. While it is great for making backups, restoring from them is not so straightforward. Must look into an image based program.


  • Homegroups work great - it found the other Windows 7 PC on the network with no problems and is effortlessly sharing files between them. I can even use the printer connected to the other machine - a Canon that has stuff-all driver support under linux.


  • The bad:
  • While sleep works, the simpler task of powering off the screen after a set period seems more difficult for it. It doesn't always do it. Then again, a number of XP systems at my work have problems with that as well - and they are factory Dell and HP boxen.


  • The media centre application isn't quite up there with MythTV, one of linux's killer apps, in my opinion. Guide data is only available from the networks' broadcast guide, and I can't find a way to get it to set up two tuners. Actually, finding any documentation on it is somewhat difficult.


  • All the rebooting needed. It's still there, and it gets old pretty quick.


  • I think I'll be dual-booting for a while longer, especially since MythTV will not be replaced any time soon. It has been interesting though, and I am learning the interface and all. I have noticed a lot of fixes since the Release Candidate that I was running last year on the other PC.

    (Originally posted on January 29, 2010 on my other blog)

    My journey to the Dark Side is complete

    Well, not complete. But I did buy the Windows 7, 3 user family pack for $209. I'll be installing it on the wife's PC today. I also put it on as a dual boot on my ubuntu system, for a laugh. I managed to get it to activate, using method #2 on this page, from Paul Thurrott's site. And after windows thoughtlessly overwrote the MBR on my drive, I even got my grub 2 menu back, following this excellent guide on the ubuntu forums.

    One other thing I had to do on the ubuntu side of things is set the system clock to be local time, not UTC. I had to go edit the file /etc/default/rcS, and change the value UTC=yes to UTC=no. They seem to be playing OK together now. It's just a shame that windows can't see the data on my ext4/ext3/XFS partitions. I don't know if I can be bothered putting my data on an NTFS partition yet, so it can be seen by both systems. I'll see.

    (Originally posted January 7, 2010 on my other blog)

    The Dark Side is looming...

    Well. I'm having a crisis of faith. Looking at going back to the dark side. Windows 7 Home Premium, 3 user pack for ~$210 depending on where you shop. With the Release Candidate soon to expire in the next few months on the wife's PC, I have to either put XP back on it, or move to Windows 7. The 3 user pack is a good deal, normally it's about $160 for a single user, but it is going to be very short-lived. It only ran a couple of months even in the USA. There's a sense of urgency to this.

    My wife couldn't care less what OS is running (as long as it's windows), so there'd be no problems with putting XP on, in her view. Realistically, I just need to get a decent backup program for it, because the Win7RC one is pretty good, while the standard XP one is rubbish by comparison. The only issue with the RC is while it is scheduled to run every Saturday at 3am, it seems to insist that the next scheduled time is December 30, 1899. Back in the days of steam-powered computers. I assume it is fixed in the release version of Windows 7.

    Over the last few days and weeks, running linux has started to wear me down a bit. It is such a battle sometimes, to do stuff that would be simple in Windows or even OSX. Yesterday I wanted to install Google SketchUp – no native linux client, so installed through Wine. Go to run it, crashes straight away. Then it's off trawling forums and web searching for fixes, workarounds, all of that. Then there's flash player. Still buggy, still get sound lagging behind so speaking is out of sync. There's always more work, a lot of the time. F-Spot is a piece of crap, crashing all the time. Picasa is a half-arsed effort, just a windows program running under wine, really. Just doesn't look right.

    But still, there are aspects of linux that I love. MythTV is the killer app, in my opinion. There is nothing in the proprietary software world that is so capable and unrestricted, with nothing locked down by media companies. I would miss that the most if I went back to windows. Being able to scroll anywhere without first clicking in the window is something I wish windows had, when I'm using it at work.

    Anyway, I'll keep thinking about it, and pick it up on boxing day if I decide to. Otherwise, I could put that money towards hardware. I need a new screen, which I'll get regardless, but that $200 could go towards a solid state hard drive. Mmmm, fast.

    (Originally posted December 25, 2009 on my other blog)

    Desktop OS in 64k - GEOS

    Over the last weekend I had a sudden attack of nostalgia for the Commodore 64. I have fond memories of the thing, having one for my first computer. I remembered how there was an operating system available for it that was able to show a graphical desktop, with icons and all, a WYSIWIG word processor, even a spreadsheet, in only 64 kilobytes of memory. It was amazingly tightly coded, and such a difference from today's software that needs tens of megabytes to show a pissy icon in a system tray, for instance.

    This operating system was GEOS, standing for Graphical Environment Operating System. An excellent article about it is here. I felt like having a bit of a tinker with it, relive some old memories. One problem though was I no longer had a C64 to load it on. However, emulators came to the rescue. For linux, I installed VICE, which is in the ubuntu repositories. To make it work though, you have to download the version from viceteam.org, that contains the ROM files that are not included in the ubuntu version. There are two ways of putting the ROM files in.

    First, the way I tried, was to unzip the version 1.22 file from viceteam, and copy the ROM files into the installation directories (in ubuntu, it is /usr/lib/vice/). The ROM files are the ones without file extensions that live in the data directory in the zip file. They are named 'kernal', 'basic', 'chargen', and so on, and are in different subdirectories for different commodore models the program emulates. Once it is installed and the ROM files are in the right spots, you can start the C64 emulator by typing 'x64' at the command prompt. If all goes well, you will see the C64 screen come up in a window.

    The second way, which I did not try, is to follow the instructions here for compiling the latest version of VICE.

    Once you have the emulator running, you can download a copy of GEOS. Put the *.D64 files in a directory somewhere, and using VICE, choose the File -> Attach a disk image -> Unit #8 menu. Point it to the GEOS64.D64 file, and it should start loading. You can then marvel at the mid-1980's computing experience. It is similar to the first Mac desktop, but was available for a fraction of the price.

    There are other productive things you can do with the Commodore 64 emulator, too...

    (Originally posted November 20, 2007 on my other blog)

    The little PC that could - resurrecting an old PC as a file server

    One of the little PC related projects that I recently undertook was to do something about a backup strategy. I wanted something that I didn't have to think about much; that would perform backups basically unattended. That pretty much ruled out burning stuff to DVDs - too tedious. I did that previously and ended up backing up stuff about twice a year. The other issue is I have two PCs to back up - my main one running Ubuntu linux, the other machine of Mrs Bort that dual boots into Windows XP and Gentoo linux. The second machine also has no DVD drive. The answer came down to three options:

    1. A tape drive. Expensive, I'd also need a SCSI adapter card to put in the machine and buy a bunch of tapes. While capacity would probably be OK, I'd have to get the drive up and running in linux, and if something were to happen to the machine with the tape drive (fire, for instance) I'd not have a way to get the data back without buying a new drive. I never really considered it as an option. 
    2. External Hard drive. I already have an external box, but I think the power connector is a bit flaky. Last time I used it the disk would spin up, down, up again. Wouldn't really trust it. I'd have to buy a new disk drive, something big enough to hold what's on two 250G drives (one in each PC). I then thought, well, if I have to buy a new disk, I could put it in...
    3. A linux fileserver, built from an old PC I had lying around. This is the approach I took, described below.

    For sentimental reasons mainly, I still had the case that my first PC came in, a beige Arrow minitower case. Bought from the now defunct Pacific Microlab computers in 1992, with a 386DX-25 chip and a whopping 2 megabytes of RAM. A couple of upgrades later, it ended up with a Gigabyte GA-586TX2 motherboard and a Pentium 233MMX. This is what I decided to base the new fileserver around. Previously it had a spell as a firewall and router, sharing a dialup connection running the excellent IPCop. When I moved to cable internet and a hardware firewall, that could do the same job using only about 7 watts, it was retired.

    It is a bit poky nowadays, but with a Voodoo2 card in it I used to have hours of fun playing quake, need for speed 3 and powerslide on it, with quite good framerates. I figured if it could do that, it could probably handle life as a bit bucket, receiving files across a network.

    The first task was to buy a hard drive. Since there's about 500GB worth of possible storage on both PCs (not that it would get filled, most is just duplicated as another form of backup) I'd splurge on a 500GB drive. So a Seagate IDE 500GB drive, with 16MB cache was purchased. Since 500GB is way higher than the 8GB the onboard disk controller could read, and its ATA100 interface was far higher than the onboard controller's ATA33 max speed, I used an add in IDE card that is capable of ATA133 speeds. Fortunately I happened to have one of those lying around. I also bought a 100 megabit capable network card to replace the 10mbit one in it. Since everyone wants wireless networks these days, a wired network card was picked up for under ten dollars - bargain! And with better speed, more reliability and less security hassles than wireless as well!

    The next task was finding an operating system that would detect the IDE card and be able to boot from it. I initially had installed Damn Small Linux on it for a laugh when it had a 40GB drive (An IBM Deskstar that hadn't died, actually) and intended to use that or a fresh install of Debian. Unfortunately, that used an older version of the linux kernel which didn't yet have support for the IDE card. I considered Ubuntu server but thought it might be a bit heavy on resources - it was going to run in only 64MB after all.

    In my search for a linux distro that was light on resources and had a recent kernel, I thought, what about Gentoo? Since it was a source based distribution, any updates have to be compiled, but I could afford to just leave the thing running overnight to let it finish. Plus, I already had a few years' experience running it on the other PC.

    So I downloaded an install CD, printed out the current installation guide, put it in the CD drive of the new (old) box and... it booted! And it detected the IDE card. The hard drive was detected as the full 500GB, so there was no hardware issues. A few hours later, I was into a working Gentoo install. Since there was no graphical desktop environment installed, it felt nice and quick.

    One cool thing that can be done if you have a few machines is install a program called distcc, which uses the faster machines to offload the compiling from the old, slow machine. This would be great to get running, so I tried setting it up. First of all on the Core 2 Duo machine, but despite following a guide, I had no luck setting that up. Next was the Athlon XP machine. Sure it isn't that quick compared to the E6300, but it is lightning compared to a Pentium 233. After quite a few hours of editing config files, environment variables and swearing, I think it is at a stage where the little P233 hands work to the Athlon to compile, saving a heap of time.

    So now it is at a stage where I now have around 450ish Gigabytes of storage, a backup that is scheduled to run on my Ubuntu box (a full backup weekly, incremental backups daily) going to the backup server. Since I also now have two gentoo boxen, the server is now set up to grab the latest list of updates (emerge --sync) every night, and the Athlon box now  points to that for its update list. 

    Just a few things need to be taken care of now -  the little fan on the Pentium is pretty loud. I'd like to be able to use a larger, slower fan to cool the CPU and also direct a bit of breeze onto the hard drive too. Getting the disk to power down when it is not needed would be good too, in the interests of the drive's lifespan. Once that is sorted I'll be getting an extension cord and a long network cable, then putting the thing in the cupboard. As long as it doesn't get too hot in there it would also keep the noise down a bit.

    Hopefully it will chug along for another few years! 



    (Originally posted April 8, 2007 on my other blog)

    Fixing segmentation faults in Ubuntu's apt-get

    I recently had an issue with my backup server that runs xubuntu, in that it would refuse to update any programs installed on it. Whenever I tried to run apt-get update, or apt-get upgrade, it would just return a segmentation fault. Actually the exact thing it would return was "Segmentation faultsts... 0%", as if it overwrote whatever was meant to be there.

    I did a bit of a search and found a fix that worked for me, and was pretty simple. There are two files that apt creates, that somehow got corrupted. On my system, they are called pkgcache.bin and srcpkgcache.bin, and they live in /var/cache/apt. I tried moving them to another directory, and re-running apt-get update, and what do you know, it worked! I later deleted them, as there was no need for the old copies.

    For those who don't want to move the files before deleting them (I recommend that you at least copy them somewhere else, just in case), the command is:

    sudo rm -i /var/cache/apt/*.bin

    The command should ask you to remove two files - it does in my case. I chose to put the "i" switch in the rm command just to be sure to prompt you before deleting.

    (Originally posted September 12, 2009 on my other blog)

    Introduction

    Hi all, my name is Steve Taylor. I've been using various distributions of the Linux operating system for around ten years now. From some tentative first steps with Red Hat on an old, spare machine (I was too scared to put it on my main box in case I wiped everything accidentally), to running a dual boot system with Windows and Gentoo Linux, to the present day where I mainly use Ubuntu. I have a more general blog at bort.blogsome.com, which contains a bit of everything. This blog is going to concentrate mainly on Linux tips, mostly to do with Ubuntu, and general PC related topics. Some of the old articles from there will be copied over to here, just to kick things off.