Using Polar M460 on GNU/Linux

In short: Use Garmin if you can afford it.

The problem with Polar devices is that they are not only proprietary themselves but you can’t operate them without using additional proprietary software. If you don’t like the default Polar M460 sport profiles and screens (and no, you won’t like them), you must find someone with Windows (or Apple) computer. You must install Polar software on the computer and use it twice: First to register the device at Polar and upgrade its firmware and then, after you prepare your setup in the Polar Web application, to change the device settings.

Good news is that then you won’t need Windows anymore, unless you want to change device profiles and screens or to upgrade firmware again. But you’ll still want to retrieve data from your device regularly. You can do that using the Windows application, using a proprietary Polar Android application or (fortunately!) you can do that even on GNU/Linux.

Thanks to the authors of bipolar and v800_downloader it’s possible to download and convert data from some Polar devices. It’s also the only way to get the data without sharing it with Polar. To download and convert data from Polar M460 you need a slightly modified version of v800_downloader (v800_downloader is unfortunately abandoned, so the changes can’t be integrated into the original version). It works for me and I can get complete recorded data: time, GPS position, GPS altitude, barometric altitude, temperature, distance, speed, cadence, heart rate, heart rate variability, and some summary data. There are sometimes problems such as that stored sessions can’t be listed or the downloaded data is incomplete. Reconnecting and/or restarting v800_downloader helps in such cases.

To use online and other features such as syncing data to Polar Web, mobile notifications, Strava segments, or route upload, you still need Polar Android or Windows proprietary applications (so no luck if you have a free software OS without Google Play on your mobile phone). All you can do with Polar M460 without using additional proprietary software is to use its built-in display functions and to download logged data. But that’s already quite a lot, especially for those who don’t race, don’t want to upload their private data anywhere and don’t need proprietary fitness analysis functions.

Why do I think that using Garmin devices instead of Polar or other vendors’ devices might be a better idea? As far as I know Garmin devices still expose logged data via USB mass storage protocol and it’s possible to set up most of their functionality without using external software. So there is no trouble with their basic usage, Garmin allows at least using the basic device functions. You also get navigation functions and ANT+ support with most devices.

Is there still any reason to use Polar M460 then? Well, Polar M460 is a nice device equipped with all the common sensors, either internal or external. Up until recently, only very expensive Garmin devices supported Bluetooth (which matters if you want to use devices such as Polar OH1). You can buy Polar M460 for a good price today, Garmin is much more expensive and since Garmin devices are still proprietary (although with less need to use additional proprietary software) you waste your money on a proprietary device. But the price difference may be erased with the new Garmin Edge 130, which looks interesting (I don’t know whether it works with GNU/Linux though).

BTW, very similar problems are with Polar OH1. You need a Windows computer to initialize the device and to upgrade its firmware. If you are interested in using it as an independent device, there is still some chance to download logged heart rate from it.

I think I want that keyboard (and mouse)

KeyMouse keyboard looks interesting. It seems to be better than any of the other ergonomic keyboards I looked at as it provides all of the following:

  • Everything seems to be reasonably reachable (maybe after remapping some keys).
  • It’s curved.
  • The left and right parts of the keyboard are completely separated, permitting to position the hands freely.
  • Wireless operation.
  • The mouse of course (even better: two mice!), no need to leave the keyboard to use it.

There may be some drawbacks:

  • Stability of the keyboard(s) when typing.
  • Mouse jitter when typing.
  • Not possible to use on one’s lap.

KeyMouse is expensive, but its price is in the price range of other (mouseless) ergonomic keyboards, and the keyboard is still cheaper than a good chair or a good display.

New hard drive

In the last years adding new hard drives to desktop computers running GNU/Linux used to be very easy. The drive was just connected to the computer, partitioned, formatted and it worked well. It seems it may not be that easy now again.

I’ve bought a new Western Digital Caviar Green hard drive some time ago. I did some googling for using green hard drives on GNU/Linux and have found two surprising facts.

For first, new Western Digital hard drives use larger sectors and when partitioning the drive it’s important to align partitions properly otherwise performance of the drive will be poor. None of the tools I tried (fdisk, cfdisk, parted) was able to do the right thing. Maybe it’s because I’ve got older kernel on the computer and connecting the drive through a USB box to a laptop with recent Linux kernel didn’t help either. Should I synchronize installing new hard drives with operating system upgrades next time? Fortunately I could find a tip how to fix partitioning manually using fdisk.

For second, the Caviar Green drives park their heads just after several seconds of inactivity. It’s not much problem except for the drive lifetime. Indeed, according to my S.M.A.R.T. reports the expected lifetime of my new hard drive isn’t going to exceed the warranty period because of the frequent heads parking. Well, it’s only a counter after all and it mostly motivates one to perform backups regularly. The strange thing is that nobody seems to get the reason for such an aggressive parking policy in this kind of drive.

Are AMD processors of any worth?

I looked at a price list of PC CPUs after some time and wondered what AMD Phenom is. So I looked at the AMD website. To keep the story short I’ll limit my experience to a single FAQ entry labeled What is the AMD Phenom(TM) processor?, well representing overall information provided there.

The first part of the answer says:

AMD Phenom(TM) processors represent the next generation of AMD’s award winning multi-core Direct Connect Architecture with AMD64 technology enabling greater memory throughput, lower latency and ultra-fast connections to system resources including graphics processors and accelerators.

Well, so they say the new family of processors provides better performance than its predecessors. I wouldn’t expect the opposite, so nothing new to me.

The next paragraph:

Featuring true quad-core technology, AMD Phenom(TM) processors are designed to deliver unprecedented megatasking performance and highly tuneable performance platforms to meet the demanding needs of technologically savvy enthusiasts.

This paragraph is interesting because it contains the only single bit of information of the whole FAQ entry answer, i.e. that these processor are quad-core processors. But I could read this already in the price list. As for “unprecedented megatasking performance” and “highly tuneable performance platforms” I couldn’t find anything indicating that it describes any real features so I suspect they are just marketing idle talks.

And finally:

AMD Phenom(TM) processors are designed for phenomenal performance and optimum energy efficiency for a growing list of demanding applications, including digital content creation, high-definition video editing, multi-threaded gaming and creative design. AMD Phenom(TM) processors are targeted toward mainstream users who crave more performance and productivity.

I see, these processors are designed so that one can work with a computer. What a surprise!

So I still don’t know what the Phenom thing is about. But I know now that either AMD are idiots or they have nothing great to say about this family of processors and they try to hide this fact in meaningless blurbs. In both cases I’d hesitate to buy AMD processors.

Microsoft user

I’ve been a user of Microsoft products for some time now. No, I didn’t install their operating system on my workstation of course. But I started to use their Natural Ergonomic Desktop 7000 wireless keyboard + mouse set.

When looking for an ergonomic wireless keyboard several months ago I’ve found that the Natural 7000 set is the only choice. Except for Microsoft, only Logitech offered split wireless keyboards, but they didn’t manage to understand where the backslash key belongs to so it was no option for me.

Surprisingly this new Microsoft keyboard is based on the standard keyboard layout, with addition of some more or less useful keys. It fits into my hands very well and typing on it is more comfortable than on my old Chicony ergonomic keyboard. Its adjustable tilt looks like a good idea to me, I like it. It was just necessary to get use to the keyboard as it’s somewhat different to my previous keyboard in its shape. I also had to get rid of my bad typing habits to use the keyboard well. I can’t judge on real usefulness of the extra keys because they don’t work with the latest Linux VServer kernel and I have to wait until VServer gets updated for the last Linux changes. My only complaint about the keys so far is that the multimedia keys look very cheaply built and feel like they could fall apart any time. But Microsoft offers three year warranty for the set so we’ll see.

The mouse feels very well too. I think the side holding approach is a very good idea, it became relief for my hand after some time of its use. The mouse is nice to handle, it’s just too sensitive for my taste (any way to get it change in X?). Some people complain the mouse is too heavy but for me it’s completely fine. The side buttons are easy to reach for me (unlike for most people who reviewed the mouse on internet). There is one problem with the mouse: The left and right scroll wheel movement and especially clicking the scroll wheel (the middle button) are very though. I often have to press the middle button more than once before it actually clicks. I don’t know whether this is a feature or a problem of my particular mouse.

It’s necessary to note that my hands and fingers are longer than average and that people with smaller hands may feel both the keyboard and mouse uncomfortable.

I can’t say much about the operating range of the keyboard and mouse as I use them very near to the receiver. I tried to type something and handle the mouse at several places about 5 meters away from the receiver including one thick wall in between and it seemed to work well as long as there was no special obstacles in the way, as a computer case or my body.

As for the power source I put some old NiMH accumulators to both the keyboard and mouse. They last for about 1-2 months in the keyboard and for about twice as long in the mouse (note I use a keyboard much more than a mouse). Although both the devices are equipped with low battery warning lights, they appeared to be useless. For the first time when the accumulaters got exhausted the keyboard suddenly stopped work in the middle of typing, the same happened later to the mouse. The next time the keyboard light just blinked shortly three times and the keyboard stopped work immediately after that. Perhaps it would work better with alkaline batteries, I don’t know.

Overall I’m pretty satisfied with the set despite its minor annoyances. It’s really ergonomic and well usable and this is what matters for me.

Scanning films with flatbed and film scanners

Film scanners are often claimed to be superior to flatbed scanners when scanning 35 mm films. The harder thing is to find actual facts supporting such claims. Actually it’s possible to find samples suggesting there is no significant difference between the scanners. And even Minolta was able to find the only relevant argument for film scanner superiority on their site: better optics.

The fact is that I was sometimes dissatisfied with a cheap flatbed Epson Perfection 2480 Photo scanner. I can compare its outputs with a dedicated film scanner (Nikon LS-40 / Coolscan IV) now. Indeed, there are significant differences in the results.

As for image quality I could observe the following:

  • Every film defect (scratches, dust, garbage) is clearly visible in Nikon scans. Without digital ICE the Nikon scanner would be almost unusable. Epson is much better in this area and thus it’s the only option for scanning my old b&w negatives.
  • The true resolution of the Nikon scanner is clearly superior to the Epson, despite their nominal resolutions are almost the same (2900 versus 2400 ppi).
  • Nikon is much less prone to grain aliasing.
  • Nikon output can be used without further processing, sometimes small level of USM improves the image. Epson output is very soft and typically requires strong USM followed by noise reduction and additional USM to get well looking results (but still somewhat inferior to unprocessed Nikon output).
  • Epson suffers from irregular annoying stripes in monotone image areas, in the direction of scanning. This is one of the worst and completely unavoidable problems of the scanner.
  • Nikon has shallow depth of field and the scan is often sharp in the center (the default focus point) and unsharp near the film field borders. Special care is needed to reduce the effect. Epson is much better in this area.
  • Epson poor film holders make the scans prone to terrible reddish artifacts near the both ends of film stripes.
  • There seems to be no relevant difference in the dynamic range capabilities of the scanners when scanning negatives.

To summarize: While one can often receive similar results from the scanners, there are situations where only the film scanner is able to produce good results. IMHO it’s really worth to consider investment into a dedicated film scanner instead of a cheap flatbed. On the other hand the flatbed scanner may be superior when scanning imperfect films when digital ICE can’t be used.

Besides the image quality convenience may also matter:

  • Epson is much faster, I’d say I can scan a roll of film with it twice as faster than with Nikon.
  • Nikon can load film stripes itself (some flatbeds can do that too). On the other hand it’s sometimes difficult to force it to move the film field to the desirable position so that it could be scanned whole, without cutting out any part of the film field area. I don’t know whether this is a problem of the scanner or of something else (driver? user?), but it’s annoying.

And finally, which low-end film scanner to buy? The cheap film scanners such as Plustek or Reflecta don’t seem to provide quality comparable to standard middle-range film scanners. A used Nikon LS-40 / Coolscan IV seems to offer very nice quality/price ratio for an advanced amateur. Older Nikons are SCSI devices, i.e. quite inconvenient to use with contemporary personal computers. Nikon LS-50 / Coolscan V is one of the rare middle-range film scanner models still in production. Minolta Dual scanners are cheap, but they don’t have Digital ICE (which makes their use very inconvenient), they seem to be more prone to grain aliasing and they are infamous for banding problems. If I understand the technology right, Nikon scanners are superior in their LED light source: It’s very reliable and there is no color interpolation (each pixel is scanned in all the color channels separately). Minolta Elite 5400 scanners look very nice but they are more expensive and the II model is known to be prone to defects. I don’t care about Canon scanners as they are completely unsupported in SANE. As for SANE support, AFAIK only Nikon LS-30 and LS-40 and Minolta Dual II and III are reported to be fully supported.

HTH, although it’s all mostly a personal opinion of course.

Proprietary drivers and Linux

ATI graphics cards suck and I can’t recommend buying them. I’ve spent significant amount of time trying to get run their proprietary drivers on Linux and the conclusion is that the X.Org free drivers are dozen times better than those from ATI, despite 3D acceleration and TV-out don’t work with my ATI card. Effectively, my new ATI card is a 3D incapable device without TV output.

I don’t believe NVidia is much better – I’ve once had particularly bad experience with an NVidia graphics card. Intel cards are well supported, but does Intel make anything else than onboard cards? We clearly miss real competition on the hardware market.

I really can’t understand the ignorance of hardware vendors. They are incapable to produce stable and well working drivers for Linux. Well, why don’t they provide specifications to their devices then and let people make good drivers for free? What’s so secret on accessing 3D graphics acceleration or making a printer to print a borderless photo?!

Will Linux and other non-proprietary operating systems still be usable without essential modern hardware features? I doubt. But what’s the solution? Current market turns into unbreakable oligopoly in many areas, so the natural market mechanism doesn’t work. Open hardware would be the best solution, but it seems this is not something that could become widely accessible in the foreseeable future. Apparently there is not much else to do now than continuing the reverse engineering battle. 🙁

AMD64 dual core as a power saving solution

One of the interesting features of the AMD64 X2 architecture is that its declared power consumption is not higher than of single core AMD64 processors running on similar frequencies. So if you perform significant amount of tasks that can be parallelized (e.g. running a build daemon), you can choose between two different advantages of the dual core system: higher performance or reduced power consumption.

The latter should be achievable by running the CPU on half of its standard frequency (using the Cool`n’Quiet technology). This way you may get a high performance system that can be easily dynamically turned into a system with about half the CPU power consumption and performance of a single core system running on the full CPU frequency. Cool. It seems it makes perfect sense to run ‘nice make -j3’ on a dual core system.

AMD64 performance

Having finally an AMD64 system in my hands, I’m positively surprised by its performance. When I switched from a Duron 650 MHz to a Celeron 2.6 GHz a few years ago, the performance increased much less than I expected. This applied to Sempron 2400+ (~1.6 GHz?, Socket A) as well, it ran only slightly faster than the Celeron. The increase in performance was significantly lower than increase of the (AMD) CPU frequency. After that experience and having read some articles about current processors and 64-bit systems, I didn’t expect too much from an upgrade to Athlon64 X2 3800+ (2.0 GHz), despite AMD64 users around me reported great performance with their AMD64 systems.

Surprisingly, according to my first tests GNU/Linux runs about 2-3 times faster on the Athlon64 than on the Celeron (measuring single processes). The overall user experience confirms the timings, e.g. Firefox doesn’t feel like a terrible CPU hog now. I can think about the following reasons for the great performance:

  • CPU+bus PC technology finally improved in such a way that it can deliver much better performance with the same CPU clock frequency.
  • The 64-bit mode is faster than 32-bits on AMD.
  • 32-bit system binaries are compiled for a general i386 architecture, while the 64-bit system is optimized directly for the target architecture in Debian.

Overall Athlon64 at 2.0 GHz seems to be very roughly about 5 times faster than the 6 year old Duron at 650 MHz, which means that the performance increased more than the CPU frequency. Cool.

As for the dual core performance, it seems to serve its purpose. Of course, don’t expect it to speed up your single tasks. At best you can try to run ‘make -j3’ on ‘make’ based build systems. If you’re lucky, your compilation time can decrease to almost half the standard compilation time, but you may face two problems: 1. Not everything can be built in parallel (e.g. speed improvement of Emacs compilation is only about 10%, since the Lisp part compilation is not parallelized); 2. Conflicts can occur in parallel compilation (as happened to me at least twice – in Emacs and in Festival), forcing you to manually restart the compilation. Well, I don’t use Gentoo nor run a Debian builder machine so compilation time tuning is not that important to me.

The real benefit of the dual core architecture on a mixed desktop/server system is that background tasks run without disturbing your interactive work. You no longer need fear to run big rsync tasks (such as backup) or multimedia processing on background while doing unrelated interactive work. (This is something to be appreciated with the current state of software when one should buy a strong machine just to use a modern web browser comfortably.) So far it works well, but I must wait until the machine gets into its full service before making my final statement about saving time, patience and coffee.