Planet ALUG

January 09, 2018

Jonathan McDowell

How Virgin Media lost me as a supporter

For a long time I’ve been a supporter of Virgin Media (from a broadband perspective, though their triple play TV/Phone/Broadband offering has seemed decent too). I know they have a bad reputation amongst some people, but I’ve always found their engineers to be capable, their service in general reliable, and they can manage much faster speeds than any UK ADSL/VDSL service at cheaper prices. I’ve used their services everywhere I’ve lived that they were available, starting back in 2001 when I lived in Norwich. The customer support experience with my most recent move has been so bad that I am no longer of the opinion it is a good idea to use their service.

Part of me wonders if the customer support has got worse recently, or if I’ve just been lucky. We had a problem about 6 months ago which was clearly a loss of signal on the line (the modem failed to see anything and I could clearly pinpoint when this had happened as I have collectd monitoring things). Support were insistent they could do a reset and fix things, then said my problem was the modem and I needed a new one (I was on an original v1 hub and the v3 was the current model). I was extremely dubious but they insisted. It didn’t help, and we ended up with an engineer visit - who immediately was able to say they’d been disconnecting noisy lines that should have been unused at the time my signal went down, and then proceeded to confirm my line had been unhooked at the cabinet and then when it was obvious the line was noisy and would have caused problems if hooked back up patched me into the adjacent connection next door. Great service from the engineer, but support should have been aware of work in the area and been able to figure out that might have been a problem rather than me having a 4-day outage and numerous phone calls when the “resets” didn’t fix things.

Anyway. I moved house recently, and got keys before moving out of the old place, so decided to be organised and get broadband setup before moving in - there was no existing BT or Virgin line in the new place so I knew it might take a bit longer than usual to get setup. Also it would have been useful to have a connection while getting things sorted out, so I could work while waiting in for workmen. As stated at the start I’ve been pro Virgin in the past, I had their service at the old place and there was a CableTel (the Belfast cable company NTL acquired) access hatch at the property border so it was clear it had had service in the past. So on October 31st I placed an order on their website and was able to select an installation date of November 11th (earlier dates were available but this was a Saturday and more convenient).

This all seemed fine; Virgin contacted me to let me know there was some external work that needed to be done but told me it would happen in time. This ended up scheduled for November 9th, when I happened to be present. The engineers turned up, had a look around and then told me there was an issue with access to their equipment - they needed to do a new cable pull to the house and although the ducting was all there the access hatch for the other end was blocked by some construction work happening across the road. I’d had a call about this saying they’d be granted access from November 16th, so the November 11th install date was pushed out to November 25th. Unfortunate, but understandable. The engineers also told me that what would happen is the external team would get a cable hooked up to a box on the exterior of the house ready for the internal install, and that I didn’t need to be around for that.

November 25th arrived. There was no external box, so I was dubious things were actually going to go ahead, but I figured there was a chance the external + internal teams would turn up together and get it sorted. No such luck. The guy who was supposed to do the internal setup turned up, noticed the lack of an external box and informed me he couldn’t do anything without that. As I’d expected. I waited a few days to hear from Virgin and heard nothing, so I rang them and was told the installation had moved to December 6th and the external bit would be done before that - I can’t remember the exact date quoted but I rang a couple of times before the 6th and was told it would happen that day “for sure” each time.

December 5th arrives and I get an email telling me the installation has moved to December 21st. This is after the planned move date and dangerously close to Christmas - I’m aware that in the event of any more delays I’m unlikely to get service until the New Year. Lo and behold on December 7th I’m informed my install is on hold and someone will contact me within 5 working days to give me an update.

Suffice to say I do not get called. I ring towards the end of the following week and am told they are continuing to have trouble carrying out work on the access hatch. So I email the housing company doing the work across the road, asking if Virgin have actually been in touch and when the building contractors plan to give them the access they require. I get a polite response saying Virgin have been on-site but did not ask for anything to be moved or make it clear they were trying to connect a customer. And providing an email address for the appropriate person in the construction company to arrange access.

I contact Virgin to ask about this on December 20th. There’s no update but this time I manage to get someone who actually seems to want to help, rather than just telling me it’s being done today or soon. I get email confirmation that the matter is being escalated to the Area Field Manager (I’d been told this by phone on December 16th as well but obviously nothing had happened), and provide the contact details for the construction company.

And then I wait. I’m aware things wind down over the Christmas period, so I’m not expecting an install before the New Year, but I did think I might at least get a call or email with an update. Nothing. My wife rang to finally cancel our old connection last week (it’s been handy to still have my backup host online and be able to go and do updates in the old place) and they were aware of the fact we were trying to get a new connection and that there had been issues, but had no update and then proceeded to charge a disconnection fee, even though Virgin state no disconnection if you move and continue with Virgin Media.

So last week I rang and cancelled the order. And got the usual story of difficulty with access and asked to give them 48 hours to get back to me. I said no, that the customer service so far had been appalling and to cancel anyway. Which I’m informed has now been done.

Let’s be clear on what I have issue with here. While the long delay is annoying I don’t hold Virgin entirely responsible - there is construction work going on and things slow down over Christmas (though the order was placed long enough beforehand that this really shouldn’t have impacted things). The problem is the customer service and complete lack of any indication Virgin are managing this process well internally - the fact the interior install team turned up when the exterior work hadn’t been completed is shocking! If Virgin had told me at the start (or once they’d done the first actual physical visit to the site and seen the situation) that there was going to be a delay and then been able to provide a firm date, I’d have been much more accepting. Instead, the numerous reschedules, an inability to call back and provide updates when promised and the empty assurances that exterior work will be carried out on certain dates all leave me lacking any faith in what Virgin tell me. Equally, the fact they have charged a disconnection fee when their terms state they wouldn’t is ridiculous (a complaint has been raised but at the time of writing the complaints team has, surprise, surprise, not been in contact). If they’re so poor when I’m trying to sign up as a new customer, why should I have any faith in their ability to provide decent support when I actually have their service?

It’s also useful to contrast my Virgin experience with 2 others. Firstly, EE who I used for 4G MiFi access. Worked out of the box, and when I rang to cancel (because I no longer need it) were quick and efficient about processing the cancellation and understood that I’d been pleased with the service but no longer needed it, so didn’t do a hard retention sell.

Secondly, I’ve ended up with FTTC over a BT Openreach line from a local Gamma reseller, MCL Services. I placed this order on December 8th, after Virgin had put my install on hold. At the point of order I had an install date of December 19th, but within 3 hours it was delayed until January 3rd. At this point I thought I was going to have similar issues, so decided to leave both orders open to see which managed to complete first. I double-checked with MCL on January 2nd that there’d been no updates, and they confirmed it was all still on and very unlikely to change. And, sure enough, on the morning of January 3rd a BT engineer turned up after having called to give a rough ETA. Did a look around, saw the job was bigger than expected and then, instead of fobbing me off, got the job done. Which involved needing a colleague to turn up to help, running a new cable from a pole around the corner to an adjacent property and then along the wall, and installing the master socket exactly where suited me best. In miserable weather.

What did these have in common that Virgin does not? First, communication. EE were able to sort out my cancellation easily, at a time that suited me (after 7pm, when I’d got home from work and put dinner on). MCL provided all the installation details for my FTTC after I’d ordered, and let me know about the change in date as soon as BT had informed them (it helps I can email them and actually get someone who can help, rather than having to phone and wait on hold for someone who can’t). BT turned up and discovered problems and worked out how to solve them, rather than abandoning the work - while I’ve had nothing but good experiences with Virgin’s engineers up to this point there’s something wrong if they can’t sort out access to their network in 2 months. What if I’d been an existing customer with broken service?

This is a longer post than normal, and no one probably cares, but I like to think that someone in Virgin might read it and understand where my frustrations throughout this process have come from. And perhaps improve things, though I suspect that’s expecting too much and the loss of a single customer doesn’t really mean a lot to them.

January 09, 2018 08:39 AM

January 02, 2018

Steve Engledow (stilvoid)


New year, new diff. Here's last year.

New Year's Resolutions

by Steve Engledow ( at January 02, 2018 10:09 AM

January 01, 2018

Jonathan McDowell

Twisted Networking with an EE MiFi

Life’s been busy for the past few months, so excuse the lack of posts. One reason for this is that I’ve moved house. Virgin were supposed to install a cable modem on November 11th, but at the time of writing currently have my install on hold (more on that in the future). As a result when we actually moved in mid-December there was no broadband available. I’d noticed Currys were doing a deal on an EE 4GEE WiFi Mini - £4.99 for the device and then £12.50/month for 20GB on a 30 day rolling contract. Seemed like a good stopgap measure even if it wasn’t going to be enough for streaming video. I was pleasantly surprised to find it supported IPv6 out of the box - all clients get a globally routed IPv6 address (though it’s firewalled so you can’t connect back in; I guess this makes sense but it would be nice to be able to allow things through). EE are also making use of DNS64 + NAT64, falling back to standard CGNAT when the devices don’t support that.

All well and good, but part of the problem in the new place is a general lack of mobile reception in some rooms (foil backed internal insulation doesn’t help). So the MiFi is at the top of the house, where it gets a couple of bars of 4G reception and sits directly above the living room and bedroom. Coverage in those rooms is fine, but the kitchen is at the back of the house through a couple of solid brick walls and the WiFi ends up extremely weak there. Additionally my Honor 7 struggles to get a 3 signal in that room (my aging Nexus 7, also on 3, does fine, so it seems more likely to be the Honor 7 at fault here). I’ve been busy with various other bits over the Christmas period, but with broadband hopefully arriving in the new year I decided it was time to sort out my UniFi to handle coverage in the kitchen.

The long term plan is cabling around the house, but that turned out to be harder than expected (chipboard flooring and existing cabling not being in conduit ruled out the easy options, so there needs to be an external run from the top to the bottom). There is a meter/boiler room which is reasonably central and thus a logical place for cable termination and an access point to live. So I mounted the UniFi there, on the wall closest to the kitchen. Now I needed to get it connected to the MiFi, which was still upstairs. Luckily I have a couple of PowerLine adaptors I was using at the old place, so those provided a network link between the locations. The only remaining problem was that the 4GEE doesn’t have ethernet. What it does have is USB, and it presents as a USB RNDIS network interface. I had a spare DGN3500 lying around, so I upgraded it to the latest LEDE, installed kmod-usb-net-rndis and usb-modeswitch and then had a usb0 network device. I bridged this with eth0.1 - I want clients to talk to the 4GEE DHCP server so they can roam between the 2 APs, and I want the IPv6 configuration to work on both APs as well. I did have to change the IP on the DGN3500 as well - it defaulted to which is what the 4GEE uses. Switching it to a static ensures I can still get to it when the 4GEE isn’t active and prevents conflicts.

The whole thing ends up looking like the following (I fought Inkscape + Dia for a bit, but ASCII art turned out to be the easiest option):

/----------\       +-------+       +--------------+            +------------+
| Internet |--LTE--| EE 4G |--USB--|   DGN3500    |--Ethernet--| TL-PA9020P |
|          |       | MiFi  |       | LEDE 17.01.4 |            | PowerLine  |
\----------/       +-------+       +--------------+            +------------+
                       |                                              |
                      WiFi                                            |
                       |                                              |
                  +---------+                                         |
                  | Clients |                                     Ring Main
                  +---------+                                         |
                       |                                              |
                      WiFi                                            |
                       |                                              |
                  +--------+            +----------+            +------------+
                  | UniFi  |--Ethernet--|   PoE    |--Ethernet--| TL-PA9020P |
                  | AC Pro |            | Injector |            | PowerLine  |
                  +--------+            +----------+            +------------+

It feels a bit overly twisted for use with just a 4G connection, but various bits will be reusable when broadband finally arrives.

January 01, 2018 02:36 PM

December 31, 2017

Chris Lamb

Free software activities in December 2017

Here is my monthly update covering what I have been doing in the free software world in December 2017 (previous month):

Reproducible builds

Whilst anyone can inspect the source code of free software for malicious flaws, most software is distributed pre-compiled to end users.

The motivation behind the Reproducible Builds effort is to allow verification that no flaws have been introduced — either maliciously or accidentally — during this compilation process by promising identical results are always generated from a given source, thus allowing multiple third-parties to come to a consensus on whether a build was compromised.

I have generously been awarded a grant from the Core Infrastructure Initiative to fund my work in this area.

This month I:

I also made the following changes to our tooling:


diffoscope is our in-depth and content-aware diff utility that can locate and diagnose reproducibility issues.

  • Support Android ROM boot.img introspection. (#884557)
  • Handle case where a file to be "fuzzy" matched does not contain enough entropy despite being over 512 bytes. (#882981)
  • Ensure the cleanup of symlink placeholders is idempotent. [...]


trydiffoscope is a web-based version of the diffoscope in-depth and content-aware diff utility. Continued thanks to Bytemark for sponsoring the hardware.

  • Parse dpkg-parsechangeloga in instead of hardcoding version. [...]
  • Flake8 the main file. [...] is my experiment into how to process, store and distribute .buildinfo files after the Debian archive software has processed them.

  • Don't HTTP 500 if no request body. [...]
  • Catch TypeError: decode() argument 1 must be string, not None tracebacks. [...]


My activities as the current Debian Project Leader will be covered in my Bits from the DPL email to the debian-devel-announce mailing list.

Patches contributed

  • bitseq: Add missing Build-Depends on python-numpy for documentation generation. (#884677)
  • dh-golang: Avoid "uninitialized value" warnings. (#885696)
  • marsshooter: Avoid source-includes-file-in-files-excluded Lintian override. (#885732)
  • gtranslator: Do not ship .pyo and .pyc files. (#884714)
  • media-player-info: Bugs field does not refer to Debian infrastructure. (#885703)
  • pydoctor: Add a Homepage field to debian/control. (#884255)

Debian LTS

This month I have been paid to work 14 hours on Debian Long Term Support (LTS). In that time I did the following:

  • "Frontdesk" duties, triaging CVEs, etc.
  • Updating old notes in data/dla-needed.txt.
  • Issued DLA 1204-1 for the evince PDF viewer to fix an arbitrary command injection vulnerability where a specially-crafted embedded DVI filename could be exploited to run commands as the current user when "printing" to PDF.
  • Issued DLA 1209-1 to fix a vulnerability in sensible-browser (a utility to start the most suitable web browser based on one's environment or configuration) where remote attackers could conduct argument-injection attacks via specially-crafted URLs.
  • Issued DLA 1210-1 for kildclient, a "MUD" multiplayer real-time virtual world game to remedy a command-injection vulnerability.


  • python-django (2:2.0-1) — Release the new upstream stable release to the experimental suite.
  • redis:
    • 5:4.0.5-1 — New upstream release & use "metapackage" over "meta-package" in debian/control.
    • 5:4.0.6-1 — New upstream bugfix release.
    • 5:4.0.6-2 — Replace redis-sentinel's main dependency with redis-tools from redis-server moving the creating/deletion of the redis user, associated data & log directories to redis-tools (#884321), and add stub manpages for redis-sentinel, redis-check-aof & redis-check-rdb.
    • 5:4.0.6-1~bpo9+1 — Upload to the stretch-backports repository.
  • redisearch:
    • 1.0.1-1 — New upstream release.
    • 1.0.2-1 — New upstream release, ensure .so file is hardered (upstream patch), update upstream's .gitignore so our changes under debian/ are visible without -f (upstream patch and override no-upstream-changelog in all binary packages.
  • installation-birthday (6) — Bump Standards-Version to 4.1.2 and replace Priority: extra with Priority: optional.

Finally, I also made the following miscellaneous uploads:

  • cpio (2.12+dfsg-6), NMU-ing a new 2.12 upstream version to the "unstable" suite.
  • wolfssl (3.12.2+dfsg-1 & 3.13.0+dfsg-1) — Sponsoring new upstream versions.

Debian bugs filed

FTP Team

As a Debian FTP assistant I ACCEPTed 106 packages: aodh, autosuspend, binutils, btrfs-compsize, budgie-extras, caja-seahorse, condor, cross-toolchain-base-ports, dde-calendar, deepin-calculator, deepin-shortcut-viewer, dewalls, dh-dlang, django-mailman3, flask-gravatar, flask-mail, flask-migrate, flask-paranoid, flask-peewee, gcc-5-cross-ports, getmail, gitea, gitlab, golang-github-go-kit-kit, golang-github-knqyf263-go-deb-version, golang-github-knqyf263-go-rpm-version, golang-github-mwitkow-go-conntrack, golang-github-parnurzeal-gorequest, golang-github-prometheus-tsdb, haskell-unicode-transforms, haskell-unliftio-core, htslib, hyperkitty, libcbor, libcdio, libcidr, libcloudproviders, libepubgen, libgaminggear, libgitlab-api-v4-perl, libgoocanvas2-perl, libical, libical3, libixion, libjaxp1.3-java, liblog-any-adapter-tap-perl, liborcus, libosmo-netif, libt3config, libtirpc, linux-show-player, mailman-hyperkitty, mailman-suite, mailmanclient, muchsync, node-browser-stdout, node-crc32, node-deflate-js, node-get-func-name, node-ip-regex, node-json-parse-better-errors, node-katex, node-locate-path, node-uglifyjs-webpack-plugin, nq, nvidia-cuda-toolkit, openstack-meta-packages, osmo-ggsn, osmo-hlr, osmo-libasn1c, osmo-mgw, osmo-pcu, patman, peewee, postorius, pyasn1, pymediainfo, pyprind, pysmi, python-colour, python-defaults, python-django-channels, python-django-x509, python-ldap, python-quamash, python-ratelimiter, python-rebulk, python-trezor, python3-defaults, python3-stdlib-extensions, python3.6, python3.7, qscintilla2, range-v3, rawkit, remmina, reprotest, ruby-gettext-i18n-rails-js, ruby-webpack-rails, sacjava, sphinxcontrib-pecanwsme, unicode-cldr-core, wolfssl, writerperfect, xrdp & yoshimi.

I additionally filed 4 RC bugs against packages that had incomplete debian/copyright files against: libtirpc, python-ldap, python-trezor & sphinxcontrib-pecanwsme.

December 31, 2017 04:17 PM

December 30, 2017

Chris Lamb

Favourite books of 2017

Whilst I managed to read just over fifty books in 2017 (down from sixty in 2016) here are ten of my favourites, in no particular order.

Disappointments this year included Doug Stanhope's This Is Not Fame, a barely coherent collection of bar stories that felt especially weak after Digging Up Mother, but I might still listen to the audiobook as I would enjoy his extemporisation on a phone book. Ready Player One left me feeling contemptuous, as did Charles Stross' The Atrocity Archives.

The worst book I finished this year was Adam Mitzner's Dead Certain, beating Dan Brown's Origin, a poor Barcelona tourist guide at best.

Year of Wonders

Geraldine Brooks

Teased by Hilary Mantel's BBC Reith Lecture appearances and not content with her short story collection, I looked to others for my fill of historical fiction whilst awaiting the final chapter in the Wolf Hall trilogy.

This book, Year of Wonders, subtitled A Novel of the Plague, is written from point of view of Anna Frith, recounting what she and her Derbyshire village experience when they nobly quarantine themselves in order to prevent the disease from spreading further.

I found it initially difficult to get to grips with the artificially aged vocabulary — and I hate to be "that guy" — but do persist until the chapter where Anna takes over the village apothecary.

The Second World Wars

Victor Davis Hanson

If the pluralisation of "Wars" is an affectation, it certainly is an accurate one: whilst we might consider the Second World War to be a unified conflict today, Hanson reasonably points out that this is a post hoc simplification of different conflicts from the late-1910s through 1945.

Unlike most books that attempt to cover the entirety of the war, this book is organised by topic instead of chronology. For example, there are two or three adjacent chapters comparing and contrasting naval strategy before moving onto land armies, constrasting and comparing Germany's eastern and western fronts, etc. This approach leads to a readable and surprisingly gripping book despite its lengthy 720 pages.

Particular attention is given to the interplay between the various armed services and how this tended to lead to overall strategic victory. This, as well as the economics of materiel, simple rates-of-replacement, combined with the irrationality and caprice of the Axis would be an fair summary of the author's general thesis — this is no Churchill, Hitler & The Unnecessary War.

Hanson is not afraid to ask "what if" questions but only where they provide meaningful explanation or provide deeper rationale rather than as an indulgent flight of fancy. His answers to such questions are invariably that some outcome would have come about.

Whilst the author is a US citizen, he does not spare his homeland from criticism, but where Hanson's background as classical-era historian lets him down is in contrived comparisons to the Peloponnesian War and other ancient conflicts. His Napoleonic references do not feel as forced, especially due to Hitler's own obsessions. Recommended.

Everybody Lies

Seth Stephens-Davidowitz

Vying for the role as the Freakonomics for the "Big Data" generation, Everybody Lies is essentially a compendium of counter-arguments, refuting commonly-held beliefs about the internet and society in general based on large-scale observations. For example:

Google searches reflecting anxiety—such as "anxiety symptoms" or "anxiety help"—tend to be higher in places with lower levels of education, lower median incomes and where a larger portion of the population lives in rural areas. There are higher search rates for anxiety in rural, upstate New York than in New York City.


On weekends with a popular violent movie when millions of Americans were exposed to images of men killing other men, crime dropped. Significantly.

Some methodological anecdotes are included: a correlation was once noticed between teens being adopted and the use of drugs and skipping school. Subsequent research found this correlation was explained entirely by the 20% of the self-reported adoptees not actually being adopted...

Although replete with the kind of factoids that force you announce them out loud to anyone "lucky" enough to be in the same room as you, Everybody Lies is let down by a chronic lack of structure — a final conclusion that is so self-aware of its limitations that it ready and repeatedly admits to it is still an weak conclusion.

The Bobiverse Trilogy

Dennis Taylor

I'm really not a "science fiction" person, at least not in the sense of reading books catalogued as such, with all their indulgent meta-references and stereotypical cover art.

However, I was really taken by the conceit and execution of the Bobiverse trilogy: Robert "Bob" Johansson perishes in an automobile accident the day after agreeing to have his head cryogenically frozen upon death. 117 years later he finds that he has been installed in a computer as an artificial intelligence. He subsequently clones himself multiple times resulting in the chapters being written from various "Bob's" locations, timelines and perspectives around the galaxy.

One particular thing I liked about the books was their complete disregard for a film tie-in; Ready Player One was almost cynically written with this in mind, but the Bobiverse cheerfully handicaps itself by including Homer Simpson and other unlicensable characters.

Whilst the opening world-building book is the most immediately rewarding, the series kicks into gear after this — as the various "Bob's" unfold with differing interests (exploration, warfare, pure science, anthropology, etc.) a engrossing tapestry is woven together with a generous helping of humour and, funnily enough, believability.

Homo Deus: A Brief History of Tomorrow

Yuval Noah Harari

After a number of strong recommendations I finally read Sapiens, this book's prequel.

I was gripped, especially given its revisionist insight into various stages of Man. The idea that wheat domesticated us (and not the other way around) and how adoption of this crop led to truncated and unhealthier lifespans particularly intrigued me: we have an innate bias towards chronocentrism, so to be reminded that progress isn't a linear progression from "bad" to "better" is always useful.

The sequel, Homo Deus, continues this trend by discussing the future potential of our species. I was surprised just how humourous the book was in places. For example, here is Harari on the anthropocentric nature of religion:

You could never convince a monkey to give you a banana by promising him limitless bananas after death in monkey heaven.

Or even:

You can't settle the Greek debt crisis by inviting Greek politicians and German bankers to a fist fight or an orgy.

The chapters on AI and the inexpensive remarks about the impact of social media did not score many points with me, but I certainly preferred the latter book in that the author takes more risks with his own opinion so it's less dry and more more thought-provoking, even if one disagrees.

La Belle Sauvage: The Book of Dust Volume One

Philip Pullman

I have extremely fond memories of reading (and re-reading, etc.) the author's Dark Materials as a teenager despite being started on the second book by a "supply" English teacher.

La Belle Sauvage is a prequel to this original trilogy and the first of another trio. Ms Lyra Belacqua is present as a baby but the protagonist here is Malcolm Polstead who is very much part of the Oxford "town" rather than "gown".

Alas, Pullman didn't make a study of Star Wars and thus relies a little too much on the existing canon, wary to add new, original features. This results in an excess of Magesterium and Mrs Coulter (a superior Delores Umbridge, by the way), and the protagonist is a little too redolent of Will...

There is also an very out-of-character chapter where the magical rules of the novel temporarily multiply resulting in a confusion that was almost certainly not the author's intention. You'll spot it when you get to it, which you should.

(I also enjoyed the slender Lyra's Oxford, essentially a short story set just a few years after The Amber Spyglass.)

Open: An Autobiography

Andre Agassi

Sporting personalities certainly exist, but they are rarely revealed by their "authors" so upon friends' enquiries to what I was reading I frequently caught myself qualifying my response with «It's a sports autobiography, but...».

It's naturally difficult to know what we can credit to Agassi or his (truly excellent) ghostwriter but this book is a real pleasure to read. This is no lost Nabokov or Proust, but the level of wordsmithing went beyond supererogatory. For example:

For a man with so many fleeting identities, it's shocking, and symbolic, that my initials are A. K. A.


I understand that there's a tax on everything in America. Now, I discover that this is the tax on success in sports: fifteen seconds of time for every fan.

Like all good books that revolve around a subject, readers do not need to know or have any real interest in the topic at hand, so even non-tennis fans will find this an engrossing read. Dark themes abound — Agassi is deeply haunted by his father, a topic I wish he went into more, but perhaps he has not done the "work" himself yet.

The Complete Short Stories

Roald Dahl

I distinctly remember reading Roald Dahl's The Wonderful Story of Henry Sugar and Six More collection of short stories as a child, some characters still etched in my mind; the 'od carrier and fingersmith of The Hitchhiker or the protagonist polishing his silver Trove in The Mildenhall Treasure.

Instead of re-reading this collection I embarked on reading his complete short stories, curious whether the rest of his œuvre was at the same level. After reading two entire volumes, I can say it mostly does — Dahl's typical humour and descriptive style are present throughout with only a few show-off sentences such as:

"There's a trick that nearly every writer uses of inserting at least one long obscure word into each story. This makes the reader think that the man is very wise and clever. I have a whole stack of long words stored away just for this purpose." "Where?" "In the 'word-memory' section," he said, epexegetically.

There were a perhaps too many of his early, mostly-factual, war tales that were lacking a an interesting conceit and I still might recommend the Henry Sugar collection for the uninitiated, but I would still heartily recommend either of these two volumes, starting with the second.

Watching the English

Kate Fox

Written by a social anthropologist, this book dissects "English" behaviour for the layman providing an insight into British humour, rites of passage, dress/language codes, amongst others.

A must-read for anyone who is in — or considering... — a relationship with an Englishman, it is also a curious read for the native Brit: a kind of horoscope for folks, like me, who believe they are above them.

It's not perfect: Fox tediously repeats that her "rules" or patterns are not rules in the strict sense of being observed by 100% of the population; there will always be people who do not, as well as others whose defiance of a so-called "rule" only reinforces the concept. Most likely this reiteration is to sidestep wearisome criticisms but it becomes ponderous and patronising over time.

Her general conclusions (that the English are repressed, risk-averse and, above all, hypocrites) invariably oversimplify, but taken as a series of vignettes rather than a scientifically accurate and coherent whole, the book is worth your investment.

(Ensure you locate the "revised" edition — it not only contains more content, it also profers valuable counter-arguments to rebuttals Fox received since the original publication.)

What Does This Button Do?

Bruce Dickinson

In this entertaining autobiography we are thankfully spared a litany of Iron Maiden gigs, successes and reproaches of the inevitable bust-ups and are instead treated to an introspective insight into just another "everyman" who could very easily be your regular drinking buddy if it weren't for a need to fulfill a relentless inner drive for... well, just about anything.

The frontman's antics as a schoolboy stand out, as are his later sojourns into Olympic fencing and being a commercial pilot. These latter exploits sound bizarre out of context but despite their non-sequitur nature they make a perfect foil (hah!) to the heavy metal.

A big follower of Maiden in my teens, I fell off the wagon as I didn't care for their newer albums so I was blindsided by Dickinson's sobering cancer diagnosis in the closing chapters. Furthermore, whilst Bruce's book fails George Orwell's test that autobiography is only to be trusted when it reveals something disgraceful, it is tour de force enough for to distract from any concept of integrity.

(I have it on excellent authority that the audiobook, which is narrated by the author, is definitely worth one's time.)

December 30, 2017 06:56 PM

December 26, 2017

Mick Morgan

Merry Christmas 2017

I’m a couple of days late this year. I normally post on Christmas Eve, trivia’s birthday, but hey, I’ve been busy (it goes with the territory at this time of year if you are a grandparent). This year I thought I would depart from my usual topic(s) and post a couple of pictures marking the occasion. So here you go.

Last year my lady gave me a rather interesting christmas present – a Mr Potato Head, but home made.

mr potato head

Not content to leave the joke alone, this year she went slightly upmarket and gave me a Mr Pineapple Head.

mr pineapple head

I’m sure she loves me really. In fact I know that she does. She made the toadstool cake below for our daughter’s boys, and hey, she really does love those boys.


Merry Christmas to all my readers, wherever you are (and oddly enough, a lot of you appear to be in China).

by Mick at December 26, 2017 05:47 PM

October 24, 2017

Daniel Silverstone (Kinnison)

Introducing 석진 the car

For many years now, I have been driving a diesel based VW Passat Estate. It has served me very well and been as reliable as I might have hoped given how awful I am with cars. Sadly Gunther was reaching the point where it was going to cost more per year to maintain than the car was worth, and also I've been being more and more irked by not having a car from the future.

I spent many months doing spreadsheets, trying to convince myself I could somehow afford a Tesla of some variety. Sadly I never quite managed it. As such I set my sights on the more viable BEVs such as the Nissan Leaf. For a while, nothing I saw was something I wanted. I am quite unusual it seems, in that I don't want a car which is a "Look at me, I'm driving an electric car" fashion statement. I felt like I'd never get something which looked like a normal car, but happened to be a BEV.

Then along came the Hyundai Ioniq. Hybrid, Plug-in Hybrid, and BEV all looking basically the same, and not in-your-face-special. I began to covet. Eventually I caved and arranged a test drive of an Ioniq plug-in hybrid because the BEV was basically on 9 month lead. I enjoyed the drive and was instantly very sad because I didn't want a plug-in hybrid, I wanted a BEV. Despondent, I left the dealership and went home.

I went online and found a small number of second-hand Ioniq BEVs but as I scrolled through the list, none seemed to be of the right trim level. Then, just as I was ready to give up hope, I saw a new listing, no photo, of the right thing. One snag, it was 200 miles away. No matter, I rang the place, confirmed it was available, and agreed to sleep on the decision.

The following morning, I hadn't decided to not buy, so I called them up, put down a deposit to hold the car until I could test drive it, and then began the long and awkward process of working out how I would charge the car given I have no off-street parking so I can't charge at home. (Yeah yeah, you'd think I'd have checked that first, but no I'm just not that sensible). Over the week I convinced myself I could do it, I ordered RFID cards for various schemes, signed up with a number of services, and then, on Friday last week, I drove down to a hotel near the dealership and had a fitful night's sleep.

I rocked up to the dealership exactly as they opened for business, shook the hand of the very helpful salesman who had gone through the purchase process with me over the phone during the week, and got to see the car. Instant want coursed through me as I sat in it and decided "Yes, this somehow feels right".

I took the car for about a 45 minute test drive just to see how it felt relative to the plug-in hybrid I'd driven the week before and it was like night and day. The BEV felt so much better to drive. I was hooked. Back to the dealership and we began the paperwork. Emptying Gunther of all of the bits and bobs scattered throughout his nooks and crannies took a while and gave me a chance to say goodbye to a car which, on reflection, had actually been a pleasure to own, even when its expensive things went wrong, more than once. But once I'd closed the Passat for the last time, and handed the keys over, it was quite a bittersweet moment as the salesman drove off in what still felt like my car, despite (by this point) it not being so.

Sitting in the Ioniq though, I headed off for the 200 mile journey back home. With about 90% charge left after the test drive, I had two stops planned at rapid chargers and I headed toward my first.

Unfortunately disaster struck, the rapid (50KW) charger refused to initialise, and I ended up with my car on the slower (7KW) charger to get enough juice into it to drive on to the next rapid charger enabled service station. When I got the message that my maximum charge period (45m) had elapsed, I headed back to the car to discover I couldn't persuade it to unlock from the car. Much hassle later, and an AA man came and together we learned that it takes 2 to tango, one to pull the emergency release in the boot, the other to then unplug the cable.

Armed with this knowledge, I headed on my way to a rapid charger I'd found on the map which wasn't run by the same company. Vainly hoping that this would work better, I plugged the car in, set the charger going, and headed into the adjacent shop for a rest break. I went back to the car about 20 minutes later to see the charger wasn't running. Horror of horrors. I imagined maybe some nasty little oik had pressed 'stop' so I started the charger up again, and sat in the car to read my book. After about 3 minutes, the charge stopped. Turns out that charger was slightly iffy and couldn't cope with the charge current and kept emergency-stopping as a result. The lovely lady I spoke to about it directed me to a nearby (12 miles or so, easily done with the charge I had) charger in the grounds of a gorgeous chateau hotel. That one worked perfectly and I filled up. I drove on to my second planned stop and that charge went perfectly too. In fact, every charge since has gone flawlessly. So perhaps my baptism of failed charges has hardened me to the problems with owning a BEV.

I've spent the past few days trying different charge points around Manchester enjoying my free charge capability, and trying different names for the car before I finally settled on 석진 which is a reasonable Korean boy's name (since the car is Korean I even got to set that as the bluetooth ID) and it's roughly pronounced sock/gin which are two wonderful things in life.

I'm currently sat in a pub, eating a burger, enjoying myself while 석진 suckles on the teat of "free" electricity to make up for the fact that I've developed a non-trivial habit of leaving Audi drivers in the dust at traffic lights.

Further updates may happen as I play with Android Auto and other toys in an attempt to eventually be able to ask the car to please "freeze my buttocks" (a feature it has in the form of air-conditioned seats.)

by Daniel Silverstone at October 24, 2017 07:44 PM

October 14, 2017

Mick Morgan

multilingual chat

I use email fairly extensively for my public communication but I use XMPP (with suitable end-to-end encryption) for my private, personal communication. And I use my own XMPP server to facilitate this. But as I have mentioned in previous posts my family and many of my friends insist on using proprietary variants of this open standard (facebook, whatsapp etc. ad nauseam). I was thus amused to note that I am not alone in having difficulty in keeping track of “which of my contacts use which chat systems“.

XKCD cartoon about multiple chat systems

(My thanks, as ever, to Randall Munroe over at XKCD.)

I must find a client which can handle all of my messaging systems. Better yet, I’d like one which worked, and seamlessly synchronised, across my mobile devices and my linux desktop. Even better again, such a client should offer simple (i.e. easy to use) e-to-e crypto and use an open server platform which I can manage myself.

Proprietary systems suck.

by Mick at October 14, 2017 03:29 PM

October 04, 2017

Daniel Silverstone (Kinnison)

F/LOSS (in)activity, September 2017

In the interests of keeping myself "honest" regarding F/LOSS activity, here's a report, sadly it's not very good.

Unfortunately, September was a poor month for me in terms of motivation and energy for F/LOSS work. I did some amount of Gitano work, merging a patch from Richard Ipsum for help text of the config command. I also submitted another patch to the STM32F103xx Rust repository, though it wasn't a particularly big thing. Otherwise I've been relatively quiet on the Rust/USB stuff and have otherwise kept away from projects.

Sometimes one needs to take a step away from things in order to recuperate and care for oneself rather than the various demands on ones time. This is something I had been feeling I needed for a while, and with a lack of motivation toward the start of the month I gave myself permission to take a short break.

Next weekend is the next Gitano developer day and I hope to pick up my activity again then, so I should have more to report for October.

by Daniel Silverstone at October 04, 2017 12:53 PM

July 10, 2017

Steve Engledow (stilvoid)

An evening of linux on the desktop

Last time, I wrote about trying a few desktop environments to see what's out there, keep things fresh, and keep me from complacency. Well, as with desktop environments, so with text editors. I decided briefly that I would try a few of the more recent code editors that are around these days. Lured in by their pleasing, modern visuals and their promises of a smooth, integrated experience, I've been meaning to give these a go for a while. Needless to say, as a long-time vim user, I just found myself frustrated that I wasn't able to get things done as efficiently in any of those editors as I could in vim ;) I tried installing vim keybindings in Atom but it just wasn't the same as a very limited set of functionality was there. As for the integrated environment, when you have tmux running by default, everything's integrated anyway.

And, as with editors, so once again with desktop environments. I've decided to retract my previous hasty promise and no longer to bother with trying any other environments; i3 is more than fine :)

However, I did spend some time this evening making things a bit prettier so here are some delicious configs for posterity:



I've switched back to xterm from urxvt because, er... dunno.

Anyway, I set some nice colours for terminals and some magic stuff that makes man pages all colourful :)

XTerm*faceName: xft:Hack:regular:size=12
*termName: xterm-256color

! Colourful man pages
*VT100.colorBDMode:     true
*VT100.colorBD:         cyan
*VT100.colorULMode:     true
*VT100.colorUL:         darkcyan
*VT100.colorITMode:     true
*VT100.colorIT:         yellow
*VT100.veryBoldColors:  518

! terminal colours

!black darkgray
*color0:    #2B2D2E
*color8:    #808080
!darkred red
*color1:    #FF0044
*color9:    #F92672
!darkgreen green
*color2:    #82B414
*color10:   #A6E22E
!darkyellow yellow
*color3:    #FD971F
*color11:   #E6DB74
!darkblue blue
*color4:    #266C98
*color12:   #7070F0
!darkmagenta magenta
*color5:    #AC0CB1
*color13:   #D63AE1
!darkcyan cyan
*color6:    #AE81FF
*color14:   #66D9EF
!gray white
*color7:    #CCCCCC
*color15:   #F8F8F2


Nothing exciting here except for discovering a few options I hadn't previous known about:

" Show a marker at the 80th column to encourage nice code
set colorcolumn=80
highlight ColorColumn ctermbg=darkblue

" Scroll the text when we're 3 lines from the top or bottom
set so=3

" Use browser-style incremental search
set incsearch

" Override the default background colour in xoria256 to match the terminal background
highlight Normal ctermbg=black

" I like this theme
colorscheme xoria256


I made a few colour tweaks to my i3 config so I get colours that match my new Xresources. One day, I might see if it's easy enough to have them both read colour definitions from the same place so I don't have to define things twice.

The result

Here's what it looks like:

My new desktop

by Steve Engledow ( at July 10, 2017 10:14 PM

March 01, 2017

Brett Parker (iDunno)

Using the Mythic Beasts IPv4 -> IPv6 Proxy for Websites on a v6 only Pi and getting the right REMOTE_ADDR

So, more because I was intrigued than anything else, I've got a pi3 from Mythic Beasts, they're supplied with IPv6 only connectivity and the file storage is NFS over a private v4 network. The proxy will happily redirect requests to either http or https to the Pi, but this results (without turning on the Proxy Protocol) with getting remote addresses in your logs of the proxy servers, which is not entirely useful.

I've cheated a bit, because the turning on of ProxyProtocol for the addresses is currently not exposed to customers (it's on the list!), to do it without access to Mythic's backends use your own domainname (I've also got mapped to this Pi).

So, first step first, we get our RPi and we make sure that we can login to it via ssh (I'm nearly always on a v6 connection anyways, so this was a simple case of sshing to the v6 address of the Pi). I then installed haproxy and apache2 on the Pi and went about configuring them, with apache2 I changed it to listen to localhost only and on ports 8080 and 4443, I hadn't at this point enabled the ssl module so, really, the change for 4443 didn't kick in. Here's my /etc/apache2/ports.conf file:

# If you just change the port or add more ports here, you will likely also
# have to change the VirtualHost statement in
# /etc/apache2/sites-enabled/000-default.conf

Listen [::1]:8080

<IfModule ssl_module>
       Listen [::1]:4443

<IfModule mod_gnutls.c>
       Listen [::1]:4443

# vim: syntax=apache ts=4 sw=4 sts=4 sr noet

I then edited /etc/apache2/sites-available/000-default.conf to change the VirtualHost line to [::1]:8080.

So, with that in place, now we deploy haproxy infront of it, the basic /etc/haproxy/haproxy.cfg config is:

       log /dev/log    local0
       log /dev/log    local1 notice
       chroot /var/lib/haproxy
       stats socket /run/haproxy/admin.sock mode 660 level admin
       stats timeout 30s
       user haproxy
       group haproxy

       # Default SSL material locations
       ca-base /etc/ssl/certs
       crt-base /etc/ssl/private

       # Default ciphers to use on SSL-enabled listening sockets.
       # For more information, see ciphers(1SSL). This list is from:
       ssl-default-bind-options no-sslv3

       log     global
       mode    http
       option  httplog
       option  dontlognull
        timeout connect 5000
        timeout client  50000
        timeout server  50000
       errorfile 400 /etc/haproxy/errors/400.http
       errorfile 403 /etc/haproxy/errors/403.http
       errorfile 408 /etc/haproxy/errors/408.http
       errorfile 500 /etc/haproxy/errors/500.http
       errorfile 502 /etc/haproxy/errors/502.http
       errorfile 503 /etc/haproxy/errors/503.http
       errorfile 504 /etc/haproxy/errors/504.http

frontend any_http
        option httplog
        option forwardfor

        acl is_from_proxy src 2a00:1098:0:82:1000:3b:1:1 2a00:1098:0:80:1000:3b:1:1
        tcp-request connection expect-proxy layer4 if is_from_proxy

        bind :::80
        default_backend any_http

backend any_http
        server apache2 ::1:8080

Obviously after that you then do:

systemctl restart apache2
systemctl restart haproxy

Now you have a proxy protocol'd setup from the proxy servers, and you can still talk directly to the Pi over ipv6, you're not yet logging the right remote ips, but we're a step closer. Next enable mod_remoteip in apache2:

a2enmod remoteip

And add a file, /etc/apache2/conf-available/remoteip-logformats.conf containing:

LogFormat "%v:%p %a %l %u %t \"%r\" %>s %O \"%{Referer}i\" \"%{User-Agent}i\"" remoteip_vhost_combined

And edit the /etc/apache2/sites-available/000-default.conf to change the CustomLog line to use remoteip_vhost_combined rather than combined as the LogFormat and add the relevant RemoteIP settings:

RemoteIPHeader X-Forwarded-For
RemoteIPTrustedProxy ::1

CustomLog ${APACHE_LOG_DIR}/access.log remoteip_vhost_combined

Now, enable the config and restart apache2:

a2enconf remoteip-logformats
systemctl restart apache2

Now you'll get the right remote ip in the logs (cool, huh!), and, better still, the environment that gets pushed through to cgi scripts/php/whatever is now also correct.

So, you can now happily visit http://www.<your-pi-name>, e.g.

Next up, you'll want something like dehydrated - I grabbed the packaged version from debian's jessie-backports repository - so that you can make yourself some nice shiny SSL certificates (why wouldn't you, after all!), once you've got dehydrated installed, you'll probably want to tweak it a bit, I have some magic extra files that I use, I also suggest getting the dehydrated-apache2 package, which just makes it all much easier too.









case $action in
    cat "$privkey" "$fullchain" > /etc/ssl/private/srwpi.pem
    chmod 640 /etc/ssl/private/srwpi.pem

/etc/dehydrated/hooks/srwpi has the execute bit set (chmod +x /etc/dehydrated/hooks/srwpi), and is really only there so that the certificate can be used easily in haproxy.

And finally the file /etc/dehydrated/domains.txt:

Obviously, use your own pi name in there, or better yet, one of your own domain names that you've mapped to the proxies.

Run dehydrated in cron mode (it's noisy, but meh...):

dehydrated -c

That s then generated you some shiny certificates (hopefully). For now, I'll just tell you how to do it through the /etc/apache2/sites-available/default-ssl.conf file, just edit that file and change the SSLCertificateFile and SSLCertificateKeyFile to point to /var/lib/dehydrated/certs/ and /var/llib/dehydrated/certs/ files, do the edit for the CustomLog as you did for the other default site, and change the VirtualHost to be [::1]:443 and enable the site:

a2ensite default-ssl
a2enmod ssl

And restart apache2:

systemctl restart apache2

Now time to add some bits to haproxy.cfg, usefully this is only a tiny tiny bit of extra config:

frontend any_https
        option httplog
        option forwardfor

        acl is_from_proxy src 2a00:1098:0:82:1000:3b:1:1 2a00:1098:0:80:1000:3b:1:1
        tcp-request connection expect-proxy layer4 if is_from_proxy

        bind :::443 ssl crt /etc/ssl/private/srwpi.pem

        default_backend any_https

backend any_https
        server apache2 ::1:4443 ssl ca-file /etc/ssl/certs/ca-certificates.crt

Restart haproxy:

systemctl restart haproxy

And we're all done! REMOTE_ADDR will appear as the correct remote address in the logs, and in the environment.

by Brett Parker ( at March 01, 2017 06:35 PM

Ooooooh! Shiny!

Yay! So, it's a year and a bit on from the last post (eeep!), and we get the news of the Psion Gemini - I wants one, that looks nice and shiny and just the right size to not be inconvenient to lug around all the time, and far better for ssh usage than the onscreen keyboard on my phone!

by Brett Parker ( at March 01, 2017 03:12 PM

October 18, 2016

MJ Ray

Rinse and repeat

Forgive me, reader, for I have sinned. It has been over a year since my last blog post. Life got busy. Paid work. Another round of challenges managing my chronic illness. Cycle campaigning. Fun bike rides. Friends. Family. Travels. Other social media to stroke. I’m still reading some of the planets where this blog post should appear and commenting on some, so I’ve not felt completely cut off, but I am surprised how many people don’t allow comments on their blogs any more (or make it too difficult for me with reCaptcha and the like).

The main motive for this post is to test some minor upgrades, though. Hi everyone. How’s it going with you? I’ll probably keep posting short updates in the future.

Go in peace to love and serve the web. 🙂

by mjr at October 18, 2016 04:28 AM

June 11, 2015

MJ Ray

Mick Morgan: here’s why pay twice? asks why the government hires civilians to monitor social media instead of just giving GC HQ the keywords. Us cripples aren’t allowed to comment there (physical ability test) so I reply here:

It’s pretty obvious that they have probably done both, isn’t it?

This way, they’re verifying each other. Politicians probably trust neither civilians or spies completely and that makes it worth paying twice for this.

Unlike lots of things that they seem to want not to pay for at all…

by mjr at June 11, 2015 03:49 AM

March 09, 2015

Ben Francis

Pinned Apps – An App Model for the Web

(re-posted from a page I created on the Mozilla wiki on 17th December 2014)

Problem Statement

The per-OS app store model has resulted in a market where a small number of OS companies have a large amount of control, limiting choice for users and app developers. In order to get things done on mobile devices users are restricted to using apps from a single app store which have to be downloaded and installed on a compatible device in order to be useful.

Design Concept

Concept Overview

The idea of pinned apps is to turn the apps model on its head by making apps something you discover simply by searching and browsing the web. Web apps do not have to be installed in order to be useful, “pinning” is an optional step where the user can choose to split an app off from the rest of the web to persist it on their device and use it separately from the browser.


”If you think of the current app store experience as consumers going to a grocery store to buy packaged goods off a shelf, the web is more like a hunter-gatherer exploring a forest and discovering new tools and supplies along their journey.”

App Discovery

A Web App Manifest linked from a web page says “I am part of a web app you can use separately from the browser”. Users can discover web apps simply by searching or browsing the web, and use them instantly without needing to install them first.


”App discovery could be less like shopping, and more like discovering a new piece of inventory while exploring a new level in a computer game.”

App Pinning

If the user finds a web app useful they can choose to split it off from the rest of the web to persist it on their device and use it separately from the browser. Pinned apps can provide a more app-like experience for that part of the web with no browser chrome and get their own icon on the homescreen.


”For the user pinning apps becomes like collecting pin badges for all their favourite apps, rather than cluttering their device with apps from an app store that they tried once but turned out not to be useful.”

Deep Linking

Once a pinned app is registered as managing its own part of the web (defined by URL scope), any time the user navigates to a URL within that scope, it will open in the app. This allows deep linking to a particular page inside an app and seamlessly linking from one app to another.


”The browser is like a catch-all app for pages which don’t belong to a particular pinned app.”

Going Offline

Pinning an app could download its contents to the device to make it work offline, by registering a Service Worker for the app’s URL scope.


”Pinned apps take pinned tabs to the next level by actually persisting an app on the device. An app pin is like an anchor point to tether a collection of web pages to a device.”

Multiple Pages

A web app is a collection of web pages dedicated to a particular task. You should be able to have multiple pages of the app open at the same time. Each app could be represented in the task manager as a collection of sheets, pinned together by the app.


”Exploding apps out into multiple sheets could really differentiate the Firefox OS user experience from all other mobile app platforms which are limited to one window per app.”

Travel Guide

Even in a world without app stores there would still be a need for a curated collection of content. The Marketplace could become less of a grocery store, and more of a crowdsourced travel guide for the web.


”If a user discovers an app which isn’t yet included in the guide, they could be given the opportunity to submit it. The guide could be curated by the community with descriptions, ratings and tags.”

3 Questions


What value (the importance, worth or usefulness of something) does your idea deliver?

The pinned apps concept makes web apps instantly useful by making “installation” optional. It frees users from being tied to a single app store and gives them more choice and control. It makes apps searchable and discoverable like the rest of the web and gives developers the freedom of where to host their apps and how to monetise them. It allows Mozilla to grow a catalogue of apps so large and diverse that no walled garden can compete, by leveraging its user base to discover the apps and its community to curate them.

What technological advantage will your idea deliver and why is this important?

Pinned apps would be implemented with emerging web standards like Web App Manifests and Service Workers which add new layers of functionality to the web to make it a compelling platform for mobile apps. Not just for Firefox OS, but for any user agent which implements the standards.

Why would someone invest time or pay money for this idea?

Users would benefit from a unique new web experience whilst also freeing themselves from vendor lock-in. App developers can reduce their development costs by creating one searchable and discoverable web app for multiple platforms. For Mozilla, pinned apps could leverage the unique properties of the web to differentiate Firefox OS in a way that is difficult for incumbents to follow.

UI Mockups

App Search


Pin App


Pin Page


Multiple Pages


App Directory



Web App Manifest

A manifest is linked from a web page with a link relation:

  <link rel=”manifest” href=”/manifest.json”>

A manifest can specify an app name, icon, display mode and orientation:

   "name": "GMail"
   "icons": {...},
   "display": "standalone",
   "orientation": “portrait”,

There is a proposal for a manifest to be able to specify an app scope:

   "scope": "/"

Service Worker

There is also a proposal to be able to reference a Service Worker from within the manifest:

   service_worker: {
     src: "app.js",
     scope: "/"

A Service Worker has an install method which can populate a cache with a web app’s resources when it is registered:

 this.addEventListener('install', function(event) {
    caches.create('v1').then(function(cache) {
     return cache.add(
    }, function(error) {
        console.error('error populating cache ' + error);

So that the app can then respond to requests for resources when offline:

 this.addEventListener('fetch', function(event) {
    caches.match(event.request).catch(function() {
      return event.default();

by tola at March 09, 2015 03:54 PM

December 11, 2014

Ben Francis

The Times They Are A Changin’ (Open Web Remix)

In the run up to the “Mozlandia” work week in Portland, and in reflection of the last three years of the Firefox OS project, for a bit of fun I’ve reworked a Bob Dylan song to celebrate our incredible journey so far.

Here’s a video featuring some of my memories from the last three years, with Siobhan (my fiancée) and me singing the song at you! There are even lyrics so you can sing along ;)

“Keep on rockin’ the free web” — Potch

by tola at December 11, 2014 11:26 AM

July 10, 2014

James Taylor


Is it annoying or not that everyone says SSL Certs and SSL when they really mean TLS?

Does anyone actually mean SSL? Have there been any accidents through people confusing the two?

July 10, 2014 02:09 PM

Cloud Computing Deployments … Revisited.

So its been a few years since I’ve posted, because its been so much hard work, and we’ve been pushing really hard on some projects which I just can’t talk about – annoyingly. Anyways, March 20th , 2011 I talked about Continual Integration and Continual Deployment and the Cloud and discussed two main methods – having what we now call ‘Gold Standards’ vs continually updating.

The interesting thing is that as we’ve grown as a company, and as we’ve become more ‘Enterprise’, we’ve brought in more systems administrators and begun to really separate the deployments from the development. The other thing is we have separated our services out into multiple vertical strands, which have different roles. This means we have slightly different processes for Banking or Payment based modules then we do from marketing modules. We’re able to segregate operational and content from personally identifiable information – PII having much higher regulation on who can (and auditing of who does) access.

Several other key things had to change: for instance, things like SSL keys of the servers shouldn’t be kept in the development repo. Now, of course not, I hear you yell, but its a very blurry line. For instance, should the Django configuration be kept in the repo? Well, yes, because that defines the modules and things like URLs. Should the nginx config be kept in the repo? Well, oh. if you keep *that* in then you would keep your SSL certs in…

So the answer becomes having lots of repo’s. One repo per application (django wise), and one repo per deployment containing configurations. And then you start looking at build tools to bring, for a particular server or cluster of servers up and running.

The process (for our more secure, audited services) is looking like a tool to bring an AMI up, get everything installed and configured, and then take a snapshot, and then a second tool that takes that AMI (and all the others needed) and builds the VPC inside of AWS. Its a step away from the continual deployment strategy, but it is mostly automated.

July 10, 2014 02:09 PM

June 12, 2014

Paul Tansom

Beginning irc

After some discussion last night at PHP Hants about the fact that irc is a great facilitator of support / discussion, but largely ignored because there is rarely enough information for a new user to get going I decided it may be worth putting together a howto type post so here goes…

What is irc?

First of all, what on earth is it? I’m tempted to describe it as Twitter done right years before Twitter even existed, but I’m a geek and I’ve been using irc for years. It has a long heritage, but unlike the ubiquitous email it hasn’t made the transition into mainstream use. In terms of usage it has similarities to things like Twitter and Instant Messaging. Let’s take a quick look at this.

Twitter allows you to broadcast messages, they get published and anyone who is subscribed to your feed can read what you say. Everything is pretty instant, and if somebody is watching the screen at the right time they can respond straight away. Instant Messaging on the other hand, is more of a direct conversation with a single person, or sometimes a group of people, but it too is pretty instantaneous – assuming, of course, that there’s someone reading what you’ve said. Both of these techonologies are pretty familiar to many. If you go to the appropriate website you are given the opportunity to sign up and either use a web based client or download one.

It is much the same for irc in terms of usage, although conversations are grouped into channels which generally focus on a particular topic rather than being generally broadcast (Twitter) or more specifically directed (Instant Messaging). The downside is that in most cases you don’t get a web page with clear instructions of how to sign up, download a client and find where the best place is to join the conversation.

Getting started

There are two things you need to get going with irc, a client and somewhere to connect to. Let’s put that into a more familiar context.

The client is what you use to connect with; this can be an application – so as an example Outlook or Thunderbird would be a mail client, or IE, Firefox, Chrome or Safari are examples of clients for web pages – or it can be a web page that does the same thing – so if you go to and login you are using the web page as your Twitter client. Somewhere to connect to can be compared to a web address, or if you’ve got close enough to the configuration of your email to see the details, your mail server address.

Let’s start with the ‘somewhere to connect to‘ bit. Freenode is one of the most popular irc servers, so let’s take a look. First we’ll see what we can find out from their website,


There’s a lot of very daunting information there for somebody new to irc, so ignore most of it and follow the Webchat link on the left.


That’s all very well and good, but what do we put in there? I guess the screenshot above gives a clue, but if you actually visit the page the entry boxes will be blank. Well first off there’s the Nickname, this can be pretty much anything you like, no need to register it – stick to the basics of letters, numbers and some simple punctuation (if you want to), keep it short and so long as nobody else is already using it you should be fine; if it doesn’t work try another. Channels is the awkward one, how do you know what channels there are? If you’re lucky you’re looking into this because you’ve been told there’s a channel there and hopefully you’ve been given the channel name. For now let’s just use the PHP Hants channel, so that would be #phph in the Channels box. Now all you need to do is type in the captcha, ignore the tick boxes and click Connect and you are on the irc channel and ready to chat. Down the right you’ll see a list of who else is there, and in the main window there will be a bit of introductory information (e.g. topic for the channel) and depending on how busy it is anything from nothing to a fast scrolling screen of text.


If you’ve miss typed there’s a chance you’ll end up in a channel specially created for you because it didn’t exist; don’t worry, just quit and try again (I’ll explain that process shortly).

For now all you really need to worry about is typing in text an posting it, this is as simple as typing it into the entry box at the bottom of the page and pressing return. Be polite, be patient and you’ll be fine. There are plenty of commands that you can use to do things, but for now the only one you need to worry about is the one to leave, this is:


Type it in the entry box, press return and you’ve disconnected from the server. The next thing to look into is using a client program since this is far more flexible, but I’ll save that for another post.

by Paul Tansom at June 12, 2014 04:27 PM

May 06, 2014

Richard Lewis

Refocusing Ph.D

Actual progress on this Ph.D revision has been quite slow. My current efforts are on improving the focus of the thesis. One of the criticisms the examiners made (somewhat obliquely) was that it wasn&apost very clear exactly what my subject was: musicology? music information retrieval? computational musicology? And the reason for this was that I failed to make that clear to myself. It was only at the writing up stage, when I was trying to put together a coherent argument, that I decided to try and make it a story about music information retrieval (MIR). I tried to argue that MIR&aposs existing evaluation work (which was largely modelled on information retrieval evaluation from the text world) only took into account the music information needs of recreational users of MIR systems, and that there was very little in the way of studying the music information seeking behaviour of "serious" users. However, the examiners didn&apost even accept that information retrieval was an important problem for musicology, nevermind that there was work to be done in examining music information needs of music scholarship.

So I&aposm using this as an excuse to shift the focus away from MIR a little and towards something more like computational musicology and music informatics. I&aposm putting together a case study of a computational musicology toolkit called music21. Doing this allows me to focus in more detail on a smaller and more distinct community of users (rather than attempting to studying musicologists in general which was another problematic feature of the thesis), it makes it much clearer what kind of music research can be addressed using the technology (all of MIR is either far too diverse or far too generic, depending on how you want to spin it), and also allows me to work with the actually Purcell Plus project materials using the toolkit.

May 06, 2014 11:16 PM

March 27, 2014

Richard Lewis

Taking notes in Haskell

The other day we had a meeting at work with a former colleague (now at QMUL) to discuss general project progress. The topics covered included the somewhat complicated workflow that we&aposre using for doing optical music recognition (OMR) on early printed music sources. It includes mensural notation specific OMR software called Aruspix. Aruspix itself is fairly accurate in its output, but the reason why our workflow is non-trivial is that the sources we&aposre working with are partbooks; that is, each part (or voice) of a multi-part texture is written on its own part of the page, or even on a different page. This is very different to modern score notation in which each part is written in vertical alignment. In these sources, we don&apost even know where separate pieces begin and end, and they can actually begin in the middle of a line. The aim is to go from the double page scans ("openings") to distinct pieces with their complete and correctly aligned parts.

Anyway, our colleague from QMUL was very interested in this little part of the project and suggested that we spend the afternoon, after the style of good software engineering, formalising the workflow. So that&aposs what we did. During the course of the conversation diagrams were drawn on the whiteboard. However (and this was really the point of this post) I made notes in Haskell. It occurred to me a few minutes into the conversation that laying out some types and the operations over those types that comprise our workflow is pretty much exactly the kind of formal specification we needed.

Here&aposs what I typed:

module MusicalDocuments where

import Data.Maybe

-- A document comprises some number of openings (double page spreads)
data Document = Document [Opening]

-- An opening comprises one or two pages (usually two)
data Opening = Opening (Page, Maybe Page)

-- A page comprises multiple systems
data Page = Page [System]

-- Each part is the line for a particular voice
data Voice = Superius | Discantus | Tenor | Contratenor | Bassus

-- A part comprises a list of musical sybmols, but it may span mutliple systems
--(including partial systems)
data Part = Part [MusicalSymbol]

-- A piece comprises some number of sections
data Piece = Piece [Section]

-- A system is a collection of staves
data System = System [Staff]

-- A staff is a list of atomic graphical symbols
data Staff = Staff [Glyph]

-- A section is a collection of parts
data Section = Section [Part]

-- These are the atomic components, MusicalSymbols are semantic and Glyphs are
--syntactic (i.e. just image elements)
data MusicalSymbol = MusicalSymbol
data Glyph = Glyph

-- If this were real, Image would abstract over some kind of binary format
data Image = Image

-- One of the important properties we need in order to be able to construct pieces
-- from the scanned components is to be able to say when objects of the some of the
-- types are strictly contiguous, i.e. this staff immediately follows that staff
class Contiguous a where
  immediatelyFollows :: a -> a -> Bool
  immediatelyPrecedes :: a -> a -> Bool
  immediatelyPrecedes a b = b `immediatelyFollows` a

instance Contiguous Staff where
  immediatelyFollows :: Staff -> Staff -> Bool
  immediatelyFollows = undefined

-- Another interesting property of this data set is that there are a number of
-- duplicate scans of openings, but nothing in the metadata that indicates this,
-- so our workflow needs to recognise duplicates
instance Eq Opening where
  (==) :: Opening -> Opening -> Bool
  (==) a b = undefined

-- Maybe it would also be useful to have equality for staves too?
instance Eq Staff where
  (==) :: Staff -> Staff -> Bool
  (==) a b = undefined

-- The following functions actually represent the workflow

collate :: [Document]
collate = undefined

scan :: Document -> [Image]
scan = undefined

split :: Image -> Opening
split = undefined

paginate :: Opening -> [Page]
paginate = undefined

omr :: Page -> [System]
omr = undefined

segment :: System -> [Staff]
segment = undefined

tokenize :: Staff -> [Glyph]
tokenize = undefined

recogniseMusicalSymbol :: Glyph -> Maybe MusicalSymbol
recogniseMusicalSymbol = undefined

part :: [Glyph] -> Maybe Part
part gs =
  if null symbols then Nothing else Just $ Part symbols
  where symbols = mapMaybe recogniseMusicalSymbol gs

alignable :: Part -> Part -> Bool
alignable = undefined

piece :: [Part] -> Maybe Piece
piece = undefined

I then added the comments and implemented the part function later on. Looking at it now, I keep wondering whether the types of the functions really make sense; especially where a return type is a type that&aposs just a label for a list or pair.

I haven&apost written much Haskell code before, and given that I&aposve only implemented one function here, I still haven&apost written much Haskell code. But it seemed to be a nice way to formalise this procedure. Any criticisms (or function implementations!) welcome.

March 27, 2014 11:13 PM

February 06, 2014

Adam Bower (quinophex)

I finally managed to beat my nemesis!

I purchased this book (Linked, by Barabasi) on the 24th of December 2002, I had managed to make 6 or 7 aborted attempts at reading it to completion where life had suddenly got busy and just took over. This meant that I put the book down and didn't pick it up again until things were less hectic some time later and I started again.

Anyhow, I finally beat the book a few nights ago, my comprehension of it was pretty low anyhow but at least it is done. Just shows I need to read lots more given how little went in.

comment count unavailable comments

February 06, 2014 10:40 PM

February 01, 2014

Adam Bower (quinophex)

Why buying a Mio Cyclo 305 HC cycling computer was actually a great idea.

I finally made it back out onto the bike today for the first time since September last year. I'd spent some time ill in October and November which meant I had to stop exercising and as a result I've gained loads of weight over the winter and it turns out also become very unfit which can be verified by looking at the Strava ride from today:

Anyhow, a nice thing about this ride is that I can record it on Strava and get this data about how unfit I have become, this is because last year I bought a Mio Cyclo 305 HC cycle computer from Halfords reduced to £144.50 (using a British Cycling discount). I was originally going to get a Garmin 500 but Amazon put the price up from £149.99 the day I was going to buy it to £199.99.

I knew when I got the Mio that it had a few issues surrounding usability and features but it was cheap enough at under £150 that I figured that even if I didn't get on with it I'd at least have a cadence sensor and heart rate monitor so I could just buy a Garmin 510 when they sorted out the firmware bugs with that and the price came down a bit which is still my longer term intention.

So it turns out a couple of weeks ago I plugged my Mio into a Windows VM when I was testing USB support and carried out a check for new firmware. I was rather surprised to see a new firmware update and new set of map data was available for download. So I installed it think I wasn't going to get any new features from it as Mio had released some new models but it turns out that the new firmware actually enables a single feature (amongst other things, they also tidied up the UI and sorted a few other bugs along with some other features) that makes the device massively more useful as it now also creates files in .fit format which can be uploaded directly to Strava.

This is massively useful for me as although the Mio always worked in Linux as the device is essentially just a USB mass storage device but you would have to do an intermediate step of having to use to convert the files from the Mio-centric GPX format to something Strava would recognise. Now I can just browse to the folder and upload the file directly which is very handy.

All in it turns out that buying a Mio which reading reviews and forums were full of doom and gloom means I can wait even longer before considering replacement with a garmin.

comment count unavailable comments

February 01, 2014 02:11 PM

January 01, 2014

John Woodard

A year in Prog!

It's New Year's Day 2014 and I'm reflecting on the music of past year.

Album wise there were several okay...ish releases in the world of Progressive Rock. Steven Wilson's The Raven That Refused To Sing not the absolute masterpiece some have eulogised a solid effort though but it did contain some filler. Motorpsyco entertained with Still Life With Eggplant not as good as their previous album but again a solid effort. Magenta as ever didn't disappoint with The 27 Club, wishing Tina Booth a swift recovery from her ill health.

The Three stand out albums in no particular order for me were Edison's Children's Final Breath Before November which almost made it as album of the year and Big Big Train with English Electric Full Power which combined last years Part One and this years Part Two with some extra goodies to make the whole greater than the sum of the parts. Also Adrian Jones of Nine Stones Close fame pulled one out of the bag with his side Project Jet Black Sea which was very different and a challenging listen, hard going at first but surprisingly very good. This man is one superb guitarist especially if you like emotion wrung out of the instrument like David Gilmore or Steve Rothery.

The moniker of Album of the Year this year goes to Fish for the incredible Feast of Consequences. A real return to form and his best work since Raingods With Zippos. The packaging of the deluxe edition with a splendid book featuring the wonderful artwork of Mark Wilkinson was superb. A real treat with a very thought provoking suite about the first world war really hammed home the saying "Lest we forget". A fine piece that needs to be heard every November 11th.

Gig wise again Fish at the Junction in Cambridge was great. His voice may not be what it was in 1985 but he is the consummate performer, very at home on the stage. As a raconteur between songs he is as every bit as entertaining as he is singing songs themselves.

The March Marillion Convention in Port Zealand, Holland where they performed their masterpiece Brave was very special as every performance of incredible album is. The Marillion Conventions are always special but Brave made this one even more special than it would normally be.
Gig of the year goes again to Marillion at Aylesbury Friars in November. I had waited thirty years and forty odd shows to see them perform Garden Party segued into Market Square Heroes that glorious night it came to pass, I'm am now one very happy Progger or should that be Proggie? Nevermind Viva Progressive Rock!

by BigJohn (aka hexpek) ( at January 01, 2014 07:56 PM

December 01, 2013

Paul Tansom

Scratch in a network environment

I have been running a Code Club at my local Primary School for a while now, and thought it was about time I put details of a few tweaks I’ve made to the default Scratch install to make things easier. So here goes:

With the default install of Scratch (on Windows) projects are saved to the C: drive. For a network environment, with pupils work stored on a network drive so they always have access whichever machine they sit at, this isn’t exactly helpful. It also isn’t ideal that they can explore the C: drive in spite of profile restrictions (although it isn’t the end of the world as there is little they can do from Scratch).


After a bit of time with Google I found the answer, and since it didn’t immediately leap out at me when I was searching I thought I’d post it here (perhaps my Google Fu was weak that day). It is actually quite simple, especially for the average Code Club volunteer I should imagine; just edit the scratch.ini file. This is, as would be expected, located in:

C:\Program Files\Scratch\Scratch.ini

Initially it looks like this:


Pretty standard stuff, but unfortunately no comments to indicate what else you can do with it. As it happens you can add the following two lines (for example):


To get this:


They do exactly what is says on the tin. If you click on the Home button in a file dialogue box then you only get the drive(s) specified. You can also put a full path in if you want to put the home directory further down the directory structure.


The VisibleDrives option restricts what you can see if you click on the Computer button in a file dialogue box. If you want to allow more visible drives then separate them with a comma.


You can do the same with a Mac (for the home drive), just use the appropriate directory format (i.e. no drive letter and the opposite direction slash).

There is more that you can do, so take a look at the Scratch documentation here. For example if you use a * in the directory path it is replaced by the name of the currently logged on user.

Depending on your network environment it may be handy for your Code Club to put the extra resources on a shared network drive and open up an extra drive in the VisibleDrives. One I haven’t tried yet it is the proxy setting, which I hope will allow me to upload projects to the Scratch website. It goes something like:

ProxyServer=[server name or IP address]
ProxyPort=[port number]

by Paul Tansom at December 01, 2013 07:00 PM

February 22, 2013

Joe Button

Sampler plugin for the baremetal LV2 host

I threw together a simpler sampler plugin for kicks. Like the other plugins it sounds fairly underwhelming. Next challenge will probably be to try plugging in some real LV2 plugins.

February 22, 2013 11:22 PM

February 21, 2013

Joe Button

Baremetal MIDI machine now talks to hardware MIDI devices

The Baremetal MIDI file player was cool, but not quite as cool as a real instrument.

I wired up a MIDI In port along the lines of This one here, messed with the code a bit and voila (and potentially viola), I can play LV2 instrument plugins using a MIDI keyboard:

When I say "LV2 synth plugins", I should clarify that I'm only using the LV2 plugin C API, not the whole .ttl text file shebangle. I hope to get around to that at some point but it will be a while before you can directly plug LV2s into this and expect them to just work.

February 21, 2013 04:05 PM

January 16, 2013

John Woodard

LinuxMint 14 Add Printer Issue

 LinuxMint 14 Add Printer Issue


I wanted to print from my LinuxMint 14 (Cinnamon) PC via a shared Windows printer on my network. Problem is it isn’t found by the printers dialog in system settings. I thought I’d done all the normal things to get samba to play nice like rearranging the name resolve order in /etc/samba/smb.conf to a more sane bcast host lmhosts wins. Having host and wins, neither of which I’m using first in the order cocks things up some what. Every time I tried to search for the printer in the system setting dialog it told me “FirewallD is not running. Network printer detection needs services mdns, ipp, ipp-client and samba-client enabled on firewall.” So much scratching of the head there then, because as far as I can tell there ain’t no daemon by that name available!

It turns out thanks to /pseudomorph this has been a bug since LinuxMint12 (based on Ubuntu 11.10). It’s due to that particular daemon (Windows people daemon pretty much = service) being Fedora specific and should have no place in a Debian/Ubuntu based distribution. Bugs of this nature really should be ironed out sooner.

Anyway the simple fix is to use the more traditional approach using the older printer dialog which is accessed by inputting system-config-printer at the command line. Which works just fine so why the new (over a year old) printer config dialog that is inherently broken I ask myself.

The CUPS web interface also works apparently http://localhost:631/ in your favourite browser which should be there as long as CUPS is installed which it is in LinuxMint by default.

So come on Minty people get your bug squashing boots on and stamp on this one please.


Bug #871985 only affects Gnome3 so as long as its not affecting Unity that will be okay Canonical will it!

by BigJohn (aka hexpek) ( at January 16, 2013 12:39 AM

August 20, 2012

David Reynolds

On Music

Lately, (well I say lately, I think it’s been the same for a few years now) I have been finding that it is very rare that an album comes along that affects me in a way that music I heard 10 years ago seem to. That is not to say that I have not heard any music that I like in that time, it just doesn’t seem to mean as music that has been in my life for years. What I am trying to work out is if that is a reflection on the state of music, of how I experience music or just me.


Buying music was always quite an experience. I would spend weeks, months and sometimes longer saving up to buy some new music. Whether I knew exactly what I wanted or just wanted “something else by this artist” I would spend some time browsing the racks weighing up what was the best value for my money. In the days before the internet, if you wanted to research an artist’s back catalogue, you were generally out of luck unless you had access to books about the artists. This lead to the thrill of finding a hidden gem in the racks that you didn’t know existed or had only heard rumours about. The anticipation of listening to the new music would build even more because I would have to wait until I had travelleled home before I could listen to my new purchases.

Nowadays, with the dizzying amount of music constantly pumped into our ears through the internet, radio, advertising and the plethora of styles and genres, it is difficult to sift through and find artists and music that really speak to you. Luckily, there are websites available to catalogue releases by artists so you are able to do thorough research and even preview your music before you purchase it. Of course the distribution methods have changed massively too. No longer do I have to wait until I can make it to a brick and mortar store to hand over my cash. I can now not only buy physical musical releases on CD or Vinyl online and have it delivered to my door, I can also buy digital music through iTunes, Amazon or Bandcamp or even stream the music straight to my ears through services like Spotify or Rdio. Whilst these online sales avenues are great for artists to be able to sell directly to their fans, I feel that some of the magic has been removed from the purchasing of music for me.


Listening to the music used to be an even greater event than purchasing it. After having spent the time saving up for the purchase, then the time carefully choosing the music to buy and getting it home, I would then sit myself down and listen to the music. I would immerse myself totally in the music and only listen to it (I might read the liner notes if I hadn’t exhausted them on the way home). It is difficult to imagine doing one thing for 45+ minutes without the constant interruptions from smartphones, tablet computers, games consoles and televisions these days. I can’t rememeber the last time I listened to music on good speakers or headphones (generally I listen on crappy computers speakers or to compressed audio on my iPhone through crappy headphones) without reading Twitter, replying to emails or reading copiuous amounts of information about the artists on Wikipedia. This all serves to distract from the actual enjoyment of just listening to the music.


The actual act of writing this blog post has called into sharp focus the main reason why music doesn’t seem to affect me nowadays as much as it used to - because I don’t experience it in the same way. My life has changed, I have more resposibilities and less time to just listen which makes the convenience and speed of buying digital music online much more appealing. You would think that this ‘instant music’ should be instantly satisfying but for some reason it doesn’t seem to work that way.

What changed?

I wonder if I am the only one experiencing this? My tastes in music have definitely changed a lot over the last few years, but I still find it hard to find music that I want to listen to again and again. I’m hoping I’m not alone in this, alternatively I’m hoping someone might read this and recommend some awesome music to me and cure this weird musical apathy I appear to me suffering from.

August 20, 2012 03:33 PM

On Music

Lately, (well I say lately, I think it’s been the same for a few years now) I have been finding that it is very rare that an album comes along that affects me in a way that music I heard 10 years ago seem to. That is not to say that I have not heard any music that I like in that time, it just doesn’t seem to mean as music that has been in my life for years. What I am trying to work out is if that is a reflection on the state of music, of how I experience music or just me.


Buying music was always quite an experience. I would spend weeks, months and sometimes longer saving up to buy some new music. Whether I knew exactly what I wanted or just wanted “something else by this artist” I would spend some time browsing the racks weighing up what was the best value for my money. In the days before the internet, if you wanted to research an artist’s back catalogue, you were generally out of luck unless you had access to books about the artists. This lead to the thrill of finding a hidden gem in the racks that you didn’t know existed or had only heard rumours about. The anticipation of listening to the new music would build even more because I would have to wait until I had travelleled home before I could listen to my new purchases.

Nowadays, with the dizzying amount of music constantly pumped into our ears through the internet, radio, advertising and the plethora of styles and genres, it is difficult to sift through and find artists and music that really speak to you. Luckily, there are websites available to catalogue releases by artists so you are able to do thorough research and even preview your music before you purchase it. Of course the distribution methods have changed massively too. No longer do I have to wait until I can make it to a brick and mortar store to hand over my cash. I can now not only buy physical musical releases on CD or Vinyl online and have it delivered to my door, I can also buy digital music through iTunes, Amazon or Bandcamp or even stream the music straight to my ears through services like Spotify or Rdio. Whilst these online sales avenues are great for artists to be able to sell directly to their fans, I feel that some of the magic has been removed from the purchasing of music for me.


Listening to the music used to be an even greater event than purchasing it. After having spent the time saving up for the purchase, then the time carefully choosing the music to buy and getting it home, I would then sit myself down and listen to the music. I would immerse myself totally in the music and only listen to it (I might read the liner notes if I hadn’t exhausted them on the way home). It is difficult to imagine doing one thing for 45+ minutes without the constant interruptions from smartphones, tablet computers, games consoles and televisions these days. I can’t rememeber the last time I listened to music on good speakers or headphones (generally I listen on crappy computers speakers or to compressed audio on my iPhone through crappy headphones) without reading Twitter, replying to emails or reading copiuous amounts of information about the artists on Wikipedia. This all serves to distract from the actual enjoyment of just listening to the music.


The actual act of writing this blog post has called into sharp focus the main reason why music doesn’t seem to affect me nowadays as much as it used to - because I don’t experience it in the same way. My life has changed, I have more resposibilities and less time to just listen which makes the convenience and speed of buying digital music online much more appealing. You would think that this ‘instant music’ should be instantly satisfying but for some reason it doesn’t seem to work that way.

What changed?

I wonder if I am the only one experiencing this? My tastes in music have definitely changed a lot over the last few years, but I still find it hard to find music that I want to listen to again and again. I’m hoping I’m not alone in this, alternatively I’m hoping someone might read this and recommend some awesome music to me and cure this weird musical apathy I appear to me suffering from.

August 20, 2012 03:33 PM

June 25, 2012

Elisabeth Fosbrooke-Brown (sfr)

Black redstarts

It's difficult to use the terrace for a couple of weeks, because the black redstart family is in their summer residence at the top of a column under the roof. The chicks grow very fast, and the parents have to feed them frequently; when anyone goes out on the terrace they stop the feeding process and click shrill warnings to the chicks to stay still. I worry that if we disturb them too often or for too long the chicks will starve.

Black redstarts are called rougequeue noir (black red-tail) in French, but here they are known as rossignol des murailles (nightingale of the outside walls). Pretty!

The camera needs replacing, so there are no photos of Musatelier's rossignols des murailles, but you can see what they look like on

by sunflowerinrain ( at June 25, 2012 08:02 AM

June 16, 2012

Elisabeth Fosbrooke-Brown (sfr)

Roundabout at Mirambeau

Roundabouts are taken seriously here in France. Not so much as traffic measures (though it has been known for people to be cautioned by the local gendarmes for not signalling when leaving a roundabout, and quite rightly too), but as places to ornament.

A couple of years ago the roundabout at the edge of  Mirambeau had a make-over which included an ironwork arch and a carrelet (fishing hut on stilts). Now it has a miniature vineyard as well, and roses and other plants for which this area is known.

Need a passenger to take photo!

by sunflowerinrain ( at June 16, 2012 12:06 PM

September 04, 2006

Ashley Howes

Some new photos

Take a look at some new photos my father and I have taken. We are experimenting with our new digital SLR with a variety of lenses.

by Ashley ( at September 04, 2006 10:42 AM

August 30, 2006

Ashley Howes

A Collection of Comments

This is a bit of fun. A collection of comments found in code. This is from The Daily WTF.

by Ashley ( at August 30, 2006 01:13 AM