Planet ALUG

August 31, 2016

Chris Lamb

Free software activities in August 2016

Here is my monthly update covering what I have been doing in the free software world (previously):



Reproducible builds


Whilst anyone can inspect the source code of free software for malicious flaws, most Linux distributions provide binary (or "compiled") packages to end users.

The motivation behind the Reproducible Builds effort is to allow verification that no flaws have been introduced — either maliciously and accidentally — during this compilation process by promising identical binary packages are always generated from a given source.


Diffoscope

diffoscope is our "diff on steroids" that will not only recursively unpack archives but will transform binary formats into human-readable forms in order to compare them:

  • Added a command-line interface to the try.diffoscope.org web service.
  • Added a JSON comparator.
  • In the HTML output, highlight lines when hovering to make it easier to visually track.
  • Ensure that we pass str types to our Difference class, otherwise we can't be sure we can render them later.
  • Testsuite improvements:
    • Generate test coverage reports.
    • Add tests for Haskell and GitIndex comparators.
    • Completely refactored all of the comparator tests, extracting out commonly-used routines.
    • Confirm rendering of text and HTML presenters when checking non-existing files.
    • Dropped a squashfs test as it was simply too unreliable and/or has too many requirements to satisfy.
  • A large number of miscellaneous cleanups, including:
    • Reworking the comparator setup/preference internals by dynamically importing classes via a single list.
    • Split exceptions out into dedicated diffoscope.exc module.
    • Tidying the PROVIDERS dict in diffoscope/__init__.py.
    • Use html.escape over xml.sax.saxutils.escape, cgi.escape, etc.
    • Removing hard-coding of manual page targets names in debian/rules.
    • Specify all string format arguments as logging function parameters, not using interpolation.
    • Tidying imports, correcting indentation levels and drop unnecessary whitespace.

disorderfs

disorderfs is our FUSE filesystem that deliberately introduces nondeterminism in system calls such as readdir(3).

  • Added a testsuite to prevent regressions. (f124965)
  • Added a --sort-dirents=yes|no option for forcing deterministic ordering. (2aae325)

Other

  • Improved strip-nondeterminism, our tool to remove specific nondeterministic information after a build:
    • Match more styles of Java .properties files.
    • Remove hyphen from "non-determinism" and "non-deterministic" throughout package for consistency.
  • Improvements to our testing infrastucture:
    • Improve the top-level navigation so that we can always get back to "home" of a package.
    • Give expandable elements cursor: pointer CSS styling to highlight they are clickable.
    • Drop various trailing underlined whitespaces after links.
    • Explicitly log that build was successful or not.
    • Various code-quality improvements, including prefering str.format over concatentation.
  • Miscellaneous updates to our filter-packages internal tool:
    • Add --random=N and --url options.
    • Add support for --show=comments.
    • Correct ordering so that --show-version runs after --filter-ftbfs.
    • Rename --show-ftbfs to --filter-ftbfs and --show-version to --show=version.
  • Created a proof-of-concept reproducible-utils package to contain commonly-used snippets aimed at developers wishing to make their packages reproducible.


I also submitted 92 patches to fix specific reproducibility issues in advi, amora-server, apt-cacher-ng, ara, argyll, audiotools, bam, bedtools, binutils-m68hc1x, botan1.10, broccoli, congress, cookiecutter, dacs, dapl, dateutils, ddd, dicom3tools, dispcalgui, dnssec-trigger, echoping, eekboek, emacspeak, eyed3, fdroidserver, flashrom, fntsample, forkstat, gkrellm, gkrellm, gnunet-gtk, handbrake, hardinfo, ircd-irc2, ircd-ircu, jack-audio-connection-kit, jpy, kxmlgui, libbson, libdc0, libdevel-cover-perl, libfm, libpam-ldap, libquvi, librep, lilyterm, mozvoikko, mp4h, mp4v2, myghty, n2n, nagios-nrpe, nikwi, nmh, nsnake, openhackware, pd-pdstring, phpab, phpdox, phpldapadmin, pixelmed-codec, pleiades, pybit, pygtksourceview, pyicu, python-attrs, python-gflags, quvi, radare2, rc, rest2web, roaraudio, rt-extension-customfieldsonupdate, ruby-compass, ruby-pg, sheepdog, tf5, ttf-tiresias, ttf-tiresias, tuxpaint, tuxpaint-config, twitter-bootstrap3, udpcast, uhub, valknut, varnish, vips, vit, wims, winswitch, wmweather+ & xshisen.


Debian GNU/Linux


Debian LTS


This month I have been paid to work 15 hours on Debian Long Term Support (LTS). In that time I did the following:

  • "Frontdesk" duties, triaging CVEs, etc.
  • Authored the patch & issued DLA 596-1 for extplorer, a web-based file manager, fixing an archive traversal exploit.
  • Issued DLA 598-1 for suckless-tools, fixing a segmentation fault in the slock screen locking tool.
  • Issued DLA 599-1 for cracklib2, a pro-active password checker library, fixing a stack-based buffer overflow when parsing large GECOS fields.
  • Improved the find-work internal tool adding optional colour highlighting and migrating it to Python 3.
  • Wrote an lts-missing-uploads tool to find mistakes where there was no correponding package in the archive after an announcement.
  • Added optional colour highlighting to the lts-cve-triage tool.

Uploads

  • redis 2:3.2.3-1 — New upstream release, move to the DEP-5 debian/copyright format, ensure that we are running as root in LSB initscripts and add a README.Source regarding our local copies of redis.conf and sentinel.conf.
  • python-django:
    • 1:1.10-1 — New upstream release.
    • 1:1.10-2 — Fix test failures due to mishandled upstream translation updates.

  • gunicorn:
    • 19.6.0-2 — Reload logrotate in the postrotate action to avoid processes writing to the old files and move to DEP-5 debian/copyright format.
    • 19.6.0-3 — Drop our /usr/sbin/gunicorn{,3}-debian and related Debian-specific machinery to be more like upstream.
    • 19.6.0-4 — Drop "template" systemd .service files and point towards examples and documentation instead.

  • adminer:
    • 4.2.5-1 — Take over package maintenance, completely overhauling the packaging with a new upstream version, move to virtual-mysql-server to support MariaDB, updating package names of dependencies and fix the outdated Apache configuration.
    • 4.2.5-2 — Correct the php5 package names.




FTP Team

As a Debian FTP assistant I ACCEPTed 90 packages: android-platform-external-jsilver, android-platform-frameworks-data-binding, camlpdf, consolation, dfwinreg, diffoscope, django-restricted-resource, django-testproject, django-testscenarios, gitlab-ci-multi-runner, gnome-shell-extension-taskbar, golang-github-flynn-archive-go-shlex, golang-github-jamesclonk-vultr, golang-github-weppos-dnsimple-go, golang-golang-x-time, google-android-ndk-installer, haskell-expiring-cache-map, haskell-hclip, haskell-hdbc-session, haskell-microlens-ghc, haskell-names-th, haskell-persistable-record, haskell-should-not-typecheck, haskell-soap, haskell-soap-tls, haskell-th-reify-compat, haskell-with-location, haskell-wreq, kbtin, libclipboard-perl, libgtk3-simplelist-perl, libjs-jquery-selectize.js, liblemon, libplack-middleware-header-perl, libreoffice, libreswan, libtest-deep-json-perl, libtest-timer-perl, linux, linux-signed, live-tasks, llvm-toolchain-3.8, llvm-toolchain-snapshot, lua-luv, lua-torch-image, lua-torch-nn, magic-wormhole, mini-buildd, ncbi-vdb, node-ast-util, node-es6-module-transpiler, node-es6-promise, node-inline-source-map, node-number-is-nan, node-object-assign, nvidia-graphics-drivers, openhft-chronicle-bytes, openhft-chronicle-core, openhft-chronicle-network, openhft-chronicle-threads, openhft-chronicle-wire, pycodestyle, python-aptly, python-atomicwrites, python-click-log, python-django-casclient, python-git-os-job, python-hypothesis, python-nosehtmloutput, python-overpy, python-parsel, python-prov, python-py, python-schema, python-tackerclient, python-tornado, pyvo, r-cran-cairo, r-cran-mi, r-cran-rcppgsl, r-cran-sem, ruby-curses, ruby-fog-rackspace, ruby-mixlib-archive, ruby-tzinfo-data, salt-formula-swift, scapy3k, self-destructing-cookies, trollius-redis & websploit.

August 31, 2016 09:48 PM

August 14, 2016

Chris Lamb

try.diffoscope.org CLI client

One criminally-unknown new UNIX tool is diffoscope, a diff "on steroids" that will not only recursively unpack archives but will transform binary formats into human-readable forms in order to compare them instead of simply showing the raw difference in hexadecimal.

In an attempt to remedy its underuse, in December 2015 I created the try.diffoscope.org service so that I—and hopefully others—could use diffoscope without necessarily installing the multitude of third-party tools that using it can require. It also enables trivial sharing of the HTML reports in bugs or on IRC.

To make this even easier, I've now introduced a command-line client to the web service:

 $ apt-get install trydiffoscope
 [..]
 Setting up trydiffoscope (57) ...

 $ trydiffoscope /etc/hosts.allow /etc/hosts.deny
 --- a/hosts.allow
 +++ b/hosts.deny
│ @@ -1,10 +1,17 @@
│ -# /etc/hosts.allow: list of hosts that are allowed to access the system.
│ -#                   See the manual pages hosts_access(5) and hosts_options(5).
│ +# /etc/hosts.deny: list of hosts that are _not_ allowed to access the system.
│ +#                  See the manual pages hosts_access(5) and hosts_options(5).

You can also install it from PyPI with:

$ pip install trydiffoscope

Mirroring the original diffoscope command, you can save the output locally in an even more-readable HTML report format by appending "--html output.html".

In addition, if you specify the --webbrowser (or -w) argument:

$ trydiffoscope -w /etc/hosts.allow /etc/hosts.deny
https://try.diffoscope.org/gaauupyapzkb

... this will automatically open your default browser to view the results.

August 14, 2016 06:43 PM

July 13, 2016

Mick Morgan

show me yours

As Theresa May moves from the Home Office to Number 10, it is perhaps timely to reflect on public attitudes to surveillance as evidenced in Liberty’s campaign film “Show me yours” in April of this year. In the film (shown below), comedian Olivia Lee pursues members of the public with the intention of taking details from their mobile phones of all their recent communications or browsing activity. The reactions of the people approached speak for themselves. Unfortunately, Liberty research suggests that 75% of adults in the UK had never heard of the impending legislation laid out in the Investigatory Powers Bill.

by Mick at July 13, 2016 04:30 PM

July 03, 2016

Jonathan McDowell

Confirming all use of an SSH agent

For a long time I’ve wanted an ssh-agent setup that would ask me before every use, so I could slightly more comfortably forward authentication over SSH without worrying that my session might get hijacked somewhere at the remote end (I often find myself wanting to pull authenticated git repos on remote hosts). I’m at DebConf this week, which is an ideal time to dig further into these things, so I did so today. As is often the case it turns out this is already possible, if you know how.

I began with a setup that was using GNOME Keyring to manage my SSH keys. This isn’t quite what I want (eventually I want to get to the point that I can sometimes forward a GPG agent to remote hosts for signing purposes as well), so I set about setting up gpg-agent. I used Chris’ excellent guide to GnuPG/SSH Agent setup as a starting point and ended up doing the following:

$ echo use-agent >> ~/.gnupg/options
$ echo enable-ssh-support >> ~/.gnupg/gpg-agent.conf
$ sudo sed -i.bak "s/^use-ssh-agent/# use-ssh-agent/" /etc/X11/Xsession.options
$ sudo rm /etc/xdg/autostart/gnome-keyring-ssh.desktop

The first 2 commands setup my local agent, and told it to do SSH agent foo. The next stopped X from firing up ssh-agent, and the final one prevents GNOME Keyring from being configured to be the SSH agent, without having to remove libpam-gnome-keyring as Chris did. After the above I logged out of and into X again, and could see ~/.gnupg/S.gpg-agent.ssh getting created and env | grep SSH showing SSH_AUTH_SOCK pointing to it (if GNOME Keyring is still handling things it ends up pointing to something like /run/user/1000/keyring/ssh).

[Update: Luca Capello emailed to point out this was a bad approach; there’s thankfully no need to do the last 2 commands that require root. #767341 removed the need to edit Xsession.options and you can prevent GNOME Keyring starting on a per user basis with:

(cat /etc/xdg/autostart/gnome-keyring-ssh.desktop ;
 echo 'X-GNOME-Autostart-enabled=false') > \
 ~/.config/autostart/gnome-keyring-ssh.desktop

]

After this it turned out all I need to do was ssh-add -c <ssh keyfile>. The -c says “confirm use” and results in the confirm flag being appended to the end of ~/.gnupg/sshcontrol (so if you’ve already done the ssh-add you can go and add the confirm if that’s the behaviour you’d like).

Simple when you know how, but I’ve had conversations with several people in the past who wanted the same thing and hadn’t figured out how, so hopefully this is helpful to others.

July 03, 2016 03:55 PM

June 27, 2016

Jonathan McDowell

Hire me!

It’s rare to be in a position to be able to publicly announce you’re looking for a new job, but as the opportunity is currently available to me I feel I should take advantage of it. That’s especially true given the fact I’ll be at DebConf 16 next week and hope to be able to talk to various people who might be hiring (and will, of course, be attending the job fair).

I’m coming to the end of my Masters in Legal Science and although it’s been fascinating I’ve made the decision that I want to return to the world of tech. I like building things too much it seems. There are various people I’ve already reached out to, and more that are on my list to contact, but I figure making it more widely known that I’m in the market can’t hurt with finding the right fit.

I’m on LinkedIn and OpenHUB, which should give a bit more info on my previous experience and skill set. I know I’m light on details here, so feel free to email me to talk about what I might be able to specifically bring to your organisation.

June 27, 2016 10:21 PM

June 24, 2016

Steve Engledow (stilvoid)

Brugger Off

I'm putting this here and then I'm going to try not to say anything else on the subject for a while.

I'm disappointed and upset by result of the referendum. Not because we're (probably) leaving the EU. Us leaving may be the beginning of the fall of the EU and I can't tell one way or another how that will affect anyone in the world.

I'm hurt and ashamed because it's a measure of the sentiments of the people who live in the UK. 52% of you are leaning in a direction that I want no part of and don't want my son to be surrounded by as he grows up. I grew up in the tail of end of Thatcher's Britain and the UK today has the same oppressive feeling that you can sense when you watch the Young Ones.

I have some very good friends who voted out and they are good people so I'm certainly not tarring everyone with the racist brush but I've seen much fear and hate generally and I'm just saddened that this country is following the international trend and moving to the far right.

It's not an exaggeration to say that I'm pretty damn scared of the future with the US possibly about to vote in a right wing leadership too.

Don't tell me "it'll be alright" because it's not the fact of the decision that has me upset; it's what it tells me about the country I love. Or used to love. I don't know.

by Steve Engledow (steve@offend.me.uk) at June 24, 2016 12:32 PM

May 30, 2016

Wayne Stallwood (DrJeep)

UPS for Octopi or Octoprint

So it only took one mid print power cut to realise I need a UPS for my 3D printer.

it's even worse for a machine like mine with a E3D all metal head as it requires active cooling to stop damage to the head mount or prevent a right mess of molten filament inside the heatbreak.

See below for instructions on setting up an APC UPS so that it can send a command to octopi to abort the print and start cooling the head before the batteries in the UPS are exhausted.

I used a APC BackUPS Pro 550, which seems to be about the minimum spec I can get away with, on my printer this gives me approximately 5 minutes of print time without power, or 40 minutes of the printer powered but idle, other UPS's would work but APC is the only type tested with these instructions

Test this throughly and make sure you have enough runtime to cool the head before the batteries are exhausted, the only way to do this properly is to set up a test print and pull the power.

Once you have installed the power leads to and from the UPS and got the printer powered through it (not forgetting the Rpi or whatever you have running octoprint also needs power...mine is powered via the printer PSU ) You need to install acpupsd, it's in the default repo for raspian so just install it with apt.

sudo apt-get install apcupsd

Now we need to tweak apcupsd's configuration a bit

Edit the apcupsd configuration as follows, you can find it at /etc/apcupsd/apcupsd.conf, just use your favourite editor.

Find and change the following lines

UPSCABLE smart

UPSTYPE usb

DEVICE (this should be blank)

BATTERYLEVEL 50

MINUTES 5

You might need to tweak BATTERYLEVEL and MINUTES for your printer and UPS. this is the percentage of power left before the shutdown will trigger or the minutes of runtime, whichever one happens first

Remember this is minutes as calculated whilst the printer is still running. Once the print is stopped the runtime will be longer as the heaters will be off, so setting 5 minutes here would in my case give me 20 minutes of runtime once the print has aborted for the hot-end to cool

Plug the USB cable from the UPS into a spare port on the Rpi

Now activate the service by editing /etc/default/apcupsd and changing the following line

ISCONFIGURED=yes

Now start the service, it will start by itself on the next boot

sudo service apcupsd start

If all is well typing acpaccess at the prompt should get you some stats from the UPS, battery level etc

If that's all good then apcupsd is configured, now for the script that aborts your print

First go into the octoprint settings from the web interface, make sure API access is turned on and record the API key carefully

Back on the rpi go to the home directory

cd ~

Now download my custom shutdown script with wget

wget http://www.digimatic.co.uk/media/doshutdown sudo cp doshutdown /etc/apcupsd cd /etc/apcupsd

Set the permissions so the script can run

chmod 755 doshutdown

Don't be tempted to rename the file, leave it as this name

Now edit the script and change the variable at the top API_KEY to the API key you got from your copy of octoprint earlier

That should be it, the script does 3 things when the power fails and the battery goes below one of the trigger points

Prints a warning on the printer's LCD screen

Records the current printer status and print file position to a file in /home/pi, so that maybe you can work out how to slice the reminder of the model and save the print

Aborts the print

This hasn't had a massive amount of testing and there are a few bugs, if you have a really big layer going on when the power goes you might not have enough power to make it to the end, octoprint only aborts at specific points in the print, same if you are at the first stages and are heating the bed, octoprint will wait until the bed is up to temp before running the next command (abort).

The sleep at the end of the script stops the rpi from shutting down, we need to wait here and make sure the printer has taken the abort command before killing the pi so that's an unknown amount of time so I leave it running by sleeping indefinitely here

If I get time I will make a proper octoprint plugin for all this

May 30, 2016 08:13 PM

May 21, 2016

Steve Engledow (stilvoid)

Eurodivision

I'm going to a Eurovision party tonight because I'm not the only person of impeccable taste who was away last week :)

I really don't know what it is about Eurovision that makes for such a fun evening but I've had a fantastic Eurovision party every year since I was at uni.

For the next 5 weeks, I'm at home alone as my wife and child are staying with family in Turkey. In order to make sure I won't be bored, I appear to have overfilled my calendar and now I find myself worrying I won't have a moment to myself. Ah well, busy is better than leaving myself open to the temptation of sitting in front of the telly for evenings on end.

I've ordered a Raspberry Pi 3 with the intention of setting it up as a retro gaming machine. I want something that can live permanently attached to my telly so that I can just pick up a controller and have a 10 minute blast on Sonic or Mario at the drop of a hat. I tried doing this before with my original Pi but it was just too slow.


In other news, I posted this on Facebook a while ago and decided it might as well live here too:

I'll be voting that we stay in thanks very much. I know the EU is far from perfect but I hate the idea of slumping backward into a world of tribes. Hating the other guy because he’s on the other side of a fence or believes in a particular magical sky man is ridiculous and childish and exactly the kind of thing we in the west deride and see as the cause of conflicts in the east.

I’m proud of my country. And like any prized possession, I want to show it off to everyone. I want free movement so that I can visit (and maybe one day live and work in) some of the wonderful places that other people are proud of.

I'm married to a foreigner; I frequently meet, work with, and have many friends who are foreign; I love travelling and being the foreigner. I’d love to be in a world where this post doesn’t make any sense because “foreign” and “country” don’t mean anything any more. It’s one planet, guys.

Try this one weird trick to help you realise why I think your ideas about borders are daft: You want tighter border control in the UK... Why the UK? Why not Great Britain? Make the Irish need visas to get in. Why not individual countries? Who wouldn’t enjoy a nice driving break while you queue for passport control at the Welsh border? In fact, why stop there; we could do this regionally! The great wall of East Anglia? County? District? City? Neighbourhood? Street? Why do you draw the line where you draw it?

If you must have a border, draw it around the planet for now. I wouldn’t mind working as a passport officer aboard the ISS.

Be excellent to each other and party on dudes.

by Steve Engledow (steve@offend.me.uk) at May 21, 2016 02:14 PM

May 02, 2016

Mick Morgan

raid performance

I have recently been building a new NAS box (of which, possibly, more later). In fact the build is really a rebuild because I initially built the server about three years ago in order to consolidate a bunch of services I was running on assorted separate servers into one place. That first build was a RAID 1 array of two 2 TB disks (to give me a mirrored setup with a total of 2 TB store). At the time that was sufficient to hold all my important data (backed up both to other networked devices and to standalone USB disks for safety). But I have just upgraded my main desktop machine to a nice shiny new core i7 Skylake box with 16 GB of DDR4 and a 3 TB disk. That disk is already two thirds full (my old machine had a rather full 2 TB disk). This meant that my NAS backup storage requirements exceeded the capacity of my RAID 1 setup. Adding disks wouldn’t help of course because all that would do is add mirror capability rather than capacity. So I decided to upgrade the NAS and bought a bunch of new 2 TB disks with the intention of setting up a RAID 5 array of 4 disks, thus giving a total storage capacity of 6 TB (8 TB minus 2 TB for parity). Furthermore I initially looked at using FreeNAS rather than my usual debian or ubuntu server with software RAID simply because it looked interesting and, with plugins, could probably meet most of my requirements. But I could not get the software to install properly and after three abortive attempts I gave up and decided that I didn’t really like freeBSD anyway….

So I opted to go back to mdadm on linux – at least I know that works. Better still I would be able to retain all my old setup from the old RAID 1 system without having to worry about finding plugins to handle my media streaming requirements, or owncloud installation, for example.

My previous build was on debian (which is by far my preferred server OS) but ubuntu server has recently been released in a LTS version at 16.04 and I thought it might be fun to try that instead. So I did. (For any readers who have not tried installing linux on a RAID system there are plenty of sites offering advice, but the official ubuntu pages are pretty good). During the build I hit what I initially thought was a snag because the installation seemed to get stuck at around the 83% level when it was apparently installing the linux kernel image and headers. Indeed I confess that on the first such installation I pulled the plug after about three hours of no apparent activity because I was beginning to think that there might be something wrong with my hardware (the earlier FreeNAS failures worried me). My on-line searches for assistance were initially not particularly helpful since none of the huge number of sites advising on software RAID installation bothered to mention that initial RAID 5 build (or rebuild) using large capacity disks takes a very long time because of the need to calculate the parity data. Incidentally, it is this parity data and its layout that gives RAID 5 its write performance penalty.

One useful outcome of my research about RAID 5 build times (which in my case eventually took just over 6 hours) was my discovery of the wintelguy’s site providing an on-line calculator (and much more besides) for RAID performance and capacity. There is even a very useful page allowing you to compare two separate configurations side by side – thoroughly recommended. More worrying, and thought provoking, is the reclaime.com calculator for RAID failure. That site suggests that the probability of successfully rebuilding a RAID 5 array of 4 * 2 TB disks after a failure is only 52.8%.

That is why you need to keep backups…….

by Mick at May 02, 2016 03:46 PM

April 02, 2016

Wayne Stallwood (DrJeep)

Simple USB2 Host Switch

Initially created for the BigBox 3D printer to allow use of both the Internal Raspberry Pi running Octoprint and the rear mounted USB port for diagnostic access. The Rumba has only one USB port and can only be attached to one of these at a time.

However this circuit will work in any other scenario where you want to be able to switch between USB Hosts.

Plug a Host PC or other host device into port X1 and the device you want to control into Port X3, everything should work as normal.

Plug an additional powered Host PC or other host device into Port X2 and and the host plugged into Port X1 should be disconnected in preference to this device which should now be connected to the device plugged into port X3.

Please note, in many cases, particularly with devices that are bus powered like memory sticks, the device will not function if there is no powered host PC plugged into port X1

April 02, 2016 07:38 PM

February 29, 2016

Daniel Silverstone (Kinnison)

Kicad hacking - Intra-sheet links and ERC

This is a bit of an odd posting since it's about something I've done but is also here to help me explain why I did it and thus perhaps encourage some discussion around the topic within the Kicad community...

Recently (as you will know if you follow this blog anywhere it is syndicated) I have started playing with Kicad for the development of some hardware projects I've had a desire for. In addition, some of you may be aware that I used to work for a hardware/software consultancy called Simtec, and there I got to play for a while with an EDA tool called Mentor Designview. Mentor was an expensive, slow, clunky, old-school EDA tool, but I grew to understand and like the workflow.

I spent time looking at gEDA and Eagle when I wanted to get back into hardware hacking for my own ends; but neither did I really click with. On the other hand, a mere 10 minutes with Kicad and I knew I had found the tool I wanted to work with long-term.

I designed the beer'o'meter project (a flow meter for the pub we are somehow intimately involved with) and then started on my first personal surface-mount project -- SamDAC which is a DAC designed to work with our HiFi in our study at home.

As I worked on the SamDAC project, I realised that I was missing a very particular thing from Mentor, something which I had low-level been annoyed by while looking at other EDA tools -- Kicad lacks a mechanism to mark a wire as being linked to somewhere else on the same sheet. Almost all of the EDA tools I've looked at seem to lack this nicety, and honestly I miss it greatly, so I figured it was time to see if I could successfully hack on Kicad.

Kicad is written in C++, and it has been mumble mumble years since I last did any C++, either for personal hacking or professionally, so it took a little while for that part of my brain to kick back in enough for me to grok the codebase. Kicad is not a small project, taking around ten minutes to build on my not-inconsiderable computer. And while it beavered away building, I spent time looking around the source code, particularly the schematic editor eeschema.

To skip ahead a bit, after a couple of days of hacking around, I had a proof-of-concept for the intra-sheet links which I had been missing from my days with Mentor, and some ERC (electrical rules checking) to go alongside that to help produce schematics without unwanted "sharp corners".

In total, I added:

I forked the Kicad mirror on Github and pushed my own branch with this work to my Kicad fork.

All of this is meant to allow schematic capture engineers to more clearly state their intentions regarding what they are drawing. The intra-sheet link could be thought of like a no-connect element, except instead of saying "this explicitly goes nowhere" we're saying "this explicitly goes somewhere else on this sheet, you can go look for it".

Obviously, people who dislike (or simply don't want to use) such intra-sheet link elements can just disable that ERC tickybox and not be bothered by them in the least (well except for the toolbar button and menu item I suppose).

Whether this work gets accepted into Kicad, or festers and dies on the vine, it was good fun developing it and I'd like to illustrate how it could help you, and why I wrote it in the first place:

A contrived story

Note, while this story is meant to be taken seriously, it is somewhat contrived, the examples are likely electrical madness, but please just think about the purpose of the checks etc.

To help to illustrate the feature and why it exists, I'd like to tell you a somewhat contrived story about Fred. Fred is a schematic capture engineer and his main job is to review schematics generated by his colleagues. Fred and his colleagues work with Kicad (hurrah) but of late they've been having a few issues with being able to cleanly review schematics.

Fred's colleagues are not the neatest of engineers. In particular they tend to be quite lazy when it comes to running busses, which are not (for example) address and data busses, around their designs and they tend to simply have wires which end in mid-space and pick up somewhere else on the sheet. All this is perfectly reasonable of course, and Kicad handles it with aplomb. Sadly it seems quite error prone for Fred's workplace.

As an example, Fred's colleague Ben has been designing the power supply for a particular board. As with most power supplies, plenty of capacitors are needed to stabilise the regulators and smooth the output. In the example below, the intent is that all of the capacitors are on the FOO net.

Contrived problem example 1

Sadly there's a missing junction and/or slightly misplaced label in the upper section which means that C2 and C3 simply don't join to the FOO net. This could easily be missed, but the ERC can't spot it at all since there's more than one thing on each net, so the pins of the capacitors are connected to something.

Fred is very sad that this kind of problem can sometimes escape notice by the schematic designer Ben, Fred himself, and the layout engineer, resulting in boards which simply do not work. Fred takes it upon himself to request that the strict wiring checks ERC is made mandatory for all designs, and that the design engineers be required to use intra-sheet link symbols when they have signals which wander off to other parts of the sheet like FOO does in the example. Without any further schematic changes, strict wiring checks enabled gives the following points of ERC concern for Ben to think about:

Contrived problem example 2

As you can see, the ERC is pointing at the wire ends and the warnings are simply that the wires are dangling and that this is not acceptable. This warning is very like the pin-not-connected warnings which can be silenced with an explicit no-connect schematic element. Ben, being a well behaved and gentle soul, obeys the design edicts from Fred and seeks out the intra-sheet link symbols, clearing off the ERC markers and then adding intra-sheet links to his design:

Contrived problem example 3

This silences the dangling end ERC check, which is good, however it results in another ERC warning:

Contrived problem example 4

This time, the warning for Ben to deal with is that the intra-sheet links are pointless. Each exists without a companion to link to because of the net name hiccough in the top section. It takes Ben a moment to realise that the mistake which has been made is that a junction is missing in the top section. He adds the junction and bingo the ERC is clean once more:

Contrived problem example 5

Now, this might not seem like much gain for so much effort, but Ben can now be more confident that the FOO net is properly linked across his design and Fred can know, when he looks at the top part of the design, that Ben intended for the FOO net to go somewhere else on the sheet and he can look for it.

Why do this at all?

Okay, dropping out of our story now, let's discuss why these ERC checks are worthwhile and why the intra-sheet link schematic element is needed.

Note: This bit is here to remind me of why I did the work, and to hopefully explain a little more about why I think it's worth adding to Kicad...

Designers are (one assumes) human beings. As humans we (and I count myself here too) are prone to mistakes. Sadly mistakes are often subtle and could easily be thought of as deliberate if the right thought processes are not followed carefully when reviewing. Anyone who has ever done code review, proofread a document, or performed any such activity, will be quite familiar with the problems which can be introduced by a syntactically and semantically valid construct which simply turns out to be wrong in the greater context.

When drawing designs, I often end up with bits of wire sticking out of schematic sections which are not yet complete. Sadly if I sleep between design sessions, I often lose track of whether such a dangling wire is meant to be attached to more stuff, or is simply left because the net is picked up elsewhere on the sheet. With intra-sheet link elements available, if I had intended the latter, I'd have just dropped such an element on the end of the wire before I stopped for the day.

Also, when drawing designs, I sometimes forget to label a wire, especially if it has just passed through a filter or current-limiting resistor or similar. As such, even with intra-sheet link elements to show me when I mean for a net to go bimbling off across the sheet, I can sometimes end up with unnamed nets whose capacitors end up not used for anything useful. This is where the ERC comes in.

By having the ERC complain if a wire dangles -- the design engineer won't forget to add links (or check more explicitly if the wire is meant to be attached to something else). By having junctions which don't actually link anything warned about, the engineer can't just slap a junction blob down on the end of a wire to silence that warning, since that doesn't mean anything to a reviewer later down the line. By having the ERC warn if a net has exactly one intra-sheet link attached to it, missing net names and errors such as that shown in my contrived example above can be spotted and corrected.

Ultimately this entire piece of work is about ensuring that the intent of the design engineer is captured clearly in the schematic. If the design engineer meant to leave that wire dangling because it's joining to another bit of wire elsewhere on the sheet, they can put the intra-sheet links in to show this. The associated ERC checks are there purely to ensure that the validation of this intent is not bypassed accidentally, or deliberately, in order to make the use of this more worthwhile and to increase the usefulness of the ERC on designs where signals jump around on sheets where wiring them up directly would just create a mess.

by Daniel Silverstone at February 29, 2016 11:19 AM

January 31, 2016

Daniel Silverstone (Kinnison)

Building an Oscilloscope

I recently ordered some PCBs from Elecrow for the Vic's beer-measurement system I've been designing with Rob. While on the site, I noticed that they have a single-channel digital oscilloscope kit based on an STM32. This is a JYE Tech DSO138 which arrives as a PCB whose surface-mount stuff has been fitted, along with a whole bunch of pin-through components for you to solder up the scope yourself. There's a non-trivial number of kinds of components, so first you should prep by splitting them all up and double-checking them all.

Preparing the components

Once you've done that, the instructions start you off fitting a whole bunch of resistors...

Step 1, fitting resistors

Then some diodes, RF chokes, and the 8MHz crystal for the STM32.

Step 2, fitting diodes, a crystal, and chokes

The single most-difficult bit for me to solder was the USB socket. Fine pitch leads, coupled with high-thermal-density socket.

Step 3, the USB socket

There is a veritable mountain of ceramic capacitors to fit...

Step 4, all the ceramic capacitors

And then buttons, inductors, trimming capacitors and much more...

Step 5, buttons, inductors, trimming capacitors, etc

THe switches were the next hardest things to solder, after the USB socket...

Step 6, Switches, connectors, etc

Finally you have to solder a test loop and close some jumpers before you power-test the board.

Step 7, Test loop and jumper soldering

The last bit of soldering is to solder pins to the LCD panel board...

Step 8, LCD panel

Before you finally have a working oscilloscope

Working oscilloscope!

I followed the included instructions to trim the scope using the test point and the trimming capacitors, before having a break to write this up for you all. I'd say that it was a fun day because I enjoyed getting a lot of soldering practice (before I have to solder up the beer'o'meter for the pub) and at the end of it I got a working oscilloscope. For 40 USD, I'd recommend this to anyone who fancies a go.

by Daniel Silverstone at January 31, 2016 05:51 PM

June 11, 2015

MJ Ray

Mick Morgan: here’s why pay twice?

http://baldric.net/2015/06/05/why-pay-twice/ asks why the government hires civilians to monitor social media instead of just giving GC HQ the keywords. Us cripples aren’t allowed to comment there (physical ability test) so I reply here:

It’s pretty obvious that they have probably done both, isn’t it?

This way, they’re verifying each other. Politicians probably trust neither civilians or spies completely and that makes it worth paying twice for this.

Unlike lots of things that they seem to want not to pay for at all…

by mjr at June 11, 2015 03:49 AM

May 14, 2015

MJ Ray

Recorrecting Past Mistakes: Window Borders and Edges

A while ago, I switched from tritium to herbstluftwm. In general, it’s been a good move, benefitting from active development and greater stability, even if I do slightly mourn the move from python scripting to a shell client.

One thing that was annoying me was that throwing the pointer into an edge didn’t find anything clickable. Window borders may be pretty, but they’re a pretty poor choice as the thing that you can locate most easily, the thing that is on the screen edge.

It finally annoyed me enough to find the culprit. The .config/herbstluftwm/autostart file said “hc pad 0 26” (to keep enough space for the panel at the top edge) and changing that to “hc pad 0 -8 -7 26 -7” and reconfiguring the panel to be on the bottom (where fewer windows have useful controls) means that throwing the pointer at the top or the sides now usually finds something useful like a scrollbar or a menu.

I wonder if this is a useful enough improvement that I should report it as an enhancement bug.

by mjr at May 14, 2015 04:58 AM

March 09, 2015

Ben Francis

Pinned Apps – An App Model for the Web

(re-posted from a page I created on the Mozilla wiki on 17th December 2014)

Problem Statement

The per-OS app store model has resulted in a market where a small number of OS companies have a large amount of control, limiting choice for users and app developers. In order to get things done on mobile devices users are restricted to using apps from a single app store which have to be downloaded and installed on a compatible device in order to be useful.

Design Concept

Concept Overview

The idea of pinned apps is to turn the apps model on its head by making apps something you discover simply by searching and browsing the web. Web apps do not have to be installed in order to be useful, “pinning” is an optional step where the user can choose to split an app off from the rest of the web to persist it on their device and use it separately from the browser.

Pinned_apps_overview

”If you think of the current app store experience as consumers going to a grocery store to buy packaged goods off a shelf, the web is more like a hunter-gatherer exploring a forest and discovering new tools and supplies along their journey.”

App Discovery

A Web App Manifest linked from a web page says “I am part of a web app you can use separately from the browser”. Users can discover web apps simply by searching or browsing the web, and use them instantly without needing to install them first.

Pinned_apps_discovery

”App discovery could be less like shopping, and more like discovering a new piece of inventory while exploring a new level in a computer game.”

App Pinning

If the user finds a web app useful they can choose to split it off from the rest of the web to persist it on their device and use it separately from the browser. Pinned apps can provide a more app-like experience for that part of the web with no browser chrome and get their own icon on the homescreen.

Pinned_apps_pinning

”For the user pinning apps becomes like collecting pin badges for all their favourite apps, rather than cluttering their device with apps from an app store that they tried once but turned out not to be useful.”

Deep Linking

Once a pinned app is registered as managing its own part of the web (defined by URL scope), any time the user navigates to a URL within that scope, it will open in the app. This allows deep linking to a particular page inside an app and seamlessly linking from one app to another.

Pinned_apps_linking

”The browser is like a catch-all app for pages which don’t belong to a particular pinned app.”

Going Offline

Pinning an app could download its contents to the device to make it work offline, by registering a Service Worker for the app’s URL scope.

Pinned_apps_offline

”Pinned apps take pinned tabs to the next level by actually persisting an app on the device. An app pin is like an anchor point to tether a collection of web pages to a device.”

Multiple Pages

A web app is a collection of web pages dedicated to a particular task. You should be able to have multiple pages of the app open at the same time. Each app could be represented in the task manager as a collection of sheets, pinned together by the app.

Pinned_app_pages

”Exploding apps out into multiple sheets could really differentiate the Firefox OS user experience from all other mobile app platforms which are limited to one window per app.”

Travel Guide

Even in a world without app stores there would still be a need for a curated collection of content. The Marketplace could become less of a grocery store, and more of a crowdsourced travel guide for the web.

Pinned_apps_guide

”If a user discovers an app which isn’t yet included in the guide, they could be given the opportunity to submit it. The guide could be curated by the community with descriptions, ratings and tags.”

3 Questions

Pinnged_apps_pinned

What value (the importance, worth or usefulness of something) does your idea deliver?

The pinned apps concept makes web apps instantly useful by making “installation” optional. It frees users from being tied to a single app store and gives them more choice and control. It makes apps searchable and discoverable like the rest of the web and gives developers the freedom of where to host their apps and how to monetise them. It allows Mozilla to grow a catalogue of apps so large and diverse that no walled garden can compete, by leveraging its user base to discover the apps and its community to curate them.

What technological advantage will your idea deliver and why is this important?

Pinned apps would be implemented with emerging web standards like Web App Manifests and Service Workers which add new layers of functionality to the web to make it a compelling platform for mobile apps. Not just for Firefox OS, but for any user agent which implements the standards.

Why would someone invest time or pay money for this idea?

Users would benefit from a unique new web experience whilst also freeing themselves from vendor lock-in. App developers can reduce their development costs by creating one searchable and discoverable web app for multiple platforms. For Mozilla, pinned apps could leverage the unique properties of the web to differentiate Firefox OS in a way that is difficult for incumbents to follow.

UI Mockups

App Search

Pinned_apps_search

Pin App

Pin_app

Pin Page

Pin_page

Multiple Pages

Multiple_pages

App Directory

App_directory

Implementation

Web App Manifest

A manifest is linked from a web page with a link relation:

  <link rel=”manifest” href=”/manifest.json”>

A manifest can specify an app name, icon, display mode and orientation:

 {
   "name": "GMail"
   "icons": {...},
   "display": "standalone",
   "orientation": “portrait”,
   ...
 }

There is a proposal for a manifest to be able to specify an app scope:

 {
   ...
   "scope": "/"
   ...
 }

Service Worker

There is also a proposal to be able to reference a Service Worker from within the manifest:

 {
   ...
   service_worker: {
     src: "app.js",
     scope: "/"
   ...
 }

A Service Worker has an install method which can populate a cache with a web app’s resources when it is registered:

 this.addEventListener('install', function(event) {
  event.waitUntil(
    caches.create('v1').then(function(cache) {
     return cache.add(
        '/index.html',
        '/style.css',
        '/script.js',
        '/favicon.ico'
      );
    }, function(error) {
        console.error('error populating cache ' + error);
    };
  );
 });

So that the app can then respond to requests for resources when offline:

 this.addEventListener('fetch', function(event) {
  event.respondWith(
    caches.match(event.request).catch(function() {
      return event.default();
    })
  );
 });

by tola at March 09, 2015 03:54 PM

December 11, 2014

Ben Francis

The Times They Are A Changin’ (Open Web Remix)

In the run up to the “Mozlandia” work week in Portland, and in reflection of the last three years of the Firefox OS project, for a bit of fun I’ve reworked a Bob Dylan song to celebrate our incredible journey so far.

Here’s a video featuring some of my memories from the last three years, with Siobhan (my fiancée) and me singing the song at you! There are even lyrics so you can sing along ;)

“Keep on rockin’ the free web” — Potch

by tola at December 11, 2014 11:26 AM

July 10, 2014

James Taylor

SSL / TLS

Is it annoying or not that everyone says SSL Certs and SSL when they really mean TLS?

Does anyone actually mean SSL? Have there been any accidents through people confusing the two?


July 10, 2014 02:09 PM

Cloud Computing Deployments … Revisited.

So its been a few years since I’ve posted, because its been so much hard work, and we’ve been pushing really hard on some projects which I just can’t talk about – annoyingly. Anyways, March 20th , 2011 I talked about Continual Integration and Continual Deployment and the Cloud and discussed two main methods – having what we now call ‘Gold Standards’ vs continually updating.

The interesting thing is that as we’ve grown as a company, and as we’ve become more ‘Enterprise’, we’ve brought in more systems administrators and begun to really separate the deployments from the development. The other thing is we have separated our services out into multiple vertical strands, which have different roles. This means we have slightly different processes for Banking or Payment based modules then we do from marketing modules. We’re able to segregate operational and content from personally identifiable information – PII having much higher regulation on who can (and auditing of who does) access.

Several other key things had to change: for instance, things like SSL keys of the servers shouldn’t be kept in the development repo. Now, of course not, I hear you yell, but its a very blurry line. For instance, should the Django configuration be kept in the repo? Well, yes, because that defines the modules and things like URLs. Should the nginx config be kept in the repo? Well, oh. if you keep *that* in then you would keep your SSL certs in…

So the answer becomes having lots of repo’s. One repo per application (django wise), and one repo per deployment containing configurations. And then you start looking at build tools to bring, for a particular server or cluster of servers up and running.

The process (for our more secure, audited services) is looking like a tool to bring an AMI up, get everything installed and configured, and then take a snapshot, and then a second tool that takes that AMI (and all the others needed) and builds the VPC inside of AWS. Its a step away from the continual deployment strategy, but it is mostly automated.


July 10, 2014 02:09 PM

June 28, 2014

Brett Parker (iDunno)

Sony Entertainment Networks Insanity

So, I have a SEN account (it's part of the PSN), I have 2 videos with SEN, I have a broken PS3 so I can no deactivate video (you can only do that from the console itself, yes, really)... and the response from SEN has been abysmal, specifically:

As we take the security of SEN accounts very seriously, we are unable to provide support on this matter by e-mail as we will need you to answer some security questions before we can investigate this further. We need you to phone us in order to verify your account details because we're not allowed to verify details via e-mail.

I mean, seriously, they're going to verify my details over the phone better than over e-mail how exactly? All the contact details are tied to my e-mail account, I have logged in to their control panel and renamed the broken PS3 to "Broken PS3", I have given them the serial number of the PS3, and yet they insist that I need to call them, because apparently they're fucking stupid. I'm damned glad that I only ever got 2 videos from SEN, both of which I own on DVD now anyways, this kind of idiotic tie in to a system is badly wrong.

So, you phone the number... and now you get stuck with hold music for ever... oh, yeah, great customer service here guys. I mean, seriously, WTF.

OK - 10 minutes on the phone, and still being told "One of our advisors will be with you shortly". I get the feeling that I'll just be writing off the 2 videos that I no longer have access to.

I'm damned glad that I didn't decide to buy more content from that - at least you can reset the games entitlement once every six months without jumping through all these hoops (you have to reactivate each console that you still want to use, but hey).

by Brett Parker (iDunno@sommitrealweird.co.uk) at June 28, 2014 03:54 PM

June 12, 2014

Paul Tansom

Beginning irc

After some discussion last night at PHP Hants about the fact that irc is a great facilitator of support / discussion, but largely ignored because there is rarely enough information for a new user to get going I decided it may be worth putting together a howto type post so here goes…

What is irc?

First of all, what on earth is it? I’m tempted to describe it as Twitter done right years before Twitter even existed, but I’m a geek and I’ve been using irc for years. It has a long heritage, but unlike the ubiquitous email it hasn’t made the transition into mainstream use. In terms of usage it has similarities to things like Twitter and Instant Messaging. Let’s take a quick look at this.

Twitter allows you to broadcast messages, they get published and anyone who is subscribed to your feed can read what you say. Everything is pretty instant, and if somebody is watching the screen at the right time they can respond straight away. Instant Messaging on the other hand, is more of a direct conversation with a single person, or sometimes a group of people, but it too is pretty instantaneous – assuming, of course, that there’s someone reading what you’ve said. Both of these techonologies are pretty familiar to many. If you go to the appropriate website you are given the opportunity to sign up and either use a web based client or download one.

It is much the same for irc in terms of usage, although conversations are grouped into channels which generally focus on a particular topic rather than being generally broadcast (Twitter) or more specifically directed (Instant Messaging). The downside is that in most cases you don’t get a web page with clear instructions of how to sign up, download a client and find where the best place is to join the conversation.

Getting started

There are two things you need to get going with irc, a client and somewhere to connect to. Let’s put that into a more familiar context.

The client is what you use to connect with; this can be an application – so as an example Outlook or Thunderbird would be a mail client, or IE, Firefox, Chrome or Safari are examples of clients for web pages – or it can be a web page that does the same thing – so if you go to twitter.com and login you are using the web page as your Twitter client. Somewhere to connect to can be compared to a web address, or if you’ve got close enough to the configuration of your email to see the details, your mail server address.

Let’s start with the ‘somewhere to connect to‘ bit. Freenode is one of the most popular irc servers, so let’s take a look. First we’ll see what we can find out from their website, http://freenode.net/.

freenode

There’s a lot of very daunting information there for somebody new to irc, so ignore most of it and follow the Webchat link on the left.

webchat

That’s all very well and good, but what do we put in there? I guess the screenshot above gives a clue, but if you actually visit the page the entry boxes will be blank. Well first off there’s the Nickname, this can be pretty much anything you like, no need to register it – stick to the basics of letters, numbers and some simple punctuation (if you want to), keep it short and so long as nobody else is already using it you should be fine; if it doesn’t work try another. Channels is the awkward one, how do you know what channels there are? If you’re lucky you’re looking into this because you’ve been told there’s a channel there and hopefully you’ve been given the channel name. For now let’s just use the PHP Hants channel, so that would be #phph in the Channels box. Now all you need to do is type in the captcha, ignore the tick boxes and click Connect and you are on the irc channel and ready to chat. Down the right you’ll see a list of who else is there, and in the main window there will be a bit of introductory information (e.g. topic for the channel) and depending on how busy it is anything from nothing to a fast scrolling screen of text.

phph

If you’ve miss typed there’s a chance you’ll end up in a channel specially created for you because it didn’t exist; don’t worry, just quit and try again (I’ll explain that process shortly).

For now all you really need to worry about is typing in text an posting it, this is as simple as typing it into the entry box at the bottom of the page and pressing return. Be polite, be patient and you’ll be fine. There are plenty of commands that you can use to do things, but for now the only one you need to worry about is the one to leave, this is:

/quit

Type it in the entry box, press return and you’ve disconnected from the server. The next thing to look into is using a client program since this is far more flexible, but I’ll save that for another post.

by Paul Tansom at June 12, 2014 04:27 PM

May 06, 2014

Richard Lewis

Refocusing Ph.D

Actual progress on this Ph.D revision has been quite slow. My current efforts are on improving the focus of the thesis. One of the criticisms the examiners made (somewhat obliquely) was that it wasn&apost very clear exactly what my subject was: musicology? music information retrieval? computational musicology? And the reason for this was that I failed to make that clear to myself. It was only at the writing up stage, when I was trying to put together a coherent argument, that I decided to try and make it a story about music information retrieval (MIR). I tried to argue that MIR&aposs existing evaluation work (which was largely modelled on information retrieval evaluation from the text world) only took into account the music information needs of recreational users of MIR systems, and that there was very little in the way of studying the music information seeking behaviour of "serious" users. However, the examiners didn&apost even accept that information retrieval was an important problem for musicology, nevermind that there was work to be done in examining music information needs of music scholarship.

So I&aposm using this as an excuse to shift the focus away from MIR a little and towards something more like computational musicology and music informatics. I&aposm putting together a case study of a computational musicology toolkit called music21. Doing this allows me to focus in more detail on a smaller and more distinct community of users (rather than attempting to studying musicologists in general which was another problematic feature of the thesis), it makes it much clearer what kind of music research can be addressed using the technology (all of MIR is either far too diverse or far too generic, depending on how you want to spin it), and also allows me to work with the actually Purcell Plus project materials using the toolkit.

May 06, 2014 11:16 PM

March 27, 2014

Richard Lewis

Taking notes in Haskell

The other day we had a meeting at work with a former colleague (now at QMUL) to discuss general project progress. The topics covered included the somewhat complicated workflow that we&aposre using for doing optical music recognition (OMR) on early printed music sources. It includes mensural notation specific OMR software called Aruspix. Aruspix itself is fairly accurate in its output, but the reason why our workflow is non-trivial is that the sources we&aposre working with are partbooks; that is, each part (or voice) of a multi-part texture is written on its own part of the page, or even on a different page. This is very different to modern score notation in which each part is written in vertical alignment. In these sources, we don&apost even know where separate pieces begin and end, and they can actually begin in the middle of a line. The aim is to go from the double page scans ("openings") to distinct pieces with their complete and correctly aligned parts.

Anyway, our colleague from QMUL was very interested in this little part of the project and suggested that we spend the afternoon, after the style of good software engineering, formalising the workflow. So that&aposs what we did. During the course of the conversation diagrams were drawn on the whiteboard. However (and this was really the point of this post) I made notes in Haskell. It occurred to me a few minutes into the conversation that laying out some types and the operations over those types that comprise our workflow is pretty much exactly the kind of formal specification we needed.

Here&aposs what I typed:

module MusicalDocuments where

import Data.Maybe

-- A document comprises some number of openings (double page spreads)
data Document = Document [Opening]

-- An opening comprises one or two pages (usually two)
data Opening = Opening (Page, Maybe Page)

-- A page comprises multiple systems
data Page = Page [System]

-- Each part is the line for a particular voice
data Voice = Superius | Discantus | Tenor | Contratenor | Bassus

-- A part comprises a list of musical sybmols, but it may span mutliple systems
--(including partial systems)
data Part = Part [MusicalSymbol]

-- A piece comprises some number of sections
data Piece = Piece [Section]

-- A system is a collection of staves
data System = System [Staff]

-- A staff is a list of atomic graphical symbols
data Staff = Staff [Glyph]

-- A section is a collection of parts
data Section = Section [Part]

-- These are the atomic components, MusicalSymbols are semantic and Glyphs are
--syntactic (i.e. just image elements)
data MusicalSymbol = MusicalSymbol
data Glyph = Glyph

-- If this were real, Image would abstract over some kind of binary format
data Image = Image

-- One of the important properties we need in order to be able to construct pieces
-- from the scanned components is to be able to say when objects of the some of the
-- types are strictly contiguous, i.e. this staff immediately follows that staff
class Contiguous a where
  immediatelyFollows :: a -> a -> Bool
  immediatelyPrecedes :: a -> a -> Bool
  immediatelyPrecedes a b = b `immediatelyFollows` a

instance Contiguous Staff where
  immediatelyFollows :: Staff -> Staff -> Bool
  immediatelyFollows = undefined

-- Another interesting property of this data set is that there are a number of
-- duplicate scans of openings, but nothing in the metadata that indicates this,
-- so our workflow needs to recognise duplicates
instance Eq Opening where
  (==) :: Opening -> Opening -> Bool
  (==) a b = undefined

-- Maybe it would also be useful to have equality for staves too?
instance Eq Staff where
  (==) :: Staff -> Staff -> Bool
  (==) a b = undefined

-- The following functions actually represent the workflow

collate :: [Document]
collate = undefined

scan :: Document -> [Image]
scan = undefined

split :: Image -> Opening
split = undefined

paginate :: Opening -> [Page]
paginate = undefined

omr :: Page -> [System]
omr = undefined

segment :: System -> [Staff]
segment = undefined

tokenize :: Staff -> [Glyph]
tokenize = undefined

recogniseMusicalSymbol :: Glyph -> Maybe MusicalSymbol
recogniseMusicalSymbol = undefined

part :: [Glyph] -> Maybe Part
part gs =
  if null symbols then Nothing else Just $ Part symbols
  where symbols = mapMaybe recogniseMusicalSymbol gs

alignable :: Part -> Part -> Bool
alignable = undefined

piece :: [Part] -> Maybe Piece
piece = undefined

I then added the comments and implemented the part function later on. Looking at it now, I keep wondering whether the types of the functions really make sense; especially where a return type is a type that&aposs just a label for a list or pair.

I haven&apost written much Haskell code before, and given that I&aposve only implemented one function here, I still haven&apost written much Haskell code. But it seemed to be a nice way to formalise this procedure. Any criticisms (or function implementations!) welcome.

March 27, 2014 11:13 PM

February 06, 2014

Adam Bower (quinophex)

I finally managed to beat my nemesis!

I purchased this book http://www.amazon.co.uk/dp/0738206679 (Linked, by Barabasi) on the 24th of December 2002, I had managed to make 6 or 7 aborted attempts at reading it to completion where life had suddenly got busy and just took over. This meant that I put the book down and didn't pick it up again until things were less hectic some time later and I started again.

Anyhow, I finally beat the book a few nights ago, my comprehension of it was pretty low anyhow but at least it is done. Just shows I need to read lots more given how little went in.




comment count unavailable comments

February 06, 2014 10:40 PM

February 01, 2014

Adam Bower (quinophex)

Why buying a Mio Cyclo 305 HC cycling computer was actually a great idea.

I finally made it back out onto the bike today for the first time since September last year. I'd spent some time ill in October and November which meant I had to stop exercising and as a result I've gained loads of weight over the winter and it turns out also become very unfit which can be verified by looking at the Strava ride from today: http://www.strava.com/activities/110354158

Anyhow, a nice thing about this ride is that I can record it on Strava and get this data about how unfit I have become, this is because last year I bought a Mio Cyclo 305 HC cycle computer http://eu.mio.com/en_gb/mio-cyclo-305-hc.htm from Halfords reduced to £144.50 (using a British Cycling discount). I was originally going to get a Garmin 500 but Amazon put the price up from £149.99 the day I was going to buy it to £199.99.

I knew when I got the Mio that it had a few issues surrounding usability and features but it was cheap enough at under £150 that I figured that even if I didn't get on with it I'd at least have a cadence sensor and heart rate monitor so I could just buy a Garmin 510 when they sorted out the firmware bugs with that and the price came down a bit which is still my longer term intention.

So it turns out a couple of weeks ago I plugged my Mio into a Windows VM when I was testing USB support and carried out a check for new firmware. I was rather surprised to see a new firmware update and new set of map data was available for download. So I installed it think I wasn't going to get any new features from it as Mio had released some new models but it turns out that the new firmware actually enables a single feature (amongst other things, they also tidied up the UI and sorted a few other bugs along with some other features) that makes the device massively more useful as it now also creates files in .fit format which can be uploaded directly to Strava.

This is massively useful for me as although the Mio always worked in Linux as the device is essentially just a USB mass storage device but you would have to do an intermediate step of having to use https://github.com/rhyas/GPXConverter to convert the files from the Mio-centric GPX format to something Strava would recognise. Now I can just browse to the folder and upload the file directly which is very handy.

All in it turns out that buying a Mio which reading reviews and forums were full of doom and gloom means I can wait even longer before considering replacement with a garmin.

comment count unavailable comments

February 01, 2014 02:11 PM

January 04, 2014

Brett Parker (iDunno)

Wow, I do believe Fasthosts have outdone themselves...

So, got a beep this morning from our work monitoring system. One of our customers domain names is hosted with livedns.co.uk (which, as far as I can tell, is part of the Fasthosts franchise)... It appears that Fasthosts have managed to entirely break their DNS:

brettp@laptop:~$ host www.fasthosts.com
;; connection timed out; no servers could be reached
brettp@laptop:~$ whois fasthosts.com | grep -i "Name Server"
   Name Server: NS1.FASTHOSTS.NET.UK
   Name Server: NS2.FASTHOSTS.NET.UK
Name Server: NS1.FASTHOSTS.NET.UK
Name Server: NS2.FASTHOSTS.NET.UK
brettp@laptop:~$ whois fasthosts.net.uk | grep -A 2 "Name servers:"
    Name servers:
        ns1.fasthosts.net.uk      213.171.192.252
        ns2.fasthosts.net.uk      213.171.193.248
brettp@laptop:~$  host -t ns fasthosts.net.uk 213.171.192.252
;; connection timed out; no servers could be reached
brettp@laptop:~$ host -t ns fasthosts.net.uk 213.171.193.248
;; connection timed out; no servers could be reached
brettp@laptop:~$

So, that's fasthosts core nameservers not responding, good start! They also provide livedns.co.uk, so lets have a look at that:

brettp@laptop:~$ whois livedns.co.uk | grep -A 3 "Name servers:"
    Name servers:
        ns1.livedns.co.uk         213.171.192.250
        ns2.livedns.co.uk         213.171.193.250
        ns3.livedns.co.uk         213.171.192.254
brettp@laptop:~$ host -t ns ns1.livedns.co.uk 213.171.192.250
;; connection timed out; no servers could be reached
brettp@laptop:~$ host -t ns ns1.livedns.co.uk 213.171.193.250
;; connection timed out; no servers could be reached
brettp@laptop:~$ host -t ns ns1.livedns.co.uk 213.171.192.254
;; connection timed out; no servers could be reached

So, erm, apparently that's all their DNS servers "Not entirely functioning correctly"! That's quite impressive!

by Brett Parker (iDunno@sommitrealweird.co.uk) at January 04, 2014 10:24 AM

January 01, 2014

John Woodard

A year in Prog!


It's New Year's Day 2014 and I'm reflecting on the music of past year.

Album wise there were several okay...ish releases in the world of Progressive Rock. Steven Wilson's The Raven That Refused To Sing not the absolute masterpiece some have eulogised a solid effort though but it did contain some filler. Motorpsyco entertained with Still Life With Eggplant not as good as their previous album but again a solid effort. Magenta as ever didn't disappoint with The 27 Club, wishing Tina Booth a swift recovery from her ill health.

The Three stand out albums in no particular order for me were Edison's Children's Final Breath Before November which almost made it as album of the year and Big Big Train with English Electric Full Power which combined last years Part One and this years Part Two with some extra goodies to make the whole greater than the sum of the parts. Also Adrian Jones of Nine Stones Close fame pulled one out of the bag with his side Project Jet Black Sea which was very different and a challenging listen, hard going at first but surprisingly very good. This man is one superb guitarist especially if you like emotion wrung out of the instrument like David Gilmore or Steve Rothery.

The moniker of Album of the Year this year goes to Fish for the incredible Feast of Consequences. A real return to form and his best work since Raingods With Zippos. The packaging of the deluxe edition with a splendid book featuring the wonderful artwork of Mark Wilkinson was superb. A real treat with a very thought provoking suite about the first world war really hammed home the saying "Lest we forget". A fine piece that needs to be heard every November 11th.


Gig wise again Fish at the Junction in Cambridge was great. His voice may not be what it was in 1985 but he is the consummate performer, very at home on the stage. As a raconteur between songs he is as every bit as entertaining as he is singing songs themselves.

The March Marillion Convention in Port Zealand, Holland where they performed their masterpiece Brave was very special as every performance of incredible album is. The Marillion Conventions are always special but Brave made this one even more special than it would normally be.
Gig of the year goes again to Marillion at Aylesbury Friars in November. I had waited thirty years and forty odd shows to see them perform Garden Party segued into Market Square Heroes that glorious night it came to pass, I'm am now one very happy Progger or should that be Proggie? Nevermind Viva Progressive Rock!

by BigJohn (aka hexpek) (noreply@blogger.com) at January 01, 2014 07:56 PM

December 01, 2013

Paul Tansom

Scratch in a network environment

I have been running a Code Club at my local Primary School for a while now, and thought it was about time I put details of a few tweaks I’ve made to the default Scratch install to make things easier. So here goes:

With the default install of Scratch (on Windows) projects are saved to the C: drive. For a network environment, with pupils work stored on a network drive so they always have access whichever machine they sit at, this isn’t exactly helpful. It also isn’t ideal that they can explore the C: drive in spite of profile restrictions (although it isn’t the end of the world as there is little they can do from Scratch).

save-orig

After a bit of time with Google I found the answer, and since it didn’t immediately leap out at me when I was searching I thought I’d post it here (perhaps my Google Fu was weak that day). It is actually quite simple, especially for the average Code Club volunteer I should imagine; just edit the scratch.ini file. This is, as would be expected, located in:

C:\Program Files\Scratch\Scratch.ini

Initially it looks like this:

ini-orig

Pretty standard stuff, but unfortunately no comments to indicate what else you can do with it. As it happens you can add the following two lines (for example):

Home=U:
VisibleDrives=U:

To get this:

ini-new

They do exactly what is says on the tin. If you click on the Home button in a file dialogue box then you only get the drive(s) specified. You can also put a full path in if you want to put the home directory further down the directory structure.

save-new1

The VisibleDrives option restricts what you can see if you click on the Computer button in a file dialogue box. If you want to allow more visible drives then separate them with a comma.

save-new2

You can do the same with a Mac (for the home drive), just use the appropriate directory format (i.e. no drive letter and the opposite direction slash).

There is more that you can do, so take a look at the Scratch documentation here. For example if you use a * in the directory path it is replaced by the name of the currently logged on user.

Depending on your network environment it may be handy for your Code Club to put the extra resources on a shared network drive and open up an extra drive in the VisibleDrives. One I haven’t tried yet it is the proxy setting, which I hope will allow me to upload projects to the Scratch website. It goes something like:

ProxyServer=[server name or IP address]
ProxyPort=[port number]

by Paul Tansom at December 01, 2013 07:00 PM

February 22, 2013

Joe Button

Sampler plugin for the baremetal LV2 host

I threw together a simpler sampler plugin for kicks. Like the other plugins it sounds fairly underwhelming. Next challenge will probably be to try plugging in some real LV2 plugins.

February 22, 2013 11:22 PM

February 21, 2013

Joe Button

Baremetal MIDI machine now talks to hardware MIDI devices

The Baremetal MIDI file player was cool, but not quite as cool as a real instrument.

I wired up a MIDI In port along the lines of This one here, messed with the code a bit and voila (and potentially viola), I can play LV2 instrument plugins using a MIDI keyboard:

When I say "LV2 synth plugins", I should clarify that I'm only using the LV2 plugin C API, not the whole .ttl text file shebangle. I hope to get around to that at some point but it will be a while before you can directly plug LV2s into this and expect them to just work.

February 21, 2013 04:05 PM

January 16, 2013

John Woodard

LinuxMint 14 Add Printer Issue


 LinuxMint 14 Add Printer Issue



 

I wanted to print from my LinuxMint 14 (Cinnamon) PC via a shared Windows printer on my network. Problem is it isn’t found by the printers dialog in system settings. I thought I’d done all the normal things to get samba to play nice like rearranging the name resolve order in /etc/samba/smb.conf to a more sane bcast host lmhosts wins. Having host and wins, neither of which I’m using first in the order cocks things up some what. Every time I tried to search for the printer in the system setting dialog it told me “FirewallD is not running. Network printer detection needs services mdns, ipp, ipp-client and samba-client enabled on firewall.” So much scratching of the head there then, because as far as I can tell there ain’t no daemon by that name available!

It turns out thanks to /pseudomorph this has been a bug since LinuxMint12 (based on Ubuntu 11.10). It’s due to that particular daemon (Windows people daemon pretty much = service) being Fedora specific and should have no place in a Debian/Ubuntu based distribution. Bugs of this nature really should be ironed out sooner.

Anyway the simple fix is to use the more traditional approach using the older printer dialog which is accessed by inputting system-config-printer at the command line. Which works just fine so why the new (over a year old) printer config dialog that is inherently broken I ask myself.

The CUPS web interface also works apparently http://localhost:631/ in your favourite browser which should be there as long as CUPS is installed which it is in LinuxMint by default.

So come on Minty people get your bug squashing boots on and stamp on this one please.

Update

Bug #871985 only affects Gnome3 so as long as its not affecting Unity that will be okay Canonical will it!

by BigJohn (aka hexpek) (noreply@blogger.com) at January 16, 2013 12:39 AM

August 20, 2012

David Reynolds

On Music

Lately, (well I say lately, I think it’s been the same for a few years now) I have been finding that it is very rare that an album comes along that affects me in a way that music I heard 10 years ago seem to. That is not to say that I have not heard any music that I like in that time, it just doesn’t seem to mean as music that has been in my life for years. What I am trying to work out is if that is a reflection on the state of music, of how I experience music or just me.

Buying

Buying music was always quite an experience. I would spend weeks, months and sometimes longer saving up to buy some new music. Whether I knew exactly what I wanted or just wanted “something else by this artist” I would spend some time browsing the racks weighing up what was the best value for my money. In the days before the internet, if you wanted to research an artist’s back catalogue, you were generally out of luck unless you had access to books about the artists. This lead to the thrill of finding a hidden gem in the racks that you didn’t know existed or had only heard rumours about. The anticipation of listening to the new music would build even more because I would have to wait until I had travelleled home before I could listen to my new purchases.

Nowadays, with the dizzying amount of music constantly pumped into our ears through the internet, radio, advertising and the plethora of styles and genres, it is difficult to sift through and find artists and music that really speak to you. Luckily, there are websites available to catalogue releases by artists so you are able to do thorough research and even preview your music before you purchase it. Of course the distribution methods have changed massively too. No longer do I have to wait until I can make it to a brick and mortar store to hand over my cash. I can now not only buy physical musical releases on CD or Vinyl online and have it delivered to my door, I can also buy digital music through iTunes, Amazon or Bandcamp or even stream the music straight to my ears through services like Spotify or Rdio. Whilst these online sales avenues are great for artists to be able to sell directly to their fans, I feel that some of the magic has been removed from the purchasing of music for me.

Listening

Listening to the music used to be an even greater event than purchasing it. After having spent the time saving up for the purchase, then the time carefully choosing the music to buy and getting it home, I would then sit myself down and listen to the music. I would immerse myself totally in the music and only listen to it (I might read the liner notes if I hadn’t exhausted them on the way home). It is difficult to imagine doing one thing for 45+ minutes without the constant interruptions from smartphones, tablet computers, games consoles and televisions these days. I can’t rememeber the last time I listened to music on good speakers or headphones (generally I listen on crappy computers speakers or to compressed audio on my iPhone through crappy headphones) without reading Twitter, replying to emails or reading copiuous amounts of information about the artists on Wikipedia. This all serves to distract from the actual enjoyment of just listening to the music.

Experience

The actual act of writing this blog post has called into sharp focus the main reason why music doesn’t seem to affect me nowadays as much as it used to - because I don’t experience it in the same way. My life has changed, I have more resposibilities and less time to just listen which makes the convenience and speed of buying digital music online much more appealing. You would think that this ‘instant music’ should be instantly satisfying but for some reason it doesn’t seem to work that way.

What changed?

I wonder if I am the only one experiencing this? My tastes in music have definitely changed a lot over the last few years, but I still find it hard to find music that I want to listen to again and again. I’m hoping I’m not alone in this, alternatively I’m hoping someone might read this and recommend some awesome music to me and cure this weird musical apathy I appear to me suffering from.

August 20, 2012 03:33 PM

On Music

Lately, (well I say lately, I think it’s been the same for a few years now) I have been finding that it is very rare that an album comes along that affects me in a way that music I heard 10 years ago seem to. That is not to say that I have not heard any music that I like in that time, it just doesn’t seem to mean as music that has been in my life for years. What I am trying to work out is if that is a reflection on the state of music, of how I experience music or just me.

Buying

Buying music was always quite an experience. I would spend weeks, months and sometimes longer saving up to buy some new music. Whether I knew exactly what I wanted or just wanted “something else by this artist” I would spend some time browsing the racks weighing up what was the best value for my money. In the days before the internet, if you wanted to research an artist’s back catalogue, you were generally out of luck unless you had access to books about the artists. This lead to the thrill of finding a hidden gem in the racks that you didn’t know existed or had only heard rumours about. The anticipation of listening to the new music would build even more because I would have to wait until I had travelleled home before I could listen to my new purchases.

Nowadays, with the dizzying amount of music constantly pumped into our ears through the internet, radio, advertising and the plethora of styles and genres, it is difficult to sift through and find artists and music that really speak to you. Luckily, there are websites available to catalogue releases by artists so you are able to do thorough research and even preview your music before you purchase it. Of course the distribution methods have changed massively too. No longer do I have to wait until I can make it to a brick and mortar store to hand over my cash. I can now not only buy physical musical releases on CD or Vinyl online and have it delivered to my door, I can also buy digital music through iTunes, Amazon or Bandcamp or even stream the music straight to my ears through services like Spotify or Rdio. Whilst these online sales avenues are great for artists to be able to sell directly to their fans, I feel that some of the magic has been removed from the purchasing of music for me.

Listening

Listening to the music used to be an even greater event than purchasing it. After having spent the time saving up for the purchase, then the time carefully choosing the music to buy and getting it home, I would then sit myself down and listen to the music. I would immerse myself totally in the music and only listen to it (I might read the liner notes if I hadn’t exhausted them on the way home). It is difficult to imagine doing one thing for 45+ minutes without the constant interruptions from smartphones, tablet computers, games consoles and televisions these days. I can’t rememeber the last time I listened to music on good speakers or headphones (generally I listen on crappy computers speakers or to compressed audio on my iPhone through crappy headphones) without reading Twitter, replying to emails or reading copiuous amounts of information about the artists on Wikipedia. This all serves to distract from the actual enjoyment of just listening to the music.

Experience

The actual act of writing this blog post has called into sharp focus the main reason why music doesn’t seem to affect me nowadays as much as it used to - because I don’t experience it in the same way. My life has changed, I have more resposibilities and less time to just listen which makes the convenience and speed of buying digital music online much more appealing. You would think that this ‘instant music’ should be instantly satisfying but for some reason it doesn’t seem to work that way.

What changed?

I wonder if I am the only one experiencing this? My tastes in music have definitely changed a lot over the last few years, but I still find it hard to find music that I want to listen to again and again. I’m hoping I’m not alone in this, alternatively I’m hoping someone might read this and recommend some awesome music to me and cure this weird musical apathy I appear to me suffering from.

August 20, 2012 03:33 PM

June 25, 2012

Elisabeth Fosbrooke-Brown (sfr)

Black redstarts

It's difficult to use the terrace for a couple of weeks, because the black redstart family is in their summer residence at the top of a column under the roof. The chicks grow very fast, and the parents have to feed them frequently; when anyone goes out on the terrace they stop the feeding process and click shrill warnings to the chicks to stay still. I worry that if we disturb them too often or for too long the chicks will starve.

Black redstarts are called rougequeue noir (black red-tail) in French, but here they are known as rossignol des murailles (nightingale of the outside walls). Pretty!

The camera needs replacing, so there are no photos of Musatelier's rossignols des murailles, but you can see what they look like on http://fr.wikipedia.org/wiki/Rougequeue_noir.

by sunflowerinrain (noreply@blogger.com) at June 25, 2012 08:02 AM

June 16, 2012

Elisabeth Fosbrooke-Brown (sfr)

Roundabout at Mirambeau

Roundabouts are taken seriously here in France. Not so much as traffic measures (though it has been known for people to be cautioned by the local gendarmes for not signalling when leaving a roundabout, and quite rightly too), but as places to ornament.

A couple of years ago the roundabout at the edge of  Mirambeau had a make-over which included an ironwork arch and a carrelet (fishing hut on stilts). Now it has a miniature vineyard as well, and roses and other plants for which this area is known.

Need a passenger to take photo!

by sunflowerinrain (noreply@blogger.com) at June 16, 2012 12:06 PM

September 04, 2006

Ashley Howes

Some new photos

Take a look at some new photos my father and I have taken. We are experimenting with our new digital SLR with a variety of lenses.

by Ashley (noreply@blogger.com) at September 04, 2006 10:42 AM

August 30, 2006

Ashley Howes

A Collection of Comments

This is a bit of fun. A collection of comments found in code. This is from The Daily WTF.

by Ashley (noreply@blogger.com) at August 30, 2006 01:13 AM