Planet ALUG

January 17, 2020

Jonathan McDowell

A beginner tries PCB assembly

I wrote last year about my experience with making my first PCB using JLCPCB. I’ve now got 5 of the boards in production around my house, and another couple assembled on my desk for testing. I also did a much simpler board to mount a GPS module on my MapleBoard - basically just with a suitable DIP connector and mount point for the GPS module. At that point I ended up having to pay for shipping; not being in a hurry I went for the cheapest option which mean the total process took 2 weeks from order until it arrived. Still not bad for under $8!

Just before Christmas I discovered that JLCPCB had expanded their SMT assembly option to beyond the Chinese market, and were offering coupons off (but even without that had much, much lower assembly/setup fees than anywhere else I’d seen). Despite being part of LCSC the parts library can be a bit limited (partly it seems there’s nothing complex to assemble such as connectors), with a set of “basic” components without setup fee and then “extended” options which have a $3 setup fee (because they’re not permanently loaded, AIUI).

To test out the service I decided to revise my IoT board. First, I’ve used a few for 12V LED strip control which has meant the 3.3V LDO is working harder than ideal, so I wanted to switch (ha ha) to a buck converter. I worked back from the JLCPCB basic parts list and chose an MP2451, which had a handy data sheet with an example implementation. I also upgraded the ESP module to an ESP32-WROOM - I’ve had some issues with non-flickery PWM on the ESP8266 and the ESP32 has hardware PWM. I also have some applications the Bluetooth would be useful for. Once again I turned to KiCad to draw the schematic and lay out the board. I kept the same form factor for ease, as I knew I could get a case for it. The more complex circuitry was a bit harder to lay out in the same space, and the assembly has a limitation of being single sided which complicates things further, but the fact it was being done for me meant I could drop to 0603 parts.

All-in-all I ended up with 17 parts for the board assembly, with the ESP32 module and power connectors being my responsibility (JLCPCB only have the basic ESP32 chip and I did not feel like trying to design a PCB antenna). I managed to get everything except the inductor from the basic library, which kept costs down. Total cost for 10 boards, parts, assembly, shipping + customs fees was just under $29 which is amazing value to me. What’s even better is that the DFM (design for manufacturing) checks they did realised I’d placed the MP2451 the wrong way round and they automatically rotated it 180° for me. Phew!

The order was placed in the middle of December and arrived just before New Year - again, about 2 weeks total time end to end. Very impressive. Soldering the ESP32 module on was more fiddly than the ESP-07, but it all worked first time with both 5V + 12V power supplies, so I’m very pleased with the results.

ESP32 IoT PCB

Being able to do cheap PCB assembly is a game changer for me. There are various things I feel confident enough to design for my own use that I’d never be able to solder up myself; and for these prices it’s well worth a try. I find myself currently looking at some of the basic STM32 offerings (many of them in JLCPCB’s basic component range) and pondering building a slightly more advanced dev board around one. I’m sure my PCB design will cause those I know in the industry to shudder, but don’t worry, I’ve no plans to do this other than for my own amusement!

January 17, 2020 07:34 PM

January 14, 2020

Mick Morgan

do not ask me for guest posts or links

For the past four years or so I have been receiving increasingly frequent requests for either guest posts, or links to external sites (or sometimes both). The requests have increased in number ever since I started posting about my use of OpenVPN. Many of these requests want me to point to their commercial VPN site. The requests all look something like this:

Hi.

My name is Foo. I represent Bar. I found your blog on google and read your article on “X”. I think your readers will like our discussion about “X” on our site. Would you be willing to host a guest post by us, or one of our affiliates, promoting the use of “Y”? It would also be really good if you could link to our site from your article.

We are really flexible, so we could totally negotiate about special deals.

Now, the least irritating of these requests tend to come to the correct email address (which shows they have read the “about” page) rather than “postmaster@baldric.net” or some other speculative email address, and they are also directly relevant to the article in question (which shows they have actually read that too). But unfortunately, a depressingly large number of requests point to article “X” which has nothing whatsoever to do with their site (which may be a commercial site of tangential, at best, relevance to anything I write about). The worst type of request merely asks for me to point to some external resource from some random post on trivia.

I very, very, very rarely respond to any such requests. And I never, ever respond to persistent, repeated requests from the same source.

One particularly laughable request came in about three years ago. It asked me to point to an on-line password generator/checker (not a smart thing to do). I tried it with an XKCD style password like “soldieravailablecrossmagnet” and got the stupid response:

“Weak Password

It would take a computer about 507 quintillion years to crack your
password.”

Weak password eh?

It should be obvious, but in case it isn’t I’ll spell it out here (and in an addition to my “about” page).

This is a personal blog. It is avowedly and intentionally non-commercial in nature. I pay for this blog from my own resources simply because I want to. I do not seek, nor will I accept, any sponsored content or linkages of any kind. Any external resources I point to are there simply because I have personally found those resources interesting or useful. So please do not ask me to point to your site. Please do not ask me for sponsored content. Please do not ask me for guest posts. If you do, it simply proves that you have not done your research properly – so you will be ignored.

Regards

Mick

by Mick at January 14, 2020 04:42 PM

retiring the slugs

I first started using Linksys NSLU2s (aka “slugs”) in early 2008. Back then I considered them quite useful and I even ran webservers and local apt-caches on them. But realistically they are (and even then, were) a tad underpowered. Worse, since Debian on the XScale-IXP42x hasn’t been updated for several years, the slugs are probably vulnerable to several exploits. The latest version of Debian available for the slugs is probably that which I have running (“uname -a” shows “Linux slug 3.2.0-6-ixp4xx #1 Debian 3.2.102-1 armv5tel”).

The advent of the Raspberry Pi (astonishingly eight years ago now) brought a much more powerful and flexible device into the hands of the masses – and it didn’t need complex re-flashing procedures to get a general purpose linux installation running on it. Over the christmas period last year I added two more Pis (Pi 4s this time) to my network and finally got around to retiring my slugs (well, actually I still have one running, but I will get around to replacing that too soon).

On replacing the slugs I noticed that the 1TB disk I bought as additional storage for my main slug had been running almost non-stop (apart from the occasional reboot) since March 2009. I think that is a remarkably good lifetime for a consumer grade hard disk. Certainly I have had internal disks fail at much lower usage timescales. I have even had supposedly more robust, and certainly way more expensive, disks fail on high end Sun workstations and servers in my professional life.

So if you are in the market for new consumer grade disks, I think I can safely recommend Toshiba.

Oh, and Happy New Year by the way.

by Mick at January 14, 2020 01:42 PM

December 31, 2019

Chris Lamb

Favourite books of 2019

I managed to read 74 books in 2019 (up from 53 in 2018 and 50 in 2017) but here follows ten of my favourites this year, in no particular order.

Disappointments included The Seven Deaths of Evelyn Hardcastle (2018) which started strong but failed to end with a bang; all of the narrative potential energy tightly coiled in the exposition was lazily wasted in a literary æther like the "whimper" in the imagined world of T. S Eliot. In an adjacent category whilst I really enjoyed A Year in Provence (1989) last year, Toujours Provence (1991) did not outdo its predecessor but was still well worth the dégustation. I was less surprised to be let down by Jon Ronson's earliest available book, The Men Who Stare At Goats (2004), especially after I had watched the similarly off-key film of the same name, but it was at least intellectually satisfying to contrast the larval author of this work and comparing him the butterfly he is today but I couldn't recommend the experience to others who aren't fans of him now.

The worst book that I finished this year was Black Nowhere (2019), a painful attempt at a cyberthriller based on the story of the Silk Road marketplace. At many points I seriously pondered whether I was an unwitting participant in a form of distributed performance art or simply reading an ironic takedown of inexpensive modern literature.

As a slight aside, choosing which tomes to write about below was an interesting process but likely not for the reasons you might think; I found it difficult to write so publically anything interesting about some books that remain memorable to this day without essentially inviting silent censure or, worse still, the receipt of tedious correspondence due to their topics of contemporary politics or other vortexes of irrationality, assumed suspicion and outright hostility. (Given Orwell's maxim that "the only test of artistic merit is survival," I find this somewhat of disservice to my integrity, yet alone to the dear reader.)


In the Woods (2007), The Likeness (2008) & Faithful Place (2010)

Tana French

I always feel a certain smug pleasure attached to spotting those gaudy "Now a major TV series!" labels appearing upon novels I have already digested. The stickers do not merely adhere to the book themselves, but in a wider sense stick to myself too as if my own refined taste had been given approval and blessing of its correctness. Not unlike as if my favourite local restaurant had somehow been granted a Michelin star, the only problem then becomes the concomitant difficulty in artfully phrasing that one knew about it all along...

But the first thing that should probably be said about the books that comprise the Dublin Murder Squad ("Now a major TV series!") is the underlying scaffolding of the series: whilst the opening novel details Irish detectives Rob Ryan and Cassie Maddox investigating a murder it is told in from the first-person perspective of the former. However, the following book then not only recounts an entirely different Gardaí investigation it is told from the point of view of the latter, Cassie, instead. At once we can see how different (or not) the characters really are, how narrow (or not) their intepretation of events are, but moreover we get to enjoy replaying previous interaction between the two both, implicitly in our minds and even sometimes explicitly on the page. This fount of interest continues in the third of the series which is told from the viewpoint of a yet another character introduced in the second book and so forth.

I feel I could write a fair amount about these novels, but in the interest of brevity I will limit my encomium to the observation that the setting of Ireland never becomes a character itself, now curiously refreshing as most series feel the need to adopt this trope which overshot cliché some time ago. Authors, by all means set your conceits in well-trodded locations but please refrain from boasting or namedropping your knowledge at seemingly every opportunity (the best/worst example being Ben Aaronovitch's Rivers of London series or, by referencing street and pub names just a few too many times for comfort, Irvine Welsh's Edinburgh). Viewer's of the BBC Spooks series will likely know what I mean too - it isn't that the intelligence officers couldn't meet in the purview of St Paul's or under the watchful London Eye but the unlikelihood that all such clandestine conventicles would happen with the soft focus of yet another postcard-worthy landmark in the background forces at least this particular ex-Londoner of the plot somewhat.

Anyway, highly recommend. I believe I have three more in this series, all firmly on my 2020 list.


The Ministry of Truth (2019)

Dorian Lynskey

It should hopefully come as no surprise to anyone that I would read this "biography" of George Orwell's Nineteen Eighty-Four (NB. not "1984"...) after a number of Orwell-themed travel posts this year (Marrakesh, Hampstead, Paris, Southwold, Ipswich, etc.).

Timed to coincide with the book's publication 70 years ago, Lynskey celebrates its platinum anniversary with an in-depth view into the book's literary background in the dystopian fiction of the preceding generation including Yevgeny Zamyatin's 1921 We and H. G. Well's output more generally. It is a bête noire of mine that the concepts in the original book are taken too literally by most (as if by pointing out the lack of overt telescreens somehow discredits the work or — equally superficially in analysis — has been "proven right" by the prevalence of the FAANGs throughout our culture) but Lynskey does no such thing and avoids this stubbornly sophomoric and narrow view of Nineteen... and does not neglect the wider, more delicate and more interesting topics such as the slippage between deeds, intentions, thoughts, veracity and language.

Thorough and extremely comprehensive, this biography remains a wonderfully easy read and is recommended to all interested in one of the most influencial novels of the 20th century and furthermore should not be considered the exclusive domain of lovers of trivial Orwellania, not withstanding that such folks will undoubtably find something charming in Lynskey's research in any case: Who knew that the original opening paragraph of this book was quite so weak? Or a misprint resulted in an ambiguous ending...? This book shouldn't just make you want to read the novel again, it will likely pique your interest into delving deeper into Orwell’s writing for yourself. And if you don't, Big Brother is...


City of the Dead (2011) & The Bohemian Highway (2013)

Sara Gran

Imagine a Fleabag with more sass, more drug abuse, and — absent the first person narrative — thankfully hold the oft-distracting antics with the fourth wall. Throw in the perceptive insight of Sherlock and finish with the wistful and mystical notes of a Haruki Murakami novel and you've got Claire DeWitt, our plucky protagonist.

In post-Katrani New Orleans, where we lay our scene, this troubled private detective has been tasked with looking for a local prosecutor who has been missing since the hurricane. Surprisingly engrossing and trenchant, my only quibble with the naked, fast-paced and honest writing of City of the Dead is that the ends of chapters are far too easily signposted as the tone of the prose changes in a reliable manner, disturbing the unpredictability of the rest of the text.

The second work I include here (The Bohemian Highway) is almost on-par with the first with yet more of Claire's trenchant observations about herself and society (eg. "If you hate yourself enough, you’ll start to hate anyone who reminds you of you", etc.). However, it was quite the disappointment to read the third in this series (The Infinite Blacktop (2018) which had almost all the aforementioned ingredients but somehow fell far, far short of the target. Anyway, if someone has not optioned the rights for an eight-part television series of the first two novels, I would be willing to go at least, say, 90:10 in with you.


Never Split the Difference (2016)

Chris Voss

I was introduced to Chris Voss earlier in the year via an episode of The Tim Ferriss Show (and if that wasn't enough of a eyebrow-raising introduction he was just on an episode of Lance Armstrong's own podcast...) but regardless of its Marmite-esque route into my world I could not help but be taken hostage by this former FBI negotiator's approach to Negotiating as if Your Life Depended On It, as its subtitle hyperbolically claims.

My initial interest in picking up this how-to-negotiate volume lied much deeper than its prima facie goal of improving my woefully-lacking skills as I was instead intellectually curious about the socio-anthropology and to learn more about various facets of human connections and communication in general. However, the book mixes its "pop psych" with remarkably simple and highly practical tips for all levels of negotiations. Many of these arresting ideas, at least in the Voss school, are highly counter-intuitive yet he argues for them all persuasively, generally preferring well-reasoned argument over relying on the langue du bois of the "amygdala" and other such concepts borrowed superficial from contemporary psychology that will likely be rendered the phrenology of the early 21st-century anyway.

Whilst the book's folksy tone and exhale-inducing approach to pedagogy will put many off (I thought I left academia and its "worksheets" a long time ago…) it certainly passes the primary test of any book of negotiation: it convinced me.


The Way Inn (2014) & Plume (2019)

Will Wiles

I really enjoyed this authors take on modern British culture but I am unsure if I could really communicate exactly why. However, I am certain that I couldn't explain what his position really is beyond using misleading terms such as "surreal" or "existential" because despite these labels implying an inchoate and nebulous work I also found it simultaneously sharp and cuttingly incisive.

Outlining the satirical and absurd plot of The Way Inn would do little to communicate the true colour palette of the volume too (our self-absorbed protagonist attends corporate conferences on the topic, of course, of conferences themselves) but in both of these books Wiles ruthlessly avoiding all of the tired takedowns of contemporary culture, somehow finding new ways to critique our superficial and ersatz times.

The second of Wiles' that I read this year, Plume, was much darker and even sinister in feel but remains peppered with enough microscopic observations on quotidian life ("the cloying chemical reek of off-brand energy drinks is a familiar part of the rush-hour bouquet"...) that somehow made it more, and not less, harrowing in tone. You probably need to have lived in the UK to get the most out of this, but I would certainly recommend it.


Chasing the Scream (2015)

Johann Hari

It is commonplace enough to find RT ≠ endorsement in a Twitter biography these days but given that Hari's book documents a Search For The Truth About Addiction I am penning this review with more than a soupçon of trepidation. As in, if it would be premature to assume that if someone has chosen to read something then they are implicitly agreeing with its contents it would also be a similar error to infer that reader is looking for the same answers. This is all to say that I am not outing myself as an addict here, but then again, this is precisely what an addict would say...

All throat clearing aside, I got much from reading Johann Hari's book which, I think, deliberately does not attempt to break new ground in any of the large area it surveys and prefers to offer a holistic view of the war on drug [prohibition] through a series of long vignettes and stories about others through the lens of Hari himself on his own personal journey.

Well-written and without longueur, Hari is careful to not step too close to the third-rail of the medication—mediation debate as the most effective form of treatment. This leads to some equivocation at points but Hari's narrative-based approach generally lands as being more honest than many similar contemporary works that cede no part of the complex terrain to anything but their prefered panacea, all deliciously ironic given his resignation from the Independent newspaper in 2011. Thus acting as a check against the self-assured tones of How to Change Your Mind (2018) and similar, Chasing the Scream can be highly recommended quite generally but especially for readers in this topic area.


The Sellout (2016)

Paul Beatty

"I couldn't put it down…" is the go-to cliché for literature so I found it amusing to catch myself in quite-literally this state at times. Winner of the 2016 Man Booker Prize, the first third of this were perhaps the most engrossing and compulsive reading experience I've had since I started "seriously" reading.

This book opens in medias res within the Supreme Court of the United States where the narrator lights a spliff under the table. As the book unfolds, it is revealed that this very presence was humbly requested by the Court due to his attempt to reinstate black slavery and segregation in his local Los Angeles neighbourhood. Saying that, outlining the plot would be misleading here as it is far more the ad-hoc references, allusions and social commentary that hang from this that make this such an engrossing work.

The tranchant, deep and unreserved satire might perhaps be merely enough for an interesting book but where it got really fascinating to me (in a rather inside baseball manner) is how the latter pages of the book somehow don't live up the first 100. That appears like a straight-up criticism, but this flaw is actually part of this book's appeal to me — what actually changed in these latter parts? It's not overuse of the idiom or style and neither is it that it strays too far from the original tone or direction, but I cannot put my finger on why which has meant the book sticks to this day in my mind. I can almost, just almost, imagine a devilish author such as Paul deliberately crippling one's output for such an effect…

Now, one cannot unreservedly recommend this book. The subject matter itself, compounded by being dealt with in such an flippant manner will be unpenetrable to many and deeply offensive to others, but if you can see your way past that then you'll be sure to get something—whatever that may be—from this work.


Diary of a Somebody (2019)

Brian Bilston

The nom de plume of the "unofficial poet laureate of Twitter", Brian Bilston is an insufferable and ineffectual loser who decides to write a poem every day for a year. A cross between the cringeworthiness of Alan Partridge and the wit and wordplay of Spike Milligan, the eponymous protagonist documents his life after being "decruited" from his job.

Halfway through this book I came to the realisation that I was technically reading a book of poetry for fun, but far from being Yeats, Auden or The Iliad, "Brian" tends to pen verse along the lines of:

No, it's not Tennyson and "plot" ties itself up a little too neatly at the end, but I smiled out loud too many times whilst reading this book to not include it here.


Stories of Your Life and Others (2014) & Exhalation (2019)

Ted Chiang

This compilation has been enjoying a renaissance in recent years due the success of the film Arrival (2016) which based on on the fourth and titular entry in this amazing collection. Don't infer too much from that however as whilst this is prima facie just another set of sci-fi tales, it is science fiction in the way that Children of Men is, rather than Babylon 5.

A well-balanced mixture of worlds are evoked throughout with a combination of tales that variously mix the Aristotelian concepts of spectacle (opsis), themes (dianoia), character (ethos) and dialogue (lexis), perhaps best expressed practically in that some stories were extremely striking at the time — one even leading me to rebuff an advance at a bar — and a number were not as remarkable at the time yet continue to occupy my idle thoughts.

The opening tale which reworks the Tower of Babel into a construction project probably remains my overall favourite, but the Dark Materials-esque world summoned in Seventy-Two Letters continues to haunt my mind and lips of anyone else who has happened to come across it, perhaps becoming the quite-literal story of my life for a brief period. Indeed it could be said that, gifted as a paperback, whilst the whole collection followed me around across a number of locales, it continues to follow me — figuratively speaking that is — to this day.

Highly recommended to all readers but for those who enjoy discussing books with others it would more than repay any investment.


Operation Mincemeat (2010)

Ben MacIntyre

In retrospect it is almost obvious that the true story of an fictitious corpse whose invented love letters, theatre life and other miscellania stuffed into the pockets of a calculatingly creased Captain's uniform would make such a captivating tale. Apparently drowned and planted into the sea off Huelva in 1943, this particular horse was not exactly from Troy but was rather a Welsh vagrant called Glyndwr who washed up — or is that washed out? — on the Andalusian shoreline along with information on a feigned invasion of Sicily in an attempt to deceive the Wehrmacht. However, this would be to grosslly misprice Ben MacIntyre's ability to not get in the way of telling the story as well the larger picture about the bizarre men who concocted the scheme and the bizarre world they lived in.

In such a Bond-like plot where even Ian Fleming (himself a genuine British naval officer) makes an appearance it seems prudent to regularly recall yet again that truth can be stranger than fiction, but the book does fall foul of the usual sin of single-issue WW2 books in overestimating the importance in the larger context of a conflict. (Indeed, as a diversionary challenge to the reader of this review I solicit suggestions for any invention, breakthrough or meeting that has not been identified as "changing the course of World War II". Victor Davis Hanson rather handsomeley argues in his 2017 The Second World Wars that is best approached as multiple wars, anyway…)

Likely enjoyed by those not typically accustomed to reading non-fiction history, this is genuinely riveting account nonetheless and well worth the reading.

December 31, 2019 06:42 PM

Jonathan McDowell

Free Software Activities for 2019

As a reader of Planet Debian I see a bunch of updates at the start of each month about what people are up to in terms of their Free Software activities. I’m not generally active enough in the Free Software world to justify a monthly report, and this year in particular I’ve had a bunch of other life stuff going on, but I figured it might be interesting to produce a list of stuff I did over the course of 2019. I’m pleased to note it’s longer than I expected.

Conferences

I’m not a big conference attendee; I’ve never worked somewhere that paid travel/accommodation for Free Software conferences so I end up covering these costs myself. That generally means I go to local things and DebConf. This year was no exception to that; I attended BelFOSS, an annual free software conference held in Belfast, as well as DebConf19 in Curitiba, Brazil. (FOSDEM was at an inconvenient time this year for me, or I’d have made it to that as well.)

Debian

Most of my contributions to Free software happen within Debian.

As part of the Data Protection Team I responded to various minor requests for advice from within the project.

The Debian Keyring was possibly my largest single point of contribution. We’re in a roughly 3 month rotation of who handles the keyring updates, and I handled 2019.03.24, 2019.06.25, 2019.08.23, 2019.09.24 + 2019.12.23.

For Debian New Members I handled a single applicant, Marcio de Souza Oliveira, as an application manager. I had various minor conversations throughout the year as part of front desk.

I managed to get binutils-xtensa-lx106 + gcc-xtensa-lx106 packages (1 + 1) for cross building ESP8266 firmware uploaded in time for the buster release, as well as several updates throughout the year (2, 3 + 2, 3, 4). There was a hitch over some disagreements on the package naming, but it conforms with the generally accepted terms used for this toolchain.

Last year I ended up fixing an RC bug in ghdl, so this year having been the last person to touch the package I did a couple of minor uploads (0.35+git20181129+dfsg-3, 0.35+git20181129+dfsg-4). I’m no longer writing any VHDL as part of my job so my direct interest in this package is limited, but I’ll continue to try and fix the easy things when I have time.

Although I requested the package I originally uploaded it for, l2tpns, to be removed from Debian (#929610) I still vaguely maintain libcli, which saw a couple of upstream driven uploads (1.10.0-1, 1.10.2-1).

OpenOCD is coming up to 3 years since its last stable release, but I did a couple (0.10.0-5, 0.10.0-6) of minor uploads this year. I’ve promised various people I’ll do a snapshot upload and I’ll try to get that into experimental at some point. libjaylink, a dependency, also saw a couple of minor uploads (0.1.0-2, 0.1.0-3).

I pushed an updated version of libtorrent into experimental (0.13.8-1), as a pre-requisite for getting rtorrent updated. Once that had passed through NEW I uploaded 0.13.8-2 and then rtorrent 0.9.8-1.

The sigrok project produced a number of updates, sigrok-firmware-fx2lafw 0.1.7-1, libsigrok 0.5.2-1 + libsigrokdecode 0.5.3-1.

sdcc was the only package I did sponsored uploads of this year - (3.8.0+dfsg-2, 3.8.0+dfsg-3). I don’t have time to take over maintainership of this package fully, but sigrok-firmware-fx2lafw depends on it to build so I upload for Gudjon and try to help him out a bit.

Personal projects

In terms of personal projects I finally pushed my ESP8266 Clock to the outside world (and wrote it up). I started learning Go and as part of that wrote gomijia, a tool to passively listen for Bluetooth LE broadcasts from Xiaomi Mijia devices and transmits them over MQTT. I continued to work on onak, my OpenPGP key server, adding support for the experimental v5 key format, dkg’s abuse resistant keystore proposal and finally merged in support for signature verification. It’s due a release, but the documentation really needs improved before I’d be happy to do that.

picolibc

Back when picolibc was newlib-nano I had a conversation with Keith Packard about getting the ESP8266 newlib port (largely by Max Filippov based on the Tensilica work) included. Much time has passed since then, but I finally got time to port this over and test it this month. I’m hopeful the picolibc-xtensa-lx106-elf package will appear in Debian at some point in the next few months.

Snort

As part of my work at Titan IC I did some work on Snort3, largely on improving its support for hardware offload accelerators (ignore the fact my listed commits were all last year, Cisco generally do a bunch of squashed updates to the tree so the original author doesn’t always show).

Software in the Public Interest

While I haven’t sat on the board of SPI since 2015 I’m still the primary maintainer of the membership website (with Martin Michlmayr as the other active contributor). The main work carried out this year was fixing up some issues seen with the upgrade from Stretch to Buster.

Talks

I talked about my home automation, including my use of Home Assistant, at NIDC 2019, and again at DebConf with more emphasis on the various aspects of Debian that I’ve used throughout the process. I had a couple of other sessions at DebConf with the Data Protection and Keyring teams. I did a brief introduction to Reproducible Builds for BLUG in October.

Random

I had a one liner accepted to systemd to make my laptop keyboard work out of the box. I fixed up Xilinx XRT to be able to build .debs for Debian (rather than just Ubuntu), have C friendly header files and clean up some GCC 8.3 warnings. I submitted a fix to Home Assistant to accept 202 as a successful REST notification response. And I had a conversation on IRC which resulted in a tmux patch to force detach (literally I asked how do to this thing and I think Colin had whipped up a patch before the conversation was even over).

December 31, 2019 06:20 PM

Chris Lamb

Free software activities in December 2019

Software Freedom Conservancy (the fiscal sponsor for the Reproducible Builds project) have announced their fundraising season with a huge pledge to match donations from a number of illustrious individuals. If you have ever considered joining as a supporter, now would be the time to do so.


Whilst it was a busy month away from the keyboard for me, here is my update covering what I have been doing in the free software world during December 2019 (previous month):


Reproducible builds

Whilst anyone can inspect the source code of free software for malicious flaws almost all software is distributed pre-compiled to end users. The motivation behind the Reproducible Builds effort is to ensure no flaws have been introduced during this compilation process by promising identical results are always generated from a given source, thus allowing multiple third-parties to come to a consensus on whether a build was compromised.

The initiative is proud to be a member project of the Software Freedom Conservancy, a not-for-profit 501(c)(3) charity focused on ethical technology and user freedom.

Conservancy acts as a corporate umbrella allowing projects to operate as non-profit initiatives without managing their own corporate structure. If you like the work of the Conservancy or the Reproducible Builds project, please consider becoming an official supporter.


I made the following changes to diffoscope, our in-depth and content-aware diff utility that can locate and diagnose reproducibility issues:


I also:


Debian

Debian LTS

This month I have worked 16½ hours on Debian Long Term Support (LTS) and 12 hours on its sister Extended LTS project.

You can find out more about the project via the following video:


Uploads


FTP Team

As a Debian FTP assistant I ACCEPTed eight packages: fluidsynth, golang-github-bmatcuk-doublestar, golang-github-pearkes-cloudflare, librandomx, meep, meep-mpi-default, meep-openmpi & node-webassemblyjs. I additionally filed two RC bugs against packages that had potentially-incomplete debian/copyright files against fluidsynth & meep.

December 31, 2019 01:50 PM

November 27, 2019

Daniel Silverstone (Kinnison)

Rust 2020

As in recent years, there was a call for posts about Rust in 2020. I've been sitting on my response for quite a few weeks because every time I try to write this, I think of other things I want to say, or a new "theme" I want to propose for Rust in 2020.


First, some history about myself in the Rust community -- I started learning Rust a few years ago, had a go at Advent Of Code 2016 in it, completed that, wrote a testing tool in Rust, and then promptly did nothing with Rust except read blogs and reddit for a year. Then I went and did Advent of Code 2017 in Rust and reminded myself of why I enjoyed the language. This time I decided I'd do more with Rust and started playing around. I wrote bits and bobs through 2018, but still failed to do anything major until once again, December rolled around and I did Advent of Code 2018 in Rust, learning an awful lot about procedural macros and other useful things. This was the time of the 2018 Rust edition which also spurred me on to do more useful stuff.

While I was enjoying the fun coding through December, I sat down and decided how I wanted 2019 to play out. I was very aware that I was failing a number of open source communities I was part of, and so I resolved (a) to be a net positive influence on any community I remained active in, and (b) to specifically work to be part of the wider Rust community because I wanted to give back to a community which had been welcoming to me and given me so much joy in coding. I went looking for ways to contribute, did a small patch to the panic wording as per an open issue, and then looked at the developer tools surrounding Rust. I noted that rustup needed some TLC and so I ended up filing my first PR on rustup, I then filed others, and on the 18th Of December 2018, my PR to update rustup to the 2018 edition was merged and thus began my time as a rustup contributor.

Fast-forward to mid-January 2019 and the Rustup working group is established and I was invited to be a member of that. Zoom on to mid-May and I'm asked to take on the mantle of leading the working group, and from there I ended up also helping on rustdoc and various other bits and bobs. I've ended up quite involved in an exciting and interesting community which has always made me feel welcome.


With that potted history given, I hope that you can appreciate where I'm coming from when I say that the one word which keeps coming up when I think about what the Rust community needs to look toward next, the "buzzword" for Rust 2020 that I'd propose, is "Inclusivity". Interestingly I don't mean in terms of being inclusive and welcoming to LGBT people, or women, or any number of other "minority" or "marginal" groups which online communities often find themselves needing to work to include; because Rust already is exceedingly welcoming to such groups -- I've never had so much as a misplaced blink when I say "my husband" in a Rust setting, and that has been so nice. What I mean is that in order to be more widely used, the Rust community needs to look toward ensuring that it can be included into other things.

I'll list the major topic areas I'm thinking of, from most-likely to least-likely to get somewhere concrete in the coming year. At least from my perspective. I'd love to be proven wrong about some of the later items…

Getting Rust

As a Debian developer I have long enjoyed the stability and reliability of the Debian operating system, the software supplied by it, and the trust I feel I can place in it as a software distribution point. Recently Debian started to include Rust in releases, and that is incredibly important I feel. However the Rust community expects things like rustup to "just work" and that means that if someone is using a Debian provided toolchain, and ask how to get rls working then the instruction rustup component add rls simply won't work properly for them. This gets even more interesting when you look at NixOS, or any Arch Linux derivative, because they distribute rustup but it cannot update it self, so other behaviours the community expects don't work, despite rustup being there.

Getting the Rust toolchain in the first place is a critical part of the flow of getting someone into Rust, and making that possible in a way which fits into our target user's worldview and meets their expectations for acquisition of tooling is critical to reducing the onboarding friction. So my first point of inclusivity is, amusingly, on me to spearhead and work out how to make it happen; though I'd welcome anyone wanting to make suggestions on how to make it work nicely.

Trust ergonomics

One thing which holds a lot of people back from Rust is being able to acquire it in a manner they already feel they can trust. This follows on from the point above because trust has to start somewhere, and people already trust the operating system they're running, at least to some extent. When you say that the recommended method for acquiring Rust toolchains starts with a curl | bash people have recoiled in horror. I think that the Rust project needs to start to come up with ways to improve the level of trust people can put in the tooling they acquire. Some of this has begun already, with a proposal to sign the crates index which is still in progress. However we need to extend that to the Rust toolchains, and to the installer (rustup) itself too, in order to be more usable to more people.

This kind of thing is something I am actively interested in, and I hope to be able to announce something in the new year at some point. Those of you who follow the rustup issues will have seen me discussing OpenPGP signatures on toolchains, and that will certainly form part of an infrastructure which people outside of the project can build trust upon. Maybe one day we'll get Debian to sign something which Debian users can use to trust a rustup that they downloaded in the curl | bash method which we know to be one of the most accepted of "current" approaches to getting non-distro-packaged software.

Reliable, deduplicated, dependencies

One criticism which is levelled at the crates ecosystem an awful lot in my earshot, and which I doubt anyone would really argue with, is that it feels very immature -- the preponderance of 0.x version numbers for "primary" crates is something that has been worked on, but is still a huge problem. In part this is because the community is so confident with the semantic versioning we all take care of, but also because there's an amount of release anxiety manifesting. What's worse than this though is the way that we end up with sets of incompatible dependency chains on many of these "primary" crates. It is not uncommon to end up with multiple version of rand, syn, quote, or others in your dependency tree. Not all of that is the fault of the authors of those libraries either, but because other crates are not yet up-to-date with changes in them. For example, syn is already in its 1.0 series and yet via various pathways, rustup ends up depending on 0.15 as well.

In order to be more includible into distributions such as Debian, it's critical that the Rust community as a whole looks to address this kind of thing. It seems odd to say that "gardening" your library's dependencies can be a way to lead to being more includible into other things, to enhance "inclusivity" as I'm choosing to define it for this post, but it is indeed one pathway.

An end-goal of this is that a majority of tools ought to be buildable with a unified singular set of library crate versions if they are going to be usefully included into a distribution.

Making Windows feel Tier One

The Windows platform is considered Tier One by the Rust project. This means that we care, as a project, that Rust and the developer tools all work nicely on the platform. Yet it's abundantly clear that the majority of the tools developers use either a Linux or MacOS X system. As such, concerns that Windows brings due to its non-POSIX nature are often either considered unimportant or simply ignored. I am very guilty of this myself. I am responsible for rustup and it's quite possibly the most UNIXy part of the Rust ecosystem, and yet is pretty much mandatory for every Windows user.

Another example of this is the rust-docs component which contains tens of thousands of small files which causes Windows based anti-virus systems significant indigestion. We ended up adding a "minimal" profile to rustup so that Windows users could skip installing the docs, but that's not great as a long-term solution.

As Rust tries to get included into more companies, this friction is going to be more and more important to resolve. A huge number of companies expect their developers to use Windows desktops, even when they're working on Linux hosted software, and making sure that's first-class will be really important as the number of users who have to use Rust (rather than those who choose to) increases.

I'm sure there are things we can do here, but I'm really going to need help to find ways to make this happen, I'm not a Windows person, but I'm very open to finding ways to improve matters on that platform.

Shared libraries

This is something I'm not sure that we can do anything particularly concrete on in the coming year, but is part of Rust's story which really needs a lot of thought. It's incredibly hard to provide a shared object which can be versioned at all similarly to how SONAME versioning currently works, due to the complexity of the types, and the instability of Rust's ABI. However at least some thought needs to be put into how we might begin to resolve this as it's yet another potential blocker to being included into distributions because it makes the security story for Rust programs so different from other languages which can and do make use of more traditional shared objects.


In summary, for me, for 2020, Rust's already very inclusive approach to its community needs to turn outward and look for ways to increase the chances that it can be included into other projects such as Linux distributions. I see this as increasing the inclusivity of the project by including into our worldview the particular needs of these other projects and communities and ensuring that by treating them as first-class consumers of Rust, we can become first-class members of their projects and communities as well.

by Daniel Silverstone at November 27, 2019 03:53 PM

November 13, 2019

Steve Engledow (stilvoid)

Maur - A minimal AUR helper

This post is about the Arch User Repository. If you’re not an Arch user, probably just move along ;)

There are lots of AUR helpers in existence already but, in the best traditions of open source, none of them work exactly how I want an AUR helper to work, so I created a new one.

Here it is: https://github.com/stilvoid/maur

maur (pronounced like “more”) is tiny. At the time of writing, it’s 49 lines of bash. It also has very few features.

Here is the list of features:

The “help” when installing a package is this, and nothing more:

If you think maur needs more features, use a different AUR helper.

If you find bugs, please submit an issue or, even better, a pull request.

Example usage

Searching the AUR

If you want to search for a package in the AUR, you can grep for it ;)

maur | grep maur

Installing a package

If you want to install a package, for example yay:

maur yay

Upgrading a package

Upgrade a package is the same as installing one. This will upgrade maur:

maur maur

by Steve Engledow at November 13, 2019 12:00 AM

October 20, 2019

Daniel Silverstone (Kinnison)

A quarter in review - Nearly there, 2020 in sight

The 2019 plan - Third-quarter review

At the start of the year I blogged about my plans for 2019. For those who don't want to go back to read that post, in summary they are:

  1. Continue to lose weight and get fit. I'd like to reach 80kg during the year if I can
  2. Begin a couch to 5k and give it my very best
  3. Focus my software work on finishing projects I have already started
  4. Where I join in other projects be a net benefit
  5. Give back to the @rustlang community because I've gained so much from them already
  6. Be better at tidying up
  7. Save up lots of money for renovations
  8. Go on a proper holiday

At the point that I posted that, I promised myself to do quarterly reviews and so here is the second of those. The first can be found here, and the second here.

1. Weight loss

So when I wrote in July, I was around 83kg. I am very sad to report that I did not manage to drop another 5kg, I'm usually around 81.5kg at the moment, though I do peak up above 84kg and down as far as 80.9kg. The past three months have been an exercise in incredible frustration because I'd appear to be making progress only to have it vanish in one day.

I've continued my running though, and cycling, and I've finally gone back to the gym and asked for a resistance routine to compliment this, so here's hoping.

Yet again, I continue give myself a solid "B" for this, though if I were generous, given everything else I might consider a "B+"

2. Fitness (was Couch to 5k)

When I wrot ein July, I was pleased to say that I was sub 28m on my parkrun. I'm now consistently sub 27:30, and have personal bests of 26:18 and 26:23 at two of my local parkruns.

I have started running with my colleagues once a week, and that run is a bit longer (5.8 to 7km depending on our mood) and while I've only been out with them a couple of times so far, I've enjoyed running in a small group. With the weather getting colder I've bought myself some longer sleeved tops and bottoms to hopefully allow me to run through the winter. My "Fitness age" is now in the mid 40s, rather than high 60s, so I'm also approaching a point where I'm as fit as I am old, which is nice.

So far, so good, I'm continuing with giving myself an "A+"

3. Finishing projects

This is a much more difficult point for me this year, sadly. I continued to do some work on on NetSurf this quarter. We had another amazing long-weekend where we worked on a whole bunch of NS stuff, and I've even managed to give up some of my other spare time to resolve bugs, though they tend to be quite hard and I'm quite slow. I'm very pleased with how I've done with that.

Lars and I continue to work on our testing project, now called Subplot. Though, frankly, Lars does almost all of the work on this.

I did accidentally start another project (remsync) after buying a reMarkable tablet. So that bumps my score down a half point.

So over-all, this one drops to "C-", from the "C" earlier in the year - still (barely) satisfactory but could do a lot better.

4. Be a net benefit

My efforts for Debian continue to be restricted, though I hope it continues to just about be a net benefit to the project. My efforts with the Lua community have not extended again, so pretty much the same.

I remain invested in Rust stuff, and have managed (just about) to avoid starting in on any other projects, so things are fairly much the same as before. I lead the Rust installer working group and we recently released a huge update to rustup which adds a bunch of desired stuff.

While the effects of my Rust work affect both this and the next section, I am very pleased with how I did and have upgraded myself to an "A-" for this.

5. Give back to the Rust community

I have worked very hard on my Rustup work, and I have also started to review documentation and help updates for the Rust compiler itself. I've become involved in the Sequoia project, at least peripherally, and have attended a developer retreat with them which was both relaxing and productive.

I feel like the effort I'm putting into Rust is being recognised in ways I did not expect nor hope for, but that's very positive and has meant I've engaged even more with the community and feel like I'm making a valuable contribution.

I still hang around on the #wg-rustup Discord channel and other channels on that server, helping where I can, and I've been trying to teach my colleagues about Rust so that they might also contribute to the community.

So initially an 'A', I dropped to an 'A-' last time, but I feel like I've put enough effort in to give myself 'A+' this time.

6. Be better at tidying up

I've managed to do a bit more tidying, but honestly this is still pretty bad. I managed to clean up some stuff, but then it slips back into mess. The habit forming is still not happening. At this point I think I really need to grab the bull by the horns and focus on this one, so it'll be better in the next report I promise.

I'm upgrading to an 'E' because I am making some difference, just not enough.

7. Save up money for renovations

We spent those savings on our renovations, but I do continue to manage to put some away. As you can see in the next section though, I've been spending money on myself too.

I think I get to keep an 'A' here, but only just.

8. Go on a proper holiday

I spent a week with the Sequoia-PGP team in Croatia which was amazing. I have a long weekend planned with them in Barcelona for Rustfest too. Some people would say that those aren't real holidays, but I relaxed, did stuff I enjoyed, tried new things, and even went on a Zip-line in Croatia, so I'm counting it as a win.

While I've not managed to take a holiday with Rob, he's been off to do things independently, so I'm upgrading us to a 'B' here.

Summary

Last quarter I had a B+, A+, C, B, A-, F, A, C+, which ignoring the F is a was better than earlier in the year, though still not great.

This quarter I have a B+, A+, C-, A-, A+, E, A, B. The F has gone which is nice, and I suppose I could therefore call that a fair A- average, or perhaps C+ if I count the E.

by Daniel Silverstone at October 20, 2019 03:00 PM

April 06, 2019

Richard Lewis

e-Research on Texts and Images

I went to a colloquium on e-Research on Texts and Images at the British Academy yesterday; very, very swanky. Lunch was served on triangular plates, triangular! Big chandeliers, paintings, grand staircase. Well worth investigating for post-doc fellowships one day.

There were also some good papers. Just one or two things that really stuck out for me. There seems to be quite a lot of interest in e-research now around formalising, encoding, and analysing scholarly process. The motivation seems to be that, in order to design software tools to aid scholarship, it's necessary to identify what scholarly processes are engaged in and how they may be re-figured in software manifestations. This is the same direction that my research has been taking, and relates closely to the study of tacit knowledge in which Purcell Plus is engaged.

Ségoléne Tarte presented a very useful diagram in her talk explaining why this line of investigation is important. It showed a continuum of activity which started with "signal" and ended with "meaning". Running along one side of this continuum were the scholarly activities and conceptions that occur as raw primary sources are interpreted, and along the other were the computational processes which may aid these human activities. Her particular version of this continuum was describing the interpretation of images of Roman writing tablets, so the kinds of activities described included identification of marks, characters, and words, and boundary and shape detection in images. She described some of the common aspects of this process, including: oscillation of activity and understanding; dealing with noise; phase congruency; and identifying features (a term which has become burdened with assumed meaning but which should also be considered at its most general sometimes). But I'm sure the idea extends to other humanities disciplines and other kinds of "signal" or primary sources.

Similarly, Melissa Terras talked about her work on knowledge elicitation from expert papyrologists. This included various techniques (drawn from social science and clinical psychology) such as talk-aloud protocols and concept sorting. She was able to show nice graphs of how an expert's understanding of a particular source switches between different levels continuously during the process of working with it. It's this cyclical, dynamic process of coming to understand an artifact which we're attempting to capture and encode with a view to potentially providing decision support tools whose design is informed by this encoded procedure.

A few other odd notes I made. David DeRoure talked about the importance of social science methods in e-Humanities. Amongst other things, he also made an interesting point that it's probably a better investment to teach scholars and researchers about understanding data (representation, manipulation, management) than it is to buy lots of expensive and powerful hardware. Annamaria Carusi said lots of interesting things which I'm annoyed with myself for not having written down properly. (There was something about warning of the non-neutrality of abstractions; interpretation as arriving at a hypothesis, and how this potentially aligns humanistic work with scientific method; and how use of technologies can make some things very easy, but at the expense of making other things very hard.)

April 06, 2019 09:04 PM

New baby, new house, new job

A great deal of time has passed since I last wrote a blog post. During that time my partner and I have had a baby (who's now 20 months old) and bought a house, I've started a new job, finished that new job, and started another new job.

The first new job was working for an open source consultancy firm called credativ which is based in Rugby but which, at the time I started, had recently opened a London office. Broadly, they consult on open source software for business. In practice most of the work is using OpenERP, an open source enterprise resource planning (ERP) system written in Python. I was very critical of OpenERP when I started, but I guess this was partly because my unfamiliarity with it led to me often feeling like a n00b programmer again and this was quite frustrating. By the time I finished at credativ I'd learned to understand how to deal with this quite large software system and I now have a better understanding of its real deficiencies: code quality in the core system is generally quite poor, although it has a decent test suite and is consequently functionally fairly sound, the code is scrappy and often quite poorly designed; the documentation is lacking and not very organised; its authors, I find, don't have a sense of what developers who are new to the framework actually need to know. I also found that, during the course of my employment, it took a long time to gain experience of the system from a user's perspective (because I had to spend time doing development work with it); I think earlier user experience would have helped me to understand it sooner. Apart from those things, it seems like a fairly good ERP. Although one other thing I learned working with it (and with business clients in general) is the importance of domain knowledge: OpenERP is about business applications (accounting, customer relations, sales, manufacture) and, it turns out, I don't know anything about any of these things. That makes trying to understand software designed to solve those problems doubly hard. (In all my previous programming experience, I've been working in domains that are much more familiar.)

As well as OpenERP, I've also learned quite a lot about the IT services industry and about having a proper job in general. Really, this was the first proper job I've ever had; I've earned money for years, but always in slightly off-the-beaten-track ways. I've found that team working skills (that great CV cliché) are actually not one of my strong points; I had to learn to ask for help with things, and to share responsibilities with my colleagues. I've learned a lot about customers. It's a very different environment where a lot of your work is reactive; I've previously been used to long projects where the direction is largely self-determined. A lot of the work was making small changes requested by customers. In such cases it's so important to push them to articulate as clearly as possible what they are actually trying to achieve; too often customers will describe a requirement at the wrong level of detail, that is, they'll describe a technical level change. What's much better is if you can get them to describe the business process they are trying to implement so you can be sure the technical change they want is appropriate or specify something better. I've learned quite a bit about managing my time and being productive. We undertook a lot of fixed-price work, where we were required to estimate the cost of the work beforehand. This involves really knowing how long things take which is quite a skill. We also needed to be able to account for all our working time in order to manage costs and stick within budgets for projects. So I learned some more org-mode tricks for managing effort estimates and for keeping more detailed time logs.

My new new job is working back at Goldsmiths again, with mostly the same colleagues. We're working on an AHRC-funded project called Transforming Musicology. We have partners at Queen Mary, the Centre for e-Research at Oxford, Oxford Music Faculty, and the Lancaster Institute for Contemporary Arts. The broad aim of the project can be understood as the practical follow-on from Purcell Plus: how does the current culture of pervasive networked computing affect what it means to study music and how music gets studied? We're looking for evidence of people using computers to do things which we would understand as musicology, even though they may not. We're also looking at how computers can be integrated into the traditional discipline. And we're working on extending some existing tools for music and sound analysis, and developing frameworks for making music resources available on the Semantic Web. My role is as project manager. I started work at the beginning of October so we've done four days so far. It's mainly been setting up infrastructure (website, wiki, mailing list) and trying to get a good high-level picture of how the two years should progress.

I've also moved my blog from livejournal to here which I manage using Ikiwiki. Livejournal is great; I just liked the idea of publishing my blog using Ikiwiki, writing it in Emacs, and managing it using git. Let's see if I stick to it...

April 06, 2019 09:04 PM

February 12, 2019

Steve Engledow (stilvoid)

Using Git with AWS CodeCommit Across Multiple AWS Accounts

(Cross-posted from the AWS DevOps blog)

I use AWS CodeCommit to host all of my private Git repositories. My repositories are split across several AWS accounts for different purposes: personal projects, internal projects at work, and customer projects.

The CodeCommit documentation shows you how to configure and clone a repository from one place, but in this blog post I want to share how I manage my Git configuration across multiple AWS accounts.

Background

First, I have profiles configured for each of my AWS environments. I connect to some of them using IAM user credentials and others by using cross-account roles.

I intentionally do not have any credentials associated with the default profile. That way I must always be sure I have selected a profile before I run any AWS CLI commands.

Here’s an anonymized copy of my ~/.aws/config file:

[profile personal]
region = eu-west-1
aws_access_key_id = ABCDEFGHIJKLMNOPQRST
aws_secret_access_key = uvwxyz0123456789abcdefghijklmnopqrstuvwx

[profile work]
region = us-east-1
aws_access_key_id = ABCDEFGHIJKLMNOPQRST
aws_secret_access_key = uvwxyz0123456789abcdefghijklmnopqrstuvwx

[profile customer]
region = eu-west-2
source_profile = work
role_arn = arn:aws:iam::123456789012:role/CrossAccountPowerUser

If I am doing some work in one of those accounts, I run export AWS_PROFILE=work and use the AWS CLI as normal.

The problem

I use the Git credential helper so that the Git client works seamlessly with CodeCommit. However, because I use different profiles for different repositories, my use case is a little more complex than the average.

In general, to use the credential helper, all you need to do is place the following options into your ~/.gitconfig file, like this:

[credential]
    helper = !aws codecommit credential-helper $@
    UserHttpPath = true

I could make this work across accounts by setting the appropriate value for AWS_PROFILE before I use Git in a repository, but there is a much neater way to deal with this situation using a feature released in Git version 2.13, conditional includes.

A solution

First, I separate my work into different folders. My ~/code/ directory looks like this:

code
    personal
        repo1
        repo2
    work
        repo3
        repo4
    customer
        repo5
        repo6

Using this layout, each folder that is directly underneath the code folder has different requirements in terms of configuration for use with CodeCommit.

Solving this has two parts; first, I create a .gitconfig file in each of the three folder locations. The .gitconfig files contain any customization (specifically, configuration for the credential helper) that I want in place while I work on projects in those folders.

For example:

[user]
    # Use a custom email address
    email = sengledo@amazon.co.uk

[credential]
    # Note the use of the --profile switch
    helper = !aws --profile work codecommit credential-helper $@
    UseHttpPath = true

I also make sure to specify the AWS CLI profile to use in the .gitconfig file which means that, when I am working in the folder, I don’t need to set AWS_PROFILE before I run git push, etc.

Secondly, to make use of these folder-level .gitconfig files, I need to reference them in my global Git configuration at ~/.gitconfig

This is done through the includeIf section. For example:

[includeIf "gitdir:~/code/personal/"]
    path = ~/code/personal/.gitconfig

This example specifies that if I am working with a Git repository that is located anywhere under ~/code/personal/, Git should load additional configuration from ~/code/personal/.gitconfig. That additional file specifies the appropriate credential helper invocation with the corresponding AWS CLI profile selected as detailed earlier.

The contents of the new file are treated as if they are inserted into the main .gitconfig file at the location of the includeIf section. This means that the included configuration will only override any configuration specified earlier in the config.

by Steve Engledow at February 12, 2019 12:00 AM

June 07, 2018

Brett Parker (iDunno)

The Psion Gemini

So, I backed the Gemini and received my shiny new device just a few months after they said that it'd ship, not bad for an indiegogo project! Out of the box, I flashed it, using the non-approved linux flashing tool at that time, and failed to backup the parts that, err, I really didn't want blatted... So within hours I had a new phone that I, err, couldn't make calls on, which was marginally annoying. And the tech preview of Debian wasn't really worth it, as it was fairly much unusable (which was marginally upsetting, but hey) - after a few more hours / days of playing around I got the IMEI number back in to the Gemini and put back on the stock android image. I didn't at this point have working bluetooth or wifi, which was a bit of a pain too, turns out the mac addresses for those are also stored in the nvram (doh!), that's now mostly working through a bit of collaboration with another Gemini owner, my Gemini currently uses the mac addresses from his device... which I'll need to fix in the next month or so, else we'll have a mac address collision, probably.

Overall, it's not a bad machine, the keyboard isn't quite as good as I was hoping for, the phone functionality is not bad once you're on a call, but not great until you're on a call, and I certainly wouldn't use it to replace the Samsung Galaxy S7 Edge that I currently use as my full time phone. It is however really rather useful as a sysadmin tool when you don't want to be lugging a full laptop around with you, the keyboard is better than using the on screen keyboard on the phone, the ssh client is "good enough" to get to what I need, and the terminal font isn't bad. I look forward to seeing where it goes, I'm happy to have been an early backer, as I don't think I'd pay the current retail price for one.

by Brett Parker (iDunno@sommitrealweird.co.uk) at June 07, 2018 01:04 PM

February 21, 2018

MJ Ray

How hard can typing æ, ø and å be?

Petter Reinholdtsen: How hard can æ, ø and å be? comments on the rubbish state of till printers and their mishandling of foreign characters.

Last week, I was trying to type an email, on a tablet, in Dutch. The tablet was running something close to Android and I was using a Bluetooth keyboard, which seemed to be configured correctly for my location in England.

Dutch doesn’t even have many accents. I wanted an e acute (é). If you use the on screen keyboard, this is actually pretty easy, just press and hold e and slide to choose the accented one… but holding e on a Bluetooth keyboard? eeeeeeeeeee!

Some guides suggest Alt and e, then e. Apparently that works, but not on keyboards set to Great British… because, I guess, we don’t want any of that foreign muck since the Brexit vote, or something(!)

Even once you figure out that madness and switch the keyboard back to international, which also enables alt i, u, n and so on to do other accents, I can’t find grave, check, breve or several other accents. I managed to send the emails in Dutch but I’d struggle with various other languages.

Have I missed a trick or what are the Android developers thinking? Why isn’t there a Compose key by default? Is there any way to get one?

by mjr at February 21, 2018 04:14 PM

March 01, 2017

Brett Parker (iDunno)

Using the Mythic Beasts IPv4 -> IPv6 Proxy for Websites on a v6 only Pi and getting the right REMOTE_ADDR

So, more because I was intrigued than anything else, I've got a pi3 from Mythic Beasts, they're supplied with IPv6 only connectivity and the file storage is NFS over a private v4 network. The proxy will happily redirect requests to either http or https to the Pi, but this results (without turning on the Proxy Protocol) with getting remote addresses in your logs of the proxy servers, which is not entirely useful.

I've cheated a bit, because the turning on of ProxyProtocol for the hostedpi.com addresses is currently not exposed to customers (it's on the list!), to do it without access to Mythic's backends use your own domainname (I've also got https://pi3.sommitrealweird.co.uk/ mapped to this Pi).

So, first step first, we get our RPi and we make sure that we can login to it via ssh (I'm nearly always on a v6 connection anyways, so this was a simple case of sshing to the v6 address of the Pi). I then installed haproxy and apache2 on the Pi and went about configuring them, with apache2 I changed it to listen to localhost only and on ports 8080 and 4443, I hadn't at this point enabled the ssl module so, really, the change for 4443 didn't kick in. Here's my /etc/apache2/ports.conf file:

# If you just change the port or add more ports here, you will likely also
# have to change the VirtualHost statement in
# /etc/apache2/sites-enabled/000-default.conf

Listen [::1]:8080

<IfModule ssl_module>
       Listen [::1]:4443
</IfModule>

<IfModule mod_gnutls.c>
       Listen [::1]:4443
</IfModule>

# vim: syntax=apache ts=4 sw=4 sts=4 sr noet

I then edited /etc/apache2/sites-available/000-default.conf to change the VirtualHost line to [::1]:8080.

So, with that in place, now we deploy haproxy infront of it, the basic /etc/haproxy/haproxy.cfg config is:

global
       log /dev/log    local0
       log /dev/log    local1 notice
       chroot /var/lib/haproxy
       stats socket /run/haproxy/admin.sock mode 660 level admin
       stats timeout 30s
       user haproxy
       group haproxy
       daemon

       # Default SSL material locations
       ca-base /etc/ssl/certs
       crt-base /etc/ssl/private

       # Default ciphers to use on SSL-enabled listening sockets.
       # For more information, see ciphers(1SSL). This list is from:
       #  https://hynek.me/articles/hardening-your-web-servers-ssl-ciphers/
       ssl-default-bind-ciphers ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+3DES:!aNULL:!MD5:!DSS
       ssl-default-bind-options no-sslv3

defaults
       log     global
       mode    http
       option  httplog
       option  dontlognull
        timeout connect 5000
        timeout client  50000
        timeout server  50000
       errorfile 400 /etc/haproxy/errors/400.http
       errorfile 403 /etc/haproxy/errors/403.http
       errorfile 408 /etc/haproxy/errors/408.http
       errorfile 500 /etc/haproxy/errors/500.http
       errorfile 502 /etc/haproxy/errors/502.http
       errorfile 503 /etc/haproxy/errors/503.http
       errorfile 504 /etc/haproxy/errors/504.http

frontend any_http
        option httplog
        option forwardfor

        acl is_from_proxy src 2a00:1098:0:82:1000:3b:1:1 2a00:1098:0:80:1000:3b:1:1
        tcp-request connection expect-proxy layer4 if is_from_proxy

        bind :::80
        default_backend any_http

backend any_http
        server apache2 ::1:8080

Obviously after that you then do:

systemctl restart apache2
systemctl restart haproxy

Now you have a proxy protocol'd setup from the proxy servers, and you can still talk directly to the Pi over ipv6, you're not yet logging the right remote ips, but we're a step closer. Next enable mod_remoteip in apache2:

a2enmod remoteip

And add a file, /etc/apache2/conf-available/remoteip-logformats.conf containing:

LogFormat "%v:%p %a %l %u %t \"%r\" %>s %O \"%{Referer}i\" \"%{User-Agent}i\"" remoteip_vhost_combined

And edit the /etc/apache2/sites-available/000-default.conf to change the CustomLog line to use remoteip_vhost_combined rather than combined as the LogFormat and add the relevant RemoteIP settings:

RemoteIPHeader X-Forwarded-For
RemoteIPTrustedProxy ::1

CustomLog ${APACHE_LOG_DIR}/access.log remoteip_vhost_combined

Now, enable the config and restart apache2:

a2enconf remoteip-logformats
systemctl restart apache2

Now you'll get the right remote ip in the logs (cool, huh!), and, better still, the environment that gets pushed through to cgi scripts/php/whatever is now also correct.

So, you can now happily visit http://www.<your-pi-name>.hostedpi.com/, e.g. http://www.srwpi.hostedpi.com/.

Next up, you'll want something like dehydrated - I grabbed the packaged version from debian's jessie-backports repository - so that you can make yourself some nice shiny SSL certificates (why wouldn't you, after all!), once you've got dehydrated installed, you'll probably want to tweak it a bit, I have some magic extra files that I use, I also suggest getting the dehydrated-apache2 package, which just makes it all much easier too.

/etc/dehydrated/conf.d/mail.sh:

CONTACT_EMAIL="my@email.address"

/etc/dehydrated/conf.d/domainconfig.sh:

DOMAINS_D="/etc/dehydrated/domains.d"

/etc/dehydrated/domains.d/srwpi.hostedpi.com:

HOOK="/etc/dehydrated/hooks/srwpi"

/etc/dehydrated/hooks/srwpi:

#!/bin/sh
action="$1"
domain="$2"

case $action in
  deploy_cert)
    privkey="$3"
    cert="$4"
    fullchain="$5"
    chain="$6"
    cat "$privkey" "$fullchain" > /etc/ssl/private/srwpi.pem
    chmod 640 /etc/ssl/private/srwpi.pem
    ;;
  *)
    ;;
esac

/etc/dehydrated/hooks/srwpi has the execute bit set (chmod +x /etc/dehydrated/hooks/srwpi), and is really only there so that the certificate can be used easily in haproxy.

And finally the file /etc/dehydrated/domains.txt:

www.srwpi.hostedpi.com srwpi.hostedpi.com

Obviously, use your own pi name in there, or better yet, one of your own domain names that you've mapped to the proxies.

Run dehydrated in cron mode (it's noisy, but meh...):

dehydrated -c

That s then generated you some shiny certificates (hopefully). For now, I'll just tell you how to do it through the /etc/apache2/sites-available/default-ssl.conf file, just edit that file and change the SSLCertificateFile and SSLCertificateKeyFile to point to /var/lib/dehydrated/certs/www.srwpi.hostedpi.com/fullchain.pem and /var/llib/dehydrated/certs/ww.srwpi.hostedpi.com/privkey.pem files, do the edit for the CustomLog as you did for the other default site, and change the VirtualHost to be [::1]:443 and enable the site:

a2ensite default-ssl
a2enmod ssl

And restart apache2:

systemctl restart apache2

Now time to add some bits to haproxy.cfg, usefully this is only a tiny tiny bit of extra config:

frontend any_https
        option httplog
        option forwardfor

        acl is_from_proxy src 2a00:1098:0:82:1000:3b:1:1 2a00:1098:0:80:1000:3b:1:1
        tcp-request connection expect-proxy layer4 if is_from_proxy

        bind :::443 ssl crt /etc/ssl/private/srwpi.pem

        default_backend any_https

backend any_https
        server apache2 ::1:4443 ssl ca-file /etc/ssl/certs/ca-certificates.crt

Restart haproxy:

systemctl restart haproxy

And we're all done! REMOTE_ADDR will appear as the correct remote address in the logs, and in the environment.

by Brett Parker (iDunno@sommitrealweird.co.uk) at March 01, 2017 06:35 PM

October 18, 2016

MJ Ray

Rinse and repeat

Forgive me, reader, for I have sinned. It has been over a year since my last blog post. Life got busy. Paid work. Another round of challenges managing my chronic illness. Cycle campaigning. Fun bike rides. Friends. Family. Travels. Other social media to stroke. I’m still reading some of the planets where this blog post should appear and commenting on some, so I’ve not felt completely cut off, but I am surprised how many people don’t allow comments on their blogs any more (or make it too difficult for me with reCaptcha and the like).

The main motive for this post is to test some minor upgrades, though. Hi everyone. How’s it going with you? I’ll probably keep posting short updates in the future.

Go in peace to love and serve the web. 🙂

by mjr at October 18, 2016 04:28 AM

March 09, 2015

Ben Francis

Pinned Apps – An App Model for the Web

(re-posted from a page I created on the Mozilla wiki on 17th December 2014)

Problem Statement

The per-OS app store model has resulted in a market where a small number of OS companies have a large amount of control, limiting choice for users and app developers. In order to get things done on mobile devices users are restricted to using apps from a single app store which have to be downloaded and installed on a compatible device in order to be useful.

Design Concept

Concept Overview

The idea of pinned apps is to turn the apps model on its head by making apps something you discover simply by searching and browsing the web. Web apps do not have to be installed in order to be useful, “pinning” is an optional step where the user can choose to split an app off from the rest of the web to persist it on their device and use it separately from the browser.

Pinned_apps_overview

”If you think of the current app store experience as consumers going to a grocery store to buy packaged goods off a shelf, the web is more like a hunter-gatherer exploring a forest and discovering new tools and supplies along their journey.”

App Discovery

A Web App Manifest linked from a web page says “I am part of a web app you can use separately from the browser”. Users can discover web apps simply by searching or browsing the web, and use them instantly without needing to install them first.

Pinned_apps_discovery

”App discovery could be less like shopping, and more like discovering a new piece of inventory while exploring a new level in a computer game.”

App Pinning

If the user finds a web app useful they can choose to split it off from the rest of the web to persist it on their device and use it separately from the browser. Pinned apps can provide a more app-like experience for that part of the web with no browser chrome and get their own icon on the homescreen.

Pinned_apps_pinning

”For the user pinning apps becomes like collecting pin badges for all their favourite apps, rather than cluttering their device with apps from an app store that they tried once but turned out not to be useful.”

Deep Linking

Once a pinned app is registered as managing its own part of the web (defined by URL scope), any time the user navigates to a URL within that scope, it will open in the app. This allows deep linking to a particular page inside an app and seamlessly linking from one app to another.

Pinned_apps_linking

”The browser is like a catch-all app for pages which don’t belong to a particular pinned app.”

Going Offline

Pinning an app could download its contents to the device to make it work offline, by registering a Service Worker for the app’s URL scope.

Pinned_apps_offline

”Pinned apps take pinned tabs to the next level by actually persisting an app on the device. An app pin is like an anchor point to tether a collection of web pages to a device.”

Multiple Pages

A web app is a collection of web pages dedicated to a particular task. You should be able to have multiple pages of the app open at the same time. Each app could be represented in the task manager as a collection of sheets, pinned together by the app.

Pinned_app_pages

”Exploding apps out into multiple sheets could really differentiate the Firefox OS user experience from all other mobile app platforms which are limited to one window per app.”

Travel Guide

Even in a world without app stores there would still be a need for a curated collection of content. The Marketplace could become less of a grocery store, and more of a crowdsourced travel guide for the web.

Pinned_apps_guide

”If a user discovers an app which isn’t yet included in the guide, they could be given the opportunity to submit it. The guide could be curated by the community with descriptions, ratings and tags.”

3 Questions

Pinnged_apps_pinned

What value (the importance, worth or usefulness of something) does your idea deliver?

The pinned apps concept makes web apps instantly useful by making “installation” optional. It frees users from being tied to a single app store and gives them more choice and control. It makes apps searchable and discoverable like the rest of the web and gives developers the freedom of where to host their apps and how to monetise them. It allows Mozilla to grow a catalogue of apps so large and diverse that no walled garden can compete, by leveraging its user base to discover the apps and its community to curate them.

What technological advantage will your idea deliver and why is this important?

Pinned apps would be implemented with emerging web standards like Web App Manifests and Service Workers which add new layers of functionality to the web to make it a compelling platform for mobile apps. Not just for Firefox OS, but for any user agent which implements the standards.

Why would someone invest time or pay money for this idea?

Users would benefit from a unique new web experience whilst also freeing themselves from vendor lock-in. App developers can reduce their development costs by creating one searchable and discoverable web app for multiple platforms. For Mozilla, pinned apps could leverage the unique properties of the web to differentiate Firefox OS in a way that is difficult for incumbents to follow.

UI Mockups

App Search

Pinned_apps_search

Pin App

Pin_app

Pin Page

Pin_page

Multiple Pages

Multiple_pages

App Directory

App_directory

Implementation

Web App Manifest

A manifest is linked from a web page with a link relation:

  <link rel=”manifest” href=”/manifest.json”>

A manifest can specify an app name, icon, display mode and orientation:

 {
   "name": "GMail"
   "icons": {...},
   "display": "standalone",
   "orientation": “portrait”,
   ...
 }

There is a proposal for a manifest to be able to specify an app scope:

 {
   ...
   "scope": "/"
   ...
 }

Service Worker

There is also a proposal to be able to reference a Service Worker from within the manifest:

 {
   ...
   service_worker: {
     src: "app.js",
     scope: "/"
   ...
 }

A Service Worker has an install method which can populate a cache with a web app’s resources when it is registered:

 this.addEventListener('install', function(event) {
  event.waitUntil(
    caches.create('v1').then(function(cache) {
     return cache.add(
        '/index.html',
        '/style.css',
        '/script.js',
        '/favicon.ico'
      );
    }, function(error) {
        console.error('error populating cache ' + error);
    };
  );
 });

So that the app can then respond to requests for resources when offline:

 this.addEventListener('fetch', function(event) {
  event.respondWith(
    caches.match(event.request).catch(function() {
      return event.default();
    })
  );
 });

by tola at March 09, 2015 03:54 PM

December 11, 2014

Ben Francis

The Times They Are A Changin’ (Open Web Remix)

In the run up to the “Mozlandia” work week in Portland, and in reflection of the last three years of the Firefox OS project, for a bit of fun I’ve reworked a Bob Dylan song to celebrate our incredible journey so far.

Here’s a video featuring some of my memories from the last three years, with Siobhan (my fiancée) and me singing the song at you! There are even lyrics so you can sing along 😉

“Keep on rockin’ the free web” — Potch

by tola at December 11, 2014 11:26 AM

July 10, 2014

James Taylor

SSL / TLS

Is it annoying or not that everyone says SSL Certs and SSL when they really mean TLS?

Does anyone actually mean SSL? Have there been any accidents through people confusing the two?


July 10, 2014 02:09 PM

Cloud Computing Deployments … Revisited.

So its been a few years since I’ve posted, because its been so much hard work, and we’ve been pushing really hard on some projects which I just can’t talk about – annoyingly. Anyways, March 20th , 2011 I talked about Continual Integration and Continual Deployment and the Cloud and discussed two main methods – having what we now call ‘Gold Standards’ vs continually updating.

The interesting thing is that as we’ve grown as a company, and as we’ve become more ‘Enterprise’, we’ve brought in more systems administrators and begun to really separate the deployments from the development. The other thing is we have separated our services out into multiple vertical strands, which have different roles. This means we have slightly different processes for Banking or Payment based modules then we do from marketing modules. We’re able to segregate operational and content from personally identifiable information – PII having much higher regulation on who can (and auditing of who does) access.

Several other key things had to change: for instance, things like SSL keys of the servers shouldn’t be kept in the development repo. Now, of course not, I hear you yell, but its a very blurry line. For instance, should the Django configuration be kept in the repo? Well, yes, because that defines the modules and things like URLs. Should the nginx config be kept in the repo? Well, oh. if you keep *that* in then you would keep your SSL certs in…

So the answer becomes having lots of repo’s. One repo per application (django wise), and one repo per deployment containing configurations. And then you start looking at build tools to bring, for a particular server or cluster of servers up and running.

The process (for our more secure, audited services) is looking like a tool to bring an AMI up, get everything installed and configured, and then take a snapshot, and then a second tool that takes that AMI (and all the others needed) and builds the VPC inside of AWS. Its a step away from the continual deployment strategy, but it is mostly automated.


July 10, 2014 02:09 PM

June 12, 2014

Paul Tansom

Beginning irc

After some discussion last night at PHP Hants about the fact that irc is a great facilitator of support / discussion, but largely ignored because there is rarely enough information for a new user to get going I decided it may be worth putting together a howto type post so here goes…

What is irc?

First of all, what on earth is it? I’m tempted to describe it as Twitter done right years before Twitter even existed, but I’m a geek and I’ve been using irc for years. It has a long heritage, but unlike the ubiquitous email it hasn’t made the transition into mainstream use. In terms of usage it has similarities to things like Twitter and Instant Messaging. Let’s take a quick look at this.

Twitter allows you to broadcast messages, they get published and anyone who is subscribed to your feed can read what you say. Everything is pretty instant, and if somebody is watching the screen at the right time they can respond straight away. Instant Messaging on the other hand, is more of a direct conversation with a single person, or sometimes a group of people, but it too is pretty instantaneous – assuming, of course, that there’s someone reading what you’ve said. Both of these techonologies are pretty familiar to many. If you go to the appropriate website you are given the opportunity to sign up and either use a web based client or download one.

It is much the same for irc in terms of usage, although conversations are grouped into channels which generally focus on a particular topic rather than being generally broadcast (Twitter) or more specifically directed (Instant Messaging). The downside is that in most cases you don’t get a web page with clear instructions of how to sign up, download a client and find where the best place is to join the conversation.

Getting started

There are two things you need to get going with irc, a client and somewhere to connect to. Let’s put that into a more familiar context.

The client is what you use to connect with; this can be an application – so as an example Outlook or Thunderbird would be a mail client, or IE, Firefox, Chrome or Safari are examples of clients for web pages – or it can be a web page that does the same thing – so if you go to twitter.com and login you are using the web page as your Twitter client. Somewhere to connect to can be compared to a web address, or if you’ve got close enough to the configuration of your email to see the details, your mail server address.

Let’s start with the ‘somewhere to connect to‘ bit. Freenode is one of the most popular irc servers, so let’s take a look. First we’ll see what we can find out from their website, http://freenode.net/.

freenode

There’s a lot of very daunting information there for somebody new to irc, so ignore most of it and follow the Webchat link on the left.

webchat

That’s all very well and good, but what do we put in there? I guess the screenshot above gives a clue, but if you actually visit the page the entry boxes will be blank. Well first off there’s the Nickname, this can be pretty much anything you like, no need to register it – stick to the basics of letters, numbers and some simple punctuation (if you want to), keep it short and so long as nobody else is already using it you should be fine; if it doesn’t work try another. Channels is the awkward one, how do you know what channels there are? If you’re lucky you’re looking into this because you’ve been told there’s a channel there and hopefully you’ve been given the channel name. For now let’s just use the PHP Hants channel, so that would be #phph in the Channels box. Now all you need to do is type in the captcha, ignore the tick boxes and click Connect and you are on the irc channel and ready to chat. Down the right you’ll see a list of who else is there, and in the main window there will be a bit of introductory information (e.g. topic for the channel) and depending on how busy it is anything from nothing to a fast scrolling screen of text.

phph

If you’ve miss typed there’s a chance you’ll end up in a channel specially created for you because it didn’t exist; don’t worry, just quit and try again (I’ll explain that process shortly).

For now all you really need to worry about is typing in text an posting it, this is as simple as typing it into the entry box at the bottom of the page and pressing return. Be polite, be patient and you’ll be fine. There are plenty of commands that you can use to do things, but for now the only one you need to worry about is the one to leave, this is:

/quit

Type it in the entry box, press return and you’ve disconnected from the server. The next thing to look into is using a client program since this is far more flexible, but I’ll save that for another post.

The post Beginning irc appeared first on Linuxlore.

by Paul Tansom at June 12, 2014 04:27 PM

February 06, 2014

Adam Bower (quinophex)

I finally managed to beat my nemesis!

I purchased this book http://www.amazon.co.uk/dp/0738206679 (Linked, by Barabasi) on the 24th of December 2002, I had managed to make 6 or 7 aborted attempts at reading it to completion where life had suddenly got busy and just took over. This meant that I put the book down and didn't pick it up again until things were less hectic some time later and I started again.

Anyhow, I finally beat the book a few nights ago, my comprehension of it was pretty low anyhow but at least it is done. Just shows I need to read lots more given how little went in.




comment count unavailable comments

February 06, 2014 10:40 PM

February 01, 2014

Adam Bower (quinophex)

Why buying a Mio Cyclo 305 HC cycling computer was actually a great idea.

I finally made it back out onto the bike today for the first time since September last year. I'd spent some time ill in October and November which meant I had to stop exercising and as a result I've gained loads of weight over the winter and it turns out also become very unfit which can be verified by looking at the Strava ride from today: http://www.strava.com/activities/110354158

Anyhow, a nice thing about this ride is that I can record it on Strava and get this data about how unfit I have become, this is because last year I bought a Mio Cyclo 305 HC cycle computer http://eu.mio.com/en_gb/mio-cyclo-305-hc.htm from Halfords reduced to £144.50 (using a British Cycling discount). I was originally going to get a Garmin 500 but Amazon put the price up from £149.99 the day I was going to buy it to £199.99.

I knew when I got the Mio that it had a few issues surrounding usability and features but it was cheap enough at under £150 that I figured that even if I didn't get on with it I'd at least have a cadence sensor and heart rate monitor so I could just buy a Garmin 510 when they sorted out the firmware bugs with that and the price came down a bit which is still my longer term intention.

So it turns out a couple of weeks ago I plugged my Mio into a Windows VM when I was testing USB support and carried out a check for new firmware. I was rather surprised to see a new firmware update and new set of map data was available for download. So I installed it think I wasn't going to get any new features from it as Mio had released some new models but it turns out that the new firmware actually enables a single feature (amongst other things, they also tidied up the UI and sorted a few other bugs along with some other features) that makes the device massively more useful as it now also creates files in .fit format which can be uploaded directly to Strava.

This is massively useful for me as although the Mio always worked in Linux as the device is essentially just a USB mass storage device but you would have to do an intermediate step of having to use https://github.com/rhyas/GPXConverter to convert the files from the Mio-centric GPX format to something Strava would recognise. Now I can just browse to the folder and upload the file directly which is very handy.

All in it turns out that buying a Mio which reading reviews and forums were full of doom and gloom means I can wait even longer before considering replacement with a garmin.

comment count unavailable comments

February 01, 2014 02:11 PM

January 01, 2014

John Woodard

A year in Prog!


It's New Year's Day 2014 and I'm reflecting on the music of past year.

Album wise there were several okay...ish releases in the world of Progressive Rock. Steven Wilson's The Raven That Refused To Sing not the absolute masterpiece some have eulogised a solid effort though but it did contain some filler. Motorpsyco entertained with Still Life With Eggplant not as good as their previous album but again a solid effort. Magenta as ever didn't disappoint with The 27 Club, wishing Tina Booth a swift recovery from her ill health.

The Three stand out albums in no particular order for me were Edison's Children's Final Breath Before November which almost made it as album of the year and Big Big Train with English Electric Full Power which combined last years Part One and this years Part Two with some extra goodies to make the whole greater than the sum of the parts. Also Adrian Jones of Nine Stones Close fame pulled one out of the bag with his side Project Jet Black Sea which was very different and a challenging listen, hard going at first but surprisingly very good. This man is one superb guitarist especially if you like emotion wrung out of the instrument like David Gilmore or Steve Rothery.

The moniker of Album of the Year this year goes to Fish for the incredible Feast of Consequences. A real return to form and his best work since Raingods With Zippos. The packaging of the deluxe edition with a splendid book featuring the wonderful artwork of Mark Wilkinson was superb. A real treat with a very thought provoking suite about the first world war really hammed home the saying "Lest we forget". A fine piece that needs to be heard every November 11th.


Gig wise again Fish at the Junction in Cambridge was great. His voice may not be what it was in 1985 but he is the consummate performer, very at home on the stage. As a raconteur between songs he is as every bit as entertaining as he is singing songs themselves.

The March Marillion Convention in Port Zealand, Holland where they performed their masterpiece Brave was very special as every performance of incredible album is. The Marillion Conventions are always special but Brave made this one even more special than it would normally be.
Gig of the year goes again to Marillion at Aylesbury Friars in November. I had waited thirty years and forty odd shows to see them perform Garden Party segued into Market Square Heroes that glorious night it came to pass, I'm am now one very happy Progger or should that be Proggie? Nevermind Viva Progressive Rock!

by BigJohn (aka hexpek) (noreply@blogger.com) at January 01, 2014 07:56 PM

December 01, 2013

Paul Tansom

Scratch in a network environment

I have been running a Code Club at my local Primary School for a while now, and thought it was about time I put details of a few tweaks I’ve made to the default Scratch install to make things easier. So here goes:

With the default install of Scratch (on Windows) projects are saved to the C: drive. For a network environment, with pupils work stored on a network drive so they always have access whichever machine they sit at, this isn’t exactly helpful. It also isn’t ideal that they can explore the C: drive in spite of profile restrictions (although it isn’t the end of the world as there is little they can do from Scratch).

save-orig

After a bit of time with Google I found the answer, and since it didn’t immediately leap out at me when I was searching I thought I’d post it here (perhaps my Google Fu was weak that day). It is actually quite simple, especially for the average Code Club volunteer I should imagine; just edit the scratch.ini file. This is, as would be expected, located in:

C:\Program Files\Scratch\Scratch.ini

Initially it looks like this:

ini-orig

Pretty standard stuff, but unfortunately no comments to indicate what else you can do with it. As it happens you can add the following two lines (for example):

Home=U:
VisibleDrives=U:

To get this:

ini-new

They do exactly what is says on the tin. If you click on the Home button in a file dialogue box then you only get the drive(s) specified. You can also put a full path in if you want to put the home directory further down the directory structure.

save-new1

The VisibleDrives option restricts what you can see if you click on the Computer button in a file dialogue box. If you want to allow more visible drives then separate them with a comma.

save-new2

You can do the same with a Mac (for the home drive), just use the appropriate directory format (i.e. no drive letter and the opposite direction slash).

There is more that you can do, so take a look at the Scratch documentation here. For example if you use a * in the directory path it is replaced by the name of the currently logged on user.

Depending on your network environment it may be handy for your Code Club to put the extra resources on a shared network drive and open up an extra drive in the VisibleDrives. One I haven’t tried yet it is the proxy setting, which I hope will allow me to upload projects to the Scratch website. It goes something like:

ProxyServer=[server name or IP address]
ProxyPort=[port number]

The post Scratch in a network environment appeared first on Linuxlore.

by Paul Tansom at December 01, 2013 07:00 PM

January 16, 2013

John Woodard

LinuxMint 14 Add Printer Issue


 LinuxMint 14 Add Printer Issue



 

I wanted to print from my LinuxMint 14 (Cinnamon) PC via a shared Windows printer on my network. Problem is it isn’t found by the printers dialog in system settings. I thought I’d done all the normal things to get samba to play nice like rearranging the name resolve order in /etc/samba/smb.conf to a more sane bcast host lmhosts wins. Having host and wins, neither of which I’m using first in the order cocks things up some what. Every time I tried to search for the printer in the system setting dialog it told me “FirewallD is not running. Network printer detection needs services mdns, ipp, ipp-client and samba-client enabled on firewall.” So much scratching of the head there then, because as far as I can tell there ain’t no daemon by that name available!

It turns out thanks to /pseudomorph this has been a bug since LinuxMint12 (based on Ubuntu 11.10). It’s due to that particular daemon (Windows people daemon pretty much = service) being Fedora specific and should have no place in a Debian/Ubuntu based distribution. Bugs of this nature really should be ironed out sooner.

Anyway the simple fix is to use the more traditional approach using the older printer dialog which is accessed by inputting system-config-printer at the command line. Which works just fine so why the new (over a year old) printer config dialog that is inherently broken I ask myself.

The CUPS web interface also works apparently http://localhost:631/ in your favourite browser which should be there as long as CUPS is installed which it is in LinuxMint by default.

So come on Minty people get your bug squashing boots on and stamp on this one please.

Update

Bug #871985 only affects Gnome3 so as long as its not affecting Unity that will be okay Canonical will it!

by BigJohn (aka hexpek) (noreply@blogger.com) at January 16, 2013 12:39 AM

August 20, 2012

David Reynolds

On Music

Lately, (well I say lately, I think it’s been the same for a few years now) I have been finding that it is very rare that an album comes along that affects me in a way that music I heard 10 years ago seem to. That is not to say that I have not heard any music that I like in that time, it just doesn’t seem to mean as music that has been in my life for years. What I am trying to work out is if that is a reflection on the state of music, of how I experience music or just me.

Buying

Buying music was always quite an experience. I would spend weeks, months and sometimes longer saving up to buy some new music. Whether I knew exactly what I wanted or just wanted “something else by this artist” I would spend some time browsing the racks weighing up what was the best value for my money. In the days before the internet, if you wanted to research an artist’s back catalogue, you were generally out of luck unless you had access to books about the artists. This lead to the thrill of finding a hidden gem in the racks that you didn’t know existed or had only heard rumours about. The anticipation of listening to the new music would build even more because I would have to wait until I had travelleled home before I could listen to my new purchases.

Nowadays, with the dizzying amount of music constantly pumped into our ears through the internet, radio, advertising and the plethora of styles and genres, it is difficult to sift through and find artists and music that really speak to you. Luckily, there are websites available to catalogue releases by artists so you are able to do thorough research and even preview your music before you purchase it. Of course the distribution methods have changed massively too. No longer do I have to wait until I can make it to a brick and mortar store to hand over my cash. I can now not only buy physical musical releases on CD or Vinyl online and have it delivered to my door, I can also buy digital music through iTunes, Amazon or Bandcamp or even stream the music straight to my ears through services like Spotify or Rdio. Whilst these online sales avenues are great for artists to be able to sell directly to their fans, I feel that some of the magic has been removed from the purchasing of music for me.

Listening

Listening to the music used to be an even greater event than purchasing it. After having spent the time saving up for the purchase, then the time carefully choosing the music to buy and getting it home, I would then sit myself down and listen to the music. I would immerse myself totally in the music and only listen to it (I might read the liner notes if I hadn’t exhausted them on the way home). It is difficult to imagine doing one thing for 45+ minutes without the constant interruptions from smartphones, tablet computers, games consoles and televisions these days. I can’t rememeber the last time I listened to music on good speakers or headphones (generally I listen on crappy computers speakers or to compressed audio on my iPhone through crappy headphones) without reading Twitter, replying to emails or reading copiuous amounts of information about the artists on Wikipedia. This all serves to distract from the actual enjoyment of just listening to the music.

Experience

The actual act of writing this blog post has called into sharp focus the main reason why music doesn’t seem to affect me nowadays as much as it used to - because I don’t experience it in the same way. My life has changed, I have more resposibilities and less time to just listen which makes the convenience and speed of buying digital music online much more appealing. You would think that this ‘instant music’ should be instantly satisfying but for some reason it doesn’t seem to work that way.

What changed?

I wonder if I am the only one experiencing this? My tastes in music have definitely changed a lot over the last few years, but I still find it hard to find music that I want to listen to again and again. I’m hoping I’m not alone in this, alternatively I’m hoping someone might read this and recommend some awesome music to me and cure this weird musical apathy I appear to me suffering from.

August 20, 2012 03:33 PM

On Music

Lately, (well I say lately, I think it’s been the same for a few years now) I have been finding that it is very rare that an album comes along that affects me in a way that music I heard 10 years ago seem to. That is not to say that I have not heard any music that I like in that time, it just doesn’t seem to mean as music that has been in my life for years. What I am trying to work out is if that is a reflection on the state of music, of how I experience music or just me.

Buying

Buying music was always quite an experience. I would spend weeks, months and sometimes longer saving up to buy some new music. Whether I knew exactly what I wanted or just wanted “something else by this artist” I would spend some time browsing the racks weighing up what was the best value for my money. In the days before the internet, if you wanted to research an artist’s back catalogue, you were generally out of luck unless you had access to books about the artists. This lead to the thrill of finding a hidden gem in the racks that you didn’t know existed or had only heard rumours about. The anticipation of listening to the new music would build even more because I would have to wait until I had travelleled home before I could listen to my new purchases.

Nowadays, with the dizzying amount of music constantly pumped into our ears through the internet, radio, advertising and the plethora of styles and genres, it is difficult to sift through and find artists and music that really speak to you. Luckily, there are websites available to catalogue releases by artists so you are able to do thorough research and even preview your music before you purchase it. Of course the distribution methods have changed massively too. No longer do I have to wait until I can make it to a brick and mortar store to hand over my cash. I can now not only buy physical musical releases on CD or Vinyl online and have it delivered to my door, I can also buy digital music through iTunes, Amazon or Bandcamp or even stream the music straight to my ears through services like Spotify or Rdio. Whilst these online sales avenues are great for artists to be able to sell directly to their fans, I feel that some of the magic has been removed from the purchasing of music for me.

Listening

Listening to the music used to be an even greater event than purchasing it. After having spent the time saving up for the purchase, then the time carefully choosing the music to buy and getting it home, I would then sit myself down and listen to the music. I would immerse myself totally in the music and only listen to it (I might read the liner notes if I hadn’t exhausted them on the way home). It is difficult to imagine doing one thing for 45+ minutes without the constant interruptions from smartphones, tablet computers, games consoles and televisions these days. I can’t rememeber the last time I listened to music on good speakers or headphones (generally I listen on crappy computers speakers or to compressed audio on my iPhone through crappy headphones) without reading Twitter, replying to emails or reading copiuous amounts of information about the artists on Wikipedia. This all serves to distract from the actual enjoyment of just listening to the music.

Experience

The actual act of writing this blog post has called into sharp focus the main reason why music doesn’t seem to affect me nowadays as much as it used to - because I don’t experience it in the same way. My life has changed, I have more resposibilities and less time to just listen which makes the convenience and speed of buying digital music online much more appealing. You would think that this ‘instant music’ should be instantly satisfying but for some reason it doesn’t seem to work that way.

What changed?

I wonder if I am the only one experiencing this? My tastes in music have definitely changed a lot over the last few years, but I still find it hard to find music that I want to listen to again and again. I’m hoping I’m not alone in this, alternatively I’m hoping someone might read this and recommend some awesome music to me and cure this weird musical apathy I appear to me suffering from.

August 20, 2012 03:33 PM

June 25, 2012

Elisabeth Fosbrooke-Brown (sfr)

Black redstarts

It's difficult to use the terrace for a couple of weeks, because the black redstart family is in their summer residence at the top of a column under the roof. The chicks grow very fast, and the parents have to feed them frequently; when anyone goes out on the terrace they stop the feeding process and click shrill warnings to the chicks to stay still. I worry that if we disturb them too often or for too long the chicks will starve.

Black redstarts are called rougequeue noir (black red-tail) in French, but here they are known as rossignol des murailles (nightingale of the outside walls). Pretty!

The camera needs replacing, so there are no photos of Musatelier's rossignols des murailles, but you can see what they look like on http://fr.wikipedia.org/wiki/Rougequeue_noir.

by sunflowerinrain (noreply@blogger.com) at June 25, 2012 08:02 AM

June 16, 2012

Elisabeth Fosbrooke-Brown (sfr)

Roundabout at Mirambeau

Roundabouts are taken seriously here in France. Not so much as traffic measures (though it has been known for people to be cautioned by the local gendarmes for not signalling when leaving a roundabout, and quite rightly too), but as places to ornament.

A couple of years ago the roundabout at the edge of  Mirambeau had a make-over which included an ironwork arch and a carrelet (fishing hut on stilts). Now it has a miniature vineyard as well, and roses and other plants for which this area is known.

Need a passenger to take photo!

by sunflowerinrain (noreply@blogger.com) at June 16, 2012 12:06 PM

September 04, 2006

Ashley Howes

Some new photos

Take a look at some new photos my father and I have taken. We are experimenting with our new digital SLR with a variety of lenses.

by Ashley (noreply@blogger.com) at September 04, 2006 10:42 AM

August 30, 2006

Ashley Howes

A Collection of Comments

This is a bit of fun. A collection of comments found in code. This is from The Daily WTF.

by Ashley (noreply@blogger.com) at August 30, 2006 01:13 AM