Posts for April 2014

2014-04-06: control-archive 1.6.1

control-archive is the software that maintains the archive of Usenet control messages and the "canonical" list of newsgroups maintained on There's nothing particularly exciting in this release, but there were some accumulated metadata changes and it had been nearly a year since the previous release.

This is one of my few packages that uses a three-part version number, since I figured there would be metadata-only releases and wanted a way to designate those. And then, of course, I almost always made code changes. So this is only the second metadata-only release (which is part of why I did a release now, to be honest).

At some point, I still want to rewrite the underlying signature validation code and then redo all the code in this package to match my current coding style and be quite a bit cleaner. But the hours to do projects like that aren't particularly forthcoming at the moment.

You can get the latest release from the control-archive distribution page.

2014-04-11: Accumulated haul

Wow, it's been a while since I've done this. In part because I've not had much time for reading books (which doesn't prevent me from buying them).

Jared Bernstein & Dean Baker — Getting Back to Full Employment (non-fiction)
James Coughtrey — Six Seconds of Moonlight (sff)
Philip J. Davis & Reuben Hersh — The Mathematical Experience (non-fiction)
Debra Dunbar — A Demon Bound (sff)
Andy Duncan & Ellen Klages — Wakulla Springs (sff)
Dave Eggers & Jordan Bass — The Best of McSweeny's (mainstream)
Siri Hustvedt — The Blazing World (mainstream)
Jacqueline Koyanagi — Ascension (sff)
Ann Leckie — Ancillary Justice (sff)
Adam Lee — Dark Heart (sff)
Seanan McGuire — One Salt Sea (sff)
Seanan McGuire — Ashes of Honor (sff)
Seanan McGuire — Chimes at Midnight (sff)
Seanan McGuire — Midnight Blue-Light Special (sff)
Seanan McGuire — Indexing (sff)
Naomi Mitchinson — Travel Light (sff)
Helaine Olen — Pound Foolish (non-fiction)
Richard Powers — Orfeo (mainstream)
Veronica Schanoes — Burning Girls (sff)
Karl Schroeder — Lockstep (sff)
Charles Stross — The Bloodline Feud (sff)
Charles Stross — The Traders' War (sff)
Charles Stross — The Revolution Trade (sff)
Matthew Thomas — We Are Not Ourselves (mainstream)
Kevin Underhill — The Emergency Sasquatch Ordinance (non-fiction)
Jo Walton — What Makes This Book So Great? (non-fiction)

So, yeah. A lot of stuff.

I went ahead and bought nearly all of the novels Seanan McGuire had out that I'd not read yet after realizing that I'm going to eventually read all of them and there's no reason not to just own them. I also bought all of the Stross reissues of the Merchant Princes series, even though I had some of the books individually, since I think it will make it more likely I'll read the whole series this way.

I have so much stuff that I want to read, but I've not really been in the mood for fiction. I'm trying to destress enough to get back in the mood, but in the meantime have mostly been reading non-fiction or really light fluff (as you'll see from my upcoming reviews). Of that long list, Ancillary Justice is getting a lot of press and looks interesting, and Lockstep is a new Schroeder novel. 'Nuff said.

Kevin Underhill is the author of Lowering the Bar, which you should read if you haven't since it's hilarious. I'm obviously looking forward to that.

The relatively obscure mainstream novels here are more Powell's Indiespensible books. I will probably cancel that subscription soon, at least for a while, since I'm just building up a backlog, but that's part of my general effort to read more mainstream fiction. (I was a bit disappointed since there were several months with only one book, but the current month finally came with two books again.)

Now I just need to buckle down and read. And play video games. And do other things that are fun rather than spending all my time trying to destress from work and zoning in front of the TV.

2014-04-23: On learning, growth, and trust

Here are two separate ideas about programming, Internet security, Internet architecture, and free software. Both of them are fundamental to everything those of us who work on free software are doing.

  1. Writing secure and reliable code is a highly complex and demanding task. It's something that one has to learn, like any other skilled profession. It's not something we're very good at teaching via any mechanism other than apprenticeship and experimentation. The field is changing quickly; if you took ten years off from writing security-critical code, you would expect to have to learn multiple new tools (static analysis, testing techniques, security features), possibly new programming languages, and not infrequently new ways of thinking about threat models and vulnerabilities.

  2. Nearly every computer user trusts tens of thousands of other programmers and millions of lines of code with their day-to-day computer security and reliability, without auditing that code themselves. Even if you have the skill required to audit the code (and very, very few people have the skill to audit all of the code that they use), you do not have the time. Therefore, our computer security is built on trust. We're trusting other programmers not to be malicious, which is obvious, but we're also trusting other programmers to be highly skilled, careful, cautious (but still fast and productive, since we quickly abandon software that isn't "actively developed"), and constantly adopting new techniques and treat models.

I think both of those principles are very widely understood and widely acknowledged. And, as we all know, both of those principles have failure modes, and those failures mean that our computers are nowhere near as secure as we would like them to be.

There has been quite a lot of technical discussion about both of these principles in recent days and months, ranging from better code analysis and testing through the flaws in particular programming languages to serious questions about the trust model we use for verifying the code that we're running. I'm going to switch gears away from that discussion for a moment to talk about a social aspect.

When a piece of code that one is using has a security vulnerability, it is not unusual to treat that as a trust failure. In other words, technical flaws are very quickly escalated to social flaws. This is less likely among practitioners, since we're all painfully aware of how many bugs we have in our own code, but it's still not unheard of, particularly if the authors of the code do not immediately and publicly show a socially-acceptable level of humility and contrition (in the middle of what is often a horrifically stressful experience).

I think this reaction comes largely from fear. Anyone who has given it a moment's thought is painfully aware of just how much code they are running on trust, and just how many ways it could be compromised, accidentally or maliciously. And how frequently that code is compromised. The less control one has over a situation, the more terrifying it is, and the cold reality is that we have very little control over our computer security. There is a natural tendency, when afraid, to look for targets for that fear.

Now, go back and think about the first principle.

Anyone who has ever worked on or near free software is painfully aware that we have far more good ideas about how to improve computing and security than we have skilled resources to execute on those ideas. My own backlog of things that I've already thought about, know would be good ideas, and simply have to implement is probably longer than my remaining lifespan. I suspect that's the case for nearly all of us.

In other words, we have a severe shortage of programmers who care and who have skill. We desperately need more skilled programmers who can write secure and reliable code. We may (and I think do) also need better tools, techniques, languages, protocols, and all the other machinery that we, as technical people, spend most of our time thinking about. But none of that changes the fact that we need more skilled people. In fact, it makes that need more acute: in addition to skilled people to write the code we use, we need skilled people to write the tools.

Skilled people are not born. They're made. And in professions where training techniques are still in their infancy and where we don't have a good formal grasp on which techniques work, those skilled people are made primarily through apprenticeship, experimentation, and learning from failure.

Worse, people who were skilled do not remain skilled without continually participating in that learning process. See the above point about a fast-changing field with evolving best practices. It's not enough to know how to write secure code to the best known practices today, or even enough to retrofit all of your existing code to current knowledge (which is often so large of an effort as to be practically impossible). You have to constantly, continually learn more, for which there is no reliable formal training.

We have to try, fail, try again, and fail better.

But failure that leads to a security vulnerability is treated as a loss of trust. We trusted that person to write secure code that we could use. They failed. Now we can't trust them. Based on the trust model of security, we should revoke their ability to try again and instead rely on people who have not failed, since that will make us more secure.

Except now we just broke the learning process. And there's no such thing as a programmer who can stop learning. So what does that do to our resource pool?

It's sadly ironic, but I believe the free software community writ large has a very serious collaboration problem: we do not tolerate each other's learning processes. This leads to a wide variety of social failures around hostile communities and the failures of meritocracy that other people have talked about at much greater length. But even if you set that all aside, it threatens our security. We need secure code, a lot of it, a lot more than we have right now. To get that code, we need people who can write it. We need to grow, encourage, and support those people and enable their learning processes.

Code is written by people. If we rip people apart when they write bad, insecure code, we don't get better, secure code. We get fewer people writing security code. We get far fewer people writing security code in public, since some of the people who haven't been ripped apart will look at that experience and say, "No way am I going near that problem area. It's too scary. I don't want to end up like those programmers."

Fewer people writing security code means fewer people learning how to write better security code.

Fewer people capable of writing good, secure code is not a solution to any of our problems.

If we do not tolerate, support, and encourage the learning process required to become a skilled programmer, or maintain one's skill in programming, we are destroying our future as a community.

When you find code that is broken and badly written, you have found a problem that should be reported, analyzed, and corrected. You have also found a programmer who is about to have one of two very different types of experiences. Either they are about to learn how to become a better programmer, or they are about to be publicly shamed, humiliated, and treated as untrustworthy. Which branch they take is partly up to them, but it's also heavily influenced by how all of us react socially to the discovery of bad code.

One of those branches leads to more good, secure code being written in the future. The other does not.

Last modified and spun 2017-02-20