Please see the disclaimer.

Assumed Audience: Hackers, programmers, and anyone in the tech industry. But especially EU-level politicians who might vote on the CRA.

Epistemic Status: Only somewhat confident, but absolutely confident that the ideas herein have some chance of improving the status quo.

Discuss on Hacker News and Reddit.

Please send this post to EU politicians.

Introduction

Programmers are terrified. And rightfully so.

There is a monster, a colossus that may sweep into the shining City of Open Source and trample it all with less care than Godzilla.

This beast is the child of desire for better software, but it has grown into a caricature with the opposite effect.

This leviathan freak is Europe’s Cybersecurity Resilience Act.

There’s a lot of bad (and you should read about it), but to speedrun the parts I care about:

  • All software (“digital products”) must meet “essential cybersecurity requirements.”
  • “Critical” software must be certified.

Okay, doesn’t sound too bad, right? We all want companies to improve…

<whisper>It includes Open Source software that take donations.</whisper>

Son of another rippin’ sandstorm! That’s a disaster!

The Solution

So what if I told you there is a solution that would not only save Open Source, but fund it and improve cybersecurity? Would you believe me?

If you don’t, well, you can quit reading now and pretend ignorance. And don’t let the boot hit you on the way out.

But if you are at least curious, you probably think there is a catch.

I won’t bury it; the catch is this: we must own the responsibility society is trying to give us and act like it!

Accepting Liability

Oh, you don’t know what taking responsibility means?

<mutter>Kids these days.</mutter>

The meaning of responsibility is accepting liability if you get paid.

That’s it. That is all.

If you make it and sell it, you better make it right.

In both senses of “making it right.”

Of course, if we’re going to do this, we need some system, and the CRA obviously ain’t it.

People talk about the CRA being “vague” and “broad,” but those terms are nigh nonsensical.

Let me lay out why the CRA is bad: it creates liability between parties that may have no business relationship!

You could take FOSS that I wrote, and if I happen to accept donations, I’m now liable to you even though you paid me nothing.

“But Gavin, we do the same thing for products.”

Not really.

Yes, a physical product does still need to meet minimum standards. But if you get a physical thing for free from someone else (free couch on the curb, anyone?), that person no longer has any claim on the manufacturer.

Software is different. If I give you some software from someone else, that does not deprive me of that software, so the author is now liable for us both.

Liability for software can grow without bound.

In addition, software is “pure thought-stuff”, so it is not limited by physical constraints, for the most part. This means that software can become more complex than the most complex physical objects we could build.

Example: you make a physically accurate physics engine. And then add a dragon to it.

Anyway, software is different, and we need a different legal framework.

Fault Flow Matches Money Flow

So let’s use fault flow, er, liability protection, to fix the CRA.

With the CRA as it is, you are basically liable to all of society as a whole.

That cannot work; that is unsustainable, maybe even for the largest companies!

Even as someone who wishes for proper liability in the industry, that thought is petrifying!

So the most important change we need to make to the CRA is that fault should only flow upstream when money does.

If there is no monetary business relationship, the software author should have zero liability.

Barring negligence and malice.

No poisoning free cookies, Mallory.

This one simple change saves Open Source. Period. No questions asked.

Why? Because Open Source projects don’t have business relationships. At least, the ones we care about don’t.

Most importantly, our Nebraska rando will be saved from lawsuits!

One little detail that we must not forget: since fault flow matches money flow, that means that whenever money does flow, whoever receives it must accept liability.

This is the part that gives this teeth, and this is why it needs to be done in law like the CRA; if contracts could still disclaim liability, they will, so we need to make it illegal to do so.

Money in Open Source

“But Gavin, a lot of FOSS projects I care about take money, so that requirement will hurt them.”

Au contraire! It gives them a built-in business model!

Let’s take Blender as an example. They get donations from some of the biggest companies.

Well, those companies pay to ensure Blender works on their hardware (Intel, Dell, nVidia and AMD), so the liability that the Blender Foundation has is that Blender will work on that hardware.

😐

Um, yes, that’s what they are already paying for. Blender would have no problem if this proposal happened. In fact, they might get more money as more studios that use Blender now must pay or accept liability.

And that is the built-in business model: actual FOSS maintainers would accept the fault flowing back to them and get paid for it.

Starting a career in FOSS would become almost comically simple: start a business or non-profit around some software, work on it on the side until it grows enough revenue to support you full-time, and then quit your day job in favor of your business.

No fiddling with business models, no trying to “sell” your software (for money) or tug on heartstrings to drum up donations.

You make a project, you get programmers to use it for free (“ooh, shiny new thing!”), and you have a business as soon as they start using your code in employment.

Like stealing a bone from a puppy.

Of course, people do have to use your software, but I’m not the greatest salesman, and I got a project into Mac OSX. How hard could it be?

Famous last words.

“But Gavin, I would still have to found a business or a non-profit!”

Well, yeah, receiving money is work.

But after having spun up an LLC with the ability to receive money, I can say that it’s not too hard.

And if there are lawyers out there who use Open Source, you can pay it back by guiding maintainers in setting up these orgs pro bono. Just sayin'.

At that point, you have an actual, real-life entity that businesses can send money to. Businesses love sending money to entities rather than people.

For bonus points, you can use something like GnuCash to create “official” invoices; businesses will snarf that like my wife snarfs sushi!

You get all this for a one-time payment of a week or two of your spare time. And bits and pieces here and there.

Then if your project is good enough, and companies start using it, they’ll come to you. Easiest sales ever.

And if nothing happens, well, you didn’t even quit your job, so no biggie.

But in general, I would bet my entire lifetime earnings that:

  1. More money would flow into Open Source,
  2. More people would be able to make a living on their FOSS projects.

Another example: OpenSSL was sorely lacking funds at the time of Heartbleed. And they got a measly $9000 after.

Wut.

Surely if this proposal became law, OpenSSL would be flooded with cash.

Oh, that would be a good day. And I believe that story would not just happen, it would be common.

Our Nebraska rando would actually get money!

“But I just want to work on my project as a hobby!”

Ah, yes, I feel you. That’s still a good choice.

And this will actually make it easier.

Just don’t set anything up. And if anyone files a bug report or a feature request that you just don’t want to deal with, simply remind them, “This is my hobby. Go away.”

They’ll flee to the projects that do accept money, because those projects will exist, and you’ll have the peace you want with a good hobby and no burn out from entitled users!

Win-win!

Better Software

It gets better; with this proposal, software would magically get better.

I get that you’re worried about your favorite FOSS project that receives donations.

You may ask, “Would they really be forced to accept liability if they take money?”

Yep! And that’s a good thing!

FOSS projects would have to hit the brakes on features and stabilize everything; “it’s not 1.0 yet” or “move fast and break things” are not a valid excuses in the eyes of the law.

And that applies to not-so-free software, too!

This shift would incentivize the “boring” work that makes software great because if accepting money means putting your neck on the line, I’m sure most code “ninjas” in FOSS and companies would slacken speed and magically find the time to make an exceptional test suite.

In other words, by requiring liability, software would get better, almost like all of the “reasons” companies and FOSS projects gave were excuses all along.

And I’m sure you’d like that, even for your favorite FOSS project.

Closing Loopholes

But besides the work that money would bring (<s>ugh, how awful!</s>), there are a few more problems.

First, companies might try to claim that they are not distributing software if they have a web app or something similar.

Personally, I think that if the software is available for public use, it should count, and those users deserve the same things. If it’s a web app, or an arcade machine, OEM software, or even a IoT toaster, it counts.

Second, companies like Google may try to claim that since their services are “free,” they shouldn’t be subject to the law.

Naw, bruh, we’ll just redefine data and ads as equivalent to money in the case of digital services and products. If a consumer gives up data, even unknowingly, or is served ads, they are a user deserving of liability protection. Simple as.

Third, how shall we fix the boundless liability problem caused by easily distributing software?

This one is more complicated and requires politicians to play along.

Because the other thing the law should do is define two types of liability: direct liability and redistribution liability.

And the law must only allow those two types; contracts should not override them.

Direct liability is the liability you think it is: I make software, I give you software, I take payment, I accept liability for your use of the software. I am directly liable to you and you only.

But what if my software is a library, and you want to redistribute it as part of your own program? Should I be liable to your users?

Of course not.

Unless I see as much green and gold as a Kansas cornfield. If you do, I would give you redistribution liability protection.

At that point, yes, I would be liable to your users for bugs in my library.

“But Gavin, how do we know if it was a bug in the library?”

Wherever the fix was for the problem. If the fix is in my library, I’m liable. In your app? You’re liable. In both? We’re both liable.

“But perhaps the docs were just not complete?”

Then I am liable. If I didn’t document my library enough for you to use it correctly, that’s on me because complete documentation is something a professional should create.

Yes, I went there. Write complete docs for your projects, people!

Now, say I had a library, and you had a library that used mine. Say you got redistribution liability from me and distributed your library to a company that built an app using your library. Then that company distributes the app to the public.

Obviously, I am liable to that third-party company because you redistributed directly to them. But am I liable to that company’s users?

The answer should be no.

Redistribution liability must be only one level. This is key to preventing boundless liability.

But it also will have two good side effects.

In our example, that third-party company must figure out what their transitive dependencies are because they need redistribution liability from all of them.

So they get redistribution liability from you, and then they get redistribution liability from me.

We both get paid! Hooray! And I get paid twice as much because my library was more foundational!

This would partially solve Daniel Stenberg’s Open Source Pyramid Problem since foundational software used everywhere would naturally accumulate more funds instead of the reverse.

That’s the first good side effect.

The second is like unto it: companies would actually figure out what dependencies they have! Complete Software Bills of Material would be the rule, not the exception. And people would have a reason for cutting down excessive dep trees.

Should We Just Kill the CRA?

“Why can’t we just not have liability like it is now?”

Society don’t care. People with power don’t care.

Alas, high-class turn to grass your sorry working class mass enmasse if you make morass that cause impasse and harass the brass in first-class.

And besides, is our current system working?

No, it’s not. Our software is terrible. Disclaiming liability is not working.

In fact, my personal opinion is that Microsoft’s worst sin was not using its monopoly power to push Windows or IE or Office; it was using its monopoly power to normalize disclaiming liability even when you pay.

Absolutely ludicrous.

Now, I know what you’re thinking; you think responsibility is a dirty four-letter word.

When it’s shortened to r12y, anyway.

But if you think that, you’re a fool. And I’mma show you why.

First, even the US Federal Government is trying to do something. It’s obvious that society thinks the tech industry has jumped the shark, and it’s not a baby shark either; it’s a Meg.

Techies, and tech industrialists, we are responsible for this. It’s our fault.

We have to own that responsibility.

Because society will compel us to.

Forced Responsibility

If you retort that society won’t compel us to own responsibility because we have no legal responsibility, you are a fool!

I mean, yes, with laws as they are, we don’t, but the CRA shows that society seems stupid bent on giving us that legal responsibility.

Why is society going that direction? Well…

We rule the world.

The world doesn’t know this yet. We don’t quite know it yet.

Other people believe that they rule the world, but they write the rules down, and they hand them to us. And then we write the rules that go into the machines that execute everything that happens on this planet nowadays.

No law can be enacted without software; no law can be enforced without software. No government can act without software.

We rule the world.

Uncle Bob Martin, “The Future of Programming”

Society has learned that we have the power to rule the world, and they want that power back.

Yes, that’s right: this is a power struggle, the kind of clash with the highest stakes and the vilest tactics.

Oh, and on one side is all of society, including governmental nation-state actors with mammoth power who intend to keep it.

We, as an industry, might be “powerful,” but when society decides to use actual force on us, we won’t stand a chance.

You and I could go to jail for the code we write.

Uncle Bob Martin, Voxxed CERN 2019 Keynote

Oh, you think it won’t happen? Well, let’s expand Uncle Bob’s above quote about the Volkswagen emissions scandal:

…The CEO of Volkswagen North America…said, and I quote, “Well, it was just a couple of software developers who did it for whatever reason.”

Now, it was a couple of software developers who did it…so those guys, whoever they were, they put their fingers on the keyboard, and they typed the cheating code.

They’re in jail, and they deserve to be in jail.

You and I could go to jail for the code we write.

Uncle Bob Martin, Voxxed CERN 2019 Keynote

That’s right: it has already happened!

So yeah, you’re a fool if you think that can’t happen to you!

And I agree: those software developers deserve to be in jail.

Catastrophic Rage

But there’s one tiny detail about the Volkswagen emissions scandal that may have escaped your notice: nobody died.

What if people did die? What if bad software caused a catastrophe?

Of course, it will have to be a catastrophe, not a slow walk of death.

Yes, we are killing people everyday. Think about that and feel ashamed.

We’ve been lucky so far that events like the Boeing 737 MAX scandal, which did kill people, have not stirred up rage.

Because three years before that scandal, Uncle Bob was prophetic:

And when [a catastrophe] happens (and it will happen, it has to happen; it’s just a matter of time), when this happens, the politicians of the world will rise up, as they should, in righteous indignation, which they should have, and they will point their fingers right at us, and they will ask us the question, “How could you have let this happen?”

They won’t point at our managers because our managers will say, “Oh it was some software guys who did it for whatever reason.”

They will point at us, and they will ask us this question, and we’d better have an answer for them because if our answer is, “Well, my boss made me do that,” that is not going to fly!

Uncle Bob Martin, “The Future of Programming”

So we are running on borrowed time. Almost 5 years of borrowed time.

A catastrophe will happen, and if the fury is furious enough, society might load us all onto a 737 MAX with an incompetent pilot and let it run out of gas midair. Over a spittin’ volcano.

And if society decides we deserve that, I’ll gladly be the one to fly that plane because I think I will agree.

Yes, I include myself in this. I am not blameless.

Techies, and tech industrialists, we are not ready for the coming catastrophe and the pure, pristine rage that will follow.

Digital Chains

But even if we are all morqued, we will still be spared; the worst will be yet to come.

We are building digital chains right now, and they imprison everyone who ain’t in the elite club.

For if we are morqued, our loved ones will survive to live in the world we created.

During the power struggle, power centers will only gather more. And then they’ll use that power to stamp on the faces of your loved ones. Forever. Everywhere.

Yes, that includes you tech industrialists! You think governments are going to let you act like quasi-governments? Nah, they’ll oppress you too, or merck you if you fight back.

Even if you are in a different country.

A Warning to Tech Industrialists

And tech industrialists, while I have your attention, I have another warning for you.

I know you are loathe to give up your laurels built up by cheating common people with buggy software.

But these changes are coming regardless because society can only tolerate so much. So if you want to limit the liability you have, you need to accept some liability.

How much? The same amount as physical objects at least.

Of course, since software is more complex, it will be more work to get the equivalent reliability. And since software can do so much, we should hold it to a higher standard.

If you want a seat at the table to decide, you have to accept that liability upfront.

No more pushing untested software on consumers, no more updates.

And if you want the least liability possible, this proposal is your best shot.

And Google, you can go to company Hades if you dare to stop the “data and ads as money” tack!

Our Best Chance

So yeah, liability will happen; it’s just a matter of when and how.

Our best chance to survive as an industry, and not meet Pele with a supersonic swan dive, is to get society on our side.

Even better if we could win by gaining the favor of non-state actors; we could shake off those digital chains.

And how to do that?

Well, if we stand up right now, with a plan that will work for society and us, they might accept it.

But we need to do it right now! Every day without a plan in place is a day where the snowball could smash us on its way down to our shining City of Open Source.

How to Be Professional

“Okay, Gavin, you got a plan?”

Boy, do I have a plan!

It’s simple; we just need to emulate the best professionals: professional engineers.

“What makes them the best?”

They actually can suffer consequences. And do.

“Gavin, I don’t hear of many professional engineers suffering consequences.”

That’s because the consequences made them get their snot together, so they actually know how to do their jobs!

“Wait, are you saying that we don’t know how to do our jobs?”

THAT’S EXACTLY WHAT I AM SAYING, YOU DUMB, DELUDED, DIMWITTED DOLT!

Ahem.

Professional Standard of Care

What I mean is that professional engineers have strict standard practices they must know and adhere to.

But if they do know and adhere to them, that is good enough.

For example, as far as I know, the engineers in the Tacoma Narrows Bridge Collapse never faced consequences. Why? Because the failure was the result of a physical phenomenon that they were not expected to worry about.

Which has since changed, so if a bridge were to collapse due to the same issue now, there would be consequences.

But there are consequences for engineering negligence, such as the legal consequences of the Hyatt Regency walkway collapse.

In fact, I highlight that collapse specifically because of the legal changes around responsibility, which those responsible tried to deflect.

Though the Engineer of Record for the Hyatt Regency, Jack D. Gillum, supposedly regrets what happened, and did lose his license, if he was not willing to directly compensate victims, I believe he didn’t regret it enough.

But the key is that they do have to adhere to those practices, and they must do so strictly.

<whine>“Building a set of best practices sounds hard."</whine>

Bullnanny! I made a checklist that would be a good start. And I’m just one guy with a gutsy grudge.

With a professional standard of care, all you need to do is follow it responsibly and keep it up-to-date because every professional should constantly update best practices as knowledge grows.

Do that, and society will be satisfied, even if an accident happens.

And if they’re not, well, they just wanted an excuse to bring back volcano sacrifices anyway.

So taking on responsibility is not as bad as it sounds; just make sure you have an up-to-date checklist that you follow more religiously than politicians lie.

Random plug: please read The Checklist Manifesto.

Professional Code of Ethics

But having just an up-to-date standard of care is not enough.

We need to have a standard of ethics too. With teeth.

And yes, I made one already as well.

“Bored!”

No shim, Sherlock! Ethics are boring. But necessary.

Remember the Volkswagen story? The CEO was all too happy to throw those programmers under the bus. Our managers will as well.

Yes, I’ve had a manager throw me under the bus before.

If we professionalize, we have to have a standard of ethics to prevent that.

Here’s how it works: CEO or PHB tells you, “Do this shady thing.”

You say “No, it’s against the Software Engineer Code of Ethics.”

PHB says, “Do it or you’re fired.”

You say, “You need a professional software engineer on this project; good luck keeping one if you fire them for following the Code of Ethics. They’ll want to keep their licenses.”

That’s why it has to have teeth; if engineers could lose their license for violating it, that’s a powerful incentive. You could always find another job; you can’t easily get a license back.

And if the engineers do play along, they’ll deserve the blame because they had an out and didn’t take it.

Professionalizing protects us more than it injures us; we can handle liability, but that programmer pulverizing bus is a problem.

Also, there will be a nice side effect: we will be in control of project management, not product owners or managers.

In engineering projects, the “manager” or “director” is an engineer, the Engineer of Record. We could figure out how to run our projects best for us. No stupid “sprints,” no useless meetings, no performative standups.

Unless you love them.

People Certification

Of course, if there are professional Engineers of Record, we need a way to certify them.

Well, obviously, we need to train them and test them. Duh.

Okay, that’s dodging discussion. How should it be done?

Let me tell you a story.

I raised these ideas publicly at a conference once, and a woman castigated me for gatekeeping, saying that just because my parents paid for my college education doesn’t mean that others can.

I paid for my own college education, by the way.

I was green, so I couldn’t think fast enough to respond, but now I know.

Since university doesn’t prepare you for a programming job anyway, we can forego a college requirement.

Instead, we should require an apprenticeship.

It makes sense; programming is a craft, and it is best learned on the job.

Such jobs should have no gatekeeping requirements, but they should be intensive, focused on output and mentorship, and quite long.

Also, not every programmer needs to be certified; only the best should be, and only the best should be apprentices. An Engineer of Record should be able to direct hundreds of uncertified programmers, and not one of them should need certification if the Engineer of Record does his job right.

So yes, we need training, but we should avoid putting any obstacle in the way that requires anything but humility and ability to learn.

Any such obstacle will harm us and society, and society won’t be happy.

Software Certification

I have been skirting one subject: when exactly a software project needs a certified Engineer of Record.

The CRA already has an answer: anything classified as Class I or Class II products.

A fairly good list of both is here.

Those are good lists, but I don’t think the details matter.

On this point, we could let the politicians debate and decide, with some advice. Whatever society decides is important enough to need certification is important enough, simply by being part of the list that a majority cares about.

Though I would personally add any software that takes part in handling the software supply chain, such as compilers, interpreters, package managers, build systems, version control systems, etc.

Oh, and compiler authors, I think 00UB should be illegal, and I will work to make it so. You have been warned.

But if you’re unnerved that the list will be so large that the industry will come to a standstill due to the lack of certified Engineers, the best way to convince society that something isn’t important enough is to treat all software as critical and develop them right, even if you’re not certified.

If every programmer treated their software with the care society deserves, then society might minister mercy on some things.

Bootstrapping the Plan

Alrighty then, we have a plan for a system, but how do we start the engine? After all, certified Software Engineers of Record don’t exist (except in a few countries), so how do we get some?

First, let’s create a professional organization. Then let’s have it nail the Standard of Care and the Code of Ethics.

And then, that organization could solicit nominations from everywhere about who could be considered the “root” professional engineers, programmers so good, so ethical, and so careful, that they deserve the title without any training.

Once the nominations are in, members could vote yes or no on each nomination. Every person with a supermajority is given the title by fiat.

And then, companies should offer to pay one or more to take on apprentices while preparing existing software for certification.

Because yep! There be stuff that need cert pronto!

Once the root Engineers are chosen, they should be given a period of time to certify existing software. Maybe one or two years.

After that point, both the Engineers and the software are bootstrapped, and society quietly transitions to a superior status quo.

Conclusion

If you don’t believe by now that accepting liability is the best thing that can happen to the tech industry, I’m a crude communicator.

But if I have convinced you, your journey to professionalism starts now.

Keep a checklist. Refer to it often. Hit a minimum standard. Evangelize professionalism. Participate politically. Refuse to break your personal code of ethics. And stand by your work or admit your mistakes.

Our industry, our world, our shining City of Open Source, and the freedom of our loved ones depend on all of us, including you now that you know.

So change yourself first. If enough of us do that, the rest will follow.

Oh, and do send this to the EU politicians. They need a plan to replace the CRA, so let’s give it to them.


Thank you to Loup Vaillant for reading a draft of this post; his “very short comment” was worth an ounce of gold for every word.