Archives
The Church of Trust
2014/04/10 13:27:47 CEST

There is a tendency among some people to believe that their systems are absolutely secure and dependable. This is not solely a sales motto aimed at customers, they actually do believe it.

Let's go through some of their arguments.

Applying cryptographic signatures to software (e.g. signed packages) can ensure that there was no tampering between the emitter (e.g. a repository) and the receiver (e.g. one's desktop), and that the origin of the software is known. Every other property that one may infer from the digital signature is in fact due to the emitter binding additional commitments to it.

For example, they may claim to follow a given release engineering process, coding practice or audit process. This goes beyond merely making sure that the systems used for signing the software are not compromised. One may or may not trust the emmiter's claims, and may want to have those claims checked.

People blindly relying on cryptographic signatures often forget that they put their trust in other people and organisations. The questions they should be asking themselves is: who can I entrust with the security of my system? How can I make sure I can trust them? What are they actually providing? What can really be expected? Where does my responsiblity lie?

If you ask these persons to describe what happens behind the scene, they often do not know. But you should, of course, do like they do.

This is often said for open-source software, but the fact that many eyes can see the code is no proof that many eyes did see the code, let alone actively had it audited.

In the closed-source world, quality standards and processes apply, contracts are signed, contractors have an interested in checking that the other side is meeting their commitments, chances are that many vendors stick to them and are good at what they are doing. How much? Because of the seal of secrecy, outsiders often do not know. Open source can shine better in this area, but for that to be true, the will to audit, but also to admit mistakes, must be present. Often, one or both are missing.

Which often comes, in certain ideological circles, with a comment along the lines that "people producing closed-source software are after profitability and quality is not a priority for them, while open-source is made by volunter, therefore they care more about quality".

There is of course no tangible link between the nature of the code and its quality, it is a matter of people, organisation, priorities and design choices. It can happen that features that look interesting on the surface have security flaws or are designed in such a way that they are not reliable. Choices have to be made. More often than not, people claiming they can have everything from many worlds - i.e. with different goals - have not carefully considered the question. It seems out of touch, for example, to brag about security while at the same time using blobs to operate some 3D accelerator or RAID hardware and refusing to admit that a third-party is trusted not to introduce a backdoor on the system.

Chances are there is a fundamental difference in philosophy and behavior between someone who says "my systems are as secure and dependable as they can be" and someone who says "my systems are absolutely secure and dependable".

I think we should call people of the second group the Church of Trust, in a twisted reference to the chain of trust. Their attitude can be quite harmful.

Tags security