In every field of engineering, there is a grace period when the engineers doing the heroic work of making a complex and highly valuable new technology work can escape liability for poor performance, failures, or damages caused by what they build. That grace erodes as the technology becomes commonplace. Eventually, usually through a combination of litigation, legislation, regulation, and evolving insurance requirements, liability and responsibility for failure starts being pinned to the engineers who designed and built the failed system.
Software is still in its infancy compared to other engineering disciplines. However, given the destruction wrought by the ever-increasing stream of cybersecurity catastrophes, and with computing now embedded into nearly every aspect of our digital and physical lives, software engineering’s baby days are coming to an end; a reckoning is inevitable.
Just like every other engineering discipline before it, software development will be “professionalized” by force. Design standards and the means to test compliance with them will become increasingly common. Just as in other industries, the purpose of those standards will be twofold: one is to ensure good engineering practice and the other is to pinpoint blame when it isn’t good. Software developers are going to lose the ability to claim they didn’t know or are not responsible for the harm their designs failed to prevent. Moreover, security failure will be the driving force of professionalization because it’s the use case that makes the news. You don’t see daily headlines about some feature or function being less elegant than it could be.
Current Software Development Starts in The Wrong Place
In general, software developers don’t think about data security unless they must, and when they do, it’s not uncommon for them to push back. Data security is generally perceived as hard, although that’s not the case anymore. Without an easy way to ensure it’s done correctly, there’s a natural and very understandable tendency on the part of developers to avoid data security. As a result, it’s often absent, reduced, or removed altogether from app specifications. However, whether they realize it or not, developers are the de facto decider of how data will be secured; either they’ll do it or hope somebody else does.
Part of the problem is that agile development, which we are proponents of, usually starts at the wrong place. It’s usually focused solely on features, functions, and later, bug fixes. Basic questions that should be asked and answered at the beginning of a build aren’t addressed. Questions like, “Is the data confidential? Do I need to I assure it’s protected in motion and at rest? Who can or cannot access the data? Who can share data and who can they share it with? How long can it be shared? Where can it be stored?” should be resolved at the beginning and compliance with the answers maintained as the software evolves.
Starting software development at the feature level without also nailing down the data security requirements is perhaps the largest contributor to data security failures.
Encryption Doesn’t Have to Be Hard
Encryption for data security used to be notoriously difficult. It required advanced knowledge of cryptography and implementation was finicky to say the least. That’s no longer the case. Absio has abstracted all that complexity into a few simple methods in multi-language SDKs that work in any architecture with a portable server app for cross-platform sync and backup.
So, there are no more excuses…