I’ve received this question a couple of weeks ago and I believe it’s valuable enough to spread my thoughts on the subject here as well.
Having been a university lecturer myself I truly believe there is much more we could be doing. It doesn’t mean we need to push a lot of new knowledge on students, it’s just enough if we share with them the principles.
Educators need to make it their goal to try to teach software engineers anything about security because 1% is better than none. Instructors should strive to make lessons relevant and engaging while making it as simple as possible. You want to show software developers that application security isn’t really as hard as it’s portrayed. If we lower the entrance bar and we help them understand where they can find high quality knowledge, then they’ll be more eager to learn about the subject.
The common problem we see among software engineers coming from a variety of background isn’t a complexity of security, but that they just haven’t been made aware of the need for application security.
We certainly don’t want to overwhelm software developers with loads of new knowledge because they already have a lot to learn in their own specialisations. A good start would be to outline the basic security principles. These are fundamental principles that would – hopefully – change their naive mindset and prepare their software for facing the real, dangerous world.
Here are some of those key security principles: (paraphrased for brevity)
- Minimize attack surface area
Every feature that is added to an application adds a certain amount of risk to the overall application. The aim for secure development is to reduce the overall risk by reducing the attack surface area, so think twice before you write that next feature/functionality and expand your code. More code = more places where mistakes could’ve been done.
- Establish secure defaults
There are many ways to deliver an “out of the box” experience for users. However, by default, the experience should be secure, and it should be up to the user to reduce their security – if they wish to and if it’s allowed per design.
- Principle of least privilege
The principle of least privilege recommends that accounts have the least amount of privilege(s) required to perform their business processes. This encompasses user rights, resource permissions such as CPU limits, memory, network, and file system permissions.
- Principle of Defense in-depth
The principle of defense in-depth suggests that where one control would be reasonable, more controls that approach risks in different fashions are better. Controls, when used in-depth, can make severe vulnerabilities extraordinarily difficult to exploit and thus unlikely to occur.
With secure coding, this may take the form of tier-based validation, centralized auditing controls, and requiring users to be logged on all pages.
- Fail securely
Applications regularly fail to process transactions for many reasons. How they fail can determine if an application is secure or not. So when an application fails or throws an exception, it should default to the lowest privileges and accesses possible.
- Don’t trust services
Many organizations utilize the processing capabilities of third party partners, who will more than likely have different security policies and posture than you. It is unlikely that you can influence or control any external third-party, whether they are home users or major suppliers or partners.
Therefore, implicit trust of externally run systems is not warranted. All external systems should be treated in a similar fashion.
- Separation of duties
The key to fraud control is the separation of duties. For example, someone who requests a computer cannot also sign for it, nor should they directly receive the computer. This prevents the user from requesting many computers and claiming they never arrived.
Certain roles have different levels of trust than normal users. In particular, administrators are different to normal users. In general, administrators should not be users of the application.
- Avoid relying exclusively on security by obscurity
Security through obscurity is a weak security control and nearly always fails when it is the only control. This is not to say that keeping secrets is a bad idea, it simply means that the security of key systems should not be reliant upon keeping details hidden. The security should rely upon many other factors, including reasonable password policies, defense in depth, business transaction limits, solid network architecture, and fraud and audit controls.
- Keep security simple
Attack surface area and simplicity go hand in hand. Certain software engineering fads prefer overly complex approaches to what would otherwise be relatively straightforward and simple code.
Developers should avoid the use of double negatives and complex architectures when a simpler approach would be faster and simpler.
- Fix security issues correctly
Once a security issue has been identified, it is important to develop a test for it and to understand the root cause of the issue. When design patterns are used, it is likely that the security issue is widespread amongst all code bases, so developing the right fix without introducing regressions is essential.
It would make a huge difference if we educated software engineers and made them aware of those risks. It’s important because even without going deep into details and technical specifics, we can shift their mindset one it at a time.
We teach software engineers — and for a good reason — how to be good citizens and how to build, which after many years compound and create a builder mindset. Security assurance on the other hand is mostly about breaking and finding holes in a “perfect” creature. Builders generally don’t think about the bad stuff, because they’re always striving for better, prettier and something that’s more functional. Sharing with them the basics and showing what can possibly go wrong doesn’t cost much, yet is a great starter.
And then if we want to go deeper, which we should — we should teach them about security issues classified in OWASP TOP 10, and train them into performing basic security testing. This is to embrace the breaker mindset and to widen their skillset and horizons.
Although OWASP produces educational content mostly for web and mobile applications, it can also enrich the mindset of desktop apps developers and other people working with software development. However, we can’t just throw those resources at everyone. To be effective, we must provide relevant training, so students don’t feel like they’re pushed to learn something they won’t ever get to apply in their careers.
Once we’re done with face to face trainings, we should provide them with resources they can use on their own. Pointing them to books, websites and courses which they can use to become more security-savvy is really helpful and decreases the discomfort that comes with entering a new field of study.
As for Web developers, we should point them to content such as:
- OWASP TOP 10 – A list of most critical web application security risks
- OWASP Application Security Verification Standard – A quick checklist that can be used for ad hoc purposes to verify compliance with security standards
- OWASP Testing Guide – Extensive guide into web application security testing
- OWASP Security Code Review – Detailed book on how to perform whitebox code reviews to identify security bugs
Depending on how much time and resources we can and are allowed spend on it, we should go beyond and provide students with access to variety of other resources. Think of books explaining the security concepts behind the components they’ll use on daily basis, whether as a user or while integrating it into their software.
Let “The Tangled Web” book be an example of an excellent choice to better understand web browsers and frontend’s security.
Whatever you decide to choose, make sure you’re actions will inspire them to continued education. In my experience, the best way to learn is by applying theory to real-life problems and putting freshly possessed to practice. Human beings learn by making and breaking, and everything single mistake can be converted into positive learning experience, thus making them better developers.
If we don’t focus on addressing the issue at the core(formal education), we’ll have hard time keeping current pace of innovation, because we’re being constantly derailed by problems we try to fix with ineffective, myopic mindset. So keep exploring, keep testing and keep moving this field forward.
Good luck, ’cause you’ll need it. We all do!