Security experts have long said that internet-connected systems and software need security controls and features built in by design, in the same manner they’re built into physical infrastructure. The National Institute of Standards and Technology agrees and has issued guidance to help software engineers build secure products.
Titled “Systems Security Engineering: Considerations for a Multidisciplinary Approach in the Engineering of Trustworthy Secure Systems,” the guideline emphasizes incorporating “well-defined engineering-based security design principles at every level, from the physical to the virtual,” NIST Fellow Ron Ross wrote on the Taking Measure blog. A holistic approach does more than make systems penetration-resistant; even after a compromise, they’re still capable enough to contain the damage and resilient enough to keep supporting critical missions and business functions.
NIST’s guidance uses the international standard ISO/IEC/IEEE 15288 for systems and software engineering as a framework, and it maps out “every security activity that would help the engineers make a more trustworthy system” for each of the 30-plus processes defined by the standard. The activities cover the entire system lifecycle, from the initial business or mission analysis to requirements definition to the design and architecture phases, and they’re applicable for new, upgraded, or repurposed systems.
“We have a high degree of confidence our bridges and airplanes are safe and structurally sound. We trust those technologies because we know that they were designed and built by applying the basic laws of physics, principles of mathematics, and concepts of engineering,” Ross wrote. Similarly, applying fundamental principles in mathematics, computer science, and systems/software engineering can give us the same level of confidence in our software and hardware.
Taking a holistic approach
A holistic approach requires coordinating across different specialties, such as information, software and hardware assurance, physical security, antitamper protection, communications security, and cryptography. It also demands addressing multiple focus areas, such as privacy, verification, penetration resistance, architecture, performance, validation, and vulnerability.
The guidance addressed the dependencies and subspecialties by grouping the processes in the system lifecycle into four families:
- Agreement Process: Tasks related to acquiring products and services, as well as providing services as a supplier.
- Organizational Project-Enabling Process: Lifecycle model management, infrastructure management, portfolio management, human resource management, quality management, and knowledge management.
- Technical Management Process: Project planning, project assessment and control, decision management, risk management, configuration management, information management, and quality assurance.
- Technical Process: All the activities related to business or mission analysis, defining stakeholder needs and requirements, defining system requirements, defining the architecture, defining the design, system analysis, implementation, integration, verification, transition, validation, operations, maintenance, and disposal.
The processes outlined in the publication do not prescribe a mandatory set of activities and do not explicitly map to specific stages in the lifecycle, NIST warned. Engineers should rely on their experience and their understanding of the organization’s objectives to tailor the processes to meet the stakeholder’s requirements for a trustworthy system.
The publication also did not attempt to formally define systems security engineering. There is something for everyone involved in the process, from business stakeholders to developers, administrators, and security analysts.
Calling on engineers
When civil engineers build a bridge, they have to consider the weight of vehicles and people crossing the bridge, the stress caused by wind and other natural elements, and the materials used to build the bridge itself. Buildings have to meet specific structural and fire codes to make sure they are safe and will not collapse. Similarly, software engineers need to build systems with security controls already included in the design and not added afterward as a separate component.
If bridges were routinely collapsing, scientists and engineers would be immediately on the scene to figure out what went wrong and identify how to fix it for future projects. Currently, instead of asking engineers and scientists to perform root-cause failure analysis to find and fix the problem, cybersecurity focuses on add-ons. Changing how technology is designed and built—by strengthening underlying systems and system components, and developing with well-defined security requirements—would help reduce the number of known, unknown, and adversary-created vulnerabilities, Ross said.
NIST’s approach echoes what Dan Kaminsky, chief scientist and co-founder of White Ops, said in his keynote speech at the Black Hat security conference earlier this year. Kaminsky called for an “NIH [National Institutes of Health] for Cyber” to study the security challenges and come up with engineering solutions addressing them. While Kaminsky was using the name of a different federal agency, his message was the same: Cybersecurity needs to be treated as an engineering discipline with tools and principles that can be used to build secure systems.
“We didn’t stop our cities from burning by making fire illegal or heal the ill by making sickness a crime. We actually studied the problems and learned to deliver safety,” Kaminsky said in his speech. “If we want to make security better, give people environments that are easy to work with and still secure.”
Addressing the IoT problem
While NIST focused the language on systems and software, the guidance provides a welcome direction for the internet of things, most of which hit the market with little to no security controls.
NIST’s authority extends to only government agencies and contractors, so the guidance is not binding for engineers working in the private sector. Even so, these recommendations can raise expectations on what features must be included to be acceptable for the marketplace.
This NIST publication is the culmination of nearly four years of work, Ross said. The final draft was originally expected in December, but the release date was moved up after a crippling distributed denial-of-service attack against Dyn temporarily cut off access to large parts of the internet. The attack also revived discussions on whether the government should try to regulate the security of IoT, especially since there are currently no consequences for manufacturers selling subpar devices to consumers.
Regulation would be difficult, as many of the embedded devices aren’t manufactured in the United States. “While I’m not taking a certain level of regulation off the board, the United States can’t regulate the world,” Rep. Greg Walden (R-Ore.), chairman of the Subcommittee on Communications and Technology said during a recent Congressional hearing on IoT security.
Building trustworthy systems
The rapid pace of technological innovation, the dramatic growth in consumer demand for new technology, and the boom in IoT have made it difficult to understand, let alone protect, the global information technology infrastructure. There are too many areas to cover—software, firmware, hardware components—and cyberhygiene efforts, such as patching, asset management, and vulnerability scanning, are not enough.
“Our fundamental cybersecurity problem can be summed up in three words—too much complexity,” Ross wrote. “Creating more trustworthy, secure systems requires a holistic view of the problems, the application of concepts, principles, and best practices of science and engineering to solve those problems, and the leadership and will to do the right thing—even when such actions may not be popular.”