Proactive Resilience: The Future of Cybersecurity
By William Jackson
June 16, 2017
Share this page:
Today’s state of the art in cybersecurity is operational resilience – an organization’s ability to continue its mission despite disruptions to its IT enterprise. Summer Fowler, technical director of Carnegie Mellon University’s CERT Division, proposes moving beyond this to proactive resilience – what she calls “prosilience.”
It is not enough to remain operational during an attack, Fowler argues. She believes the next step is to anticipate attacks and prepare for those strikes before they hit.
“Prosilience is resilience with consciousness of environment, self-awareness and the capacity to evolve,” Fowler wrote on the Insider Threat Blog, a product of Carnegie Mellon’s Software Engineering Institute. “It is not about being able to operate through disruption,” she says, it is about anticipating disruption and adapting before it even occurs.”
Disruptions, whether malicious or merely unexpected, can flair up in an instant to take down servers. Take the recent incident involving the Federal Communications Commission (FCC): After comedian John Oliver told his viewers to register their disapproval after the agency rescinded so-called net neutrality rules, FCC’s servers were overwhelmed and its website crashed. The agency called it a denial of service attack.
A prosilient architecture might have anticipated that threat and reconfigured itself to remain operational during the surge in traffic.
Prosilience aims to leverage emerging capabilities such as artificial intelligence, machine learning and self-healing to help networks adapt in near real time. “This is something I don’t think we will be ready for several years yet,” Fowler told GovTechWorks, in an interview. “It’s very recent.”
Full prosilience might be as much as a decade away, but some commercial cybersecurity offerings are moving in that direction. Area 1 Security is a young cybersecurity company founded by former National Security Agency (NSA) employees that is developing technology to scan “everything interesting about the Internet.”
“That’s ambitious,” said Phil Syme, chief technology officer at Area 1 Security in Redwood City, Calif., and a former member of the National Security Agency’s engineering organization. But even incomplete information about criminal sites and malicious activity could “really put a dent in the problem” of security breaches by alerting customers to what is coming their way.
Moving Beyond Resilience
Resilience is an extension of risk management, which requires that organizations accept and prepare for risks that cannot be eliminated. A properly prepared organization should be able to continue operations with minimal disruption in the face of a security incident. Carnegie Mellon CERT’s Resilience Management Model lays out best practices for effectively managing security, business continuity and information technology operations.
Prosilience takes that concept one step further, driving organizations to become “smarter about resilience activity” and anticipate, rather than simply respond to events, Fowler said. By leveraging the distributed sensing capability of the Internet of Things, practitioners would be able to accurately spot trends and predict threats. Machine learning technologies would enable networks to respond within milliseconds, reconfiguring themselves if needed to repulse attacks and isolate threats.
At Carnegie Mellon, government, industry and academic experts are teaming up to develop the prosilience concept, beginning by establishing metrics to measure how well security budgets are used in order to develop standardized measures for return on investment. Is the budget performing according to plan? Is the plan correct for the organization? What will it take to achieve the agility needed for prosilience? “All of these roll into budget,” Fowler said.
Building out such models will take time, she added, explaining that simply establishing metrics could take up to five years.
Once an efficiency baseline can be established, developers can design and test a prosilient architecture to leverage those baseline capabilities. Fowler said a workable architecture is probably five to 10 years away.
For government agencies, achieving prosilience poses particular challenges. Many legacy systems still in use today lack the adaptability demanded for such an environment. Modernization is a necessary first step to making prosilience even a reality
While Carnegie Mellon develops that formal prosilience architecture, operators in the trenches are working on their own proactive resilience efforts. A critical element is old-fashioned human learning, according to Dan VanBelleghem, cybersecurity program director with General Dynamics Information Technology.
“You can’t start practicing breach response after the fact,” VanBelleghem said. “You need to exercise your cyber teams on a monthly or quarterly basis to prepare, presenting them with relevant threat-based scenarios and developing playbooks so they learn how to respond when different threats arise.
“You can’t figure this out when you’re in a crisis – that’s the worst time to try to learn what to do, he said. “It’s the same approach the Defense Department takes with its cyber teams.”
Still, the sheer volume of threat data and the speed with which attacks can mount, means humans alone cannot keep up. Machine learning, therefore, is critical to identify and predict threats. Area 1’s wide-scale Internet crawling identifies many sites engaged in such malicious activity as credential harvesting or hosting exploit kits. The company works with small hosting services that may not have their own Security Operations Centers (SOCs) to locate and prevent compromises.
“Traditionally you do a take-down” of compromised servers, Syme said. But when that happens, the bad guys just move to a different server. Area 1 takes a different approach: It first monitors activity to understand how the compromise works, then blocks it in such a way as to avoid tipping off the perpetrators.
Machine learning enhances that capability. Area 1 integrates with its customers’ edge equipment to automate responses, creating a powerful force multiplier, Syme said. Provided the base information is strong, it can be highly effective.
“Automation is not free,” he added. “It’s expensive and quite difficult.”
Automated tools must be customized for each enterprise; the programming is only as good as the quality and accuracy of the information it builds upon.
While developing a definitive approach to prosilience may be a long and slow process, Fowler said, it that doesn’t mean government organizations or private institutions should sit back and wait.
“We always want to start where we are,” Fowler said, even if that is not where we want to be. “We can’t sit on our heels.”