Degrading Security Diminishes Privacy

Privacy has been getting a lot of attention lately. And with good reason, given the increasing occurrences of privacy breaches, personal information records breaches, all the many new types of smart devices being used by more and more people, and the collection of more personal and associated data than ever before. It would appear that the 2014 Sony hack was the tipping point that motivated US President Barack Obama to propose the Personal Data Notification & Protection Act and the Student Digital Privacy Act on 12 January this year. It was encouraging to see this new interest in taking steps to better protect personal information—not only for improving personal privacy of US residents, but also to help show the rest of the world that the US is moving beyond having a patchwork set of privacy laws and being considered as an “inadequate” privacy protections country by the rest of the world, to moving forward with actions to better protect personal information throughout all industries, and not just a chosen few that exist in the US today.

However, on 16 January, the White House released a statement showing their support of an announced UK goal to outlaw encrypted messages and other communications unless the government is given a backdoor to decrypt such communications. These two messages from 12 and 16 January are in direct conflict with each other. You cannot achieve privacy without strong information security, and you cannot have strong information security when tools have backdoors built into them. This is an Information Security 101 lesson that has been taught for decades, but seems to have been lost (or never learned) by leaders making decisions that impact everyone’s privacy and information security. It is also something that all information security professionals need to make sure their own organization leaders understand.

Here are five important and compelling facts that government and other types of organizational leaders need to know:

  1. Backdoors can often be exploited accidentally, resulting in great harm. Backdoors in technologies are nothing new. Hearken back to 1988 when the Morris worm became the first widespread Internet attack. It spread quickly and destructively to infect systems and spread widely by using backdoors purposefully built into technologies. The backdoors in that case were a set of secrets then known by a small technically proficient group of Internet users. One error resulted in a large-scale attack that disabled many systems and brought the nascent Internet to a virtual standstill. Lessons were not learned, were they, if leaders still want to build backdoors into technology systems. As a systems engineer at a large multi-national organization at the beginning of my career, I saw fellow programmers building in backdoors and hardcoded passwords that “only they knew” (and of course anyone else examining the code) at the urging of their managers, many of which resulted in significant systems outages and program mistakes once they were put into production through accidents, which caused unforeseen problems.
  2. Backdoors will not remain a secret. Backdoors will be discovered and used by the adversaries and crooks they were established to find in the first place. This is not a new truth. History demonstrates that so-called “secret” technology backdoors are dangerous, put the associated systems at risk, and lead to breaches and security incidents. And the more people that know about each backdoor, the more dangerous having such backdoors will become. Consider another real-life example. In 2013, the security company Barracuda had an undocumented backdoor in its security tool that allowed high levels of access from the Internet addresses assigned to Barracuda. However, when it was publicized, as of course secrets will be when humans are entrusted with them, it became extremely unsafe and Barracuda’s customers said they didn’t want it. There is no such thing as a “secret” backdoor if even one human knows about it.
  3. Backdoors created to fight crime will be used to commit crime. Proficient enemies who are looking for vulnerabilities in security technologies know how to exploit the weaknesses when they find them. It has happened many times before. One example of how attackers can use backdoors placed into systems occurred in Greece’s largest commercial cellular network operator. Switches installed in the system came with built-in wiretapping features created specifically for authorized law enforcement agencies. A yet-to-be-identified attacker was able to install software, use these embedded wiretapping features to secretly, and of course illegally, collect the calls from many cell phones, including those belonging to the Prime Minister of Greece, a hundred high-ranking Greek dignitaries, and a US Embassy employee in Greece. The crooks are also willing to pay large amounts for the details on such built-in backdoors and other types of unpublished vulnerabilities. An undisclosed vulnerability in widely used commercial software reportedly sells for US $160,000, on average, on the black market.
  4. Backdoors and other types of weakened security create opportunities for malicious insiders and the authorized unaware. Humans are the weakest link in information security, and trusted insiders present the greatest threat to systems and information. Edward Snowden has become the poster child for the high risks and consequences of trusted insiders that break their promises to keep the secrets entrusted to them. Even though he started releasing his pilfered data in June 2013, he still continues to trickle out large chunks of stolen data, such as the F-35 blueprints he recently posted that had already been obtained by Chinese hackers.
  5. Backdoors in technology hurt business success and thwart technology advances. If weakened security in commercial products and services is the result of a national policy (as opposed to other causes, such as human error or corporate interests), this weakened security harms the nation economically. After having so many privacy breaches impact hundreds of millions of individuals, consumers justifiably want products and services from companies that they believe are building secure technology and that do not build in backdoors, regardless of who told them to do so. Implementing such policies could have a significant negative impact on competitiveness in the information technology sector. For example, Forrester Research Inc. estimates that recent allegations about US data surveillance activities may have reduced US technology sales overseas by as much as US $180 billion, or 25 percent of information technology services, by 2016. The government could be significantly damaging the economy even more if they would require such backdoors to be installed within security technology products that are used by consumers throughout the world.

Governments and other entities with goals of creating backdoors in security technologies, by a well-intended but fatally flawed attempt to improve security, could ultimately compromise information security and privacy and make systems and data more vulnerable. Data Privacy Day on 28 January is an opportune time to point out, to government leaders, business leaders and individuals, that privacy protections are weakened when information security protections are weakened. There are ways to ensure a country’s security without making security tools, such as encryption, weak and vulnerable to exploitation.

Rebecca Herold, CISA, CISM, CIPM, CIPP/IT, CIPP/US, CISSP, FLMI
CEO of Rebecca Herold & Associates and partner of Compliance Helper
Member of ISACA’s Privacy Task Force

[ISACA]

What Government Can (And Can’t) Do About Cybersecurity

In his 2015 State of the Union address, President Obama introduced a number of interesting, if not terribly novel, proposals. Here are six that will have minimal impact.

People are calling 2014 the “Year of the Breach.” President Obama even focused on “cybersecurity” during his 2015 State of the Union address. I’m thrilled that security seems to have finally broken into the public consciousness. It’s a complex problem that requires an international effort, cooperation between public and private sectors, and careful consideration of the best path forward.

The mess we’re in
I’ve written before about the staggering complexity of application security in the modern enterprise. So it’s not too surprising that the level of insecurity has grown over the past 20 years due to automation’s breakneck speed. The infographic below gives a sense of just how large and complex our codebases are. But like other extremely complex issues, such as healthcare, climate change and education, government intervention is a delicate matter that may do more harm than good.

Click on this link for an interactive view of the Word Cloud by David McCandless.

The commercial sector produces the vast majority of the world’s software. But this market is failing to encourage the development of secure code. Why? Because for the most part, software is a black box. When you buy a car, you can have a mechanic check it out. But software is so complex that it can take months or years to determine whether it’s secure or not. Software is a “market for lemons” where nobody can get paid a fair market price for secure code. So our software ends up stunningly insecure.

I’m not trying to blame the victim here. Malicious attackers are the cause of breaches and we should do what we can to catch them. But given the inherent anonymity of the Internet, the “attribution problem” means that hackers are going to be part of our world for a very long time. This means we’re going to have to do more to protect ourselves.

Proposed government interventions
In his 2015 State of the Union address, President Obama introduced a number of interesting, if not terribly novel, proposals. Let’s quickly review a few of these ideas.

  1. Establish Federal breach notification legislation to unify the complex patchwork of state laws. This is a great idea in principle, although there will certainly be arguments about the details. For example, the 30-day limit is too long for consumers whose credit card number was stolen, yet too short for companies to ensure their systems are clean. I’d like to see this legislation expanded to cover all breaches, not just those that involve a privacy leak. If you’ve been hacked, even if no privacy breach occurred, your customers have a right to know the details.
  2. Expand information sharing with DHS through the ISACs. President Obama said, “we are making sure our government integrates intelligence to combat cyber threats, just as we have done to combat terrorism.” I’m not convinced that the techniques used to combat terrorists will work on hackers. While information sharing is important, it must be done carefully to protect victims of data breaches from further violation.
  3. Allow prosecutors to pursue anyone related to a hacking incident under Federal RICO statute. Given the difficulty of accurately identifying suspects, gathering evidence, and proving relationships in cyberspace, this approach seems ripe for abuse. There’s nothing wrong with aggressively pursuing cyber criminals, but we can’t forget about due process. How easy would it be to frame someone as a hacker if all that is required is a loose association? What if I friend the wrong person on Facebook or LinkedIn?
  4. Radical expansion of the CFAA. The Computer Fraud and Abuse Act is the federal anti-hacking law. Obama’s proposal expands the definition of unauthorized to include any time a user accesses information “for a purpose that the accesser knows is not authorized by the computer owner.” Basically if you know you’re hacking, then you’re guilty of a felony. This subjective standard does nothing to clarify what behavior is allowed under the statute and will lead to messy court cases and bad law.
  5. Even more CFAA expansion. Further, the proposal criminalizes your security tools if you know they could be used for illegal purposes. Another subjective standard, but even if we got past that, it would still be wrongheaded. To use the language of the Betamax decision, these tools have “substantial non-infringing use.” Disarming our limited supply of security researchers is nothing short of insanity.
  6. Allow government backdoor access to secure messaging applications like WhatsApp and Snapchat. British Prime Minister David Cameron and President Obama have called for mandatory backdoors so that intelligence agencies can scan for possible terrorist activity. The desire for this type of backdoor goes back to the Clipper chip, a notoriously flawed idea to escrow encryption keys with the government. Remember that attackers can still use “super-encryption” to defeat any backdoor scheme. That means that we all have to suffer Big Brother with very little benefit in terms of reducing terror.

How to really fix the software market
What strikes me about all these proposals is that they are not very likely to have a substantial effect on the software market. They are all reactive, attempting to target the bad guys rather than focusing on enhancing our own defenses. I think we are capable of producing radically more secure software than we do today. But we’re going to have to raise the bar for developers everywhere. The good news is that we don’t have to resort to making developers liable for vulnerabilities or other tricks.

We need to ensure that software buyers and sellers have the same information about what they are buying. We should start with minimally disruptive interventions such as requiring organizations to disclose information about how their software was designed, built, and tested and information about the people, process, and tools used. Imagine the “Security Facts” equivalent of “Nutrition Facts” label or “Material Safety Data Sheet” for software. Studies of labeling regimes have shown that even if consumers don’t use these labels at all, they still have a significant effect on the companies producing the products.

One thing’s for sure. Cybersecurity is on the government’s agenda for 2015.

A pioneer in application security, Jeff Williams has more than 20 years of experience in software development and security. Jeff co-founded and is the CTO of Aspect Security, an application security consulting firm that provides verification, programmatic and training services. He is also founder and CTO of Contrast Security, which offers a revolutionary application security technology that accurately identifies vulnerabilities at portfolio scale without requiring experts. From 2004 to 2012, Jeff served as the Global Chairman of the OWASP Foundation and created many open-source standards, tools, libraries, and guidelines – including the OWASP Top Ten.

[DarkReading]

How Well Do You Know Your Zero Days and APTs?

It’s time to take the Zero Day & APT Challenge, where knowledge of the worst threats out there could win you great prizes.

Get on in here and show us what you got! (And remember… Palo Alto Networks Traps would have prevented all of these, even without prior knowledge.)

[Palo Alto Networks Blog]

World Leaders Focus on Cybersecurity, But Survey Shows 86% See A Global Skills Shortage

In Washington tonight, US President Barack Obama will propose legislative action to focus on cybersecurity during his State of the Union address. In Davos, 2,500 world leaders from government, industry and civic society are gathering today for the World Economic Forum (WEF) to discuss what WEF Chairman Klaus Schwab describes as “The New Context.” Front and center on the agenda are cybersecurity, risk and the Internet of Things.

Large-scale data breaches have brought this issue to the forefront and showcase that even well-protected, mature organizations face difficulties keeping data secure. And with cyberattacks rising exponentially, it’s no surprise that organizations are aggressively trying to hire those with the skills to prevent them.

There is one problem, however: the severe shortage of skilled cybersecurity professionals. According to the ISACA 2015 Global Cybersecurity Status Report, 86% of respondents believe there is a shortage of skilled cybersecurity professionals and 92% of those whose organizations plan to hire cybersecurity professionals in 2015 say it will be difficult to find skilled candidates. The ISACA 2015 Global Cybersecurity Status Report, conducted 13-15 January 2015, polled more than 3,400 ISACA members in 129 countries. It found that close to half (46 percent) expect their organization to face a cyberattack in 2015, and 83 percent believe cyberattacks are one of the top three threats facing organizations today.

ISACA, which assisted the National Institute of Standards and Technology (NIST) in the development of the US Cybersecurity Framework, has launched its Cybersecurity Nexus (CSX) program. CSX is a global resource for enterprises and professionals that helps identify, develop and train the cybersecurity workforce, while also raising the awareness of cybersecurity throughout the organization. CSX has extensive resources to address the cybersecurity skills gap through training, mentoring, performance-based credentials and applied research. CSX also now offers a Cybersecurity Legislation Watch center, which features the new CSX Special Report.

Cybersecurity is everyone’s business. It is absolutely essential that we accelerate the pace of creating a cyber-aware and cyber-trained society. And we need to do it together. As philosopher and inventor Ben Franklin once said, “We can hang together, or we can hang separately.” At ISACA, we are doing our part by providing knowledge and the tools to assure trust in a digital world.

Matt Loeb, CAE
Chief Executive Officer, ISACA

[ISACA]

English
Exit mobile version