Securing an Evolving Cloud Environment

The chief information officer (CIO) of a large utility provider had decided to move email, file shares, video sharing and the company’s internal web site to the cloud and needed to know the security requirements for this project within two weeks. The organization already had security requirements in place for traditional third-party vendors; however, these requirements were not a good fit for the cloud services the company was looking to adopt.

The director of security at the utility provider approached SecureState, a management consulting firm specializing in information security, with the problem.

Unlike traditional third-party solutions where the vendor is responsible for all, or most, of the security controls in the cloud, there are often cases where the client is responsible for managing and maintaining key security controls. For example, if a company was hosting a homegrown application at a Platform as a Service (PaaS) provider, the client would generally be responsible for the security of the application itself (figure 1). The cloud provider of the PaaS would be responsible for securing the platform and infrastructure supporting the application. However, if a company selected a Software as a Service (SaaS) application, the cloud provider would generally be responsible for all layers of the stack and the client would have very little responsibility or control over the security of the application (figure 2).


With that in mind when moving to the cloud, it is critical to clearly outline who is responsible for each component and have requirements that give the organization its desired level of security while being flexible enough to fit the different service models available from cloud providers.

For this utility provider, the move of these initial four services was part of a larger effort to eventually migrate all corporate IT services to the cloud, so in addition to quickly developing requirements for the applications listed previously, the director of security also needed a way to rapidly assess and categorize future cloud service providers to determine what minimum set of controls should be applied. This system also needed to be flexible enough to support new technology developments as cloud solutions mature. Further, a system would need to be put in place to track and monitor compliance of these key business partners to the required controls.

Building a Framework

To assist with this, SecureState created a program to review, approve and manage these cloud providers. The program was built around a custom cloud security framework (CSF) that the team developed. This framework was comprised of numerous components including:

  • Data classification and cloud service provider categorization guidelines
  • A control set
  • Vendor questionnaires mapped back to the control set
  • Federated identity management standards

To create this framework, the team met with stakeholders to gather business, technical and security requirements. The framework leveraged the utility company’s existing security policies, procedures and standards while adding requirements specific to cloud computing environments.

The controls in the framework were broken down by the classification of the data processed and/or stored by the provider (public, internal, confidential and regulatory). Each level added another layer of controls that needed to be present in the environment. To ensure that the controls were properly applied to various cloud models and use cases, a lookup table was created to show who is commonly responsible for managing each of the controls in the framework, depending on what type of cloud service model (e.g., SaaS, IaaS, PaaS) is being used.

Special attention was given to the regulatory requirements related to the data that would be stored and processed by the cloud providers, as the utility company needed to comply with several different regulations:

  • North American Electric Reliability Corporation (NERC) Critical Infrastructure Protection (CIP) standards because the utility provides power generation and transmission
  • Payment Card Industry Data Security Standard (PCI DSS) because the utility processes credit cards
  • US Health Insurance Portability and Accountability Act (HIPAA) because the utility self-insures its employees for health benefits

Requiring all cloud service providers to meet these regulatory requirements would be onerous, if not impossible. Therefore, appropriate regulatory controls would be applied only in environments that required them.

For example, portions of the utility’s employee health insurance process would migrate to the cloud specifically related to the corporate file share. Because of this, additional steps needed to be taken to ensure that the provider of the file-sharing service could meet the related HIPAA requirements.

Once the framework was completed, the team met with executives at the organization to review the CSF. During this meeting, SecureState conveyed the importance of the framework to the business and outlined how the organization should align to it. Once executive management buy-in was obtained, the framework was adopted for use by all lines of business moving services to the cloud, not just IT. This provided the company with a unified approach to managing the security of cloud services, thus ensuring all corporate data moved to the cloud were appropriately secured.

Managing the Security of Cloud Services

The director of security also needed to develop processes to prioritize, review and track which cloud services were approved for use, as well as a program to manage and track what data were being stored and/or processed by these cloud services. Without a robust program in place, the security department would quickly lose control of where sensitive data were stored and which vendor had been approved or denied.

The SecureState team created an online portal where lines of business inside the utility can enter requests to have potential cloud service providers (CSPs) reviewed. Once a provider is entered for review, a questionnaire is generated based on the type of cloud service used and the data stored and/or processed by that provider. This questionnaire is then sent to the point of contact at the cloud service provider to gather information on what security controls are present in their environment. Once the questionnaire is complete, SecureState works with the CSP and client to snap the cloud service into the CSF. To ensure the lines of responsibility are clearly defined, each requirement in the CSF is assigned to either the CSP or client. Depending on the categorization of the data being stored or processed by the provider, additional testing or interviews outside of the questionnaire may be required to determine which controls are present and to verify that they are properly implemented. A similar process is also followed to ensure that the controls the organization is responsible for implementing internally are present and properly implemented for each new cloud service entering into the environment.

During this review process, risk posed by the proposed solution is enumerated and areas where the solution did not meet the CSF are outlined. Using this information, the utility’s security group can determine if the new solution poses an acceptable level of risk, if the solution would be rejected or if it requires additional controls.

This portal also provides an inventory of which approved cloud applications/providers are currently being used in the environment and any exceptions associated with each provider. Additional reminders are set up to reassess each CSP annually, at a minimum. The depth of the reassessment is determined by the type of data processed or stored by the provider and any control exceptions granted.

Lessons Learned

Since implementing the CSF, the utility has applied it to four initial cloud services and a handful of subsequent providers. While applying the framework, a number of lessons were learned:

  1. Getting in front of the providers before the contract is signed to gain the full support of the providers. The utility had a large challenge applying the CSF to the initial set of vendors, as the contracts with these vendors had already been signed by the time security was brought in to review them. Because of this, the team had little leverage to get the vendors to make changes to their environments to meet the utility’s security requirements.
  2. Ensuring the use and completion of the utility’s questionnaire. Many of the providers preferred to provide third-party audit reports such as Service Organization Control 2 (SOC2) reports or self-assessments such as the Cloud Security Alliance (CSA) Consensus Assessments Initiative Questionnaire (CAIQ) instead of completing the utility company’s questionnaire. In these cases, the team would map the results back to the framework manually. Unfortunately, in most cases the information provided in the SOC2 report or CAIQ did not contain enough detail and further interviews and assessments were required to fill in the gaps. These processes ended up taking longer than initially planned. As a result, it was determined that this process would go more smoothly if the questionnaire was completed first. Thus, the team focused on streamlining the questionnaire and warned the project team that if the vendor did not complete it, the time required to review the vendor would lengthen, possibly impacting the project timeline. With this concern in mind, often the line of business could pressure the prospective providers to complete the questionnaire.
  3. Prioritizing provider assessments based on services provided. Follow-up interviews and assessments took longer than initially planned, and a method to prioritize service providers had to be developed to ensure high-priority service providers were assessed first. In some cases, lower-priority providers that housed only public data received minimal follow-up interviews and assessments. This was done to ensure that providers could be reviewed and approved quickly with the resources available.
  4. Educating the line of business on the cloud provider review process and following that process. Large projects that went through the company’s central procurement, or project management, office were easily flagged for provider review. However, many smaller projects that were initiated by the lines of business were small enough that they did not require involvement from these groups. Therefore, the security team did not hear about some smaller projects until they were fully implemented and, in some cases, had been in operation for a few months. To address this, the security department now makes a concerted effort to reach out to all lines of business to educate them on the process while working to quickly review new providers so this review is not a bottleneck in the process.

Conclusion

By pulling together the right team, the utility was able not only to address its initial problem of providing security requirements for the first group of vendors, but also to develop a solution to manage future cloud vendors. This solution allowed the utility to quickly and easily review future providers and also provide a program to manage them, thus ensuring corporate information stored in-house or in the cloud is protected equally.

The best way to start this process in any organization is to inventory the existing cloud services already in use. Many organizations have already started to leverage cloud services, often without audit, IT or security’s knowledge. By generating an inventory of which service providers are currently being used and what data are being stored or processed there, the organization can get a handle on what corporate data may be underprotected in these environments and use this information as leverage to start its own internal project to create a CSF for the environment.

Matthew Neely is the director of strategic initiatives at SecureState (www.securestate.com). Neely uses his technical knowledge to lead the Research & Innovation team to develop threat intelligence tools and methodologies for the challenging problems of the information security industry. Previously, he served as SecureState’s vice president of consulting and manager of the Attack & Defense Team. With more than 10 years of experience in the area of penetration testing and incident response, Neely brings the ability to think like an attacker to every engagement. He uses this skill to find creative ways to bypass controls and gain access to sensitive information. Prior to working at SecureState, Neely worked in the security department at a top-10 bank where he focused on penetration testing, assessing new technology and incident response.

[Source: ISACA]

The Cybersecurity Canon: Secrets and Lies

For the past decade, I have held the notion that the security industry needs a Cybersecurity Canon: a list of must-read books where the content is timeless, genuinely represents an aspect of the community that is true and precise and that, if not read, leaves a hole in a cybersecurity professional’s education.

If you’d like to hear more about my Cybersecurity Canon idea, take a look at the presentations I made at this year’s RSA Conference and at Ignite 2014. As always, I love a good argument, so feel free to let me know what you think.

The Cybersecurity Canon: Secrets and Lies: Digital Security in a Networked World (2000) by Bruce Schneier

Secrets and Lies: Digital Security in a Networked World is the perfect book to hand to new bosses or new employees coming in the door who have not been exposed to cybersecurity in their past lives*. It is also the perfect book for seasoned security practitioners who want an overview of the key issues facing our community today. Schneier wrote it more than a decade ago, but he talks about a variety of ideas so ahead of their time that they are still relevant today. Concepts he touches on include:

  • The idea that “security is a process, not a product.” With that one line, Schneier captures the essence of what our cybersecurity community should be about.
  • No matter how advanced security technology becomes, people are the still the weakest link in the security chain.
  • The cyber-adversary as something more than just a hacker.
  • Making the Internet more secure by strengthening confidentiality, integrity, and availability (CIA), as well as improving Internet privacy and anonymity.
  • Challenging the idea that security practitioners must choose between security and privacy.
  • Holding software vendors accountable for security risks in their code.
  • The need for a Bitcoin-like capability long before Bitcoin became popular.

The content within Secrets and Lies is a good introduction to the cybersecurity community, and Schneier tells the story well.

The Story

Secrets and Lies demonstrates Schneier’s evolution as an early thought leader in the cybersecurity community and outlines some key concepts that are still valid today.

Security Is a Process

In the preface, Schneier freely admits to thinking in his earlier life that cryptology would solve all of our Internet security problems. In Secrets and Lies, however, he is forced to acknowledge upfront that technology by itself does not even come close to solving these problems. You do not get security out of a box. You get security by applying people, process, and technology to a problem set, and the more complex we make things, the more likely it is that we are going to screw up the process.

People Are the Weakest Link

The weak link in all of this is the people. You can have the best tools on the planet configured to defend your enterprise, but if you do not have the qualified people to maintain them and to understand what the tools are telling you, you have probably wasted your money. This goes hand in hand with the user community, too. It doesn’t matter that I spent a gazillion dollars on Internet security this year if the least-security-savvy people on your staff take their laptops home and unwittingly install malcode on their machines.

Risk

When it comes to business risk, cybersecurity isn’t its own category separate from more traditional risks. What I have noticed in my career is that many security-practitioners and senior-level company leaders treat “cyber risk” as a thing unto itself and throw the responsibility for it over to the “IT guys” or to the “security dorks.” In my mind, this is one of our community’s great failures. It is up to all of us to convey that essential idea to senior leadership in our organizations.

Software Liability

Every new piece of software deployed has the potential to expose additional threats to the enterprise in terms of new vulnerabilities, and vendors have no liability for this. In other industries, if a vendor were to produce a defective product that causes monetary damage to a company, that company would most likely sue that vendor with a high probability of success in court. It is not like that in the commercial software business or even in the open-source movement. Vendors will patch their systems for sure, but they accept no responsibility for, let’s say, hackers stealing 400 million credit cards from a major retail chain. Schneier is aghast at this development that the user community has let vendors get away with this stance.

Adversary Motivations

Secrets and Lies was the first time that I had seen an author characterize the adversary as a person or a group with motives and aspirations.

“Adversaries have varying objectives: raw damage, financial gain, information, and so on. This is important. The objectives of an industrial spy are different from the objectives of an organized-crime syndicate, and the countermeasures that stop the former might not even faze the latter. Understanding the objectives of likely attackers is the first step toward figuring out what countermeasures are going to be effective.”

This was a revelation to me. At this point in my career, I just thought “hackers” were trying to steal my stuff. This is Schneier’s first cut of a complete adversary list:

  • Hackers
  • Lone Criminals
  • Malicious Insiders
  • Industrial Espionage Actors
  • Press
  • Organized Criminals
  • Police
  • Terrorists
  • National Intelligence Organizations
  • Info warriors

In my work, I have found it useful to refine Schneier’s list of people into the following adversary motivations:

  • Cyber Crime
  • Cyber Espionage
  • Cyber Warfare
  • Cyber Hactivism
  • Cyber Terrorism
  • Cyber Mischief

The bottom line is that these adversaries have a purpose, and it helps network defenders if they understand what kind of adversaries are likely to attack the defender’s assets.

Things Stay the Same

Sadly, even though Schneier published Secrets and Lies in 2000, all of these things are still true, and there is no real solution is sight. Many organizations still think that installing the latest shiny security toy to hit the market will make their networks more secure. They don’t stop to think that they might be better off if they just made sure that the toys they already have installed on their network worked correctly.

People are still the weak link both in the security operations center (SOC) and in the general user community. As I have written elsewhere, talented SOC people are hard to come by, and many organizations still spend resources on robust employee-training programs, but the results are mixed at best.

CISOs are still struggling to convey the security risk message to the C-Suite. Most of us came up through the technical ranks and think colorful bar charts about the numbers of systems that have been patched are pretty cool. The CEO couldn’t care less about those charts and instead wants to know what the charts mean in terms of material risk to the business.

Finally, software vendors still have no liability when it comes to deploying faulty software that results in monetary loss to a customer. This just seems to be something we have all accepted, that it is much better to build a working piece of code first and then worry how to secure it later. I know entrepreneurs prefer this method because the alternative slows the economic engine down if developers spend time adding security features to a new product that drives no immediate revenue opportunities. But this is the great embarrassment to the computer science field: we have not eradicated bugs like buffer overflows in modern code. How is it possible that we can send people to the moon but we cannot eliminate buffer overflows in code development? Don’t get me wrong; the industry has made great strides in developing tools and techniques in these areas—just look at the Building Security in Maturity Model (BSIMM) project to see for yourself. But the fact that, as a cybersecurity community, we have not made it mandatory to use these techniques is one of the reasons we are still often considered a “field of study.”

What We Need

In the end, Schneier makes the case for things that the cybersecurity community needs in order to make the Internet more secure. Long before the acronym became a staple on Certified Information Systems Security Professional (CISSP) exams, he advocated the need to strengthen confidentiality, integrity, and availability (CIA). He does not call it CIA in the book, but he talks at length about the concepts. He was prescient in his emphasis on the need for Internet privacy and Internet anonymity and was one of the first thought leaders to start asking the question about security versus privacy in terms of government surveillance. He also anticipated the need for a Bitcoin-like capability long before Bitcoin became popular.

The Tech

Unfortunately, when you begin to write a technology book about the current state of the art surrounding cybersecurity, much of what you write about is already outdated as you go to press. As I was rereading Schneier’s book, I chuckled to myself when he referenced his blindingly fast Pentium III machines running Windows NT. The world has indeed changed since 2000.

Schneier wrote Secrets and Lies at the time when the industry had just accepted that a stateful inspection firewall was not sufficient to secure the enterprise.

“Today’s firewalls have to deal with multimedia traffic, downloadable programs, Java Applets, and all sorts of weird things. A Firewall has to make decisions with only partial information: It might have to decide whether or not to let a packet through before seeing all the packets in transmission.”

Besides firewalls, he describes other controls that the cybersecurity community has decided are necessary to secure the perimeter, such as demilitarized zones (DMZs), virtual private networks (VPNs), application gateways, intrusion detection systems, honeypots, vulnerability scanners, and email security. Since the book’s publication, security vendors have added even more tools to this conga line, tools like URL filters, Doman Name System (DNS) monitoring, sandboxing technology, security incident and event management systems (SIEMS), and protocol capture and analysis tools.

As of May 2014, the cybersecurity community is mounting a bit of a backlash against the vendor community’s conga line strategy. Practitioners simply can’t manage it all. The best and most recent example of this is the Target data breach. Like many of us, the Target security team installed the conga line of security products and even had a dedicated SOC to monitor them. According to published reports, the controls dutifully alerted the SOC that a breach was in progress but there was apparently so much noise in the system (and perhaps Target’s process was not as efficient as it could be) that nobody in the organization reacted to the breach until it was too late. It’s a perfect example of why many organizations are looking for simpler solutions rather than continuing to add new tools to the security stack.

Cryptology

According to Schneier, underlying everything is cryptology. As you would expect from a cryptologist, Schneier believes that his field of study is the linchpin of the entire idea of Internet security.

“Cryptography is pretty amazing. On one level, it’s a bunch of complicated mathematics. On another level, cryptography is a core technology of cyberspace. In order to understand security in cyberspace, you need to understand cryptography. You don’t have to understand the math, but you have to understand its ramifications. You need to know what cryptography can do, and more importantly, what cryptography cannot do.”

I agree. (Note: The difference between the terms cryptography, cryptanalysis, cryptology, and cryptologist is left as an exercise for the reader.) I would say that the cybersecurity community has failed in this regard. While it is true that cryptography is the underlying technology that makes it possible to secure the Internet, it is still too complicated for the general user to leverage. In light of the Edward Snowden revelations —that we not only have to worry about foreign governments spying on our electronic transmissions, but we also have to worry about our own government doing it—the fact that most people do not know how to encrypt their own email messages as a matter of course is a testament to our industry’s failure.

Kill Chain

Schneier makes a distinction between computer and network security, that the conga line of security tools that make up the security stack at the network perimeter is not the same as the set of tools you need to secure the endpoint. While this is still true today, the cybersecurity community has merged these two ideas together since Schneier’s book was published.

The thought is that it does not make sense to consider network and endpoint security separately; it makes more sense to think of everything as a system, as we do at Palo Alto Networks. As organizations develop indicators of compromise at both the network and endpoint layers, essentially the Kill Chain model, the cybersecurity community can develop advanced adversary profiles about the attacker’s campaign plan.

In conclusion, the ideas Schneier examines in Secrets and Lies were years ahead of their time.  They show the cybersecurity industry just how far we have come and how far we still have to go. Because of this, Secrets and Lies is a candidate for the cybersecurity canon, and you should have read it by now.

*Full disclosure: The first civilian job I took after I retired from the US Army was with the company that Bruce Schneier founded called Counterpane, so I may be a little biased. 

[Source: ]

CVE-2014-1776: How Easy It Is To Attack These Days

This post originally appeared on Cyvera.com.

Just about a week ago, everyone was alarmed due to a new zero-day vulnerability affecting Internet Explorer 6 through 11. The vulnerability was used in attacks in the wild, which targeted IE 8 to IE 11. The impact was so severe that Microsoft hurried to issue an out-of-band patch. Today, I would like to show how relatively easy it is to attack these days, when you can just reuse code.

We will compare the attack that used CVE-2014-0322 (then, an IE zero-day) to the current attacks utilizing CVE-2014-1776. We will show an almost exact match between the two templates for the attacks, indicating that either the same group was behind the two campaigns, or that the ease of acquiring used exploit code (even from public sources) allows different groups to quickly reuse and adapt the same code to the next vulnerability.

Overview

Both attacks utilize use-after-free vulnerabilities in IE, and leverage Flash Player in order to easily bypass DEP and ASLR. In both cases the scheme is the same:

  • Load a Flash SWF file.
  • Spray the heap with ActionScript uint vector objects with 0x3FE elements, for a total of 0×1000 bytes (i.e., 1 page) of memory for each vector object (including the vector’s management information, which should be inaccessible directly from the ActionScript code).
  • Spray the heap with references to a Sound object, to be used later as the first trigger to the shellcode.
  • Call a JavaScript function in the HTML page and set a timer to invoke another AS function.
  • Trigger a UAF vulnerability using the JavaScript code, while spraying the heap in order to ensure that the used block is controlled.
  • Use the bug to change an AS vector’s size (which is inaccessible directly from AS).
  • Back in the timed function in the SWF, use the modified vector to change an adjacent vector’s size to encompass all virtual memory, effectively achieving full memory disclosure and writing abilities (where current permissions allow that).
  • Find a module in memory and reach NTDLL by hopping through modules’ import address tables.
  • Find a stack pivoting gadget in NTDLL as well as the address of ZwProtectVirtualMemory.
  • Overwrite the virtual function table pointer of the Sound object to initiate code execution by calling the Sound object’s (replaced) toString function. Then, use a few ROP gadgets to pivot the stack, change the permissions on the shellcode to RWX, and execute the shellcode.
  • Restore normal operation.

There are, however, some improvements in this CVE-2014-1776 attack over the CVE-2014-0322 attack. For example, while all the hard work is crammed into one big function in the CVE-2014-0322 attack, the authors of the CVE-2014-1776 attack strove for cleaner code and broke the huge pile of code into many smaller functions, which constitute basic primitives for the larger goal. In fact, now it is even easier to reuse this code for the next exploit…

The Flash Spray

In both cases vectors of uints are sprayed (with 0x3FE elements in each vector), as well as references to a Sound object. The values of the uints in the sprayed vectors are constructed so as to fit the address the attacker had chosen and the vulnerability (and the browser, if applicable).

CVE-2014-1776

CVE-2014-0322

The UAF Triggering

In both attacks, the JS code which triggers the vulnerability is called from the ActionScript code, using the external interface. The AS code then registers a function to be invoked at a later time and search for the artifacts of the triggered vulnerability. Although the code performing the actual UAF is almost the same, there are some differences in behavior here:

  • In the current attack, the JS function gets a parameter, which holds JavaScript code that is crucial for the vulnerability to arise. In contrast, in the previous attack, the entire JavaScript code was present in the HTML.
  • In the current attack, the JS code sent to the external function lies encrypted (using RC4) in the SWF file, and is decrypted only prior to sending it to the external interface. Other parts in the SWF (relating to the shellcode) are also encrypted. Consequently, if you only have the HTML file, you cannot reproduce the zero-day (and vice versa). In contrast, the previous attack had no encrypted elements at all.
  • In yet another effort to make sure the zero-day is not compromised even if one file falls into the wrong hands, in the current attack the HTML file was split into two files, the second one containing the JS code used for the heap-spray.

The heap-sprays, though, are very much alike.

CVE-2014-1776

CVE-2014-0322

Memory Ownage

In both cases this is pretty easy – look for the modified length of the vector (that is what the IE vulnerability was used for), use the modified vector to modify the adjacent vector’s length to span all memory, and use the second modified vector for memory read and write operations.

CVE-2014-1776

CVE-2014-0322

Looking for Modules and Functions

This is pretty straightforward – find a module in memory, go backwards to find its base, parse the PE header and look for a function imported from KERNEL32.DLL, repeat the same process to go from KERNEL32.DLL to NTDLL.DLL, and then parse its import table looking for the needed functions. However, the code for the recent attack has one improvement over the older code: The new code uses the Sound object’s vftable to get a function pointer which points inside the Flash OCX, while the older code scans the memory and tries to find an executable image by brute-forcing.

CVE-2014-1776

CVE-2014-0322

Running the Shellcode

In both cases, the Sound object’s vftable pointer is overwritten, pointing to a pre-crafted area in memory. Then, the Sound object’s toString method is called (#28 in the table), which runs a stack pivoting gadget chained to a gadget that uses ZwVirtualProtect on the shellcode, which immediately follows. The shellcode begins by saving information and restoring overwritten values.

CVE-2014-1776

CVE-2014-0322

Summary

We have shown a very high correlation between the exploit code used in the CVE-2014-1776 attack, and the exploit code used in the CVE-2014-0322 attack. Clearly, the same code base was used. Whether this is indicative of the same actor or not, we cannot tell, since all code was freely available on the net when the recent attack commenced.

Looking at the entire SWF file in both cases, it can be seen that some mistakes were made, and some code was copied without actually utilizing it or understanding why it is there. Nevertheless, the high correlation between the two exploits shows how easy it has become to reuse proven code from past exploits when preparing the next attack. This only means that organizations need to stay protected, as sophisticated attacks can be easily copied by teams who don’t possess the knowledge to construct such an attack on their own.

All endpoints on which Cyvera TRAPS was installed were (and are) protected from the CVE-2014-1776 attack: TRAPS stops this in-the-wild exploitation attempt at several different points. Since TRAPS does not rely on signatures or behaviors but on breaking the attacker’s core techniques, TRAPS stops even zero-day attacks (including this one) without any need for updates. Of course, TRAPS users were also protected from the CVE-2014-0322 attack.

[Source: Palo Alto Networks Research Center]

Best Practices for Defending Against APTs

Advanced persistent threats (APTs) have changed the world of enterprise security and how networks and organizations are attacked. In a new Palo Alto Networks eBook, Cybersecurity for Dummies, we explore:

  • The cybersecurity landscape and why traditional security solutions fail
  • What next-generation security brings to the fight
  • Ten best practices for controlling APTs

Head to our Cybersecurity for Dummies landing page and request your free copy today!

For more on Palo Alto Networks solutions for APTs:

Highlights from the NIST Privacy Engineering Workshop

In April, I presented at and attended the NIST Privacy Engineering Workshop on behalf of ISACA.

Throughout two days of sessions, attendees explored the Fair Information Practice Principles, privacy/technology research efforts, and the need to address privacy risks—to consider privacy from the planning stage of projects and close the longstanding communications gap between legal and engineering areas.

We joined breakout sessions to discuss the frameworks engineers use, explore privacy case studies, and determine ways in which engineering methods can address privacy risks. On day two of the event we focused on drone use, which prompted some lively, thought-provoking discussions.

My takeaways from the workshop:

  • Huge gaps in communication between the engineering areas and legal/policy areas need to be closed. Each group needs to listen to the other when it comes to privacy discussions. Each side has much to learn from the experiences of the other.
  • Privacy engineering is much more than a policy issue and much more than just getting software or systems to meet existing legal requirements for data protection. Because those laws/regulations were created in a reactionary atmosphere, they will always lag behind a significant number of new and emerging privacy risks. Engineers will be key in mitigating those privacy risks through the use of an effective privacy-engineering framework, and through the use of a catalog of vetted and reasonable privacy-use cases.
  • Engineers already have frameworks they have used for many years to build software and systems. Instead of trying to get them to use something completely different, efforts should be made to establish privacy standards that are integrated within these established frameworks, written in language appropriate for engineers.
  • Privacy engineering is not just for large organizations. There are many small and mid-size organizations that create software and systems; they must also know how to engineer privacy into their products. Often there is an even greater need for such organizations to practice privacy engineering for all the software and systems they create.

Naomi Lefkovitz, NIST senior privacy policy advisor who presided over the two-day event, indicated that NIST plans to produce a report based on the information, recommendations and comments collected during the workshop. NIST will host further workshops to refine what will likely become the privacy portion of the Cybersecurity Framework.

I found this workshop beneficial—an important first step toward identifying actionable privacy standards to include within the Cybersecurity Framework, which engineers will be able to effectively utilitize within their current frameworks to help build in the (currently missing) controls that are needed to help to protect privacy.

Rebecca Herold, CISM, CISA, CISSP, CIPP/US, CIPP/IT, CIPM, FLMI
CEO, The Privacy Professor®

[Source: ISACA]

English
Exit mobile version