Author Archives: AdminDCS

  • 0

Up and to the Right, NetApp Moves Up in 2017 Gartner Magic Quadrant for Solid State Arrays

Category : NetApp

Gartner recently released its July 2017 Magic Quadrant for Solid State Arrays (SSA).  NetApp has improved its position moving up and to the right in the Leaders Quadrant.  This past year, customers drove NetApp market share gains to new heights. Gartner’s May 2017 Market Share Analysis: SSDs and Solid-State Arrays, Worldwide, 2016 shows NetApp share growth at more than triple that of the market in 2016, moving NetApp to the #2 market share position:

Source: Gartner (May 2017) *Market Share Analysis reports provide qualitative insight, which are basically the why behind the figures.

Not a bad year for recognition of the value that NetApp is providing to customers in this arena.  Now let’s look ahead.  What can customers expect from NetApp and how might leadership criteria evolve in the future?

Innovating beyond media and the array

NetApp will continue to lead in providing flash-media-based solutions that drive peak performance and efficiency for legacy and emerging applications and that power the build out of cloud-like next generation data centers.  Customers should expect NetApp to leverage strong innovation and deep supplier relationships to deliver timely, nondisruptive upgrades that leverage the next media wave. For example, NVMe-over-Fabrics (NVMe-oF), and storage class memory (SCM) technologies bring order-of-magnitude improvements in throughput, latencies, and efficiency. Learn more about our efforts and innovation here.

While media-based innovation is exciting, we believe customers expect strategic vendors to think beyond media and the array. Digital transformation is high on the C-suite priority stack with most organizations seeking to use data to optimize operations, enable new customer touch points and drive new revenue streams. SSA vendors who can help customers achieve these larger goals by better managing their data across premise and public cloud will provide significantly more value as a strategic partner.  That is where NetApp will continue to stand out. The NetApp SSA portfolio integrates with a full ecosystem of hybrid cloud data services enabled by the NetApp Data Fabric.

The leadership criteria of the future

NetApp’s vision for data management is a data fabric that seamlessly connects a customers’ increasingly distributed, diverse, and dynamic data environment. NetApp Data Fabric is an architecture that provides data management capabilities and services that span a choice of endpoints and ecosystems connected across premises and the cloud to accelerate digital transformation. The Data Fabric delivers consistent and integrated data management services for data visibility and insights, data access and control, and data protection and security. Here are a some of the capabilities that are integrated with our SSA portfolio today:

  • Pay-as-you-go backup and disaster recovery with public cloud
  • Automated policy-based tiering of cold data to public cloud
  • Automated data sync with public cloud to leverage cloud analytics
  • Performance, capacity and cost monitoring across premises and public cloud

Solving data management challenges to achieve digital transformation

NetApp will continue to lead when it comes to developing cutting edge technology and capability in our SSA offerings.  However, it will take much more to be a strategic vendor in the digital era.  Going forward, I predict that integrated hybrid cloud data services and their broader ability to accelerate a customer’s digital transformation agenda will increasingly differentiate the leaders in all infrastructure segments.  Ask your SSA vendor how they plan to meet the emerging criteria of a strategic infrastructure vendor with integrated hybrid cloud data services.

Gartner Disclaimer:
Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.
Gartner Market Share Analysis: SSDs and Solid-State Arrays, Worldwide, 2016, Joseph Unsworth and John Monroe, 2 May 2017

Source: https://newsroom.netapp.com/blogs/up-and-to-the-right-netapp-moves-up-in-2017-gartner-magic-quadrant-for-solid-state-arrays/

Author: Brett Roscoe


  • 0

Kryptowire Integrates with MobileIron to Provide Automated Mobile Application Security and Compliance Monitoring to Enterprise Clients

Category : Mobile Iron

The partnership will enable CIOs and IT security professionals to leverage the integrated power of MobileIron’s policy and configuration engine with Kryptowire’s continuous mobile application vetting and proactive remediation to enforce ironclad security for their mobile enterprise.

Military-grade mobile security platform @kryptowire integrates w @MobileIron, provides app vetting to enterprises

Tweet this

“We welcome Kryptowire to our growing marketplace of mobile application security partners,” said John Morgan, Vice President of Product and Ecosystem, MobileIron. “With MobileIron and Kryptowire, CIO scan be confident that employee devices and applications are compliant with the highest Federal and internationally-recognized mobile security standards.”

A single vulnerability on an employee-owned or provisioned smart device may result in irreparable damage to an organization. Compliance becomes a complex challenge when thousands of employees use hundreds of internally-developed and third-party applications. For the mobile enterprise, manual security and privacy monitoring is error-prone and costly.

Kryptowire meets this challenge by providing automated mobile security and privacy testing for employees’ mobile applications against the highest internationally-recognized standards used for classified and national security systems. Now this military-grade technology is available and integrated with the MobileIron platform.

Meet Kryptowire at Black Hat USA in Las Vegas, on July 26, and get a free security scan for one app of your choice.

Kryptowire proactively monitors mobile and IOT vulnerabilities in applications, regardless of who developed them, on employee-owned or provisioned devices. Key features include:

  • Automated analysis of users’ Android and iOS mobile apps without requiring access to the source code.
  • Continuous monitoring the security of all enterprise mobile apps and devices against the highest internationally-recognized software assurance standards published by the National Institute of Standards and Technologies (NIST) and the National Information Assurance Partnership (NIAP).
  • Leveraging the latest mobile threat intelligence to test mobile apps on mobile devices for enterprise employees.
  • Pass/fail evidence down to the line of code to assure transparent and high-confidence results.
  • Proactive remediation that includes white listing or blacklisting applications, notifying the end user, or even removing non-compliant assets to protect enterprise resources and data.
  • Enforcement of compliance with enterprise-wide privacy and security policies.

“Discovering and proactively monitoring mobile app vulnerabilities is a top priority for today’s mobile enterprise,” said Angelos Stavrou, the CEO of Kryptowire. “Kryptowire is honored to offer its military-grade, automated mobile security solution to MobileIron customers and to uphold the same high standard of service that has long been trusted by U.S. civilian and military agencies.

 

Source:  http://www.businesswire.com/news/home/20170720005844/en

 


  • 0

Finding the Value of ‘Intangibles’ in Business

Category : Palo Alto

We modeled the Cybersecurity Canon after the Baseball or Rock & Roll Hall-of-Fame, except for cybersecurity books. We have more than 25 books on the initial candidate list, but we are soliciting help from the cybersecurity community to increase the number to be much more than that. Please write a review and nominate your favorite. 

The Cybersecurity Canon is a real thing for our community. We have designed it so that you can directly participate in the process. Please do so!

Book review by Canon Committee Member, Rick Howard: “How to Measure Anything: Finding the Value of ‘Intangibles’ in Business” (2011), by Douglas W. Hubbard.

Executive Summary

Douglas Hubbard’s “How to Measure Anything: Finding the Value of ‘Intangibles’ is an excellent candidate for the Cybersecurity Canon Hall of Fame. He describes how it is possible to collect data to support risk decisions for even the hardest kinds of questions. He says that network defenders do not have to have 100 percent accuracy in our models to help support these risk decisions. We can strive to simply reduce our uncertainty about ranges of possibilities. He writes that this particular view of probability is called Bayesian, and it has been out of favor within the statistical community until just recently, when it became obvious that it worked for a certain set of really hard problems. He describes a few simple math tricks that all network defenders can use to make predictions about risk decisions for our organizations. He even demonstrates how easy it is for network defenders to run our own Monte Carlo simulations using nothing more than a spreadsheet. Because of all of that, “How to Measure Anything: Finding the Value of ‘Intangibles’ is indeed a Cybersecurity Canon Hall of Fame candidate, and you should have read it by now.

Introduction

The Cybersecurity Canon project is a “curated list of must-read books for all cybersecurity practitioners – be they from industry, government or academia — where the content is timeless, genuinely represents an aspect of the community that is true and precise, reflects the highest quality and, if not read, will leave a hole in the cybersecurity professional’s education that will make the practitioner incomplete.” [1]

This year, the Canon review committee inducted this book into the Canon Hall of Fame: “How to Measure Anything in Cybersecurity Risk,” by Douglas W. Hubbard and Richard Seiersen. [2] [3]

According to Canon Committee member Steve Winterfeld, “How to Measure Anything in Cybersecurity Risk” is an extension of Hubbard’s successful first book, “How to Measure Anything: Finding the Value of ‘Intangibles’ in Business. It lays out why statistical models beat expertise every time. It is a book anyone who is responsible for measuring risk, developing metrics, or determining return on investment should read. It provides a strong foundation in qualitative analytics with practical application guidance.” [4]

I personally believe that precision risk assessment is a key, and currently missing, element in the CISO’s bag of tricks. As a community, network defenders, in general, are not good at transforming technical risk into business risk for the senior leadership team. For my entire career, I have gotten away with listing the 100+ security weaknesses within my purview and giving them a red, yellow, or green label to mean bad, kind-of-bad, or not bad. If any of my bosses would have bothered to ask me why I gave one weakness a red label vs. a green label, I would have said something like: “25 years of experience…blah, blah, blah…trust me…blah, blah, blah…can I have the money, please?”

I believe the network defender’s inability to translate technical risk into business risk with precision is the reason that the CISO is not considered at the same level as other senior C-suite executives, such as the CEO, CFO, CTO, and CMO. Most of those leaders have no idea what the CISO is talking about. For years, network defenders have blamed these senior leaders for not being smart enough to understand the significance of the security weaknesses we bring to them. But I assert that it is the other way around. The network defenders have not been smart enough to convey the technical risks to business leaders in a way they might understand.

This CISO inability is the reason that the Canon Committee inducted “How to Measure Anything in Cybersecurity Risk,” and another precision risk book called “Measuring and Managing Information Risk: A FAIR Approach” into the Canon Hall of Fame. [5][4][3][6][7]. These books are the places to start if you want to educate yourself on this new way of thinking about risk to the business.

For me though, this is not an easy subject. I slogged my way through both of these books because basic statistical models completely baffle me. I took stat courses in college and grad school but sneaked through them by the skin of my teeth. All I remember about stats was that it was hard. When I read these two books, I think I only understood about a three-quarters of what I was reading, not because they were written badly, but because I struggled with the material. I decided to get back to the basics and read Hubbard’s original book that Winterfeld referenced in his review: “How to Measure Anything: Finding the Value of ‘Intangibles’ in Business” to see if it was also Canon-worthy.

The Network Defender’s Misunderstanding of Metrics, Risk Reduction and Probabilities

Throughout the book, Hubbard emphasizes that seemingly dense and complicated risk questions are not as hard to measure as you might think. He reasons from scholars like Edward Lee Thorndike and Paul Meehl from the early twentieth century about Clarification Chains:

If it matters at all, it is detectable/observable.
If it is detectable, it can be detected as an amount (or range of possible amounts).
If it can be detected as a range of possible amounts, it can be measured. [8]

As a network defender, whenever I think about capturing metrics that will inform how well my security program is doing, my head begins to hurt. Oh, there are many things that we could collect – like outside IP addresses hitting my infrastructure, security control logs, employee network behavior, time to detect malicious behavior, time to eradicate malicious behavior, how many people must react to new detections, etc. – but it is difficult to see how that collection of potential badness demonstrates that I am reducing material risk to my business with precision. Most network defenders in the past, including me, have simply thrown our hands up in surrender. We seem to say to ourselves that if we can’t know something with 100 percent accuracy, or if there are countless intangible variables with many veracity problems, then it is impossible to make any kind of accurate prediction about the success or failure of our programs.

Hubbard makes the point that we are not looking for 100 percent accuracy. What we are really looking for is a reduction in uncertainty. He says that the concept of measurement is not the elimination of uncertainty but the abatement of it. If we can collect a metric that helps us reduce that uncertainty, even if it is just by a little bit, then we have improved our situation from not knowing anything to knowing something. He says that you can learn something from measuring with very small random samples of a very large population. You can measure the size of a mostly unseen population. You can measure even when you have many, sometimes unknown, variables. You can measure the risk of rare events. Finally, Hubbard says that you can measure the value of subjective preferences, like art or free time, or of life in general.

According to Hubbard, “We quantify this initial uncertainty and the change in uncertainty from observations by using probabilities.” [8] These probabilities refer to our uncertainty state about a specific question. The math trick that we all need to understand is allowing for ranges of possibilities within which we are 90 percent sure the true value lies.

For example, we may be trying to reduce the number of humans who have to respond to a cyberattack. In this fictitious example, last year the Incident Response team handled 100 incidents with three people each – a total of 300 people. We think that installing a next-generation firewall will reduce that number. We don’t know exactly how many, but some. We start here to bracket the question.

Do we think that installing the firewall will eliminate the need for all humans to respond? Absolutely not. What about reducing the number to three incidents with three people for a total of nine? Maybe. What about reducing the number to 10 incidents with three people for a total of 30? That might be possible. That is our lower limit.

Let’s go to the high side. Do you think that installing the firewall will have zero impact on reducing the number? No. What about 90 attacks with three people for a total of 270? Maybe. What about 85 attacks with three people for a total of 255? That seems reasonable. That is our upper limit.

By doing this bracketing we can say that we are 90 percent sure that installing the next-generation firewall will reduce the number of humans who have to respond to cyber incidents from 300 to between 30 and 255. Astute network defenders will point out that this range is pretty wide. How is that helpful? Hubbard says that first, you now know this, where before you didn’t know anything. Second, this is the start. You can now collect other metrics, perhaps, that might help you reduce the gap.

The History of Scientific Measurement Evolution

This particular view of probabilities, the idea that there is a range of outcomes that you can be 90 percent sure about, is the Bayesian interpretation of probabilities. Interestingly, this different view of statistics has not been in favor since its inception, when Thomas Bayes penned the original formula back in the 1740s. The naysayers originated from the Frequentists. Their theory said that the probability of an event can only be determined by how many times it has happened in the past. To them, modern science requires both objectivity and precise answers. According to Hubbard:

“The term ‘statistics’ was introduced by the philosopher, economist, and legal expert Gottfried Achenwall in 1749. He derived the word from the Latin statisticum, meaning ‘pertaining to the state.’ Statistics was literally the quantitative study of the state.” [8]

In the Frequentist view, the Bayesian philosophy requires a measure of “belief and approximations. It is subjectivity run amok, ignorance coined into science.” [7] But the real world has problems where the data is scant. Leaders worry about potential events that have never happened before. Bayesians were able to provide real answers to these kinds problems, like the defeating of the Enigma encryption machine in World War II and finding a lost and sunken nuclear submarine that was the basis for the movie “The Hunt for Red October.” But It wasn’t until the early 1990s that the theory became commonly accepted. [7]

Hubbard walks the reader through this historical research about the current state in scientific measurement. He explains how Paul Meehl, in the early 1900s, demonstrated time and again that statistical models outperformed human experts. He describes the birth of information theory with Claude Shannon in the late 1940s and credits Stanley Smith Stevens, around the same time, with crystalizing different scales of measurement from sets to ordinals to ratios and intervals. He reports how Amos Tversky and Daniel Kahneman, through their research in the 1960s and 1970s, demonstrated that we can improve our measurements around subjective probabilities.

In the end, Hubbard defines “measurement” as this:

  • Measurement: A quantitatively expressed reduction of uncertainty based on one or more observations. [8]

Simple Math Tricks

Hubbard explains two math tricks that, after reading, seem impossible to be true, but when used by a Bayesian proponents, greatly simplify measurement-taking for difficult problems:

  • The Power of Small Samples: The Rule of Five: There is a 93.75% chance that the median of a population is between the smallest and largest values in any random sample of five from that population. [8]
  • The Single Sample Majority Rule (i.e., The Urn of Mystery Rule): Given maximum uncertainty about a population proportion – such that you believe the proportion could be anything between 0% and 100% with all values being equally likely – there is a 75% chance that a single randomly selected sample is from the majority of the population. [8]

I admit that the math behind these rules escapes me. But I don’t have to understand the math to use the tools. It reminds me of a moving scene from one of my favorite movies: “Lincoln.” President Lincoln, played brilliantly by Daniel Day-Lewis, discusses his reasoning for keeping the southern agents – who want to discuss peace before the 13th Amendment is passed – away from Washington.

“Euclid’s first common notion is this: Things which are equal to the same thing are equal to each other. That’s a rule of mathematical reasoning. It’s true because it works. Has done and always will do.” [9]

The bottom line is that “statistically significant” does not mean a large number of samples. Hubbard says that statistical significance has a precise mathematical meaning that most lay people do not understand and many scientists get wrong most of the time. For the purposes of risk reduction, stick to the idea of a 90 percent confidence interval regarding potential outcomes. The Power of Small Samples and the Single Sample Majority Rule are rules of mathematical reasoning that all network defenders should keep handy in their utility belts as they measure risk in their organizations.

Simple Measurement Best Practices and Definitions

As I said before, most network defenders think that measuring risk in terms of cybersecurity is too hard. Hubbard explains four rules of thumb that every practitioner should consider before giving up:

  • It’s been measured before.
  • You have far more data than you think.
  • You need far less data than you think.
  • Useful, new observations are more accessible than you think. [8]

He then defines “uncertainty” and “risk” through a possibility and probabilistic lens:

Uncertainty:

The lack of complete certainty, that is, the existence of more than one possibility.

Measurement of Uncertainty:

A set of probabilities assigned to a set of possibilities.

Risk:

A state of uncertainty where some of the possibilities involve a loss, catastrophe, or other undesirable outcome.

Measurement of Risk:

A set of possibilities each with quantified probabilities and quantified losses.  [8]

In the network defender world, we tend to define risk in terms of threats and vulnerabilities and consequences. [10] Hubbard’s relatively new take gives us a much more precise way to think about these terms.

Monte Carlo Simulations

According to Hubbard, the invention of the computer made it possible for scientists to run

thousands of experimental trials based on probabilities for inputs. These trials are called Monte Carlo simulations. In the 1930s, Enrico Fermi used the method to calculate neutron diffusion by hand with human mathematicians calculating the probabilities. In the 1940s, Stanislaw Ulam, John von Neumann, and Nicholas Metropolis realized that the computer could automate the Monte Carlo method and help them design the atomic and hydrogen bombs. Today, everybody who has access to a spreadsheet can run their own Monte Carlo simulations.

For example, take my previous example of trying to reduce the number of humans who have to respond to a cyberattack. We said that, during the previous year, 300 people responded to a cyberattack. We said that we were 90 percent certain that the installation of a next-generation firewall would result in a reduction in the number of humans who have to respond to an incident to between 30 and 255.

We can refine that number even more by simulating hundreds or even thousands of scenarios inside a spreadsheet. I did this myself by setting up 100 scenarios where I randomly picked a number between 0 and 300. I calculated the mean to be 131 and the standard deviation to be 64. Remember that the standard deviation is nothing more than a measure of spread from the mean. [11][12][13] The rule of 68–95–99.7 says that 68 percent of the recorded values will fall within the first standard deviation. 95 percent will fall within the second standard deviation. 99.7 percent will fall within the third standard deviation. [8] With our original estimate, we said there was a 90 percent chance that the number is between 30 and 255. After running the Monte Carlo simulation, we can say that there is a 68 percent chance that the number is between 76 and 248.

How about that? Even a statistical luddite can like me an run his own Monte Carlo simulation.

Conclusion

After reading Hubbard’s second book in the series, “How to Measure Anything in Cybersecurity Risk,” I decided to go back to the original to see if I could understand with a bit more clarity exactly how the statistical models worked and to determine if the original was Canon-worthy too. I learned that there was probably a way to collect data to support risk decisions for even the hardest kinds of questions. I learned that we, network defenders, do not have to have 100 percent accuracy in our models to help support these risk decisions. We can strive to simply reduce our uncertainty about ranges of possibilities. I learned that this particular view of probability is called Bayesian, and it has been out of favor within the statistical community until just recently, when it became obvious that it worked for a certain set of really hard problems. I learned that there are a few simple math tricks that we can all use to make predictions about these really hard problems that will help us make risk decisions for our organizations. And I even learned how to build my own Monte Carlo simulations to supports those efforts. Because of all of that, “How to Measure Anything: Finding the Value of ‘Intangibles’ is indeed Canon-worthy, and you should have read it by now.

Source: https://researchcenter.paloaltonetworks.com/2017/07/cybersecurity-canon-measure-anything-finding-value-intangibles-business/

Author: Rick Howard


  • 0

Secure Access Solutions for Mobile, Cloud and Internet of Things – Latest Release

Category : Pulse Secure

Embrace the latest cloud, mobile and IoT technologies with Secure Access. Learn how Pulse Secure’s latest features and capabilities make it simple to securely roll out new end-user services to support the latest IT transformation without compromising security compliance or taxing your IT team.

Last year we delivered over 250 new product features. Learn about the latest features in:

  • Connect Secure
  • Policy Secure
  • Pulse Client

All are now available in Pulse Access Suite which makes planning, purchasing and deploying a snap. We’ve assembled our product owners to tell you what’s new, so be sure to join and drill down with the experts.

Listen NOW!

Presenters:

Phil Montgomery – Vice President of Marketing 
With 20+ years in enterprise solutions, Phil leads Corporate Marketing, as well as Product and Solutions Marketing, at Pulse Secure. Prior to joining Pulse Secure, he served as executive roles in product management at Identiv, Inc, VMware, and Citrix Systems. A graduate of University of Southern Queensland with a Bachelor of Business degree in operations management and end-user computing.

Prashant Batra – Director of Product Management 
Building mobile and cloud products for the past 10 years, Prashant is responsible for Pulse Secure’s Saas offerings for management, mobile, and cloud. Previously, he held product management and engineering roles at Citrix and Conexant. He has a Master’s in Embedded Systems Design.

Ganesh Nakhawa – Senior Product Manager for Pulse Policy Secure
With over 16 years of security and networking experience, Ganesh has held various product management, product marketing, and engineering roles at companies such as MOCANA, Bradford Networks, AFL, Nortel Networks, and Cabletron. Ganesh has a M.S. in Telecommunication from Boston University and M.B.A from Babson College.

Listen NOW!


  • 0

Gigamon provides visibility, intelligence and security information for Cisco and other key platform users

Category : Gigamon

“Gigamon is a company that is in the business of visibility”, says Andy Huckridge Director of Service Provider Solutions and SME at Gigamon. “The truth is in the packets.”

In this podcast we learn how Gigamon is delivering visibility in a diverse set of models and platforms, including cloud based subscriber models.

We hear about Gigamon products directed at both the enterprise and the service provider. Huckridge also gives us a look at the GigaSECURE, the industry’s first Security Delivery Platform. Security was a major topic among attendees visiting the Gigamon booth

The GigaVUE-HC3 visibility node enables comprehensive traffic and security intelligence at scale to see more, secure more, and expand your security and monitoring infrastructure.

Gigamon (provides active visibility into physical and virtual network traffic, enabling stronger security and superior performance. Gigamon Visibility Fabric™ and GigaSECURE®, the industry’s first Security Delivery Platform, deliver advanced intelligence so that security, network, and application performance management solutions in enterprise, government, and service provider networks operate more efficiently and effectively.

PODCAST

Source:  https://telecomreseller.com/2017/07/06/podcast-gigamon-provides-visibility-intelligence-and-security-information-for-cisco-and-other-key-platform-users/

 


  • 0

New GDPR-Focused Media Hub Launched By IDG/CIO and Hewlett Packard Enterprise

Category : HP Security

Do you have questions regarding the pending enforcement of the European Union’s General Data Protection Regulation (GDPR) and its impact on your business?  If so, look no further — GDPR & Beyondlaunched this week. GDPR and Beyond is a new online media hub developed for Information Governance and Security professionals looking to understand more about GDPR and how it is going to impact a company’s collection, maintenance and protection of its customers’ data.

GDPR’s reach is extensive in that it not only applies to EU companies, but also multi-national organizations that collect personal data of EU citizens. GDPR mandates tighten and deepen governance, data security and data privacy to ensure the adequate protection of the fundamental rights and freedoms of EU citizens with regard to their personal data.

The website, sponsored by Hewlett Packard Enterprise (HPE), features insightful articles, interviews and videos from an experienced and knowledgeable editorial team at IDG/CIO Magazine, with key inputs for selected content from HPE subject matter experts including David Kemp – specialist business consultant, Tim Grieveson – chief cybersecurity strategist, and Sudeep Venkatesh – global head of pre-sales for HPE Data Security.

Below is a sample of the type of interactive content included on the website:

  • How can I find the information and personal data that will fall under these regulations?
  • How can I cost effectively respond to legal matters requiring information under my management?
  • How can I protect, store and securely back up personal data?
  • What types of data protection technologies can help to secure data without breaking business processes?
  • How can I identify information for disposition in accordance with the “right to be forgotten?”
  • Can I report a breach within the timeline required by the EU data protection regulations?
  • How can I reduce my overall risk profile?

GDPR & Beyond aims to foster discussion and idea exchange around the topics of how IT and the lines of business must collaborate to drive GDPR compliance by the May 25, 2018 effective date. Included in the content will be an assortment of educational, thought-leading and opinion-based articles that discuss how organizations’ efforts to comply enable them to become more efficient in their use of data and their ability to mitigate risk.

More content will continue to be posted to the GDPR & Beyond site in addition to current highly valuable articles:

Visit GDPR & Beyond today to learn more about how to prepare for GDPR.

Source: https://www.voltage.com/gdpr/new-gdpr-focused-media-hub-launched-idgcio-hewlett-packard-enterprise/

Author: Lori Hall


  • 0

Uncover Sensitive Data with the Classifier Tool

Category : Imperva

Understanding what sensitive data resides in your enterprise database is a critical step in securing your data. Imperva offers Classifier, a free data classification tool that allows you to quickly uncover sensitive data in your database.

Classifier contains over 250 search rules for popular enterprise databases such as Oracle, Microsoft SQL, SAP Sybase, IBM DB2 and MySQL and supports multiple platforms like Windows, Mac and Linux. Once you download and install Classifier, you can start discovering sensitive data, such as credit card numbers, person IDs (which includes-ID type elements associated with a person like user name, user ID, and employee ID), access codes and more in your database. The tool also jumpstarts you on your road to compliance with General Data Protection Regulation (GDPR) as well as data security. This post will walk you through the steps of using the tool.

First, you need to meet the prerequisites listed in the Classifier User Guide. Then you can begin your scan, view the results, and evaluate corrective action options. Let’s get started.

Running a Scan

Running a Classifier scan is a simple, four-step process.

  1. Open Classifier.
  2. Select your database type from the drop down list. (Options include Oracle, Microsoft SQL Server, SAP Sybase, IBM DB2, and MySQL.)
  3. Enter details for the selected database, as follows (see Figure 1):
    • Host/IP
    • Port (or use default Port)
    • Schema – a collection of database objects (i.e. table) associated with one particular database user name
    • User Name
    • Password
    • Database Name / Instance / SID

NOTE: Microsoft SQL Server supports Windows Authentication, which is enabled by default. To disable and manually enter a User Name and Password, click the Authentication button next to the User Name field. Enter the appropriate User Name and Password (see Figure 2).

  1. Click Go to start the scan. The scan will run without the database experiencing any downtime or performance degradation.

uncover sensitive data - set scan parameters - 1

Figure 1: Set scan parameters in Classifier

uncover sensitive data - disable windows authentication - 2

Figure 2: Disable Windows Authentication

Review the Results

The results of the scan are presented on an easy-to-read dashboard (see Figure 3).

uncover sensitive data - classifier dashboard - 3

Figure 3: Classifier executive summary dashboard [click to enlarge]

The dashboard is organized into three panes:

Top Pane — Displays an executive summary of the sensitive data contained within your database as well as an indication of the amount of sensitive data present.

  • Number of sensitive data categories detected
  • Number of total sensitive data
  • Time to complete the scan

Middle Pane — Displays summary statistics that include:

  • Ratio of sensitive/non-sensitive database columns
  • Data Classification Results — Different categories of sensitive data found, such as personal identification number, mailing address, access codes, etc.
  • Ratio of each sensitive data category

Bottom Pane — Displays Classification Details, organized into a sort-able table with the following types (see Figure 4 for a larger view):

  • Category — Displays type of sensitive data
  • Table Count
  • Column Count
  • Row Count

In the example above, there are a total of 30 columns of sensitive data, which account for 11% of this scanned database. Among the sensitive data found, 7% are access codes, 20% is free text, 10% are person IDs, and 49% are person names. When you look at the classification details, you can find the actual amount under each category.

To better understand which schema, tables, and columns are contained within each category, you can click on a category row under the Classification Details section to expand the content. You can drill down into details of a specific category, including row counts associated with each schema, table and column identified by the scan (see Figure 4).

uncover sensitive data - classification details - 4

Figure 4: Category detail example showing a total of 2 tables that contain 17 rows of person ID data [click to enlarge].

Next Steps

Now that you’ve identified what sensitive data resides in your database, you can then take appropriate actions, such as data monitoring or data masking, to further secure your data. It’s easy to use Classifier to quickly uncover sensitive data that may be at risk within your organization. While this free tool searches database metadata, our enterprise data security products provide additional capabilities, such as database content searching, reporting and export functionality.

Source: https://www.imperva.com/blog/2017/07/uncover-sensitive-data-with-the-classifier-tool/?utm_source=linkedIn&utm_medium=organic&utm_campaign=2017_Q3_classifier

Auhtor: Sara Pan


  • 0

McAfee June Threats Report

Category : McAfee

Malware evasion techniques and trends

Malware developers began experimenting with ways to evade security products in the 1980s, when a piece of malware defended itself by partially encrypting its own code, making the content unreadable by security analysts.
Today, there are hundreds, if not thousands, of anti-security, anti-sandbox, and anti-analyst evasion techniques employed by malware authors.
In this Key Topic, we examine some of the most powerful evasion techniques, the robust dark market for off-the-shelf evasion technology, how several contemporary malware families leverage evasion techniques, and what to  expect in the future, including machine learning evasion and hardware based evasion.

Hiding in plain sight: The concealed threat of steganography

Steganography has been around for centuries. From the ancient Greeks to modern cyberattackers, people have hidden secret messages in seemingly benign objects. In the digital world, those messages are most often concealed in images, audio tracks, video clips, or text files. Attackers use digital steganography to pass information by security systems without detection.
In this Key Topic, we explore the very interesting field of digital steganography. We cover its history, common methods used to hide information, its use in popular malware, and how it is morphing into networks. We conclude by providing policies and procedures to protect against this form of attack.

The growing danger of Fareit, the password stealer

People, businesses, and governments increasingly depend on systems and devices that are protected only by passwords. Often, these passwords are weak or easily stolen, creating an attractive target for cybercriminals. We dissect Fareit, the most famous password-stealing malware.
In this Key Topic We cover its origin in 2011 and how it has changed since then; its typical infection vectors; its architecture, inner workings, and stealing behavior; how it evades detection; and its role in the Democratic National Committee breach before the 2016 U.S. Presidential election. We also offer practical advice on avoiding infection by Fareit and other password stealers.

McAfee Labs Threats Report June 2017


  • 0

NSS Labs NGFW Test: Is your firewall a top performer?

Category : Forcepoint

NSS Labs leads the industry in third-party assessments of Next Generation Firewalls (NGFWs), releasing the results of their stringent testing each year. And this year NSS expanded their testing, so the results may not be what you are expecting.

Join Thomas Skybakmoen, Distinguished Research Director for NSS Labs, as he shares the latest results and what distinguishes the top performers in this exclusive webcast.

And see where your current NGFW solution ranks for:

  • Live and continuous threat protection
  • Low total cost of ownership
  • Ease of deployment and management

Learn why there is so much acclaim about Forcepoint NGFW products across the industry, and why NSS Labs recommends the Forcepoint NGFW to be on every company’s short list.

Register for the webcast now.


  • 0

Government Moves to the Cloud – FireEye Government Email Threat Prevention Receives FedRAMP Authorization

Category : FireEye

Given recent high-profile incidents, cyber security has quickly risen to the top of the priority list for many organizations, including governments. The U.S. government has increased its annual cyber security budget by 35 percent, going from $14 billion budgeted in 2016 to $19 billion in 2017, and reports indicate that governments around the globe are expected to double down on cyber protection this year.

As with many organizations these days, government information technology and security is migrating to the cloud. This is largely driven by the federal CIO “Cloud First” mandate specifying that agencies consider cloud services first for procurements when meeting security, reliability and cost requirements. Cloud-delivered products can significantly reduce cost and complexity compared to on-premises products, so it’s no surprise that cloud services are coming out on top.

Email Migration to the Cloud

As government and public education entities migrate to Office 365, Google Mail or other solutions for their primary email management service, they’re also looking for email security that delivers advanced threat protection, and this requires a service that is FedRAMP authorized.

FireEye Government Email Threat Prevention (ETP), an email security service focused on advanced threat protection, was granted an Authority to Operate (ATO) by the U.S. Department of the Interior (DOI) on April 26, 2017 and achieved a FedRAMP Authorization on July 5, 2017.

Government ETP enables government entities to save time and money as they add email security for advanced threat protection. Governments can now confirm that the DOI authorization package meets their security requirements and issue their own ATOs in parallel with the procurement process, accelerating their migration to using Government ETP.

FireEye Government ETP is available today as a subscription, ensuring customers continue to benefit from intelligence-led feature updates at no additional cost. For existing FireEye EX appliance customers, upgrade programs are available.

Learn more about FireEye Government ETP, and check out our podcast to hear FireEye CTO Grady Summers speak with FireEye Global Government CTO Tony Cole and Risk Management Lead Stacey Ziegler about how FireEye will support the government as it moves to the cloud.

Source: https://www.fireeye.com/blog/products-and-services/2017/07/government-etp-receives-fedramp-authorization.html

Auhtor: Elizabeth Flammini


Support