Monthly Archives: December 2017

  • 0

10 Good Reasons to Choose NetApp for Machine Learning

Category : NetApp

Artificial intelligence (AI) can help your team get greater insight from enterprise data and enhance digital services to increase customer engagement. But it’s such a new field that the right infrastructure choices for AI / Machine Learning (ML) aren’t always clear.

Whether you do AI work on-premises or in the cloud, as you ramp up processes and move them into production, bottlenecks inevitably occur. Lack of I/O performance stalls your AI pipeline. Moving, copying, and managing rapidly growing data sets eats up valuable staff time. The methods that worked during proof of concept become impractical if not impossible at scale.

This is where NetApp can help. NetApp Data Fabric solutions and services accelerate and simplify your AI / ML efforts—from the edge of your network, to the core of your data center, to the cloud. Here are ten reasons to partner with NetApp for your AI / ML needs.

Source: https://blog.netapp.com/infographic-10-good-reasons-to-choose-netapp-for-machine-learning/

Author: Matt Watts

 


  • 0

Palo Alto Networks Now a Six-Time Gartner Magic Quadrant Leader!

Category : Palo Alto

Gartner’s 2017 Magic Quadrant for Enterprise Network Firewalls has been released, and Palo Alto Networks is proud to be positioned in the Leaders quadrant for the sixth consecutive year. I invite you to read the 2017 Magic Quadrant for Enterprise Network Firewalls report.

Gartner’s Magic Quadrant provides a graphical competitive positioning of technology providers in markets where growth is high and provider differentiation is distinct. Leaders execute well against their stated visions and are well-positioned for tomorrow. Gartner researchers continue to highlight both our ability to execute and the completeness of our vision. You can find more details in the report.

More than 39,500 customers in more than 150 countries have chosen Palo Alto Networks to realize the benefits of a truly next-generation security platform, safeguard critical assets, and prevent known and unknown threats. To protect our customers and stay ahead of sophisticated cyberattackers, we maintain a steadfast commitment to innovation. We recently introduced several more disruptive capabilities:

  • Application Framework: With a SaaS-based consumption model, Palo Alto Networks Application Framework allows customers to use new apps to solve the most challenging security use cases with the best technology available, without the cost and operational burden of deploying new infrastructure.
  • GlobalProtect cloud serviceGlobalProtect cloud service eases your next-generation firewall and GlobalProtect deployment by leveraging cloud-based security infrastructure operated by Palo Alto Networks.
  • Logging Service: Palo Alto Networks Logging Service is a cloud-based offering for context-rich, enhanced network logs generated by our security offerings, including those of our next-generation firewalls and GlobalProtect cloud service.

Source: https://researchcenter.paloaltonetworks.com/2017/07/palo-alto-networks-now-six-time-gartner-magic-quadrant-leader/

Author: 


  • 0

Secure Access in the Virtual Landscape

Category : Pulse Secure

Whether supporting “workstyle innovation” initiatives in Japan or supporting higher order compliance requirements for the financial sector, our mission to deliver Secure Access solutions for people, devices, things and services is improving the productivity of enterprises all over the world at an increasing rate. Central to our Secure Access mission is the core belief that “Security must be about ACCESS and NOT (just) about CONTROL”.

Our design goal priorities:

  1. Simpler and more integrated user experiences
  2. Designing for “inherently” mobile users and
  3. Supporting hybrid deployments

To date, the majority of our Secure Access deployments have been on industry standard appliances, known for high-performance, ease of maintenance and superior ROI.  I have the good fortune of meeting customers across the world to understand their needs and determine how Pulse Secure, and partner, solutions are ideally suited to deliver their success. Routinely, the following challenges and considerations emerge:

  • Access sprawl – different technologies and applications with custom access solutions leading to Security holes and administration overhead
  • Too many clients leading to management challenges – mobile, network access, remote access etc.
  • Complexity that leads to higher IT costs and a decrease in administrator productivity
  • The question of “how best to leverage cloud/hybrid deployments in a Secure manner”

Our exclusive focus on Secure Access and our more than 800 talented Pulsers have been, and are dedicated, to addressing each of the above – and emerging – customer needs. We have made rapid progress since our inception and our latest improvements are aimed at giving our customers more CHOICES by supporting virtual editions and cloud delivery options.

Our recently announced software releases present customers the opportunity to enable Secure Access for their changing data center, allowing them to:

  • Procure and deploy virtual editions of our industry leading Secure Access Solutions. We support a broad, and ever growing, range of virtual environments for the data center
  • Optionally deploy in various cloud environments. Customers and partners can procure from various cloud marketplaces, including Bring Your Own Licenses (BYOL) configurations
  • Flexibly procure virtual editions as subscriptions. This gives them the option of choosing between CapEx and OpEx deployments

We have ensured complete feature parity between our virtual, cloud and physical editions – an important consideration for deployment flexibility. Customers can mix and match physical,  virtual and cloud deployments as their Hybrid IT needs dictate and leverage common clients, licensing servers and management platforms to integrate their access solutions and enhance the productivity of their employees from any device and from any location, all while enjoying the flexible deployment and pricing options. Our Customer Success and Support offerings are all available with these new virtual and cloud editions.

Pulse Secure remains relentlessly focused on customer success. As we accelerate the Secure Access journey, supporting cloud and virtual deployment models and other emerging technologies, our core belief will always be the basis of Secure Access solutions, the belief that security must be about Access and NOT just control.

Source: https://blog.pulsesecure.net/secure-access-virtual-landscape/

Author: Sudhakar Ramakrishna


  • 0

Don’t Settle for “GOOD ENOUGH” Mobility

Category : Mobile Iron

Modern enterprises are rapidly shifting core business processes away from legacy technologies and standardizing on mobile devices and cloud services. As a result, these organizations are quickly outgrowing basic MDM capabilities and apps like email and calendar. Building a secure mobile and cloud architecture now requires a comprehensive approach to EMM to protect business apps and data running on any device, network, or cloud service.

The good news is, organizations don’t have to settle for “good enough” mobile management solutions that don’t scale to support rapidly changing mobile requirements. MobileIron is recognized leader in mobile and cloud security, and our comprehensive platform helps customers improve security, enable a more productive user experience, and scale to meet future mobile business requirements. In addition to being the enterprise choice for secure mobility, we rank in the top five for all categories of the Gartner Critical Capabilities for High-Security Mobility Management.


  • 0

Generating Compliance History Reports

Category : McAfee

When you’re managing a large environment with thousands of endpoints, assuring consistency can be a huge challenge. Imagine that you want every endpoint to be upgraded to a specific software version, for example. In many cases, you’re forced to rely on manual tracking, where errors and omissions are commonplace. And, if you want to demonstrate how you’re progressing towards that goal over time, you’re looking at a large manual effort to track which systems have been updated and when.

In my previous blogs, I talked about sometimes-overlooked features in McAfee ePolicy Orchestrator (ePO) that can make managing your endpoint environment a whole lot simpler. Now, I’m going to cover one more: using ePO to show compliance history over time.

Tracking Compliance

Out of the box, you can use ePO to see the percentage of your systems that comply with a given criteria, such as McAfee Endpoint Security (ENS) software version. You may already be using that feature. But what you might not realize is that, in addition to showing a snapshot of systems that do and don’t meet that criteria right now, you can also track compliance over time. Effectively, you can use ePO to set a starting point for your migration project, and then generate reports showing your day-to-day progress towards the project goal.

For example, say you want to migrate all endpoints to McAfee ENS 10.5 by the end of this quarter. And imagine that, right now, 50 percent of your endpoints are running that software version. By next week, 60 percent of endpoints may be in compliance. The following week, you may be up to 75 percent. With ePO compliance history reporting, you can generate hard numbers to track your progress towards 100 percent compliance for that migration.

Software migrations are just one example of when compliance reporting comes in handy. You could use the same reporting to track endpoint systems that have a specific set of McAfee endpoint tools or components installed. Or, you could use it to help enforce a rule that no system should be using antivirus definitions older than 10 days. If you have any compliance goal for the McAfee products and tools on your endpoints, and you can express it as a Boolean query, you can generate a graph showing your progress towards that goal and export it to an Excel spreadsheet.

Creating the Report

Generating a compliance history report in ePO involves three basic steps: creating a Boolean managed system query, creating a server task, and creating a compliance history query.

The first step, a Boolean managed system query, creates a pie chart to show which systems are compliant with your criteria and which are not. ePO features a wizard to take you through the process. To get started, click “Create new managed system query” in the Queries & Reports section of the main ePO dashboard. Select Boolean Pie Chart as the chart type, and click the “Configure Criteria” button. The properties listed here configure which attributes the query will check for compliance. So in our software migration example, if you want to see which systems are running ENS 10.5, you would add that as a compliance attribute. ePO will then show all systems that are not running software version 10.5 as non-compliant for the purposes of this query.

Using the same tool, you can also label the Boolean pie chart with your compliance criteria. And you can configure the Filters tool to exclude any systems that you don’t need to be in compliance for the purposes of your query. (So in the software migration example, you could decide that servers are out of scope for this update and exclude them from your query.)

Finally, save the Boolean Managed System Query. I’d recommend naming the report with “Compliance” in the query name for easier referencing later.

Configuring Server Tasks and Compliance Queries

The next step is to create a new server task. Go to Server Tasks in ePO and click “Create Server Task.” For simplicity’s sake, you may want to include “Compliance” in the server task’s name. For the Action field, select “Run Query.” In the Query field, select the Boolean Managed System Query you created in the previous step. In the Sub-Actions field, select, “Generate Compliance Event.” Then, set a schedule to run the server task once per day, or as often as you’d like to track. Remember: the goal here is not simply to see a snapshot of how many systems are in compliance, but to be able to track your progress towards full compliance over time. So you will want this server task to run on an ongoing basis.

For the final step, you create a new compliance history query. Go back to Queries & Reports in ePO and click “Create Compliance History Query.” For the chart type, select “Single-Line Chart.” Select “Day” for the Time Unit (unless you’ve chosen a different time interval for your server task to run). For the Line Values field, select “Average of,” and in the second field, select “Percent Compliant.” Save the chart. Then, in the filter section, add a filter for “Server Task Used to Generate Compliance Event” and assign it the Server Task that you just created.

View Progress Over Time

Illustrating compliance history over time can be extremely useful for anyone undertaking a large-scale software migration, or seeking to ensure that all systems’ McAfee components are configured consistently. But it can also be helpful for illustrating the progress of a given project to others.

If an executive wants to know how a software migration is progressing, for example, and you show them a point-in-time snapshot showing 70 percent compliance, they may want to know why 30 percent of systems are still running older software. With ePO compliance history reporting, you could demonstrate that just two weeks ago, 60 percent of systems were non-compliant, and you’ve cut that figure in half. It’s just one more way that ePO can make large-scale endpoint management easier.

 

Source: https://securingtomorrow.mcafee.com/business/generating-compliance-history-reports/#sf175360588

Author: Ted Pan

 


  • 0

DevOps in the Cloud: How Data Masking Helps Speed Development, Securely

Category : Imperva

Many articles have discussed the benefits of DevOps in the cloud. For example, the centralization of cloud computing provides DevOps automation with a standard platform for testing and development; the tight integration between DevOps tools and cloud platforms lowers the costassociated with on-prem DevOps automation technology; and cloud-based DevOps reduces the need to account for resources leveraged as it tracks the use of resources by data, application, etc. With all these benefits, cloud-based DevOps seems to provide more flexibility and scalability to organizations, allowing software developers to produce better applications and bring them to market faster.

However, moving the entire application testing, development, and production process to the cloud may cause security issues. In this post, we discuss the security issues associated with a fast-moving, cloud-based DevOps environment and ways to mitigate those issues without impacting speed to market.

Protect Data from Breaches

If the recent Uber data breach taught us anything, it’s that protection around production data disappears as soon as you make a copy of that data. In the case of the Uber breach, the hackers worked their way in via the software engineering side of the house. Software engineers then became compromised users as their login credentials were stolen, giving hackers access to an archive of sensitive rider and driver data (a copy of production data).

Get the Realistic Data You Need, When You Need It

As a developer, you may get frustrated with security restrictions placed around using production data for testing and development. But if you think about it for a moment, a data breach could cost you and the security folks their jobs when the finger of guilt points your way. Nonetheless, while it is important to prevent sensitive data from breach, it is also critical for companies to deliver software faster to the market and maintain high quality, especially when competitors are adopting cloud to increase the pace of software development. As a developer, your mission is to deliver quality code on time and in order to do so, you need realistic data to put your code through its paces. And yet it can be time consuming to get approvals from the security team and wait for DBAs to extract data from production databases.

Data Masking Removes Sensitive Information

The good news is there’s technology available to balance the needs from both ends. Data masking has proven to be the best practice in removing sensitive information while maintaining data utility. Data masking (or pseudonymization) has been referenced by Gartner (account required) and other industry analysts as required elements for data protection. This technology replaces sensitive data (access to which should be limited to a need-to-know basis) with fictional but realistic values to support DevOps in the cloud without putting sensitive data at risk. The masked data maintains referential integrity and is statistically and operationally accurate. For example, let’s say a data record shows that Terry Thompson is 52 years old and that his social security number (SSN) is 123-00-4567. After the data is masked, that record may then become John Smith whose SSN is 321-98-7654. The masked data retains the exact format of the original (real) data, maintaining the data richness that allows developers to do their jobs.

data masking example

Data masking replaces original data with fictitious, realistic data

Security and Productivity Go Hand in Hand

With data masking, companies don’t have to choose between security and productivity, which tends to be one of the most common dilemmas. Data masking ensures the data being used is anonymized and always protected—regardless of how it is being used, by whom, and how often it is copied. It’s the key for developers to embrace all the benefits associated with the cloud. Masking sensitive information in the cloud gives developers peace of mind when producing better applications and allows you to truly bring those apps to market faster without getting a red light from the security team. Better still, the finger of guilt can’t point in your direction in the event a hacker breaks in because you never had the data to begin with.

Watch our whiteboard video session to learn more about data masking and how it works

Source: https://www.imperva.com/blog/2017/12/devops-in-the-cloud-how-data-masking-helps-speed-development-securely/?utm_source=linkedIn&utm_medium=organic-social&utm_content=devops-data-masking&utm_campaign=2017-Q4-linkedin-awareness

Author: Sara Pan


  • 0

Gemalto eSIM technology enables Always Connected experience for new Microsoft Surface Pro with LTE Advanced

Category : Gemalto

Advanced integration of eSIM into Windows 10 delivers an enhanced user experience

Gemalto, the world leader in digital security, is supplying the eSIM (embedded SIM) solution for Microsoft’s Surface Pro with LTE Advanced, the most connected laptop in its class1 which will begin shipping to business customers in December 2017. Gemalto’s partnership with Microsoft enabled Surface to become the first fully integrated embedded SIM PC in the Windows​ ecosystem.

Gemalto’s advanced technology supports seamless activation of mobile subscriptions for users of the innovative Surface Pro with LTE Advanced. This smooth experience leverages Gemalto’s remote subscription managementsolution in conjunction with Windows 10. Surface customers expect their products to deliver advanced technology and with Gemalto’s eSIM solution, all possible connectivity options are available out-of-box, including the purchase of cellular data from the device itself.

 

Compliant with the GSMA Remote SIM Provisioning​specifications, Gemalto’s eSIM solution is fully integrated with Windows 10. This integration enables the Gemalto solution to have a complete servicing model so that patching and lifecycle management features are available as the technology and standards evolve over time. This capability extends the value promise of Surface as new experiences and capabilities will be available to today’s purchasers of the Surface Pro with LTE Advanced.

“The Surface Pro has redefined the laptop category,” said Paul Bischof, Director, Devices Program Management at Microsoft. “Gemalto’s eSIM solution is helping us to materialize our vision of an uncompromised customer experience.”

“Adoption of eSIM technology is growing rapidly. Mobile operators recognize the potential of seamless connectivity and increased convenience as a way of expanding their customer reach to additional devices” said Frédéric Vasnier, executive vice president Mobile Service and IoT for Gemalto. “We are at the beginning of a significant technology transformation and the Surface Pro with LTE Advanced represents the start.”

DISCLAIMERS:

  1. Comparison of supported bands and modem speed for Surface Pro with LTE Advanced vs. 12″ and 13″ LTE-enabled laptops and 2-in-1 computers. Service availability and performance subject to service provider’s network. Contact your service provider for details, compatibility, pricing and activation. See all specs and frequencies at surface.com.

Always Connected Service availability and performance subject to service provider’s network.  Contact your service provider for details, compatibility, pricing,​​​  and activation.  See all specs and frequencies at surface.com.

Source: https://www.gemalto.com/press/Pages/Gemalto-eSIM-technology-enables-Always-Connected-experience-for-new-Microsoft-Surface-Pro-with-LTE-Advanced.aspx


  • 0

Gigamon Introduces the First Scalable SSL Decryption Solution for 100Gb Networks

Category : Gigamon

Reduces Costs and Time-to-Threat Detection via Architectural Approach that Enables Traffic to be Decrypted Once and Sent to Multiple Security Tools for Inspection

Gigamon Inc., the leader in traffic visibility solutions for cybersecurity and monitoring applications, today announced the industry’s first visibility solution to support SSL/TLS decryption for high speed 100Gb and 40Gb networks. Part of the GigaSECURE Security Delivery Platform, the solution empowers companies to decrypt and re-encrypt their data once and inspect it with multiple best-of-breed security tools. This helps to expose hidden threats in SSL/TLS sessions, reduce security tool overload, and extend the value and return-on-investment (ROI) of existing security tools.

With the volume of data flowing through corporate networks having increased significantly in recent years, companies have upgraded to higher speed networks running at 40Gb and 100Gb. Meanwhile, there is a dramatic rise in the volume of data running on these high-speed networks that is encrypted, driven by the increased use of SaaS applications such as Microsoft Office365 and Dropbox. Gartner estimates that, through 2019, more than 80 percent of enterprises’ web traffic will be encryptedi.

“Traditional network security architectures are ineffective at supporting the explosive growth in high speed traffic and, more importantly, at identifying and stopping malware and data exfiltration that use encryption,” said Ananda Rajagopal, vice president of products for Gigamon. “Many security and monitoring tools become overloaded in 100Gb network environments, so it’s clear a new approach is needed. Our new solution enables enterprises to stop the sprawl by redeploying security tools from the edge of their network to the core, where it’s easier to spot lateral attacks and more quickly identify threats.”

Malware leverages SSL/TLS encryption to hide and avoid inspection. A Trustwave 2017 reportii estimates that 36 percent of malware samples analyzed used some form of encryption. In 40Gb and 100Gb networks, decrypting, exposing and identifying hidden threats in encrypted traffic is increasingly more challenging since most security and monitoring tools do not support such speeds. In addition, a tool-by-tool approach is very complex, costly and inefficient. Research from NSS Labsiii indicates a performance degradation of up to 80 percent when security tools decrypt traffic and perform their specific security function.

“By utilizing Check Point’s Infinity architecture, which manages Next-Generation Threat Prevention gateways worldwide, Gigamon provides world-class performance and a resilient security architecture, enabling inline SSL protection for our largest customer deployments,” said Jason Min, head of business and corporate development, Check Point Software. “Our partnership with Gigamon delivers optimal performance and advanced threat prevention which is critical for enterprises in this era of veiled cyber threats.”

“It’s great to see the ‘decrypt once, inspect many times’ architectural approach that Gigamon is taking to inline SSL decryption. It’s an efficient approach that will help our customers and solution provider community take advantage of whichever security solutions best suit their business need,” said Matt Rochford, vice president of the cybersecurity group in Arrow Electronics’ enterprise computing solutions business.

The expansion of the GigaSECURE Security Delivery Platform is a continuation of the Gigamon security strategy which debuted in 2015 and was extended with metadata and public cloud visibility last year. This year the company announced its inline SSL/TLS decryption solution and introduced the Defender Lifecycle Model. When implemented, the Defender Lifecyle Model empowers cybersecurity professionals to use continuous network visibility to control and automate tasks between best-of-breed security tools in the continuum of prevention, detection, prediction and containment. Recently the company announced the extension of its public cloud offerings and new applications for Splunk and Phantom in support of the Defender Lifecycle Model. Gigamon continues to build on its vision with the expansion of its security offerings for both public cloud and on-premises infrastructure.

GigaSECURE, a Security Delivery Platform

This solution includes:

  • GigaVUE® visibility nodes, such as the GigaVUE-HC2 or GigaVUE-HC3.
  • GigaSMART® module corresponding to the selected visibility node.
  • An inline bypass module to provide resiliency in 10, 40 or 100Gb networks.
  • Ability to activate desired security modules including SSL/TLS Decryption, Application Session Filtering, and NetFlow/Metadata Generation.

Resources

  • Blog post: Stop the Sprawl, Security at the Speed of the Network
  • Feature brief: SSL/TLS Decryption
  • Web page: SSL/TLS Decryption

Source: https://www.gigamon.com/company/news-and-events/newsroom/100gb-ssl-decryption.html?utm_content=buffer622e4&utm_medium=social&utm_source=linkedin.com&utm_campaign=buffer


  • 0

The Future of COBOL applications

Category : HP Security

Digital is driving faster change across every aspect of IT and business. But what does this mean for the future of COBOL applications? We asked and you answered.

Join us on Thursday, December 14th to see the Future of COBOL applications as we unveil and discuss highlights from the 2017 COBOL market survey.

Our COBOL experts, Ed Airey and Scot Nielsen, will provide their market insights into the latest trends, technologies and practices influencing change and innovation for COBOL systems.

You’ll also have a chance to pose your questions to our panel of experts. During this webinar, you’ll…

• Understand how COBOL is connected to core business strategy

• Discover how the latest technologies are inspiring COBOL innovation

• View the top priorities for application development and modernization

• Learn how your peers are responding to digital transformation across their COBOL systems

• Start planning your future application roadmap

• Get your free copy of the 2017 COBOL survey results

Register today for a first look at the next generation of COBOL applications and see how your COBOL systems can take full advantage of this new digital opportunity.


  • 0

Preventing Attacks Launched Deep Within the Network

Category : Cyber-Ark

Attacks that exploit Kerberos, a Windows authentication protocol, have been behind some of the biggest breaches in recent history. These attacks are troublesome for many different reasons, including a complete and total loss of control over the domain controller. Threat actors have uncovered a number of different vulnerabilities that exist within the Kerberos protocol, and when successful, they’re able to elevate unprivileged domain accounts to those of the domain administrator account. The intent of the attacker is to leverage Kerberos tickets to appear to be a legitimate, fully authorized user when authenticating to various systems within the network.

These attacks are extremely difficult to detect, and even more difficult to prevent. Other solutions in the market have the ability to detect Kerberos attacks but come with limited functionality, agent-based performance issues, and well documented by-passing techniques calling into question the value and effectiveness of these solutions. CyberArk Privileged Threat Analytics is the only solution able to detect, alert, prevent and remediate a variety of different flavors of Kerberos-style attacks (Golden Ticket, Overpass-the-Hash, DCSync and PAC [MS14-068] attacks).

Attackers will get inside. It’s what they do. Far too many organizations continue to focus on defending solely against perimeter attacks without considering the impact and devastation of an attack launched from deep within the network. Moreover, while vaulting credentials is certainly a best practice, privileged credentials are often not required for the attacker to be successful in this type of an attack so organizations will undoubtedly benefit from the analytics capabilities CyberArk can provide. This type of attack needs to be prioritized and top of mind for every security operations teams.

In this demo, we walk through an example of how CyberArk Privileged Threat Analytics is able to not only detect, but also automatically stop an attack, preventing further damage to a domain controller. This scenario presents a situation where an attacker gains access to a compromised machine and utilizes a post-exploitation tool to move laterally to a domain controller. The attacker then uses a hash stolen from a logged-in user on the compromised machine, performs an Overpass-the-Hash attack, and gains access to the domain controller. Watch the video below to see how CyberArk detects this activity and breaks the attack chain before irreparable damage is done.

Request a live demo to see Privileged Threat Analytics in action or download the Data Sheet for more information.

Source: https://www.cyberark.com/blog/preventing-attacks-launched-deep-within-network/

Author: Corey OConnor


Support