Monthly Archives: January 2018

  • 0

Riverbed SD-WAN and Zscaler Cloud Security

Category : Riverbed

Accelerate your digital transformation by streamlining networking and cloud security for remote business locations

Ask any enterprise IT professional today about the changes in the IT landscape and invariably the answer is some variation of, “too complex, less predictable and insecure.” A distributed workforce and applications in the cloud are challenging the traditional concept of networking and on-premises perimeter security.

Riverbed enables organizations to modernize their networks and apps with industry-leading SD-WAN, app acceleration and performance management. Today, we announced a joint solution with Zscaler, the leader in cloud-based security solutions, that enables enterprises to incorporate more Internet broadband-based transport in remote business locations without compromising the security of their distributed network. This cloud-first architecture combines the simplicity and agility of SteelConnect SD-WAN and the power and scalability of Zscaler cloud security. SteelConnect simplifies the deployment and management of networks with powerful automation and centralized orchestration of business-aligned policies. It provides unified connectivity across the entire network fabric—LAN/WLAN, data center, WAN and the cloud and ensures consistent high levels of app performance.

In addition to the simplicity of a cloud-based management interface with single-click connectivity to the cloud, SteelConnect now automates the discovery and connection to the nearest Zscaler data center and dynamically chooses the optimal point of presence (POP) to use for each site in the SteelConnect network. No longer do you have to manually configure tunnels to multiple POPs and make changes manually as new nodes are deployed. Agility achieved!

Now, a single click enables integration of Zscaler into the app-defined policy and orchestration engine of SteelConnect. It can also identify and steer the application and user traffic based on pre-defined policies and network health.

In addition to the value that SteelConnect SD-WAN and Zscaler cloud security offer independently, the joint solution provides:

  • Increased agility: pre-populated ZENs in SteelConnect Manager; dynamic discovery of new ZENs; automated secure connection to the most optimal ZEN based on latency
  • Robustness: Automatic failover from primary to secondary ZEN

Consider a global engineering services firm with hundreds of locations for its distributed yet highly localized business operations. It’s imperative that their business model be consistently replicated everywhere, and employees ¾mobile and remote¾can collaborate with ease. They have moved some of their workloads to the cloud and have some apps that are delivered via the cloud. They want to lower the TCO of their network and deploy local internet breakouts at more branch locations. Legacy solutions for networking and security at those locations no longer make sense in a Cloud-paced world; they’re too rigid, error-prone and costly to maintain.

The joint Riverbed-Zscaler solution is a great fit that will address their pain points and help IT achieve the stated objectives.

Through SteelConnect, we have taken the concepts of SD-WAN and extended those to wired and wireless LANs and cloud infrastructure networks. With the acquisition of Xirrus earlier this year, we took that a step further by adding a robust enterprise grade Wi-Fi solution, delivering an unmatched SD-WAN offering that provides unified connectivity and policy based orchestration spanning the entire network—WLAN/LAN, WAN, data center and the cloud. The integration with Zscaler adds another dimension to SteelConnect—ensuring advanced protection for networks and end users.

Source: https://www.riverbed.com/blogs/riverbed-sdwan-zscaler-cloud-security.html

Author: MILIND BHISE


  • 0

Taking a Message-Based Approach to Logging

Category : Rapid7

When you think about it, a log entry is really nothing more than a message that describes an event. As such, taking a message-based approach to logging by utilizing messaging technologies makes sense. Messaging creates the loose coupling that allows a logging system to be adaptable to the needs at hand and extensible over time.

Understanding a Standard Logging Architecture

Typically, logging is implemented in an application using a logger. A logger is an object or component that sends log entry data to a target destination. Binding a logger to one or many targets is done via logger configuration. The target can be a file, database, logging service, or depending on the logger technology, all of the above. Log4J, the most popular logger for Java programming, allows you to configure multiple log appenders to the logger, with each appender dedicated to a particular target. The same is true of Winston, a popular logger for Node.JS.

Figure 1: Most loggers allow an application to send log entries to multiple targets.Figure 1: Most loggers allow an application to send log entries to multiple targets.

The important thing to understand in terms of application-based logging is that the application submits a log entry to a logger and then the logger forwards it onto one or more targets. How each target handles the log entry is its own concern.

Loggers are a reliable, well-known way to do logging. However, there is a drawback. Imagine that you have to add an additional target to a logger, for example a target that sends the log entry onto a mobile phone using SMS.

Adding the new target means having to deploy the appender as well as update and redeploy the logger configuration file, at the very least. You might have to take the entire application offline to deploy the new logging functionality.

Is there a way to avoid the problem? Yes, use a message-based approach to logging.

Taking a Message-Based Approach to Logging

As mentioned at the beginning of this article, a log entry is basically a message. In the case of using a target-based logger such as Log4J, the application sends the log entry to a logger and the logger has the wherewithal to format and send that log entry onto various targets. However, this scenario is limited in that the logger needs to know at the onset the various targets in play. Adding targets after the logger is deployed is difficult.

By taking a message-based approach, you can avoid having to fidget with an application’s logger to add new targets. In fact, the notion of “target” goes away altogether and is replaced by the concept of a subscriber.

Figure 2 shows a simple message-based logging architecture using the pub-sub pattern.

Figure 2: Sending a log message to a central exchange allows consumers to connect to a log stream on demand.Figure 2: Sending a log message to a central exchange allows consumers to connect to a log stream on demand.

In a message-based approach, the logger emits a log entry as a message to a single target, a Message Broker. A Message Broker publishes an entry point, usually an HTTP URL that accepts messages as a POST request. Conceptually that endpoint is an exchange, a place where messages are collected and then distributed to interested parties. The term used by Amazon Simple Notification Services for an exchange is a topicRabbitMQ, another popular messaging technology, uses the term exchange.

The benefit of using an exchange is that one or more subscribers can bind to it and receive a copy of any message sent. This is the basis of the publish-subscribe pattern or pub-sub, for short. Using pub-sub architecture means that subscribers that come online later can bind to the exchange, a.k.a publisher, and get messages that it can use for its own purpose.

Thus, going back to the late binding SMS feature scenario described earlier, all that is required to support accepting log entries and forwarding them on as SMS is to create a subscriber that knows how to convert log messages into an SMS format and then send them on. After the subscriber is created, it gets bound to the publisher. Once bound, the subscriber received a copy of the message sent to the publisher. Most publisher technologies have the ability to keep sending the message to subscribers a set number of times until it is accepted. Undeliverable messages get noted by the publisher.

Should it be determined at a later time that the SMS subscriber is no longer needed, decommissioning the subscriber requires nothing more than disconnecting it from the publisher. No code in the publisher needs to be adjusted. Logging continues from the application unimpeded.

The nice thing about a message-driven logging architecture is that it provides a great deal of flexibility. You can have a subscriber that forwards messages onto a standard logging service, such as InsightOps, to database services such as AWS DynamoDB, or to a file out on Azure. Your application needs to know nothing about the eventual targets. All the application knows is that it’s sending log entries to a single target, the publisher.

Does Message-Based Logging Make Sense for Your Company?

Implementing a message-based approach to logging is not for everyone. First, if your company does not already have a messaging architecture in place, it needs to have the expertise and capacity to implement one. You can use a cloud service such as AWS SNS/SQSAzure Service Bus, or Google Pub-Sub. Also, you can take a server-based approach using industry standard messaging products such as RabbitMQ or Apache’s Kafka. Regardless of the approach you take, your company will need to devote attention and resources to implementing and supporting a messaging framework.

Secondly, all of the physical messages sent to the publisher need to adhere to a conventional format. As the name implies, core to implementing a message-based architecture is understanding that all emitted log data takes the form of a message. Any message can end up anywhere. Thus, you don’t want to be sending around messages that require a lot of predefined knowledge to decipher. You need to have a message format that is conventional, self-describing, and extensible.

Listing 1 below shows an example of a message that is not self-describing.

"My Cool App","ip: "10.1.241.116","log_event","INFO","Saving data"," {"name": \"Bob\", \"status\": \"cool\"}"

The semantics of the message shown above are unknown. You need an outside reference to decipher it. As a result, writing the parsing algorithm to make sense of the message becomes a time consuming undertaking, specific to that given message format. Rather than using a custom message format, it’s better to use a conventional, self-describing format such as JSON, XML, or YAML. Self-describing messages are easy to understand and easy to parse. Using a conventional, self-describing format avoids the hazard of needing “tribal knowledge” to figure out the semantics of a log entry when designing a subscriber, particularly when the subscriber consuming the message binds to the publisher months after the publisher has been put in play.

Listing 2 shows a message that is self-describing. It’s in JSON format and thus can be understood clearly by convention.

{
    "source": "My Cool App",
    "ip: "10.1.241.116",
    "time": "Fri Feb 28 07:25:25 UTC 2016",
    "type": "log_event",
    "body": {
        "level": "INFO",
        "message": "Saving data",
        "data": {"name": "Bob", "status": "cool"}
    }
}

Organizations that have a history of working with message-based architectures understand the value of constructing message using conventional formats. You want to have an environment in which any message can be consumed at any time, by any subscriber, in a manner that is standardized and accurate. Companies are not born with such capabilities. They need to develop them. If your company has the appetite for such an undertaking, you’ll do well. If not, it might be best to stick with the current approach to logging.

Putting It All Together

Adopting a message-based approach to logging has definite benefits. You can provide log information to any interested consumer at any time during an application’s lifetime. Also, when a given consumer no longer has need of the log data, it can easily disconnect from the log message stream.

However, using a message-based approach to logging requires that enterprises have the appetite and wherewithal to support a messaging architecture. However, having a messaging architecture in place is not enough. Once a messaging architecture is adopted, the messages that travel from publisher to consumer should adhere to a conventional format that is easy to parse and process.

Message-based logging provides a great deal of flexibility. Such flexibility is particularly useful in dynamic organizations that need to adapt quickly to continually changing logging demands. Implementing a message-based approach to logging might require a company to make some fundamental changes in the way its system does logging. However, the long term benefits that the messaging brings make the effort well worth the investment.

Source: https://blog.rapid7.com/2018/01/16/taking-a-message-based-approach-to-logging/

Author: Robert Reselman


  • 0

Running End-of-Life SA, IC, or MAG Appliances? Here is What You Should Know

Category : Pulse Secure

Appliances have natural life spans. And if your SAIC, or MAGappliances are running end of life, it is time to bid them farewell and RIP. Upgrading appliances prevents security breaches, gives you access to new features that deliver quantifiable value to your daily business operations, and comes with leading-edge components such as memory, processors, hard disks, network interface cards, etc. New hardware can handle more users and manage traffic faster and more reliably than ever, so you can do more with less.

Even so, saying the final “good-bye” can be tough. We know that. That’s why Pulse Secure is always here to help you to secure the future today to deliver and scale new IT services tomorrow.

Here’s what you should know about burying your end-of-life SA, IC, or MAG appliances and replacing them with new PSA5000 and PSA7000 appliances from Pulse Secure:

  • You gain the ability to provide secure access to SaaS applications from Microsoft Office 365, Box, Concur, and many others.
  • You can deploy and support BYOD in a simple and straightforward manner via a mobile device container.
  • You will be assured that only authorized users with compliant devices can access applications and services in the cloud or data center, thereby preventing data leakage.
  • You can integrate your existing identity stores such as Active Directory, as well as leading providers like Ping and Okta.
  • You can empower your users through single sign-on (SSO) with certificate authentication – eliminating frustrating password requirements.
  • You can know what is on your network and enforce securitywith a unified policy across wired and wireless connections, personal and corporate devices, and remote and local access.

So, take a look at your end-of-life appliances – and take action to secure your future. Saying “RIP” today will set you up to celebrate serious ROI tomorrow.

Source: https://blog.pulsesecure.net/running-end-life-sa-ic-mag-appliances-heres-know/
Author: Mike Dybowski

  • 0

AutoFocus 2.0.3 Release Documentation is Now Live!

Category : Palo Alto

Release Highlights

This release of AutoFocus introduces a variety of improvements related to the AutoFocus Reports and Dashboard, as well as support for the newly available WildFire Windows 10 analysis environment.

Autofocus_1

Key enhancements and features include:

  • Customizable reports that can be referenced and managed from your AutoFocus dashboard.
  • Visualization enhancements for widgets that provide multiple data charting options.
  • Exportable Dashboard Reports.
  • Support for samples analyzed using the WildFire Windows 10 analysis environment.
  • Ability to synchronize with the latest versions of default Palo Alto Networks reports.

For more details on the new enhancements, take a look at the AutoFocus New Feature Guide.

New and Updated Documentation

As always, you can find our content on our Technical Documentation site.

Source: https://researchcenter.paloaltonetworks.com/2018/01/tech-docs-autofocus-2-0-3-release-documentation-now-live/


  • 0

Unleash the Power of Your Hybrid Cloud

Category : NetApp

Discover how you can transform your data into a strategic asset.

Read this white paper to learn how NetApp can help you thrive in a hybrid cloud world.

  • Develop a data-centric culture
  • Control and secure your data
  • Accelerate innovation to drive business growth

Download the white paper: Master Your Hybrid Cloud

Learn how our solutions enable you to gain insight into your hybrid cloud, protect and secure your data wherever it lives, and achieve new levels of agility with DevOps and cloud analytics.


  • 0

2018 Gartner Magic Quadrant for Intrusion Detection and Prevention Systems (IDPS)

Category : McAfee

For the 11th year in a row, Gartner has named McAfee as a Magic Quadrant Leader for Intrusion Detection and Prevention Systems (IDPS).

The Gartner Magic Quadrant for IDPS is an excellent research tool for enterprise security buyers to review and assess which vendors best meet their solution needs and also learn about recent industry developments.

Download your complimentary copy of the report today to gain insight into the criteria you need to consider for your IDPS.

*Gartner Magic Quadrant for Intrusion Detection and Prevention Systems, Craig Lawson, Claudio Neiva, 10 January 2018. From 2014-17, McAfee was included as Intel Security (McAfee). Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings or other designation. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose.

Source: https://www.mcafee.com/us/solutions/lp/gartner-idps.html#sf179349359


  • 0

Cloud Migration: Technical and Business Considerations SHARE

Category : Uncategorized

If you’re like many businesses, you’re moving applications into public and private cloud infrastructures. You’ve seen how the cloud’s agility, resiliency, and scalability drives business growth. Fortunately, rolling out new apps in the cloud is easy when you have containers, microservices, and DevOps supporting your efforts. But what’s not always as easy to figure out is application security—especially if you’re in the midst of migration and need to keep apps secure both on-premises and in the cloud.

Make no mistake: your apps will be attacked. According to the 2017 Verizon Data Breach Investigations Report, web app attacks are by far the number one cause of data breaches—with denial of service attacks the most common of these security incidents.

The good news? You can secure your apps as easily as you can roll them out when you have a flexible, scalable security solution in place.

In this article, we’ll discuss what you need to take into consideration to securely migrate apps to the cloud, and how Imperva FlexProtect can keep your applications secure wherever they live.

Security Model in the Public Cloud

Leading cloud vendors introduced a shared responsibility model for security in the cloud. Amazon states that AWS has “responsibility for security of the cloud,” while customers have “responsibility for security in the cloud.” Microsoft Azure, Google Cloud and other vendors also adopted this model. What does it mean for you? Cloud vendors provide the tools and services to secure the infrastructure (such as networking and compute machines), while you are responsible for things like network traffic protection and application security.

For example, cloud vendors help to restrict access to the compute instances (AWS EC2/Azure VM/Google CE) on which the web server is deployed (by using security groups/firewalls and other methods); they also deny web traffic from accessing restricted ports by setting only the needed HTTP or HTTPS listeners in the public endpoints (usually the load balancer).

But public cloud vendors do not provide the necessary tools to fully protect against application attacks such as the OWASP Top 10 risks or automated attacks. It’s your responsibility to establish security measures that allow only authorized web traffic to enter your cloud-based data center—just as with a physical data center. Securing web traffic in physical data centers is typically done by a web application firewall (WAF) and fortunately, a WAF can be deployed in the public cloud as well.

Choose Flexible Application Security for the Cloud

When choosing solutions to mitigate different web application threats, it’s important to make sure that they offer flexibility to choose the tools you need. The first mitigation layer is usually common for all attackers, it denies access from badly-reputed sources (“malicious IPs”) and blocks requests based on predefined signatures. This solution could be useful against generic types of attacks, like a botnet attack looking for known vulnerabilities. The more targeted the attack is though, the more fine-grained the tools required to mitigate it—and the higher the level of control your security team needs. When an attacker tries to launch an attack tailored to a specific web service, you need customizable tools to block it.

An ideal solution would offer both generic and customizable tools with the flexibility to be deployed within the private network and public cloud while giving your security administrator full control, including deployment topology and security configuration. An application security solution that is deployed in the public cloud should support several key attributes:

Burst capacity: Automatically spawn new security instances which then register with the existing cluster of gateways.

Multi-cloud security: A security solution should support all the major public cloud infrastructures (AWS, Azure or Google Cloud Platform) and your own data center so you can secure applications wherever they live—now and in the future.

DevOps ready: Security solutions should employ machine learning to automatically understand application behavior.

Automation: Dynamic cloud environments require automation to launch, scale, tune policies and handle maintenance operations.

High availability: Business continuity demands that your security solution be highly available.

Centralized management for hybrid deployment: A security solution should have a centralized management solution that can control hybrid deployments in both the physical data center and in the cloud.

Pricing for Applications

Applications are moving to a more automated architecture and they’re being developed and rolled out faster than ever. If any of the following apply to you, then you need a flexible licensing solution for security:

  • Moving to a microservices architecture
  • Planning to use serverless computing such as AWS Lambda
  • Deploying containers instead of traditional virtual machines
  • Have a dedicated application DevOps team in your organization
  • Concerned about your API security
  • Moving your applications from on-premises to public cloud infrastructure like AWS, Azure or Google Cloud Platform
  • Need to keep certain applications on-premises and need security for both cloud and on-premises

Imperva FlexProtect offers a single subscription with the flexibility to mix and match application security tools so you can secure applications wherever they live. FlexProtect security tools protect in-the-cloud and on-premises application portfolios, and keep your applications safe while you navigate the uncertainties of moving to a virtual, cloud-centric architecture.

Imperva application security solutions are available in a flexible, hybrid model that combines cloud-based services with virtual appliances to deliver application security and DDoS defense for the cloud. With FlexProtect, you can choose from the following Imperva security solutions:

Summary

Your organization needs a simple and flexible solution to facilitate a smooth transition from on-premises to the cloud. Imperva offers a solution that scales with your business while allowing you to choose tools based on your application security requirements. With FlexProtect, Imperva removes the dilemma associated with cloud migration planning and future proofs application security investments.

Source: https://www.imperva.com/blog/2018/01/cloud-migration-technical-and-business-considerations/?utm_source=linkedin&utm_medium=organic-social&utm_content=cloud-migration-considerations&utm_campaign=2018-Q1-linkedin-awareness

Author: Ajay Uggirala


  • 0

Micro Focus tackles compliance requirements across GDPR, MiFID II and more with Digital Safe 10

Category : HP Security

Latest cloud archiving solution advances information risk management; unleashes value of data

Micro Focus   announced Digital Safe 10 to enable customers to mitigate information-borne risk stemming from the surge in regulatory, government and privacy mandates, including the General Data Protection Regulation (GDPR). With Digital Safe 10, customers can refine and extend their information archiving strategies to tackle the complexity of compliance and unleash the untapped potential of archived data to gain greater business insight.

Organisations of all sizes and industries are subject to ever-expanding data compliance requirements that they must be able to respond to quickly. This is complicated by the exponential growth of regulated employee-generated content (email, voice, text, instant message, file, social media, and mobile).

“The magnitude of compliance demands is astounding, and as GDPR and MiFID II take effect in 2018, it will not be any easier for organisations to meet them,” said Joe Garber, vice president, information management and governance at Micro Focus. “Digital Safe 10 further reduces the compliance management burden, allowing teams to more easily address requirements and determine what valuable insights the business can glean to reduce information risk and further drive the top line.”

Digital Safe 10 builds upon and enhances the foundational features that customers have relied on for the past 20 years to manage legal and compliance risk: high levels of security, virtually unlimited processing capacity, market-leading employee supervision, massive scalability, compliant storage, and market-defining architecture. Digital Safe 10 enhancements include:

• GDPR, MiFID II (and more) support: Single pane of glass approach to manage multi-format archived data required for comprehensive compliance reporting and data mining.
• Enhanced data capture: Open APIs to capture and understand structured and unstructured data in hundreds of file formats, including mobile and social media, which is then consolidated in a unified object store.
• Enhanced user interface: A single, modern and efficient user interface that enables high performance early-case assessment (ECA).
• Live capture verification and validation: Tracking, reporting and reconciliation of eCommunications data, providing evidence of data integrity – from ingestion to deletion.
• Integrated reporting framework: Reporting capabilities on data processes with export to all standard file formats and delivery methods.
• Integrated, advanced analytics powered by Micro Focus – More than 400 analytical functions powered by the proven analytics of Micro Focus’ Vertica and IDOL, plus new forms of data enrichment, such as sentiment analysis, message clustering, and machine learning that enable organisations to quickly and efficiently drive meaningful insights faster and through fewer queries. Comprehensive information archiving, more efficient lexical-based supervision, and built-in security that draws from strength in Micro Focus’ Voltage and Atalla technologies.

“Digital Safe 10 is a true reflection of Micro Focus bringing together the power of its portfolio to yield innovative customer-centred solutions that can help customers address SEC, Dodd-Frank, HIPPA, FTC, FDA, MiFID II, GDPR, and many more regulatory requirements in a single solution,” said Garber.

Today Micro Focus also announced the release of Retain 4.3, its unified archiving solution targeted at medium-sized businesses, and at satellite offices of global enterprises that require an on-premises solutions to supplement Digital Safe in countries with strict data sovereignty requirements. The latest Retain solution incorporates carrier-level archiving that enables organisations to capture and store text messages (SMS and MMS) delivered on the carrier network for compliance purposes. The data is secured and archived in a central unified repository that includes multi-platform email and social media.

Source: https://www.totaltele.com/498858/Micro-Focus-tackles-compliance-requirements-across-GDPR-MiFID-II-and-more-with-Digital-Safe-10

 


  • 0

Why We Need to Think Differently about IoT Security

Category : Gigamon

Breach fatigue is a real issue today. As individual consumers and IT professionals, we risk getting de-sensitized to breach alerts and notifications given just how widespread they have become. While this is a real issue, we cannot simply let our guard down or accept the current state – especially as I believe the volume and scale of today’s breaches and their associated risks will perhaps pale in comparison to what’s to come in the internet of things (IoT) world.

It is one thing to deal with loss of information, data and privacy, as has been happening in the world of digital data. As serious as that is, the IoT world is the world of connected “things” that we rely on daily – the brakes in your car, the IV pumps alongside each hospital bed, the furnace in your home, the water filtration system that supplies water to your community – but also take for granted simply because they work without us having to worry about them. We rarely stop to think about what would happen if … and yet, with everything coming online, the real question is not if, but when. Therein lies the big challenge ahead of us.

Again, breaches and cyberattacks in the digital world are attacks on data and information. By contrast, cyberattacks in the IoT world are attacks on flesh, blood and steel – attacks that can be life-threatening. For example, ransomware that locks out access to your data takes on a whole different risk and urgency level when it is threatening to pollute your water filtration system. Compounding this is the fact that we live in a world where everything is now becoming connected, perhaps even to the point of getting ludicrous. From connected forks to connected diapers, everything is now coming online. This poses a serious challenge and an extremely difficult problem in terms of containing the cyberrisk. The reasons are the following:

  1. The manufacturers of these connected “things” in many cases are not thinking about the security of these connected things and often lack the expertise to do this well. In fact, in many cases, the components and modules used for connectivity are simply leveraged from other industries, thereby propagating the risk carried by those components from one industry to another. Worse still, manufacturers may not be willing to bear the cost of adding in security since the focus of many of these “connected things” is on their functionality, not on the ability to securely connect them.
  2. Consumers of those very products are not asking or willing in many cases to pay for the additional security. Worse still, they do not know how to evaluate the security posture of these connected things or what questions to ask. This is another big problem not just at the individual consumer level, but also at the enterprise level. As an example, in the healthcare space, when making purchasing decisions on drug infusion pumps, hospitals tend to make the decision on functionality, price and certain regulatory requirements. Rarely does the information security (InfoSec) team get involved to evaluate their security posture. It is a completely different buying trajectory. In the past, when these products did not have a communication interface, that may have been fine. However, today with almost all equipment in hospitals – and in many other industries – getting a communications interface, this creates major security challenges.
  3. Software developers for connected devices come from diverse backgrounds and geographies. There is little standardization or consensus on incorporating secure coding practices into the heart of any software development, engineering course or module across the globe. In fact, any coursework on security tends to be a separate module that, in many cases, is optional in many courses and curriculums. Consequently, many developers globally today have no notion of how to build secure applications. The result is a continual proliferation of software that has been written with little to no regard to its exploitability and is seeping into the world of connected things.

These are all significant and vexing challenges with neither simple fixes nor a common understanding or agreement on the problem space itself. I won’t claim to have a solution to all of them either, but in a subsequent blog, I will outline some thoughts on how one could begin to start approaching this. In the meanwhile, I think the risk and rhetoric around cyber breaches associated with the world of connected things could perhaps take on an entirely new dimension.

To learn more now about how a Security Delivery Platform can optimize your security posture, download the complete Security Inside Out e-book. Stay safe.

Source: https://blog.gigamon.com/2017/10/15/need-think-differently-iot-security/?utm_content=bufferd8099&utm_medium=social&utm_source=linkedin.com&utm_campaign=buffer

Author: Shehzad Merchant


  • 0

Five predictions for the IoT in 2018

Category : Gemalto

2017 was another big year for the IoT. Consumers continued buying connected devices in their droves, culminating in voice assistants becoming a must-have Christmas gift around the world. The first Narrow Band IoT (NB-IoT) services and products were officially launched, and it’s even claimed that there are now more connected IoT devices than there are smartphones and PCs.

But what does 2018 have in store for the IoT? Here are five key trends we predict will define the year ahead.

1.More connected devices, more connectivity options

2017 saw the number of devices, and the ways in which they connect, expand. That trend will continue in 2018. Notably, new Low Power Wide Area Networks (LPWAN) as NB-IoT, Sigfox or LoRaWAN, will be further developed. These will enable longer battery life in devices that need to be in the field for several years, and reliable connectivity across large distances.

We expect these widespread deployments to start driving a real social impact – such as environmental monitoring in the fight against climate change. Greater convenience and simplicity in our everyday lives will grow as home automation services continue to push into new fields, and a new era of smart manufacturing will emerge. This sector is set for a milestone year after a wave of pilot projects and experimentation in 2017, with a recent IDC report estimating that it will spend $189 billion in 2018 alone.

We’ll see more firms getting started on their own ‘IoT journey’ and requiring guidance on connectivity options, storage, remote monitoring and so on.

2. More edge computing with real-time analytics

With more IoT deployments, especially in the industrial field where devices may be spread over a large geographic distance, we’ll see an increase in the use of real-time data at the edge (i.e. on the connected device). As well as providing an efficient way to reduce the cost of data transfer and storage, it will allow immediate analysis of data and the ability to take faster, better informed decisions.

recent IDC report predicted that 40% of IoT data will be stored and analyzed at the edge of the network by 2019.

3. An increase in Artificial Intelligence (AI) and machine learning

The efforts made so far in machine learning will continue to develop, helping IoT devices to move from rule-based to deeper predictive maintenance, increasing efficiencies. We can expect  better detection of potential attacks before they’re able to cause huge damage.

IoT providers will also be able to offer new matching services based on deeper observations of customer profiles, as machines learn from how end-users interact with a service or product. AI will be particularly important in large scale deployments of hundreds or thousands of IoT devices, where networking and data collection would otherwise become quite difficult.

While there’s been lots of discussion around AI taking over from humans in the workplace, we believe that these fears are not totally founded. The technology will provide new AI-related jobs, or simply allow people to focus on entirely new, more creative tasks.

4. More regulations and standards for security

The need for robust security in the IoT will grow. More attacks will happen, leading to greater awareness of IoT security and the potential damage that could be caused by an attack. A survey we ran last year found that the majority of organizations and consumers believe there is a need for IoT security regulations, and want government involvement in setting those standards.

Partnerships with external IoT experts will become more common as companies try to boost security. That means working with ‘white hat’ hackers, IoT security experts or bug bounty programs to test existing infrastructures, discover vulnerabilities and make improvements.

5. Growth of comprehensive IoT platforms

Manufacturers in particular will be looking for IoT platforms which can provide a complete technology stack in one place. The platforms that come out on top will be the ones who can offer everything a manufacturer or service provider needs – connectivity, support for all connectivity protocols, security, scalability, remote fleet monitoring and large, secure data storage opportunities with links to major cloud connectors (AWS, IBM, Microsoft etc).

Comprehensive IoT platforms like these will enable IoT service providers to develop their offering more quickly and easily, while ensuring security.

There will doubtlessly be more major developments and unexpected surprises waiting for us over the next 12 months – but whatever happens, it’s sure to be an exciting year.

Source: https://blog.gemalto.com/iot/2018/01/16/five-predictions-iot-2018/

Author: Sophie Bessin-Py


Support