sales@net-ctrl.com
01473 281 211

Net-Ctrl Blog

Not All Next-Generation Firewalls Are Created Equal

August 9th, 2016

As cybersecurity threats increase in sophistication, the security solutions used to defend against these threats must also evolve. Developers no longer adhere to standard port/protocol/application mapping; applications are capable of operating on non-standard ports, as well as port hopping; and users are able to force applications to run over non-standard ports, rendering first-generation firewalls ineffective in today’s threat environment. Enter the “next-generation firewall” (NGFW), the next stage of firewall and intrusion prevention systems (IPS) technology.

A common understanding of an NGFW is a network platform that combines the traditional firewall functionalities with IPS and application control. However, merely bundling traditional firewalls with IPS and application control does not result in an NGFW. A true NGFW emphasizes native integration, classifies traffic based on applications rather than ports, performs a deep inspection of traffic and blocks attacks before a network can be infiltrated. Here is a list of key features of a true NGFW to better inform your next purchase decision.

Identify and control applications and functions on all ports, all the time

An NGFW should identify traffic on all ports at all times, and classify each application, while monitoring for changes that may indicate when an unpermitted function is being used. For example, using Citrix GoToMeeting for desktop sharing is permitted but allowing an external user to take control is not.

Identify users regardless of device or IP address

Knowing who is using which applications on the network, and who is transferring files that may contain threats, strengthens an organization’s security policies and reduces incident response times. An NGFW must get user identity from multiple sources – such as VPN solutions, WLAN controllers and directory servers – and allow policies that safely enable applications based on users, or groups of users, in outbound or inbound directions.

Identify and control security evasion tactics

There are two different classes of applications that evade security policies: applications that are designed to evade security, like external proxies and non-VPN-related encrypted tunnels (e.g., CGIProxy), and those that can be adapted to achieve the same goal such as remote server/desktop management tools (e.g., TeamViewer). An NGFW must have specific techniques that identify and control all applications, regardless of port, protocol, encryption or other evasive tactics and know how often that firewall’s application intelligence is updated and maintained.

Decrypt and inspect SSL and control SSH

An NGFW should be able to recognize and decrypt SSL and SSH on any port, inbound or outbound; have policy control over decryption; and offer the necessary hardware and software elements to perform SSL decryption simultaneously across tens of thousands of SSL connections with predictable performance.

Systematically manage unknown traffic

Unknown traffic represents significant risks and is highly correlated to threats that move along the network. An NGFW must classify and manage all traffic on all ports in one location and quickly analyze the traffic, known and unknown, to determine if it’s an internal/custom application, a commercial application without a signature, or a threat.

Protect the network against known and unknown threats in all applications and on all ports

Applications enable businesses, but they also act as a cyberthreat vector, supporting technologies that are frequent targets for exploits. An NGFW must first identify the application, determine the functions that should be permitted or blocked, and protect the organization from known and unknown threats, exploits, viruses/malware or spyware. This must be done automatically with near-real time updates to protect from newly discovered threats globally.

Deliver consistent policy control over all traffic, regardless of user location or device type

An NGFW should provide consistent visibility and control over traffic, regardless of where the user is and what device is being used, without introducing performance latency for the user, additional work for the administrator, or significant cost for the organization.

Simplify network security

To simplify and effectively manage already overloaded security processes and people, an NGFW must enable easy translation of your business policy to your security rules. This will allow policies that directly support business initiatives.

Perform computationally intensive tasks without impacting performance

An increase in security features often means significantly lower throughput and performance. An NGFW should deliver visibility and control including content scanning, which is computationally intensive, in high-throughput networks with little tolerance for latency.

Deliver the same firewall functions in both a hardware and virtualized form factor

Virtualization and cloud computing environments introduce new security challenges, including inconsistent functionality, disparate management and a lack of integration points. An NGFW must provide flexibility and in-depth integration with virtual data centers in private and public cloud environments to streamline the creation of application-centric policies.

To learn more about what features a NGFW must have to safely enable applications and organizations, read the 10 Things Your Next Firewall Must Do white paper.

POSTED BY: Eila Shargh on August 8, 2016 on Palo Alto Network Research Portal

Palo Alto Networks Raises the Bar for Endpoint Security

August 8th, 2016

Palo Alto Networks®, the next-generation security company, announced new functionality, including significant machine learning capabilities for real-time unknown malware prevention, to its Traps™ advanced endpoint protection offering. These updates further strengthen the malware and exploit prevention capabilities of Traps and alleviate the need for legacy antivirus products to protect endpoints, such as laptops, servers and VDI instances.

Many organisations deploy a number of security products and software agents on their endpoint systems, including one or more traditional antivirus products. Nevertheless, cyber breaches continue to increase in frequency, variety and sophistication. Traditional antivirus products struggle to keep pace and invariably fail to prevent these attacks on endpoints.

An alternative to legacy antivirus point products, Traps uniquely combines the most effective, purpose-built malware and exploit detection methods to prevent known and unknown threats before they can successfully compromise an endpoint. By focusing on detecting and blocking the techniques at the core of these attacks, Traps can prevent sophisticated, targeted and never-before-seen attacks.

As a component of the Palo Alto Networks Next-Generation Security Platform, a natively integrated and automated platform designed to safely enable applications and prevent cyber breaches, Traps both shares with and receives threat intelligence information from the Palo Alto Networks WildFire™ cloud-based malware analysis environment. Threat intelligence information is passed to WildFire by each component of the security platform, and Traps uses this information to block threats on the endpoint no matter where they originated.

The new functionality announced today, which includes static analysis via machine learning and trusted publisher capabilities, will allow Traps to detect and immediately prevent malware that has never been seen.

Quotes

“The sophistication and frequency of cyberattacks are growing too quickly for legacy antivirus tools that rely on malware signatures to keep pace. The Palo Alto Networks Traps offering takes an innovative approach to endpoint security, keeping endpoints more secure despite a growing landscape of cyberthreats and reducing the resources required by IT teams to track and install security patches.”

Rob Westervelt, research manager, Security Products, IDC

“Antivirus point products give organisations a false sense of security, because while they technically make users compliant with regulatory and corporate governance requirements, they do not protect against today’s advanced cyberthreats. To do that, organisations must adopt a cybersecurity platform that prevents malware from infiltrating the enterprise at any point, including the endpoint, even if it has never been seen before.”

Lee Klarich, executive vice president, Product Management, Palo Alto Networks

The latest version of Traps, version 3.4, will be available by the end of August on the Palo Alto Networks Support Portal and will include the following updates:

  • Static analysis via machine learning examines hundreds of characteristics of a file to determine if it is malware. Threat intelligence available through the Palo Alto Networks WildFire subscription is used to train a machine learning model to recognise malware, especially previously unknown variants, with unmatched effectiveness and accuracy. This new functionality allows Traps to rapidly determine if a file should be allowed to run even before receiving a verdict from WildFire.
  • Trusted publisher identification allows organisations to automatically and immediately identify new executable files published by trusted and reputable software publishers. These executable files are allowed to run, cutting down on unnecessary analysis and allowing them to execute without delay or impact to the user.
  • Quarantine of malicious executables immediately removes malicious files and prevents further propagation or execution attempts of the files.
  • Grayware classification allows enterprises to identify non-malicious, but otherwise undesirable, software and prevent it from running in their environment.
  • Learn More

  • Read the Traps 3.4 blog post
  • Register for the upcoming webinar, Protect Yourself From Antivirus
  • Read the white paper, Protect Yourself From Antivirus
  • Palo Alto Networks Traps Advanced Endpoint Protection
  • Palo Alto Networks WildFire Cloud-Based Malware Analysis Environment
  • Palo Alto Networks Next-Generation Security Platform
  • View the original article at Palo Alto Networks.

    New Structure at Net-Ctrl

    July 28th, 2016

    Net-Ctrl has been going through a period of change over the last few months. As many of you will be aware the business was previously co-owned by Tony Pullon and Lee Georgio. Tony has now left the business, and we wish him a great Summer break and thank him for his years of dedication.

    The departure has bought change to Net-Ctrl, below is our revised senior management team structure:

  • Lee Georgio – Director
  • Lesley Cook – Finance and Office Manager
  • Mark Power – Major Accounts Sales Manager
  • Josh Moore – Marketing and Internal Sales Manager
  • Carol Gorman – Renewals and Major Accounts Sales Manager
  • In addition to those listed above we have our internal and external sales and technical teams.

    We’re all very excited about what the future holds for Net-Ctrl, and we are all dedicated more than ever to delivering great solutions and services to our customers.

    Following the changes, if you’re unsure who your account manager is please contact josh.moore@net-ctrl.com.

    We wish you all a great Summer.

    No Brexcuses: GDPR preparations must continue

    July 28th, 2016

    Whether or not you voted for Brexit, whether or not you believe it’s a done deal, there’s one thing post-referendum that surely isn’t up for debate. For British companies wanting to trade with Europe, the bureaucracy of Brussels isn’t going away. And that particularly applies to data protection. Some business people may well have heaved a sigh of relief on June 24th at the thought that GDPR (General Data Protection Regulation) the tough new European data protection regulation that was adopted in April 2016 and comes into force in May 2018 would no longer apply in the UK. That idea was based on the premise that the important thing is where the data is stored.

    Unfortunately, that’s not true under GDPR. What matters is whether the data concerns EU citizens, irrespective of where it is stored. Current UK data protection legislation comes from the Data Protection Act 1998, based on the 1995 Data Protection Directive. That will be superseded in Europe by GDPR less than two years from now. In other words, even if Article 50 were notified right now, GDPR would come into force before the Article 50 two-year post notification period runs out. Because GDPR is a Regulation and not a Directive, it does not require enabling national legislation to become law. That means it will apply in the United Kingdom, whether we like it or not. Even once Brexit is fully negotiated and implemented the chances are that the UK will either have to comply with GDPR or implement data protection legislation of its own that the EU deems adequate (i.e. the same or very similar) if it wishes to keep trading with the European Union. This is likely to be equally applicable to the Network and Information Security Directive which has until May 2018 to be implemented in national law.

    So, if UK businesses have any ambition to continue selling to European customers, viewing Brexit as an opportunity to side-step data protection obligations is a serious mistake. Despite the GDPR’s short term disruption, the regulation is likely to have a positive impact on data security industry. It will accelerate the modernisation of Europe’s data security practices and enforce consistency of approach between EU member states. Nonetheless, it will require European business of all sizes to take a very close look at their security, including those in the UK. From both commercial and practical perspectives, preparations must continue. Regardless of what you make of either Brexit or the GDPR, businesses in the UK have no choice but to keep pace with the regulation.

    Original post by Swivel Secure. View original post.

    Gemalto and Ponemon Institute Study: Cloud data security still a challenge for many companies

    July 28th, 2016
  • Half of all cloud services and corporate data stored in cloud not controlled by IT departments
  • Only a third of sensitive data stored in cloud-based applications is encrypted
  • More than half of companies do not have a proactive approach for compliance with privacy and security regulations for data in cloud environments
  • Simple measures by IT organizations provide protection for both corporate data and “shadow IT” needs.
  • Despite the continued importance of cloud computing resources to organizations, companies are not adopting appropriate governance and security measures to protect sensitive data in the cloud. These are just a few findings a Ponemon Institute study titled “The 2016 Global Cloud Data Security Study,” commissioned by Gemalto (Euronext NL0000400653 GTO), the world leader in digital security. The study surveyed more than 3,400 IT and IT security practitioners worldwide to gain a better understanding of key trends in data governance and security practices for cloud-based services.

    [On July 28 at 10 a.m. EDT, Gemalto and the Ponemon Institute will host a webinar to discuss the full results of the study. Click on the following link to register: https://www.brighttalk.com/webcast/2037/216247].

    According to 73 percent of respondents, cloud-based services and platforms are considered important to their organization’s operations and 81 percent said they will be more so over the next two years. In fact, thirty-six percent of respondents said their companies’ total IT and data processing needs were met using cloud resources today and that they expected this to increase to forty-five percent over the next two years.

    Although cloud-based resources are becoming more important to companies’ IT operations and business strategies, 54 percent of respondents did not agree their companies have a proactive approach to managing security and complying with privacy and data protection regulations in cloud environments. This is despite the fact that 65 percent of respondents said their organizations are committed to protecting confidential or sensitive information in the cloud. Furthermore, 56 percent did not agree their organization is careful about sharing sensitive information in the cloud with third parties such as business partners, contractors and vendors.

    “Cloud security continues to be a challenge for companies, especially in dealing with the complexity of privacy and data protection regulations,” said Dr. Larry Ponemon, chairman and founder, Ponemon Institute. “To ensure compliance, it is important for companies to consider deploying such technologies as encryption, tokenization or other cryptographic solutions to secure sensitive data transferred and stored in the cloud.”

    “Organizations have embraced the cloud with its benefits of cost and flexibility but they are still struggling with maintaining control of their data and compliance in virtual environments,” said Jason Hart, Vice President and Chief Technology Officer for Data Protection at Gemalto. “It’s quite obvious security measures are not keeping pace because the cloud challenges traditional approaches of protecting data when it was just stored on the network. It is an issue that can only be solved with a data-centric approach in which IT organizations can uniformly protect customer and corporate information across the dozens of cloud-based services their employees and internal departments rely every day.”

    Key Findings

    Cloud security is stormy because of shadow IT
    According to respondents, nearly half (49 percent) of cloud services are deployed by departments other than corporate IT, and an average of 47 percent of corporate data stored in cloud environments is not managed or controlled by the IT department. However, confidence in knowing all cloud computing services in use is increasing. Fifty-four percent of respondents are confident that the IT organization knows all cloud computing applications, platform or infrastructure services in use – a nine percent increase from 2014.

    Conventional security practices do not apply in the cloud
    In 2014, 60 percent of respondents felt it was more difficult to protect confidential or sensitive information when using cloud services. This year, 54 percent said the same. Difficulty in controlling or restricting end-user access increased from 48 percent in 2014 to 53 percent of respondents in 2016. The other major challenges that make security difficult include the inability to apply conventional information security in cloud environments (70 percent of respondents) and the inability to directly inspect cloud providers for security compliance (69 percent of respondents).

    More customer information is being stored in the cloud and is considered the data most at risk
    According to the survey, customer information, emails, consumer data, employee records and payment information are the types of data most often stored in the cloud. Since 2014, the storage of customer information in the cloud has increased the most, from 53 percent in 2014 to 62 percent of respondents saying their company was doing this today. Fifty-three percent also considered customer information the data most at risk in the cloud.

    Security departments left in the dark when it comes to buying cloud services
    Only 21 percent of respondents said members of the security team are involved in the decision-making process about using certain cloud application or platforms. The majority of respondents (64 percent) also said their organizations do not have a policy that requires use of security safeguards, such as encryption, as a condition to using certain cloud computing applications.

    Encryption is important but not yet pervasive in the cloud
    Seventy-two percent of respondents said the ability to encrypt or tokenize sensitive or confidential data is important, with 86 percent saying it will become more important over the next two years, up from 79 percent in 2014. While the importance of encryption is growing, it is not yet widely deployed in the cloud. For example, for SaaS, the most popular type of cloud-based service, only 34 percent of respondents say their organization encrypts or tokenizes sensitive or confidential data directly within cloud-based applications.

    Many companies still rely on passwords to secure user access to cloud services
    Sixty-seven percent of respondents said the management of user identities is more difficult in the cloud than on-premises. However, organizations are not adopting measures that are easy to implement and could increase cloud security. About half (forty-five percent) of companies are not using multi-factor authentication to secure employee and third-party access to applications and data in the cloud, which means many companies are still relying on just user names and passwords to validate identities. This puts more data at risk because fifty-eight percent of respondents say their organizations have third-party users accessing their data and information in the cloud.

    Recommendations for Data Security in the Cloud

    The new realities of Cloud IT mean that IT organizations need to set comprehensive policies for data governance and compliance, create guidelines for the sourcing of cloud services, and establish rules for what data can and cannot be stored in the cloud.

    IT organizations can accomplish their mission to protect corporate data while also being an enabler of their “Shadow IT” by implementing data security measures such as encryption that allow them to protect data in the cloud in a centralized fashion as their internal organizations source cloud-based services as needed.
    As companies store more data in the cloud and utilize more cloud-based services, IT organizations need to place greater emphasis on stronger user access controls with multi-factor authentication. This is even more important for companies that give third-parties and vendors access to their data in cloud.

    About the Survey
    The survey was conducted by the Ponemon Institute on behalf of Gemalto and surveyed 3,476 IT and IT security practitioners in the United States, Brazil, United Kingdom, Germany, France, Russian Federation, India, Japan and Australia who are familiar and involved in their company’s use of both public and private cloud resources. Industries represented among the respondents include Financial Services, Retail, Technology & Software, Public Sector, Healthcare and Pharmaceutical, Utilities & Energy, Education, Transportation, Communications, Media & Entertainment, and Hospitality.

    Related Resources

  • Report: Gemalto 2016 G​lobal Cloud Data Security Study
  • Infographic: Gemalto Cloud Data Security Infographic
  • Web Site: Gemalto 2016 Global Cloud Data Security Study Findings
  • Video: Gemalto Cloud Security Solutions Overview
  • Web Site: You Can’t Secure the Cloud with Old School Technology​​
  • Article taken from Gemalto.com. View original.

    Cybersecurity Education Efforts Yielding Results

    July 21st, 2016

    Cybersecurity education efforts are yielding results, with 61 percent of respondents to a survey conducted by Palo Alto Networks saying they would speak with IT before introducing new devices onto a corporate network or adding business applications and tools onto unsecured devices.

    With more than 25% of identified attacks in enterprises expected to involve IoT by 2020, [1] – and many expected to enter the workplace – this survey finding represents a significant step in the right direction and demonstrates that employees’ knowledge and understanding of their role in cybersecurity is improving.

    However, the contrasting findings from this survey of business managers – who typically have the salary and tendency to be early adopters of new technology – are that 39 percent would fly under IT’s radar. This leaves a large margin for risk.

    Further still, of the group that doesn’t go to IT, one in every eight would “not tell anyone” about bringing a new device into an organisation or installing corporate tools, such as email, onto unsecured devices.

    Attitude Impacts Adherence

    The survey found that adherence to cybersecurity policies, such as those around the introduction of a new device, is largely guided by personal attitudes and views toward technology. Of those who have circumvented their company’s cybersecurity policy in the past, the prevailing reason for doing so was that they wanted to use a more efficient tool or service, or one that was considered to be the best in the market. Companies need to enable, not limit, employee choices, using technology and education to manage risk.

    Temporary Employees Require Full-time Supervision

    Contractors were the group most often seen to be bypassing company guidelines on cybersecurity, with 16 percent of respondents saying they had seen a temporary employee circumvent policies.

    Quote

    “BYOD is now a mature concept, but many still struggle to manage the blurry lines between personal and business data access by personal devices. Many organisations have deployed solutions to manage devices, but the anxiety comes from their broad connectivity, especially as the boundaries between business-driven cloud services and personal ones become less clear, which creates unknown bridges between business networks and the Internet at large. Modern state-of-the-art security must be able to prevent any device communication becoming the point of a breach and minimise risk for an organisation.”

    Greg Day, VP and regional CSO, EMEA at Palo Alto Networks

    Recommendations

    • Organisations should continue with employee education efforts to ensure that those on the front line of defences have the skills they need to identify threats.
    • Security professionals should closely monitor the activity of non-permanent employees or contractors and ensure they receive the same policy information as full-time staff.
    • Organisations should integrate up-to-date security solutions that fit with new technology trends in order to eliminate the weaknesses exposed in an evolving computing environment.
    • Businesses should look at how they identify and enable the safe use of trusted or sanctioned cloud services and applications and manage the use of those that are untrusted and unsanctioned.

    Download “Preventing the Blocks to Cybersecurity in Business” at: https://www.paloaltonetworks.com/resources/research/preventing-blocks-to-cybersecurity-in-business

    Research Methodology

    The survey was conducted online among 765 business decision-makers in companies with 1,000+ employees in the U.K., Germany, France, the Netherlands and Belgium by Redshift Research in October 2015.

    [1] Gartner-Press Release, “Gartner Says Worldwide IoT Security Spending to Reach $348 Million in 2016”, April 25, 2016, http://www.gartner.com/newsroom/id/3291817

    Possible MOBOTIX Shipment Delay

    July 21st, 2016

    MOBOTIX will be shutting down their operations from Monday, August 8 to Friday to August 26, 2016.

    All items which are in stock will be dispatched immediately once orders are confirmed.

    Orders for larger quantities or special models may take longer. We recommend to place your order as soon as possible and will do everything in our power to assist you with your project.

    For more information please contact Mark Power on 01473 281 211 or at mark.power@net-ctrl.com.

    Why User-Based Controls Are Critical to Your Breach Prevention Strategy

    July 20th, 2016

    POSTED BY: Navneet Singh on Palo Alto Networks Blog.

    Employees, customers and partners connect to different repositories of information within your network, as well as to the internet, to perform various aspects of their jobs. These people and their many devices represent your network’s users. It’s important to your organisation’s risk posture that you’re able to identify who they are — beyond IP address — and the inherent risks they bring with them based on the particular device they’re using, especially when security policies have been circumvented or new threats have been introduced to the organisation.

    Here are two high-profile, real-world breaches that you can learn from. The key takeaway here is that, to make the most of your next-generation firewall investment, it is critical to implement user-based controls.

    Example 1: Data Breach at a Large U.S. Retailer

    This data breach started with the attackers stealing a third-party vendor’s login credentials. This allowed them to gain access to the third-party vendor environment and exploit a Windows vulnerability. Since the vendor had the privileges to access the corporate network, the attackers gained access, too. The attackers were then able to install memory-scraping malware on more than 7,500 self-checkout POS terminals. This malware was able to grab 56 million credit and debit card numbers. The malware was also able to capture 53 million email addresses.

    The SANS Institute Reading Room for InfoSec has published a report on the breach. The report mentions several ways in which the breach could have been prevented. One of the most important is to have the right access controls in place. Quoting from the report:

  • An identity and access management solution should be used to manage the identities and access of all internal and external employees (third-party vendors).
  • Each external employee should have their own account, so that there is accountability for anything performed on their behalf.
  • Account review procedures should also be in place, specifically for third-party vendor accounts. Auditing of these third-party vendors is critical. This will allow the detection of abnormal behavior.
  • Having all of these controls in place for managing and monitoring the third-party vendor accounts will detect any misuse of third-party vendor credentials.
  • Example 2: Data Breach at a Large U.S. Banking and Financial Services Company

    This data breach started with the attackers infecting the personal computer of an employee. The malware stole the employee’s login credentials. When the employee used VPN to connect to the corporate network, the attackers were able to gain access to more than 90 corporate servers. The attackers stole private information for 76 million households and 7 million small businesses.

    The SANS Institute Reading Room for InfoSec’s report on this breach mentions the need to manage user privileges as one of the key ways to minimize the risk of a breach or minimise damage in case of a breach. Quoting from the report:

  • Least privilege simply means to give someone the least amount of access to perform his or her job. If least privilege control access were applied, these organisations would have reduced the amount of stolen data by 86 percent.
  • Anonymous access must be disabled because many Windows vulnerabilities are caused by null user sessions. A null user session is essentially a Server Message Block (SMB) session with blank username and password.
  • What This Means for You as the Security Practitioner

    Want to make sure your organisation does not end up in the headlines for the wrong reasons, like a massive data breach? You’d do well to implement user-based controls and restrict user access to least privilege, as the SANS Institute reports recommend. Employ the right user access mechanisms not only on the endpoints and on the applications that they access but also on your next-generation firewall.

    Call to Action

    If you own a Palo Alto Networks® Next-Generation Firewall, refer to the following resources to enable User-ID™, and increase your organization’s breach defenses:

  • User-ID documentation
  • Best practice internet gateway security policy
  • User-ID tech tips
  • The Theory of Wi-Fi Evolution and IEEE 802.11 Selection

    July 14th, 2016

    By: Sundar Sankaran, Chief Wireless Architect

    September 2015 marked the 25th anniversary of IEEE 802.11, commonly referred to as Wi-Fi. Over these 25 years, Wi-Fi has ascended from a technology that enabled computers to wirelessly transfer data at 2 Mbps to winning a spot in Maslow’s pyramid as the most basic human need.

    IEEE 802.11 got here, as Lewis Carroll suggested, by running twice as fast. The standard has continuously advanced itself by introducing amendments, such as 802.11n, 802.11ac and 802.11ax. These amendments support higher data rates to meet ever-increasing application demands through the adoption of higher-order modulation schemes such as 64-, 256-, and 1024-QAM, by supporting channel bonding up to 160 MHz and by employing MIMO techniques to transmit multiple streams to single client. In addition to increasing the peak data rate, efforts have been made to improve the spectral efficiency, which characterizes how well the system uses the available spectrum (how many bits of data can be pumped per second in 1 Hz of spectrum). Multi-user techniques such as MU-MIMO and OFDMA have been introduced in 802.11ac and 802.11ax to improve spectral efficiency and network capacity.

    The following table summarizes the key ingredients of various IEEE 802.11 amendments ranging from 802.11b to the recently ratified 802.11ac to the upcoming 802.11ax. As evident from this table, peak PHY data rate supported by IEEE 802.11 has gone up by a factor of 5000, and spectral efficiency has improved by a factor of 625. Enhancements like this have enabled Wi-Fi to become one of the basic needs of life on par with water, air and fire.

    WLAN-Protocols

    View the original article by The Ruckus Room.

    Conventional AV Systems Can Actually Harm You

    June 15th, 2016

    POSTED BY: Steve Gerrard on June 13, 2016 8:00 AM

    There’s barely a day goes by when I’m not reading another batch of stories about how unsuitable conventional endpoint AV security is for dealing with modern malware, APTs, zero-day threats and so forth.

    So replete are these tales of woe that it’s almost possible to switch off from the basic fact that in this uber-connected, cloud-enabled, everything-as-a-service, internet-of-thingamajigs world, most conventional endpoint AV systems are impotent and probably do more harm than good. I write almost, but not quite, because every now and again the occasional story jumps off the screen and gives you that all important wake-up call.

    One such story that came to light a few weeks ago, centered around an Adverse Event Report published by the U.S. Food and Drug Administration (FDA). Wherein, a patient (not named), undergoing a cardiac catheterisation procedure at a US Hospital (also not named) had to be sedated, mid-operation, for five minutes, while the procedure was suspended following the system crash of a vital piece of monitoring equipment.

    The system in question monitors, measures and records patient data during cardiac catheterization procedures. The system is made up of a patient data module, used to capture the patient’s vitals, and a hemo monitor PC to display them. The two elements are connected via a serial interface.

    During this particular procedure the monitor PC lost communication with the patient data module resulting in a black screen on the monitor and the patient having to be sedated while the system was rebooted. As the FDA report describes, the cause of this blackout was attributed to the installed conventional AV software, which at a critical point in the procedure initiated a scan of the system.

    Although the system could be rebooted and the patient fortunately survived it got me thinking about the real-life harm a conventional AV could do to me. Quoting from the Manufacturers Narrative in the FDA Report, “Our experience has shown that improper configuration of anti-virus software can have adverse effects including downtime and clinically unusable performance.” So, although I may be sensationalizing the FDA’s paragraph a little, I’m not feeling that confident after reading the manufacturer’s narrative. Let’s face it: the team performing a standard cardiac catheterisation procedure is not likely to include an IT security engineer who can be called upon at a moment’s notice.

    Could this scenario have been avoided with an Advanced Endpoint Protection system? The answer is probably yes. Traps, our advanced endpoint protection product, is not a conventional AV system — indeed, it’s a paradigm shift from “the way things used to be done.” Traps secures endpoints by preventing known and unknown malware and exploits from executing by focusing on blocking the few, core techniques used by attackers rather than application-specific characteristics. Furthermore, It does this in a lightweight, nonintrusive agent that definitely does not rely on system scanning.

    View the original post on the Palo Alto Network Research Centre Website.

    Learn more

    Watch a demo
    Ultimate Test Drive
    Traps

    Net-Ctrl Blog - mobile

    Not All Next-Generation Firewalls Are Created Equal

    August 9th, 2016

    As cybersecurity threats increase in sophistication, the security solutions used to defend against these threats must also evolve. Developers no longer adhere to standard port/protocol/application mapping; applications are capable of operating on non-standard ports, as well as port hopping; and users are able to force applications to run over non-standard ports, rendering first-generation firewalls ineffective in today’s threat environment. Enter the “next-generation firewall” (NGFW), the next stage of firewall and intrusion prevention systems (IPS) technology.

    A common understanding of an NGFW is a network platform that combines the traditional firewall functionalities with IPS and application control. However, merely bundling traditional firewalls with IPS and application control does not result in an NGFW. A true NGFW emphasizes native integration, classifies traffic based on applications rather than ports, performs a deep inspection of traffic and blocks attacks before a network can be infiltrated. Here is a list of key features of a true NGFW to better inform your next purchase decision.

    Identify and control applications and functions on all ports, all the time

    An NGFW should identify traffic on all ports at all times, and classify each application, while monitoring for changes that may indicate when an unpermitted function is being used. For example, using Citrix GoToMeeting for desktop sharing is permitted but allowing an external user to take control is not.

    Identify users regardless of device or IP address

    Knowing who is using which applications on the network, and who is transferring files that may contain threats, strengthens an organization’s security policies and reduces incident response times. An NGFW must get user identity from multiple sources – such as VPN solutions, WLAN controllers and directory servers – and allow policies that safely enable applications based on users, or groups of users, in outbound or inbound directions.

    Identify and control security evasion tactics

    There are two different classes of applications that evade security policies: applications that are designed to evade security, like external proxies and non-VPN-related encrypted tunnels (e.g., CGIProxy), and those that can be adapted to achieve the same goal such as remote server/desktop management tools (e.g., TeamViewer). An NGFW must have specific techniques that identify and control all applications, regardless of port, protocol, encryption or other evasive tactics and know how often that firewall’s application intelligence is updated and maintained.

    Decrypt and inspect SSL and control SSH

    An NGFW should be able to recognize and decrypt SSL and SSH on any port, inbound or outbound; have policy control over decryption; and offer the necessary hardware and software elements to perform SSL decryption simultaneously across tens of thousands of SSL connections with predictable performance.

    Systematically manage unknown traffic

    Unknown traffic represents significant risks and is highly correlated to threats that move along the network. An NGFW must classify and manage all traffic on all ports in one location and quickly analyze the traffic, known and unknown, to determine if it’s an internal/custom application, a commercial application without a signature, or a threat.

    Protect the network against known and unknown threats in all applications and on all ports

    Applications enable businesses, but they also act as a cyberthreat vector, supporting technologies that are frequent targets for exploits. An NGFW must first identify the application, determine the functions that should be permitted or blocked, and protect the organization from known and unknown threats, exploits, viruses/malware or spyware. This must be done automatically with near-real time updates to protect from newly discovered threats globally.

    Deliver consistent policy control over all traffic, regardless of user location or device type

    An NGFW should provide consistent visibility and control over traffic, regardless of where the user is and what device is being used, without introducing performance latency for the user, additional work for the administrator, or significant cost for the organization.

    Simplify network security

    To simplify and effectively manage already overloaded security processes and people, an NGFW must enable easy translation of your business policy to your security rules. This will allow policies that directly support business initiatives.

    Perform computationally intensive tasks without impacting performance

    An increase in security features often means significantly lower throughput and performance. An NGFW should deliver visibility and control including content scanning, which is computationally intensive, in high-throughput networks with little tolerance for latency.

    Deliver the same firewall functions in both a hardware and virtualized form factor

    Virtualization and cloud computing environments introduce new security challenges, including inconsistent functionality, disparate management and a lack of integration points. An NGFW must provide flexibility and in-depth integration with virtual data centers in private and public cloud environments to streamline the creation of application-centric policies.

    To learn more about what features a NGFW must have to safely enable applications and organizations, read the 10 Things Your Next Firewall Must Do white paper.

    POSTED BY: Eila Shargh on August 8, 2016 on Palo Alto Network Research Portal

    Palo Alto Networks Raises the Bar for Endpoint Security

    August 8th, 2016

    Palo Alto Networks®, the next-generation security company, announced new functionality, including significant machine learning capabilities for real-time unknown malware prevention, to its Traps™ advanced endpoint protection offering. These updates further strengthen the malware and exploit prevention capabilities of Traps and alleviate the need for legacy antivirus products to protect endpoints, such as laptops, servers and VDI instances.

    Many organisations deploy a number of security products and software agents on their endpoint systems, including one or more traditional antivirus products. Nevertheless, cyber breaches continue to increase in frequency, variety and sophistication. Traditional antivirus products struggle to keep pace and invariably fail to prevent these attacks on endpoints.

    An alternative to legacy antivirus point products, Traps uniquely combines the most effective, purpose-built malware and exploit detection methods to prevent known and unknown threats before they can successfully compromise an endpoint. By focusing on detecting and blocking the techniques at the core of these attacks, Traps can prevent sophisticated, targeted and never-before-seen attacks.

    As a component of the Palo Alto Networks Next-Generation Security Platform, a natively integrated and automated platform designed to safely enable applications and prevent cyber breaches, Traps both shares with and receives threat intelligence information from the Palo Alto Networks WildFire™ cloud-based malware analysis environment. Threat intelligence information is passed to WildFire by each component of the security platform, and Traps uses this information to block threats on the endpoint no matter where they originated.

    The new functionality announced today, which includes static analysis via machine learning and trusted publisher capabilities, will allow Traps to detect and immediately prevent malware that has never been seen.

    Quotes

    “The sophistication and frequency of cyberattacks are growing too quickly for legacy antivirus tools that rely on malware signatures to keep pace. The Palo Alto Networks Traps offering takes an innovative approach to endpoint security, keeping endpoints more secure despite a growing landscape of cyberthreats and reducing the resources required by IT teams to track and install security patches.”

    Rob Westervelt, research manager, Security Products, IDC

    “Antivirus point products give organisations a false sense of security, because while they technically make users compliant with regulatory and corporate governance requirements, they do not protect against today’s advanced cyberthreats. To do that, organisations must adopt a cybersecurity platform that prevents malware from infiltrating the enterprise at any point, including the endpoint, even if it has never been seen before.”

    Lee Klarich, executive vice president, Product Management, Palo Alto Networks

    The latest version of Traps, version 3.4, will be available by the end of August on the Palo Alto Networks Support Portal and will include the following updates:

  • Static analysis via machine learning examines hundreds of characteristics of a file to determine if it is malware. Threat intelligence available through the Palo Alto Networks WildFire subscription is used to train a machine learning model to recognise malware, especially previously unknown variants, with unmatched effectiveness and accuracy. This new functionality allows Traps to rapidly determine if a file should be allowed to run even before receiving a verdict from WildFire.
  • Trusted publisher identification allows organisations to automatically and immediately identify new executable files published by trusted and reputable software publishers. These executable files are allowed to run, cutting down on unnecessary analysis and allowing them to execute without delay or impact to the user.
  • Quarantine of malicious executables immediately removes malicious files and prevents further propagation or execution attempts of the files.
  • Grayware classification allows enterprises to identify non-malicious, but otherwise undesirable, software and prevent it from running in their environment.
  • Learn More

  • Read the Traps 3.4 blog post
  • Register for the upcoming webinar, Protect Yourself From Antivirus
  • Read the white paper, Protect Yourself From Antivirus
  • Palo Alto Networks Traps Advanced Endpoint Protection
  • Palo Alto Networks WildFire Cloud-Based Malware Analysis Environment
  • Palo Alto Networks Next-Generation Security Platform
  • View the original article at Palo Alto Networks.

    New Structure at Net-Ctrl

    July 28th, 2016

    Net-Ctrl has been going through a period of change over the last few months. As many of you will be aware the business was previously co-owned by Tony Pullon and Lee Georgio. Tony has now left the business, and we wish him a great Summer break and thank him for his years of dedication.

    The departure has bought change to Net-Ctrl, below is our revised senior management team structure:

  • Lee Georgio – Director
  • Lesley Cook – Finance and Office Manager
  • Mark Power – Major Accounts Sales Manager
  • Josh Moore – Marketing and Internal Sales Manager
  • Carol Gorman – Renewals and Major Accounts Sales Manager
  • In addition to those listed above we have our internal and external sales and technical teams.

    We’re all very excited about what the future holds for Net-Ctrl, and we are all dedicated more than ever to delivering great solutions and services to our customers.

    Following the changes, if you’re unsure who your account manager is please contact josh.moore@net-ctrl.com.

    We wish you all a great Summer.

    No Brexcuses: GDPR preparations must continue

    July 28th, 2016

    Whether or not you voted for Brexit, whether or not you believe it’s a done deal, there’s one thing post-referendum that surely isn’t up for debate. For British companies wanting to trade with Europe, the bureaucracy of Brussels isn’t going away. And that particularly applies to data protection. Some business people may well have heaved a sigh of relief on June 24th at the thought that GDPR (General Data Protection Regulation) the tough new European data protection regulation that was adopted in April 2016 and comes into force in May 2018 would no longer apply in the UK. That idea was based on the premise that the important thing is where the data is stored.

    Unfortunately, that’s not true under GDPR. What matters is whether the data concerns EU citizens, irrespective of where it is stored. Current UK data protection legislation comes from the Data Protection Act 1998, based on the 1995 Data Protection Directive. That will be superseded in Europe by GDPR less than two years from now. In other words, even if Article 50 were notified right now, GDPR would come into force before the Article 50 two-year post notification period runs out. Because GDPR is a Regulation and not a Directive, it does not require enabling national legislation to become law. That means it will apply in the United Kingdom, whether we like it or not. Even once Brexit is fully negotiated and implemented the chances are that the UK will either have to comply with GDPR or implement data protection legislation of its own that the EU deems adequate (i.e. the same or very similar) if it wishes to keep trading with the European Union. This is likely to be equally applicable to the Network and Information Security Directive which has until May 2018 to be implemented in national law.

    So, if UK businesses have any ambition to continue selling to European customers, viewing Brexit as an opportunity to side-step data protection obligations is a serious mistake. Despite the GDPR’s short term disruption, the regulation is likely to have a positive impact on data security industry. It will accelerate the modernisation of Europe’s data security practices and enforce consistency of approach between EU member states. Nonetheless, it will require European business of all sizes to take a very close look at their security, including those in the UK. From both commercial and practical perspectives, preparations must continue. Regardless of what you make of either Brexit or the GDPR, businesses in the UK have no choice but to keep pace with the regulation.

    Original post by Swivel Secure. View original post.

    Gemalto and Ponemon Institute Study: Cloud data security still a challenge for many companies

    July 28th, 2016
  • Half of all cloud services and corporate data stored in cloud not controlled by IT departments
  • Only a third of sensitive data stored in cloud-based applications is encrypted
  • More than half of companies do not have a proactive approach for compliance with privacy and security regulations for data in cloud environments
  • Simple measures by IT organizations provide protection for both corporate data and “shadow IT” needs.
  • Despite the continued importance of cloud computing resources to organizations, companies are not adopting appropriate governance and security measures to protect sensitive data in the cloud. These are just a few findings a Ponemon Institute study titled “The 2016 Global Cloud Data Security Study,” commissioned by Gemalto (Euronext NL0000400653 GTO), the world leader in digital security. The study surveyed more than 3,400 IT and IT security practitioners worldwide to gain a better understanding of key trends in data governance and security practices for cloud-based services.

    [On July 28 at 10 a.m. EDT, Gemalto and the Ponemon Institute will host a webinar to discuss the full results of the study. Click on the following link to register: https://www.brighttalk.com/webcast/2037/216247].

    According to 73 percent of respondents, cloud-based services and platforms are considered important to their organization’s operations and 81 percent said they will be more so over the next two years. In fact, thirty-six percent of respondents said their companies’ total IT and data processing needs were met using cloud resources today and that they expected this to increase to forty-five percent over the next two years.

    Although cloud-based resources are becoming more important to companies’ IT operations and business strategies, 54 percent of respondents did not agree their companies have a proactive approach to managing security and complying with privacy and data protection regulations in cloud environments. This is despite the fact that 65 percent of respondents said their organizations are committed to protecting confidential or sensitive information in the cloud. Furthermore, 56 percent did not agree their organization is careful about sharing sensitive information in the cloud with third parties such as business partners, contractors and vendors.

    “Cloud security continues to be a challenge for companies, especially in dealing with the complexity of privacy and data protection regulations,” said Dr. Larry Ponemon, chairman and founder, Ponemon Institute. “To ensure compliance, it is important for companies to consider deploying such technologies as encryption, tokenization or other cryptographic solutions to secure sensitive data transferred and stored in the cloud.”

    “Organizations have embraced the cloud with its benefits of cost and flexibility but they are still struggling with maintaining control of their data and compliance in virtual environments,” said Jason Hart, Vice President and Chief Technology Officer for Data Protection at Gemalto. “It’s quite obvious security measures are not keeping pace because the cloud challenges traditional approaches of protecting data when it was just stored on the network. It is an issue that can only be solved with a data-centric approach in which IT organizations can uniformly protect customer and corporate information across the dozens of cloud-based services their employees and internal departments rely every day.”

    Key Findings

    Cloud security is stormy because of shadow IT
    According to respondents, nearly half (49 percent) of cloud services are deployed by departments other than corporate IT, and an average of 47 percent of corporate data stored in cloud environments is not managed or controlled by the IT department. However, confidence in knowing all cloud computing services in use is increasing. Fifty-four percent of respondents are confident that the IT organization knows all cloud computing applications, platform or infrastructure services in use – a nine percent increase from 2014.

    Conventional security practices do not apply in the cloud
    In 2014, 60 percent of respondents felt it was more difficult to protect confidential or sensitive information when using cloud services. This year, 54 percent said the same. Difficulty in controlling or restricting end-user access increased from 48 percent in 2014 to 53 percent of respondents in 2016. The other major challenges that make security difficult include the inability to apply conventional information security in cloud environments (70 percent of respondents) and the inability to directly inspect cloud providers for security compliance (69 percent of respondents).

    More customer information is being stored in the cloud and is considered the data most at risk
    According to the survey, customer information, emails, consumer data, employee records and payment information are the types of data most often stored in the cloud. Since 2014, the storage of customer information in the cloud has increased the most, from 53 percent in 2014 to 62 percent of respondents saying their company was doing this today. Fifty-three percent also considered customer information the data most at risk in the cloud.

    Security departments left in the dark when it comes to buying cloud services
    Only 21 percent of respondents said members of the security team are involved in the decision-making process about using certain cloud application or platforms. The majority of respondents (64 percent) also said their organizations do not have a policy that requires use of security safeguards, such as encryption, as a condition to using certain cloud computing applications.

    Encryption is important but not yet pervasive in the cloud
    Seventy-two percent of respondents said the ability to encrypt or tokenize sensitive or confidential data is important, with 86 percent saying it will become more important over the next two years, up from 79 percent in 2014. While the importance of encryption is growing, it is not yet widely deployed in the cloud. For example, for SaaS, the most popular type of cloud-based service, only 34 percent of respondents say their organization encrypts or tokenizes sensitive or confidential data directly within cloud-based applications.

    Many companies still rely on passwords to secure user access to cloud services
    Sixty-seven percent of respondents said the management of user identities is more difficult in the cloud than on-premises. However, organizations are not adopting measures that are easy to implement and could increase cloud security. About half (forty-five percent) of companies are not using multi-factor authentication to secure employee and third-party access to applications and data in the cloud, which means many companies are still relying on just user names and passwords to validate identities. This puts more data at risk because fifty-eight percent of respondents say their organizations have third-party users accessing their data and information in the cloud.

    Recommendations for Data Security in the Cloud

    The new realities of Cloud IT mean that IT organizations need to set comprehensive policies for data governance and compliance, create guidelines for the sourcing of cloud services, and establish rules for what data can and cannot be stored in the cloud.

    IT organizations can accomplish their mission to protect corporate data while also being an enabler of their “Shadow IT” by implementing data security measures such as encryption that allow them to protect data in the cloud in a centralized fashion as their internal organizations source cloud-based services as needed.
    As companies store more data in the cloud and utilize more cloud-based services, IT organizations need to place greater emphasis on stronger user access controls with multi-factor authentication. This is even more important for companies that give third-parties and vendors access to their data in cloud.

    About the Survey
    The survey was conducted by the Ponemon Institute on behalf of Gemalto and surveyed 3,476 IT and IT security practitioners in the United States, Brazil, United Kingdom, Germany, France, Russian Federation, India, Japan and Australia who are familiar and involved in their company’s use of both public and private cloud resources. Industries represented among the respondents include Financial Services, Retail, Technology & Software, Public Sector, Healthcare and Pharmaceutical, Utilities & Energy, Education, Transportation, Communications, Media & Entertainment, and Hospitality.

    Related Resources

  • Report: Gemalto 2016 G​lobal Cloud Data Security Study
  • Infographic: Gemalto Cloud Data Security Infographic
  • Web Site: Gemalto 2016 Global Cloud Data Security Study Findings
  • Video: Gemalto Cloud Security Solutions Overview
  • Web Site: You Can’t Secure the Cloud with Old School Technology​​
  • Article taken from Gemalto.com. View original.

    Cybersecurity Education Efforts Yielding Results

    July 21st, 2016

    Cybersecurity education efforts are yielding results, with 61 percent of respondents to a survey conducted by Palo Alto Networks saying they would speak with IT before introducing new devices onto a corporate network or adding business applications and tools onto unsecured devices.

    With more than 25% of identified attacks in enterprises expected to involve IoT by 2020, [1] – and many expected to enter the workplace – this survey finding represents a significant step in the right direction and demonstrates that employees’ knowledge and understanding of their role in cybersecurity is improving.

    However, the contrasting findings from this survey of business managers – who typically have the salary and tendency to be early adopters of new technology – are that 39 percent would fly under IT’s radar. This leaves a large margin for risk.

    Further still, of the group that doesn’t go to IT, one in every eight would “not tell anyone” about bringing a new device into an organisation or installing corporate tools, such as email, onto unsecured devices.

    Attitude Impacts Adherence

    The survey found that adherence to cybersecurity policies, such as those around the introduction of a new device, is largely guided by personal attitudes and views toward technology. Of those who have circumvented their company’s cybersecurity policy in the past, the prevailing reason for doing so was that they wanted to use a more efficient tool or service, or one that was considered to be the best in the market. Companies need to enable, not limit, employee choices, using technology and education to manage risk.

    Temporary Employees Require Full-time Supervision

    Contractors were the group most often seen to be bypassing company guidelines on cybersecurity, with 16 percent of respondents saying they had seen a temporary employee circumvent policies.

    Quote

    “BYOD is now a mature concept, but many still struggle to manage the blurry lines between personal and business data access by personal devices. Many organisations have deployed solutions to manage devices, but the anxiety comes from their broad connectivity, especially as the boundaries between business-driven cloud services and personal ones become less clear, which creates unknown bridges between business networks and the Internet at large. Modern state-of-the-art security must be able to prevent any device communication becoming the point of a breach and minimise risk for an organisation.”

    Greg Day, VP and regional CSO, EMEA at Palo Alto Networks

    Recommendations

    • Organisations should continue with employee education efforts to ensure that those on the front line of defences have the skills they need to identify threats.
    • Security professionals should closely monitor the activity of non-permanent employees or contractors and ensure they receive the same policy information as full-time staff.
    • Organisations should integrate up-to-date security solutions that fit with new technology trends in order to eliminate the weaknesses exposed in an evolving computing environment.
    • Businesses should look at how they identify and enable the safe use of trusted or sanctioned cloud services and applications and manage the use of those that are untrusted and unsanctioned.

    Download “Preventing the Blocks to Cybersecurity in Business” at: https://www.paloaltonetworks.com/resources/research/preventing-blocks-to-cybersecurity-in-business

    Research Methodology

    The survey was conducted online among 765 business decision-makers in companies with 1,000+ employees in the U.K., Germany, France, the Netherlands and Belgium by Redshift Research in October 2015.

    [1] Gartner-Press Release, “Gartner Says Worldwide IoT Security Spending to Reach $348 Million in 2016”, April 25, 2016, http://www.gartner.com/newsroom/id/3291817

    Possible MOBOTIX Shipment Delay

    July 21st, 2016

    MOBOTIX will be shutting down their operations from Monday, August 8 to Friday to August 26, 2016.

    All items which are in stock will be dispatched immediately once orders are confirmed.

    Orders for larger quantities or special models may take longer. We recommend to place your order as soon as possible and will do everything in our power to assist you with your project.

    For more information please contact Mark Power on 01473 281 211 or at mark.power@net-ctrl.com.

    Why User-Based Controls Are Critical to Your Breach Prevention Strategy

    July 20th, 2016

    POSTED BY: Navneet Singh on Palo Alto Networks Blog.

    Employees, customers and partners connect to different repositories of information within your network, as well as to the internet, to perform various aspects of their jobs. These people and their many devices represent your network’s users. It’s important to your organisation’s risk posture that you’re able to identify who they are — beyond IP address — and the inherent risks they bring with them based on the particular device they’re using, especially when security policies have been circumvented or new threats have been introduced to the organisation.

    Here are two high-profile, real-world breaches that you can learn from. The key takeaway here is that, to make the most of your next-generation firewall investment, it is critical to implement user-based controls.

    Example 1: Data Breach at a Large U.S. Retailer

    This data breach started with the attackers stealing a third-party vendor’s login credentials. This allowed them to gain access to the third-party vendor environment and exploit a Windows vulnerability. Since the vendor had the privileges to access the corporate network, the attackers gained access, too. The attackers were then able to install memory-scraping malware on more than 7,500 self-checkout POS terminals. This malware was able to grab 56 million credit and debit card numbers. The malware was also able to capture 53 million email addresses.

    The SANS Institute Reading Room for InfoSec has published a report on the breach. The report mentions several ways in which the breach could have been prevented. One of the most important is to have the right access controls in place. Quoting from the report:

  • An identity and access management solution should be used to manage the identities and access of all internal and external employees (third-party vendors).
  • Each external employee should have their own account, so that there is accountability for anything performed on their behalf.
  • Account review procedures should also be in place, specifically for third-party vendor accounts. Auditing of these third-party vendors is critical. This will allow the detection of abnormal behavior.
  • Having all of these controls in place for managing and monitoring the third-party vendor accounts will detect any misuse of third-party vendor credentials.
  • Example 2: Data Breach at a Large U.S. Banking and Financial Services Company

    This data breach started with the attackers infecting the personal computer of an employee. The malware stole the employee’s login credentials. When the employee used VPN to connect to the corporate network, the attackers were able to gain access to more than 90 corporate servers. The attackers stole private information for 76 million households and 7 million small businesses.

    The SANS Institute Reading Room for InfoSec’s report on this breach mentions the need to manage user privileges as one of the key ways to minimize the risk of a breach or minimise damage in case of a breach. Quoting from the report:

  • Least privilege simply means to give someone the least amount of access to perform his or her job. If least privilege control access were applied, these organisations would have reduced the amount of stolen data by 86 percent.
  • Anonymous access must be disabled because many Windows vulnerabilities are caused by null user sessions. A null user session is essentially a Server Message Block (SMB) session with blank username and password.
  • What This Means for You as the Security Practitioner

    Want to make sure your organisation does not end up in the headlines for the wrong reasons, like a massive data breach? You’d do well to implement user-based controls and restrict user access to least privilege, as the SANS Institute reports recommend. Employ the right user access mechanisms not only on the endpoints and on the applications that they access but also on your next-generation firewall.

    Call to Action

    If you own a Palo Alto Networks® Next-Generation Firewall, refer to the following resources to enable User-ID™, and increase your organization’s breach defenses:

  • User-ID documentation
  • Best practice internet gateway security policy
  • User-ID tech tips
  • The Theory of Wi-Fi Evolution and IEEE 802.11 Selection

    July 14th, 2016

    By: Sundar Sankaran, Chief Wireless Architect

    September 2015 marked the 25th anniversary of IEEE 802.11, commonly referred to as Wi-Fi. Over these 25 years, Wi-Fi has ascended from a technology that enabled computers to wirelessly transfer data at 2 Mbps to winning a spot in Maslow’s pyramid as the most basic human need.

    IEEE 802.11 got here, as Lewis Carroll suggested, by running twice as fast. The standard has continuously advanced itself by introducing amendments, such as 802.11n, 802.11ac and 802.11ax. These amendments support higher data rates to meet ever-increasing application demands through the adoption of higher-order modulation schemes such as 64-, 256-, and 1024-QAM, by supporting channel bonding up to 160 MHz and by employing MIMO techniques to transmit multiple streams to single client. In addition to increasing the peak data rate, efforts have been made to improve the spectral efficiency, which characterizes how well the system uses the available spectrum (how many bits of data can be pumped per second in 1 Hz of spectrum). Multi-user techniques such as MU-MIMO and OFDMA have been introduced in 802.11ac and 802.11ax to improve spectral efficiency and network capacity.

    The following table summarizes the key ingredients of various IEEE 802.11 amendments ranging from 802.11b to the recently ratified 802.11ac to the upcoming 802.11ax. As evident from this table, peak PHY data rate supported by IEEE 802.11 has gone up by a factor of 5000, and spectral efficiency has improved by a factor of 625. Enhancements like this have enabled Wi-Fi to become one of the basic needs of life on par with water, air and fire.

    WLAN-Protocols

    View the original article by The Ruckus Room.

    Conventional AV Systems Can Actually Harm You

    June 15th, 2016

    POSTED BY: Steve Gerrard on June 13, 2016 8:00 AM

    There’s barely a day goes by when I’m not reading another batch of stories about how unsuitable conventional endpoint AV security is for dealing with modern malware, APTs, zero-day threats and so forth.

    So replete are these tales of woe that it’s almost possible to switch off from the basic fact that in this uber-connected, cloud-enabled, everything-as-a-service, internet-of-thingamajigs world, most conventional endpoint AV systems are impotent and probably do more harm than good. I write almost, but not quite, because every now and again the occasional story jumps off the screen and gives you that all important wake-up call.

    One such story that came to light a few weeks ago, centered around an Adverse Event Report published by the U.S. Food and Drug Administration (FDA). Wherein, a patient (not named), undergoing a cardiac catheterisation procedure at a US Hospital (also not named) had to be sedated, mid-operation, for five minutes, while the procedure was suspended following the system crash of a vital piece of monitoring equipment.

    The system in question monitors, measures and records patient data during cardiac catheterization procedures. The system is made up of a patient data module, used to capture the patient’s vitals, and a hemo monitor PC to display them. The two elements are connected via a serial interface.

    During this particular procedure the monitor PC lost communication with the patient data module resulting in a black screen on the monitor and the patient having to be sedated while the system was rebooted. As the FDA report describes, the cause of this blackout was attributed to the installed conventional AV software, which at a critical point in the procedure initiated a scan of the system.

    Although the system could be rebooted and the patient fortunately survived it got me thinking about the real-life harm a conventional AV could do to me. Quoting from the Manufacturers Narrative in the FDA Report, “Our experience has shown that improper configuration of anti-virus software can have adverse effects including downtime and clinically unusable performance.” So, although I may be sensationalizing the FDA’s paragraph a little, I’m not feeling that confident after reading the manufacturer’s narrative. Let’s face it: the team performing a standard cardiac catheterisation procedure is not likely to include an IT security engineer who can be called upon at a moment’s notice.

    Could this scenario have been avoided with an Advanced Endpoint Protection system? The answer is probably yes. Traps, our advanced endpoint protection product, is not a conventional AV system — indeed, it’s a paradigm shift from “the way things used to be done.” Traps secures endpoints by preventing known and unknown malware and exploits from executing by focusing on blocking the few, core techniques used by attackers rather than application-specific characteristics. Furthermore, It does this in a lightweight, nonintrusive agent that definitely does not rely on system scanning.

    View the original post on the Palo Alto Network Research Centre Website.

    Learn more

    Watch a demo
    Ultimate Test Drive
    Traps

    Net-Ctrl Blog

    Not All Next-Generation Firewalls Are Created Equal

    August 9th, 2016

    As cybersecurity threats increase in sophistication, the security solutions used to defend against these threats must also evolve. Developers no longer adhere to standard port/protocol/application mapping; applications are capable of operating on non-standard ports, as well as port hopping; and users are able to force applications to run over non-standard ports, rendering first-generation firewalls ineffective in today’s threat environment. Enter the “next-generation firewall” (NGFW), the next stage of firewall and intrusion prevention systems (IPS) technology.

    A common understanding of an NGFW is a network platform that combines the traditional firewall functionalities with IPS and application control. However, merely bundling traditional firewalls with IPS and application control does not result in an NGFW. A true NGFW emphasizes native integration, classifies traffic based on applications rather than ports, performs a deep inspection of traffic and blocks attacks before a network can be infiltrated. Here is a list of key features of a true NGFW to better inform your next purchase decision.

    Identify and control applications and functions on all ports, all the time

    An NGFW should identify traffic on all ports at all times, and classify each application, while monitoring for changes that may indicate when an unpermitted function is being used. For example, using Citrix GoToMeeting for desktop sharing is permitted but allowing an external user to take control is not.

    Identify users regardless of device or IP address

    Knowing who is using which applications on the network, and who is transferring files that may contain threats, strengthens an organization’s security policies and reduces incident response times. An NGFW must get user identity from multiple sources – such as VPN solutions, WLAN controllers and directory servers – and allow policies that safely enable applications based on users, or groups of users, in outbound or inbound directions.

    Identify and control security evasion tactics

    There are two different classes of applications that evade security policies: applications that are designed to evade security, like external proxies and non-VPN-related encrypted tunnels (e.g., CGIProxy), and those that can be adapted to achieve the same goal such as remote server/desktop management tools (e.g., TeamViewer). An NGFW must have specific techniques that identify and control all applications, regardless of port, protocol, encryption or other evasive tactics and know how often that firewall’s application intelligence is updated and maintained.

    Decrypt and inspect SSL and control SSH

    An NGFW should be able to recognize and decrypt SSL and SSH on any port, inbound or outbound; have policy control over decryption; and offer the necessary hardware and software elements to perform SSL decryption simultaneously across tens of thousands of SSL connections with predictable performance.

    Systematically manage unknown traffic

    Unknown traffic represents significant risks and is highly correlated to threats that move along the network. An NGFW must classify and manage all traffic on all ports in one location and quickly analyze the traffic, known and unknown, to determine if it’s an internal/custom application, a commercial application without a signature, or a threat.

    Protect the network against known and unknown threats in all applications and on all ports

    Applications enable businesses, but they also act as a cyberthreat vector, supporting technologies that are frequent targets for exploits. An NGFW must first identify the application, determine the functions that should be permitted or blocked, and protect the organization from known and unknown threats, exploits, viruses/malware or spyware. This must be done automatically with near-real time updates to protect from newly discovered threats globally.

    Deliver consistent policy control over all traffic, regardless of user location or device type

    An NGFW should provide consistent visibility and control over traffic, regardless of where the user is and what device is being used, without introducing performance latency for the user, additional work for the administrator, or significant cost for the organization.

    Simplify network security

    To simplify and effectively manage already overloaded security processes and people, an NGFW must enable easy translation of your business policy to your security rules. This will allow policies that directly support business initiatives.

    Perform computationally intensive tasks without impacting performance

    An increase in security features often means significantly lower throughput and performance. An NGFW should deliver visibility and control including content scanning, which is computationally intensive, in high-throughput networks with little tolerance for latency.

    Deliver the same firewall functions in both a hardware and virtualized form factor

    Virtualization and cloud computing environments introduce new security challenges, including inconsistent functionality, disparate management and a lack of integration points. An NGFW must provide flexibility and in-depth integration with virtual data centers in private and public cloud environments to streamline the creation of application-centric policies.

    To learn more about what features a NGFW must have to safely enable applications and organizations, read the 10 Things Your Next Firewall Must Do white paper.

    POSTED BY: Eila Shargh on August 8, 2016 on Palo Alto Network Research Portal

    Palo Alto Networks Raises the Bar for Endpoint Security

    August 8th, 2016

    Palo Alto Networks®, the next-generation security company, announced new functionality, including significant machine learning capabilities for real-time unknown malware prevention, to its Traps™ advanced endpoint protection offering. These updates further strengthen the malware and exploit prevention capabilities of Traps and alleviate the need for legacy antivirus products to protect endpoints, such as laptops, servers and VDI instances.

    Many organisations deploy a number of security products and software agents on their endpoint systems, including one or more traditional antivirus products. Nevertheless, cyber breaches continue to increase in frequency, variety and sophistication. Traditional antivirus products struggle to keep pace and invariably fail to prevent these attacks on endpoints.

    An alternative to legacy antivirus point products, Traps uniquely combines the most effective, purpose-built malware and exploit detection methods to prevent known and unknown threats before they can successfully compromise an endpoint. By focusing on detecting and blocking the techniques at the core of these attacks, Traps can prevent sophisticated, targeted and never-before-seen attacks.

    As a component of the Palo Alto Networks Next-Generation Security Platform, a natively integrated and automated platform designed to safely enable applications and prevent cyber breaches, Traps both shares with and receives threat intelligence information from the Palo Alto Networks WildFire™ cloud-based malware analysis environment. Threat intelligence information is passed to WildFire by each component of the security platform, and Traps uses this information to block threats on the endpoint no matter where they originated.

    The new functionality announced today, which includes static analysis via machine learning and trusted publisher capabilities, will allow Traps to detect and immediately prevent malware that has never been seen.

    Quotes

    “The sophistication and frequency of cyberattacks are growing too quickly for legacy antivirus tools that rely on malware signatures to keep pace. The Palo Alto Networks Traps offering takes an innovative approach to endpoint security, keeping endpoints more secure despite a growing landscape of cyberthreats and reducing the resources required by IT teams to track and install security patches.”

    Rob Westervelt, research manager, Security Products, IDC

    “Antivirus point products give organisations a false sense of security, because while they technically make users compliant with regulatory and corporate governance requirements, they do not protect against today’s advanced cyberthreats. To do that, organisations must adopt a cybersecurity platform that prevents malware from infiltrating the enterprise at any point, including the endpoint, even if it has never been seen before.”

    Lee Klarich, executive vice president, Product Management, Palo Alto Networks

    The latest version of Traps, version 3.4, will be available by the end of August on the Palo Alto Networks Support Portal and will include the following updates:

  • Static analysis via machine learning examines hundreds of characteristics of a file to determine if it is malware. Threat intelligence available through the Palo Alto Networks WildFire subscription is used to train a machine learning model to recognise malware, especially previously unknown variants, with unmatched effectiveness and accuracy. This new functionality allows Traps to rapidly determine if a file should be allowed to run even before receiving a verdict from WildFire.
  • Trusted publisher identification allows organisations to automatically and immediately identify new executable files published by trusted and reputable software publishers. These executable files are allowed to run, cutting down on unnecessary analysis and allowing them to execute without delay or impact to the user.
  • Quarantine of malicious executables immediately removes malicious files and prevents further propagation or execution attempts of the files.
  • Grayware classification allows enterprises to identify non-malicious, but otherwise undesirable, software and prevent it from running in their environment.
  • Learn More

  • Read the Traps 3.4 blog post
  • Register for the upcoming webinar, Protect Yourself From Antivirus
  • Read the white paper, Protect Yourself From Antivirus
  • Palo Alto Networks Traps Advanced Endpoint Protection
  • Palo Alto Networks WildFire Cloud-Based Malware Analysis Environment
  • Palo Alto Networks Next-Generation Security Platform
  • View the original article at Palo Alto Networks.

    New Structure at Net-Ctrl

    July 28th, 2016

    Net-Ctrl has been going through a period of change over the last few months. As many of you will be aware the business was previously co-owned by Tony Pullon and Lee Georgio. Tony has now left the business, and we wish him a great Summer break and thank him for his years of dedication.

    The departure has bought change to Net-Ctrl, below is our revised senior management team structure:

  • Lee Georgio – Director
  • Lesley Cook – Finance and Office Manager
  • Mark Power – Major Accounts Sales Manager
  • Josh Moore – Marketing and Internal Sales Manager
  • Carol Gorman – Renewals and Major Accounts Sales Manager
  • In addition to those listed above we have our internal and external sales and technical teams.

    We’re all very excited about what the future holds for Net-Ctrl, and we are all dedicated more than ever to delivering great solutions and services to our customers.

    Following the changes, if you’re unsure who your account manager is please contact josh.moore@net-ctrl.com.

    We wish you all a great Summer.

    No Brexcuses: GDPR preparations must continue

    July 28th, 2016

    Whether or not you voted for Brexit, whether or not you believe it’s a done deal, there’s one thing post-referendum that surely isn’t up for debate. For British companies wanting to trade with Europe, the bureaucracy of Brussels isn’t going away. And that particularly applies to data protection. Some business people may well have heaved a sigh of relief on June 24th at the thought that GDPR (General Data Protection Regulation) the tough new European data protection regulation that was adopted in April 2016 and comes into force in May 2018 would no longer apply in the UK. That idea was based on the premise that the important thing is where the data is stored.

    Unfortunately, that’s not true under GDPR. What matters is whether the data concerns EU citizens, irrespective of where it is stored. Current UK data protection legislation comes from the Data Protection Act 1998, based on the 1995 Data Protection Directive. That will be superseded in Europe by GDPR less than two years from now. In other words, even if Article 50 were notified right now, GDPR would come into force before the Article 50 two-year post notification period runs out. Because GDPR is a Regulation and not a Directive, it does not require enabling national legislation to become law. That means it will apply in the United Kingdom, whether we like it or not. Even once Brexit is fully negotiated and implemented the chances are that the UK will either have to comply with GDPR or implement data protection legislation of its own that the EU deems adequate (i.e. the same or very similar) if it wishes to keep trading with the European Union. This is likely to be equally applicable to the Network and Information Security Directive which has until May 2018 to be implemented in national law.

    So, if UK businesses have any ambition to continue selling to European customers, viewing Brexit as an opportunity to side-step data protection obligations is a serious mistake. Despite the GDPR’s short term disruption, the regulation is likely to have a positive impact on data security industry. It will accelerate the modernisation of Europe’s data security practices and enforce consistency of approach between EU member states. Nonetheless, it will require European business of all sizes to take a very close look at their security, including those in the UK. From both commercial and practical perspectives, preparations must continue. Regardless of what you make of either Brexit or the GDPR, businesses in the UK have no choice but to keep pace with the regulation.

    Original post by Swivel Secure. View original post.

    Gemalto and Ponemon Institute Study: Cloud data security still a challenge for many companies

    July 28th, 2016
  • Half of all cloud services and corporate data stored in cloud not controlled by IT departments
  • Only a third of sensitive data stored in cloud-based applications is encrypted
  • More than half of companies do not have a proactive approach for compliance with privacy and security regulations for data in cloud environments
  • Simple measures by IT organizations provide protection for both corporate data and “shadow IT” needs.
  • Despite the continued importance of cloud computing resources to organizations, companies are not adopting appropriate governance and security measures to protect sensitive data in the cloud. These are just a few findings a Ponemon Institute study titled “The 2016 Global Cloud Data Security Study,” commissioned by Gemalto (Euronext NL0000400653 GTO), the world leader in digital security. The study surveyed more than 3,400 IT and IT security practitioners worldwide to gain a better understanding of key trends in data governance and security practices for cloud-based services.

    [On July 28 at 10 a.m. EDT, Gemalto and the Ponemon Institute will host a webinar to discuss the full results of the study. Click on the following link to register: https://www.brighttalk.com/webcast/2037/216247].

    According to 73 percent of respondents, cloud-based services and platforms are considered important to their organization’s operations and 81 percent said they will be more so over the next two years. In fact, thirty-six percent of respondents said their companies’ total IT and data processing needs were met using cloud resources today and that they expected this to increase to forty-five percent over the next two years.

    Although cloud-based resources are becoming more important to companies’ IT operations and business strategies, 54 percent of respondents did not agree their companies have a proactive approach to managing security and complying with privacy and data protection regulations in cloud environments. This is despite the fact that 65 percent of respondents said their organizations are committed to protecting confidential or sensitive information in the cloud. Furthermore, 56 percent did not agree their organization is careful about sharing sensitive information in the cloud with third parties such as business partners, contractors and vendors.

    “Cloud security continues to be a challenge for companies, especially in dealing with the complexity of privacy and data protection regulations,” said Dr. Larry Ponemon, chairman and founder, Ponemon Institute. “To ensure compliance, it is important for companies to consider deploying such technologies as encryption, tokenization or other cryptographic solutions to secure sensitive data transferred and stored in the cloud.”

    “Organizations have embraced the cloud with its benefits of cost and flexibility but they are still struggling with maintaining control of their data and compliance in virtual environments,” said Jason Hart, Vice President and Chief Technology Officer for Data Protection at Gemalto. “It’s quite obvious security measures are not keeping pace because the cloud challenges traditional approaches of protecting data when it was just stored on the network. It is an issue that can only be solved with a data-centric approach in which IT organizations can uniformly protect customer and corporate information across the dozens of cloud-based services their employees and internal departments rely every day.”

    Key Findings

    Cloud security is stormy because of shadow IT
    According to respondents, nearly half (49 percent) of cloud services are deployed by departments other than corporate IT, and an average of 47 percent of corporate data stored in cloud environments is not managed or controlled by the IT department. However, confidence in knowing all cloud computing services in use is increasing. Fifty-four percent of respondents are confident that the IT organization knows all cloud computing applications, platform or infrastructure services in use – a nine percent increase from 2014.

    Conventional security practices do not apply in the cloud
    In 2014, 60 percent of respondents felt it was more difficult to protect confidential or sensitive information when using cloud services. This year, 54 percent said the same. Difficulty in controlling or restricting end-user access increased from 48 percent in 2014 to 53 percent of respondents in 2016. The other major challenges that make security difficult include the inability to apply conventional information security in cloud environments (70 percent of respondents) and the inability to directly inspect cloud providers for security compliance (69 percent of respondents).

    More customer information is being stored in the cloud and is considered the data most at risk
    According to the survey, customer information, emails, consumer data, employee records and payment information are the types of data most often stored in the cloud. Since 2014, the storage of customer information in the cloud has increased the most, from 53 percent in 2014 to 62 percent of respondents saying their company was doing this today. Fifty-three percent also considered customer information the data most at risk in the cloud.

    Security departments left in the dark when it comes to buying cloud services
    Only 21 percent of respondents said members of the security team are involved in the decision-making process about using certain cloud application or platforms. The majority of respondents (64 percent) also said their organizations do not have a policy that requires use of security safeguards, such as encryption, as a condition to using certain cloud computing applications.

    Encryption is important but not yet pervasive in the cloud
    Seventy-two percent of respondents said the ability to encrypt or tokenize sensitive or confidential data is important, with 86 percent saying it will become more important over the next two years, up from 79 percent in 2014. While the importance of encryption is growing, it is not yet widely deployed in the cloud. For example, for SaaS, the most popular type of cloud-based service, only 34 percent of respondents say their organization encrypts or tokenizes sensitive or confidential data directly within cloud-based applications.

    Many companies still rely on passwords to secure user access to cloud services
    Sixty-seven percent of respondents said the management of user identities is more difficult in the cloud than on-premises. However, organizations are not adopting measures that are easy to implement and could increase cloud security. About half (forty-five percent) of companies are not using multi-factor authentication to secure employee and third-party access to applications and data in the cloud, which means many companies are still relying on just user names and passwords to validate identities. This puts more data at risk because fifty-eight percent of respondents say their organizations have third-party users accessing their data and information in the cloud.

    Recommendations for Data Security in the Cloud

    The new realities of Cloud IT mean that IT organizations need to set comprehensive policies for data governance and compliance, create guidelines for the sourcing of cloud services, and establish rules for what data can and cannot be stored in the cloud.

    IT organizations can accomplish their mission to protect corporate data while also being an enabler of their “Shadow IT” by implementing data security measures such as encryption that allow them to protect data in the cloud in a centralized fashion as their internal organizations source cloud-based services as needed.
    As companies store more data in the cloud and utilize more cloud-based services, IT organizations need to place greater emphasis on stronger user access controls with multi-factor authentication. This is even more important for companies that give third-parties and vendors access to their data in cloud.

    About the Survey
    The survey was conducted by the Ponemon Institute on behalf of Gemalto and surveyed 3,476 IT and IT security practitioners in the United States, Brazil, United Kingdom, Germany, France, Russian Federation, India, Japan and Australia who are familiar and involved in their company’s use of both public and private cloud resources. Industries represented among the respondents include Financial Services, Retail, Technology & Software, Public Sector, Healthcare and Pharmaceutical, Utilities & Energy, Education, Transportation, Communications, Media & Entertainment, and Hospitality.

    Related Resources

  • Report: Gemalto 2016 G​lobal Cloud Data Security Study
  • Infographic: Gemalto Cloud Data Security Infographic
  • Web Site: Gemalto 2016 Global Cloud Data Security Study Findings
  • Video: Gemalto Cloud Security Solutions Overview
  • Web Site: You Can’t Secure the Cloud with Old School Technology​​
  • Article taken from Gemalto.com. View original.

    Cybersecurity Education Efforts Yielding Results

    July 21st, 2016

    Cybersecurity education efforts are yielding results, with 61 percent of respondents to a survey conducted by Palo Alto Networks saying they would speak with IT before introducing new devices onto a corporate network or adding business applications and tools onto unsecured devices.

    With more than 25% of identified attacks in enterprises expected to involve IoT by 2020, [1] – and many expected to enter the workplace – this survey finding represents a significant step in the right direction and demonstrates that employees’ knowledge and understanding of their role in cybersecurity is improving.

    However, the contrasting findings from this survey of business managers – who typically have the salary and tendency to be early adopters of new technology – are that 39 percent would fly under IT’s radar. This leaves a large margin for risk.

    Further still, of the group that doesn’t go to IT, one in every eight would “not tell anyone” about bringing a new device into an organisation or installing corporate tools, such as email, onto unsecured devices.

    Attitude Impacts Adherence

    The survey found that adherence to cybersecurity policies, such as those around the introduction of a new device, is largely guided by personal attitudes and views toward technology. Of those who have circumvented their company’s cybersecurity policy in the past, the prevailing reason for doing so was that they wanted to use a more efficient tool or service, or one that was considered to be the best in the market. Companies need to enable, not limit, employee choices, using technology and education to manage risk.

    Temporary Employees Require Full-time Supervision

    Contractors were the group most often seen to be bypassing company guidelines on cybersecurity, with 16 percent of respondents saying they had seen a temporary employee circumvent policies.

    Quote

    “BYOD is now a mature concept, but many still struggle to manage the blurry lines between personal and business data access by personal devices. Many organisations have deployed solutions to manage devices, but the anxiety comes from their broad connectivity, especially as the boundaries between business-driven cloud services and personal ones become less clear, which creates unknown bridges between business networks and the Internet at large. Modern state-of-the-art security must be able to prevent any device communication becoming the point of a breach and minimise risk for an organisation.”

    Greg Day, VP and regional CSO, EMEA at Palo Alto Networks

    Recommendations

    • Organisations should continue with employee education efforts to ensure that those on the front line of defences have the skills they need to identify threats.
    • Security professionals should closely monitor the activity of non-permanent employees or contractors and ensure they receive the same policy information as full-time staff.
    • Organisations should integrate up-to-date security solutions that fit with new technology trends in order to eliminate the weaknesses exposed in an evolving computing environment.
    • Businesses should look at how they identify and enable the safe use of trusted or sanctioned cloud services and applications and manage the use of those that are untrusted and unsanctioned.

    Download “Preventing the Blocks to Cybersecurity in Business” at: https://www.paloaltonetworks.com/resources/research/preventing-blocks-to-cybersecurity-in-business

    Research Methodology

    The survey was conducted online among 765 business decision-makers in companies with 1,000+ employees in the U.K., Germany, France, the Netherlands and Belgium by Redshift Research in October 2015.

    [1] Gartner-Press Release, “Gartner Says Worldwide IoT Security Spending to Reach $348 Million in 2016”, April 25, 2016, http://www.gartner.com/newsroom/id/3291817

    Possible MOBOTIX Shipment Delay

    July 21st, 2016

    MOBOTIX will be shutting down their operations from Monday, August 8 to Friday to August 26, 2016.

    All items which are in stock will be dispatched immediately once orders are confirmed.

    Orders for larger quantities or special models may take longer. We recommend to place your order as soon as possible and will do everything in our power to assist you with your project.

    For more information please contact Mark Power on 01473 281 211 or at mark.power@net-ctrl.com.

    Why User-Based Controls Are Critical to Your Breach Prevention Strategy

    July 20th, 2016

    POSTED BY: Navneet Singh on Palo Alto Networks Blog.

    Employees, customers and partners connect to different repositories of information within your network, as well as to the internet, to perform various aspects of their jobs. These people and their many devices represent your network’s users. It’s important to your organisation’s risk posture that you’re able to identify who they are — beyond IP address — and the inherent risks they bring with them based on the particular device they’re using, especially when security policies have been circumvented or new threats have been introduced to the organisation.

    Here are two high-profile, real-world breaches that you can learn from. The key takeaway here is that, to make the most of your next-generation firewall investment, it is critical to implement user-based controls.

    Example 1: Data Breach at a Large U.S. Retailer

    This data breach started with the attackers stealing a third-party vendor’s login credentials. This allowed them to gain access to the third-party vendor environment and exploit a Windows vulnerability. Since the vendor had the privileges to access the corporate network, the attackers gained access, too. The attackers were then able to install memory-scraping malware on more than 7,500 self-checkout POS terminals. This malware was able to grab 56 million credit and debit card numbers. The malware was also able to capture 53 million email addresses.

    The SANS Institute Reading Room for InfoSec has published a report on the breach. The report mentions several ways in which the breach could have been prevented. One of the most important is to have the right access controls in place. Quoting from the report:

  • An identity and access management solution should be used to manage the identities and access of all internal and external employees (third-party vendors).
  • Each external employee should have their own account, so that there is accountability for anything performed on their behalf.
  • Account review procedures should also be in place, specifically for third-party vendor accounts. Auditing of these third-party vendors is critical. This will allow the detection of abnormal behavior.
  • Having all of these controls in place for managing and monitoring the third-party vendor accounts will detect any misuse of third-party vendor credentials.
  • Example 2: Data Breach at a Large U.S. Banking and Financial Services Company

    This data breach started with the attackers infecting the personal computer of an employee. The malware stole the employee’s login credentials. When the employee used VPN to connect to the corporate network, the attackers were able to gain access to more than 90 corporate servers. The attackers stole private information for 76 million households and 7 million small businesses.

    The SANS Institute Reading Room for InfoSec’s report on this breach mentions the need to manage user privileges as one of the key ways to minimize the risk of a breach or minimise damage in case of a breach. Quoting from the report:

  • Least privilege simply means to give someone the least amount of access to perform his or her job. If least privilege control access were applied, these organisations would have reduced the amount of stolen data by 86 percent.
  • Anonymous access must be disabled because many Windows vulnerabilities are caused by null user sessions. A null user session is essentially a Server Message Block (SMB) session with blank username and password.
  • What This Means for You as the Security Practitioner

    Want to make sure your organisation does not end up in the headlines for the wrong reasons, like a massive data breach? You’d do well to implement user-based controls and restrict user access to least privilege, as the SANS Institute reports recommend. Employ the right user access mechanisms not only on the endpoints and on the applications that they access but also on your next-generation firewall.

    Call to Action

    If you own a Palo Alto Networks® Next-Generation Firewall, refer to the following resources to enable User-ID™, and increase your organization’s breach defenses:

  • User-ID documentation
  • Best practice internet gateway security policy
  • User-ID tech tips
  • The Theory of Wi-Fi Evolution and IEEE 802.11 Selection

    July 14th, 2016

    By: Sundar Sankaran, Chief Wireless Architect

    September 2015 marked the 25th anniversary of IEEE 802.11, commonly referred to as Wi-Fi. Over these 25 years, Wi-Fi has ascended from a technology that enabled computers to wirelessly transfer data at 2 Mbps to winning a spot in Maslow’s pyramid as the most basic human need.

    IEEE 802.11 got here, as Lewis Carroll suggested, by running twice as fast. The standard has continuously advanced itself by introducing amendments, such as 802.11n, 802.11ac and 802.11ax. These amendments support higher data rates to meet ever-increasing application demands through the adoption of higher-order modulation schemes such as 64-, 256-, and 1024-QAM, by supporting channel bonding up to 160 MHz and by employing MIMO techniques to transmit multiple streams to single client. In addition to increasing the peak data rate, efforts have been made to improve the spectral efficiency, which characterizes how well the system uses the available spectrum (how many bits of data can be pumped per second in 1 Hz of spectrum). Multi-user techniques such as MU-MIMO and OFDMA have been introduced in 802.11ac and 802.11ax to improve spectral efficiency and network capacity.

    The following table summarizes the key ingredients of various IEEE 802.11 amendments ranging from 802.11b to the recently ratified 802.11ac to the upcoming 802.11ax. As evident from this table, peak PHY data rate supported by IEEE 802.11 has gone up by a factor of 5000, and spectral efficiency has improved by a factor of 625. Enhancements like this have enabled Wi-Fi to become one of the basic needs of life on par with water, air and fire.

    WLAN-Protocols

    View the original article by The Ruckus Room.

    Conventional AV Systems Can Actually Harm You

    June 15th, 2016

    POSTED BY: Steve Gerrard on June 13, 2016 8:00 AM

    There’s barely a day goes by when I’m not reading another batch of stories about how unsuitable conventional endpoint AV security is for dealing with modern malware, APTs, zero-day threats and so forth.

    So replete are these tales of woe that it’s almost possible to switch off from the basic fact that in this uber-connected, cloud-enabled, everything-as-a-service, internet-of-thingamajigs world, most conventional endpoint AV systems are impotent and probably do more harm than good. I write almost, but not quite, because every now and again the occasional story jumps off the screen and gives you that all important wake-up call.

    One such story that came to light a few weeks ago, centered around an Adverse Event Report published by the U.S. Food and Drug Administration (FDA). Wherein, a patient (not named), undergoing a cardiac catheterisation procedure at a US Hospital (also not named) had to be sedated, mid-operation, for five minutes, while the procedure was suspended following the system crash of a vital piece of monitoring equipment.

    The system in question monitors, measures and records patient data during cardiac catheterization procedures. The system is made up of a patient data module, used to capture the patient’s vitals, and a hemo monitor PC to display them. The two elements are connected via a serial interface.

    During this particular procedure the monitor PC lost communication with the patient data module resulting in a black screen on the monitor and the patient having to be sedated while the system was rebooted. As the FDA report describes, the cause of this blackout was attributed to the installed conventional AV software, which at a critical point in the procedure initiated a scan of the system.

    Although the system could be rebooted and the patient fortunately survived it got me thinking about the real-life harm a conventional AV could do to me. Quoting from the Manufacturers Narrative in the FDA Report, “Our experience has shown that improper configuration of anti-virus software can have adverse effects including downtime and clinically unusable performance.” So, although I may be sensationalizing the FDA’s paragraph a little, I’m not feeling that confident after reading the manufacturer’s narrative. Let’s face it: the team performing a standard cardiac catheterisation procedure is not likely to include an IT security engineer who can be called upon at a moment’s notice.

    Could this scenario have been avoided with an Advanced Endpoint Protection system? The answer is probably yes. Traps, our advanced endpoint protection product, is not a conventional AV system — indeed, it’s a paradigm shift from “the way things used to be done.” Traps secures endpoints by preventing known and unknown malware and exploits from executing by focusing on blocking the few, core techniques used by attackers rather than application-specific characteristics. Furthermore, It does this in a lightweight, nonintrusive agent that definitely does not rely on system scanning.

    View the original post on the Palo Alto Network Research Centre Website.

    Learn more

    Watch a demo
    Ultimate Test Drive
    Traps

    Keep up-to-date with Net-Ctrl

    Simply fill in the fields below to sign up for the Net-Ctrl Newsletter.

    Don't worry we only send it once a month.

    • New Solution Announcements
    • Latest Promotions
    • Links to some great content.