Archive for the ‘latest news’ Category

Four key trends driving sustainable use of biometrics

Monday, June 17th, 2019

Today biometrics has become an essential element in integrating security and trust in the global digital economy, as it can reinforce solutions such as digital identity management, border control, fraud management and real-time event management.

On 6th June 2019, I took part in a discussion around the importance of the responsible and ethical use of biometrics as part of the Biometric Institute’s “Great Debate” event, hosted by Microsoft in Brussels. So, I wanted to share my views on what we consider as key trends in driving sustainable usage of biometrics.

Biometrics data is personal, sensitive and intimate. The volume of biometric data is increasing, therefore the number of attacks on this sensitive data is also growing. The good news is that there’s been an increased focus on security, including ciphering data stored, securing communication related to biometry exchange and executing biometric matching in a safe environment. There are also alternatives to manipulate less data and make it non-sensitive. Let’s explore them below:

  1. Reducing data usability

Biometrics involves four key steps in order to work successfully and securely: enrollment/ capture of data, storage of data either in the cloud or on a device locally, matching of the biometrics (this is when another capture of the face/ finger is compared with the initial picture) and identity verification (on the device and on the cloud).

Most of these steps are usually done in the cloud, but we advocate for more operations being performed on the device itself. This is what we call pre-treatment on the edge. This means that only a mathematical representation that is not as sensitive as the raw biometric data is transferred to the cloud. This, of course, mandates that devices and cloud still implement some security measures.

2. Machine learning and frugality

Machine learning is a fantastic enabler of biometric verification because it provides great accuracy. However, in order for it to work precisely it requires massive data collection. This is because before having a well-performing machine learning model, you need to have millions of pairs. This is why we advocate for adopting new techniques of machine learning, named frugal learning, that require 7000 times less data to train the model.

3. New modalities for ethical use

We are all familiar with biometrics such a fingerprint, face, voice, iris, etc. However, there are alternatives that use a smaller amount of sensitive data and are equally secure. These include:

  • Digital behavioral biometrics. This includes the way you type on your keyboard or move your PC mouse. Behavioral biometrics can vary over time and context
  • Ephemeral biometrics. These include the clothes and accessories you wear or the color of your hair. However, this data may be less intrusive.
  • Monitoring events on people. This means looking for instances instead of people to support crowd management.

4. Ensuring that algorithms used are fair and efficient

It is important that machine learning algorithms used to manage biometrics matching and verification do not embed bias. This means that they should provide the same level of performance and accuracy from an ethnic and gender point of view, no matter the people using it.

There are three complementary qualities to look for to ensure performance and accuracy of biometrics:

  • Data representability: Using training data, including your target representative data, to make sure that algorithms are trained for the right context and people
  • Data quality: Ensuring data is correctly balanced and labelled
  • Solution audit and test: Ensuring that the data has the ability to understand the solution, rationale, processes and decisions.

Since biometrics have been increasingly used for security and authentication, we need to make sure that we use them responsibly. I hope these four tips give you an idea on the steps we can take to ensure this.

To view the original post, visit Gemalto.com.

Wi-Fi at 20: Bridging the performance gap towards ten-gigabit speeds

Monday, June 17th, 2019

The Wi-Fi Alliance recently interviewed Intel’s Doron Tal (General Manager, Wireless Infrastructure Group, Connected Home Division) about the past, present, and future of Wi-Fi. As we’ve previously discussed in The Ruckus Room, 2019 marks the 20th anniversary of the popular and ever-evolving wireless standard.

According to Tal, the average home today has approximately 10-20 devices, a number that Intel expects to increase to 30-50 devices over the next year or so.

“Those devices are connecting over Wi-Fi and need fast, responsive and reliable connections to ensure the best experiences,” he explains. “Whether you are streaming HD video or creating and editing content or immersed in an online experience like gaming and virtual reality (VR), Wi-Fi is really important.”

The emergence of Wi-Fi 6

Wi-Fi 6 (802.11ax), says Tal, is a significant step forward to deliver home connectivity that is faster, more responsive and more reliable.

“With Wi-Fi 6, you’re now able to control the traffic from the access point (AP) to the client in a very managed and provisioned manner that can actually be monetized in new ways,” he states. “We see a clear trend on the infrastructure side that deployments are shifting from a single AP to a multi-node architecture with different types of extenders.”

In the future says, Tal, the market will see reliable, smart and seamless Wi-Fi that supports immersive 3D video and augmented reality in very high definition, as well as new use cases in broadcasting, IoT, sensing and machine learning.

“The key to realizing the highly impactful Wi-Fi of the future, as these new and more diverse device types get introduced to the network, will be a lot of focus on making these networks self-organizing and self-healing so that they can be optimized for different experiences,” he adds. 

Commenting on the above, Ruckus’ Jeanette Lee, Sr. Director, Product Solutions and Technical Marketing, Ruckus Networks at CommScope, tells us that that Wi-Fi 6 is well on its way to bridging the performance gap towards ten-gigabit speeds. 

“Wi-Fi 6 delivers faster network performance, connects more devices simultaneously and effectively transitions Wi-Fi from a best-effort endeavor to a deterministic wireless technology,” she explains. “Designed for high-density connectivity, Wi-Fi 6 offers up to a four-fold capacity increase over its Wi-Fi 5 (802.11ac) predecessor. This further solidifies Wi-Fi’s position as the de-facto medium for internet connectivity.”

The advancements of Wi-Fi 6, says Lee, will benefit a wide range of consumer use cases, although they are particularly important for dense environments in which large numbers of users and devices are connecting to the network. Some specific scenarios that will benefit from the new Wi-Fi 6 standard include large public venues (LPVs) such as stadiums, convention centers and transportation hubs.

“Stadiums and convention centers offer high-speed Wi-Fi to improve the fan experience, increase customer interaction and create value-added services such as showing instant replays on smartphones and tablets or allowing attendees to order food from their seats,” she states. “However, stadiums and convention centers with tens of thousands of users simultaneously connecting to Wi-Fi pose definite scale and density challenges. The Wi-Fi 6 advancements around OFDMA, 1024 QAM, OBSS coloring, as well as faster PHY rates, will make it easier for LPV owners to create new business opportunities by offering enhanced services for guests.” 

In addition, says Lee, public transportation hubs are increasingly offering high-speed public Wi-Fi to passengers waiting for trains, buses, taxis and ride-sharing services.  

“Like stadiums, transportation hubs have high densities of people attempting to connect to the networks simultaneously. However, these hubs face the unique challenge posed by transient devices that are not connecting to the Wi-Fi network but are still sending management traffic that congests it. OFDMA and BSS coloring, both of which are part of the new Wi-Fi 6 standard, provide the tools to manage and mitigate these challenges,” she concludes.

To view the original press release, visit The Ruckus Room.

Why you need 90 watts of PoE power

Tuesday, June 11th, 2019

Power-over-Ethernet (PoE) eliminates the need for an additional power source and a second set of cables to each device. The very first Power-over-Ethernet (PoE) standard was ratified by the IEEE in 2003. The nascent standard delivered up to 15 watts of power for devices such as VoIP phones, Wi-Fi access points (APs) and IP cameras. In 2009, the IEEE ratified a new standard for PoE+ that delivered up to 30 watts at the switch. 

PoE

Two new standards in one – 60 watts and 90 watts

The unceasing demand for power in the enterprise has only increased over the past decade. As such, PoE is now standard for enterprise networks that support wireless access points (APs), VoIP phones and other devices. However, it should be noted that a new generation of power-hungry APs, video displays, pan-tilt-zoom cameras and many other devices require more than 30 watts. In recent years, individual vendors responded to this demand by creating protocols such as UPoE (60 watts) and PoH (95 watts). These protocols effectively formed the basis of the IEEE’s most recent 802.3bt standard. Ratified in 2018, the new standard defines two levels of PoE power: 60 watts (Type 3) and 90 watts (type 4).

Do I need more than 30 watts of PoE?

Devices that consumed more than 30 watts of PoE were hitting the market even before the 802.3bt standard was officially ratified. The most common devices were new generations of wireless Wi-Fi 5 (802.11ac) and Wi-Fi 6 (802.11ax) access points. Although most of these APs operate at 30 watts, some require more power to drive the 4, 8 or 12 wireless radios to full power – and provide power for devices connected via their USB ports. Put simply, more than 30 watts is needed to take full advantage of certain Wi-Fi 5 and Wi-Fi 6 access points. In many cases, 40-45 watts is enough for optimal AP performance. Additional devices powered by PoE that can benefit from more than 30 watts include HD/4K video displays, point-tilt-zoom cameras, POS systems and smart LED lighting. Of course, this list is only growing.

Isn’t 60 watts more than enough?

As noted above, there are several devices that can take advantage of more than 60 watts of PoE at the switch. Much like Wi-Fi 6 APs operating with only 30 watts, many devices are designed to operate with less than optimal power – but only deliver their full capabilities when maximum power is available. One such example is smart LED lighting. Another is a wide range of IoT devices for office and building automation. We expect this trend to continue in the future for next-generation IoT sensors, access points, and video, as well as AR/VR infrastructure. 

If you build it, they will come

The recently ratified IEEE 802.3bt standard is serving as a catalyst for the design of high-powered devices and switches capable of delivering 60-90 watts. Ultimately, we expect the industry to clamor for even higher PoE levels. It should be noted that switch PoE capabilities are an important consideration when purchasing and future-proofing network infrastructure. Currently, the useful life of a switch is typically 5-7 years, although, in some deployment scenarios, the life-span can stretch up to 10 years. This means customers will have to determine how capable their switch purchases are of supporting both current and future PoE requirements.

Buyer Beware

Network vendors have united around the single 802.3bt standard described above. Most vendors advertise their compatibility with this standard, but only deliver the lower power level (60W).  Ruckus Networks (now part of CommScope via acquisition) is one of the few that has implemented the 802.3bt standard to the full 90 watts. While it is often challenging to accurately predict future requirements, having more power available now will significantly increase the odds of being ready for a new generation of energy-hungry devices.

View the original post by The Ruckus Room.

One Year After GDPR: Significant rise on Data Breach reporting from European Businesses

Friday, June 7th, 2019

It’s been one year since the European Union (EU) enforced the General Data Protection Regulation (GDPR)¹, a legislation designed to protect the personal data of EU citizens and lay specific rules and guidelines on how their data is collected, stored, processed and deleted by various entities. GDPR requires that organizations must disclose to national Data Protection Agencies (DPAs) any breaches of security leading to “the accidental or unlawful destruction, loss, alteration, unauthorized disclosure of, or access to, personal data transmitted, stored or otherwise processed to local data protection authorities not later than 72 hours after having become aware of it”.

Penalties for organizations failing to comply with the new notification requirements of the regulation include fines of up to €10 million, or up to 2% of the total worldwide annual turnover of the preceding financial year, whichever is higher. A lot of studies at the time showed that companies would not be ready for the 25th of May 2018 which led a lot of privacy professionals to assume the worst when they tried to hypothesize about what could happen when the new European legislation would come into effect.

Rise in the number of data breaches

The European Data Protection Board (EDPB)², the EU body in charge of the application of GDPR still hasn’t developed any official standards to clarify how independent EU DPAs will publicly report specific statistics/numbers about GDPR, and this currently makes collecting and analyzing data on GDPR compliance somewhat challenging. A number of European DPAs have voluntarily confirmed in recent months that the new regulation has led to a significant rise in reported data breaches, clearly demonstrating the impact GDPR has had on raising awareness with the general public as well as organizations regarding their rights and obligations under EU data protection law.

So far, the most reliable data regarding the number of data breaches currently available seems to be from some of the DPAs as well as the overview reports³ published by the EU’s Commission on the implementation of the GDPR. From the data we can deduct that EU DPAs received more than 95,000 complaints from EU citizens since May 2018 and from these complaints nearly 65,000 were data breach notifications.

The law firm DLA Piper analyzed data breach reports⁴ that have been filed by 23 of the 28 EU member states since GDPR came into full force and at the end of January 2019 also the European Commission reported that EU data protection regulators had collectively received 41,502 data breach notifications⁵.

“The Netherlands, Germany and the United Kingdom came top of the table with the largest number of data breaches notified to supervisory authorities with approximately 15,400, 12,600 and 10,600 breaches notified respectively.” DLA Piper says in its report and that the Netherlands recorded the most data breach reports per capita, followed by Ireland and Denmark. “The United Kingdom, Germany and France rank tenth, eleventh and twenty-first respectively, while Greece, Italy and Romania have reported the fewest breaches per capita,” the report says.

Under GDPR, non-EU organizations that have headquarters established in Europe can take advantage of the “one-stop shop” mechanism and with numerous U.S. high-profile technology leaders like Facebook, Microsoft, Twitter and Google choosing to have their European headquarters in Ireland, it will be very interesting to study the yearly data breaches report from Ireland’s DPA when it comes out.

With the EU elections approaching in a few weeks it will be very thought-provoking to analyze how imposed safeguards from EU DPAs and GDPR on the use of political data during elections will affect political parties and how this will influence the collection of personal data related to political opinions and communicating political views to target audiences during the election period.

Anyhow we must be prudent with current data because we are still in a transitional year and with most EU DPAs having a median time for investigating a data breach from 12 to 15 months (or even more), a lot of cases that currently are under investigation are incidents that happened under older Data Protection laws.

GDPR Penalties

Germany is the leading country currently in the number of fines with German organizations receiving 64 of the GDPR fines that have been imposed so far. This includes the two largest fines to date, an organization that published health data on the internet (€80,000) and the second a chat platform (€20,000 for failing to hash stored passwords). “So far 91 reported fines have been imposed under the new GDPR regime,” DLA Piper reports, “But, not all of the fines imposed relate to personal data breaches.”

The largest fine to date is €50 million against Google by France’s Data Protection Authority, but the fine did not relate to a data breach, but to the processing of personal data from Google without authorization from its users. The remaining fines from countries like Austria and Cyprus were comparatively low in value.

Looking into the future

The objective of GDPR was to bring uniformity to data protection laws across EU member states and control how organizations should store personal data and how they must respond in the event of a data breach, emphasizing the importance of creating trust that allows the digital economy to grow inside the European community.

As GDPR reaches its first birthday in a few days, it is clear that the regulation is still young and both regulators and companies are still figuring out its impact and importance. Data Protection Authorities across the EU will soon be publishing annual reports, which should give us a wider and better picture of the level of compliance.

Transparency is a necessity that will help the EU further increase awareness of GDPR and let’s not forget that the rest of the world, especially countries that are very close partners with the EU like the United States, are closely observing in order to better understand the effects and the strengths and weaknesses of the regulation.

References

  1. General Data Protection Regulation (GDPR)
    https://ec.europa.eu/info/law/law-topic/data-protection/data-protection-eu_en
  2. European Data Protection Board (EDPB)
    https://ec.europa.eu/info/law/law-topic/data-protection/reform/rules-business-and-organisations/enforcement-and-sanctions/enforcement/what-european-data-protection-board-edpb_en
  3. First overview on the implementation of the GDPR and the roles and means of the national supervisory authorities.
    http://www.europarl.europa.eu/meetdocs/2014_2019/plmrep/COMMITTEES/LIBE/DV/2019/02-25/9_EDPB_report_EN.pdf
  4. DLA Piper GDPR Data Breach Survey
    https://www.dlapiper.com/~/media/files/insights/publications/2019/02/dla-piper-gdpr-data-breach-survey-february-2019.pdf
  5. GDPR in numbers Infographic
    https://ec.europa.eu/commission/sites/beta-political/files/190125_gdpr_infographics_v4.pdf

View the original post at Gemalto.com.

GDPR One Year Anniversary: What We’ve Learned So Far

Friday, June 7th, 2019

On May 25, the European Union celebrated the first anniversary of the enforcement of the General Data Protection Regulation (GDPR), the most important change in data privacy regulations in the last decade, designed to restructure the way in which personal data is handled across every sector (public or private) and every industry. Now that one year has passed since the GDPR came into effect, we’ve had a lot of questions arising such as how are companies managing the adoption of the new stricter data protection regulation? Do companies know exactly what is required of them to achieve compliance? Are European citizens aware of their new rights? How are Data Protection Authorities (DPAs) handling the enforcement of violations and issue non-compliant fines? How has GDPR affected other global data protection regulations?

GDPR’s story so far

On May 22, the European Commission published an infographic on compliance with and enforcement of the GDPR from May 2018 to May 2019. The infographic reveals some very interesting statistics, including:

  • 67% of Europeans have heard of the GDPR
  • 57% of Europeans know that there is a public authority in their country responsible for protecting their rights about personal data
  • 20% know which public authority is responsible
  • 144,376 is the total number of queries and complaints to DPAs
  • The types of activities for which the most complaints have been made so far are telemarketing, promotional e-mails and vdeo surveillance/CCTV
  • 89,271 is the number of data breach notifications
  • DPAs have 446 open cross-border cases
  • 25 EU Member States have adopted the required national legislation, but three are still in the process of doing so (Greece, Slovenia and Portugal)

The European Commission also issued a press release about the first year of GDPR enforcement, with Andrus Ansip, Vice-President for the Digital Single Market and Věra Jourová, Commissioner for Justice, Consumers and Gender Equality, stating that “These game-changing rules have not only made Europe fit for the digital age, they have also become a global reference point.” And that “People are becoming more aware – and this is a very encouraging sign. New figures show that nearly six in ten people know that there is a data protection authority in their country”.

The statement also makes a bold declaration about work that the EU does for regulating future technologies stating that “The new law has become Europe’s regulatory floor that shapes our response in many other areas. From artificial intelligence, development of 5G networks to integrity of our elections, strong data protection rules help to develop our policies and technologies based on people’s trust.”

The future of data protection

A year ago, there was a lot of fear and doom scenarios across the business world, mostly because of uncertainty about the requirements and obligations of GDPR. Some of that fear has lessened as companies have slowly started to decode and better understand the requirements of the new regulation, and one of the primary benefits of GDPR enforcement has been the overall higher awareness of data privacy issues and the adaption of best practices.

Privacy and consent are obviously still big priorities for many organizations and as we are in a transitional year it is almost certain that DPAs will continue to issue increased fines and penalties. As a result,stakeholders must be educated on the nature of personal data that their organization handles, and what needs to be done so they can comply with the regulation.

Stricter enforcement is around the corner

During the first year of the GDPR being in effect, DPAs in all EU Member States were very tolerant when it came to breaches of compliance and they provided great help to many organizations in becoming compliant..
In many cases, heavy fines have been handed out., however, stakeholders should be aware that there are other GDPR penalties besides fines, including the suspension of data processing. As we head into the second year of GDPR, DPAs will probably become less forgiving for violations in compliance, and organizations should expect an increase in sanctions and fines as we go forward.

GPDR influences data privacy discussions worldwide

A lot of countries in Europe that aren’t subject to EU legislations have adopted compliance regulations almost identical to the GPDR, including Norway, Switzerland, Iceland, Liechtenstein and the UK (in its preparation for a no deal Brexit). Likewise, some countries in Asia and Africa that have close relationships with Europe are redesigning their data privacy regulations, including South Korea and India. Other privacy legislations appear to be heavily influenced from GDPR, in giving rights of data subjects, data breach detection/prevention and accountability, like the California Consumer Privacy Act (CCPA) and the upcoming LGPD (General Law of Data Protection) in Brazil.

What lies ahead

The enforcement of GDPR began a huge global shift for data privacy, creating political movements that are privacy agnostic and require more rights for data subjects, heavier penalties for companies and governments regulating the new rapidly advancing technologies. Also, thousands of GDPR actions are currently under investigation and organizations should expect EU regulators to continue to chase instances of regulation violations.

GDPR has begun regulatory discussions to other countries as well, including the United States. Presently, the data privacy standards being discussed on a global level are not uniform, and organizations could find that they must comply with different privacy legal frameworks as well as face conflicts in legislation (especially if we are talking about multinational/multiregional organizations). In addition, the evolution of technology will certainly challenge even the best-prepared organizations and hugely increase their cyber risk.

Learn more on how to your company can achieve and maintain GDPR compliance, contact Net-Ctrl for more information.

View the original post at Gemalto.com.

Breaking up with old network paradigms

Friday, June 7th, 2019

We’ve all seen it.

You walk into a switch closet and look around. Dust lies thick upon ancient switches—enough to dim LED lights in some cases. A herd of dust bunnies has taken up residence near the fan vents. Nobody has seen the power outlets in years, buried under grime and cobwebs.

The soft hum of equipment fills the air as the tech next to you beams. “It may be old, but it works!”

And if it doesn’t?

“At least we know all of the bugs and have workarounds,” says our new IT friend.

Of course, they do. But what if they absolutely must upgrade? Then what?

Anyone who has ever worked in IT knows the drill. You don’t upgrade firmware unless absolutely necessary. And you never, ever reboot. Why?

“It might not come back,” whispers the network engineer with a shiver. “You never know what will happen if you reboot or upgrade equipment.”

It almost sounds like superstition, but it’s plain practical sense and the reason so many IT departments put off upgrades—hardware and software—as long as possible. The purpose of IT is to deliver network and computing services and anything that takes that down, or puts it at risk, is a Big Problem.

Nobody Wants Downtime: Unless It’s in Hawaii

IT is about maximizing services and minimizing risk of downtime. Why take a chance with new and exciting bugs that will keep you late at the office or working through an extended maintenance period over the weekend?

Which brings us back to those packs of dust bunnies roaming wild and free through switching closets everywhere.

Eventually, time runs out and you need to upgrade for a new capability required by the business. We get that and we understand you because many of us have lived in that world. That’s why here at Ruckus we’ve spent so much time building products that minimize risk without sacrificing new capabilities:

  • Technologies like campus fabric and RF innovations in access points designed to optimize network performance and increase reliability
  • Switches with plenty of POE budget for any application need
  • The most scalable network controller architecture which, in virtualized form, SmartZone is almost limitless in its ability to be upgraded without a rip-and-replace
  • In-service stack updates (ISSU) for switches or multi-image AP deployments allow in-place upgrades without slowing down the network

How to Make a 1Gbps Connection Go Faster than 1Gbps

An age-old problem. You’ve got switches in the closets with 1Gbps uplinks, but you need something faster.

“Here we go,” sighs my IT tech guide. “I spent all this time building the network and now we’re going to change just for a little speed boost.”

What if you could make a 1 Gbps uplink port go faster? With the ICX 7150 switch Ruckus did just that. Buy it with 1 Gbps uplinks today and, when you’re ready, a simple command and license will upgrade to 10 Gbps uplinks. No new hardware.

My network engineer pauses to consider, eyes narrowed. “No hardware swap? What about a reboot? I bet there’s a reboot.”

Nope. No new hardware. No new software. No reboots.

This is why when we ask our customers what they like about their Ruckus network the answer is unanimous: “It just works.”

And we like that just fine.

View the original post at the Ruckus Room.

MOBOTIX Spring Splash: Innovations Ready for the Market

Friday, June 7th, 2019

After the Innovation Summits in Athens and Langmeil and the company’s attendance at the ISC West in Las Vegas, where MOBOTIX presented the latest research and R&D successes and trends to customers and partners, “Spring Splash” marks the market launch of the latest innovations. MOBOTIX is demonstrating the company’s innovative strength to both its partners and customers.

”We want to exploit the potential of MOBOTIX technology and DNA and focus on quality from Germany and cybersecurity,” said MOBOTIX CEO Thomas Lausten at the various launches. MOBOTIX also sees good opportunities and growth potential for video surveillance outside the traditional security sector: “We focus our research and development activities on continuously opening up new markets for MOBOTIX – whether in production monitoring, customer behavior in retail stores, in the health care sector or in logistics – in close cooperation with our technology partners,” explains Lausten.

MOBOTIX works on regularly optimizing and expanding its range of products and solutions in order to develop market-driven and future-oriented innovations and to ensure the greatest possible cybersecurity for these innovations in Langmeil, Germany. Together with the R&D colleagues of the technology partners, such as Konica Minolta, new solutions for vertical markets were developed and brought to marketability:

MxManagementCenter (MxMC): ONVIF-compliant and Scalable

The Spring Splash event sees the launch of version 2.1 of the MxManagementCenter, which is ONVIF-compliant like all MOBOTIX IoT and MOVE camera models. This means it complies with the worldwide open standard for IP-based security products. MOBOTIX offers its customers a simple and scalable total solution for video-supported search for a variety of applications with its new Smart Data license, regardless of the industry. The MOBOTIX Smart Data solution enables the combination of almost any multi-layered data source, such as cash register or car license plate detection systems, using the video data from MOBOTIX IoT cameras. All data can be securely encrypted and transmitted in real time, and the results can be analyzed onsite or via an Internet connection from any MxMC workstation worldwide. “Our Smart Data solution is a valuable tool, especially for our solution and technology partners, for integrating their technological developments into our MxManagementCenter,” continues Lausten.

MxThinClient: Everything on One Monitor

There will be a firmware release going online for the MxThinClient, which will enable an IP video interface to display live images from all MOBOTIX IoT and MOVE camera models and Door Stations on a monitor/TV set. Both the stability and access security of the MOBOTIX system are increased thanks to the option of direct displaying camera images without the need for operating software or even a PC workstation.

MxBell 2.1: More Than Just Improved Usability

MxBell 2.1, the MOBOTIX app, is the mobile remote station for MOBOTIX IP video Door Stations and IoT cameras. The app’s interface underwent a facelift to improve user-friendliness and now sends push notifications for all Door Station and camera events. Especially valuable for the user is the visitor and event documentation, which makes it possible to track events quickly and easily at any time and from anywhere.

Mx-V5.2.3.x: Extreme Data Economy

The system release Mx-V5.2.3.30 for all Mx6 x16/x26 cameras reduces bandwidth and storage requirements by 25% thanks to 3D noise reduction, among other things. In addition to data security, data economy is indispensable in the world of the IoT and is therefore of utmost importance for MOBOTIX in the interests of its customers.

View the original post by at mobotix.com

FINANCIAL CYBER THREATS: 10 CASES OF INSIDER BANK ATTACKS

Thursday, May 23rd, 2019

It is reported that at least 60% of cyber-attacks in financial institutions are attributed to privileged users, third-party partners, or malicious employees. This occasionally happens through employee negligence, or when an employee has malicious intentions, leading them to commit deliberate sabotage. The threats have become hard to control since these types of threat factors normally use authorized information and are considered safe when accessing the organizational network. Banks and other financial institutions are considered one of the top targets and have lead to the loss of billions of customers’ records over the past few years. According to a 2018 Cost of Insider Threats: Global Organizations report, “a malicious insider threat can cost an organization $2.8M per year, or an average of $604,092 per incident”.

data-id=”4811

Verizon’s breakdown was that 77% of internal breaches were deemed to be by employees, 11% by external factors only, 3% were from partners, and 8% involved in some kind of internal-external collusion which makes them hard to categorize. An annual DBIR report states that since 2010, internal attackers account for almost one in five successful breaches.

A Gartner study on criminal insider threats found that 62% of insiders with malicious intent are categorized as people that are looking for a supplemental income. Important to note that seniority had little to almost no effect in this category. Just 14% of persistently malicious insiders were in a leadership role and approximately 1/3 had sensitive data access.

This post looks into the aftermath of insider threats across different banking institutions around the world. Please take note that the content and any of the opinions expressed are solely my own, and do not express the views or opinions of my employer.

JP Morgan Chase

The now-former banker at JP Morgan Chase, Peter Persaud, reportedly sold personal identifying information (PII) and other account information, including the personal identification numbers (PIN) of bank customers. Persaud was first exposed in 2014 when he sold account information to a confidential informant for a sum of $2,500. Later, Persaud reportedly offered four additional accounts for approximately $180,000. Court documents showed that Persaud told the undercover officer that he needed to “take it easy”, otherwise the bank may realize he had accessed all of the bank accounts that “got hit”.

“Persaud abused his position by victimizing unsuspecting customers, and will now pay the penalty for his fraudulent conduct,” -Richard Donoghue, United States Attorney for the Eastern District of New York

JP Morgan Chase II

Another former JP Morgan Chase investment advisor, Michael Oppenheim, was accused in a civil complaint of stealing more than $20M from the bank’s clients between 2011 and 2015. Oppenheim claimed to have invested their money in low-risk municipal bonds and sent doctored account statements reportedly showing earned profits on those investments. Throughout the years, Oppenheim took steps to conceal his fraud. For instance, when a customer asked for a statement reflecting his municipal bond holdings, he created false account statements. Additionally, there were times Oppenheim copied the customers’ details onto an account statement reflecting the holdings of another customer, then provided the fabricated statement to convince the customer that he had purchased the municipal bonds as promised. In another instance, Oppenheim transferred money from one customer to another in order to replenish the funds he had previously stolen.

“We allege that Oppenheim promised his customers that he would invest their money in safe and secure investments, but he seized their funds and aggressively played the stock market in his own accounts,” said Amelia A. Cottrell, Associate Director of the SEC’s New York Regional Office.

JP Morgan Chase III

In a different case of an insider at JP Morgan Chase, it was reported that for over two years JP Morgan Chase bankers could access and issue ATM cards for the 15 accounts of elderly and deceased bank clients. Dion Allison was accused of stealing $400,000 from accounts by searching for customers with high, stagnant balances and Social Security deposits. With the help of two of the banker’s friends, the funds were withdrawn by using issued ATMs around NYC.

“Since I was 16, I worked in the financial field, I did internships and everything, now my reputation is tarnished because of this,” – Jonathan Francis, an ex-banker who was wrongfully implicated in this case.

Morgan Stanley

In 2015, Morgan Stanley, one of the largest financial service companies in the world, was forced to pay a $1M penalty for failing to protect their customers’ records. This was after the company lost $730,000 in customer records to hackers. It was reported in a post published on Pastebin, where six million account records of Morgan Stanley clients were being offered. In the following weeks, a new post was shared on a website pointing to the Speedcoin platform. It featured a teaser of real records from 900 different accounts and provided a link for people interested in purchasing more. This activity was traced to Galen Marsh, an individual who was employed in the private wealth management division of Morgan Stanley. Marsh was originally a Customer Service Associate and then became a Financial Advisor in the Manhattan office, where he provided financial and investment services to particular private wealth management clients.

It was reported that Marsh conducted a total of approximately 6,000 unauthorized searches in the computer systems, and thereby obtained confidential client information, including names, addresses, telephone numbers, account numbers, fixed-income investment information, and account values, totaling approximately $730,000, from client accounts for about three years. Marsh uploaded the confidential client information to a personal server at his home. Ironically enough, the investigatorsconfirmed that Marsh’s home-server was hacked, the very same server that was used by Marsh to exfiltrate customer data from Morgan Stanley.

“It is probable that the client data was extracted from Mr. Marsh’s home as a result of outside hackers. In fact, based upon conversations with representatives of Morgan Stanley, we learned that hackers emanating from Russia were suspected of posting the information and offering to sell it online.” – Sentencing Memorandum

‘The London Whale’

The London Whale‘ scandal resulted in over $6 billion of trading losses to JPMorgan Chase. The claims included wire fraud, falsification of books and records, false filings with the Securities and Exchange Commission, and conspiracy to commit all of those crimes. The individuals’ intent remains unclear, while the charges pertaining to two former derivatives traders were dropped. The Department of Justice stated that it “no longer believes that it can rely on the testimony” of Bruno Iksil.

“The top U.S. securities regulator on Friday dropped its civil lawsuit accusing two former JPMorgan Chase & Co (JPM.N) traders of trying to hide some of the bank’s $6.2 billion of losses tied to the 2012 ‘London Whale’ scandal”.

Wells Fargo

Wells Fargo reported insider fraud by employees who created almost 2M accounts for their clients without their knowledge or consent. Wells Fargo’s clients took notice when they started receiving charges for fees they did not anticipate, together with credit or debit cards that they did not expect. Initially, the blame was placed on individual Wells Fargo branch workers and managers. The blame later shifted top-down to the opening of many accounts for clients through cross-selling. This insider fraud was engineered by particular managers of the bank in collaboration with other bank employees. By opening these accounts, Fargo employees were able to access credits illegally. The fraud led to the CFPB fining the bank an estimated $100M and a total of nearly $3 billion when counting the remainder of the losses and fines. The illegal activity has also made the bank face other civil and criminal lawsuits, as well as losing the trust of their customers.

“The widespread illegal practice of secretly opening unauthorized deposit and credit card accounts.” – Consumer Financial Protection Bureau.

Bangladesh Bank

In 2016, Bangladesh Bank underwent a massive cyber attack, where more than $81M disappeared without a trace. The attack, originally targeting $951M, was conducted through a series of transactions that were terminated when $850M was still to be transferred through the SWIFT network. Thirty transactions amounting to $850M were blocked by the Federal Reserve Bank of New York after suspicions arose due to a spelling mistake made by the perpetrators of the crime. Nearly $101M was transferred from Bangladesh Bank’s account at the New York Fed to Philippines-based Rizal Commercial Banking Corp under fake names, which later disappeared into the casino industry. Only $20M out of $101M that was originally traced to Sri Lanka was successfully recovered from Perera’s Shalika Foundation bank account. Also, it is important to mention that the Philippines’ Anti-Money Laundering Council has accused seven bank officials of money-laundering in a complaint filed at the country’s Justice Department. Good to note that there was no definite published evidence that these breaches were caused by insiders.

“The malware was customized for Bangladesh Bank’s systems, Alam said, adding someone must have provided the hackers with technical details about the central bank’s computer network”. – Bangladesh police deputy inspector general, Mohammad Shah Alam.

It was also reported and published by several reliable sources that cybercriminal gang Lazarus group was linked to the Philippines and Bangladesh bank attack.

“We’re pretty sure it was the work of Lazarus group.” and “We don’t do attribution, we publish only the facts.” – Vitaly Kamluk, researcher at the Kaspersky Lab.

Punjab National Bank

Punjab National Bank in India parted with almost $43M after Gokulnath Shetty, a bank employee, used unauthorized access to a susceptible password in the SWIFT interbank transaction system. The fraudulent act was done to release funds in a highly complex transactional chain schemed up byNirav Modi. It was reported that the bank officials issued a series of fraudulent “Letters of Undertaking” and sent them to overseas banks, then to a group of Indian jewelry companies.

A Letter Of Undertaking, or LOU, is a document issued by a bank to a person or a firm. This LOU is generally used for international transactions and is issued by keeping in mind the credit history of the party concerned. The party can then avail Buyer’s Credit against this LOU from a foreign bank.

Suntrust Bank

In February 2018, Suntrust Bank became aware of an attempted data breach by a now-former employee who downloaded client information, which triggered an internal investigation that led to its discovery. It was reported that the compromised 1.5M client information data included clients’ names, addresses, phone numbers, and banking balances. However, the stolen data did not include information such as social security numbers, account numbers, PINs, and passwords. To combat the increasing concern of identity theft and fraud, Suntrust offered its clients services like credit monitoring, dark web monitoring, identity “restoration assistance”, and $1M identity theft insurance. In addition, the bank heightened its existing security protocols, like ongoing monitoring of accounts, FICO score program, alerts, tools, and zero-liability fraud protection.

Later, Morgan & Morgan filed a proposed class-action lawsuit in which they sought damages for the theft of the plaintiffs’ personal and financial information, as well as imminent and impending injury as a result of identity theft and potential fraud, improper disclosure of personally identifiable information, inadequate notification of the data breach, and loss of privacy.

“The lawsuit, which we filed on behalf of our clients and the 1.5 million consumers affected by the data breach, seeks to hold SunTrust accountable from its acknowledged failure to keep safe the information entrusted to it” – Morgan & Morgan’ lawyer John Yanchunis.

Bank of America

It was reported that Bank of America lost at least $10M as a result of an insider threat that sold “about 300” customer data to cyber-criminals.

“Involved, a now-former associate, who provided customer information to people outside the bank, who then used the information to commit fraud against our customers,” – Bank of America spokeswoman, Colleen Haggerty, said in an email message.

Conclusion – Do the right thing.

Insider threats are a major problem within the banking industry and occur in countries all around the world. Both funds and data are at risk, and with over three-quarters of breaches committed by employees, it is clear that financial institutions need visibility into what is happening on the network, and the ability to hunt for threats and determine attribution in a timely manner.

You can view the original post on SentinelOne’s Website here.

Ruckus takes on the competition with the R730 Wi-Fi 6 AP

Thursday, May 23rd, 2019

The Ruckus Technical Marketing Engineering team recently pitted the company’s flagship R730 Wi-Fi 6 (802.11ax) access point against competing Wi-Fi 6 APs from vendor 1 and vendor 2. All tests were performed in a classroom with 60 MacBook Pro clients (802.11ac) connected to the 5GHz radio of each AP. It should be noted that one of the APs from a competing vendor had its second software defined radio turned off for these tests as well.

data-id=”4801

The AP Contenders

All APs were powered up using multi-gig ports of the ICX 7650 switch. WPA2-PSK encryption was enabled for the tests with an encrypted SSID. The Ixia Chariot testing software was used with endpoint software installed on each client device. TCP frame size was set at a standard 1460. 

The results? The R730 beat the competition by as much as 33% in downlink tests and 25% to 33% for uplink. These tests are with 802.11ac (Wi-Fi 5) devices, so results are expected to be even better when Wi-Fi 6 clients begin shipping in volume this year.

data-id=”4802

TCP-DL/UP Results

This test is useful when looking at performance with maximum frame size. However not all traffic uses large frame sizes. As a matter of fact, most applications don’t. What happens when we use smaller frame sizes?

The next benchmark test clearly illustrates Ruckus’ advantage with small packets.

data-id=”4803

This test is notable because smaller packet sizes (65 bytes in this test) create higher CPU utilization on the AP. The reason vendor 1 and vendor 2 AP results flatline (or worse) is that they are hitting 100% CPU usage in the tests. Due to a superior CPU architecture, the R730 yields twice the throughput of the next best competitor. In real-world terms, this means the R730 delivers superior performance and user experience in deployments where smaller packet size is a crucial makeup in application traffic mix such as a large-scale VoIP deployment.

data-id=”4804

A similar trend is seen with packet sizes of 256 bytes, 512 bytes, and 900 bytes. The Ruckus R730 consistently outperformed the vendor 2 by a considerable margin (200-400% higher throughput).

Summary

Although Wi-Fi 6 clients are just starting to hit the market, IT organizations looking to upgrade will still reap significant benefits with the Ruckus R730—even for their Wi-Fi 5 devices. As a matter of fact, they may see an improvement due to the R730’s superior architecture. These performance improvements will not only help existing applications, but they are also well-positioned to deliver on new technologies such as the greater demands of CPU-intensive WPA3 encryption.

You can visit the original blog on Ruckus’ website here.

Wi-Fi: $2 trillion and more than 13 billion devices

Thursday, May 23rd, 2019

Kevin Robinson, VP of marketing at the WiFi Alliance, recently noted that Wi-Fi has contributed approximately $2 trillion to the world’s economy – with more than 13 billion Wi-Fi devices in active use worldwide. According to Robinson, Wi-Fi is the primary medium for global Internet traffic, as more than 80% of traffic on the average smartphone is transferred via Wi-Fi.

data-id=”4793

Today’s Wi-Fi: A victim of its own success

“While Wi-Fi has been incredibly successful, its success has brought a number of challenges,” he states.  “[Because] Wi-Fi [is] being used so broadly in different device types for different data applications, we see a very broad mix of data traversing Wi-Fi networks, which can ultimately lead to inefficiencies in how Wi-Fi is using a wireless medium.”

As Robinson explains, Wi-Fi is being utilized in ultra-dense deployments to provide coverage for stadiums and transportation hubs where the unscheduled, contention-based access paradigm of traditional Wi-Fi technologies can be problematic.

“We’re seeing Wi-Fi networks increasingly used to deliver connectivity between buildings in either enterprise or maybe city-wide deployments – and there are challenges that go along with that as well,” he adds.

Wi-Fi 6: Bridging the performance gap towards ten gigabit speeds

Wi-Fi 6 (802.11ax), says Robinson, can help address the above-mentioned issues and limitations. 

“The benefits of Wi-Fi 6 become more pronounced as you add more devices to the network. This is important for dense deployments in the enterprise, university campuses, as well as residential areas,” he elaborates. “Wi-Fi 6 delivers diverse capabilities. Because of the ubiquity of Wi-Fi, it is the primary connectivity means in everything from AR and VR headsets to IoT devices… Wi-Fi 6 [also] delivers a more deterministic experience, meaning a more consistent user experience, regardless of the environment.”

Indeed, as we’ve previously discussed on the Ruckus Room, the Institute of Electrical and Electronics Engineers (IEEE) has ratified five major iterations of the 802.11 Wi-Fi protocol, culminating with Wi-Fi 5 (802.11ac) in 2013. However, despite a significant increase in speed, many organizations still find themselves limited by the Wi-Fi 5 standard, particularly in high-density venues such as stadiums, convention centers, transportation hubs, and auditoriums. To meet the challenges of high-density deployments, the IEEE recently introduced the Wi-Fi 6 standard.

From our perspective, Wi-Fi 6 will successfully bridge the performance gap towards ten-gigabit speeds. It delivers faster network performance, connects more devices simultaneously and effectively transitions Wi-Fi from a best-effort endeavor to a deterministic wireless technology, further solidifying its position as the de-facto medium for internet connectivity. Deployed in dense environments, Wi-Fi 6 supports higher service-level agreements (SLAs) to more concurrently connected users and devices with more diverse usage profiles. This is made possible by a range of features that optimize spectral efficiency, increase throughput and reduce power consumption. These include Multi-User Multiple Input Multiple Output (MU-MIMO)Target Wake Time (TWT)Orthogonal Frequency-Division Multiple Access (OFDMA)BSS Coloring and 1024-QAM.

You can visit the original blog on Ruckus’ website here.