To Secure Banking Data, Map Out Its Journey In Advance

When it comes to stealing data, thieves often look to the finance industry for rich pickings. Banks must think differently about how they manage their data to reduce the risk of it falling into the wrong hands. By analysing a data record’s journey through the organization, financial institutions can govern and protect it along the way. In this article, we will explore how that works and where to begin as you rethink data governance.

Banks can’t afford to ignore data security in the light of mounting cyberthreats. In April, Global Banking and Finance Review reported on Verizon’s 2017 Data Breach Investigations survey. The financial services sector was the number one global target for data compromise, accounting for a full quarter (24 percent) of all breaches.

Hacked banks have littered the headlines for years. In 2015, JP Morgan and other banks were victims of a concerted cybercrime campaign that harvested the customer data of over 100 million people. The following year, criminals hit Qatar National Bank and leaked 1.4Gb of sensitive data online, including passwords, PINs and payment card data for hundreds of thousands of customer accounts. These organizations all made the same understandable mistake: They didn’t map out their data’s journey properly.

Data is one of a bank’s most valuable assets, but if not tightly controlled, it can quickly become a liability. By separating a data record’s journey throughout a bank into distinct phases with individual characteristics, financial institutions can make use of its value while minimizing its risk. In its work across finance and other sectors, HANDD has identified five states in the data journey.

Creation

All records begin their journey when the bank creates them. This can happen via online channels, on the telephone, or in many cases, at the branch when new customers open accounts.

These records go beyond sensitive data such as customer records. Banks should include documents created internally in their analysis, ranging from risk models through to operating reports.

Each of these individual data assets will have its own level of sensitivity, and processing requirements outlined by financial regulators. Banks must keep some data records for longer than others, for example. Understand these characteristics and bind them individually to each document or record for future use. This is a key requirement if banks are to manage data consistently and at scale.

Data classification tools will use metadata as the solution to these problems. They can tag each record with specific labels describing these characteristics. Future applications can ‘read’ these tags and use them to decide how they should treat the associated record.

Banks can train employees to choose the right labels and tag new records as a matter of course, so that no new records enter the organization without applications knowing what to do with them.

Go beyond tagging new records, though. All banks will have an ocean of existing data that they should also classify to avoid it becoming a liability. Use data discovery tools to find these records and classify them. In many cases, these tools can tag existing data based on predefined rules, leaving administrators to enlist employees’ help with ad hoc records that don’t fit the existing categories.

Classifying data up front will help to solve one of the biggest problems facing UK companies: a lack of differentiation in data security investment.

HANDD surveyed 304 IT professionals and found that 41.4% of them allocated the same level of security resource and expenditure for all company data, regardless of its importance. This theoretically means that they are spending the same time and money securing internal memos about lunch breaks as they are customer records. These companies are not putting their security budgets to work effectively. Data classification can change that.

Storage

The next stage in the data journey highlights the importance of good classification. Without probably classifying data, a bank won’t know how to store it optimally. Should they encrypt it on high-speed solid-state internal storage, or leave it in plain text on a slower hard drive?

Access

Proper data classification makes the next stage in the data journey easier: access. Not everyone should have access to all data. Tying information about data to its record makes it possible to decide automatically whether someone should see it, and whether they can edit it.

Identity and access management (IAM) systems have a big part to play at this stage. Financial institutions can use them to securely authenticate employees using enhanced mechanisms such as two-factor authentication.

2FA uses something you know, such as a password, along with something you have, such as a hardware token. Biometric systems take it a step further, using ‘something you are’, such as your fingerprint, making it still harder for one person to impersonate another.

Many banks will have an existing directory management system such as Active Directory. This will contain employee access credentials along with information about their roles and responsibilities at the bank.

When a user accesses a data asset, IAM systems can combine this information with record metadata to provide them with only the privileges that they need, and no more.

Sharing

The next stage in a data record’s journey is sharing. Left unmanaged, data has a way of leaving an organization in potentially disastrous ways as users share the data inappropriately via channels ranging from email to social media, or plain old printed paper.

Rights management and data leak prevention systems can govern how users share records, and who with. Combine these tools with user training to prevent inappropriate sharing.

Without this level of visibility and control, banks face expensive mistakes. Earlier this year, Scottrade admitted that 20,000 customers’ personal data was openly available online after a third-party vendor accidentally uploaded it to a server in plain text. When it came to managing sensitive data, no one was at the steering wheel.

Disposal

Given the financial sector’s heavy regulation, banks should not overlook what happens to a data record when it reaches the end of its life. Eventually, a bank must dispose of the record either because it is no longer useful, or because regulators mandate it.

Map a record’s metadata against a disposal policy. Some records may need erasing. Others may still have value for historical data analysis, but might need to be stripped of personally identifying information and used in aggregated big data analytics. Create these policies so that you can dispose of or archive your data records responsibly.

All sectors must deal with these data governance challenges, but the financial sector has historically been especially data-intensive, meaning that the stakes are higher in this innovative and fast-moving industry. By thinking about data governance systematically, you can avoid your financial institution becoming the next unfortunate headline.