Standard Operating Procedures Ensure Data Security


When people think about information security, they often think about blocking malware, preventing system attacks and encrypting data. While traditional security measures are certainly a must, it’s important for organizations to also consider their data in the context of their business.

There are many operational best practices that can be applied to protect data as it gets used. Here are just a few to strive for as you deliver the best business workflow experience possible:

  • Define who has access to what data, from where, and when.
  • Define what people can do with the data when they access it.
  • Control documents in, around and outside of the organization.
  • Make sure things are done in the correct order and correct manner.
  • Monitor and report on who had access to what and when, and what they did with and to the data and the documents.
  • Ensure processes and supporting workflows adhere to internal, industry and regulatory policies, standards and regulations, such as HIPAA, SOX and ISO.

It’s also important to look at data management from the security CIA triad perspective:

  • Confidentiality: Information is not made available nor disclosed to anyone or anything without proper authorization.
  • Availability: Information, both stored and in-transit, is there when needed by the people, systems and process requesting it.
  • Integrity: Information is authentic, complete and accurate throughout its entire lifecycle.

Things get really interesting when you apply these best practices and perspectives to some real-world scenarios as presented below.

Scenario 1: M&A Deals

Consider the scenario where the business development team is working with a partner on a project, and the team decides to take the relationship to the next level. They must communicate and collaborate with inside counsel, outside counsel, technical teams, business teams, the board, and financial support teams. They also need to set up a deal room and then move content in-and-out of that room.

They then need to control access to the content while also auditing what gets put in, moved, read, edited, signed, and removed. The team may also need to control the timeframe for the deal to be completed: one week to review the materials; two weeks to comment on the materials; and one week to sign the agreements.

When the deal is done, the deal room must be decommissioned, content removed, permissions disabled, and the environment wiped clean. A record of what and why things happened must also be recorded: who approved what—captured in logs and in the final documents and PDFs.

When you don’t have a process in place—especially for specialized cases where the players and content change rapidly and unexpectedly, and then flows in-and-out of the company—information management becomes extremely complicated. And, when things get complicated, people tend to move outside the systems and processes defined for use by the IT department.

The goal for M&A teams should be to optimize the process to make it easy to build the information management workflows required to get the deal done—while also securing and protecting the information.

For mature companies that transact more than one M&A deal, they should consider defining a workflow process as a template. They should also provision a clean M&A room as it is supposed to be—do it once, and create a template for it. The team can then modify the room once you know the delta: how long is it active and who should be invited? Then select the document, spin it up…and the clean room is created.

Scenario 2: Employee Lifecycle Management

Next, let’s look at the process of managing employees. Most organizations have well-defined policies for onboarding new employees; they are given access to data available through systems and applications. Beyond this, the processes required to enforce data throughout the lifecycle of an employee often fall flat.

We can begin with the moments immediately following the onboarding event as employees often ask for additional access to systems, applications and data they require to perform their work. The first few requests may get a good look from the IT and/or InfoSec teams as part of the overall onboarding picture. But requests that come days and weeks into an employee’s employment may fall outside of the formal scrutiny required to evaluate the access in connection with the other access granted on the first day of employment.

Things get more complicated as employees move around the organization: new roles, new responsibilities, promotions, demotions, new locations, new organizational structures, mergers and acquisitions events, and more. How can an organization accurately control their employees’ access over time?

Finally, there’s both the planned and unplanned off-boarding processes. Employees take medical and personal leave, permanently leave the company under good terms, quit under not-so-good terms, and get let go from the company. Sometimes they leave the premises immediately, other times they stay on to complete a project. In short, the off-boarding process can be a bear.

When an employee leaves the company, it is extremely difficult to ensure all the right things happen at the right time, by the right people, and in the right order. Sometimes it’s important to keep certain staff members unaware of an employee’s departure until it is time for those staff members to perform their tasks that are part of the off-boarding process—not too soon, not too late.

Also, when the company is in the throes of cleaning up the employment process, people often forget to do things. 

  • Did you collect the parking pass from the employee and give them a validated parking sticker so they can leave the parking structure?
  • Did you collect their laptop?
  • Before they upload all of the data on their company-provisioned computers to Dropbox, did you remember to block access to Dropbox before the employee was let go?
  • Did you shut down access to all systems—even those the employee can access through remote access?
  • Did you close down their mailbox and re-route their emails to their manager?

A number of these things are standard and can be automated. Workflows should be created as templates and applied to the employee lifecycle management process as employees apply to the company, join the company, change their role, and leave the company.

At each application of the template to the employee lifecycle event, notifications should be created and sent to the security team to inform them of the event. This way they can respond accordingly with system actions such as updating the firewall rules as well as to monitor, block or alert access to certain systems by the affected employee.

The InfoSec team can also be alerted to the fact that the employee gave notice so they can keep an eye on this person…and block large data uploads from the employee’s laptop to Dropbox, for example.

Minimizing Errors with Workflows

The scenarios presented above focus primarily on access and availability. They don’t directly touch on the issues revolving around data integrity.

Data integrity means data meets the following criteria:

  • Available in the form in which it is expected: a number is actually a number, not a string.
  • Current, including all of the latest authorized changes; all authorized transactions must be accounted for.
  • Accurate with all unauthorized changes flagged; processing of invalid data must be handled properly.

As much as possible, systems and humans should automatically grab data from systems of record whenever and wherever they should. All too often, users are forced to open an application from the left screen and paste copied data to another application displayed on the screen to the right. Or worse, they are forced to re-key in the information into the right screen as seen on the left screen.

During this manual process, you introduce the possibility of error—or even fraud.


Consider a bank manager working with a client to complete an application for a mortgage or other loan origination. The client would sit down with the bank manager to apply for a car loan. The bank manager would then have to jump back and forth between multiple applications and spreadsheets, saving the data to a database and finishing the application in a Web browser. With all of the switching, copying, pasting, and re-typing, the bank manager could easily capture the wrong interest rate, loan terms, credit score or customer details.

Alternatively, the bank could implement a workflow that leverages the systems and databases to grab the right information from the right sources, sources that are trusted and have integrity. From within this same workflow, they could kick out a legally-binding document (contract) that is automatically created by the system and then presented to the client to collect their digital signature. The now-signed document could be automatically saved back to the system of record where it could be kept for seven years for purposes of compliance.

Of course, there are always exceptions to the rule, but workflows can also be updated to handle these exceptions. For example, additional managerial approvals could be required based on the size of the loan, or the contract could be custom-created based on the client’s risk profile or location. Nine times out of ten, you don’t want to delete the document without approval as well as proof of the deletion event so the deleted document can be archived as a PDF.

Identifying and Thwarting Intentional and Unintentional Data Handling Errors

Information security is about strengthening the weakest leak in the chain. You need to understand where your data is coming from, where it lands, where it goes within the organization, and how it is used. As employees are hired, transfer to new roles and leave the company, maintaining data security grows exponentially more complex. Essentially, you need to understand what happens to data and where.

Your organization can better ensure data confidentiality, availability and integrity by deploying operational best practices and workflows that enable your InfoSec team to monitor employee activity—and to always know how data is handled throughout each employee’s lifecycle.  Only then can you identify and thwart intentional and unintentional data handling errors that might put your company at serious risk.

Ryan Duguid
Vice President of Products


Ryan Duguid is vice president of products at Nintex, where he is responsible for setting product and platform vision, driving continuous innovation, and delivering technology to help everyday people solve their process problems.

More about Ryan Duguid