Thursday, December 17, 2015

Consensus at last - but what does the EU General Data Protection Regulation mean for you?

Discussions over the EU General Data Protection Regulation (GDPR) have rumbled on since 2012. Consequently, it's understandable that this week's breaking news about a final agreement over the legislation already seems like old news. However, while it may have been almost three years since the need for change was acknowledged, the regulation as it stands today is vastly different to that under which organisations currently operate.

As a result, there is an inevitable widespread need for an update to policy, procedure and technology. With the regulation on track to be formally adopted in January 2016 and enforced a short two years later, organisations need to evaluate, implement and adopt processes and technology now, so they don’t fall foul later.

Two points to watch out for

Across the board, two of the most significant changes to be introduced are mandatory reporting of data breaches that are 'likely to harm individuals' within 72 hours and hefty fines of up to 4% of global turnover for non-compliance (the ICO's current maximum of £500,000 will pale in comparison for many large organisations).

Mandatory notification is expected to result in a rise of in the number of data breaches being reported - not because more breaches are happening but because fewer can be swept under the carpet. Consequently, organisations will be forced to open themselves up to scrutiny, with regulatory bodies looking at how the sensitive data they handle is protected throughout its lifecycle. Any shortcomings will be exposed and will count against them.

As we recently examined, TalkTalk's data breach from October 2015 is estimated to cost them £35m in one-off costs alone. We need only add 4% of their global turnover to that and we can see why the EU GDPR will be keeping CFOs awake at night!

The good news is that now there's clarity, there can be action. Boards across Europe need to immediately start planning and implementing the right processes, training and technologies to protect the entire lifecycle of their data so they're prepared for when the regulation is enforced. We can see from previous breaches that it is the small slip ups, caused by human error, that have been the most common and largely the most damning. As a result, security policy need to be matched with user training and education, and underpinned by smart, intuitive technology. Getting a head start on this now can only pay dividends in the future.

Friday, December 11, 2015

There’s no job security when your job is security

“There’s no job security when your job is security”. That’s the kind of line that would be enough for any CSO, CIO or even CEO to start penning their resignation letter.

The reality is obviously somewhat different. However if the history of the last 12-18 months has taught us anything, it is that no-one is exempt from a high-profile data breach. Breaches so severe that jobs can be lost and reputations so badly damaged that businesses are put at risk.

Finally, it seems, the penny has dropped. Organisations including the likes of TalkTalk, Facebook, Gmail and Twitter now accept that no set of security measures is completely infallible to a breach.

As a result, they are starting to assess two things.

The cost of a data breach

Research carried out by IBM and the Ponemon Institute earlier this year found that on average, the global total cost of a data breach increased from $3.52m to $3.79m within the last year. The average cost paid for each lost or stolen record with sensitive data rose as well, to $154, from $145 in 2014. In the case of TalkTalk, it is estimated their breach could cost as much at £35m.

Of course, a monetary value also tells us nothing about the inconvenience and emotional cost of a breach to the real victims of PII loss – you and me. Consumers are now much more aware both of the risks of a breach and their rights if the worst happens. For example, research by Deloitte warns that three-quarters of customers would reconsider using a company in the event of a breach.

What to do when the inevitable happens

Probably as annoying, if not worse than an actual breach, is a company who appears to have no grip on exactly what happened or how bad the breach was. Again, take TalkTalk as an example. Their high-profile breach and the subsequent media circus that followed it was made worse by their own confusion about what had happened and the lack of communication to their already worried customers. In fact, it was more than 24 hours before customers were even notified there had been a breach. What then followed was confusion about what data had been stolen, the number of accounts affected and whether the stolen data had been encrypted in the first place. TalkTalk’s CEO continues to cling onto her job and claims to currently have the support of the founder and the board. However, one has to question how long this will be the case, particularly once the true implications of the breach are felt through lost revenue and lack of customer support.

The risk to data extends further than just a cyber-attack

Organisations need to consider the complete lifecycle of the data they own and manage, therefore understanding where the vulnerabilities lie. This could, of course, be an external cyber-attack orchestrated by a third party intent on accessing and profiting from sensitive data. However, it could also be an inexperienced employee sending highly sensitive information in a clear text email to the wrong recipient, as highlighted by the recent email breach at the North Carolina DHHS.

As research shows, often the biggest risk to any business is human error.

So what does a CSO, CTO or CEO make of this? In time I think we will reflect on these high-profile breaches and realise that they signalled a gear change in data security. At an exec / board-level, suddenly focus and – more importantly – budget are being allocated to better understand all aspects of data security across a business. No longer will complacency rule, because everyone knows that in all likelihood at some point they will be forced the answer the question:

“You had one job: Secure the data. What happened?”

If this results in greater information assurance, more vigorously tested security measures and processes, then it has to be a positive for our data and our confidence as consumers in the market.

Wednesday, November 25, 2015

Underpinning Public Sector reform with smart and secure communication

The major spending reviews of the last eight years have put the Public Sector under unprecedented pressure to preserve high levels of citizen service and support, whilst battling reductions in budget and staff resources. This situation is not set to change any time soon, with George Osborne today announcing further spending cuts for this parliament.

Alongside this, the Public Sector has also been challenged with transforming the way it delivers services, moving away from a traditional ‘vending machine’ approach towards one based on insight, intelligence and early intervention.

So, how can public sector workers face these challenges and, as seems to be their common rhetoric, do more with less?

Delivering the new vision for public services

Let’s take a (simplified) example of child suffering from recurrent chest infections, probably linked to damp living conditions. In a traditional, fragmented system it is harder for a GP or health worker to make a meaningful change to the child. The correct medical intervention is to tackle the infection, however this does not resolve the underlying cause.

The new vision for public services calls for a more coherent and integrated approach to service delivery that tries to get to the root of the problem. Integrated networks of organisations and individuals able to work together seamlessly are key to this.

However as many areas are finding, this is easier said than done. What’s clear is that these new place-based models cannot work if the information flows needed to support them are stagnant and fractured.

This, therefore, puts the need for intuitive and easy-to-use communication solutions at the heart of public service delivery, bringing together professionals from across health, education, blue light, local authorities, and increasingly third and private sector organisations. What’s more, information security must be built into this from the start, meaning citizens’ personal data – be that name, contact information, health details, etc – can only be accessed by approved individuals.

As the public sector is increasingly asked to tailor service delivery to meet individual’s unique needs, it will inevitably require secure communication solutions that can support this level of flexibility, while also providing sophisticated information security, and truly delivering cost and efficiency benefits.

Tuesday, July 7, 2015

How can schools share sensitive pupil data securely?

Schools are expected to process and share increasing amounts of information about pupils – from exam results being sent to governing bodies, to information about ethnicity, special educational needs and medical conditions being shared with approved organisations such as local authorities, and health and social care providers. This is necessary to ensure that not only are curriculum standards being met but that schools are providing holistic care for the pupils in their charge.

Yet schools need to be aware of the types of data they are sharing – and how to do this securely.

Most of the information shared is personal data, as it includes of names, gender and dates of birth. However, sensitive personal data, including ethnicity, physical and mental health, sexuality, and criminal records, can also be shared with these external organisations. Therefore, it is essential that schools ensure the correct technical steps are being taken in order to protect this information as it leaves their institutions.

Sharing pupils’ data outside of schools 

Despite the sensitive nature of this data, a concerning number of schools still continue to utilise unsecure mechanisms for sharing this information. Data exchanged via plaintext emails, fax and even post could not only compromise children’s privacy but also expose institutions to fines up to £500,000 by the Information Commissioner’s Office (ICO) if a data breach takes place.

To resolve this issue, in ‘Inspecting e-safety’, Ofsted has declared that it is inadequate practice to send pupils’ information without using encryption technology to protect this highly sensitive data. In addition, according to the ICO, email and file encryption solutions certified via CESG’s CPA scheme are best suited to meet the appropriate security levels required by schools, as well as help them remain DPA compliant.

Top tips for secure pupil data exchange  

In order to protect pupils’ data that is shared over the internet, schools need not only implement risk management policies and procedures (such as staff training and / or the shredding of all confidential paper waste) but also ensure appropriate technical measures (such as encryption software) are put in place, including:

Email and file encryption

Mechanisms for secure electronic transfer vary widely, however it is important that they offer robust encryption, sophisticated functionality and ease of use – all without affecting existing work processes and infrastructure. In practice, this looks like:

  • Capabilities that integrate seamlessly with existing email clients, such as Microsoft Outlook, so school staff don’t need to log into separate systems to send emails and files securely
  • Ability to provide real-time access control over encrypted emails and file attachments, as well as time-based access restrictions, to reduce the impact of sending information in error and / or third parties mishandling data  
  • Embracing cloud technology securely. As an increasing number of schools move their systems to the cloud via Office 365 Education and Google Apps, they need to provide the highest level of assurance around who can access this data both in transit and when stored in users’ mailboxes 
  • Easy for recipients to use. Uptake of any encryption solution depends on recipients being able to not only understand its necessity but actually be able to intuitively use it

Secure web form 

While email and file encryption are useful when schools need to share sensitive information externally, a secure web form provide an alternative mechanism for securing pupils’ data flowing back into the network. In particular, this can be of importance, when parents need to provide information, including scans of passport, when pupils join a new school. Some of the key advantages of a secure web form include:

  • Security: Providing third parties with an encrypted solution to use means that pupils’ sensitive data is always awarded the correct level of information assurance
  • Simplification and improved efficiency: Web forms provide a single point of contact for numerous third parties and can be integrated internally to populate existing systems and workflows, reducing the admin time schools need to spend simply processing incoming data
  • Cost-effectiveness: A secure web form can replace the need to send  personal data and any other information by post or couriers 

Thursday, June 18, 2015

Bank of England bans ‘autocomplete’ – but is this really the best way forward?

We’ve all done it. Hit ‘Send’ and suddenly realised you cc’d in Dave from Marketing instead of Dave from HR, felt that immediate sickening feeling and realised at best you’ve made yourself look a bit foolish. At worst – and likely what we all haven’t done – you’ve managed to send highly confidential information about Britain’s potential exit of the EU (or, ‘Brexit) to a Guardian journalist.

Unfortunately, that’s what happened to the Bank of England’s Head of Press last month. Not only did the email include details about research into the financial implications of Brexit, termed ‘Project Bookend’, it ironically also included instructions on how to fend off enquiries about this top-secret activity.

In an arguably knee-jerk reaction, the BoE have since announced the disabling of ‘autocomplete’ functionality for their email platform – meaning employees will need to repeatedly type individual email addresses every time they send an email.

But is this really the right course of action to take?

In some ways, it is encouraging to see the BoE taking information security seriously. Data protection is relevant for all organisations – whether you’re handling traditionally recognised ‘personally identifiable information’ or, as in this case, commercially sensitive data and intellectual property.

However, it is likely that turning off autocomplete is going to meet with a lot of frustration amongst BoE employees. Not only will it be a time-consuming process for staff to laboriously type every single address for every single email sent, just imagine the bounce rate (and therefore repeated processes) for typos! Plus, this solution won’t actually provide any control over the email addresses BoE employees type in.

Frustratingly for them, the technology exists that would allow the BoE to have the best of both worlds – business convenience and data protection. It is mystifying why they haven’t instead implemented smart technology that could control who confidential information is sent to, and accessed by, and what they can do with it. Data protection doesn’t need to take us back into the Dark Ages of Technology – organisations just need to be aware of what information security solutions are already available. 

Thursday, May 28, 2015

The future of encryption depends on giving customers choice, flexibility and control

Whether a government department or multinational enterprise, our customers face the same challenge: How do you share sensitive electronic information internally and externally while maintaining compliance, security and control? For organizations working in highly regulated sectors, including government, financial and insurance services, and private healthcare (where the sensitivity of data is of particular importance), this challenge sits high on the agenda.

A broad platform

Egress Software Technologies provides hosted and on-premises encryption services, designed to secure all forms of electronic information and delivered to customers in the public and private sectors using a single platform: Egress Switch. The award-winning Switch portfolio of products includes Secure Email, Secure File Transfer, Secure Web Form, and the latest online collaboration offering, Secure Workspace.

Increasingly we need to be able to offer our customers a truly flexible and scalable, cloud-based, on-premises or hybrid solution that caters to a variety of complex data-sharing and infrastructure needs.

A scalable platform

We have utilized the Azure-hosted platform to help customers encrypt data and files using a combination of services designed to seamlessly integrate with on-premises Exchange or hosted Office 365 environments, enabling end users to share confidential information securely using desktop, mobile, or web based applications. Azure lets us quickly and easily deliver our award-winning encryption services so that end users can benefit from enhanced information security when sharing sensitive data.

Tuesday, April 21, 2015

Encryption 101: The Vigenère cipher

The Vigenère cipher (as it is currently known) was created by Blaise de Vigenère in 1585. However, it is worth mentioning that the cipher has undergone many reinventions over time and its original method is actually believed to have been created by Giovan Battista Bellaso, who first mentioned it in his book ‘La cifra del. Sig. Giovan Battista Bellaso’ in 1553.

A solution to frequency analysis

As you might already know, particularly if you’ve read any of the previous entries in the Encryption 101 series, most of the ciphers we’ve looked at up until now were vulnerable to the cryptanalysis method known as ‘letter frequency analysis’.

The Vigenère cipher, however, is a polyalphabetic substitution cipher and offers some defence against letter frequency analysis. In essence, while the functions of this cipher are very similar to that of the monoalphabetic substitution ciphers that we’ve looked at before, rather than using a single alphabet when encrypting information, we make use of multiple alphabets – 26 of them to be precise!

Vigenère square

Creating the square is fairly simple. On the top line, write out the alphabet going from A to Z. On the next line, move every letter one space to the left, wrapping any over flow round to the end of the row. Repeat this for the remaining letters until you have the square shown below.


In order to encrypt the message, first of all a key has to be agreed upon. In this example, we’ll be using ‘DRAX’. Next, the key is repeated until it is the same length as the length of the message being encrypted – for example:

Key: DRAX 

Plaintext: Nothing goes over my head. My reflexes are too fast, I would catch it.


Ciphertext: Qfteleg drvs lyvr jb yexg. Dy ohwlbavs xuv tlr wapw, Z wlxcd zdkce lk.

To encrypt, we now take the first letter of the plaintext and pair it up with the first letter of the key string. One is placed along the top row and one is placed in the first column. The letter at the point where they intersect will be the first letter of the ciphertext, in this case: 


To decrypt a piece of ciphertext, we follow much the same method used to encrypt the message. We place the first letter of the key in either the top row or the first column. Then, we follow the line along until we hit the first letter of the ciphertext: ‘Q’. The letter at the top of the column or row where the intersection occurs is the first letter of the recovered plaintext.


By using multiple different alphabets, we’re now slowly gaining some defence against letter frequency analysis since the letter ‘N’ won’t always encrypt to the letter ‘Q’. However, that’s not to say the cipher is bulletproof.

The main weakness of this cipher comes down to the length of the key used. Since we used a four letter key in our example, we had to repeat the key multiple times to ensure it matched the length of our message (54 letters). This use of a repeating key will inevitably result in some patterns occurring in our resultant ciphertext and using these patterns, the likely length of the key can eventually be reasoned.

One such method that can be used when trying to deduce the length of the key is known as the ‘Index of coincidence’. Like letter frequency analysis, it is focused on looking at ‘normal’ patterns that occur in texts and how the ciphertext deviates from these patterns.

Once we know the key length, the ciphertext can be rearranged so that it is written in a series of columns, as shown below. We then know that each column was encrypted using the same key – in other words, a Caesar cipher was used. With this information, our old friend frequency analysis can be used to help reconstruct the key.

Even with this weakness, however, the use of multiple alphabets proved an effectual method at protecting information for over two centuries, earning the cypher the name of ‘Le Chiffre Undechiffrable’ or in English: ‘The Unbreakable Cipher’. It was finally defeated in 1854 by an English cryptographer, Charles Babbage – who required a mix of cunning, intuition and amazing cryptographic genius to finally break the unbreakable cipher.

A glimpse of perfection

The Vigenère cipher also gave us perhaps our first glimpse of ‘perfect’ cryptography – that is to say, the information is ‘theoretically secure’. If we had a 250 character long message with a 250 character long random key, then there would be no clues as to what the ciphertext is or what key was used. Perhaps even more incredible is this 250 character ciphertext can be decrypted into any 250 character plaintext message, using any 250 character key – so how do we know which message was the real message?!

This ‘theoretically secure’ idea will be explored more when we take a look at the one-time pad cipher, which has seen use in World War II and in an early version of the ‘Red Phone’ that connected offices of the President of the United States to the President of the Soviet Union.

Friday, March 20, 2015

Securing shared services

It comes as no surprise that two of the biggest drivers towards multi-agency working in the Public Sector are cost and efficiency. In efforts to lower spend, increasing numbers of public bodies and organisations are working together to meet combined goals and doing so using digital solutions.

Yet a major stumbling block exists in the form of ‘closed community’ accredited networks and supported mail systems – such as the PSN, GCSX and CJSM. Although individually they can facilitate secure communication between organisations of the same type or function – who therefore sit within the same umbrella networks and systems – they often fail to do so between different government organisations, as well as with private and third sector partners.

This issue of information security and assurance risks undermining multi-agency working and, ultimately, the evolution of service delivery.

You can only collaborate with confidence if you can share information securely

The challenge: How can organisations create secure environments to work together outside of trusted networks?

Firstly, you need to find a suitable solution that will meet the project’s aims while also securely bridging the divided between existing government supported systems and the organisations unable to access them. Procuring a suitable COTS solution will offer both cost and efficiency savings, and catalogues such as G-Cloud can often help to narrow down the search, with offerings already approved for use within the Public Sector.

Secondly, the solution must offer the appropriate levels of information security and data protection. Again, government initiatives such as CESG’s Commercial Product Assurance (CPA) and Pan Government Accreditation (PGA) can aid the search for suitable solutions. These provide assurance that the solution has been independently certified by the UK National Technical Authority for Information Assurance, is fit for purpose, and is capable of protecting your organisation and the data you share from external threats. PGA in particular is offered to manage combined risks and provide end-to-end assurance when different Public Sector organisations work together to deliver shared services.

Finally, the solution must be simple to use. If the aim of multi-agency working is to improve efficiency, then the solution must not take more time to use than old ways of working. Moreover, a recent ICO FOI demonstrated that 93% of data breaches were caused by human error. Solutions have to make data protection accessible to all while also offering comprehensive protection and control to mitigate the risk of a data breach.

Ultimately, information security should not, and does not need to, hinder the delivery of effective and efficient multi-agency projects. In fact, by sharing data securely, public sector organisations can enhance their services to provide citizens with greater levels of information assurance and thus increased confidence in the services being delivered.