Archive for the 'Data Protection' Category

Court holds that failure to comply with data protection laws can be a material breach of contract

A recent case before the Court of Session has held that a company was in material breach of contract as a result of a failure to comply with data protection laws. The case also provided further guidance on when the courts will consider a aspirational pre-contractual sales statement to be a misrepresentation.

The case involved a company called Soccer Savings (Scotland) Ltd (SSSL). In 2010, SSSL entered into a contract with the Scottish Building Society (SBS) to run an affinity savings scheme targeted at football fans. Basically it allowed fans to get a savings account branded with their football club’s brand.

The scheme wasn’t very successful and SBS terminated the contract in June 2011. SSSL challenged the grounds of termination but accepted the termination as a repudiation of contract and sued for damages. The case came to proof before Lord Hodge.

The defence
When SBS terminated the agreement it relied on pre-contractual mis-representation and material breach of contract. At proof before Lord Hodge, SBS departed from some of the allegations on record and restricted their defence to mis-representation and three separate contractual breaches.

Mis-representation
Lord Hodge found that statements of aspiration or optimism about what was achievable did not amount to an undertaking or warranty. SBS had the clear impression that the proposed venture was likely to succeed but:

It is clear that the venture failed very badly. But that does not make the statements of aspiration by the promoters of SSSL into misrepresentations of fact. Other things may have been said that strengthened [SBS’ Chief Executive] Mr Kay’s conviction that he had been given representations on which he had relied to recommend the deal to his board, but absent evidence of specific statements of fact, I am satisfied that the defence of misrepresentation fails.

So SBS were left with the three breaches of contract to justify their termination of the contract.

Breach of contract
The first breach relied upon was SSL’s failure to get a signed written agreement with a football club by the stipulated contractual deadline of 1st July 2010 thus delaying promotion of the venture.

In an earlier decision Lord Hodge had already held that this was a breach of contract but he now held that although it was a breach it was not a material breach. It did not go “to the heart of the contract” and did not contribute to the eventual failure of the scheme. Accordingly it could not be used to justify termination.

SBS argued that SSSL had breached regulations 3 and 5 of the Consumer protection from Unfair Trading Regulations 2008 by issuing letters on football club notepaper. Lord Hodge disagreed. The clubs had agreed to the issuing of the letters and had signed them. There was no breach.

Breach of data protection laws
And so to the final alleged breach – a failure to comply with data protection rules.

The data protection clause obliged SSSL to use reasonable endeavours to to comply with the statutory rules and to take appropriate measures against unauthorised or unlawful processing of personal data.

SSSL had used the database of a related company (Soccer Savings Ltd or SSL) to send out letters in its own name and in the name of two clubs to account holders in a similar scheme which another building society, the Dunfermline Building Society (DBS), already ran with the SSL. The deal with the SBS came about after the value of deposits under the SSL/DBS scheme fell significantly after DBS encountered difficulties and was put into special administration in 2009. The DBS was subsequently taken over by the Nationwide Building Society (NBS).

Lord Hodge found that SSSL was a data controller under the Data Protection Act, but was not registered as a data controller with the Information Commissioner when it processed data. It had committed an offence. In addition it did not have the necessary consent from the account holders to use their data to promote the new scheme:

While a failure to register may not of itself have been a material breach of contract, I am satisfied that SSSL’s use of the data obtained by SSL under the soccer saver scheme was. SSL did not have the consent of the data subjects (i) to make their data available to the football clubs with which it contracted or (ii) to use their data to promote SBS. Yet SSL had contracted with the football clubs to give them access to the names and addresses of account holders. And SSSL’s directors procured SSL to use the data for the latter purpose. It used the football clubs’ unauthorised possession of the soccer saver data in an attempt to circumvent the restrictions on SSL’s activities in its contract with DBS.

What takes the breaches to the heart of the contract is that SSSL was offering SBS a business proposal, a major component of which involved achieving the transfer of account holders from DBS to SBS. SSSL proposed to use SSL’s data to market SBS’s products and to obtain the transfer of accounts from DBS by targeted marketing. That is what it sought to do in SSL’s letter to the Rangers account holders [one of the clubs involved]. But that provoked NBS correctly to assert both a breach of contract by SSL and also breach of the data protection legislation. NBS carried out the threat in its letter of 10 November 2010 and complained to the Data Commissioner.

I conclude that an important component of SSSL’s performance of its obligations under the contract involved it in the breach of the statutory data protection rules and that that illegality materially impaired that performance. That amounted to a material breach of contract.

The result was that SSSL had indeed been in material breach of contract and so SBS had been entitled to terminate the contract –even if, perhaps, their reasons for doing so were originally quite different.

Ownership of customers
More importantly, however, the case emphasises the importance of ensuring that ownership of customers under affinity arrangements is clearly defined, and the importance of thinking up front about the privacy consents that may be required from customers.

Had the original privacy notices issued to customers clearly stated that SSL and its related companies could use customer details for marketing purposes, then many of these issues could have been avoided. However, I suspect that the course of events that subsequently unfolded were not in anyone’s contemplation when the original deal was conceived.

Martin Sloan

With assistance from Douglas MacGregor, PSL in Brodies’ Dispute Resolution and Litigation department

European guidance on mobile apps and privacy

The Article 29 Working Party (the “A29WP”), a grouping of representatives from the various European data protection regulators, recently issued an opinion on apps on smart devices.

There are two constants with the A29WP’s opinions:

  • Firstly, although often presented as such, they are not an authorative statement of the law. They simply set out the collective (sometimes aspirational) interpretation of the European data protection directive.
  • Secondly, the opinions set out a far stricter interpretation of the directive than that usually taken by the UK’s Information Commissioner’s Office (ICO). This reflects the fact that the ICO usually takes a more business friendly/pragmatic approach to interpreting the law than some of its European counterparts.

That said, the latest opinion provides some useful guidance for app developers, and builds on previous guidance from California’s attorney general and the GSMA, which I summarised in this blog post last year.

The guidance also follows on from the so-called Cookie Law, which (contrary to popular opinion) also applies to mobile apps.

Why do mobile apps raise privacy concerns?
As I noted in that blogpost, there are a number of reasons for the current privacy deficiencies with mobile apps:

  • The market is immature, with many apps developed by individuals or small companies not familiar with privacy laws, but whose products have become hugely popular.
  • The distribution model is fragmented and apps frequently incorporate third party services (for example, mapping providers) into their functionality. SDKs and OS developer rules impose strict controls on developers, yet they don’t provide the necessary tools to ensure that developers adopt privacy by design.
  • The mobile app market has developed at the same time as a vast expansion in the data created by devices, such as geolocation data.
  • Many app developers are located outside the EU and are therefore unfamiliar with European privacy rules, despite the fact that they are selling their apps to users in the EU.

A29WP’s recommendations
The opinion imposes a number of requirements on app developers. These include:

  • App developers must understand their obligations as data controllers when they process data from and about users.
  • Freely given, specific and informed consent must be sought before an app is stalled.
  • Granular consent must be obtained for each specific category of data that the app will access.
  • The user must be provided with well-defined details of the purposes for which data will be processed before the app is installed. General purposes such as “product innovation” or “market research” are, in the A29WP’s opinion, not sufficient.
  • The purposes for which data is processed must not be changed without obtaining new consent from the user.
  • Users must be provided with a readable, understandable and easily accessibile privacy policy, which includes
  • Allow users to revoke their consent and uninstall the app and delte data where appropriate.
  • Incorporate data minimisation and privacy by design/default.

Part of the problem with these requirements is that some of them are impossible to achieve in practice as they are dependant upon the design of the app store and OS ecosystem. For example, the way in which most smart device operating systems install apps means that there is no opportunity in the app purchase system to notify users about data use and obtain consent. This could be set out in the app licence terms of use, but given the low profile given to such licence terms in the app store purchase process, this wouldn’t meet the A29WP’s own recommmendations on obtaining consent.

This is presumably why the opinion also sets out a number of requirements on app stores and OS and device manufacturers, even though there appears to be little base in law for such requirements (the neither party is a data controller in relation to data primarily processed by the app/the app developer).

These requirements, for example, oblige app stores to check that app developers have incorporated appropriate consent mechanisms, and obligations on OS manufacturers to build additional controls into their OS APIs to facilitate consent to access data on the device.

The practical approach
In my view, given these technical limitations, it is more pragmatic to recommend that app developers design apps so that the privacy policy is displayed, and consent obtained, when the app is first opened, and that no data is captured until this takes place. This way, app developers can be sure that they do not inadvertently collect data without consent.

The opinion also skims over one of the other big issues with mobile apps – the use of third party services. In many cases, I suspect that app developers simply aren’t aware of which party is responsible for data protection compliance. Where third party services are utilised (for example, mapping or geolocation), there will often be multiple data controllers. However, the app developer is the party that controls the primary interface with those third parties and therefore needs to flag the terms on which such third parties will use the data collected.

Given the opacity of the policies provided by many third party service providers (and the lack of clear guidance from regulators when the revised cookie law came into force), working this out is often difficult.

You can read the A29WP’s opinion in full by following this link (PDF). If you are an app developer and would like to discuss how your app collects data, and what you can do to ensure that it complies with EU data protection law, please get in touch.

Martin Sloan

Kitchen design company fined £90,000 for unsolicited marketing calls

As someone who received a number of cold calls from fitted kitchen company DM Design, I’m pleased to see that the Information Commissioner’s Office (ICO) has taken action against the company for a breach of the Privacy and Electronic Communications Regulations (PECR).

The fine is the first monetary penalty to be issued by the ICO in relation to live marketing calls. The ICO’s power to issue monetary penalty under the PECR came into effect in 2012, but to date the power has been little used. The first fine to be issued under the PECR, in November last year, was however fairly high – a £440,000 fine issued to Tetrus Communications after it sent millions of spam text messages to promote compensation claims for personal injury and payment protection misselling.

Both fines serve as a timely reminder to organisations involved in telemarketing – whether by telephone, email, or SMS – to ensure that their processes comply with the law.

The law on unsolicited telemarketing
Under the PECR, organisations must not make unsolicited calls for direct marketing purposes where:

  • the subscriber (recipient of the call) has previously notified the caller that it does not wish to receive such calls: or
  • the telephone number in question is registered with the Telephone Preference Service (TPS).

To enable organisations to check whether a number is registered with the TPS, organisations can pay a fee to the TPS to receive a regular report of numbers that have opted out of receiving direct marketing. In practice, this means that any organisation wishing to make unsolicited marketing calls is required to subscribe to the TPS’s service and regularly check their calling lists against the list of numbers registered with the TPS.

The PECR also sets out rules applying to marketing by text (SMS) and email. In summary, an organisation cannot send unsolicited direct marketing emails or text messages (or faxes) to consumers unless:

  • that individual has either provided their details to the organisation as part of a previous transaction (and the marketing is for similar products and services from that organisation); and
  • the individual was given the opportunity to opt out of receiving marketing when the information was collected, and any permitted marketing gives the individual an easy way to opt out of future marketing.

So called “silent calls” (where an automated system dials numbers but when the recipient answers there is no one on the other end) are dealt with by the telecoms regulator, Ofcom. Ofcom now has powers to fine organisations making silent calls up to £2m.

What did DM Design do wrong?
In this case, it appears that DM Design consistently failed to check whether the people it was phoning had opted out of receiving marketing calls, and (in at least one case) refused to remove the individual’s details from their system when asked to do so.

Over an 18 month period, the TPS received nearly 2000 complaints in relation to unsolicited marketing calls from DM Design. According to the TPS’s records, 12 months into the complaint period, DM Design did pay for one month’s subscription to the TPS mailing list and downloaded it once, but did not download the list at any other time during the period of the complaints.

Reporting silent calls and spam telemarketing
If you receive silent calls or unwanted telemarketing, and are registered with the TPS), then you should report the call, email or SMS to the ICO or Ofcom (see links below). Having done this with unsolicited communications from a number of organisations (including DM Design), I'm pleased to see that the ICO is finally taking enforcement action.

Of course, in order for the ICO to investigate, they will need details about the party that sent the message. I usually find that if you connect through to the call centre, then the operative will usually me more than happy to tell you who they work for and where they are calling from before realising why you are asking!

You can access the ICO's unwanted text and calls reporting tool by following this link.

You can report silent calls to Ofcom.

Martin Sloan

Niall Mclean blogs on Brodies PublicLawBlog about a recent ICO monetary penalty notice issued following the loss of sensitive personal data by the Nursing and Midwifery Council.

Brodies PublicLawBlog

Last month, the Information Commissioner’s Office (ICO) fined the Nursing and Midwifery Council (NMC) £150,000 after the loss of three unencrypted DVDs which contained sensitive personal data.   The DVDs related to a nurse’s misconduct hearing and contained evidence from two vulnerable children.   You can read the ICO’s Monetary Penalty Notice here.    A recent post on our TechBlog discussed the methodology the ICO uses for calculating penalties and the NMC’s breach falls into the “very serious” category.

The NMC has expressed disappointment at the decision which it says was down to “an isolated human error”.  The fine is a pointed reminder to regulatory bodies of the importance in keeping hearing information confidential and secure – particularly where it is held electronically and should be encrypted.

Niall Mclean

View original post

Information Commissioner publishes guidance on Bring Your Own Device

The UK’s Information Commssioner’s Office (ICO) has today published new guidance for employers on the use personal (employee owned) devices for work purposes.

Bring Your Own Device (or BYOD) is a hot topic for many organisations. Many employees are seeking to use their own smartphone or tablet for work purposes. If properly implemented, a BYOD scheme can actually reduce the information security risks by making it easier for employees to access corporate data on their own device, thereby discouraging them from trying to find workarounds (such as emailing confidential information to a personal email address, or using a personal email address to carry out work business).

However, there are risks.

In November, Computer weekly reported that the number of BYOD devices in use was set to double by 2014. However, Gartner predicts that through 2014 employee owned devices will be compromised by malware at more than double the rate of corporate owned devices.

A survey by the ICO, published alongside the new guidance, reveals that some 47% of those polled have used a personal device (whether a smartphone, tablet or laptop) for work purposes. However, only 27% of respondents said that their organisation had provided guidance on the use of personal devices for work purposes.

BYOD policy
This is worrying, as it opens up the employer and employee to a number of risks.

For example, if the employer turns a blind eye to BYOD (which would otherwise breach its information security policy), it will find itself in a very difficult position in the event of a data loss incident. Not just with the ICO and any potential fine for a breach of the Data Protection Act, but also in terms of the ability of the employer to take disciplinary action against the employee.

A lack of a BYOD policy means that the employer has no cogent BYOD strategy, setting out what is and isn’t acceptable. For example, the sorts of devices that are considered to have appropriate levels of security, password security, the employee’s responsibilities, and what happens if the device is lost or stolen.

The policy should also cover other issues such as who is responsible for voice and data costs, insurance, and what happens if the employee is unable to carry out his duties because the device has been lost or stolen.

The ICO’s guidance
The ICO’s guidance emphasises the importance of developing a BYOD policy contains the following key recommendations:

  • Be clear with staff about which types of personal data may be processed on personal devices and which may not.
  • Use a strong password to secure your devices.
  • Enable encryption to store data on the device securely.
  • Ensure that access to the device is locked or data automaticaly deleted if an incorrect password is input too many times.
  • Use public cloud-based sharing and public backup services, which you have not fully assessed, with extreme caution, if at all.
  • Register devices with a remote locate and wipe facility (mobile device management) to maintain confidentiality of the data in the event of a loss or theft.

The guidance also reminds organisations in the public sector that information held by employees on a personal device may be subject to disclosure under freedom of information legislation.

More information
To read our top tips for BYOD, follow this link.

To read the ICO’s new guidance, follow this link.

Brodies can help you develop a BYOD policy which suits your organisation. To discuss how we can assist please contact me or your usual Brodies contact.

Martin Sloan

Information Commissioner reveals methodology for calculating monetary penalty notices

Last month, the Information Commissioner’s Office (ICO) successfully defended the first appeal against a monetary penalty notice issued by the ICO for a breach of the Data Protection Act.

The appeal was by Central London Community Healthcare NHS Trust, which appealed against a fine of £90,000 issued for repeatedly faxing a list of pallaiative care in-patients to the wrong fax number.

The most interesting aspect of the appeal is that as part of the ICO’s defence of its decision, the Tribunal was presented with information on the ICO’s internal methodology for calculating monetary penalties.

The ICO’s methodology
The process comprises three stages.

Firstly, a decision is amade as to whether or not to issue a monetary penalty.

Secondly, the case is placed in one of three bands, depending upon the seriousness of the contravention:

  • Serious – in which case the fine will be between £40,000 and £100,000
  • Very serious – in which case the fine will be between £100,000 and £250,000
  • Most serious – in which case the fine will be between £250,000 and £500,000

Finally, the ICO selects the mid point of the applicable banding (so, for a “very serious” fine, £175,000) and then assesses the aggravating factors to see if the fine should be higher and the mitigating factors to see if it should be lower. The aggravating and mitigating factors create an overall weighting, which is then applied to the fine.

Applying this methodology to the Central London Community Healthcare NHS Trust decision, we can see that the ICO viewed this breach as being a “serious” breach with a number of aggravating circumstances (it was towards the top of the £40,000 to £100,000 banding for a “serious” breach).

Interestingly, in its decision the Tribunal queried whether the breach in this case should actually have been classified as a “very serious” breach, given the nature of the breach, the information involved and the fact that the Trust was also in breach of the well established Caldicott Principles.

Early payment discount
In its decision, the Tribunal also upheld the ICO’s decision to permit an early payment discount only if the organisation does not appeal.

Whilst the Tribunal’s decision is not binding on subsequent tribunal hearings, the guidance does provide organisations faced with a notice of intention to impose a monetary penalty notice with more information on how the ICO has calculated the proposed fine. This should in turn help organisations to ensure that any challenges to the size of a monetary penalty can be made by reference to the ICO’s own methodology.

Martin Sloan

ICO revisits approach to cookie law consent – what does this mean for other organisations?

Last month, the Information Commissioner’s Office (ICO) announced that it was going to change the way that it sought to obtain consent from users to the use of cookies on its website, as required under laws that came into force in May 2011 (known as the cookies law). Those changes were implemented on Friday.

What’s changed?
Firstly, the ICO’s website now sets certain non-essential cookies automatically upon arrival. This is a big change from the old approach and marks a shift from prior, explicit, consent to implied consent.

After moving to prior, explicit, consent, recorded traffic to the ICO’s website dropped by 90% as a consequence of users failing to accept cookies (including a Google Analytics cookie used to analyse traffic). Reinstating implied consent will mean that those figures will go shooting back up, giving the ICO a much better idea about how people use itse website. According to the ICO’s news release, this was one of the main drivers behind the change to its cookie consent policy.

Secondly, the ICO has updated its banner notification. The old one looked like this:
Screenshot of ICO website in 2012

The new one looks like this. The banner has now moved to the bottom of the screen (but not the bottom of the page) and is a bit more subtle (no contrasting text colour or box shading to make it stand out):
Screenshot of the ICO website. 4 February 2013

The banner message has been amended to maked it clear that the website has “placed” cookies (as opposed to “will place”), and provides a pointer to allow users to change settings. Notably, the banner will remain until the user clicks “don’t show this message again” or moves to another page.

Surprisingly, the banner message still says that cookies are used to “make this website better”. Given the ICO’s otherwise very strict adherance with the cookie law rules, I’ve always thought that this was a very ambiguous basis upon which to obtain user consent – better for whom? The user? The ICO?

Thirdly, the ICO has shifted information on the use of cookies to a new standalone cookies page.

Finally, on that page (but not on the banner itself) is an option for users to delete non-essential cookies and not set them again:
Screenshot of cookies opt out button on ICO website
This allows users who do not wish cookies to reject them, notwithstanding that they were automatically placed upon arrival at the website. Unsurprisingly, this cookie control tool relies upon a cookie to remember the user’s setting.

What does this mean for other organisations?
Whilst the ICO argues that its revised approach is consistent with its own guidance, other organisations will take some comfort from the ICO’s new approach to cookie consent:

  • The ICO is of the view that knowledge about cookies amongst internet users is much greater than it was 8 months ago.
  • Explicit consent is therefore no longer considered necessary by the ICO for low risk, but non-essential, cookies.
  • Setting cookies on arrival, based upon implied consent, can be appropriate depending on the potential intrusiveness of the cookie. Pre-setting an analytics cookie is one thing; doing the same with a behavioural advertising cookie is quite another.
  • Banners or other methods used to notify users about the use of cookies may not need to be as prominent (design intrusive) as perhaps previously thought.
  • Using a cookie to identify a user that has opted out of other cookies is considered by the ICO to be an appropriate approach, provided users are notified about this.
  • Pointing users to third party websites for further information on third party cookies (such as those used for embedded YouTube clips on the ICO’s website) remains the ICO’s method of dealing with third party cookies.

If you would like to discuss how your website or mobile app deals with cookie law, or would like to understand the implications of the ICO’s revised approach for how you currently handle cookies, please visit our cookies page or get in touch.

Martin Sloan

Christine O’Neill blogs on our PublicLaw blog about the Information Tribunal’s first hearing in Scotland, following Scottish Borders Council’s appeal against its monetary penalty under the Data Protection Act issued as a result of a data breach by a contractor working on behalf of the Council.

Brodies PublicLawBlog

Interesting story carried by the BBC today suggests that (I think for the first time) the Information Tribunal (more properly the First tier Tribunal – Information Rights) is going to sit in Scotland to hear an appeal from a decision of the UK Information Commissioner. As has been widely reported, Scottish Borders Council was fined £250,000 by the ICO in relation to the discovery of pensions records in a supermarket car park.

SBC is appealing against the level of the fine and, it appears, the Tribunal has determined that it should hold an oral hearing in March in Edinburgh or in the Borders. A rare chance to see the Tribunal at work north of the border.

Christine O'Neill

View original post

Kim Dotcom and Mega: Legal FAQs

You’re probably familiar with Kim Dotcom, the German-Finnish internet entrepreneur who currently resides in New Zealand, and is being pursued by the US Department of Justice regarding accusations of a “Megaupload” business empire built on rampant infringement of US copyright laws and the Digital Millennium Copyright Act. 

Much of what is currently being written about Mr Dotcom simply churns trite facts without actually offering much in the way of explanation.  I thought a blog which answered some of the main questions would be helpful.

How does the US have jurisdiction over Megaupload?
Why would Megaupload Limited, with its registered office in Hong Kong, be subject to US copyright laws and to the Digital Millennium Copyright Act?  The answer is that Megaupload deliberately carried out business in the US and with US residents.  The site leased more than 1,000 servers in North America (525 were at Carpathia Hosting, which received $13 million from Megaupload).   

Wired provides great analysis here, but the general principal is that individuals and companies can’t gain the benefits of doing business in a jurisdiction without complying with its laws and being subject to its enforcement efforts – assuming that the jurisdiction can gets its hands on you in “terrifying real life”. Which brings us to extradition!

Will Dotcom be extradited?
Under New Zealand’s Extradition Act, any request for extradition from New Zealand must relate to an “extraditable offence” which is defined as an offence that:

  • Carries a maximum penalty of not less than one year’s imprisonment in the requesting country; and
  • Involves conduct that would be regarded as criminal had it occurred in New Zealand, and would have carried a similar penalty

Unfortunately for Kim Dotcom, breach of copyright is just as illegal in New Zealand as it is in the US. 

Part 3 of the Extradition Act also provides a mechanism by which the requirements to provide evidence establishing a prima facie case in support of the extradition request can be replaced by the simpler “record of the case” procedure. This mechanism is available to select countries, including the US.  (A guide to New Zealand extradition prepared by the New Zealand Ministry of Foreign Affairs and Trade can be read here.)

Nevertheless the US is struggling to extradite Dotcom and is also struggling to make its case against Megaupload and the “conspirators” (Dotcom and various associates).  Dotcom actually received an apology from the Prime Minister of New Zealand for illegal surveillance.  A helpful timeline of the various legal twists and turns can be read here.

What’s the new service that he’s offering?
Kim Dotcom has launched a new service, Mega, which he says is distinct from Megaupload, and which he also insists is legal.

Mega is offering all users 50GB of free cloud storage, making it a potentially compelling competitor to the likes of Dropbox (2GB free) and SkyDrive (7GB free) — if you’re not worried about the service getting shut down like its predecessor.

Mega offers client-side encryption, meaning that (arguably) even Mega doesn’t know what is on the files that clients upload.  The only way a client file can be decrypted is if the client makes both the encrypted file and also the private encryption key publicly available.  This would presumably breach acceptable use of Mega, and Mega also has in place a take down process similar to what other content sharing websites (such as YouTube) offer, and which is required under US law in order for the website operator to qualify for “safe harbor” protection from copyright infringement claims.

Of course, the predecessor site Megaupload had a take down process as well, so this leads us to the next obvious question.

Is Mega legal?
Dotcom still insists that Megaupload was legal, despite the US Department of Justice’s claims that Megaupload’s overall operating model was geared towards criminal intent, because:

  • the vast majority of users did not have any significant long term private storage capability;
  • continued storage was dependent upon regular downloads of files occurring;
  • files that were infrequently accessed were usually rapidly removed, whereas popular downloaded files were retained;
  • only a small portion of users paid for storage subscriptions, meaning that the business was dependent on advertising revenue, and displaying adverts to downloaders;
  • an incentive programme was adopted encouraging the upload of “popular” files in return for payments to successful uploaders; and
  • (potentially most damning of all) there was a comprehensive take down process in use for child pornography and terrorist propaganda, but this same take down process was not deployed to remove infringing content.

Initial impressions would suggest that Mega does not share these strategies.  Certainly Dotcom would have to be incredibly foolish to not apply the take down  process this time around.  In fact, it’s perhaps a credit to Dotcom’s slick advertising/media persona, and Mega’s attractive user interface, that initial bloggers thought Mega would “dismantle copyright forever”.

As Jonathan Bailey succinctly puts it (in by far the best analysis of Mega which I have read):

where Megaupload provided incentives and tools that encouraged users to upload (often illegal) files for mass download, Mega  does not and in fact has a structure and service that puts barriers up against mass downloading of files, legal or otherwise.

What is certain is that we can expect plenty of fun and games over the next few months. 

When Mega launched this week as “The Privacy Company” their claims of super-security were bound to come under the highest levels of scrutiny (some cloud providers definitely perform better than  others in the security stakes – see my colleague Leigh’s analysis).  Yesterday the story was that Mega’s encryption was substandard, today the story (which is emerging as I write) appears to be some form of encryption prize – Kim Dotcom himself has just Tweeted:

We welcome the ongoing #Mega security debate & will offer a cash prize encryption challenge soon. Let’s see what you got ;-)

Who knows what tomorrow will bring?

John-McGonagle

Santa’s “Naughty List” and data protection compliance

Back in December 2010 Martin offered some wonderful advice to Santa Claus regarding his data processing obligations, and provided some further thoughts yesterday on how the proposed draft data protection regulation might affect Santa’s data processing activities.

With under a week to go until Christmas Day I thought it would be good to offer Santa some further advice about that Naughty List that his mince spies have spent all year compiling.

Complying with the First Data Protection Principle
Santa’s Naughty List contains lots of personal data about misbehaving children, and the First Data Protection Principle of the Data Protection Act 1998 (the “DPA”) provides that personal data shall be processed “fairly and lawfully”. In particular, personal data should not be processed unless at least one of the conditions in Schedule 2 is met.  (And further, if the personal data involved is “sensitive” – for example concerning the “commission or alleged commission of any offence” (!) – then at least one of the conditions in Schedule 3 also has to be met too).

Santa has to tread carefully here (not easy after gorging on so much sherry and mince pies!) because the Information Commissioner has provided clear guidance that enticing children to divulge personal data with the prospect of a prize (or similar inducement) is likely to breach the requirements of the Data Protection Act.

For children of a certain age (11 or under), Santa should ensure that parental/guardian consent for any disclosure of personal data has been obtained.  (This is potentially a good result for Santa, as a naughty child would have been unlikely to consent to Santa processing his/her data and therefore limiting his/her prospects of presents.)

But before Santa heads down the chimney, he also has to comply with Paragraph 2 of Part II of Schedule 1 to the DPA, which provides that for the purposes of the First Data Protection Principle, personal data isn’t processed fairly unless the data subject is provided with:

  • the identity of the data controller;
  • if he has nominated a representative for the purposes of the DPA, the identity of that representative;
  • the purpose or purposes for which data are intended to be processed; and
  • any further information which is necessary, having regard to the specific circumstances in which the data are or are to be processed.

Complying with the Fourth and Fifth Data Protection Principles

Having dealt with the First Data Protection Principle, we then arrive at a pair of subordinate clauses.

The Fifth Data Protection Principle requires that data is not kept for longer than is necessary.  It’s virtually impossible to provide an easy answer as to how long is truly “necessary”, but Santa should consider:

  • the current and future value of the information;
  • the costs, risks and liabilities associated with retaining the information; and
  • the ease or difficulty of making sure it remains accurate and up to date.

Ensuring the data is accurate and up to date is actually the Fourth Data Protection Principle. In order to comply with this principle, Santa should:

  • take reasonable steps to ensure the accuracy of any personal data he obtains;
  • ensure that the source of any personal data is clear;
  • carefully consider any challenges to the accuracy of information; and
  • consider whether it is necessary to update the information.

Keeping the Naughty List up to date must be a huge undertaking, especially when candidates even appear from heavenly sources.

Blacklists
Of course, Santa wouldn’t be the first individual to have compiled a blacklist that potentially breaches the DPA. 

You may remember that in 2009, a secret blacklist of construction industry workers made the headlines.  That blacklist was found by the ICO to have been established and maintained in contravention of a number of the Data Protection Principles described above.

The exact nature of the information held is still coming to light, and the ICO is still trying to deal with the fallout.  The private investigator who compiled the blacklist was fined £5,000 – the maximum fine available at that time for persistent breaches of the DPA. 

It’s likely that if such a blacklist was discovered today it would be deemed to be a deliberate breach of the DPA (or at best risking a breach likely to cause substantial damage or distress), with the result that whoever compiled it could face a monetary penalty of up to £500,000.

This isn’t to say that blacklists are impossible to maintain.  For example, Stockholm football club Djurgården has a “hooligan register” (though everybody on it has to be informed, and their details immediately deleted if they successfully contest their inclusion). 

So, if Santa follows our guidance above then he might keep the Naughty List on the right side of the law.  Not that he’s probably too bothered – if you spend all night out sleighing, then you’re probably more worried about the police then the ICO!
 
Merry Christmas!

John-McGonagle


Twitter: @BrodiesTechBlog feed

June 2017
M T W T F S S
« May    
 1234
567891011
12131415161718
19202122232425
2627282930  

%d bloggers like this: