Thoughts on the Trust, Risk, Information and the Law Conference (#TRILCon)

On the 29th of April, I attended the TRIL Trust, Risk, Information and the Law Conference, in Winchester hosted by the University’s Centre for Information Rights.  The conference was well organised with about 60 attendees.  The day was structured with four sessions. The morning had the opening plenary and the first presentation session. The afternoon followed the same pattern with a plenary and a presentation session. The final session was the closing plenary. People live-tweeted from the event and their tweets can be found at the hashtag #TRILcon on Twitter.

The opening plenary was by Matthew Reed Chief Executive of the Children’s Society “The role of trust and information in assessing risk and protecting the vulnerable.” He gave an insight into how important information and the trust of children for the Society’s work. These are issues that resonate through presentations as trust is at the heart of concerns with data and surveillance. He spoke at length about child poverty, which helped the participants understand how large-scale data collection can build up a better picture of child well-being, which in turn can be analysed to look for trends and other issues.

Questions to consider

An interesting question to consider from this presentation was how to understand the child as both a data subject and a legal person. We need to consider them as an individual, a legal person, for data or information purposes yet still regard them as a child in other contexts. In the context of the Data Protection Act (DPA) the test for a subject access request from a child usually relies upon the age of 12 years old where a data controller needs to consider whether they can decide whether their own interests regarding the request. Yet, society sees a different age for other legal acts such as sexual consent is 15 and the age for voting is 18. At the same time, though a child is a data subject from birth even though an adult with parental responsibility will have a large influence on the child’s access to data and their existence as a digital individual. Therefore, a child in care has to rely on the organisation or the state to act as their parent for data protection purposes.

The opening plenary helped set the stage for the first set of presentations. The schedule can be found here.

The conference had a number of presentation strands and had depth and variety. I attended my panel, Surveillance, encryption, State secrets & fashion! The first paper was on Spain’s transparency laws. The paper suggested that political culture’s view of transparency shaped the public’s understanding of its success and possible constraints. The challenge was whether the public could look beyond the headlines when the Spanish government appeared to have a greater influence over Spanish media than UK media.

My presentation was on Blinding the Leviathan: Encryption Surveillance and the Digital State of Nature.  In that presentation, I argued that surveillance was necessary to fulfil the sovereign’s fundamental responsibility and contract with the citizen. The sovereign is created to deliver public safety and because it had the right to determine peace and war within the state, it had to have the means to ensure that it was not threatened which included surveillance of the public space. I then suggested that the digital state of nature DSON, which is similar to the state of nature that Hobbes argued man escaped by creating a sovereign, presented a new challenge. The DSON blurred the clear line between domestic and foreign, public and private, and friend and enemy. Therefore, the Leviathan’s surveillance has to extend into these areas. Yet, when individuals used encryption to thwart the state, it blinds the Leviathan and limits its ability to protect the individuals. A blind Leviathan was still strong enough to deliver the benefits people wanted, their many and expanding rights, but unable to look into any areas that the individual, rather than the state, decided. The result, though, will not be increased freedom and autonomy but the opposite as the state lacks the means to deliver the many and expanding rights of citizens.

The next presentation was excellent. The University of Winchester and London College of Fashion collaborated on the paper. The multimedia presentation offer a fashion show to explore the ways in which wearable computing, like Google Glass and other devices, was changing how we hide from surveillance and the ways in which it enhanced surveillance. A number of interesting points and ideas were presented on the way that data, trust, risk, and information could and did intersect with our most intimate experiences.

Questions to consider

What is the relationship between fashion, our identity, and surveillance? If we wear various personae to fit within different contexts, does ubiquitous surveillance, through our lifestyle devices, penetrate those personae to reveal us? Our concern with surveillance may result in an iterative relationship, where technology defeats technology, so that fashion to thwart surveillance is only available to a few in much the same way haute couture is only available to a few.

 Afternoon Plenary: Statistics

The afternoon plenary looked at the use of statistics in law looking at the Bayes Theorem and Likelihood tests with a presentation by Professor Norman Fenton “Improving probability and risk assessment in the law.” As the presenter explained, the problem of using the statistics was not just the public having difficulty understanding the maths. Instead, it showed how statistical experts often presented the theorems and the inferences incorrectly, which created problems. As many businesses, such as Amazon, use algorithms and Bayesian probability theory to help profile customers based on their interactions and purchase trends, the session was useful. Though focused more on the use of statistics in law it did show a wider application for other fields such as behavioural advertising and other predictive systems that rely upon big data.

 The Afternoon presentations.

I attended the session on Data linking, statistical disclosure control, Facebook privacy policies and the right to be forgotten.

The Facebook session looked at the problem of privacy statements being limited by what the customer could understand. In a survey of 100 university students (undergraduate and graduate) only 4% had read their Facebook privacy agreement. As a result, it may be difficult to assess how well these capture consent that is fully informed, specific, and freely given. Another problem highlighted by the paper was that privacy statements are usually written in English and then translated into a host country’s language. A poor translation compounded the difficulties experienced with understanding the consent. The user is then left vulnerable because they will not be aware of or able to understand the ways in which their privacy statement may have explained how their data is going to be used, stored, and potentially sold.

Questions to consider

An interesting question from this paper was the extent we take consent for granted in the digital domain. Even if we have well designed privacy notices and opt in or opt out statements, how well does this capture consent and would it really be able to capture any future uses? The deeper problem, perhaps at a philosophical level, is how we demonstrate consent to the other laws and to the government in general when we have to make repeated and detailed consents when our data is used but our other behaviour, such as driving, does not attract the same requirements. We start to see a possible tension between the physical and digital domains.

The next paper on Big Data and the right to be forgotten offered an insight into whether we can be forgotten with large data bases that link data. Another problem was the tension between the digital person and the public person in that a public act may be remember or forgotten in a way that is different from the way in which a digital act is remembered or forgotten.

Questions to consider

In the digital age, who remembers determines whether it can be forgotten. The “official record” may be expunged, but the individual can remember now as well as the state can. Will the right to be forgotten extend to the private domain where rival memories are created and maintained? If the concern about linking and big data relies upon data quality, can that quality be assured in the future? A further question is whether the linking and data can resist or overcome strategies to muddle the history or paint a counter narrative. In that sense, the session on statistics will help us determine whether the history we are reading, through the linked big data, is accurate.


The final plenary: [De]-anonymisation & technology panel*

The final plenary brought together a number of speakers on this topic. What was of interest was the presence of the ICO on the panel as they had set the code for anonymisation and psuedonymisation. They pointed that they were the first Regulator in the EU to publish such a standard.  The panel discussed the problems associated with the process and with making sure such data could not be identified by future, yet, unidentified methods.

Questions to consider

Can the tension between useful and meaningful data and personally identifiable data be reconciled? The richer the personal data sets being used the greater the potential to identify someone. Will the concern over data be mitigated by the natural law of data inertia or decay? The data quality could not always be assured so gaps and problem could render its use moot at worst or difficult at best. As the data decays or lacks a robust quality, can we be certain that the correct re-identified someone with great confidence.

Final thoughts

The conference was a success. I found the breadth of papers and presentations stimulating. In my session, I had a number of interested and insightful questions. All the papers sparked discussions and further ideas. The event was well managed and structured. I would recommend people involved with information governance to attend any future events. I have organised similar events and I appreciate the amount of work needed to host and run such events. The Centre for Information Rights offered an excellent day and a lot of stimulating content and discussion, which is exactly what you want from a conference.


Enhanced by Zemanta

About lawrence serewicz

An American living and working in the UK trying to understand the American idea and explain it to others. The views in this blog are my own for better or worse.
This entry was posted in compliance, culture, data protection act, information management, privacy and tagged , , , , , , , , . Bookmark the permalink.

1 Response to Thoughts on the Trust, Risk, Information and the Law Conference (#TRILCon)

  1. Pingback: Thoughts on TRILCon15: The Privacy Arms Race | Thoughts on management

Comments are closed.