Overview
This project is part of the EPSRC Consortia for Explorative Research in Security (CeReS), which aims to develop robust, flexible and sustainable solutions to cyber security. One of the key issues in understanding ethics and norms of cyber behaviours is the impact on individual privacy. In recent years, social psychologists have made a core distinction between personal identity (which refers to what makes us unique, as individuals, compared to other individuals) and social identity (which refers to our sense of ourselves as members of a social group and the meaning that group has for us). Identity is not fixed, but is rather the outcome of a dynamic process. People can move from a personal to a social identity (and back again) depending on the context. Understanding the identity process is therefore key to assessing the impact that privacy and security policies have on people’s behaviours. This is essential in order to be able to deliver systems that can express and analyse users’ privacy requirements and, at runtime, self-adapt and guide users as they move from context to context.
Research Questions
Broadly speaking, this project asks the following two questions and attempts to answer them from both a social psychology and a computing perspective:
- Can privacy be a distributed quality (across ‘the group’)? If so, under what conditions might this be the case?
- Can the group protect the privacy of the individual? If so, how does the group manage the privacy-related behaviour of its members?
The technical challenge is to investigate the privacy dynamics of individuals as they relate to their membership of social, professional or other groups, to develop computational (machine learning) techniques that support such dynamics, and deliver privacy management capabilities interactively, autonomously, and adaptively as individuals’ contexts change.
Expected Outcomes
This project aims to study privacy management by investigating how individuals learn and benefit from their membership of social or functional groups, and how such learning can be automated and incorporated into modern mobile and ubiquitous technologies that increasingly pervade society. We will focuses on the privacy concerns of individuals in the context of their use of pervasive technologies, such as Smartphones and Clouds, and we expect to produce the following outcomes:
- empirical investigation of the privacy behaviour of, and in, groups, in the context of both collaboration and conflict.
- a software engineering framework and infrastructure for the development of adaptive systems that are able to guide their users in their privacy management;
- machine learning techniques, algorithms and tools for automated generation and adaptation of personalised and context-aware users’ privacy policies from sensed data.
The project is in collaboration with the Open University and the University of Exeter. Official website is available here.