Personalized support and services, based on data analytics, have been on the rise the last decade. The scale and dimensions of data gathering and shared in smart information environment (SIE) is sometimes hard for users (data subjects) to fathom and leaves them wondering when, why and how data was collected, or information generated. Much of this is due to the combined power of an abundance of data, data analytics methods and machine learning. Machine learning may, for example, be used to support (or automate) repetitive work, warnings of potential errors, and sense non-compliant behavior.
Despite the many advantages that such smart information environments offer, there are concerns about the responsible use of the data collected. Specifically, the new European regulations on data protection and privacy, GDPR, have raised awareness on privacy issues and causes concerns for designers and developers of smart information systems. In the age of information and digital technology, the focus of privacy has been on the protection of data directly or indirectly pertaining to a person; i.e. protection of personal information and reduction of risks for data subjects. Emphasis has been on data security and several methods, frameworks and techniques have been developed for ensuring appropriate data security. However, in the age of big data, machine learning, ubiquitous computing and social networks, such a data-centric view is inadequate and the need for a more user-centric view of privacy and protection of user data are required. In fact, with increasing availability of data, technology to aggregate and the possibility to conduct sophisticated analyses, the need to protect data and informational privacy is more important than ever before. This is also critical to build smart information environments that users can trust.
For a user-centric view, research has shown that there is a mismatch between legitimate concerns about privacy and actual behavior when it comes to sharing personal information (the "Privacy Paradox"). For those building systems it is necessary to navigate the (users) needs for personalization and wishes to remain anonymous ("Personalization Paradox"). It is indeed this paradox that SIE designers and organizations are faced with when designing services to support people in the various arenas in their personal or work lives and to enhance and foster knowledge sharing among people. Many appreciate the personalized recommendations on websites or personalized messages and notifications received through social media and other online services but oppose the invasion of their privacy. This in turn requires IT designers and developers of SIE to practice privacy-by-design or privacy-by-architecture within the design of SIE and calls for anonymization and cryptographical data protection techniques for log files.
The need to discuss issues related to privacy and trust in smart information environments is an important and highly relevant topic. This workshop's main objective is to start a dialogue and bring together a multi-disciplinary group of researchers, industry and practitioners to share their research, ideas, experiences and concerns in area of organizational and technology privacy and trust in smart information environments. The topics of interest for this workshop, but not limited to, are provided below.
- Privacy and trust by design in SIE
- Privacy and trust in SIE
- System design for privacy awareness
- Privacy and trust in (big) data analytics
- Privacy-preserving data / process mining
- Privacy engineering for (event) logs
- Privacy and trust in machine learning
- Privacy and trust in data aggregation
- Privacy and trust in personalized services
- Privacy and trust at the workplace
- Privacy and trust and human factors
- Privacy and trust in organizational data collection
- Empirical analysis of GDPR compliance in service repositories
- End-user privacy and trust control/management in SIE
- Techniques for GDPR compliant modeling
- Methods and techniques for privacy and trust management in SIE
Paper submission deadline: 22nd April 2019 (extended) Notification of acceptance: 15 June 2019 (extended) Early Bird Registration: 20 June 2019 Workshop day: 18 September
Deadlines correspond to anywhere on earth ('AoE' or 'UTC-12')
Authors should submit original, unpublished research papers. All papers must not simultaneously be submitted to another journal or conference. There are two types of submissions:
- Full Paper submissions have a maximum of 12 pages.
- Short Paper submissions have a maximum of 6 pages.
Authors should follow the Springer formatting for Lecture Notes on Computer Science as indicated here:
Papers and abstracts should be submitted through the Easy Chair web site in PDF format:
By submitting a paper, authors implicitly agree that at least one of them will register to the conference and present the paper. It is expected that at least one author will register for each accepted paper. Only papers that have been presented by their authors during the conference will be published in the conference proceedings.
The proceedings of the I3E 2019 workshops are planned to be published as a book with the main conference proceedings by Springer-Verlag in its Lecture Notes in Computer Science (LNCS) series.
Journal Special Issue
The best paper were invited to a fas-track of the Towards Privacy Preservation and Data Protection in Information System Design special issue of the EMISA Journal (https://www.emisa-journal.org/emisa).
Wednesday, September 18, 2019 - Radisson Blu Royal Garden Hotel (Room D - Austråt)
10:30 - 11:15 Invited talk / Keynote
- Eirik Gulbrandsen (Datatilsynet)
Eirik Gulbrandsen is a senior engineer at The Norwegian Data Protection Authority (Datatilsynet), focusing on the technology aspects of GDPR. His background is from cybersecurity, cryptography, and systems design. Eirik is also president of the Norwegian chapter of Cloud Security Alliance, focusing on cloud and cloud security. He has worked 20+ years in the information security industry, including global companies like IBM and CA.
11:15 - 11:30 Break
11:30 - 12:00 Session 1
- Urbano Reviglio.
Towards a Right Not to Be Deceived? An Interdisciplinary Analysis on Personalization in the light of the GDPR
12:00 - 13:00 Lunch
- Organised by I3E
13:00 - 14:00 Session 2
- Alfredo Perez Fernandez and Guttorm Sindre.
Software Assisted Privacy Impact Assessment in Interactive Ubiquitous Computing Systems
- Georgios Lioudakis, Maria Koukovini, Eugenia Papagiannakopoulou, Nikolaos Dellas, Kostas Kalaboukas, Renata M. Carvalho, Marwan Hassani, Lorenzo Bracciale, Giuseppe Bianchi, Adrián Juan-Verdejo, Spiros Alexakis, Francesca Rubina Gaudino, Davide Cascone and Paolo Barracano.
Facilitating GDPR compliance: the H2020 BPR4GDPR approach (short paper)
14:00 - 14:15 Break
14:15 - 15:15 Session 3
- Felix Mannhardt, Sobah Abbas Petersen and Manuel Oliveira.
Designing a Privacy Dashboard for a Smart Manufacturing Environment (short paper)
- Saskia Nuñez von Voigt and Florian Tschorsch.
RRTxFM: Probabilistic Counting for Differentially Private Statistics
15:15 - 15:30 Break
15:30 - 16:30 Session 4
- Discussion / Breakout session
Wil M. P. van der Aalst, RWTH Aachen University
Anne Adams, Open University
Nathalie Baracaldo, IBM Almaden Research Center
Andrea Burratin, Technical University of Denmark
Jörg Cassens, University of Hildesheim
Anders Kofod-Petersen, Alexandra Institute
Renata Medeiros de Carvalho, Eindhoven University of Technology
Mauro Conti, University of Padua
Martin Degeling, Ruhr-Universität Bochum
Judith Michael, RWTH Aachen University
Ali Sunyaev, Karlsruhe Institute of Technology
Manuel Oliveira, SINTEF
Pieter J. Toussaint, Norwegian University of Science and Technology
Hans Torvatn, SINTEF
Matthias Weidlich, Humboldt University of Berlin
Moe Wynn, Queensland University of Technology