User Studies on Trustworthy Collaborative Systems
Inria Associate team USCoast (2013-2015) and USCoast2 (2016-2018)
This project focuses on the human evaluation of methods and algorithms for trustworthy collaborative editing. The project brings together expertise on distributed collaborative systems of SCORE team with expertise in user studies of Department of Psychology of Wright State University.
© Inria / Photo H. Raguet.
The project focuses on two substantive areas - real-time collaborative editing and trust-based collaboration-, with one overarching methodological contribution.
- Real-time collaborative editing including an insightful understanding of real-time requirements for collaborative editing, grounded in a theory for the effect of real-time constraints in collaborative work. Current related work is fundamentally flawed, based on tasks with varied time constants, idiosyncratic task coupling and uncontrolled compensatory strategies. The project focusses as well on awareness management for the coordination of work in the presence of conflict and disruption.
- Trust-based collaboration where users master and control their data by deciding with whom they share their data without relying on a central authority. We investigate new trust-based access control mechanisms where access is given based on user trust values that are dynamic and vary during a collaboration. These mechanisms are scalable and usable. The main questions we address are how to compute the trust values such that they correctly reflect collaboration experiences between users and that they are accepted by users.
Methodologically, validation requires the expertise of both computer scientists that designed the systems and social scientists for conceptualizing and measuring human behaviour in collaborative work. We are developing new methods for the cost-effective evaluation of collaborative work to compensate for otherwise unrealistic sample sizes and costly engineering, using game theory to inspire task analogues and simulated users along with human users.