Coming to Terms

Coming to Terms

Interaction Principles for Negotiating Privacy in the Connected Workplace



Abstract


User-centered digital interfaces between an employee, management, and an inherently surveilling eco-system of workplace applications can help uphold the dignity, earn the trust, and evoke the altruism of an employee via screen-based interactions that re-enforce transparency, employee agency, co-determination, and negotiation.


More and more, businesses are looking to their massive stores of data regarding workplace behavior as an opportunity to better understand their organization, monitor its performance, and optimize its workforce and processes.

This is not a new notion, but the present revival of optimization based on rigorous, quantitative observation brings with it an unprecedented amount and granularity of data. Furthermore, technologies and services are beginning to surface that are capable of trawling massive data stores and building ontologies for the purpose of business strategy.

However, such systems bare cultural connotations of oppression, coercion, and privacy invasion—rightfully so, given the unflinchingly bottom-line driven climate of US business, the widening class gap, and daily headlines regarding the NSA surveillance program. If workforce surveillance is going to be genuinely successful, one of the primary challenges it must negotiate is that of an established and expanding distrust amidst employees.

My thesis project is a digital user interface prototype for a workforce tracking system that affords users the ability to manage their level of participation. The design of this UI supports and manifests principles of transparency, privacy, human autonomy, co-determination, and negotiation. The hypothesis of my project is a user interface that is transparent and democratic in its interactions regarding workplace surveillance will result in users being more responsible regarding their personal data as well as more altruistic in volunteering personal data for use by their organization.








Paper

Covers primary and secondary research, design principles, prototyping process, discussion, and conclusion.

DOWNLOAD PDF

Prototype

Simulation of software that allows an employee to customize and negotiate their involvement in a workplace surveillance program.

LAUNCH PROTOTYPE

TL;DR

An article that serves as an executive summary of learnings and principles from the thesis, published on DesignMind.

VIEW ARTICLE







Design Principles





Personal Data is an Inalienable Representation—not a Currency


In the past decade, many services predicated on the idea that a user’s data can be exchanged as payment for said service have seen tremendous success and adoption. Businesses that have deployed such service models include tech giants such as Google and Facebook.

However, there has always been a consistent nagging from those concerned with the implications of such service models. And, in light of the Edward Snowden leaks regarding government surveillance as well as huge tech companies making increasing plays to provide the essential infrastructure for societies, this concern has become widespread and is fueling much debate and dialogue. In spite of its effectiveness in user adoption and even some legitimacy as a system that could possibly uphold the privacy of individuals, treating a user’s private data as a commodity that has transferable ownership is gaining a poor reputation.

To counteract this model of personal data processing, the thesis prototype is to be designed with a principle that shares the same spirit of the EU directives that regard human dignity and privacy as a fundamental right. In this vein, the design is to leverage any opportunities to communicate this belief—that their data is theirs—to the user to better establish trust between all parties.

Personal Data is a Democratic Mechanism


In writing about the delicate tension between privacy and transparency in society, Evgeny Morozov points out that the right to complete privacy—to withdraw oneself wholly from the public sphere, disconnected from corresponding responsibilities—poses a danger for society and begins to “undermine the very democratic regime that made the right possible” (Morozov, “The Real Privacy Problem”).

Be it a society or an organization, there is a need for some degree of transparency at the level of the individual in order for peers to mutually benefit from knowledge sharing as well as for government officials or management to make informed strategies for the organization. Morozov ultimately points to the importance of politics—the realm of dialogue and negotiation given the messy, human, often confusing nature of societies and organizations—as the medium in which personal information should be pledged or withheld.

Thus, employees should view their personal information not merely as being useful only to themselves and their own career but also as something that can be pledged toward strategies and initiatives with which they deem beneficial. Furthermore, as opposed to a UI designed with transactional metaphors, the UI of workplace surveillance software should instead seek out dialectical patterns and metaphors that are appropriate for negotiation, political discourse, and the formation and maintenance of relationships.

Effective Policies are the Product of Recourse and Negotiation


Adam Greenfield articulates the problem this design principle sets out to mitigate:

[T]he deep structural design of informatic systems—their architecture—has important implications for the degree of freedom people are allowed in using those systems, forever after. Whether consciously or not, values are encoded into a technology, in preference to others that might have been, and then enacted whenever the technology is employed. Such decisions are essentially incontestable...Even if we are eventually able to challenge the terms of the situation—whether by appealing to a human attendant who happens to be standing by, hacking into the system ourselves, complaining to the ACLU, or mounting a class-action lawsuit—the burden of time and energy invested in such activism falls squarely on our own shoulders. (Greenfield 144, 146) Everyware: The Dawning Age of Ubiquitous Computing

Systems that embody and become a platform of institutional power structures—such as a system concerned with measuring and communicating how people behave at work—are highly susceptible to the issue Greenfield is raising. Therefore, it was of primary importance to always be identifying opportunities for the UI of my prototype to afford users the ability to react against any unintended biases inherent to or acquired by a system that tracks personnel.

Transparency is an Effective Means to Trustworthiness


When doing the initial round of interviews with people that work for employers or in fields that might deploy surveillance technology for workplace analytics, one interviewee mentioned that it was really a matter of surfacing what her company wanted to do with the data gathered by such an initiative. When running some hypothetical uses by her, she claimed to have no concerns with any of them and would volunteer her data. This was true of hypothetical uses that had no benefit for her. After further discussion, she realized and communicated that it was the good faith of clearly exposing intent that helped her trust the initiative just as much if not more than fully understanding the intent.

This was echoed when facilitating the usability tests on the prototype. If the business’ intentions regarding the workplace surveillance were not stated immediately and clearly, a few participants mentioned this made them concerned. While it is important to let users know what is intended with workplace surveillance so they have an opportunity to judge if it is worthy or not, it seems there is a much deeper trustworthiness that is gained by being upfront with intentions.

Opt-In is the System Default


Dark Patterns, a website that collects examples of ethically dubious user experience design patterns, defines the dark pattern called ‘misdirection’ as follows: “[t]he attention of the user is focused on one thing in order to distract its attention from another”. This pattern is commonly applied to software installation wizards that are trying to stealthily install additional, generally unwanted software. This is often accomplished by opting the user in by default in hopes that they fail to opt out. (“Misdirection")

Users are becoming increasingly aware of this pattern and others that are used to opt in their personal information. A 2012 study showed that over half of app users have decided not to install an app when they discovered how much personal information they would be required to share in order to use it. The same study also showed that 30% of app users have uninstalled an app already on their phone upon learning it was collecting personal information that they did not wish to share. (Boyles)

Such interactions instill a lack of trust from the user towards the software and the creators thereof. These tactics that attempt to sneakily opt people into generally undesirable circumstances, or—worse yet—a lack of any interactions that allow the user to opt out can even be viewed as coercive. Coercive maneuvers made by management to surveil the workforce can actually lead to paradoxical results. “These efforts become an escalating battle of wits in which managers devise ever more sophisticated surveillance and employees use their ingenuity to circumvent it” (Sewell 939).

This means that such systems—when it comes to the processing, synthesizing, and reporting of data—should be engineered and designed under the assumption that a complete data “portrait” of the workforce—even at the level of the individual—is unlikely. Besides, this notion of the inability of embedded, data-gathering system to not paint a complete picture is not just a result of honoring active consent of those being surveilled but is a deeper, inherent reality. A complete, qualitative picture of human behavior being produced by a system is an impossibility (Greenfield 133).

A Third Party Embodies and Communicates Bipartisanship


This principle started out as very much a hypothesis at the beginning of the project. On the one hand, a third party handling the deployment and data processing of a workplace surveillance program could put employees at ease, knowing that their managers or peers would not be the ones viewing and synthesizing their information. This could result in chasms in the power structure within the organization. On the other hand, a third party could also represent an outside organization that is not necessarily beholden to the organization and its members and could seek out its own benefit when wielding employee information.

However, most usability test participants preferred the notion of a third party handling and processing the data. It represented neutrality. Thus, it would seem that a third party does indeed help establish trust between employees and the system.

Some People Care about Details—Most Don't


While it is critical for the sake of communicating trustworthiness to extend the offer for line-item consent to potential participants of a surveillance system, many users are not interested in the finer details of their consent and participation. When doing initial research interviews, it became clear that employees that trusted their employer did not feel a lot of need to control every aspect of their participation in a workforce analytics study that would involve surveillance. Therefore, software should allow an easy path for users that wish to consent quickly and completely.

This principle also has another application. As mentioned in a previous section, there is a tremendous usability issue when it comes to the lengthy, complex, legal documents that spell out the terms of agreement for software applications or other agreements. While these documents will remain necessary for legal purposes, the user experience design of workplace surveillance software should find ways to interpret the terms in a user-friendly manner.

Get Out of the Way of Work


This is a simple, obvious principle. Any such system should minimize its interruption of an employee’s workday. This concern was mentioned frequently when testing the prototype. The concern regarding privacy of such systems is rivaled by the concern of it being a nuisance to getting work done.




Discussion



Trustworthiness

Based on the feedback received from participants in the user testing, the approach taken has resulted in a user experience that engenders trust. Transparency, clarity, and control were cited by test participants as the reason for its trustworthiness. The initial prototype had some issues in user testing regarding clarity of methods of surveillance that resulted in some suspicion or doubt. However, steps have been taken in the revised prototype to increase clarity and mitigate these issues...





Methods

In the process of creating the initial and revised prototype, I never came across a particularly novel interaction pattern or widget that espoused trustworthiness. If anything, designing the prototype has been in exercise in the fundamentals of interaction and user interface design. The novelty to be found in this project is more in regards to the application of humancentered design fundamentals to the nascent fields of computational social science and state-ofthe-art, deeply connected surveillance in the context of the workplace...





Provocation

The breed of design I have employed for the current iteration is conventional and focused on achieving the goals of all involved parties as directly and friendly as possible. This results in a smooth process that can be executed quickly and as clearly as possible. While this approach has yielded several important, worthwhile interaction patterns and design principles, I don’t believe it is sufficient in beginning to address some of the deepest, most difficult issues that are beginning and will continue to crop up with inherently surveilling information products...





Third-Party Facilitation

While design a business model is not the goal of this thesis, many benefits of such a service being operated by a third party were discovered when designing and testing the prototype. Almost all test participants appreciated the notion of the study being facilitated by an outside organization—even those that had a high degree of trust towards their organization. I suspect the precedence set by the third party organizations that execute background checks for employers could be influential in this notion of the neutrality of third parties given the context of sensitive, personal data in the workplace...





Legislation

The prototype proves the beginnings of a system that is indeed capable of facilitating the communication and negotiation of policies regarding workplace surveillance. Given such systems, it becomes easier to imagine a future in which individuals and organizations can successfully co-determine an array of policies with one another...






About



Project

This thesis was completed as part of the requirements of the MFA program at Savannah College of Art & Design. The thesis was completed in the winter of 2014, and it now serves as a launchpad for further exploration into topics of workplace systems and employee privacy.

Author

Clint Rule is an associate creative director with expertise in interaction design. As a design consultant, Clint has experienced various "opportunity spaces" by which large organizations seek individuals' personal data—at home, at work, on the road, and in the air. Because this trend will only increase for all designers, he strives to uncover principles and process that ensure user-centered design will address our society's growing privacy concerns.