WeWork says this technology will help make the workplace more collaborative, productive, and engaging. It also might give you the creeps.
There is a big question here: do we understand what all the data mean? In much analysis there is correlation, but not explanation. So even if we don’t think this is creepy, there’s a concern about how the data are used, especially as it might relate to assessing individual productivity or activity. It could lead to a new iteration of Taylorism and ‘scientific management’ (from the early 20th century)—too much focus on the observable data, without enough thinking about the human needs and what really drives productivity.Beyond this macro concern, questions should be asked about boundaries, accountability, responsibility and transparency. Some attention to the utilization of assets could help efficiency, but, as several have noted, individual tracking doesn’t make sense and could erode privacy and, ironically, productivity. Further, deanonymization could be a problem, in that anonymization is not always as robust as it is thought to be—in datasets that are thought to have unidentifiable individual records researchers have been able to identify individuals. Finally, a comparison to Google/Alphabet-Sidewalk Labs project in Toronto’s Quayside seems appropriate—greatly increasing data collection for the stated purpose of efficiency, but with unclear consequences for social dynamics and privacy. While not everything new should be feared, the dynamics we’re seeing with respect to social media in the last few years suggest that we don’t have the ability to know all of the implications of broad system changes ex ante, so some caution and thoughtfulness seem warranted.
We are facing into a customer conundrum this year across industries - opt for suboptimal service or give up your data to gain an ever increasing edge on experience. The only way to get over this barrier is total transparency and customer control of their own data with the ability to switch bits on and off as they see fit and be able to understand how this might impact the services we enjoy such as optimised workplaces, uniquely tailored medicines, perfectly brewed coffee or the comfiest of sneakers made just for you - the choice has to be ours!
As with every other data-harvesting service out there, whether or not it’s “creepy” is the wrong question. The creepiness depends on how the data is used and what protections are in place to prevent uses that would lead to discrimination against people. The reason we worry about creepiness is that clear standards for what rights and protections people should have around the use of their data simply don’t exist in the US, so everything rests on the practices of the companies. Curious to know what limits GDPR places on this type of data use.
The opportunities and risks from doing more surveillance will inevitably lead to worse outcomes for individuals unless we embrace a new technical and ethical paradigm based on clear rights - and controls - over data by people. History shows it’s just a matter of time until the best intentioned companies, organizations and governments abuse new forms of power unless kept in check from the start. This has unlimited potential for harm otherwise.
This is about gathering data to measure and optimize productivity. Worker bee takes on a whole new meaning with the analytics data to back it up. Imagine defining a model for what a super-productive worker acts like, and then being trained for and/or measured against it.