“To get people to the point where there’s more openness – that’s a big challenge. But I think we’ll do it. I just think it will take time. The concept that the world will be better if you share more is something that’s pretty foreign to a lot of people, and it runs into all these privacy concerns.” ~Mark Zuckerberg (Source)
A few months ago, as I was reading about medieval privacy in Barbara Tuchman’s A Distant Mirror: The Calamitous 14th Century, I began to see stories pop up in the news about how much personal data tech companies like Google and Facebook collect on their customers. At about the same time, I started a new writing project at work related to the European Union General Data Protection Regulation, a sweeping new law designed to increase consumer privacy protections in European countries. It was a relatively unique situation in which my reading interests overlapped both with what I was seeing in the news and what I was actually doing in my day job.
Naturally I started asking certain questions: how does what we know of medieval privacy compare to privacy today? Does the contrast between then and now teach us anything interesting or valuable about modern privacy concerns? Is what we have today “better”?
An Overly Simple Definition of Privacy
Merriam-Webster defines privacy as “the quality or state of being apart from company or observation.” That’s a good start, but what constitutes “observation,” “company,” and “being apart” when you’re alone in a physical space and yet constantly connected to a digital world that monitors so much of what you do?
With a little help from Wikipedia, let’s expand the definition to say that privacy is about having certain abilities or opportunities to seclude yourself, to be selective in what you share about your identity and your doings, and to be free from unwanted or unsanctioned surveillance by states and corporations. This is by no means an exhaustive definition—the meaning of privacy gets especially complicated when you take consent statements and data processing into account—but I think it’s a reasonable baseline.
Privacy in the Middle Ages
Privacy as defined by Merriam-Webster was utterly lacking in medieval times. In his short book The Middle Ages, historian Bishop Morris notes that very few people had their own bedrooms. Children slept in the same room as their parents, and servants with their masters. And while nobles did have additional rooms, they also had more servants and retainers to occupy them. “Even the English king was known to hold royal court in his bedroom,” he writes, “with his queen sitting on the bed for lack of other retreat. All ate together in the hall. […] Children slept with their parents or with the servants on the floor of the hall.” He wraps up his observations with one of my favorite lines: “Privacy is one of the greatest of modern inventions.”
Barbara Tuchman concludes much the same in A Distant Mirror. Unless you were a religious hermit, privacy was unknown. Even guests usually slept in the same room as the host and hostess. Citing Geoffrey Chaucer’s Reeve’s Tale, in which two Cambridge students have sex with the miller’s wife and daughter, Tuchman wonders whether the absence of privacy actually helped or hindered seduction. (Since I know you’re interested, Tuchman concludes that this remains an “open question,” adding that the “unmarried girl, noble or otherwise” probably had quite extensive knowledge of her peers’ sexual habits.)
It is difficult for us in the 21st century to look back and imagine such a world. Most if not all of your actions would be constantly observed by others, just as the actions of others would disturb your most intimate moments. Your whole family (parents included) wasn’t just hearing you behind closed doors; they were in the same room, sleeping inches away, oftentimes under the same covers. They knew your sounds and smells. If you lived in a castle, you would eat regularly with your neighbors, sweat together as you did your chores, and sleep next to one another if you were servants. If you did anything dramatic, or suffered some small cruelty or shame, the whole village or castle would soon know about it.
Privacy in America Today
In some ways this tiny foray into medieval history suggests that modern privacy is vastly superior. To be sure, the average American has a great deal more physical space, more opportunities for quiet and seclusion, and a reasonable expectation of taking a bath without the whole house watching. The advances we’ve achieved in this area are truly amazing.
And yet thanks to the Internet and modern database technology, more information about us is known and can be retrieved on demand than ever before. As reported in The Guardian, Google knows all the apps you use, videos you watch, everything you’ve ever search for and deleted, emails you’ve sent, and so on. Facebook, not to be outdone in creeper status, knows “every message you’ve ever sent or been sent, every file you’ve ever sent or been sent, all the contacts in your phone, and all the audio messages you’ve ever sent or been sent.” And as professor Ian Bogost reminds us in this recent Atlantic feature, Facebook and Google are actually late to the game, meaning they are just a few of the companies that have been collecting, selling, processing, and correlating data about us on a large scale over the past few decades.
Now we all know this to some degree, and I suspect our eyes glaze over with indifference whenever we recall it. Or we wave our hands and insist that—unlike most people out there—we’re smart with how we control the dissemination of our information. Besides, we don’t do anything illegal. We do we have to hide?
The irony here is twofold. One, we share a lot more than we realize. Even the conscious actions we take to limit our technology use are specifically observed by companies for the express purpose of improving their programs. Two, the issue is not just about what we have to hide, but how our data can be used to monitor and influence us without our knowledge or consent. On these points it’s worth reflecting on a passage from another Guardian article:
If you think you’re a passive user of Facebook, minimising the data you provide to the site or refraining from oversharing details of your life, you have probably underestimated the scope of its reach. Facebook doesn’t just learn from the pictures you post, and the comments you leave: the site learns from which posts you read and which you don’t; it learns from when you stop scrolling down your feed and how long it takes you to restart; it learns from your browsing on other websites that have nothing to do with Facebook itself; and it even learns from the messages you type out then delete before sending (the company published an academic paper on this “self-censorship” back in 2013).
Granted, we’re not talking about visual, real-time observation of here; and yes, a lot of the information recorded about us is boring and innocuous—in isolation. It’s when you collect enough of it and put it all together that you can predict behavioral patterns with shocking accuracy, and wield an exceptional amount of influence over people.
That sounds dramatic, but it is not overly so. Says Arvind Narayanan, a Princeton computer science professor, there are companies in existence right now whose specialty is “combining data about us from different sources to create virtual dossiers and applying data mining to influence us in various ways.” The term “influence” goes undefined, which is not particularly comforting, but at any rate this is precisely why free digital products are not really free. You, the consumer, are as much the product as the product you consume.
This isn’t all bad. We want our products to be continually improved, and the data collected about us is essential to that process. But things get nasty when our data is misused or hacked. Stories about online bullying, Russia’s interference in our election processes, the scandal with Cambridge Analytica, and delinquent celebrity tipping habits come to mind. And such incidents don’t have to be perpetrated by shady corporations or clever programmers to be alarming. As told in a recent Gizmodo feature, it is relatively easy today for a complete stranger who holds a grudge against you to find slivers of your information online and weave them into an incredibly damaging series of falsehoods, with just enough nuggets of truth to fool lots and lots of people.
Modern Versus Medieval Privacy
Okay, so medieval privacy was lacking, and modern privacy, while offering us a whole slew of social and economic benefits, has its own set of problems. Can we compare the two and evaluate which one is better?
No doubt the question is naive question for a few obvious reasons. If medieval privacy did not really exist, how can we compare it to modern privacy? Moreover, I’m not a professor with a degree in history or privacy law, and I’ve never personally lived in a situation for more than a few days that’s even remotely analogous to a medieval existence.
But while the privacy issues of today do seem to be of a qualitatively different kind than those of the Middle Ages, I think we can still get some interesting insights in comparing the absence of a thing to the presence of it. That’s one of the many reasons we study history. Concerning the other objection, well, it’s just too much fun to speculate! I’ll group my observations under four themes that have jumped out at me.
Value and Consent
Although modern privacy is scary in terms of how much data about us is collected and stored, at least we have the chance (a) to derive massive value from it (I’m thinking of things like self-expression on social media, tailored ads, faster processing of administrivia, virtual banking), and (b) to grant our consent to its collection and use. Seen in this light, the risk of having reams of our personal information “out there” in the digital vortex feel like a small price to pay.
Yet it’s important to reflect on what we mean by “value” and “consent.” Is it really true that the value we derive is in every case beneficial to a flourishing liberal society and our formation as individuals? An honest look at the potential effects of social media and smartphones on our spare time and mental health (see here, and here, and here, for example), to say nothing of its impact on our public discourse, is more than enough to cast doubt on that score.
Are you really going to stop using Google? Or quit Facebook? Or stop browsing the web? Or leave your smartphone behind? Or disable location services in its hardware settings? Maybe some people will, for now, for a time, but then the reality of contemporary life will corral them back into these services. Eventually, it will become impossible. Unless you are independently wealthy, you can’t opt out of the credit services. Even if you never use your credit card, your employer might be giving your data to the agencies that manage it anyway. You can’t forego the supermarket, or the drug store, or the Target, where every purchase is stored and linked to every other. There is no escaping the machinery of actual life, no matter how many brows get furrowed over or tweets get sent about it.
Or as one security researcher said more succinctly: “If you want to be a functioning member of society you have no ability to restrict the amount of data that’s being vacuumed out of you to a meaningful level.”
Extent and Granularity
Ian Bogost makes several other points in his Atlantic feature worth recalling: (1) that companies have been trying to benefit from the information they collect about customers for years (he tells us how the term “business intelligence” was first invented in 1865); and (2) the extent and granularity of that information has dramatically increased in the past decade or so, largely because of smartphones and location data. He also observes that the tools for storing this information are now easier to unify, to survive outages, to be archived and copied and backed up in servers and databases around the globe (whereas before they tended to be scattered and disparate). In addition, the means for drawing correlations between the data are becoming ever more sophisticated.
By contrast, in the Middle Ages, you could be quite certain that your personal information, if collected at all, was not being recorded in detail and stored in such a resilient manner. Capturing information was costly, and reliable records were scarce—a dual reality which would grind our contemporary nerves to a pulp. How many of us would like to revert to cumbersome, paper-based systems at the DMV?
At the same time, our systems have arguably gone too far. If an independently-owned website published utterly false, damaging information about you somewhere on the internet (as described in the Gizmodo feature), wouldn’t you want it permanently removed? Or what if you wanted to prevent your bank or credit agency from sharing any of your data with third-party marketing companies? Right now in the United States, there is apparently no easy way to do either of these things.
Ease of Discovery
We have more space to ourselves than our medieval ancestors did, and more opportunities to withdraw to quiet spaces where we can think and act with minimum observation. But that doesn’t necessarily make us less discoverable. In fact, as far as social media is concerned, discovery is the point. Which is great so long as the people who discover us are nice. But woe betide the victims of prejudice, hacking, fake news, and online bullying.
Medieval people, who knew nothing of these phenomena, could be reasonably certain that even if some gossip about them were to spread, it would not leak beyond their local circle or be mined by strangers in different time zones. It had to be pretty dramatic if it did. And that’s still true today, with important differences. Medieval intelligence had to travel (rather than be sent instantaneously over the Internet), was mostly dependent on word of mouth (rather than video or hypertext), and took more work—more memory, conversation, and hand writing—to uncover and transmit (no Google searches or email). You would not have had to worry about hackers or cyber bullies. This isn’t to deny that social stigmas back then were crushing and cruel; it is merely to point out that social stigmas today can be equally crushing, if not more so, thanks to the viral potential of the web.
Community and Isolation
While medieval privacy was basically non-existent, the flip side was that you generally had a thicker sense of community and belonging. Life was shared in a real-time, physical co-presence unmediated by the distractions of television, phones, and computer interfaces. No doubt this made it easier to irritate one another, yet it also facilitated social bonding in ways our screen technologies have yet to match. (This in fact seems to be why professor Nichalos Tampio argues so strenuously for a model of education that is based less on screens and more on lived, embodied experiences.) Barbara Tuchman’s insight on this dimension is powerful to consider: “Beneath the cry of protest much of medieval life was supportive because it was lived collectively in infinite numbers of groups, orders, associations, brotherhoods. Never was man less alone.”
From what I can tell, the opposite is true today: never was man more alone. Without question, we have incredible opportunities to stay connected with our friends and family through social media. Yet those connections are often highly superficial, and there’s a strong tendency to compare ourselves to others—and then perpetuate the comparison cycle by posting pictures of our best moments for others to envy. This in turn can foster loneliness rather than reduce it.
You could argue that technology alleviates loneliness by letting us connect with like-minded people across the globe. That is true, to an extent. And it is a double-edged sword. Alan Jacobs reminds us in How to Think that “Technologies of communication that allow us to overcome the distances of space also allow us to neglect the common humanity we share with the people we now find inhabiting the world (p. 82).” The same technology which unites us can just as easily divide us, a fact which Russia seems keen to exploit. And if we are to believe this recent feature in the Harvard Business Review, there is a loneliness epidemic spreading in the workplace, prompted in part by the quiet, sedentary nature of office jobs and the increased mobility of the workforce. It is telling that these trends are enabled rather than arrested by technology. It does not mean we never laugh and have fun with our coworkers, but that, compared to a few decades ago when average job tenures were longer, we have a harder time feeling like we truly know them, or are known by them.
So, which is better? Modern or medieval privacy? I think the answer is neither—or rather that one is better in some ways and not others, and vice versa. You were discoverable in the Middle Ages; but, for better or for worse, not nearly as discoverable as your are today. You might have had some personal information recorded about you, but by modern standards, the extent and granularity of that information was minuscule. You would have had a hard time escaping the observation of your peers; escape is easier today. Transmitting personal information was hard, whereas now it is simpler and easier, and we save a lot of time as a result.
Things get very muddy, though, when we add data privacy to the equation. Again: a staggering amount of information about us is collected, and it’s almost impossible to have a meaningful level of control over how it is used to influence us. With social media in particular, we are coaxed into greater transparency, not less. (And I have not even touched on the issue of national security and surveillance by the state, a topic fraught with serious implications for democracy as well as other autocratic political orders.) In some ways, perhaps it is just as hard for us to escape from observation today as it was in the Middle Ages.
But what is our alternative? We need to give basic information to businesses and government agencies in order to function as a society, and no one is going to give up social media or credit cards anytime soon. For these reasons, the European Union General Data Protection Regulation seems to me like a step in the right direction, especially as regards the ability to have certain kinds of data well and truly deleted from all of a business’s systems of record. But I haven’t studied the law in depth (I wonder, for instance, how much it stifles the ability of businesses to innovate), and I’m skeptical of regulatory solutions alone to solve such complex problems. At any rate we have a ways to go to see how effective it is.
The better question, I think, is this: what principles ought to govern a sort of golden mean between too much collection of our personal data, and just enough to enable us to flourish as humans who depend on one another? What criteria should we apply to determine what is fair use of our data, and what goes too far? I believe such principles exist, and that empathy and trust are among them. My guess is that we need a lot more than technological solutions, and that we’d do well to ponder what human beings are for before we define the proper role that privacy should play in our lives.
In the meantime, as one of my graduate professors once said, there is little we can do right now to achieve holistic privacy. The best we can hope for is to swim below the digital spotlight without rocking the boat enough to draw unwanted attention. Ian Bogost is even more pessimistic: the problem is so complex and out of control that we don’t know who the real villains are. Like the Hydra of Greek mythology, we might chop off one head only for two more to grow in its place. I am not sure we need to be quite so nihilistic as that; I believe as informed citizens we can and should speak up on the issue, and demand better solutions from government and industry (while at the same time try to learn from what medieval society can teach us about the irreplaceable benefits of steady, face-to-face community). Still, Bogost has a point: we need to be brutally honest about the precariousness of our situation. I am as grateful as the next person for convenient online services and personalized notifications, but it’s crucial to remember that we are not as safe as we think, nor quite so far advanced beyond our medieval predecessors as we want to believe.