Banksy banana mastermind?

Banksy banana mastermind?
image from beglen

Sunday, March 2, 2014

The Danger of Automated Content

Last week I watched a February 2011 TED Talk by Eli Pariser.

Eli Pariser is the author of The Filter Bubble: What The Internet Is Hiding From You. Pariser joined MoveOn.Org in 2001, is currently the board president and most recently, he started Upworthy - a content curation site.

The subject of Pariser's TED speech was the potential danger of automated content algorithms - curated search results, news feeds, etc.

According to Pariser, even when you're not logged in, Google search evaluates 57 different signals - things like what type of computer one is using, what type of browser, etc - that are used to help curate results.

"You don't get to decide what gets in. And more importantly, you don't see what gets out," Pariser said.

According to Pariser, prior to the information/internet revolution, editors at media organizations were the "gatekeepers" to content. The spread of the internet, he noted, gave everyone a chance to find content for themselves. For Pariser, now the new gatekeepers are the algorithms that tailor search results, news feeds or other content. And this, he said, is dangerous.

At the end of his speech, Pariser called out to Google, Facebook and other tech companies who were in the audience. 

"We really need to [sic] you to make sure that these algorithms have encoded in them a sense of the public life, a sense civic responsibility. We need you to make sure that they are transparent enough that we can see what the rules are that determine what gets through our filters."

After watching his TED Talk, I began to wonder about the ethics of programming. Is this something that is being taught in universities? Are there any guidelines?

The Association for Computer Machinery, a computer science trade organization founded at Columbia University in 1947 that has more than 100,000 members around the globe, has two sets of ethical guidelines - a general code of ethics and professional conduct and a software engineering code of ethics.

 The general code of ethics starts off with a list of eight moral imperatives.

 The first imperative is that all ACM members must, "Contribute to society and human well-being... An essential aim of computing professionals is to minimize negative consequences of computing systems, including threats to health and safety."

To what extent could not including what may be deemed as non-relevant to an user's historical interests, but potentially important to one's safety, be labeled as a threat to threat to health and safety?

The second imperative continues along the same path.

All ACM members must, "Avoid harm to others...

"Well-intended actions, including those that accomplish assigned duties, may lead to harm unexpectedly. In such an event the responsible person or persons are obligated to undo or mitigate the negative consequences as much as possible. One way to avoid unintentional harm is to carefully consider potential impacts on all those affected by decisions made during design and implementation."

Again, we see how it could be argued that curation could fit into this imperative as leading to harm unexpectedly.

The first principle in the software engineering code of ethics focuses on the public.

"Software engineers shall act consistently with the public interest. In particular, software engineers shall... Approve software only if they have a well-founded belief that it is safe, meets specifications, passes appropriate tests, and does not diminish the quality of life, diminish privacy, or harm the environment. The ultimate effect of the work should be to the public good."

I wonder how one would argue that curating results goes to the public good. Certainly it goes to the personal good - to the fulfilment of one's wants and desires, but does it go to the public good? Could this be interpreted to read that, in this case, the importance of the public good should overtake the desire for the personal good? Is the act of curation unethical?

Pariser gave his talk in February, 2011. In the three years since then, has anything changed?

At least one thing has. In November of the same year, Google added a new search option, known as "verbatim" searches, which, according to this blog post, search for just the terms that have been entered and nothing else.

It's a nice addition, but how many people are going to redo their search results a second time with the "verbatim" search activated? How does Google safeguard against over-curation of standard searches?

Unfortunately, I highly value my privacy and as such haven't had a Facebook account since 2008 or 2009, and can't comment on how curated its system is.

In all, Pariser's talk offered an important and insightful comment on the general direction of the internet. While it's hard to gauge how much damage curated results may be doing to our society, it's something that needs more thought and consideration.

As computers become more vertically integrated in our lives, Pariser and the ACM's call to a standard of ethics is going to become more and more important. As a society, it is our job to take this charge with seriousness as we move forward in a world ever-more controlled by and enveloped in computers.


No comments:

Post a Comment