Case Study: Revolutionizing the Mobile Shopping Experience
At Sourcebits, we’ve worked w ...
Back in June, the results from a controversial study by Facebook came to light. Unbeknownst to their users, Facebook manipulated timelines in a positive or negative slant to see how users’ emotional states (judged by their own timelines) would be affected. Although Facebook maintained this experiment was “for science,” many users were shocked, outraged, and troubled as they realized the extent to which Facebook owns their private information and how willing they are to use it.
And in the past couple of weeks, Facebook has made another gaffe when a user flagged hundreds of drag queens for not abiding by Facebook’s “Real Name” policy and Facebook suspended those accounts. Unaware that the majority of these flagged users were drag queens, Facebook issued an automated response requesting proof of the users’ “legal name” to continue use of the site. This then spread throughout Facebook, with alias account names being flagged. While this is standard Facebook operating procedure – and primarily used to prevent cyber bullying, trolling, and other impersonations – in this case it disproportionately affected members of the LGBTQ community. And it was poorly handled.
Protests erupted, with some of the most outspoken being Sister Roma and Lil Miss Hot Mess. While the controversy raged on, thousands of disenchanted Facebook users fled the site to join Facebook’s newest rival, a rough beta anti-corporate social networking site Ello.
Facebook has apologized for its actions and made moves to repair the damage, but these back-to-back situations have left Facebook on shaky ground with many users. And they offer some important lessons for developers, designers, entrepreneurs, startups and larger companies building mobile apps and user communities.
The recent Facebook actions have left a bad taste in people’s mouths for several reasons. To start, the results of Facebook’s “research” only served to benefit Facebook. Their findings stated that “emotional contagion”, or people altering their emotions and moods based upon the presence or absence of other people’s positive (and negative) moods, is transferable through an online medium like Facebook. This effectively erases any concerns about the so-called “Facebook depression,” where constant online social comparison supposedly leads to users feeling sad and lonely.
The Real Name Policy backlash was further fueled by a 2010 comment made by Zuckerburg, who claimed anyone using a Facebook alias showed a “lack of integrity.” As Olivia Grace, a drag performer, explains, the policy doesn’t just affect the hundreds of drag queens who spoke up, but also anyone whose legal name can pose a risk, such as abuse survivors or political dissidents.
“Victims of abuse, trans people, queer people who are not able to be safely “out,” and performers alike need to be able to socialize, connect, and build communities on social media safely.” said Grace. “By forcing us to use our “real” names, it opens the door to harassment, abuse, and violence.”
These situations have raised many troubling questions. What ethical obligations does Facebook, or any other company, have to their users? Should Facebook have conducted such a controversial experiment, considering that users didn’t have a chance to opt out? How can a company balance conflicting arguments when creating user policies? In both situations, what more could Facebook have done to ensure they were not putting their users at risk?
Facebook’s recent gaffes compel us to take a closer look at the question of moral obligation in technology, and in the upcoming section we’ll explore that question more deeply in regards to A/B testing, dark alleys of UI design, and the “new” skeuomorphism.
A/B testing is the Holy Grail of software optimization: implement a feature, put it out there, pick two or more variants and test them live. Users are rarely informed about being a part of the test.1 Usually, the purpose of A/B testing is to understand which UI language yields the highest usability, the highest click-through rate, and the least confusion. In sum, A/B testing focuses on optimizing user behavior in order to reach the goals set out by the product.
In a world of continuous public beta testing, where time to market is the most important success metric — A/B testing seems to be the most cost effective and fastest way to optimize the product.
Do designers and developers who A/B test have a moral obligation to their users? As mentioned before, A/B testing focuses on optimizing behavior. In human psychology, behavior is fueled by a plethora of interconnected factors, and the tool the user interacts with is certainly one of them. If its interface is being changed in real-time, especially if it happens without letting the user know, the behavior alters accordingly. And it is safe to say that altered behavior leads to different emotional responses, modifies user’s capability to attach themselves or identify themselves with the tool and ideas behind it. Therefore, A/B testing indirectly affects user emotions, mood, and reactions.
We can’t judge whether that is good or not. We can only ask designers and developers to be aware of this connection, and conduct their A/B tests with a clarity of purpose: to optimize the user interface in order to end up with the most positive, efficient and productive user experience possible, not to manipulate the user into doing what you want them to do. It’s a thin line, but it is up to testers and test subjects to agree on a fair balance.
Dark UI patterns are the taboo of UI design2. Every time designers are presented with a situation where a Dark UI pattern would make the design work, it’s both a cause for excitement and a metaphorical glance over our shoulders. Wouldn’t it be awesome to overcome all my challenges in this piece of design with one simple step that will lock the user down and shove the information my boss wants to convey down their throat?
Boy, wouldn’t that be beautiful.
In short, Dark UI patterns are methods which allow designers to lead users to a place they want them to go (in order to meet a specific goal), and don’t provide an obvious way out. More on them here.
One of the best examples is Amazon’s shopping cart. Try to find a way to get back to Amazon’s home page from the “Review your order” page below:
Hint: clicking the big Amazon.com logo at the top doesn’t do anything.
UI design is a balancing act, a meta-discipline on the cross section of all conflicting aspirations within the product creation process. Finding the right balance between design (which usually has very idealistic aspirations) and business (which has its feet on the ground and wants to see numbers) quite often leads to friction among designers, developers, marketing, and the CEO. Therefore, UI design is a constant battle between carefully optimizing the design language to elicit a positive user experience, and making compromises to obtain business goals.
Designers pride themselves on creating solutions that gently nudge the user towards the goal of a particular workflow. It can be as rudimentary as using a contrasting color for a “Purchase” button or conveying that an image is being uploaded and the user needs to wait with a filling progress bar. All these tiny details help lead the user through the workflow and help figure out the best usage patterns. It leads to what seems to be the Holy Grail of contemporary UI design: delightful user experience.
But is there a dark side to it? Even if a designer avoids using dark patterns (which obviously puts their work in a moral grey area right away), one has to be aware of the moral implications of UI design overall. Nudging appears benign, but only to a certain point. After all, everyone remembers this guy:
I can hear your teeth grinding.
But let’s put the pitchforks away for a moment and answer this: did Microsoft designers have evil intentions when they created clippy? Did they make it deliberately to screw with your day? Microsoft isn’t that evil. This is simply a case of suggestive UI design taken too far. It was intrusive to the point that it altered user behavior and negatively affected their emotional responses.
Designers should constantly ask themselves: how is this going to affect a user’s mood? Is it going to sway the user in a particular direction? Am I trying to get them to do what I want, or am I building an environment that lets them be creative and/or efficient in whatever way they see fit? Am I entitled to design things that will change the user’s emotional response?
I think the invisibility design principle is more relevant today than it was when it was originally articulated. Contemporary UI design seems to focus on creating opportunity for delight. It’s built to make a user go “Wow, these animations are awesome”, to make using apps a fun and therefore unforgettable experience. Isn’t that what apps like Path or Paper bank on? There is no doubt that the battle for user attention is ongoing and fiercer than ever. However, is delightful UI the right way to capture that attention? Is delightful, suggestive, fun UI a value in itself and enough to claim innovation?
Moral implications, as well as the invisibility principle, suggest otherwise. The invisibility principle says that UI design should be merely a means which enables user to interact with a machine and get the most out of it. In this context, UI design should blend into the background, become invisible as quickly as possible. Present value, demonstrate what’s the suggested optimal way to interact, and leave the room. There is no benefit in UI which has so many ornaments and delight-inducing moments that it becomes a chore to actually use.
With the advent of iOS 7, iOS 8 and Material Design, we’ve entered the new era of skeuomorphism. In the past, ornaments were of visual nature: photo-realistic buttons, lavish textures, page tears etc. As of today, skeuomorphism seems to occur in motion: we fell in love with UIKit Dynamics, everybody’s making things bounce, pop and fly, as if UI elements were made of plastic bits on rubber bands. It’s not inherently negative — any ornament used sparingly adds value — however I would argue that motion skeuomorphism poses a larger risk to usability than visual skeuomorphism does. Motion happens over time and if the UI locks user out in attempt to force them to watch a delightful animation, we end up with a lot more frustration.
It’s true that overly skeuomorphic UI can be confusing and force users to focus more in the process of building muscle memory. A button that looks like something else but behaves consistently, is something a user can figure out after a few tries. Users are conditioned to put up with a lot, let’s be honest about that. However, if the user knows that after pressing that button they will see a long and useless animation, they are turned away from pressing it at all. Lavish motion design may discourage users from delving into the application. In other words: you don’t even get a chance to prove yourself.
Facebook’s blunders have provided a useful catalyst for viewing how the sticky issues of ethics, optimization, and emotional manipulation are entwined and how they are handled through the filter of modern technology. As tech companies grow more powerful and the average user grows more accustomed to surrendering their private data, it’s an issue which needs to be considered more frequently. Although tech giants generally have well-intentioned external goals — whether it’s usability or user experience optimization – moral questions are still going to come into play. User interfaces and applications are not just vessels for talent and dedication to detail, they are creations with a purpose: making people’s lives more efficient. Although it seems like Facebook needs a reminder, at the end of the day creators are still responsible for not inflicting any emotional damage on the users they are supposed to be helping.
1. That is a challenge in itself and psychologists have been battling with it for ages. It is proven that if a test subject is aware of being tested, they behave differently and test results are usually invalidated.↩
2. Please note that I’m not using the “UX Design” term purposefully, because I don’t believe such a discipline exists.↩