Doing it right.

The technologies introduced at this year’s WWDC have gotten the development community extremely excited about the potential to extend iOS in previously unavailable ways. To me, one of the most interesting parts of these new advancements is the creation of a Touch ID API for developers. In its current implementation, Touch ID is somewhat limited and many users report inconsistency issues with the technology. But if the accuracy improves (likely after a year of further development, and reports from iOS 8 beta users are anecdotally positive) and developers capitalize on the ability to easily validate user identity, a lot of new uses come to mind. One of the most interesting and potentially useful so far has been the 1Password extension created by AgileBits, which allows 3rd-party app developers to build support in their apps for the 1Password app to fill login fields and access other secure data using a fingerprint.

There’s another interesting part of the expansion of Touch ID, though. If Apple continues to add it to new devices–which it almost certainly will as the increased sapphire glass production necessary to support the hardware gets up to speed–then the company finds itself in a unique position as the prevalence of users with Apple IDs tied to credit cards grows. Mobile “wallets” have been slow to gain acceptance in the mainstream for a variety of reasons (primarily limited mobile hardware support and expensive POS hardware required on the part of the retailers). Apple is known for making big moves and strategic partnerships with a lot of cachet that it can laud at its press events and keynotes. A few of those key partnerships could easily drive awareness of mobile payments for retail purchases firmly into the mind of the mainstream user, instead of languishing on the bleeding edge. Many retailers already have some kind of iPhone/mobile integration in place, but the last mile–the actual payment–is still primarily a manual affair. Programs exist for users to pre-load loyalty cards in advance (e.g. Starbucks) but the trick is going to be transforming this process from one requiring proactive steps to one that is reactive at the moment of purchase, while appearing seamless to the typical user, who will tolerate far less than tech fans when it comes to exploring these kinds of things.

The iPhone was the breakthrough device to sell the notion and utility of the smartphone to the general public. It’s the most popular type of camera on Flickr and presumably among many segments of the population. Apple commands interest in the public consciousness in a way that few other companies can. It’s traditionally had a focus on platform security with iOS, which it leverages as a selling point against other mobile platforms, most notably Android, and it continues to trumpet privacy and security in consumer-focused materials and media. While that’s a great story in and of itself for many of us, the seed of something larger gets planted: Apple is secure, iOS is secure, my iPhone is safe, hence I am safe. When security ceases to be something people need to think about and is easy, obvious, and ubiquitous, resistance to new ways of doing things will evaporate. While there are all kinds of phones with some level of this functionality right now, the iPhone is probably the only single consumer hardware device positioned to do this effectively any time soon.

The notion of the Apple ID as a payment mechanism for non-iTunes content is an idea that’s been tossed around for a while. None of this is news to anyone. Whenever Apple finally decides to announce that you can use your Apple ID for more than just iTunes purchases by simply accessing Touch ID when you’re in your favorite retail stores, tons of people will claim to have been predicting it for years. Widespread acceptance won’t be far behind. Critics will bemoan the fact that other phones and platforms did it first, but as with Apple’s previous innovations, the key to success wasn’t being first, it was doing it right. It’s the combination of cultural penetration and acceptance along with a longtime and public focus on security at a critical time in society that ensures that people won’t dismiss it as a gimmick. The utility will become infectious as people see their friends using the technology, and Touch ID will probably become as ubiquitous as the camera in your phone is now.

I think your thumb is about to become your favorite finger.

A user interface is not like a joke.

In the past week or so, I’ve noticed this sentiment, passed around in various tweets by a bunch of people:

“A user interface is like a joke. If you have to explain it, it’s not that good.”

While I understand the point of the comment, and agree with its overall intent, to summarily declare this platitude as true discredits innovation and the learning process that we all submit to as human beings willing to try new experiences.

A UI is a tool; it’s a method for interacting with a software application, the same way a hammer is a method for interacting with nails and wood. If you put a hammer in the hand of someone who has literally never seen it before (I know, stay with me here), is that person going to automatically know how to use it to its greatest efficiency with zero instruction provided? Perhaps he or she can figure out that the heavy end should be swung at something, but it might be important to mention that you don’t want to have your fingers in the way when you do. How should the nail be held so that it’s inserted at the right angle and binds the wood properly? What happens if you hammer the nail sideways? What the hell are these claws on the back for (or this little round ball thing)? Throughout our life, as we engage in new experiences, in so many cases, someone or something is there to help us understand as we learn. It’s a natural phenomenon.

No matter how simple you think something is or ought to be, human beings will benefit from guidance. There’s a big difference between showing someone how to create a task in an app like Clear and how to navigate the byzantine menu bars of Excel. Both require explanation, but many would argue that one UI is superior to the other. What about the UI for a machine that performs laser surgery on internal organs? Should that be so elegantly designed that it can be just “figured out” without an instruction manual? Wouldn’t you prefer to know that the person who’s shooting a laser into you didn’t only rely on his/her own intuition to ensure that the operation is a success?

It’s a good tweet, and it’s a good idea in a lot of ways. But explaining something new to someone isn’t always a bad thing. Yes, there are thousands of horrible UIs to which this sentiment can apply. Talk to anyone who uses enterprise software on a regular basis. But don’t be afraid to help your user learn how to best use your app. You can build plenty of delightful touches in as well that can be uncovered through normal use. If you want to ensure success for your users, give them the tools to understand and achieve that.