Even though the physical shape of the smartphone hasn’t changed much since the first iPhone in 2007, user behavior has evolved. For instance, users have shifted from entering a six-digit pin to expecting fingerprint or face-recognition for security purposes. Mobile apps are much more physical, tactile tools than most people often consider. While a smartphone screen is a flat plane, mobile apps operate in three dimensions, which requires developers to consider how users interact with an app just as much as they consider the tasks the app performs.

Users Are All Thumbs

Smartphones have gotten larger, providing more screen space for apps to utilize, yet screen real estate is still a valuable commodity. This is especially true as devices begin to lose physical buttons. Larger screens provide mobile developers with more opportunities, but they also create obstacles for users whose hands haven’t grown to accommodate the changes. As far back as 2013, 49% of smartphone users had already adopted the habit of operating their devices using one hand. Since many users prefer the one hand method, the challenge presented by larger devices is designing apps with dynamic actions that can be performed from the lower third of a phone screen.

Advancements in Smartphones and Apps

Apple was the first to introduce the all-screen design, gesture-based navigation for iOS on the iPhone X in 2017. According to the Apple press release, the new iOS was “redesigned to take full advantage of the Super Retina display and replaces the Home button with fast and fluid gestures, allowing customers to naturally and intuitively navigate the iPhone X.” While the phone’s functions were mostly gesture-based, apps still needed to catch up, and even Android was having some issues making the transition to a buttonless design this year.

Many apps have already adopted gestures, such as Gmail, Facebook, and Instagram. For example, the Gmail mobile app allows users to mark emails as read, set reminders, and refresh the inbox, which are all universal gestures across many apps. However, the move towards more gestures and fewer buttons will demand creative, one-handed, or more accurately, one-thumb-enabled actions.

What Does This Mean for Developers?

Developers now have two new challenges when creating apps. The first is convincing clients that buttons are old news and are more cumbersome for users than gestures. This will require some convincing because clicking buttons has been the cornerstone of navigation for decades. The key to this challenge will be in demonstrating the value gestures add to user experience rather than pitching gestures as the cool new feature. Gestures for the sake of using gestures won’t be an effective selling point.

The second challenge is creating intuitive gestures that users can learn easily and quickly. Users are accustomed to apps that provide a stress-free experience. When things don’t behave the way they’re used to, chances are users will abandon the app altogether. While initial usage tutorials are helpful, you don’t want to bombard users with a list of new actions they need to learn in order to successfully interact with a new app.

Some Elements to Consider

Designers are going to play a much larger role in app development than in previous years. UX/UI designers have always provided key elements to app creation, but designers with strong development knowledge will become the new norm.

Gestures are great for relieving screen clutter, providing clients with a larger space to give users a great experience, so that means more thought should be given to the overall experience. Animations will play a key role in the user experience. Clunky or stiff movements make users think an app isn’t working correctly or that they’re doing something wrong. When adding a new gesture to an app, consider how it will move and affect the “physical” elements.

One option developers can consider is giving users the ability to customize gestures. Android users can utilize Quickify, a third-party app that allows users to draw custom gestures such as a figure eight to call a specific contact or draw an S-shape to open an app. While these gestures operate general actions on the device, the same opportunities could be implemented for in-app functions. If there’s one option users like it’s customization.

What’s Next?

The next big step in mobile app user experience will be mind control, of course. However, that’s a distant future, at least ten years. The next step for developers is to work with UX designers and quality assurance specialists to create an arsenal of gestures that you can present to a client with a clear explanation of the value the gestures add to user experience and provide a strategy for educating users.

 

 

Do you love designing User-Experiences for Mobile and Web?

We’re looking for a UX/UI designer.

 

APPLY TODAY

 

See Envoc’s Software and Mobile App projects.

ENVOC PORTFOLIO

Comments are closed.