iOS Suggestions: Spotlight and Springboard

Rather than an in-depth editorial, here is a list of suggested improvements to the usability of Spotlight (search) and Springboard (home screen) in iOS 7:

  • Allow Spotlight to be toggled from within a folder.
  • Allow Spotlight to be toggled from within an app; either a new in-app, system-wide gesture or tuck it into Notification Center or Control Center. Or …
  • Allow Spotlight to be toggled from within the multitasking window.
  • Make a push of the home button drop you back to the last page of apps you were on (or page one if already on a different page of apps). Being dropped into a folder or some other sub-view (Newsstand) is the wrong action. Page one (home) should never be more than two taps of the home button away.
  • Decide what is Newsstand is (an app? A folder?) and make it show up in Spotlight.
  • Make app and iCloud data searchable in Spotlight. PDFs, metadata, tags, file types, text … just like on the desktop, these things should be found when searching in Spotlight, no matter which app they live in.
  • Give folders a static indicator of place within the home screen. The iOS 6 style of a split screen with a notch was more intuitive and obvious than the zooming alone in iOS 7.
  • Correct the accuracy and/or velocity of apps when scrolling from side-to-side in multitasking view. Or maybe just slow it down!
  • Make it more obvious how to quit apps from the multitasking view.

iOS Suggestion: Photo Selection

When you want to use a photo in iOS, the typical path is to tap a camera or share icon, then choose from one of two options: take a new photo or choose from the library. What’s needed is a third standard option: to use the last photo added to the camera roll. While some apps have manually coded a solution to this iOS oversight, it really ought to be a system-wide default option. This addition to the OS would allow any app to increase the speed and ease of use for users when choosing a photo.

iOS 7 Icon Suggestion: Siri and Voice Memos

The icon changes to iOS 7 have not been without some controversy. Having lived with iOS for several months now (after upgrading to the iPhone 5S), there is a sense of acceptance and familiarity with the new look of iOS. For better or for worse, the icons just live amongst each other without much thought from the user. Two icons in particular feel out of place  though; reversed even. The icons for Voice Memos and Siri feel like they would be more effective if switched.

The new icon for Voice Memos in iOS 7 features a linear graph of sound, while Siri went with a microphone. It seems that this is the wrong use of two great icons. Voice Memos doesn’t just look at your voice, it records the audio. Like a microphone does. And Siri: it doesn’t record your voice for playback later, it listens your voice, like an analyzer. Siri even betrays this mix up, as it uses both the microphone icon and an animated wave of the sound when listening to your voice.

Hopefully someone at Apple with the power to act will feel the same way, and swap these two icons to their logical places.

iOS Suggestion: Bright Idea

This would be a nifty trick for iOS: adjust the screen brightness based on input from the ambient light sensor, accelerometer and gyroscope.

By using the sensors to detect a finger tapping on the ALS the screen brightness could adjust in graduated steps. This would allow for a shortcut style gesture to quickly control the screen brightness.

iOS Store Suggestion

Why is it impossible for the app store (actually all of them) to remember my place? One of the more frustrating quirks of the app store experience.

If I am in a list, say “Top Free Apps” for instance, and I tap an app to get a better look, going back a screen to the list dumps me back to the top, not where I left off. This is opposite of how it should work; it needs to remember my place on the page. By constantly making you start over for app discovery, you are much less likely to continue using the app store app to find apps. I know I only like to go in when I know exactly what I am looking for (and even then it takes far too long).

iOS Suggestion

Here is a logical failure of iOS that has always bugged me. When exiting an app by pressing the home button, why does it drop you back to an open folder and not the home screen? It seems like there’s potential for a person to be lost when pressing the home button doesn’t take them back home. At best, it’s just annoying.

iOS Keyboard Suggestions

Two pet peeves about the iOS Keyboard interface, both relating to selection.

One; the loupe. I still remember reading a rumor pre-iPhone describing the magnifying glass on-screen used to place the cursor. It was literally unimaginable, until you used it, then having an on-screen loupe made obvious sense. The loupe needs to be engineered to be more intelligent when in edge cases, such as lines of text near the top of the screen. Having the loupe go off-screen is really no help at all. It wouldn’t hurt either if the loupe was a bit bigger or offset more, to see around ones finger better.

Two; arrow keys. Instead of having great apps like Byword need to code their own solution, the iOS keyboard could use some arrow keys built-in for fine cursor placement. I’ll admit, I don’t know where you could squeeze them in on an iPhone in portrait mode without resorting to some sort of two finger shift-select gesture or something else equally strange sounding. Regardless, a solution beyond the loupe for fast and accurate cursor movements would be great.

A smarter loupe, combined with arrow keys would make for a much more fluid and uninterrupted typing experience on iOS.

iOS Store Suggestions

Speed and search.

The app stores have always been web based sites. For all of Safari’s touted improvements to speed however, the interface to Apple’s stores have always been slow – way too slow. Especially on iOS. A focus on the customer experience does not mean removing third-party apps that provide a better store experience than yours. It means making those apps irrelevant by making the speed, responsiveness and search of your store vastly superior.

A search for the term “Camera Noir” on Google or Bing takes a fraction of a second. It also correctly returns the expected result of a direct link to the app in the App Store as the top hit. Google clocked its results at .28 seconds. The App Store took 12 seconds to return the same search term. Why is the web providing a superior experience over a native app?

Google also provided instant and relevant suggested searches below the text box. The App Store also provides instant, though questionably relevant suggested search terms. With the full term “Camera Noir” typed in, the suggested search still had seemingly random camera-related terms offered. An interesting side note: subsequent searches for the term “Camera Noir” were relatively faster (six seconds) and more relevant suggestions made (Camera Noir for camera noir for example). Search needs to be right the first time, not cached or trained for all those subsequent times you’ll be searching for that app you theoretically already downloaded.

Google is also able to correctly interpret misspellings such as “cmera noire”, whereas the iOS App Store app provides no suggested results, nor any result at all when searching for “cmera noire” or similar misspellings. Not even the infamous iOS autocorrect can assist here – no option or sub-text box suggesting to correct seemingly easy-to-guess words.

Without delving into unfamiliar territory, it’s suffice to say that developers also have their own issues with speed and search in the App Store. Released a great new app, for example? Great, it may take hours after being released to be returned in a search result, even if that search is for the exact title of your app.

The App Store has been a tremendous success and industry changing phenomenon, but there’s still plenty of room for improvement, no matter who you ask. Large, philosophical debates can be had over ideas like app discovery and pricing, but perfecting the basics of speed and search still have a long way to go.

Parallels Debuts New iOS App

Parallels announces Access: their vision for a better remote desktop experience on iPad.

I am curious if we could see a new trend– putting that old PC or Mac to use as a sort of home server that handles media streaming, gaming (see Nvidia Shield Play PC Beta) and remote desktop. With the new Airplay Display in Mavericks perhaps Apple could should be pursuing a new kind of Mac OS X/ iOS integration like what Parallels and Nvidia are doing.

Let’s just all agree not to encourage the use of bad marketing terms (I’m looking at you ‘applify’).