Starting with iOS 7, Apple began automatically sorting users' iPhone photos based on time taken and location. But a new patent application reveals the company is interested in taking it one step further, and allowing Siri to sort through iOS photos based on voice search and tagging.

The company's interests were revealed in a new patent application published by the U.S. Patent and Trademark Office on Thursday. The filing, entitled "Voice-Based Image Tagging and Searching,"
describes associating "natural language" text strings with photographs saved on a device, like an iPhone.

Such text strings would be associated with speech input, much like users can access data on their iPhone by speaking in plain conversation to Siri, the device's voice-driven personal assistant. Text strings associated with photos could cover an entity, an activity, or a location, according to the filing.

Apple's application notes that a growing volume of photos collected by users on devices like iPhones makes them increasingly hard to sort through. The company notes that tagging photos based on names of people or places makes it easier to find what users are looking for.

"Apple's system would allow users 
to use their voice to 
tag and search photos 
based on locations, people's
 names and more."

Apple's system would also allow users to tag photos with their voice. In one example provided by Apple, a user tells their device, "This is me at the beach," and the corresponding picture is tagged accordingly.

The proposed invention could even automatically tag corresponding photos, based on the time and location at which they were snapped, to make it easier for users to sort their pictures and not require them to individually tag each picture.

Apple's system would also allow users to tag photos with their voice. In one example provided by Apple, a user tells their device, "This is me at the beach," and the corresponding picture is tagged accordingly.

The proposed invention could even automatically tag corresponding photos, based on the time and location at which they were snapped, to make it easier for users to sort their pictures and not require them to individually tag each picture.

Apple's system could even recognize faces, buildings or landscapes to tag similar photos. For example, by a user telling Siri that they are captured in a photograph, the system could then intelligently tag other photos that capture the user's face.

With photos properly tagged, users could then use their voice in a similar manner to search for the pictures they are looking for. In another example, a user asks their device, "Show me photos of me at the beach," and related items are delivered.


Apple began automatically sorting pictures with a revamped Photos application that debuted in iOS 7 this year. Starting at a macro level, photos are presented in tiny thumbnails based on the year they were captured, and users can zoom in to find their photos sorted based on date and location.

In this manner of auto-tagging alone, Apple could utilize its proposed invention to allow Siri to sort through pictures. For example, saying "Show me photos from Hawaii taken in 2012" could present users with relevant photos, even without the need for tagging faces.

The application, first made public this week, was filed by Apple with the USPTO in March of this year. It's credited to Jan Erik Solem and Thijs Willem Stalenhoef.
Axact

Axact

Vestibulum bibendum felis sit amet dolor auctor molestie. In dignissim eget nibh id dapibus. Fusce et suscipit orci. Aliquam sit amet urna lorem. Duis eu imperdiet nunc, non imperdiet libero.

Post A Comment:

0 comments: