Last evening, my 14 year old commandeered my Mac to hunt down a photo of an elephant walking down the road behind our car. My Mac’s ‘Photos’ app has nearly 17,000 pictures. Not surprisingly, she didn’t find the elephant. She wasn’t too pleased, and demanded to know why I hadn’t tagged the image.
Trying to think of a shortcut to track down the missing beast, I recalled the shot was clicked on the road to her school, but not the exact date. A quick scroll through thumbnails wasn’t good enough to spot that elephant.
I ended up having to painstakingly scroll through thousands of photos taken over five years. It took nearly half an hour to locate the pachyderm.
It’s a nice shot, considering it was a split second snap taken by a 10 year old kid, against the light, and from a moving car. Of course, it would have been infinitely better if she hadn’t beheaded the mahout!
Anyway, the incident made me reflect on ‘photo search.’
I’m aware that ‘photo search’ lags far behind ‘word search.’ After all, the ability of a device to recognise what’s in an image, is still work in progress with Google Photos currently leading the pack.
At this point, I have a face palm moment.
I had completely forgotten that I had uploaded my entire library onto Google Photos when they offered unlimited backup with the caveat of a slight reduction in resolution. Google even preserves the EXIF info of the photos. I should’ve searched for the elephant in Google Photos rather than on Photos.
Never mind. I’m curious to know how good Google Photos is in ‘photo search.’ So I open the app on my iPhone and do a quick search for elephant. The app offers up eleven photos from my Google Photos library that it believes are elephants. This includes the picture my kid was looking for.
What’s impressive is Google Photos has used ‘image recognition’ to locate most of these elephant pix. Here is the another photo shot of the same elephant, half hidden and shot from across the road. As you can see from the photo info, there is no tag or anything else to show it’s an elephant.
The app can also recognise elephants when seen from the back, as well as those that are tagged/named with elephant related terms.
Google Photos did pick up some other pictures for reasons I could only hazard a guess about. Does the black area in this picture resemble the shape of an elephant, with the table leg being the elephant’s trunk or maybe its leg? Can you see it? It’s sure weird trying to get into the head of a machine.
I would say Google Photos is definitely far ahead of Apple’s Photos. The next time I want to find an elusive photo of an animal or object I’ll go straight to Google Photos instead of wasting my time on Photos.
Why not try it now?
I’ll look for ‘dog’ and ‘camel’ and ‘rose’ one at a time in my online library.
As with the elephant, I get a few random images along with what I was looking for in each of my three searches. Not bad at all!
Obviously, Apple’s Photos has a long way to catch up with Google Photos.
Photos does have a certain level of searchability via face recognition and location tagging. But neither of them worked with my elephant image. Elephants don’t qualify as people, and their faces don’t make it to my phone’s ‘people’ list. As for the location, the photo was taken near my home so a ‘location search’ threw up nearly half my pictures in Photos.
In such a situation, it’s either stick to Google Photos. Or do manual tagging on the iPhone. Unfortunately manual tagging is not built into iOS. You have to use third party apps, and I don’t have the patience for that.
What’s needed is something similar to the manual tagging function on my Mac’s Photos app. Basically I click on a photo on my Mac, go to its info (Command I), and add ‘elephant’ to the keyword section. See below.
Next time, I want this photo, all I have to do is search for ‘elephant.’ As you can see, my Mac found it among my images (IMG_2330).
Unfortunately, the Photos app on the iPhone does not seem to have the ability to search based on keywords created on my Mac. That’s surprising as Apple is usually quite painstaking about such details. I sent the photo to my phone and confirmed it has the keyword using an app called Photogene. But a search for elephant in Photos showed no such animal in the photo library.
Why does Apple not build in the manual tagging function into iOS?
I can think of two reasons. The first is more people would learn how much more advanced Google Photos is. Secondly, Apple knows people dislike the labour of manual tagging. In other words, it’s not user friendly. So Apple leaves this to third party apps that can be found in the App Store.
Photos does have a workaround for manual tagging. Just create an album, and group photos in it. But that wouldn’t have worked in this case. This was a random shot, and not taggable as ‘Zoo Trip’ or some such album. Besides after a while, the number of such albums in Photos becomes unwieldy.
Despite all this, I feel manual tagging should be built into iOS until Apple improves Photos’ image recognition ability. Let me explain.
Imagine a library with thousands of unlabelled books, and thousands more being dumped in every year. The more you delay the labelling, the more difficult it will be to tackle the task of labelling the books. Soon it will become impossible to find a book among that unlabelled chaos.
That’s exactly what’s happening in Photos. The more Apple delays tagging, the more difficult it will be to navigate my rapidly growing mountain of untagged pictures and find some elusive picture that I vaguely recall shooting. Which is a pity, as photos are the memory of our era.
On second thoughts, maybe Apple is right. Manual tagging belongs to the age of dinosaurs. It’s better to focus their resources on image recognition.
But it might take a while. So I would suggest you upload your library to Google Photos.
You have nothing to lose but your high resolution.