Why We Think Canon’s PHIL AI Is Being Used All Wrong

Canon has a new technology called PHIL AI, but it’s not helping you become a better photographer. 

How many times have you sat there and said you’ll worry about a problem later on? We all do it. This is what Canon’s PHIL AI reminds me of. My mom used to put off repairing the family car a lot. Eventually, it just stopped working. Similarly, my uncle used to put off many health issues – that landed him in the ER. And on a less serious note, I had a former staffer who used to put off a critical vision care benefit. We as a society are just trained to procrastinate and worry later. And that’s the mentality I feel Canon’s new PHIL AI is catering to.

AI in Photography Ratings and Ethics

Canon isn’t alone in technologies like this. Canon PHIL is pretty much an AI photo rating app. And that’s awesome for consumers. It will help them pick the best photo since they obviously shoot recklessly. From an ethical standpoint, I commend Canon on this. They’re currently pitching it to consumers. It’s much better than another company.

That company is EyeEm. Years ago, they were a serious social media app. We were partners of theirs on a few projects. It was clear that Instagram was playing favorites, and EyeEm seemed a lot more friendly. One year, they released an AI algorithm into their app. This algorithm let people figure out what photos were best. It was based on compositions, sharpness, etc. By all means, it honestly seemed more advanced than Canon’s PHIL is. But then they released a magazine that shocked me. It portrayed the company’s photo editors as nothing more than brain algorithms. Indeed, it seemed that EyeEm’s AI learned from these folks and stole their rating habits. How could you make such a dehumanizing decision? Later, I was informed that EyeEm was shopping it around to publications. Specifically, it was being pitched to replace Photo Editors. And that’s where I drew the line.

Thankfully, PHIL isn’t nearly as unethical. And Canon should be praised for this. What’s more, I think they should have gone even further!

How Canon PHIL Works

Canon’s PHIL stands for Photography Intelligence Learning, and it works by doing one of the most dreaded parts of photography: culling. You essentially feed PHIL your photos. It digests them and keeps the ones it thinks are good. The rest are just wasted and flushed away. But unlike you who needs magnesium and Vitamin B, PHIL craves other things. According to Canon’s press release, it seeks out sharpness, noise, emotions, and closed eyes. Indeed, PHIL is focused on the post-production side.

What you input to PHIL will determine a whole lot. Similar to stuffing your face with a ton of spinach, you’ll get lots of great nutrients. If you feed it lots of great images, it will probably rate a lot very positively. But there are also times where you shovel tons of onion rings into your mouth. In PHIL’s situation, little good will come of this.

I understand that not many folks think like me. I’ve always been a person to eliminate a problem at the source. However, PHIL isn’t catering to folks like me.

“While we don’t have immediate plans for incorporating the AI utilized in the app into our cameras, we are always looking to help our customers improve their photographic skills to enable them to become better creators,” says Theresa Mattia, Marketing Specialist, Imaging and Technologies Communications Group, Canon U.S.A. Inc. She continued to state that faces are at the center of PHIL’s functionality. 

“Photo Culling works with faces and subjects that are the central part of the photo. If pets are captured as the main subject in a photo, the app will provide the sharpness and noise scores.”

PHIL works in a sort of odd way with faces, in my opinion. The pandemic proved itself to be a perfect time to work on detecting people wearing masks. But PHIL isn’t really working to recognize someone wearing a mask. Most cameras sort of recognize that you have a face. And if PHIL doesn’t register a face, then it’s going to look at other parameters. “If the subject is wearing a mask, the app can still give scores for open/closed eyes and emotion,” explains Theresa. “If the subject is wearing both sunglasses and a mask, it gets a little trickier since you are not showing an image of a full face. Overall, the score may not be as accurate and will depend on the overall image with the mask and sunglasses.”

Prevent the Problem to Begin With

Those of us wanting to stay anonymous might find this to be great. And instead, PHIL seems to be more of a rating of a camera’s performance. If it’s indeed learning and retaining this info, it would be brilliant to apply before pressing the shutter. But Canon is clearly labeling this as intelligence learning. Instead, it’s not doing machine learning. “The AI engine is not learning or improving as a user continues to use it,” Theresa shares with us. “We are always working to update and add additional features to the app.”

By all means, Canon’s autofocus in their EOS R lineup of cameras is fantastic. I rarely, if ever, get a shot out of focus. So Canon seems to be farming their efficiency out in a different way. That’s cool! I can’t think of all the times that a camera said I was in focus, but I wasn’t. It happens a lot. It could make Panasonic’s, Leica’s, Nikon’s, and Apple’s cameras better. By doing it in post-production, I guess they’re not giving secrets away. Still, I think that the B to B side of the business could make great money from this. If everyone shot images that were clearly in focus every time, photography standards would rise. Professional and artistic photographers would need to try even harder to stand out.