Home > News & Publications > Engineering in the News > Does Your Phone Know if You're Sick?

Does Your Phone Know if You're Sick?

Just by looking at photographs, “naive observers” in a Swedish study were able to determine – at a rate better than chance – which participants had been infected with an E. coli endotoxin and which had not.

The visual cues that gave away “acutely sick” individuals included pale lips and skin, a more swollen face, droopy mouths, hanging eyelids, redder eyes, less glossy skin, and a general appearance of being tired.

But could facial recognition software do even better? DW spoke with biometrics expert Kevin Bowyer to find out more.

DW: Is the technology there for my smartphone to figure out, ‘Hey, this guy looked alright for the last 3 weeks, but now he’s coming down with some sort of illness?’

Kevin Bowyer: I actually think that that work is in progress right now, and that you will see results in the next year or two. Not just in terms of physical health, but mental health. There are groups that are looking at the time sequence of face imagery – do you appear less alert, more stressed, sort of thing. So [they’re] monitoring your psychological well-being as well as your physical well-being from the time sequence of face images. You might also get the person’s pulse rate, or their respiration rate, from a face video. So you might have additional information beyond just the face … which might be useful for truck drivers or heavy equipment operators. You would pass an alertness test, or “healthiness test,” before you operate certain types of equipment.

Would all that data allow health authorities to say, “Wow, looks like people in that city/region are getting the flu for some reason.”

I think the data would be useful. But are people signing on to have their longitudinal health condition monitored by the smartphone and reported to somebody? Whether it’s Apple or the Center for Disease Control, I think there’d be some privacy discussions.

But if it’s not my smartphone – if it’s a camera as I walk into my office every morning or cameras placed in retail shops that aren’t under my control, do you see ethical concerns there?

Let’s say at the shopping mall I’ve got the same cameras that are up now that are telling me that a young Asian female is walking up to the counter. And maybe they display an advertisement that’s appropriate for that demographic. But now it’s saying something about [her] health condition and reporting that to someone. I think people have realized that people are getting advertisements tailored to them. [But] we’re potentially crossing a line when you begin to report semi-anonymous health information to another party.

Are we heading toward a future where make-up companies sell a product, like foundation, that changes your face and makes it unrecognizable to your own phone or to other cameras?

[Laughs.] Well we certainly talk about make-up confusing face recognition algorithms and making it harder. And you could kind of intentionally make it harder in some ways. So, certainly, you could use foundation or something to cover up the lack of face color. You could probably also, if you wanted the day off, you know, put the right make-up on to make you look like you were sick and take the selfie and say, “Ooh, I can’t come in today, I’m contagious!”

Kevin W. Bowyer is a professor of computer science and engineering at the University of Notre Dame in the US. His research interests include computer vision, pattern recognition, biometrics, data mining, object recognition and medical image analysis.

Published originally:
LadyClick (Jan. 3, 2018)