The Aesthetic Guide is part of the Informa Markets Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

A.I. technology used to determine beauty

Article-A.I. technology used to determine beauty

A.I. technology used to determine beauty

Today many cosmetic surgeons would pair photos from Analyze My Face with artificial intelligence to create a 3D rendering of their patients face that realistically portrays movement and expression. Their tools would then analyze their features and and tell them the percentage increases in attractiveness they could achieve for each recommended surgery.

Cosmetic surgery is big business in the United States, and across the world. In the United States, doctors performed roughly 17.7 million procedures last year, and the American Society of Plastic Surgeons estimates they spent approximately $16.5 billion on self-beautification (this does not include procedures related to reconstruction, which are mostly referred to as plastic surgery rather than cosmetic surgery). Many surgeons are interested in any tool with potential to boost their already lucrative businesses, and increasingly, those tools somehow involve A.I., intelligent systems that are able to learn, act, and reason for themselves.

Some experts say that handing over assessments of beauty to an algorithm may not be a good idea. “A.I. use in aesthetic evaluations might destroy the cultural diversity of beauty,” wrote plastic surgeon Dr. Jungen Koimizu, in the March 2019 issue of The Plastic and Reconstructive Surgery Journal.

Because many marketing companies use A.I. to target potential clients (in everything from behavior modeling to predictive insights and big data analysis), the technology is sometimes involved even before patients make their first cosmetic surgery appointment. Dr. Heather Levites, a plastic surgery resident at Duke University School of Medicine, has used a tool from a sentiment analysis startup called Cognovi Labs, where her father is COO, to analyze social media posts that mention cosmetic surgery keywords. The tool scanned tweets for mentions of keywords such as liposuction and breast augmentation, and analyzed them to understand what prospective clients were interested in — and what they had mixed feelings about. It categorized this using six different emotions; surprise, anger, joy, disgust, fear, and sadness. Then it sorted the data into three metrics; awareness, engagement, and motivation — the higher a tweeter’s motivation, the more likely they’ll follow through with a procedure. Numerous behavioral economists report that 70% of decision-making is driven by emotion.

Heather was surprised by the results. Social media users were very familiar with nose jobs, but the Cognovi Labs tool found a strong negative reaction to that surgery. “We have to break nasal bones for the procedure, and that generated anger and frustration,” Levites says, noting that she could potentially negate this with educational social media. People were less familiar with liposuction, but the fat-sucking operation ranked first for emotional attachment. The analysis helped Heather understand how patients feel about different procedures, and she’s refining the parameters for another study now. Eventually, she hopes to provide a tool for surgeons in different regions that enable them to refine their online presence and adapt to patients’ desires.

Another popular use of A.I. among plastic surgeons are tools like BioMedX and Crisalix that show patients 3D models of what they will look like after surgery. But one challenge of this type of software, which uses 3D scanning to model bodies, is accounting for changes in lighting, age, or different shades of skin.

In Zurich, Endri Dibra, a software engineer who specializes in creating realistic 3D human avatars, says the A.I. software he built to help women envision the results of their breast reconstruction surgery doesn’t work well for people of color (African American skin is prone to keloid scarring, for example, which the software doesn’t depict). That’s because he built the dataset that his technology bases its projections on by partnering with surgeons in Switzerland, where 0.6% of people are African American. Cosmetic surgeons sent their mostly white patients to him for full body scans so that he could use their geometry to train his software to realistically synthesize and render images.

Dibra founded a company last year called Arbrea Labs, and is currently working on augmented reality imaging tools for both women undergoing breast reconstruction surgery and women electing to have breast augmentation surgery. Right now he is only selling the breast augmentation product to doctors in Switzerland. Once he has more diverse patients in his dataset, he says he’ll approach international clients. His transparency about his data’s lack of diversity is rare in the A.I. sector.

Bias in A.I. is a well-documented problem: Companies including Amazon and IBM have been found to include undisclosed gender, beauty, and racial bias in their algorithms. Amazon’s recruiting app was biased against female applicants and a portrait generator app built by IBM and MIT homogenized Asian and African American skin color to white.

These sorts of biases could be particularly harmful when assessing beauty.

Some surgeons use A.I. tools that assign patient’s a beauty score (often based on golden triangle principles) before surgery. Rescanning their face after their cosmetic work, for instance, can provide quantitative data on how much prettier they are. Potentially, this could protect surgeons from lawsuits by patients who are unhappy with their work.

The same capability could be used to predictively model changes from pre-op to post-op, as a paper published in 2014 noted — to see whether the desired surgery would increase beauty. “A quantitative measurement of aesthetic improvements could not only set expectations, but also discourage patients from undergoing procedures that offer marginal improvement,” Dr. Jonathan Kanevsky told VentureBeat; if you’ll only be 2% prettier, you may reconsider if surgery is worth your time and energy.

Measuring pretty has many practical applications — but who decides what pretty is?

Koimizu, who wrote the paper raising concerns about A.I. beauty assessments, is worried that surgeons might refine faces to fit an overwhelmingly white and westernized ideal of attractiveness. The result? “A marginalization of values of beauty in other cultures,” he warned.

“It is impossible to be perfectly free of biases when individuals score attractiveness,” added Koimizu, noting that most A.I. datasets in use are corrupted by biased ethnicity and gender ratios.

Attractiveness is not the only A.I. measurement that raises questions. An October 2019 report in The Plastic and Reconstructive Surgery Journal assessed if machine algorithms could identify if gender-confirming facial feminization surgery was successful. Using four public neural networks, the doctors tasked the A.I. with evaluating if post-surgery the trans women were successfully gender typed as women. Pre-op, they were misgendered 47% of the time, but post-op they were correctly identified 98% of the time. For trans folk, an objective evaluation that correctly identifies their gender might help them feel confident in their skin — but defining what a woman or man “is” could be as fraught as deciding what “pretty” is.

You can’t put these sorts of tools back in the box. A.I. in cosmetic surgery is inextricably linked to plastic surgery, which has used A.I. to great effect. For example, surgeons from Harvard Medical School, Massachusetts Eye and Ear Infirmary, the Royal Australasian College of Surgeons, and other research institutions tasked A.I. with assessing the post-op outcomes of cranial surgery on patients with facial paralysis; in particular, they wanted to know, did their post-op smiles convey genuine emotion? That’s a useful assessment. In Italy, surgeons are using A.I. in wound care. Their algorithm detects damaged skin with a 94% accuracy rate, leading to tailored treatment plans.

Some uses of A.I. in plastic and cosmetic surgery obviously fall into the good category. But deciding who and what is beautiful — and then operating on the algorithms’ advice — is creepy. For now, at least, surgeons are using A.I. as a guideline, not as a God. As long as we all stay aware of that, we might be OK.

Source:

One Zero

Hide comments
account-default-image

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish