This is so, so stupid, for a myriad of reasons.
What’s more, certain groups of adults are more likely to bear the brunt when these supposedly “convenient” estimation methods fail. Nonbinary and trans people are likely to be misclassified by facial age estimation technologies and often do not have access to IDs reflecting their gender and name.
People with disabilities that affect their physical appearance often face misclassification, as facial estimation technologies struggle with variations outside their training parameters, and they may be limited from attaining IDs like driver’s licenses as well.
People of color are routinely misidentified by facial recognition and estimation technology—something that Yoti’s white paper acknowledges in reporting higher error rates for people with darker skin tones—and consequently may distrust facial scanning systems and prompts to upload more invasive documentation.
Finally, people from different socioeconomic contexts, particularly low-income people, and some immigrant communities may lack the documentation and IDs even if they want to supply them—in fact, millions of Americans lack government ID.
This creates a troubling pattern: those who don’t fit algorithmic “norms” must surrender more personal data to access the same services or eschew the use of online services that help people access information, seek employment opportunities, and speak freely altogether.
Not only is it a massive invasion of privacy, it doesn't fucking work correctly anyway.