Amazon’s facial recognition tech bias? Here’s what’s upsetting AI experts

Facial recognition technology was already seeping into everyday life from your photos on Facebook to police scans of mugshots when Joy Buolamwini noticed a serious glitch: Some of the software couldn’t detect dark-skinned faces like hers.

That revelation sparked the Massachusetts Institute of Technology researcher to launch a project that’s having an outsize influence on the debate over how artificial intelligence should be deployed in the real world.

Her tests on software created by brand-name tech firms such as Amazon uncovered much higher error rates in classifying the gender of darker-skinned women than for lighter-skinned men.

Read more

You may also like

Comments are closed.