Last week on Thursday, for the use of its facial recognition technology, Microsoft has announced principles based on ethical grounds, which would not be barring the unlawful discriminatory use of the technology, but also will be encourage the customers to be transparent while setting up facial recognition services.
On Thursday in blog post by Brad Smith, President of Microsoft, emphasized upon the need to make legislation to regulate facial-recognition technology but also advised the tech companies for a self regulatory approach.
Considering the importance of the matter, Microsoft has been expecting the governments to make laws to regulate the facial-recognition technology in 2019, as the genie of facial recognition technology is just emerging from the bottle, as said Brad Smith.
Consumer products, offering face identification features has become the norm as several big techs including iPhones by the Apple has been offering recognition services, Google and Amazon are the other big names offering the services of facial recognition too.
But round the table, use of such technology is a big question to be answered to. For the potential of software being used to unfairly target the people of color and immigrants, in May, for prevention of offering services of facial identification to governments, Amazon was called on by U.S. civil liberties groups.
To bar unlawful discriminating usage of facial recognition technology, Microsoft in its blog stated its plans to document and communicates the potentials of the technology in question.
Lawful surveillance, accountability, fairness, notice and consent are the other principles defined by the Microsoft for its self regulation of technology.
For not selling face recognition technology to law enforcement agencies, contrary to the demand of American Civil Liberties Union, Microsoft avoided to make any promises.
By the end of March 2019, Microsoft intends to officially launch these announced principles with a clear framework, said the company, without providing implementation details.