views
If you’re told today that giving a customer service agent access to analysing your facial reactions would help better protect your privacy, would that even remotely sound true? That’s more or less what EnableX claims its new face recognition technology would help companies achieve. In September, the startup introduced FaceX – an artificial intelligence based facial recognition tool targeted at businesses. The goal is similar to practically any facial recognition technology, and the tech itself isn’t drastically new. However, what raises concern around the use and sale of such technologies in India are the severe privacy implications of it.
Speaking about this to News18, Pankaj Gupta, founder and CEO of EnableX, a subsidiary of VCloudX – a startup headquartered in Singapore, is certain that contrary to the popular opinion around commercial use of face recognition and its implications on user privacy, his technology will do lesser harm than what is already happening to your personal data. To back this up, he says that it is not imperative that reading your face will directly mean a greater violation of your privacy.
“Our face AI technology is implemented in the browser end, and not on the server. So, no personal data is relayed or stored to data servers. Businesses will not get to store a person’s private identifier to link and store a person’s data, and it is wiped out as soon as a customer logs out. So, if the same user logs in to the same website for the second time, his face will not be recognised and quantified into sellable data,” says Gupta.
Gupta stands firm that the technology is only going to map your reactions – and not your very unique and identifiable face – to, say, products on an e-commerce portal. “Think of it this way,” he adds, “This is not a cookie-based technology that would track you and your usage. When you are browsing, and your camera is on, then even if you don’t login, the technology can analyse your reactions to recommend better products. This is aimed at converting more random users into sales. Our AI tech will not, at any point, identify you or track you across the internet. It will not recognise you and start recommending products again based on your last visit. It may map user reactions to a particular product, to show how popular or liked it is, but those reactions will never be linked to identifiable people.”
To understand the implications of whether this technology can indeed go bad, and where it ranks in comparison to global practices, Mishi Choudhary, technology lawyer and founder of the Software Freedom Law Centre in India, tells News18 that even for such technologies, the real concern would be around frauds that ensue from mishandling and mismanagement of the generated data. As Choudhary says, “Amazon already is experimenting with such a technology at their Amazon Go store in Seattle, USA – facial recognition technology, combined with online analytics and previous purchases, to tailor the experience for shoppers and allow them to find what they are looking for faster. The questions of privacy, consent, and the function creep, i.e the data collected for one purpose being used for another, are central to the debate.”
It is in response to this that Choudhary says a viable explainable AI model is necessary, which is still a distance into the future for India. She says, “What we must worry about is security of the data collected and the frauds that ensue. This is why the concept of explainable AI (XAI) has developed. XAI is relevant even if there are no legal rights or regulatory requirements, because the user experience is improved by helping end users trust that the AI is making good decisions.”
The key concern, hence, is how the technology may be used by EnableX’s vendors. As Gupta confirms, EnableX only provides companies with an implementable API, which they can add on top of the communication stack already being sold by this startup, in order to improve their services. As a result, all implementation and custom configurations, as well as policies of data privacy, will lie with the vendor – and not with EnableX. On standalone terms, EnableX’s face recognition technology is not doing anything drastically new. However, it is offering Indian (and global) customers across sectors such as hiring, human resources, customer services and online retail ways to map live human reactions of all visitors to their products.
Gupta says that like any other similar technology, this too is based on deep learning and neural networks. However, he could not give us a full explanation as to how the algorithm is trained, if no data is recorded or relayed to their servers – except for maintaining that this is a “browser end” technology. The face recognition stack is not native to EnableX – Gupta confirms that they sourced it from a “partner” that worked on this as a research project, and is now selling it through providers such as EnableX itself. However, he did not reveal the creator’s name.
As Choudhary explains, such a technology being implemented in the market does not necessarily have to be ‘bad’. “Personalising user experiences have huge advantages in healthcare and education. But, respecting security and privacy can go hand in hand with innovation, too. To enable innovations such as these, light-handed and easy to understand laws with strict enforcement in terms of damages (and not jail time) is the only way to go forward.” It is this, she believes, that can instil accountability in commercial organisations such as EnableX’s potential vendors who may wish to use this technology.
“The issue to highlight is the confusion of our courts to mix subject matters, such as the use of facial recognition to find missing children and disoriented adults, or to support and accelerate investigations – with commercial usage. There is a tendency to assume that if a technology has positive usage, all uses should be allowed without any guardrails,” Choudhary sums up as the issue behind establishing an accountable and streamlined legal outline in India right now.
Services such as EnableX’s face recognition AI have a clear benefit – they can drastically help small e-commerce firms run a lean operation by weeding out unpopular products, thereby saving on inventory cost and other related particulars. The customer, however, must employ strict discretion behind how such technologies affect them – until there are enough failsafes that ensure that key information such as their face data are not being misused.
Read all the Latest News, Breaking News and Coronavirus News here
Comments
0 comment