Associate professor Carsten Rudolph, from Monash University, has criticised the Federal Government’s
Identity-matching Services bill.
In an interview with Innovation Intelligence, Rudolph discussed the inaccuracies of facial recognition technology can cause disruptions, and the potential for the database to be misused or stolen.
The bill introduces
five components
of facial recognition for the Commonwealth, State and Territories. Under the bill law enforcement and other agencies would be able to identify and verify identities based on a collective database of faces.
Rudolph explains that when using the data for identification, it’s not 100% accurate, and while the margin of error might seem small it would still affect a great number of people.
“If 100,000 people bought tickets to the AFL, and the event was using facial recognition software to verify identities before entering, 300 people could be denied entry.”
There is also a concern for facial recognition to misrecognise minorities, as the system will mostly be trained using white faces.
“The potential for false recognition is greater, the training data for these minorities will be small which means the data will be worse,” explains Rudolph.
With the high rates of incarceration of Indigenous Australians, there is a greater potential for the technology to misrecognise Indigenous people as criminals.
It is possible to work around the problems of these systems when it comes to misrecognising minorities, Rudolph says, but it is difficult.
“There is research around generating similar data from the data that you do have. You can use it to train particular variants of the system.”
He also has concerns over the (IDSS) database being misused or stolen.
“Just by creating that database there is the potential it can be misused, or stolen. It is not clear what the data should be used for and what it should not be used for. I think as a country or as a society we need to think about what services we want to have and what this will be used for in the future.
“Of course, there are always the issues around the security of the database itself. We have a lot of good security controls and mechanisms in place, still systems get hacked and databases get stolen.”
Currently the Federal Government’s identity-matching services bill is limited for the uses of law enforcement, security, protective security, community safety, road safety, and verifying identification, and will not use live facial recognition. However, Rudolph said there is reason to fear that it may change in future.
As he explains, when there is a crime, people are naturally going to want to use the CCTV footage for facial recognition, and while the Government promises it will not be used in real-time, a lot of people are sceptical.
“There is research that people will change the behaviour when they are tracked and always supervised by CCTV. They try not to stick out. They change their clothing to be quite bland and not stick out. They go to areas where there are no cameras and this change of behaviour is something that we don’t want to see in our society.”