top of page

Big Brother

or big benefit?

The reality of facial recognition
PHOTO: The watchful gaze of CCTV cameras. (Pexels)
About

A camera can tell you’re a criminal, just by looking at your face. It could be a dystopian nightmare...or the end of terrorism. Jennifer Luu reports.

From Orwell to Jason Bourne, generations of pop culture have drilled a deep fear into the public consciousness: the fear of surveillance. In the era of Wikileaks and Snowden revelations, privacy is an increasingly controversial topic.

 

According to the Australian Institute of Criminology, 46 per cent of local councils in New South Wales have CCTV cameras installed in public spaces. These systems have proven to be successful in criminal detection.  

​

​

​

​

​

​

​

 

 

 

 

 

 

 

 

 

 

 

 

But Toowoomba Regional Council in Queensland has taken it a step further. The council recently completed a 4-week trial of a software called iOmniscient in its library. The technology, installed in surveillance cameras, tracked the movements of library patrons.

 

“It’s probably an advanced people counter to a degree...You can actually then see from where within the library that people are spending the most amount of time or where we need more resources in an area,” said Councillor Geoff McDonald, chair of the Environment and Community Committee.

​

​

​

​

​

​

​

​

​

​

​

​

 

But iOmniscient’s capabilities don’t stop there. The program can detect criminals and suspicious behaviour in real time, merely by by observing the faces and behaviour of passersby.

 

iOmniscient CEO Rustom Kanga said: “The technology is based on what we call artificial intelligence technology, which attempts to emulate how humans think...it understands what is happening in the environment and it learns from mistakes.”

​

​

People have “gone and killed themselves, taken their own

life, because they’ve been flagged in one of these systems.”

- Professor Katina Michael, Australian Privacy Foundation

​

​

Dr Kanga is adamant that facial recognition is the answer to crime and even terrorism; it could “prevent incidents like the Brussels attack”.

​

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

​

​

​

PHOTO: Mourners lay flowers for the dead after the Brussels attack in 2016. (The New Yorker)

 

“There are things that humans can do very well...however, for repetitive activity and activity involving large masses of data, a computer is much more effective,” said Dr Kanga.

 

“If you had to recognise ten thousand unknown people in a crowded football stadium, a human being would be useless at that, but a computer can still do it.”

 

Founded in 2001, iOmniscient has since completed projects in 46 countries across 30 different industries. However, the company has faced criticism over privacy concerns.

 

Professor Katina Michael, a board member of the Australian Privacy Foundation, believes that facial recognition is inherently biased. She fears there’s a fine line between what is considered normal and abnormal behaviour or appearance.

 

“Are they are singling out people of a racial minority, people who are different? For instance, what do you do with people who are disabled? With people who are mentally ill who are queuing? And they haven’t done anything wrong...but they may be singled out because they just look different.”

 

The impact on wrongly identified suspects can be severe. Professor Michael recalls that people have “gone and killed themselves, taken their own life, because they’ve been flagged in one of these systems.”

 

“People don’t know, during the Boston Marathon somebody who was identified prematurely as being one of the people who detonated one of the bombs actually took his own life because he felt like he was being chased by the authorities and yet he was completely innocent.”

 

Dr Kanga argues that humans are also capable of racial profiling.

 

“Just because you have a camera there and a human behind it doing the racial profiling, is no different from having a camera there and you have a computer doing the racial profiling,” he said.

​

I think we're still a very long way away from how

this is portrayed in movies like 'Jason Bourne'.

- Jon Lawrence, Electronic Frontiers Australia

​

 

Despite these concerns, the technology could soon become a staple in Queensland. The data from the Toowoomba trial is currently awaiting analysis by the South-East Queensland Council of Mayors, which ordered the trial as part of its “smart region initiative”.

 

“There are a wide range of use cases we could use this for including security, people counting, event identification, access control, etc.,” said executive director Scott Smith.

 

Toowoomba Council stresses the importance of the technology for its citizens.


“Ultimately, the number one thing for any local government, I think, the number one responsibility, is to make sure people are safe in a community. So, that is why we have our CCTVs,” he said.

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

​

 

 

PHOTO: A demonstration of  iOmiscient's people-counting technology. (iOmniscient)

 

Despite their high levels of surveillance, Sydney councils aren’t as enthusiastic about facial recognition. City of Sydney media representative Keeley Irvin confirmed “the City does not currently use facial recognition technology in our CCTV program”, while the NSW Ombudsman has “not received any complaints about the use of facial recognition technology”, according to executive officer Selena Choo.

 

Some critics are sceptical of iOmniscient’s capabilities, including digital rights non-profit organisation Electronic Frontiers Australia.

“I suspect that if they had any meaningful predictive capabilities, they'd have much bigger prospects than Toowoomba City Council,” said executive officer Jon Lawrence.

 

“Public CCTV is both very reluctantly funded and...live access to it is only ever provided to the local police station...I think we're still a very long way away from how this is portrayed in movies like 'Jason Bourne'.”

 

Professor Michael disagrees. She thinks that when it comes to surveillance, life imitates art.

 

“We are going towards a ‘Minority Report’ future...I think we’re sleepwalking towards that type of society.”

 

She stated that privacy laws are struggling to keep up with advances in surveillance, and calls for a risk assessment of behavioural biometrics technology.


“Technology is pushing the boundary, it is almost that the laws are lagging far behind...it is the law not even having any idea what to do with this kind of behavioural surveillance technology.”


“There is no such thing as ‘omniscient’, there is no such thing as an all-seeing eye in real time, because context is always missing from situations and events.”

 

Dr Kanga believes that the technology is not to blame.

 

“Any technology can be used for good or for evil. It’s up for the users to use the technology as they wish to... and it is up to society to use all technology for the good of society.”

​

​

​

​

 Data Source: AIC

About the Journalist

 

Jennifer Luu is a producer for 2SER 107.3, an avid journalist and grammar pedant. She has had previous roles at Studio 10 and the City Hub newspaper. When she's not on the scent of a good story, Jennifer can be found despairing over the state of world politics and befriending cats. 

bottom of page