Skip to content

Education |
CU Colorado Springs students secretly photographed for government-backed facial-recognition research

Terrance Boult’s project captured images of more than 1,700 people walking in public in 2012-2013

Professor Terrance Boult's surreptitious surveillance photos were taken from the building on the upper right and captured images of more than 1.700 students, faculty members and other passers-by walking on the path near the West Lawn, the large grassy area on the left, on the University of Colorado's Colorado Springs campus in 2012-2013.
Google Maps
Professor Terrance Boult’s surreptitious surveillance photos were taken from the building on the upper right and captured images of more than 1.700 students, faculty members and other passers-by walking on the path near the West Lawn, the large grassy area on the left, on the University of Colorado’s Colorado Springs campus in 2012-2013.
Elizabeth Hernandez - Staff portraits in The Denver Post studio on October 5, 2022. (Photo by Eric Lutzens/The Denver Post)

A professor at the University of Colorado’s Colorado Springs campus led a project that secretly snapped photos of more than 1,700 students, faculty members and others walking in public more than six years ago in an effort to enhance facial-recognition technology.

The photographs were posted online as a dataset that could be publicly downloaded from 2016 until this past April.

While professor Terrance Boult and CU officials defended the project and its efforts to protect student privacy, a University of Denver law professor questioned whether this is an example of technological advancement crossing ethical boundaries.

“It’s yet another area where we’re seeing privacy intrusions that disturb us,” said Bernard Chao, who teaches the intersection of law and technology at DU and previously practiced law in Silicon Valley for almost 20 years.

The CU Colorado Springs project, first reported last week by the Colorado Springs Independent, began in 2012 with funding from a variety of U.S. intelligence and military operations, including the Office of Naval Research, Special Operations Command and the Office of the Director of National Intelligence. It was not clear how much funding the project received from government agencies.

Boult’s research originally was intended to analyze facial-recognition algorithms to determine whether they were up to snuff for use by the U.S. Navy. But it turned out the technology wasn’t as efficient as the Navy wanted.

“It was solved if you wanted to match two passport photos where the person is facing forward in good light, but not if you wanted to recognize someone 100 meters away,” Boult said.

Boult and his team did more advanced research to try to improve the facial-recognition technology.

“The study is trying to make facial recognition better, especially at long range or surveillance applications,” Boult said. “We wanted to collect a dataset of people acting naturally in public because that’s the way people are trying to use facial recognition.”

Facial-recognition technology is being used more and more, including for things such as enabling Facebook to tag people in pictures, in helping government agencies to check passports or visas, and beyond.

To conduct the study, Boult set up a long-range surveillance camera in an office window about 150 meters away from the West Lawn of the Colorado Springs campus, a public area where passers-by would not have a reasonable expectation of privacy.

The camera surreptitiously photographed people walking in the area of the West Lawn on certain days during the spring semesters of 2012 and 2013.

The candid shots caught students as they looked down at their phones, blurred in motion or walked out of frame altogether.

More than 16,000 images were taken, producing 1,732 unique identities. To protect student privacy, Boult said, he waited five years to release the dataset publicly. That way, people were unable to look at the pictures and figure out a student’s whereabouts in case of a domestic violence concern or a clandestine military placement, he said.

Jared Verner, a CU Colorado Springs spokesman, said the university is committed to academic freedom and the ability for faculty to study and research a variety of topics while also taking student privacy seriously.

“The research protocol was analyzed by the UCCS Institutional Review Board, which assures the protection of the rights and welfare of human subjects in research,” Verner wrote in a statement. “No personal information was collected or distributed in this specific study. The photographs were collected in public areas and made available to researchers after five years when most students would have graduated.”

DU’s Chao noted that if the study was approved by the university’s institutional review board, CU Colorado Springs determined that there was not substantial concern about individuals being harmed. Still, Chao called the project “surprising.”

“There’s creeping concern that maybe he has all this data and all these photos, and what other use could be used for that?” Chao said.

The dataset of photos was taken off the internet on April 15, but not because of privacy concerns, Boult noted. The CU Colorado Springs dataset was used in an April article in the Financial Times entitled, “Who’s using your face? The ugly truth about facial recognition.”

“They gave out more information in the article than we had intended,” Boult said.

The information published included the date and time the photos were taken, which Boult said thwarted the intended purpose of trying to randomize the photos in the dataset. Boult said he’s considering releasing another version of the data publicly that would fix this.

If a student objected to being unknowingly involved in the study, Boult said he would try to make amends.

“If somebody wants to come and sit in my lab and go through the thousands of photos and say, ‘That one is me,’ we can gladly remove them from the dataset,” Boult said.

But Boult argues the students’ faces are being used for the greater good, saying he balanced the privacy of students with the need to improve facial-recognition systems.

“As long as the systems are bad, their potential misuse is consistent,” Boult said. “If police use them and they match the wrong person, that’s not good. Our job as researchers is to balance the privacy needs with the research value this provides society, and we went above and beyond what was required.”

Chao countered by saying Boult’s reasoning assumes what government or federal agencies are doing is something society wants them to do when that might not be the case.

“He may be helping them do something that’s not right in the first place,” Chao said. “I’m not sure I want to be in a state where every place I walk, my picture is being taken and automatically uploaded into facial-recognition software. I actually know I would not like that. I think the response is, ‘Maybe we just shouldn’t be doing this, period.’ “