Skip to main contentSkip to navigationSkip to navigation
External hard drive
The study gave 53 digital forensics examiners from eight countries the same computer hard drive to analyse. Photograph: Getty Images/iStockphoto
The study gave 53 digital forensics examiners from eight countries the same computer hard drive to analyse. Photograph: Getty Images/iStockphoto

Digital forensics experts prone to bias, study shows

This article is more than 2 years old

Participants found more or less evidence on hard drive depending on what contextual information they had

Devices such as phones, laptops and flash drives are becoming increasingly central to police investigations, but the reliability of digital forensics experts’ evidence has been called into question.

A study found that experts tended to find more or less evidence on a suspect’s computer hard drive to implicate or exonerate them depending on the contextual information about the investigation that they were given.

Even those presented with the same information often reached different conclusions about the evidence.

Such biases are known to be a problem in other forensic disciplines including fingerprint analysis, but this is the first time it has been demonstrated in digital forensics.

“I cannot overemphasise the importance of forensic scientists understanding the potential for unintentional bias, and of ensuring they take measures to minimise the risks,” said Dr Gillian Tully, a professor of practice for forensic science policy and regulation at King’s College London and former UK forensic science regulator.

Digital evidence now features in around 90% of criminal cases. Digital examiners working in police and private laboratories use specialised software and other techniques to secure, retrieve and analyse data from suspects’ communications, photos and other digital interactions that could shed light on their activities.

However, the field’s rapid growth means that has not been subjected to the same scientific scrutiny as other forensic techniques. “It has been described as the wild west because it wasn’t developed systematically and scientifically before it went into the criminal justice system,” said Dr Itiel Dror, an expert in cognitive bias at University College London who carried out the study.

Ian Walden, a professor of information and communications law at Queen Mary, University of London, said there was a tendency to believe the machine. “This study shows that we need to be careful about electronic evidence,” Walden said. “Not only should we not always trust the machine, we can’t always trust the person that interprets the machine.”

Dror and Nina Sunde at the University of Oslo, Norway, gave 53 digital forensics examiners from eight countries including the UK the same computer hard drive to analyse. Some of the examiners were provided with only basic contextual information about the case, while others were told the suspect had confessed to the crime, had a strong motive for committing it or that the police believed she had been framed.

The study, soon to be published in Forensic Science International: Digital Investigation, found that the examiners who had been led to believe the suspect might be innocent documented the fewest traces of evidence in the files, while those who knew of a potential motive identified the most traces.

It also found low levels of consistency between examiners who were given the same contextual information, in terms of the observations, interpretations and conclusions they drew from the files.

“Digital forensics examiners need to acknowledge that there’s a problem and take measures to ensure they’re not exposed to irrelevant, biased information,” said Dror. “They also need to be transparent to the courts about the limitations and the weaknesses, acknowledging that different examiners may look into the same evidence and draw different conclusions.”

In her final report before stepping down as forensic science regulator earlier this year, Tully called for improved compliance with quality standards for digital forensic labs, many of which have not been accredited, and greater scrutiny of scientific evidence in court.

Dr David Gresty, a senior lecturer in computer forensics at the University of Greenwich, said: “We have every reason to believe that an expert acting in good faith, but through a mistake of interpretation, could easily mislead a courtroom. Without the defence instructing another expert to review the evidence it is entirely possible this could go unnoticed, and realistically it is likely there are undetected miscarriages of justice where cases have relied heavily on digital evidence.”

A report published by the Police Foundation in January recommended that training in digital forensics be provided for everyone working within the criminal justice system, including judges, prosecutors and defence barristers, to help reduce instances of misinterpretation and better understand the limits of what can be achieved.

The Police Foundation’s director, Rick Muir, said: “There may always an element of subjectivity in this, but we could try to reduce the room for error through effective training and the use of common standards across digital forensics work. Most examination is done in-house, so I think there’s a real onus on the police to make sure that consistent standards are applied.

“If it loses credibility, that’s a massive problem because almost any criminal case these days will have some kind of digital evidence. You could have people who were wrongly convicted or people who are guilty going free, and there’s a wider issue of undermining public confidence in digital forensics if you don’t get this right.”

A spokesperson for the National Police Chiefs’ Council said: “Digital forensics is a growing and important area of policing which is becoming increasingly more prominent as the world changes. This report is from a very small sample size and is not representative of the operational environment police in this country work in. We are always looking at how technology can add to our digital forensic capabilities and a national programme is already working on this.”

Most viewed

Most viewed