Her team has modified a commercially available Canon camera, converting the infrared focusing sensor built into its viewfinder to a biometric sensor that captures an image of the photographer's iris at the instant a photo is shot. This image is converted to digital data that is stored invisibly in the image file, along with the time and date and other watermark data.
The application for a police photographer is obvious: If challenged in court, the image, camera and shoot are verifiable, the entire system secure. Unfortunately, the world of justice is the Dark Ages to academia's Renaissance. The FBI has a special Digital Evidence Section and is funding authentication research, but federal rules of evidence don't require verification of digital images other than by the photographer or someone else at the scene, let alone a secure photography system, and there's been little effort to change them.
"Most criminal courts are technically illiterate," says Grant Fredericks, a forensic-video analyst with forensic-systems maker Avid Technology. "They don't have the tools and experience to deal with advanced technology." Lawyers are just beginning to grasp the technology and its ramifications, but the bench is especially ignorant.
"Trial judges have not been adequately apprised of the risks and technology," says New York Law School's Sherwin. "I can recount one example where in order to test an animation that was being offered in evidence, the judge asked the attorney to print it out. What we really have is a generation gap in the knowledge base. Courts are going to have to learn about these risks themselves and find ways to address them."
One bright spot is that for now, at least, we only have to worry about still images. Fredericks says that to modify video convincingly remains an incredibly painstaking business. "When you're dealing with videotape, you're dealing with 30 frames per second, and a frame is two individual pictures. The forger would have to make 60 image corrections for each second. It's an almost impossible task." There's no Photoshop for movies, and even video altered with high-end equipment, such as commercials employing reanimated dead actors, isn't especially believable.
Digital-forensics experts say they're in an evolutionary race not unlike the battle between spammers and anti-spammers -- you can create all the filters you want, but determined spammers will figure out how to get through. Then it's time to create new filters. Farid expects the same of forgers. With enough resources and determination, a forger will break a watermark, reverse-engineer a RAW file, and create a seamless fake that eludes the software. The trick, Farid says, is continuing to raise the bar high enough that most forgers are daunted.
The near future of detection technology is more of the same, only (knock wood) better: more-secure photographer-verification systems, more tightly calibrated algorithms, more-robust watermarks. The future, though, promises something more innovative: digital ballistics. Just as bullets can be traced to the gun that fired them, digital photos might reveal the camera that made them. No light sensor is flawless; all have tiny imperfections that can be read in the image data. Study those glitches enough, and you recognize patterns -- patterns that can be detected with software.
Still, no matter what technologies are in place, it's likely that top-quality fakes will always elude the system. Poor-quality ones, too. The big fish learn how to avoid the net; the smallest ones slip through it. Low-resolution fakes are more detectable by Farid's latest algorithm, which analyzes the direction of light falling on the scene, but if a photo is compressed enough, forget about it. It becomes a mighty small fish.
Which brings us back to Joey Boudreaux, the Marine who found himself denounced by his local paper, the New Orleans Times-Picayune, as having embarrassed "himself, the Marine Corps and, unfortunately, his home state." The Marines conducted two investigations last year, both of which were inconclusive. Even experts with the Naval Criminal Investigative Services couldn't find evidence to support or refute claims of manipulation.
Boudreaux has taken the incident in stride. "My first reaction, I thought it was funny," he said in a telephone interview. "I didn't have a second reaction until they called and said, 'You're getting investigated.' " He insists that he never gave the Iraqi boy a sign with any words but "Welcome Marines," but he has no way to prove it. Neither he nor anyone he knows still possesses a version of the image the way he says he created it, and no amount of Internet searching has turned it up. All that exists are the low-quality clones on the Web.
Farid's software can't assess Boudreaux's claim because the existing images are too compressed for his algorithms. And even Farid's trained eye can't tell if either of the two existing images -- the "good" sign or the "bad" one -- are real or if, as Boudreaux claims, both are fakes. An unsatisfactory conclusion, but a fitting one. Today's authentication technology is such that even after scrutiny by software and expert eyes, all you may have on your side is your word. You'd better hope it's good enough.
News stories provided by third parties are not edited by "Site Publication" staff. For suggestions and comments, please click the Contact link at the bottom of this page.