The icon indicates free access to the linked research on JSTOR.

The first criminal conviction based on fingerprint evidence took place in 1892. This was in Argentina a decade before such uses of fingerprints and convictions occurred in the UK and France. It wasn’t until 1912 that the U.S. had its first fingerprint-based conviction.

JSTOR Daily Membership AdJSTOR Daily Membership Ad

So why Argentina? Juan Vucetich, an immigrant from Croatia, joined Argentina’s provincial police Office of Identification and set up that country’s fingerprint files in 1891. A student of Galton’s system, Vucetich called fingerprinting dactyloscopy (dactiloscopia in Spanish). Galton was a eugenicist and thought, incorrectly, that fingerprints could trace heredity. Vucetich thought of fingerprints as an identification system for individuals, specifically those populations in presumed need of control, like criminals, prostitutes, and, ironically, immigrants.

As historian Julia Rodriguez puts it, the genius of Vucetich’s system was “the efficiency with which his classification system could be married to an emerging bureaucratic archive of individual fingerprints.”

One of the policemen Vucetich trained was Eduardo Alvarez. Alvarez was called to help in the aftermath of a sensational crime in the village of Necochea in June 1892. Two children had been stabbed to death in their beds. Their mother’s throat had been sliced. The mother, Francisca (sometimes Francesca) Rojas, survived and accused her neighbor.

This neighbor turned out to have a very good alibi, however, and the case was stalled until Alvarez came on the scene. He found a single bloody fingerprint on a door and brought that section of the door back to Vucetich in Buenos Aires. It happened to match the prints taken from Francisca Rojas. Rojas confessed when confronted with the evidence. She had slit her own throat to cover up her crime.

An open and shut case, as they say. “The spark that Vucetich started and that took flame in Argentina flickered and wavered in other contexts, but it eventually joined with parallel efforts to become a universal practice,” writes Rodriquez.

Fingerprints, after all, don’t change over a lifetime, are very hard to manipulate, and are…. close to unique. (In the late nineteenth century, the polymath Sir Francis Galton estimated that the chances of two people sharing fingerprints was 1 in 64 billion.) Nonetheless, it took some time for conservative legal systems to accept the process. Today, fingerprint is standard operating procedure for criminal cases.

However, latent fingerprint evidence isn’t as infallible as crime entertainment might suggest. Latent fingerprint examiners, after all, are human beings, too, subject to the same biases as everyone else. Their job is both to identify and then explain similarities and differences between latent prints, which may be partial or unclear, and prints on file (which presumably are made in perfect conditions), to juries. In other words, they are supposed to guide the jury to a conclusion.

This method has had many critics over the years. An American Association for the Advancement of Science study published last year found that “fingerprint source identity lacks scientific basis for legal certainty.” This followed earlier studies by the National Research Council, the National Institute of Standards and Technology, and the President’s Council of Advisors on Science and Technology, which all cast doubt on the notion that fingerprinting is 100% accurate. The AAAS report noted that the innocent have been convicted on the basis of incorrect fingerprint matches. The proof, it turns out, may not always be in the print.


JSTOR is a digital library for scholars, researchers, and students. JSTOR Daily readers can access the original research behind our articles for free on JSTOR.

The American Historical Review, Vol. 109, No. 2 (April 2004), pp. 387-416
Oxford University Press on behalf of the American Historical Association