A Primer on Forensic Science

Aug 13, 2012 - by Liz Porter

Aug. 13, 2012

Skulls on the beach of Punuk Island Alaska

Skulls on the beach of Punuk Island Alaska

by Liz Porter

The science

Many experts start their forensic science timeline in 1810, when a German scientist did a chemical test for a particular ink dye on a document. Three years later, Spaniard Mathieu Orfila published his Toxicologie Générale, as a result of which he is usually regarded as the father of modern toxicology.

In 1835, Londoner Henry Goddard, a member of the Bow Street Runners, the unofficial police force set up by the writer and magistrate Henry Fielding, initiated the use of bullet comparison when investigating a burglary. He spotted a flaw in a bullet lodged in a bed’s backboard, matching it both to other bullets in the suspect’s gun and to the mould from which the bullets had been made. This enabled him to solve the crime. The butler did it, then invented a story about a masked intruder to cover up his own “inside job.”

In the following year, English chemist James Marsh, inventor of a method to measure small amounts of arsenic ingested or absorbed by a human body, used the technique in the trial of a man accused of poisoning his grandfather. But the jury, confused by the complexity of Marsh’s testimony, acquitted the man. As a result, the scientist improved and simplified his method into a technique that was easier to explain to lay people, devising a test for arsenic in dead bodies which became known as the Marsh Test.

In 1850, French physician Marcel Bergeret was able to exonerate a couple accused of killing a baby whose mummified remains had been unearthed while they were renovating their apartment. Workmen removing brickwork behind the mantelpiece had discovered the remains. Bergeret carried out an autopsy on the body, finding some moths and larvae from a flesh fly on it. Using his knowledge of the succession of insects that visit dead bodies, he calculated that the moths had grown from eggs laid in 1849, meaning that the flies must have laid their eggs on the newly dead body in 1848, when it was walled in, before the current occupants had moved into the apartment. Suspicion was then directed at previous occupants, specifically a young woman who had appeared at one point to be pregnant but had never been seen with a baby. She was arrested and tried for murder but was acquitted because the cause of her baby’s death could not be established.

The title of “father of modern forensic science” tends to be given to French police scientist Edmond Locard, who set up the world’s first police laboratory in Lyons in 1910. His exchange principle – that every contact leaves a trace, set out in his L’Enquête Criminelle et les Méthodes Scientifiques (Criminal Enquiry and Scientific Methods) – remains the basic precept of 21st-century crime scene investigation and trace evidence collection. “There is no such thing as a clean contact between two objects,” he declared. “When two bodies or objects come in contact they mutually contaminate each other with minute fragments of material.”

Locard also identified “the microscopic debris that covers all our clothing and bodies [as] . . . the mute witness, sure and faithful of all our movements and all encounters.”  On that point, he acknowledged the intellectual debt he owed to Scottish physician Arthur Conan Doyle, creator of the fictional detective Sherlock Holmes, and to Austrian law professor Hans Gross. Locard believed that Gross was the first person to discuss the use of dust in crime investigation. In a book published in 1891, Gross describes the clothes of a suspect being placed in a paper bag, which was then beaten with a stick. Under the microscope, the released dust was revealed to comprise fragments of sawdust and glue, confirming that its wearer was a cabinetmaker.

Similarly, in the story “The Adventure of Shoscombe Old Place,” Holmes uses a microscope to examine dust from a cap found beside the dead body of a policeman, concluding that it contains threads and fragments of glue: evidence that implicated as a potential murder suspect a picture-frame manufacturer who worked with glue.

 

Blood groups

The dawn of the 20th century brought a revolutionary forensic discovery: human blood groups. University of Vienna immunologist Karl Landsteiner identified four major blood types, which he designated A, B, AB and O. His system was based on antibodies in the blood plasma and antigens (compounds that stimulate the production of antibodies) on the surface of red blood cells. Particular types of blood occur with varying frequency in different populations. In Australian Caucasians, for example, 46 per cent are in the O group, 38 per cent are A, 13 per cent are B, and 3 per cent are AB. In the United States and the United Kingdom Caucasian populations, the ratios are slightly different. Some ethnic populations have no members with AB or B type blood. In 1902, Landsteiner, together with Max Richter of the Vienna University Institute of Forensic Medicine, suggested that the “ABO typing” of crime scene blood stains, both wet and dry, could be used to help identify suspects.

By the late 1930s scientists were aware of the fact that about 78 per cent of the population are “secretors”: their ABO blood and other blood grouping substances can be detected in bodily fluids other than blood, such as saliva or semen. This means that a crime scene semen sample or a swab of saliva from a cigarette might yield the blood type of a suspect. Later a test for “secretor” status was developed. Called the Lewis test, this examined red cell antigens known as “Lewis antigens” and indicated if an ABO group reading would also show up in an individual’s semen or saliva.

Over the ensuing decades successive blood grouping schemes were developed, allowing a combination of systems, each based on a different protein or enzyme marker, to be used in the hunt for the origin of a particular blood stain. In the early 1970s, the Scotland Yard laboratory introduced a system based on the enzyme phosphoglucomutase (PGM), which was soon adopted around the world. By 1980, scientists were routinely using more than 10 other systems at once. By combining several test results, they could narrow down the search for a suspect dramatically. The stable frequency of different types of blood was the key, along with simple math.

Type A blood left by a killer with a cut hand could belong to 38 out of any 100 Australians. If that blood was also PGM type 1+ in the PGM system (a type possessed by 19 per cent of the population) then the killer’s blood would be found in only seven per cent of the population, or 70 out of every 1000 people. A series of up to 10 other tests might tell investigators that they were looking at a blood type found in only one in every 1000 Australians.

 

Fingerprints

Fingerprinting was also introduced at the start of the 20th century. The notion that fingerprints could be used to identify individuals in criminal cases was first suggested in 1877 by a U.S. microscopist, Thomas Taylor, but his ideas were not pursued. More attention was attracted by an article called “On the Skin-Furrows of the Hand” by Tokyo-based missionary Dr. Henry Faulds, published in the scientific journal Nature in 1880, to which India-based British civil servant William James Herschel replied with another article, published in the same journal a month later, that explained how he had been using fingerprints for identification purposes on contracts for more than 20 years.

In 1901, the assistant commissioner of Scotland Yard began implementing a fingerprint bureau to record the prints of criminals. By 1903, prisoners in New York State were being fingerprinted, and, in 1911, a New York Police Department detective presented the first fingerprint evidence to be admitted in a U.S. court: a finger mark on a shop window introduced by the prosecutor of a burglary case.

The FBI had 100 million fingerprint cards in its files by 1946, and an estimated 200 million by 1971. Individual police departments also had their own files – and, as difficult as it is to imagine in the post-CSI world, police fingerprint experts solved crimes manually, comparing prints taken at crime scenes with the fingerprint cards of convicted criminals already on file.

In 1963, Arnold Sauro, a Los Angeles Police Department fingerprint expert, was called to a Hollywood apartment where the savagely beaten body of restaurant waitress Thora Rose had been found lying on a blood-soaked bed. The assailant, who had robbed and attempted to rape his victim, had left prints on window slats as he broke in, and had touched the windowsill, the kitchen sink and the door jambs. Dusting the prints with aluminum-based powder, pressing tape over them, then lifting it and pressing the resulting fingerprint pattern onto cards, Sauro managed to obtain 36 clear prints. They were checked against nearly 30,000 sets on file at the LAPD and the California Department of Justice, but there was no match. Clearly, Thora Rose’s killer had no prior criminal record in California.

Meanwhile, the detectives investigating the case, working on the theory that the man was a serial rapist who attacked women in their homes, checked out 3,000 potential suspects, including delivery men, gardeners and hairdressers – all with no success.

Sauro was determined to solve the case. As soon as he arrived at work each day, the Thora Rose file was the first he looked at. Over the next three years, he continued to check its crime scene prints against the thousands of new sets that came in. There was never a match.

No investigator could have been more persistent than Sauro, who kept tabs on the case after he was transferred, and regularly phoned in from Las Vegas where he retired in 1978. But it took the advent of computerized records and then the scanning of old case cards for the murder to be solved.

The first semi-automated fingerprint system was set up by the Royal Canadian Mounted Police in 1973, while the FBI set up their automated card-scanning system in 1975. The first U.K. system, computerizing a database of people convicted of breaking and entering, was established in the late 1970s. In 1984, the California’s Department of Justice set up its automated fingerprint system, a monstrous logistics exercise: as well as new prints being recorded and scanned in, tens of thousands of old print cards were also entered. The system worked almost too well, bringing up more hits than investigators could process and more cases than the District Attorney’s Office had prosecutors for.

By 1990, the Californian system had more than a million prints on it, and it could check new unidentified prints against the whole database in 45 minutes – a task that would have taken years to do manually. That year, the head of the Los Angeles Police Department Latent Print section, Wendell Clements, author of the book The Study of Latent Fingerprints: A Science (1987), tested the system by running a random sample of 50 print cards of unsolved crimes. One of them was the Thora Rose case: It had remained open, as murder cases always do, but its print cards had been filed away because investigators were too busy working on current cases and historic cases from more recent decades.

The case was the only one of the test batch to produce a match. The system brought up the fingerprints of Minneapolis-based executive Vernon Robinson, then 45. At the time of the murder, he had been an 18-year-old U.S. Navy recruit with no criminal record. His fingerprints had first been taken in 1968 when, two years after leaving the Navy, he had become an alcoholic and serial petty criminal. He had served three years in San Quentin for robbery and assault, but had then gone to college and had never reoffended. Arnold Sauro returned to the case to appear at Robinson’s murder trial, in 1993, testifying that the cards bearing the crime scene prints were the ones he’d collected. Robinson denied any involvement in the 1963 murder but was found guilty and sentenced to life imprisonment.

Through the late 20th and early 21st century, techniques for raising latent fingerprints (prints invisible to the naked eye) from difficult surfaces improved. Meanwhile, the recognition that palms comprise between 20 and 30 per cent of the prints lifted from crime scenes, and from knife handles, guns and steering wheels, led to a focus on palm prints. Databases of palm prints, first created in 1994, were steadily adopted by police forces around the world through the first decade of the 21st century.

The Australian national fingerprint system, commissioned in 2001, included palm prints. In the United States, state-wide palm print databases were established in 2004 in Connecticut, Rhode Island and California, allowing unidentified latent palm prints to be searched against known offenders in other states. In the U.K., forensic officers began collecting palm prints in 2004, and a palm print database was added to the national fingerprint system in 2006.

 

Hair analysis

One of the earliest uses of hair evidence in a criminal investigation was recorded in Paris in 1847, when the bloodied body of the Duchess of Praslin was found in the mansion she shared with her husband on Paris’s Rue Saint-Honoré. Her husband’s pistol was found near her body, with strands of her hair on its bloodied handle. The Duke initially claimed that he had been woken by his wife’s screams and had brought the pistol to her room to defend her. He swallowed a fatal dose of poison before he could be arrested, and a hurried bedside “trial” was held, with the hair evidence presented as proof of his guilt, along with the scratches on his arms and the blood-soaked dressing gown and blood-stained knife and sword found in his own apartment.

German physician Rudolph Virchow carried out the world’s first large-scale hair study in 1876, when he began a survey of the hair and eye color of six million German schoolchildren. The first detailed comparative microscopic hair study, Le poil de l’homme et des animaux (The hair of man and animals) was published in 1910 by the professor of forensic medicine at the Sorbonne, Victor Balthazard, and his associate Marcelle Lambert. Balthazard was also an expert witness in the first reported case involving microscopic hair evidence. In 1909, he examined hair found beneath the fingernails of murder victim Germaine Bichon, first judging it to be the hair of a woman and later matching it to the hair of suspect Rosella Rousseau, who later confessed.

By the mid 1930s the comparison microscope was already established as the basic tool for forensic hair analysis. Consisting of two transmitted light microscopes linked with an optical bridge, it allowed the examiner to compare two hairs side by side. Seen in cross-section, every hair comprises three parts: the outer cuticle, the cortex, and the central medulla. Through the microscope, hair color is judged by the detail and arrangement of particles of pigment in the cortex – patterns best seen at magnification of 400 times life size. At this level other detailed features can also be seen in the cortex.

Microscopic hair examination can categorize hair into three different racial groups, judging it as coming from people with European, Asian or African ancestry. But, where possible, 21st-century hair analysis involves the extraction of DNA.

Hair analysis is also used to investigate an individual’s past or present drug use, with toxicologists regularly testing hair to detect the 31 most commonly abused drugs – from opiates, amphetamines, cocaine, cannabis and methadone to prescription drugs such as the benzodiazepines, which include anti-anxiety and sleeping pills.

In recent years U.K. police have begun to use stable isotope ratio analysis (SIRA) of human hair to tell them where its owners have lived or travelled. The technique is based on the fact that, although all water molecules are composed of two atoms of hydrogen and one of oxygen, different water sources contain subtly varying concentrations of hydrogen and oxygen isotopes (forms of the same chemical element with a different atomic weight).

This means that water from one area has a subtly different “isotope signature” from water sampled in another. The traces of that water signature, ingested in food and drink, are found in every cell in the body, and can be detected in hair, fingernails and bones. Measurements are carried out with an isotope ratio mass spectrometer – a $400,000 apparatus that enables scientists to measure the ratios of particular isotopes in different samples of material. Stable isotope profiling, as it is also known, has been hailed as the next big thing in forensic science after DNA.

In late 2007, five years after the brutal murder of seamstress Heather Barnett in the U.K. city of Bournemouth, stable isotope profiling produced an intriguing clue. University of Reading archaeology lecturer Dr. Stuart Black analyzed cut hair that had been found in the victim’s hand. His work on the nine-centimeter strands revealed that the hair’s owner lived in Britain, but had visited either eastern Spain or southern France 11 weeks before the hair was cut, for up to six days. The person had also visited Tampa, Florida, for eight days, two and a half weeks before the hair was cut. At the time of going to print, police were yet to trace its owner. An Italian national was charged with the murder in late 2010, but pleaded not guilty. He was due to face trial in 2011.

 

Ballistics

British detective Henry Goddard famously solved an 1835 burglary by the use of bullet comparison. As noted earlier, he observed a flaw in a bullet found at the crime scene and then matched it to other bullets in the suspect’s gun and to the mould from which the bullets had been made.

Modern forensic ballistics is based on expert knowledge of the relationship between bullet and gun. Every firearm leaves a ballistic “signature” on the bullets and cartridge cases fired from it. These microscopic markings, unique to each gun, are made by the gun’s barrel, firing pin, firing chamber, extractor and ejector.

Forensic firearms experts can compare the “signature” of a bullet or cartridge recovered from a crime scene with other “signatures” that are already on police files.

The first scientist to study the striations on bullets and their relationship to specific guns was Alexandre Lacassagne, professor of forensic medicine at the University of Lyons, who published his research results in 1889. In 1912 his compatriot Victor Balthazard, also noted for his hair analysis work, devised a method to match bullets to the guns that fired them, taking photographs of test-fired bullets and ejected cartridge casings, then enlarging and comparing them.

Aided by the 1925 invention of the comparison microscope, comparative bullet analysis became a routine part of late 1920s police work. In 1929 pioneering U.S. forensic ballistics expert Calvin H. Goddard used the device to investigate the guns used in Chicago’s infamous St. Valentine’s Day massacre. On 14 February 1929, seven gangsters, all known associates of the North Side Irish gang’s George “Bugs” Moran, had been mown down by men wearing Chicago police uniforms. But were the shooters actually police officers?

Goddard examined the 70 empty .45 caliber cartridge casings left at the scene, identifying them as having been fired from Thompson submachine guns. Manufacturer’s markings revealed where and when the bullets had been made. Studying two distinctly different ejector marks on the cartridge cases, the expert was able to say that two weapons had fired them, with one having fired 50 cartridges and the other 20. Goddard then test-fired all eight of the Thompson guns owned by Chicago Police. There was no match on the casing marks to those found at the crime scene. A later raid on the home of an associate of Al Capone, leader of the South Side Italian gang, uncovered guns which, when tested, produced markings matching those on the St. Valentine’s Day crime scene cartridges.

This complex and time-consuming work could be done only because Goddard had “suspect” guns to test-fire and compare with cartridges found at the crime scene. He also had only a limited number of guns to test.

By the 1990s, in any average year, the Chicago Police Department was seizing between 10,000 and 15,000 crime guns. By then comparison microscopes had evolved dramatically; specialist bullet comparison devices could offer clear magnifications of up to 1,500 times life size. But the work was still impossibly slow.

The next development in ballistics was a giant leap: a new system that could compare the signatures of tens of thousands of guns in less than an hour. Often described as “DNA for weapons,” the Integrated Ballistics Identification System (IBIS) is now so basic to modern law enforcement that the scriptwriters on “CSI” use it as a verb. “Did you IBIS it”? the show’s fictional forensic specialists ask, when discussing any stray bullet or shell found at a crime scene.

 

Editor’s Note:  “A Primer on Forensics” is taken from the Introduction to Liz Porter’s book, Cold Case Files: Past crimes solved by new forensic science, which is available for Kindle in the United States on amazon.com

http://www.amazon.com/Cold-Case-Files-ebook/dp/B004ZBTHMC Hard copies available at www.panmacillan.com.au

Total views: 27238