Polygraph technology introduced itself to the world in 1902, becoming commercially available in 1908. Dr. James MacKenzie debuted it to the medical community as a device used in the diagnosis of heart conditions.

Over in the United States, Dr. William Moulton Marston invented his Systolic Blood Pressure Test in 1915, providing the inspiration for John A. Larson’s “Sphyggy,” which would become the first polygraph used in law enforcement cases.

Leonard Keeler picked up the mantle from Larson, introducing his “Emotograph” in 1925. Keeler would become “The Father of the Modern Polygraph,” working with Associated Research of Chicago to produce a series of instruments until the 1970s.

However, the introduction of software systems in the late 1980s and early 1990s changed the industry forever. This post looks at the development and evolution of polygraph technology from the 1990s to the 2020s.

 

The Keeler Polygraph – A History of the Evolution of the Lie Detector Machine

Leonard Keeler was responsible for taking the polygraph designs of John A. Larson and tweaking them to become easy to set up and manage. He removed inefficiencies from Larson’s machine, such as its need for scribing results on smoke paper, requiring shellacking for storage.

Keeler’s innovations saw his polygraph designs adopted by the CIA and the DOD becoming the gold standard of polygraph technology at the time. Along the way, other companies, such as Stoelting, started developing analog polygraph systems, gaining a foothold in the market.

 

The Introduction of Polygraph Software

While Keeler was a true innovator, he died in 1949 at the age of 45, long before the first computer ever found itself in the modern world. During the 1980s, people like Bill Gates and Steve Jobs introduced the world to the first personal computers, sparking the digital revolution.

 

Stoelting and CPS

By the late 1980s, companies were already working on software-based polygraph systems. The age of the algorithm was upon us, and deception detection was forever changed. Scientific Assessment Technologies developed the “CPS” system using research conducted by John Kircher and David Raskin at the psychology laboratory at the University of Utah in 1988 in collaboration with the Stoelting company.

Much of their work came from the Computer Assisted Polygraph System (CAPS) developed in the 1980s. The CAPS was developed using data gathered in laboratory environments using simulated crime scenarios. However, the new CPS version relied on data provided by US Secret Service Criminal Investigations.

The CPS algorithm bases its data sets around standard multivariate linear discriminant function analysis. It uses this information to produce calculations to estimate the probability of deception or truthfulness in the examinee.

The modern versions of CPS were released in the 2010s, with the most recent update to the software relying on three features in calculating polygraph scores. The software analyzes increases in cardiograph readings, skin conductance amplitude, and combined lower and upper respiration measurements.

The new CPS algorithms build analytically on the original Utah numerical scoring, which is similar to the Seven-Position Numerical Analysis Scale. The Department of Defense Polygraph Institute (DoDPI) currently teaches this manual scoring system.

 

Axciton Systems and PolyScore

While Stoelting was working on CPS, Axciton systems collaborated with Johns Hopkins University Applied Physics Laboratory to produce its “PolyScore” algorithm. PolyScore is thought of as the innovation and first-to-market in the software space, releasing its original system in the early 1990s.

Bruce White of Axciton Systems started his research into developing PolyScore in 1988, assisting the team at Johns Hopkins with data sets provided by Axciton Systems. PolyScore uses inputs from digitized signals and outputs deception probability based on neural network models or logistic regression.

The PolyScore algorithm accumulates and analyzes these digital signals coming from blood pressure, galvanic skin response, and upper respiratory activity into fundamental signals that isolate portions of these signals containing information relevant to detecting deception.

PolyScore developers extracted the algorithm features from these signals based on their empirical performance rather than physio-psychological assumptions. Today, the latest versions of PolyScore are in service with Axciton and Lafayette polygraph devices and instruments. The companies base the development of the algorithm on case data received from the DoDPI.

In contrast to CPS, PolyScore, doesn’t attempt to recreate manual scoring processes used by examiners. However, neither algorithm relies on more fundamental research on data in psychophysiological processes underlying the recorded signals, other than in a heuristic methodology.

 

The Shift to Polygraph Software in the 1990s

PolyScore and CPS’s success saw the floodgates’ opening into the software revolution and the beginning of the software era. Keeler’s devices, which previously dominated the market, were now obsolete. We imagine that Larson and Keeler would find themselves shocked by the advancement of their original technology into the algorithm-dominated systems of the 1990s.

Many companies started jumping on the algorithmic bandwagon after discovering Axciton and Stoelting’s success with PolyScore and CPS. Two other companies would emerge over the coming decade, with Lafayette Instruments becoming another huge player in the software space.

The new computerized polygraph software did away with the need for analog systems. There was no more hassle with ink pens getting clogged in tests or storing reams of chart paper. By the mid to late 90s, software systems took over analog machines in law enforcement and national security operations, and with the coming era of laptop technology, we would see a shift to doing away with the analog device altogether.

The 1990s was the early era of computing technology. We saw the introduction of the “windows” operating system from Microsoft and the “Pentium” processor. Previously, computers were bulky and ran on systems offering a limited user experience.

The internet wasn’t in widespread use during the early 1990s, but ty the end of the decade we saw the tech take a huge leap forward, and global adoption of the internet began. Polygraph software continued its progression from the early CPS and PolyScore systems into the era of the 2000s.

 

Advances in Computer Technology During the 2000s

The 2000s saw the internet explode in every avenue of society, becoming the most important technological advancement ever created in human history. With tech companies realizing the importance of computers and the internet in the new millennium, billions of dollars started pouring into R&D.

Companies like Limestone Technologies opened in 2003, making the polygraph software space even more competitive, and with more competition comes more innovation. The laptop underwent a huge change in the 2000s, with models becoming slimmer, more lightweight, and more portable.

The increase in processing power also leads to the ability of these systems to run polygraph software. Soon, polygraph firms were tossing out any old analog systems they had, making the switch to software. During the 1990s, the big four polygraph software developers, Stoelting, Lafayette Instruments, Limestone technologies, and Axciton Systems, emerged.

Lafayette Instruments would eventually acquire Limestone Technologies in 2023, increasing its market share and footprint to become the leader in the space, with Stoelting following close behind. As the 2000s progressed into the 2010s, computerized technology continued its leaps and bounds, moving into highly advanced systems.

The US Department of Defense introduced the Psychological Detection Device-1 (PDD-1), becoming the first computerized polygraph system used in US military operations. National security agencies like the DOD, NSA, FBI, and CIA were already well-advanced in using polygraph technology to question candidates, employees, operatives, and state enemies. However, the introduction of polygraph software increased its accuracy and was used throughout the national security complex.

The National Institute of Justice (NIJ) released its first draft of standards for using computerized polygraph systems in 2005. NIJ Standard-0110.01 created the rules around the technical specs for hardware and software used in these systems, the procedures related to conducting polygraph exams, and training or polygraph examiners.

 

Modern Polygraph Software Vs. the 1990s

One of the biggest changes and progressions in polygraph tech from the 1990s to the 2020s was the improvement in accuracy algorithms brought to the industry. Previously, the “Keeler” and “Reid” polygraphs, the gold standard in polygraphy, were considered 60% to 70% accurate at detecting deception.

Due to the poor results provided by these systems, the Justice Department and Supreme Court instituted legislation from the 1920s through to the late 1980s to protect the population from polygraph exploitation. For instance, the “Employee Polygraph Protection Act of 1988” (EPPA) sought to protect employees from their employers polygraphing them when applying for jobs or when working on the job.

However, the introduction of software-based systems dramatically increased polygraph accuracy. According to the American Polygraph Association (APA), the benchmark standards for accuracy went from 60% to 70% average to 87% to 97%.

Despite the huge improvements in testing accuracy using computerized polygraph systems, the old laws didn’t change to accommodate these updated results. The EPPA remains in place, as do many of the old regulations, such as Rule 702 of the Federal Rules of Evidence, which prevent the admission of polygraph results as evidence in court cases.

It’s doubtful that these laws will change, and there are no plans to alter them to accommodate the updates to polygraph technology made in the last two decades. However, many private companies continue relying on computerized polygraph exams to assist them with hiring and firing.

Some industries remain exempt from the EPPA, and some states allow for the admission of polygraph results, depending on the merits of the case and the parties involved in the legal proceedings.

 

The Beginning of AI Polygraph Systems

While the introduction of computerized polygraph technology and algorithms changed the industry, there’s a bigger innovation waiting in the wings. The mid-2010s saw huge advancements in Artificial Intelligence technology (AI). AI allowed developers to create advanced systems in many tech industries, and those innovations also spread to polygraph science.

The US and the EU were the first nations to develop AI-based polygraph systems for use in their immigration control practices. “AVATAR” (Automated Virtual Agent for Truth Assessments in Real-Time) is the first example of such a project. AVATAR is an AI-based system built by the Israeli company “Nemesysco.”

Nemesysco used data from its Layered Voice Analysis (LVA) systems built in the late 1990s to train the system. AVATAR was first introduced in 2002 but continued development. It went on to include the use of facial recognition in its systems, enhancing its accuracy. The US Department of Homeland Security and US Border Patrol launched a pilot test of the AVATAR system in 2012 to screen travelers arriving in the United States. The system has the ability to detect if the test subject is using countermeasures when asking questions, such as curling the toes or clenching thigh muscles.

The AVATAR pilot was incredibly successful at detecting deception, with accuracy rates of between 60% to 75%. Developers suggest that AVATAR could be up to 85% accurate at detecting deception in examinees with the right training.

The EU also jumped on the AI bandwagon, using the technology to produce its version of AVATAR, known as “iBorderCtrl.” This project was an initiative from the European Union’s Horizon 2020 research and innovation program and underwent pilot testing in European countries between 2017 and 2019.

The main companies involved with the development of the iBorderCtrl system are European Dynamics Belgium SA and European Dynamics Luxembourg SA.

The system analyzes the facial micro-expressions of travelers, asking them questions such as “Do you have anything in your suitcase that might be illegal to bring into the country.” The system follows up with the question, “If you were to open your suitcase now, would there be anything in there that would concern customs?”

Like AVATAR, iBorderCtrl is very effective at identifying deception with similar accuracy. According to its developers, it could reach an accuracy rate of up to 85% with the right training.

Both AVATAR and iBorderCtrl can potentially assist with better security screening and reduce the resources allocated to the human screening of suspected terrorists, smugglers, or individuals presenting a threat to national security.

It’s still the early days of AI technology, and with the advancement of AI seen in the early 2020s, it’s only a matter of time before it becomes the new software replacement for current systems.