California Scientific

California Scientific
1000 SW Powell Ct
Oak Grove, MO 64075
Sales@CalSci.com
916-225-5119
California Scientific  *  BrainMaker Neural Network Software  *  Predict Forecast Classify Stocks Bonds Markets Commodities Diagnose Medical

BrainMaker Neural Network Software

Neural Network Red-Flags Police Officers With Potential For Misconduct

The Chicago Police Department has used BrainMaker to forecast which officers on the force are potential candidates for misbehavior. The Department's Internal Affairs Division used neural networks to study 200 officers who had been terminated for disciplinary reasons and developed a data base of pattern-like characteristics, behaviors, and demographies found among the 200 police officers.

BrainMaker then compared current Department officers against the pattern gleaned from the 200 member control group and produced a list of officers who, by virtue of matching the pattern or sharing questionable characteristics to some degree, were deemed to be "at risk."

This particular application has been highly controversial, drawing criticism from several quarters - the most vocal being Chicago's Fraternal Order of Police. William Nolan, the Order's president, has made Orwellian references, saying the Department's program seems like "Big Brother." Scientific American, Playboy, New Scientist, and Law Enforcement News have all done articles on the ethical implications of the Chicago P.D.'s program with mixed reviews.

The C.P.D. Internal Affairs Division, however, was pleased with the results. After BrainMaker studied the records of the 12,500 current officers (records that included such information as age, education, sex, race, number of traffic accidents, reports of lost weapons or badges, marital status, performance reports, and frequency of sick leaves) the neural network produced a list of 91 at-risk men and women. Of those 91 people, nearly half were found to be already enrolled in a counseling program founded by the personnel department to help officers guilty of misconduct. The I.A.D. now intends to make the neural network a supplement to the counseling program because, as Deputy Superintendent Raymond Risely said, the sheer size of the Chicago police force makes it "pretty much impossible for all at-risk individuals to be identified [by supervisors]."

Terry Heckart, a graduate student at Ohio's Bowling Green State University, recommended neural networks to Chicago's Internal Affairs Division. Heckhart told the Division officials that the software could be effective for two reasons: one, as the number of variables increase in the application, the output reliability increases; secondly, neural nets can deal with missing data. That, says Risley, "was really the key to solidifying our interest."

"We're very pleased with the outcome," Risley says, "We consider it much more efficient and capable of identifying at-risk personnel sooner than command officers might be able to do. The old method just can't compete with it."

The Chicago Police Department stresses that the program utilizing BrainMaker has no punitive ramifications. Risley notes that "it's not disciplinary . . . it's an opportunity for an officer who is moving in the wrong direction to rehabilitate himself . . . if an officer refuses to participate, nothing happens to him."

Despite the ethical discussion raging over whether a neural network should be used to monitor human beings, the program cannot be accused of being subjective and personally biased as "manned" programs often are. Clearly, the software can hold no personal grudges and seeks only to dispassionately identify patterns and characteristics that could spell trouble. The alternative system, being human based, cannot avoid subjectivity and bias on some level. It is worthy of note that the Fraternal Order of Police "vehemently opposed" the Department's old system for that very reason.

To counterbalance the inherent "dispassion" of the neural network, the Department closely examined the net's findings to ensure that officers who are clear anomalies, and thus don't warrant being on the list, are removed from consideration. This combination of objective technology and subjective humanity does not necessarily spell perfection, but it does signify a promising move in that direction.

Currently, we are told, the Chicago Police Department does not use BrainMaker to forecast problems with officers. The program was apparently terminated due to its controversial nature.