Cops Bust Man Who Said Wind Blew Cocaine Into His Car

A Florida man was arrested after he tried to claim that the wind blew a coke baggie into his car, according to a report.
Joseph Zak, 37, was busted trying to throw something away when he was pulled over in Fort Pierce last month for failing to pause at a stop sign, the Smoking Gun reported.
Police searched his car and found a crack pipe in the center console as well as a clear baggie with white residue, which later tested positive for crack cocaine.
When questioned about the baggie, Zack said that it didn’t belong to him and the “wind must have placed it there,” according to an affidavit obtained by the outlet.
Zack was arrested on drug paraphernalia charges and brought to St. Lucie County Jail, where he was released on bond.
He’s set to appear in court Dec. 3 for his arraignment, the outlet reported.







Google Maps Will Tell You Where Police Are Hiding On Roadways

In a new effort to compete with traffic app Waze, Google Maps is adding new features to its update, including the ability to see where officers are catching speeders.

Users around the world will be able to report where police officers are hiding in the app, and that will then show to other users on the route.

The update will also add an option to identify things like construction, lane closures, disabled vehicles, and objects in the road, which may be slowing down traffic.

An early version of the reporting feature is already available for Android, but updates just started rolling out to Apple users.


To watch the story by WSYX/WTTE click Source.

New Jersey Woman Faces 10 Years in Prison for Deadly Texting While Driving Case

CBS News recently covered the case of a New Jersey woman is facing up to a decade in prison after being convicted in a groundbreaking case. She was texting while driving, and killed a pedestrian, in a state that now considers that just as serious as drunk driving.

Surveillance video shows the moments before Alexandra Mansonet’s black Mercedes plowed into the back of a red Toyota Corolla. The impact was so hard, it bashed into the back of the Corolla, propelling it into 39-year-old Yuwen Wang, who was in the crosswalk. Five days later, Wang died in the hospital.

During Mansonet’s trial, prosecutors claim she was texting about dinner plans. But Mansonet told jurors she looked down for a moment to adjust the defroster.

“The car was right in front of me, so I um, I hit the car,” she said.

“No evidence in our accident investigation that showed that there was evasive action taken, or any skid marks that would show that she braked, so the first time she realized that she had struck something is when the actual collision occurred,” said prosecutor Chris Gramiccioni.

Last Friday, Mansonet was found guilty of vehicular homicide. It’s believed to be the first time a 2012 New Jersey law that treats a texting driver as harshly as a drunken driver was tested in court.

Forty eight states and Washington, D.C. now ban text messaging for all drivers. Fourteen percent of distracted driver crashes in 2017 were linked to cell phone usage.

Mansonet is now awaiting sentencing. It’s rare to have prosecutions in distracted driving cases, though this case might serve as a wake up call for drivers nationwide.


You Got a Brain Scan at the Hospital. Someday a Computer May Use It to Identify You.

In a disturbing experiment, imaging and facial recognition technologies were used to match research subjects to their M.R.I. scans.

Credit…Mayo Clinic, via the New England Journal of Medicine
Thousands of people have received brain scans, as well as cognitive and genetic tests, while participating in research studies. Though the data may be widely distributed among scientists, most participants assume their privacy is protected because researchers remove their names and other identifying information from their records.

But could a curious family member identify one of them just from a brain scan? Could a company mining medical records to sell targeted ads do so, or someone who wants to embarrass a study participant?

The answer is yes, investigators at the Mayo Clinic reported on Wednesday.

A magnetic resonance imaging scan includes the entire head, including the subject’s face. And while the countenance is blurry, imaging technology has advanced to the point that the face can be reconstructed from the scan.

Under some circumstances, that face can be matched to an individual with facial recognition software.

In a letter published in the New England Journal of Medicine, researchers at the Mayo Clinic showed that the required steps are not complex. But privacy experts questioned whether the process could be replicated on a much larger scale with today’s technology.

The subjects were 84 healthy participants in a long-term study of about 2,000 residents of Olmsted County, Minn. Participants get brain scans to look for signs of Alzheimer’s disease, as well as cognitive, blood and genetic tests.

Over the years, the study has accumulated over 6,000 M.R.I. scans. (Participants are not told the results of their tests.)

After the participants agreed to the experiment, a team led by Christopher Schwarz, a computer scientist at the Mayo Clinic, photographed their faces and, separately, used a computer program to reconstruct faces from the M.R.I.’s.

Then the team turned to facial recognition software to see if the participants could be correctly matched. The program correctly identified 70 of the subjects. Only one correct match would be expected by chance, Dr. Schwarz said.

Admittedly, he added, this was a fairly simple test. The facial recognition software only had to search through photos of 84 people, not thousands or millions.

But the fact that this was a straightforward test is “beside the point,” said Aaron Roth, computer scientist and privacy expert at the University of Pennsylvania.

“It is clear that eventually this will be a worrying attack” on stored medical data, he said.

The more likely abuse may be even easier than the method tested by the Mayo researchers, Dr. Roth said. Imagine that a bad actor already knew that a particular person was a study subject, and perhaps had some information regarding age and gender.

Under those circumstances, it should be far less difficult to find that person’s M.R.I. than to start with the scan and discover the subject’s identity. The task is “unfortunately reasonably straightforward,” Dr. Schwarz said.

The privacy threat is real, said Dr. Michael Weiner of the University of California, San Francisco.

Dr. Weiner directs a national study called the Alzheimer’s Disease Neuroimaging Initiative, which has enrolled 2,400 healthy people in an effort to find signs of dementia before a person shows symptoms.

With the publication of the research by the Mayo Clinic, he said, the initiative’s administrators will send letters to participating research centers informing them of the potential for privacy breaches.

The data in the study are stripped of identifying information, like participants’ names and Social Security numbers, but their M.R.I. scans do include faces. The only privacy protection for subjects so far has been the fact that researchers who want to access data from the study have to sign agreements saying that they will not try to identify participants.

“There have been millions of image downloads,” said Dr. Arthur Toga of the University of Southern California, whose group sends out M.R.I. scans and other data to researchers who request them from A.D.N.I. About 6,300 investigators have received study data, he said.

Dr. Weiner is himself a participant in that study, and his brain scans are included in the research data.

“My genetics are there,” he said. “All my tests are there. I bet there are a lot of images of me on the internet. You could match me to an A.D.N.I. subject code and look at all of my data.”

“The question is, what can we do now?”

The obvious way to fix the problem would be to remove faces from M.R.I. scans stored in databases. That process, though, blurs the image of the brain.

Also, fixing images in that way would not help protect the privacy of millions of subjects whose brain scans are already stored by A.D.N.I., the Mayo study and other large research projects.

Dr. Schwarz said his group is working on another solution, but declined to say what it is. Yves-Alexandre de Montjoye, a privacy researcher at Imperial College London, questioned whether an easy fix even is possible.

“If it doesn’t exist, that raises a lot of questions about how M.R.I. data is used,” he said. The Mayo group’s letter, he added, “is a good warning.”


Man Convicted of Murder Argues His Life Sentence Was Fulfilled When He Briefly Died

DES MOINES, Iowa – A man convicted of murder was rushed from the Iowa State Penitentiary to a hospital in 2015 where his heart was restarted five times.

He claims his life sentence was fulfilled by his short-lived death, and he has overstayed his prison time by four years.

Benjamin Schreiber, found guilty of first-degree murder in 1997 and sentenced to life behind bars without the possibility of parole, was hospitalized in March 2015 after large kidney stones caused him to develop septic poisoning, according to court records.

By the time he arrived at the hospital, he was unconscious, records show.

Though Schreiber signed a “do not resuscitate” agreement years earlier, medical staff called his brother in Texas who told them, “If he is in pain, you may give him something to ease the pain, but otherwise you are to let him pass,” according to court records.

Doctors proceeded to save his life by administering resuscitation fluids through an IV. Then he underwent surgery to fix the damage done by the kidney stones.

Schreiber filed for post-conviction relief in April 2018, claiming that because he momentarily died at the hospital, he fulfilled his life sentence and should be freed immediately.

He was sentenced to life without parole “but not to life plus one day,” Schreiber argued in court, records show.

The district court denied Schreiber’s request, writing that it found his claim “unpersuasive and without merit.”

The Iowa Court of Appeals affirmed the district court’s decision Wednesday, agreeing that Schreiber’s sentence isn’t up until a medical examiner declares he is deceased.

The district court did not address Schreiber’s additional claim that his due process rights were violated when the doctors failed to follow his “do not resuscitate” request, court records show. The court of appeals said in its ruling that it could not address the matter either as a lower court had not made any judgment on it.

Schreiber’s attorney could not immediately be reached for comment Thursday.


AG Inventory Finds Greene County Failed to Test 312 Sexual Assault Exam Kits

New reports show that Missouri law enforcement and other agencies have almost 6,200 boxes of sexual assault exam kits that have never been tested and 1,700 that are not associated with any police reports, including 312 in Greene County. The recent inventory conducted by the Attorney General’s office, using $700,000 in federal grant money, found 312 sexual assault exam kits in Greene County that were untested, including 234 in possession of the Springfield Police Department, 30 held by Greene County Sheriff’s Office, 17 at Cox South, 12 held by Republic Police, and 8 held by Willard Police.

Former Jasper County Judge M. Keithly Williams stated that “The important thing to do is to get the profiles done, uploaded into CODIS and respond to hits,” Williams said. “One of the things we have done is to make sure there is an investigation to follow up on CODIS hits.”

The inventory consumed about $700,000 of the federal grant and the creation of a tracking system is the next project that the grant will support. Attorney General Schmitt projects that the grant should provide enough money to test about 1,250 of the kits in inventory.

We expect an increase in the number of arrests and charges for sexual assaults in the near future in Greene County and Southwest Missouri, due to the testing of old sexual assault kits. It is never to early to get an attorney on your side to deal with law enforcement or prosecutors for you. If you or a loved one is suspected, arrested, or charged with a sex crime, it is critical to hire an attorney as early as possible to fight on your behalf.


Suspect with Name Tattooed on Neck Busted for Providing False Identity to Police

Matthew C. Bushman

He really stuck his neck out on this one.

A man under investigation for forgery was busted for giving a fake name to police — even though his real name is clearly tattooed across his neck, authorities said.

Cops in the city of Mattoon were speaking to forgery suspect Matthew C. Bushman, 36, of Mansfield — who has “Matty B” tattooed in black ink across his throat — when he tried to pass himself off as someone else on Oct. 8, police said.

He also gave an incorrect date of birth to throw cops off.

Bushman, cops said, was trying to dodge an arrest on an active warrant in Peoria County.

He was arrested on Oct. 11 and booked into the Coles County Jail on charges of forgery and obstructing justice, according to police and jail records.


Cops’ Tunnel Vision Can Yield Horrific Investigative Outcomes

Thanks goes out to our friends at Rosenblum Schwartz & Fry for this recent blog post.

The goal for any criminal defendant is obviously to be found not guilty of a crime.

Recently released study findings stress that suspects should also be just about as concerned that investigators never suspect guilt in the first place.

Here’s why, say university researchers who have scrutinized scores of wrongful conviction cases: Cops who zero in on an individual often tend to view evidence in a way that supports that individual’s guilt via so-called “confirmation bias.” They ignore evidence that offers up other possibilities.

Texas State University criminologists say that such “tunnel vision” is on display in many cases where innocent individuals are wrongly convicted. It spawns dire and sometimes unfathomably sad consequences.

Researchers Kim Rossmo and Joycelyn Pollock say that cops’ narrow focus needs to be consciously and systematically guarded against. It has led to draconian outcomes for legions of persons wrongly spotlighted in probes and thereafter pursued by law enforcers with “strong incentives to quickly identify the perpetrators of highly publicized crimes.”

Such crimes are often violent, which results in exceptionally lengthy prison terms for the wrongly convicted. The advocacy group Innocence Project has presented contradictory proofs in hundreds of cases that freed innocent parties in recent years.

Rossmo and Pollock say that investigators suffering from tunnel vision too often “pursue a single-minded course of action.” Doing so ignores or minimizes evidence pointing toward innocence rather than guilt.

The researchers say that “the purpose is to get an admission” for biased investigators embarked on interrogations.

To read the dissertation “The Effect of Confirmation Bias in Criminal Investigative Decision Making” by Wayne A. Wallace at Walden University, click here.


Illinois police seek Walter White look-alike on probation violation

If you are a Breaking Bad fan who subscribes to Netflix, chances are you have already seen the movie El Camino. But have you seen this Walter White look alike who police are searching for on a probation violation related to possession of methamphetamine?

Police in Illinois are looking for a man who bears a startling resemblance to “Breaking Bad” protagonist Walter White — and is wanted for a probation violation related to methamphetamine possession, according to a local report.

The Galesburg Police Department in Illinois regularly posts mugshots of wanted people to its Facebook page, but the post for Sept. 3, which featured 50-year-old Todd Barrick Jr., got extra attention. Barrick’s mugshot shows him sporting glasses and a goatee, similar to the character played by Bryan Cranston on the AMC series. Barrick is also the same age as the Walter White character when the series begins.

The Galesburg Police Department told KWQC-TV that Barrick’s probation violation was related to meth possession. The department did not immediately respond to Fox News’ request for comment.

Police in Illinois are looking for a man who has “Breaking Bad” fans doing a double take because the suspect looks like the television show’s Walter White and coincidentally is wanted in relation to methamphetamine possession, according to a local report.

Police in Illinois are looking for a man who has “Breaking Bad” fans doing a double take because the suspect looks like the television show’s Walter White and coincidentally is wanted in relation to methamphetamine possession, according to a local report. (Galesburg Police Department)

“Breaking Bad” ran for five seasons between 2008 and 2013. The show told the story of White, a high school chemistry teacher who is diagnosed with lung cancer and turns to manufacturing meth to ensure his family’s financial security after he dies.

“Heisenberg lives!” one Facebook user commented under the police department’s post, referring to an alias used by White on the show.

“This new ‘Breaking Bad’ movie looks like it sucks!” another user wrote.

It was not immediately known if Barrick is currently in custody.



Facial Recognition Technology Threatens to End All Individual Privacy

Apple has been using facial recognition software as a security option on the iPhone X since 2017. Amazon has given its facial recognition system to police departments to try out. Microsoft claims it has resisted requests to sell its products to police and has called for government regulation. Axon, the largest maker of body camera in the United States, has taken out patents for facial recognition applications. New applications are announced daily.

Recently, the Department of Homeland Security announced its plan to use facial recognition software on 97 percent of departing air passengers by 2023. President Trump signed an executive order speeding up the use of facial recognition identification for “100 percent of all international passengers” in the top airports by 2021. Source.

Privacy advocates have raised serious civil rights concerns around facial recognition software. Currently, there are no laws regulating the use of facial recognition technology in the United States.

Crime Fighting

After 9/11, law enforcement used grainy images of hijackers in the airports on the morning of the assault. It took authorities weeks to identify the attackers. Unable to close the matter on its own, the FBI released 19 photos with possible names and aliases seeking help from the public.

Today, law enforcement could have identified the hijackers within three minutes using facial recognition software. Source.

A six month test of the use of facial recognition software in a robbery investigation unit found that it lowered the time to identify a suspect from an image from 30 days to three minutes.

In New York City, detectives have requested 7,024 facial recognition searches that resulted in 1,851 possible matches and 998 arrests.

Assault on Privacy Rights

Facial recognition is the perfect tool for oppression. It enables abusive and corrosive activities. It has a disproportionate impact on people of color and minorities. Due process is harmed because it shifts the ideal from “presumed innocent” to “people have not been found guilty of crime, yet.” It increases harassment and violence. It denies fundamental rights and opportunities (tracking one’s movements, habits, relationships, interests, and thoughts). It prohibits the average citizen from walking the streets in obscurity. It amplifies the money making market for facial recognition surveillance.

Professors Woodrow Hartzog and Evan Selinger state that “facial recognition technology is the most uniquely dangerous surveillance mechanism ever invented. Surveillance conducted with facial recognition systems is intrinsically oppressive. The mere existence of facial recognition systems, often invisible, harms civil liberties, because people will act differently if they are suspected they are being surveilled.” Source.

Clare Garvie, an associate with Georgetown Law’s Center on Privacy and Technology, sounds the alarm on the use of facial recognition by law enforcement: “What happens if a system like this gets it wrong? A mistake by a video-based surveillance system may mean an innocent person is followed, investigated, and maybe even arrested and charged for a crime he or she didn’t commit. A mistake by a face-scanning surveillance system on a body camera could be lethal. An officer, alerted to a potential threat to public safety or to himself, must, in an instant, decide whether to draw his weapon. A false alert places an innocent person in those crosshairs.” Source.

In order to find a criminal, everyone has to be scanned. That data has to go somewhere. The companies that provide this technology are not obligated to let you know they are collecting and storing it–and possibly selling it to third parties.

We will monitor law enforcement in Southwest Missouri for use of facial recognition surveillance. If lawmakers do not take action on facial recognition surveillance, the privacy battle over the proper use and scope will play out on the streets instead of the courts.