In 2014, Tashfeen Malik's family in Pakistan started to worry about her Facebook posts.
It was the religious extremism that concerned them, a family member told the Los Angeles Times. A year later, Malik and her husband, Syed Rizwan Farook, declared their allegiance to ISIS on social media. Then they killed 14 people in San Bernardino using semiautomatic rifles.
SEE ALSO: Teachers say #ArmMeWith classroom resources instead of gunsIn the aftermath, 26 senators asked the Department of Homeland Security to conduct social media background checks when reviewing visa applications — something President Donald Trump approved last year.
Later, someone using the screen name "nikolas cruz" said he wanted to "shoot people" with his AR-15 in the comments under a YouTube video, according to CNN. He posed on Instagram with guns and knives, and wrote racist and xenophobic slurs. He was even reported to the FBI for writing, "Im going to be a professional school shooter" on YouTube.
Last month, using a legally purchased AR-15, Nikolas Cruz murdered 17 people at Marjory Stoneman Douglas High School in Parkland, Florida. He's only 19 years old.
No politicians called for social media background checks in the wake of the Parkland shooting.
In fact, Republican lawmakers actively held back progress on gun control. A week after the massacre, survivors from the shooting crowded the Florida state Capitol. Their presence didn't sway the Republican controlled-legislature, who voted down a motion to even consider an assault weapon ban.
If they’re willing to do it to stop terrorism, why don’t politicians want the federal government to comb through the social media feeds of gun buyers in an effort to predict — and stop — mass shootings?
The most obvious reasons are political. In the United States, almost everything is permitted in the fight against Islamic terrorism. Almost nothing is permitted in the fight against domestic gun violence. Other barriers exist, too.
"Using large-scale social media analytics for gun control would be very hard to do," said Jeff Asher, a crime analyst who formerly worked for the CIA and city of New Orleans.
The problem with broad social media sweeps, he noted, is that people don't have to tell the truth online. And even if they did, law enforcement would have to know what to look for -- what distinguishes a truly dangerous individual? Add in the thorny legal and ethical considerations, and the task of singling out shooters seems even more unlikely.
Prospective gun buyers, like visa applicants, could be asked to list their social media handles. But not everyone in America uses social media. According to the Pew Research Center, 35 percent of men (and fatal shootings are mostly committed by men) don't use social media at all.
Even if everyone used Twitter or Facebook, the sheer number of posts would be difficult to process. The Edward Snowden leak revealed that the NSA complained internally about having too much data to act on. The FBI would likely encounter the same problem, seeing as 25 million background checks were conducted last year.
After the San Bernardino shooting, two policy experts from the University of California at San Diego explained in the Washington Postwhy screening people via social media feeds is so difficult. Basically, words posted online don't often translate into action. As they put it:
[H]ate speech is commonplace, and political terrorism is exceedingly rare. Every time a keyword is used that does not lead to terrorist actions, it’s part of a vast amount of noise obscuring a miniscule signal ...
But the problem of ubiquitous false-positives remains. Thousands of angry people use the Internet to proudly declare their support for domestic terrorism every single day. Since everyone understands that most of it is just cheap talk, it is protected speech.
Vague threats of violence aren't enough to violate the First Amendment, the Supreme Court has ruled. You've got to be specific, calling for "imminent lawless action."
So, an Instagram post of a guy holding a gun with the caption, "I'm going to kill John Doe tomorrow at 10 a.m." might get flagged. Or it might not. They could be friends joking around. Maybe the guy is holding a BB gun. But what if someone ends up dead?
This is all complicated. Americans should rightly be wary of federal agents knocking down their doors over tweets. And yet, in the hindsight of tragedy, it's natural to look at a menacing comment and say, "Why didn't we stop this guy?"
Facebook and Google can ban you, but the government has a higher bar to clear. Law enforcement can investigate specific violent threats on social media. They can't deny you a gun because you've said angry, stupid things in general.
Instead of casting a wide net on social media, Asher said, police departments could use data they already collect -- such as arrest records and incident reports -- to analyze and visualize the social relationships of criminal suspects.
An Instagram post of a guy holding a gun with the caption, "I'm going to kill John Doe tomorrow at 10 a.m." might get flagged. Or it might not.
It's called social network analysis. At the core of the idea is the assumption that violence isn't random. Relationships can help predict where violence will spread. (The U.K. government even has a "how-to" guide for law enforcement officials.)
Of course, as anyone who's seen Minority Report can tell you, that carries its own ethical questions.
Algorithms have biases because the humans who design them are biased, and the statistics they rely on come from biased systems. ProPublica examined a computer program meant to predict repeat offenders and found that it wrongly flagged black defendants as future criminals twice as often as white defendants.
Still, you could focus on providing aid to those at risk, rather than policing them -- much like Columbia professor Desmond Patton is trying to do in Chicago. He's combining social media algorithms with people's knowledge of local neighborhoods to help identify troubled individuals, and, ideally, help them before they do something they might regret.
Even algorithms with good intentions are fraught, though. In her excellent new book Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, Virginia Eubanks points out how systems ostensibly designed to efficiently administer government aid can punish working class people.
Imagine a system that readily flags people in poor neighborhoods filled with gun violence. What happens to that data? Does it factor into whether someone qualifies for government aid?
So, yeah, using algorithms for gun control purposes is tricky.
Right now, however, there are things that are much easier from a technical standpoint to accomplish. Ban assault weapons and high-capacity magazines. Close loopholes that allow guns to be sold without background checks at gun shows and on the internet. Hire more people to process background checks. Don't let domestic abusers have guns. Make mental health services affordable and readily available, instead of blaming "mentally disturbed" shooters after trying to sabotage the Affordable Care Act.
In other words, the number one thing needed to protect Americans from guns isn't social media surveillance. It's for Republican lawmakers -- who take the most NRA cash and consistently vote against gun control measures -- to grow a spine.
Copyright © 2023 Powered by
After Parkland shooting, do we need social media background checks?-鼓盆之戚网
sitemap
文章
7
浏览
622
获赞
8616
Please take a moment to appreciate Bernie Sanders dancing to ABBA
Bernie Sanders spent the weekend dancing. The Democratic presidential candidate attended a "Labor So21 balloon costumes that will make your Halloween party pop
This year is your year to pop out from the crowd. Not literally, of course.Balloons are an easy wayThis 1995 ad featuring Donald Trump will ruin stuffed crust pizza for you
Stuffed crust pizza lovers, brace yourselves.That cheesy, gooey treat you're about to enjoy might coOnePlus to launch a foldable phone in 2023
OnePlus is about to enter the fold.Apologies for the bad pun, but I had to do it. Anyway, the ChinesTrump administration strips some COVID
Restricting the public's access to information is always a good sign for a free society, right?!TheRoku will make its own budget TVs
In news that will free up an HDMI port, Roku, the brand best known for making little streaming donglOutlook junk filter not working: What we know about the spam issue.
If you're inbox is a mess this Monday, you're not alone. Microsoft Outlook users have been complainiWhite House plans to admit 110,000 refugees to U.S. in 2017
The United States has vowed to take in 110,000 refugees in 2017 -- 30 percent more than in 2016.It wDon't freak out about the latest scary screen time study
There's a new study out about child brain development and screen time. So naturally, there is panic.Philippines' first transgender politician delivers emotional speech on anti
The country’s first transgender lawmaker brought the House to a standing ovation on Tuesday, aGennifer Flowers tweets she supports Trump, will accept invitation to debate
Gennifer Flowers has made her reentry into the news surrounding the 2016 presidential race. Flowers,Fleet Foxes frontman bestows his sweater upon a fan who made a Tumblr for it
This story begins, of course, in the Pacific Northwest. Five years ago, a Fleet Foxes fan did what eApple wins $15 billion court battle with EU over Irish tax
After a long string of fines and legal setbacks in the EU, Apple can now chalk up one big win next tWhat Silicon Valley Bank is, and why its failure is affecting cryptocurrency
By now, you may have heard about the collapse of Silicon Valley Bank, the second biggest bank failurCES 2023: The future of Metaverse and VR depends on these glasses
Metaverse. Metaverse. Metaverse.It seems like half of the companies at CES this year were all glommi