Sci-Tech


The NSA’s voice-recognition system raises hard questions for Echo and Google Home

23rd Jan 2018

Suppose you’re looking for a single person, somewhere in the world. (We’ll call him Waldo.) You know who he is, nearly everything about him, but you don’t know where he’s hiding. How do you find him?

The scale is just too great for anything but a computerized scan. The first chance is facial recognition — scan his face against cameras at airports or photos on social media — although you’ll be counting on Waldo walking past a friendly camera and giving it a good view. But his voice could be even better: How long could Waldo go without making a phone call on public lines? And even if he’s careful about phone calls, the world is full of microphones — how long before he gets picked up in the background while his friend talks to her Echo?

As it turns out, the NSA had roughly the same idea. In an Intercept piece on Friday, reporter Ava Kofman detailed the secret history of the NSA’s speaker recognition systems, dating back as far as 2004. One of the programs was a system known as Voice RT, which was able to match speakers to a given voiceprint (essentially solving the Waldo problem), along with generating basic transcriptions. According to classified documents, the system was deployed in 2009 to track the Pakistani army’s chief of staff, although officials expressed concern that there were too few voice clips to build a viable model. The same systems scanned voice traffic to more than 100 Iranian delegates’ phones when President Mahmoud Ahmadinejad visited New York City in 2007.

We’ve seen voice recognition systems like this before — most recently with the Coast Guard — but there’s never been one as far-reaching as the Voice RT, and it raises difficult new questions about voice recordings. The NSA has always had broad access to US phone infrastructure, something driven home by the early Snowden documents, but the last few years have seen an explosion of voice assistants like the Amazon Echo and Google Home, each of which floods more voice audio into the cloud where it could be vulnerable to NSA interception. Is home assistant data a target for the NSA’s voice scanning program? And if so, are Google and Amazon doing enough to protect users?

In previous cases, law enforcement has chiefly been interested in obtaining specific incriminating data picked up by a home assistant. In the Bentonville murder case last year, police sought recordings or transcripts from a specific Echo, hoping the device might have triggered accidentally during a pivotal moment. If that tactic worked consistently, it might be a privacy concern for Echo and Google Home owners — but it almost never does. Devices like the Echo and Google Home only retain data after hearing their wake word (“Okay Google” or “Alexa”), which means all police would get is a list of intentional commands. Security researchers have been trying to break past that wake-word safeguard for years, but so far, they can’t do it without an in-person firmware hack, at which point you might as well just install your own microphone.

But the NSA’s tool would be after a person’s voice instead of any particular words, which would make the wake-word safeguard much less of an issue. If you can get all the voice commands sent back to Google or Amazon servers, you’re guaranteed a full profile of the device owner’s voice, and you might even get an errant houseguest in the background. And because speech-to-text algorithms are still relatively new, both Google and Amazon keep audio files in the cloud as a way to catalog transcription errors. It’s a lot of data, and The Intercept is right to think that it would make a tempting target for the NSA.

When police try to collect recordings from a voice assistant, they have to play by roughly the same warrant rules as your email or Dropbox files — but the NSA might have a way to get around the warrant too. Collecting the data would still require a court order (in the NSA’s case, one approved by the FISA court), but the data wouldn’t necessarily need to be collected. In theory, the NSA could appeal to platforms to scan their own archives, arguing they would be helping to locate a dangerous terrorist. It would be similar to the scans companies already run for child abuse, terrorism or copyright-protected material on their networks, all of which are largely voluntary. If companies complied, the issue could be kept out of conventional courts entirely.

 

 

source/read more: https://www.cnet.com/news/facebook-redefines-time-with-open-source-flicks/

 

Creating clouds to stop global warming could wreak havoc

23rd Jan 2018

To counteract global warming, humans may someday consider spraying sulfur dioxide into the atmosphere to form clouds — and artificially cool the Earth.

The idea behind the process, known as geoengineering, is to keep global warming under control — with the ideal solution still being a reduction in the emissions of greenhouse gases.

However, suddenly stopping that spraying would have a “devastating” global impact on animals and plants, potentially even leading to extinction, according to the first study on the potential biological impacts of climate intervention.

“Rapid warming after stopping geoengineering would be a huge threat to the natural environment and biodiversity,” said study co-author Alan Robock of Rutgers University. “If geoengineering ever stopped abruptly, it would be devastating, so you would have to be sure that it could be stopped gradually, and it is easy to think of scenarios that would prevent that.”

Rapid warming forced animals to move. But even if they could move fast enough, they might not be able find places with enough food to survive, the study said.

“Plants, of course, can’t move reasonably at all. Some animals can move and some can’t,” Robock said.

If stratospheric climate geoengineering is deployed but not sustained, its impacts on species and communities could be far worse than the damage averted.

While animals would be able to adapt to the cooling affects of the spraying, if it’s stopped the warming would ramp up too fast for the animals to keep up.

Researchers in the study used computer models to simulate what would happen if geoengineering led to climate cooling and then what would happen if the geoengineering stopped suddenly.

Starting geoengineering then suddenly stopping it isn’t necessarily far-fetched.

“Imagine large droughts or floods around the world that could be blamed on geoengineering, and demands that it stop. Can we ever risk that?,” Robock said.

The idea behind this type of geoengineering would be to create a sulfuric acid cloud in the upper atmosphere that’s similar to what volcanic eruptions produce, Robock said. The clouds, formed after airplanes spray sulfur dioxide, would reflect solar radiation and thereby cool the planet.

Geoengineering takes its cue from the natural experiment that actually had made the only recent dent in global warming’s rise in the last few decades — the 1991 eruption of Mount Pinatubo in the Philippines, which blasted more than 15 million tons of sulfur dioxide 21 miles high, straight into the stratosphere.

The stratosphere suspended those sulfur particles in the air worldwide, where the haze they created scattered and reflected sunlight away from the Earth and cooled global atmospheric temperatures nearly 0.7 to 0.9 degrees Fahrenheit in 1992 and 1993, before finally washing out, according to NASA’s Goddard Institute for Space Studies estimates.

But the airplanes spraying the sulfur dioxide would have to continuously fly into the upper atmosphere to maintain the cloud because it would last only about a year if spraying stopped, Robock said. The airplane-spraying technology may be developed within a decade or two, he added.

 

source/read more:

Experts sound alarm as biometric data from driver’s licences added to government database

 

15th Jan 2017

Your face is becoming the latest weapon in the world of digital surveillance, and the humble driver’s licence looms as a game-changer in tracking individuals through both the real and virtual world.

Experts warn your biometric data may already be vulnerable to misuse by criminals and terrorists, as the proliferation of mobile cameras combined with social media and ubiquitous CCTV feeds mean we’re caught on screen more than ever before.

Driver’s licences will be added to the Commonwealth Government’s already vast biometric databases after it struck an agreement with the states and territories, handing authorities access to an unprecedented level of information about citizens.

A system known as “the interoperability Hub” is already in place in Australia, allowing agencies to take an image from CCTV and other media and run it against a national database of passport pictures of Australian citizens — a process known as “The Capability”.

But soon driver’s licences will be added to the system, allowing both government and private entities to access your photo, age and address.

It is a $21 million system being sold as a way to tackle terrorism and make commercial services more secure.

But experts warn people now risk losing control of their biometric identity entirely as commercial interests, governments and organised crime gangs all move to capture more personal metadata for their own gain.

Driver’s licences change the biometric gamw

Technology and legal expert Professor Katina Michael said about 50 per cent of the population already had some kind of visual biometric stored in a nationally-accessible database, but the inclusion of drivers licenses would see the proportion of Australians scooped up in the net swell to about 80 per cent.

She said one of the biggest risks of the collection of biometric data was not deliberate misuse by the AFP, ASIO or another government agency, but rather vulnerabilities in the way biometrics work.

“It’s not like a one-on-one match, where you put (in) an individual’s face and say: ‘they’re a suspect’,” Professor Michael said.

“But rather what you get returned is a number of possibilities … you might get back 15, or 20, or 30, or 50 matches.

So you might have 50 innocent people being suspects, rather than the person that you’re trying to catch.

Professor Michael said this meant that while over time a person’s name might be cleared, their data could remain in a database linked to a criminal investigation.

“And then I’m thinking, what happens to their level of innocence as time goes on, because they accidentally look like a minority group?” she said.

She said real criminals and terrorists would opt out of the system, choosing not to have passports and driver’s licenses in a bid to escape the net.

“Of course, if you’ve done nothing wrong, the old adage says you’re fine. But increasingly, we don’t know if we’re fine,” she said.

The rise of ‘uberveillance’

Professor Michael said modern surveillance methods employed by law enforcement were not just limited to CCTV — they now incorporated vast amounts of metadata and social media, leading to a concept known as “uberveillance” in which people were constantly monitored.

“What we have now are digital footprints that we all leave behind,” she said.

“Phone call records, internet searches, credit cards and even the data on your electronic train or bus ticket can be used to track your movements and activity.

“It brings together all these various touchpoints, telecommunications records, travel data via tokens, facial recognition on federal databases, your tax file number … that’s accessible depending on the level of crime and social media.

“You’ve got this very rich almost cradle-to-grave kind of data set that’s following you.”

Photo Even transport cards like the NSW Opal card can reveal your personal data.

Organised criminals want your identity

Stephen Wilson runs Lockstep Consulting, a Sydney-based firm which researches and tracks trends in biometrics in the corporate and government spheres, and advises clients on best-practice.

He said at the moment very secure biometric systems took quite a long time to process images accurately.

Problems arose when consumer convenience, such as being able to unlock a phone or access a bank account with a quick face or fingerprint scan, trumped security.

“No police force, no public service, no business is ever perfect, there is always going to be corrupt people,” Mr Wilson said.

“The more exposure we have to electronic databases, the more exposure we have to biometric matching, it’s only a matter of time before these bad actors succumb to temptation or they succumb to corruption and they wind up using these systems inappropriately.”

Your biometric twin is out there

Photo New technology can more easily track people’s faces in crowds.

Mr Wilson said biometrics were creeping into consumer services like bank accounts and online betting facilities, with customers asked to send a picture of their licence and a “selfie” that will be run through an identity matching service.

“The real risk is that bad actors will take people’s photos, ask for a match, and get back a series of matches of people that are kind of like your biometric twin,” he said.

“We’ve all got doppelgangers, we’ve all got people in public that look just like us.

“If you’re trying to perpetrate a crime, if you’re organised crime, and you’re trying for example to produce a fake driver’s licence, it’s absolute gold for you to be able to come up with a list of photos that look like ‘Steve Wilson’.”

Technology companies like Apple and Samsung have championed the use of biometrics such as fingerprints, and this has taken a step further with facial recognition becoming more common thanks to the release of the iPhone X.

Photo Apple’s iPhone X has championed facial recognition technology.

However Mr Wilson said a key difference was that information stayed on the phone, while banking and other commercial interests trying to use your biometrics to confirm your identity could be storing it on a server anywhere.

“Do you really want your photo, which is a pretty precious resource, sent off to a company perhaps on the other side of the world just so you can get a quick bank account or quick betting service set up?” he asked

What will happen next?

An annual industry survey conducted by the Biometrics Institute, known as the Industry Trend Tracker, has nominated facial recognition as the biometric trend most likely to increase over the next few years.

Respondents believed privacy and data protection concerns were the biggest constraint on the market, followed by poor knowledge of decision makers, misinformation about biometrics and opposition from privacy advocates.

The Australian law reform commission says biometric systems increasingly are being used or contemplated by organisations, including in methadone programs, taxi booking services, ATMs and online banking, and access to buildings

Dr Michael said governments needed to be very cautious about how they applied this rich new source of data in the future.

She said governments were building these agreements between themselves and corporations in a bid to stamp out fraud, but that goal was not always achieved and the potential for mistakes was vast.

“What we have is this matching against datasets, trying to find the needle in the haystack,” she said.

“Often what happens is we don’t find the needle.”

A statement from the Department of Home Affairs said the Australian Government was exploring making the Face Verification Service available to the private sector, but nothing had started at this point.

It said arrangements for private sector access would be informed by an independent privacy impact assessment and those using it would need to demonstrate their lawful basis to do so under the privacy act and where they had gained consent to use a person’s image.

Spy agency ASIO wants powers to hack into personal computers

15th Jan 2017 (ORIGINAL 2013)

SPY agency ASIO wants to hack into Australians’ personal computers and commandeer their smartphones to transmit viruses to terrorists.

The Attorney-General’s Department is pushing for new powers for the Australian Security Intelligence Organisation to hijack the computers of suspected terrorists.

But privacy groups are attacking the ”police state” plan as ”extraordinarily broad and intrusive”.

A spokesman for the Attorney-General’s Department said it was proposing that ASIO be authorised to ”use a third party computer for the specific purpose of gaining access to a target computer”.

”The purpose of this power is to allow ASIO to access the computer of suspected terrorists and other security interests,” he told News Limited.

”(It would be used) in extremely limited circumstances and only when explicitly approved by the Attorney-General through a warrant.

”Importantly, the warrant would not authorise ASIO to obtain intelligence material from the third party computer.”

The Attorney-General’s Department refused to explain yesterday how third-party computers would be used, ”as this may divulge operationally sensitive information and methods used by ASIO in sensitive national security investigations.”

But cyber specialist Andrew Pam, a board member of the Electronic Frontiers lobby group, predicted ASIO could copy the tactics of criminal hackers to seize control of target computers.

Australians’ personal computers might be used to send a malicious email with a virus attached, or to load ”malware” onto a website frequently visited by the target.

”This stuff goes on already in the commercial and criminal world, and security agencies could be using the same techniques to commandeer people’s computers and use them to monitor a target,” Mr Pam said.

”Once you get control of a computer and connect to their network you can do whatever you want.”

The ASIO Act now bans spies from doing anything that ”adds, deletes or alters data or interferes with, interrupts or obstructs the lawful use of the target computer by other persons”.

But ASIO wants the ban lifted, so Attorney-General Nicola Roxon can issue a warrant for spies to secretly intercept third-party computers to disrupt their target.

The departmental spokesman said the federal government had made ”no decisions” about whether to grant ASIO the new power.

The government would first consider advice from the federal Parliamentary Joint Committee on Intelligence and Security, which is reviewing national security legislation.

Victoria’s acting Privacy Commissioner, Dr Anthony Bendall, has told the committee that ASIO’s proposed new powers are ”characteristic of a police state.”

”To access a third party’s computer, which has no connection with the target, is extraordinarily broad and intrusive,” his submission states.

But the Attorney-General’s Department insists that ASIO will not examine the content of third-party computers.

”The use of the third party computer is essentially like using a third party premises to gain access to the premises to be searched, where direct access is not possible,” it states in response to questions from the committee.

”It involves no power to search or conduct surveillance on the third party.”

The department said technological advances had made it ”increasingly difficult” for ASIO to execute search warrants directly on target computers, ”particularly where a person of interest is security conscious.”

Australian Council for Civil Liberties president Terry O’Gorman yesterday said ASIO should have to seek a warrant from an independent judge, rather than a politician.

He warned that ASIO might be able to spy on individuals – including journalists protecting a whistleblower – by tapping into their computers.

”I’m concerned they will access all sorts of information on a computer that has nothing to do with terrorism,” he said.

 

 

 

 

 

source/read more: https://www.perthnow.com.au/news/australia/spy-agency-asio-wants-powers-to-hack-into-personal-computers-ng-12a17b2577859c1d00fba2dc78721854

A Crypto Website Changes Its Data, and $100 Billion in Market Value Vanishes

9th jan 2018

Prices for some of the most popular cryptocurrencies dropped sharply Monday. One apparent reason: an adjustment from a popular website on its digital-currency price quotes.

A website called coinmarketcap.com on Monday removed data from some South Korean exchanges from its price quotes for a range of virtual currencies including bitcoin, Ethereum and Ripple’s XRP. The move followed a South Korean government crackdown on cryptocurrencies.

The move by coinmarketcap caused some amount of chaos when prices across the board suddenly plunged. In mid-Monday trading, XRP had fallen 26% over the past 24 hours, Bitcoin Cash was down 18%, and litecoin was down 12%. Of the top 40 cryptocurrencies, 31 were down, including bitcoin and Ethereum.

A representative of the website confirmed the moves in an email to The Wall Street Journal, citing “extreme price discrepancy” among South Korean exchanges. The company added in a tweet Monday afternoon that it “excluded some Korean exchanges in price calculations due to the extreme divergence in prices from the rest of the world and limited arbitrage opportunity.”

Coinmarketcap has become one of the most popular destinations for price quotes as the sector surged last year. According to Amazon’s web-ranking service, coinmarketcap is currently the 154th most popular website in the world, in the same ballpark as Chinese retail giant Alibaba.com .

The website’s rejiggered prices led to a flip in market-value rankings on the site. Ethereum, with a $109 billion total market valuation, moved into second place, the spot previously occupied by XRP, which fell to third place with a $97 billion market value. Bitcoin remained number one, with a $255 billion market value.

More than $100 billion of the sector’s total market value was erased over the last 24 hours, according to the site. On Sunday, it stood at $835 billion. On Monday, it fell to as low as $683 billion, and lately was at $722 billion.

Before Monday, for example, prices for XRP on coinmarketcap were quoted as high as $3.84 on Jan. 4. About 25% of XRP’s trading volume came from the Seoul-based online exchange Bithumb, according to coinmarketcap.

 

 

source/read more : https://www.wsj.com/articles/a-crypto-website-changes-its-data-and-100-billion-in-market-value-vanishes-1515443100

U.K. Threatens Facebook, Google with Higher Taxes Unless They Hand Over User Data

2nd Jan 2018

Purportedly in a bid to fight online radicalization, the UK government has issued tech companies like Facebook and Google an ultimatum to hand over user data or face higher taxes.

VentureBeat reports that the UK government is considering imposing new taxes on companies like Facebook and Google if they refuse to share collected user data with the government and make further efforts to combat extremist content on their platforms. Ben Wallace, the UK Minister of State for Security, derided the tech firms for selling user information but refusing to share it with state governments.

“If they continue to be less than co-operative, we should look at things like tax as a way of incentivizing them or compen­sating for their inaction,” said Wallace in an interview with the Sunday Times newspaper. “We should stop pretending that because they sit on beanbags in T-shirts they are not ruthless profiteers,” he said. “They will ruthlessly sell our details to loans and soft-porn companies but not give it to our democratically elected government.”

Wallace did not go into detail on the taxes that may be imposed on the companies, but the Sunday Times reports that the taxes would act similarly to the windfall tax imposed on privatized utilities by former Prime Minister Tony Blair.

Simon Milner, Facebook’s current Policy Director, challenged Wallace’s claims in a statement: “Mr. Wallace is wrong to say that we put profit before safety, especially in the fight against terrorism. We’ve invested millions of pounds in people and technology to identify and remove terrorist content.”

YouTube also stated that they were constantly working to fight extremist content on their platform, with a spokesperson saying, “Over the course of 2017 we have made significant progress through investing in machine learning technology, recruiting more reviewers, building partnerships with experts and collaboration with other companies.”

 

 

 

source/read more:

Who Regulates Bitcoin Trading? No U.S. Agency Has Jurisdiction

26th dec 2017

Investors are frantically trying to learn everything they can about bitcoin—and so are regulators.

Furious trading in cryptocurrencies is testing many in the Trump administration who are eager to embrace financial innovation, after nearly a decade of tighter clamps on risk-taking put in place after the 2008 financial crisis.

The Commodity Futures Trading Commission, the agency with closest oversight of bitcoin trading, began the year by launching an in-house lab to encourage advances in blockchain, the technology that underpins digital currencies. Yet the regulator recently sounded an alarm on bitcoin itself, noting most exchanges are completely unregulated while the cryptocurrency is prone to wild price swings and potential flash crashes.

The CFTC has labeled bitcoin a commodity, but as with other commodities, the agency mostly lacks jurisdiction over the primary market: It regulates corn futures contracts but not the buying and selling of corn itself, for instance. As a result, bitcoin exchanges don’t have to tell participants how they operate, such as whether they offer preferential access to certain traders.

“A lot of people, retail traders in particular, have gotten used to securities laws and commodities laws protecting them,” said Kipp Rogers, a former proprietary trader who is now a blogger and researcher. “And so they don’t just even know what to be on guard for.”

The Securities and Exchange Commission faces challenges similar to the CFTC’s. The SEC’s new chairman, Jay Clayton, wants retail investors to have better access to high-return investments, which bitcoin and many of its digital relatives provide.

But the SEC is cracking down on initial coin offerings, a fundraising vehicle that piggybacks off investors’ lust for bitcoin and in some cases violates core investor-protection laws.

“There is clearly demand for bitcoin investments, but the regulators are struggling with how to balance new tools for capital formation with the need for investor protections,” said Michael Liftik, a partner at Quinn Emanuel Urquhart & Sullivan LLP and previously a top aide to former SEC Chairman Mary Jo White.

 

source/read more:https://www.wsj.com/articles/who-regulates-bitcoin-trading-no-u-s-agency-has-jurisdiction-1514116800

Edward Snowden unveils phone app, Haven, to spy on spies

26th Dec 2017

The former National Security Agency contractor who exposed U.S. government surveillance programs by disclosing classified material in 2013 has a new job: app developer.

Edward Snowden in a video message Friday unveiled a new phone app he helped create, called Haven, that aims to protect laptops from physical tampering.

Snowden says it’s an open-source tool designed for human rights activists and other people at risk and it uses an Android phone’s sensors to detect changes in a room.

The software was developed with the Freedom of Press Foundation and the Guardian Project. It has been greeted with mixed social media reactions, with some people celebrating its security capabilities and others saying they don’t trust Snowden.

Snowden has lived in Russia since 2013, when the country gave him asylum, resisting U.S. pressure to extradite him.

 

 

 

source/read more:https://apnews.com/7b8aacd0d929493bb4fea9ca57ea90d3/Edward-Snowden-unveils-phone-app,-Haven,-to-spy-on-spies

Minority Report’ Artificial Intelligence machine can identify 2 BILLION people in seconds

12 Dec 2017

Yitu Technology has made an AI algorithm that can connect to millions of surveillance cameras and instantly recognise people.The company – based in Shanghai, China – developed Dragonfly Eye to scan through millions of photographs that have been logged in the country’s national database.

This means it has a collection of 1.8 billion photos on file, including visitors to the country and those taken at ports and airports.

It may also have access to the photos of every one of Hong Kong’s identity card holders, although Yitu has refused to confirm this.

The cutting-edge technology is now being used track down criminals, with the early stages of use showing it has been a hugely successful.

 

 

source: https://www.dailystar.co.uk/news/latest-news/666375/minority-report-artificial-intelligence-machine-dragonfly-yitu-technology

When emoji are the weapon of choice

6th Dec 2017

IT MAY seem vague and harmless but a growing number of people have been thrown in jail for their use of the emoji language.

In a recent workplace survey, more than half of the employees polled said they have used an emoji to communicate at work. Here’s a look at the pros and cons of using them in office communications.

NEARLY everyone is familiar with emoji, those popular icons that appear in text messages, emails and social media platforms. Emoji are often used as lighthearted adjuncts to text, or to soften the blow of a message.

Emoji can be viewed as overly simplistic in some contexts. For example, government officials were questioned when Foreign Minister Julie Bishop conducted an interview using just emoji, and described Russian President Vladmir Putin using an angry face.

A 2017 study found that use of emoji in work emails reduced perceptions of competence.

But emoji can be taken very seriously in the context of the law. The use of emoji has challenged lawyers, judges, and lawmakers in several countries. In a legal context, emoji are increasingly recognised not as joke or ornament, but as a legitimate form of literacy.

Making a criminal threat via emoji

Perhaps the most troubling use of emoji has come through their use in interpersonal messages where it is unclear whether they modify or amplify a prima facie criminal threat.

In New Zealand, a judge considered the role of emoji in a Facebook message sent by a man to his ex-partner. The man wrote, “you’re going to get it” followed by an aeroplane emoji.

Concluding that the message and emoji generally conveyed that the defendant was “coming to get” his ex-partner, the judge sentenced the accused to 8 months jail on a charge of stalking.

In 2016, a court in France convicted a young man of threatening his ex-girlfriend through a text message sent by mobile phone. The court found that the inclusion of a gun emoji meant that the message amounted to a “death threat in the form of an image”. The court sentenced the defendant to six months’ imprisonment and imposed a €1,000 fine.
In Virginia in 2015 a high school student was charged with computer harassment and threatening school staff because of using weapon emoji in an Instagram post.

In Virginia in 2015 a high school student was charged with computer harassment and threatening school staff because of using weapon emoji in an Instagram post.Source:Supplied

The issue has also arisen in several cases in the US. In Virginia in 2015 a high school student was charged with computer harassment and threatening school staff. She had posted several messages to her Instagram account, combining text with emoji (a gun, a knife and a bomb).

The student claimed that she had never intended to make a threat and that the posts had been a joke.

In the same year, a 17-year-old in New York was charged with making a terrorist threat on his Facebook page after posting a policeman emoji, and three guns pointing towards it.

The prosecutor alleged that the message constituted a clear threat to police due to several factors:

• identification of a class of victim (police)

• repeated use of the gun emoji

• placement of the emoji weapons close to the emoji of the officer’s head

• the fact that other violent messages had been posted by the student earlier the same evening.

However, a grand jury failed to indict the defendant, at least in part due to concerns about whether the post really demonstrated criminal intent.

Liability in other threat cases has been more readily established. A high school student was convicted of making a criminal threat after she sent a series of tweets including a variety of emoji weapons.

Her claim that the tweets were meant to be a joke was unsuccessful.

Perhaps the high point of emoji liability occurred in a case in Spartanburg County, South Carolina. The defendants, who had previously physically attacked the victim, sent him a message comprising only emoji: a fist, followed by a pointed finger, followed by an ambulance.
The message seemed pretty clear…

 

 

source/read more: http://www.news.com.au/technology/online/social/how-the-law-responds-when-emoji-are-the-weapon-of-choice/news-story/04d9362f6bd5c27c61d6b5e1ab432798