Daily Archives: January 27, 2016

The Danger of Fake Patches

 

chipWe talk a lot about threats to data security on this blog, and personal experience has probably acquainted you with everything from Trojan Horses to phishing.

Here’s a particularly sneaky threat that’s becoming more and more common: fake patches.

Part of what makes them a problem is that, unlike those spam e-mails from people and companies you don’t know, fake patches can look like perfectly reasonable notices from software services or programs you’d expect to receive patches from, like Adobe or Google Chrome. The fake updates display the company logo, so they seem real enough. Just last year, in fact, hackers sent out a fake version of Java Update 11 that contained malware.

How well-equipped you are depends, not surprisingly, on the security measures you have in place. Keeping the auto-update feature on is good practice, provided your software is designed to identify incoming patches and make sure they’re genuine. Even then, it’s possible for malware to use a fraudulent certificate to get around an auto-update program.

There are a number of things you can do to minimize risk. Cutting down on Shadow IT and foreign software on corporate machines makes it harder for hackers to send fake patches. A robust antimalware service is another step.

But at the end of the day, just being smart and cautious goes a long way. Fake patches often look suspicious in the same way spam e-mails look suspicious. They might have misspellings or they just don’t look like a software update you’re accustomed to seeing. They might even ask you to pay for the software they’re asking you to download.

Little things like avoiding pop-ups and scanning and cleaning your computer help, too. And, as always, talk with the IT department and back up your files. Communication and stored, safe files will ensure a small problem doesn’t become a big one.

FTC: Big data and IoT spawn new data concerns

IoTThe ongoing collision of big data and the internet of things raises whole new concerns about maintaining security, privacy, and fairness of personal data, says Julie Brill, member of the Federal Trade Commission.

Brill spoke earlier this month at the Cyber Security and Privacy Summit hosted by Washington State Gov. Jay Inslee.

“The data from connected devices will be deeply personal, and big data analytics will make the data more readily actionable,” said Brill. “Some of these devices will handle deeply sensitive information about our health, our homes, and our families. Some will be linked to our financial accounts, and some to our email accounts.”

However, she added that people won’t change much.

“We as individuals will remain roughly the same. We will not suddenly become capable of keeping track of dozens or hundreds of streams of our data, peering into the depths of algorithmic decision-making engines, or spotting security flaws in the countless devices and pieces of software that will surround us,” she warned.

Faced with a world of uncertainty about which devices are safe and whether they are getting a fair shake in the big data world,  Brill continued, “consumers could use some help.”

Major inroads possible into our lives

This rapidly evolving environment raises issues that have yet to be resolved. Brill divided the issues into the three areas of security, privacy, and fairness:

1. Security

“Because these connected devices are linked to the physical world, device security also is a top concern,” she said. To wit:

No armor. Of the 90% of connected devices that are collecting personal information, 70% transmit the data without encryption.

No expertise or recognition. Traditional goods manufacturers may not have the expertise, or even realize they need such expertise, to secure their new devices.

Cheap as dirt. Many connected devices will be inexpensive and essentially disposable.

Just because the plug fits … Security vulnerabilities may be hidden deep in the code that runs an app or device, which may not become apparent until it is connected to an environment for which it wasn’t designed.

“All of these factors point to the need to take an all-hands-on-deck approach to data security, with security researchers playing an important role in bringing security flaws to light,” Brill said.

2. Privacy

“Consumers want to know—and should be able to easily find out—what information companies are collecting, where they’re sending it, and how they’re using it,” said Brill. She said that information plays an important part in consumers’ decisions about whether to use digital products and services in the first place.

However, obstacles have emerged:

Didn’t know they were watching. Many companies, including data brokers, ad networks, and analytics firms operate in the background with consumer data.

Devices give no clues. Many connected devices do not have a user interface to present information to consumers about data collection.

Queries not answered. Questions have arisen about who should receive disclosures about data collection and use practices; how would consumers or innocent bystanders know when a device is recording images or audio; and how will the collected data be secured.

Brill said that manufacturers of connected devices should recognize that providing transparency will require some creative thinking.

“Visual and auditory cues, and immersive apps and websites should be employed to describe to consumers, in a meaningful and relatively simple way, the nature of the information being collected … and provide consumers with choices,” Brill said.

3. Fairness

 Certain data brokers assemble individual profiles on consumers from various sources which are used for marketing practices.

On such firms specifically, Brill said that “while this kind of information can be used for relatively benign purposes, or even in ways that will enhance financial inclusion, this kind of information has also been used to harm vulnerable consumers.”

Again, pairing big data with internet of things in this area creates new concerns:

Credit scores used beyond credit world. The use of scores, such as credit scores, can go beyond decisions about mortgages, for example, to other major decisions such as whether a prospective employer would extend a job offer to a given applicant, or whether insurance companies would charge higher premiums on auto or homeowners insurance.

Scores grown outside the regulatory zone. The use of many different types of scores has proliferated to make eligibility determinations covered by the Fair Credit Reporting Act, yet they haven’t yet been subject to the same kind of scrutiny that Congress and federal agencies have brought to bear on traditional credit scores.

It all happens in a black box. Scoring algorithms and other forms of big data analytics rely on statistical models and data system designs that few on the outside understand in detail.

“This suggests that testing the effects of big data analytics may be a promising way to go,” Brill said, adding that “companies using scoring models should themselves do more to determine whether their own data analytics result in unfair, unethical, or discriminatory effects on consumers.”

In summary she says, “For now, the rapid changes in big data analytics and the internet of things have made it difficult to meet some of these expectations in practice. The key point, however, is that these are the enduring expectations of consumers, rather than relics of a simpler world.”