Archive for May, 2017

Why the NSA Makes Us More Vulnerable to Cyberattacks The Lessons of WannaCry – Foreign Affairs

There is plenty of blame to go around for the WannaCry ransomware that spread throughout the Internet earlier this month, disrupting work at hospitals, factories, businesses, and universities. First, there are the writers of the malicious software, which blocks victims access to their computers until they pay a fee. Then there are the users who didnt install the Windows security patch that would have prevented an attack. A small portion of the blame falls on Microsoft, which wrote the insecure code in the first place. One could certainly condemn the Shadow Brokers, a group of hackers with links to Russia who stole and published the National Security Agency attack tools that included the exploit code used in the ransomware. But before all of this, there was the NSA, which found the vulnerability years ago and decided to exploit it rather than disclose it.

All software contains bugs or errors in the code. Some of these bugs have security implications, granting an attacker unauthorized access to or control of a computer. These vulnerabilities are rampant in the software we all use. A piece of software as large and complex as Microsoft Windows will contain hundreds of them, maybe more. These vulnerabilities have obvious criminal uses that can be neutralized if patched. Modern software is patched all the timeeither on a fixed schedule, such as once a month with Microsoft, or whenever required, as with the Chrome browser.

When the U.S. government discovers a vulnerability in a piece of software, however, it decides between two competing equities. It can keep it secret and use it offensively, to gather foreign intelligence, help execute search warrants, or deliver malware. Or it can alert the software vendor and see that the vulnerability is patched, protecting the countryand, for that matter, the worldfrom similar attacks by foreign governments and cybercriminals. Its an either-or choice. As former U.S. Assistant Attorney General Jack Goldsmith has said, Every offensive weapon is a (potential)

See the article here:
Why the NSA Makes Us More Vulnerable to Cyberattacks The Lessons of WannaCry - Foreign Affairs

NSA in Unprecedented Hunt for KremlinGate Evidence – Observer

In my last column, I broke the news that Admiral Mike Rogers, the director of the National Security Agency, reportedly explained to his workforce last week that he had declined to assist President Donald Trump in his efforts to undermine the FBI and its counterintelligence investigation of the White House. As Rogers is said to have explained to agency personnel, There is no question that we have evidence of election involvement and questionable contacts with the Russians.

On this basis, Admiral Rogers confirmed the existence of highly classified signals intelligence which establishes some sort of collusion between Team Trump and the Kremlin during the 2016 election campaign. However, now that the Justice Department has appointed Robert Mueller special counsel charged with running the Russia investigation, NSA is apparently pulling out all the stops to track down any additional evidence which might be relevant to the expanded inquiry into KremlinGate.

Specifically, last week NSA is believed to have sent out an unprecedented order to the Directorate of Operations, the agencys largest unit. The DO, as insiders term it, manages all of NSAs SIGINT assets worldwide, making it the most important spy operation on earth. The email sent to every person assigned to the DO came from the Office of General Counsel, the NSAs in-house lawyers, and it was something seldom seen at the agencya preservation order.

Such an order would have charged every DO official, from junior analysts to senior managers, with finding any references to individuals involved in KremlinGate, especially high-ranking Americansand preserving those records for Federal investigators. This would include intercepted phone calls and any transcripts of them, emails, online chats, faxesanything the agency might have picked up last year.

At the request of NSA officials, I will not name the specific individuals that DO personnel have been told to be on the lookout for in SIGINT intercepts, but one could fairly surmise that the list includes virtually all key members of Team Trump.

There are several possible ways such individuals can come up in raw SIGINT. First, they might be the people talking or chatting: in other words, first-person intercept. If NSA has a relevant top-secret warrant issued by the Foreign Intelligence Surveillance Court, such intelligence collection is perfectly legalas well as highly classified.

Second, someone might recount a conversation with one of the individuals the DO is interested in. This seems to be the scenario behind the recent sensational story about how Jared Kushnerthe presidents son-in-law, all-purpose adviser, and former publisher of Observeris said to have asked Sergei Kislyak, Russias ambassador in Washington, to use his embassys secure communications to talk to Moscow. NSA reportedly intercepted a conversation between Kislyak and the Kremlin in which the ambassador relayed Kushners request to his bosses back home.

Third is a kind of intercept which NSA terms reflections, meaning that none of the individuals on the DOs list are involved, but one or more of them are being discussed by third parties. For instance, this could be a conversation between foreign officials about Team Trump and its mounting Russia problems. If the people discussing it are VIPs, their opinions may have intelligence value for policymakers in Washingtoneven if their conversation may shed no new investigative light on KremlinGate.

An example: if NSA intercepts a conversation in which senior diplomats from Middle Eastern countries are chatting about Trumps relationship with Moscow, that could be an important reflection of how their countries leaderships view the situation in Washington. If one of those Middle Eastern countries is a close ally (or avowed enemy) of the United States, their views of KremlinGate would be of interest to high-ranking American officials, even if the conversation is based on no more than press reports and office gossip.

The DO is divided into offices which focus on a specific country or region (e.g. China, the Middle East) or on a defined problem set (e.g. counterterrorism, counterintelligence). Months ago, the DOs Russia shop is said to have received a preservation order from the agencys lawyersno surprise, given what that office does. Now such an order has reportedly been passed to the whole DO, including offices which have nothing to do with Russia. This demonstrates the agencys serious intent to provide investigators with any evidence which may shed additional light on KremlinGate.

That said, NSA may have another motive in issuing this DO-wide order. Such motive is the Intelligence Communitys venerable tradition of self-preservation, what spy-veterans term CYA. As Trumps Russia problems have heated up, his fans and media allies have made increasingly serious accusations of malfeasance by NSA and other spy agencies under President Obama. Some of these wild charges have been ludicrous, merely lies created by Kremlin disinformation outlets, then parroted by right-wing media in America.

That media has lavished particular attention on the issue of SIGINT unmasking, meaning the process of how NSA responds to high-level requests to reveal the name of any American who appears in an intelligence report (normally those names are redacted; for an explanation of how this complex issue really works, see this). Although theres no evidence of any systematic abuse of unmasking by President Obama, this hasnt halted the increasingly shrill accusations.

The Kremlin has tried to smear NSA for years, and that clandestine campaign got a big boost with the defection of Edward Snowden to Moscow almost four years ago. As Ive explained, discrediting NSA and its global intelligence partnerships played a key role in Russias interference in our election last year. In order to counter pervasive lies about the agency and its mission, the reported preservation order includes collecting all customer requests for unmaskings, plus records of which agency analysts accessed the information and when, exactly: in other words, complete data trails of all incidents of SIGINT unmasking in 2016.

An undertaking of this size and scope has never happened in NSAs 65-year history. Although preservation orders have been issued previously, never has the entire DO been told to search all its databases for SIGINT on named individuals, then preserve anything thats discovered. KremlinGate is a unique event in our nations history, with accusations of nefarious meddling by hostile intelligence agencies in our democracy, and its bringing about unprecedented developments in our spy agencies too.

Given the complex nature of SIGINTsuch a DO preservation order will require thousands of analysts to reexamine at least hundreds of thousands of intercepted communicationsit seems likely that some relevant information will be uncovered. Although the public may not learn of new evidence anytime soon, we can rest assured that anything pertinent to the KremlinGate inquiry will be shared with the FBI and Bob Muellers investigators without delay.

John Schindler is a security expert and former National Security Agency analyst and counterintelligence officer. A specialist in espionage and terrorism, hes also been a Navy officer and a War College professor. Hes published four books and is on Twitter at @20committee.

See the original post here:
NSA in Unprecedented Hunt for KremlinGate Evidence - Observer

NSA leakers begin sign-ups for monthly leak subscription service – The Hill

A group of suspected National Security Agency (NSA) leakers known as the ShadowBrokers on Tuesday announced more details of their monthly subscription service to provide remaining documents from their NSA cache.

The group has been releasing files that appear to have been pilfered from the NSA in 2013 since last summer most notably releasing a suite of Windows hacking tools that were subsequently used in the Wanna Cry ransomware that induced a global panic earlier this month.

The ShadowBrokers on Tuesday posted instructions on how to join a "Wine of the Month" club for new NSA leaks. In the post, the group said interested parties should send 100 ZCash coins a digital currency akin to bitcoin to sign up for the service. Enrollment will begin June 1 and end June 30.

ZCash coins are currently worth over $230 apiece, making the sign-up cost more $23,000. Only available since October, ZCash is a relatively new entry into the digital currency market, something the ShadowBrokers address in a Q-and-A section of itspost:

"Q: Is Zcash safe and reliable?

"[Explitive] no! If you caring about loosing $20k+ Euro then not being for you. Monthly dump is being for high rollers, hackers, security companies, OEMs, and governments. Playing 'the game' is involving risks."

The post is written in the ShadowBrokers' trademark inconsistent broken English, widely believed to be an attempt to conceal the group's identity.

Little is known about the ShadowBrokers, including whether they are a group or individual, and hackers or NSA insiders leaking files. They first appeared in August trying to auction the complete set of tools, releasing an initial leak purportedly to drum up interest in the sale.

They returned in April to leak Windows tools in what the Brokers said was a protest of President Trump abandoning his hard-right positions for a more centrist view.

In an apparent attempt to capitalize on the notorietyof Wanna Cry, the ShadowBrokers announced their monthly leaking service just after Wanna Cry warranted international headlines.

The leaked documents appear to be at least in part genuine NSA documents. One of the hacking tools releasedby the group contained an identification code mentioned in a previously unreleased Edward Snowden file.

The ShadowBrokers claim they will not announce the contents of the monthlyleaks in advance.

"Q: What is going to be in the next dump?" ask the Brokers in the Monday post.

"TheShadowBrokers is not deciding yet. Something of value to someone. See theshadowbrokers previous posts... Peoples is seeing what happenings when theshadowbrokers is showing theshadowbrokers first. This is being wrong question. Question to be asking 'Can my organization afford not to be first to get access to theshadowbrokers dumps?'"

Link:
NSA leakers begin sign-ups for monthly leak subscription service - The Hill

Hackers, the NSA, Microsoft or Shadow Brokers: Who’s really to blame for WannaCry? – Healthcare IT News (blog)

I try not to jump on bandwagons, but with so much coverage of the worldwideWannaCrymess, I do have a few things to say that you may not have seen elsewhere.

There's beenplentyofmediacoverageso I'll just give a high level overview of what happened. Like many other nations, the USNational Security Agency(NSA) studies computer flaws and develops ways to attack them. TheShadowBrokersare a hacker group who started leaking some of these NSA-developed attacks in the second half of 2016. TheApril 2017 editionof their leaks included the code that enabled the WannaCry attack.

The attack that started on Thurs May 11 consisted of two parts. One would encrypt files so that the owner could not get access to their files (commonly called "CryptoWare"). The other part could get remote access to any vulnerable computer. This was a very powerful combination and this is the first time we've seen this kind of auto-spreading cryptoware. Once infected, the victim sees a screen that directs them to pay a ransom in Bitcoin so the whole attack is considered ransomware.

Now, Microsoft did release a patch in March to fix some of these problems, in particular, the remote access part. So no problem, right? Desktops and laptops are usually easier to patch, and you should always have your home systems set to automatically update. But servers need more testing to assure that applications continue to work as expected.

Patching was a critical part of the fix, but there was definitely more to it including things like new anti-virus signatures, whitelisting, intrusion prevention signatures and firewall rules.

One reason healthcare orgs seemed to be hit hard, including the BritishNational Health Service(NHS) is that healthcare often has many older systems and applications. These can be hard to update. With NHS, most of their desktop systems were running Windows XP!

But now ... who is actually to blame?

Inhispodcast, episode 21,Graham Cluleyasked this question. In particular, the question was, which of Microsoft, NSA, Shadow Brokers are to blame. You can listen to hear what they said.

Here's my opinion. There's one group to blame and then an additional area of concern.

The "blame" goes to those who carried out the exploit!

Yes, they took advantage of existing vulnerabilities and companies that didn't protect their systems, but that's beside the point. I mean, just because a bank has a welcome mat at the door doesn't mean you're allowed to rob it.

Now, what about these software vulnerabilities? Why do we keep having these problems? Someone has to write these in first place. I'm not picking on software developers. I've been one. It's a hard job. Code is very complex. This is something we call secure software engineering and it's not easy, but there are tools available to help us.

For additional reading, security expertMarcus Ranumwastalking about this wholecomplexity issue12 or more years ago.

And, not to pick on Microsoft, but why do we give everyone a computer with a general purpose operating system when most people only do specific things like email or word processing which can both be accomplished inside a browser? The more we can simplify the better off we'll be.

And then there's the correspondingmonocultureproblemwe were discussing 15-20 years ago: If everyone's computer is basically the same,then one problem can take them all out (thinkpotatoblight). And that seems to happen pretty regularly (to computers, not crops). The answer here is special purpose computing, and the more scaled down the better. I'm a big fan of "thin" systems like Chromebooks.

What can you do to protect yourself?

At home and at work: be careful with links and attachments.

At work: follow security policies and work with your IT team on all technology needs (if it uses electricity, it could pose a security issue).

At home: use the default for windows updates, useSecunia PSI, if you use Windows turn on the all the default protections and, of course,back up your data.

There are many good tools available for home backups includingCarboniteandCrashPlan, or even Microsoft OneDrive or Google Drive. And two good choices for encrypting data arebitlockerandveracrypt.

You need to take these kinds of steps now because we are going to see more new and bigger attacks coming in the future.

This post originally appeared on Barry Caplins Security and Coffeeblog.

Excerpt from:
Hackers, the NSA, Microsoft or Shadow Brokers: Who's really to blame for WannaCry? - Healthcare IT News (blog)

Software simplified – Nature.com

Project Twins

In 2015, geneticist Guy Reeves was trying to configure a free software system called Galaxy to get his bioinformatics projects off the ground. After a day or two of frustration, he asked members of his IT department for help. They installed Docker, a technology for simulating computational environments, which enabled him to use a special version of Galaxy that came packaged with everything he needed called a container. A slight tweak to the Galaxy settings, and he was done before lunch.

Reeves, at the Max Planck Institute for Evolutionary Biology in Pln, Germany, is one of many scientists adopting containers. As science becomes ever more data intensive, more software is being written to extract knowledge from those data. But few researchers have the time and computational know-how to make full use of it. Containers, packages of software code and the computational environment to run it, can close that gap. They help researchers to use a wider array of software, accelerate experiments and promote reproducibility.

Containers are essentially lightweight, configurable virtual machines simulated versions of an operating system and its hardware, which allow software developers to share their computational environments. Researchers use them to distribute complicated scientific software systems, thereby allowing others to execute the software under the same conditions that its original developers used. In doing so, containers can remove one source of variability in computational biology. But whereas virtual machines are relatively resource-intensive and inflexible, containers are compact and configurable, says C. Titus Brown, a bioinformatician at the University of California, Davis. Although configuring the underlying containerization software can be tricky, containers can be modified to add or remove tools according to the user's need flexibility that has boosted their popularity, he says. I liked the idea of having something that works out of the box, says Reeves.

Lab-built tools rarely come ready to run. They often take the form of scripts or programming source code, which must be processed and configured. Much of the software requires additional tools and libraries, which the user may not have installed. Even if users can get the software to work, differences in computational environments, such as the installed versions of the tools it depends on, can subtly alter performance, affecting reproducibility. Containers reduce that complexity by packaging the key elements of the computational environment needed to run the desired software, including settings and add-ons, into a lightweight, virtual box. They don't alter the resources required to run it if a tool needs a lot of memory, then so too will its container. But they make the software much easier to use, and the results easier to reproduce.

Depending on the software used Docker, Singularity and rkt are popular containers can run on Windows, Mac OS X, Linux or in the cloud. They can package anything from a single process to a complex environment such as Galaxy. These tools can interact with each other, sharing data or building pipelines, for instance. Because each application resides in its own box, even tools that would ordinarily conflict with each other can run harmoniously.

Docker uses executable packages, called images, which include the tool to be contained as well as the developer's computational environment. To create a Docker image, a developer creates a configuration file with instructions on how to download and build all the required tools inside it. He or she then 'runs' the file to create an executable package. All the user then needs to do is retrieve the package and run it. Other tools can also generate images. The Reprozip program, for example, assembles Docker-compatible packages by watching as software tools run and tracing the input files and software libraries that the tool requires.

Deborah Bard, a computer scientist at the National Energy Research Scientific Computing Center in Berkeley, California, helps researchers to install their software on the lab's supercomputer. She recalls spending three or four days installing a complex software pipeline for telescope simulation and analysis. Using containers cut this time down to hours. You can spend your time doing science instead of figuring out compiler versions, she says.

For Nicola Mulder, a bioinformatician at the University of Cape Town in South Africa, containers help her to synchronize a cross-border bioinformatics network she runs in Africa, called H3ABioNet. Not all African institutions have access to the same computational resources, she explains, and Internet connectivity can be patchy. Containers allow researchers with limited resources to access the tools that they otherwise might not be able to.

They also allow researchers with sensitive genomic data to collaborate and compare findings without actually sharing the underlying data, Mulder says. And, if researchers at one site obtain different results from their colleagues at another, the standardization the containers provide could eliminate one of the reasons why.

Although computer scientists have multiple options for container platforms, Docker, which is an open-source project launched in 2013, is perhaps the most popular among scientists. It has a large registry of prebuilt containers and an active online community that competitors have yet to match. But many administrators of high-performance computing systems preclude Docker use because it requires high-level administrative access privileges to run. This type of access may allow users to copy or damage anything on the system. An add-on to the fee-based enterprise edition allows users to sidestep that requirement, but it is not available with the free, community edition. They can, however, use a different containerization tool such as Shifter, which doesn't require full privileges, or root access, but still supports Docker images.

The requirement for root access is the biggest obstacle to widespread adoption of Docker, Brown explains. Many academics run bioinformatics tools on high-performance computing clusters administered by their home institutions or the government. Of course, they don't have administrative privileges on most of those systems, he says. Brown spends about US$50,000 annually for cloud computing time on Amazon Web Services, but he says this represents just one-third of his computing work; the rest is carried out on a cluster at Michigan State University, where he lacks root-level access. As a result, Brown creates Docker containers of his tools for distribution, but can rarely use them himself.

Researchers can access Docker images either from the platform's own hosting service, Docker Hub, or from registries of containers such as BioContainers and Dockstore, which allow the sharing of tools vetted by other scientists. Brian O'Connor at the University of California, Santa Cruz, who was the technical lead for the Dockstore registry, recommends that scientists look through container registries to find a tool that works for their project instead of trying to reinvent something that already exists.

But actually getting the underlying Docker software to run properly can be challenging, says Simon Adar, chief executive of Code Ocean in New York, an online service that aims to simplify the process. It's too technical, it was designed for developers to deploy complex systems. The service, launched in February, creates what Adar calls compute capsules, which comprise code, data, results and the Docker container itself. Researchers upload their code and data, and then either execute it in a web browser or share it with others no installation required. Adar likens the process to sharing a YouTube video. The company even offers a widget that enables users to embed executable code in web pages.

Shakuntala Baichoo, a computer scientist at the University of Mauritius in Moka, learned about containers at a communal programming event, called a hackathon, organized by H3ABioNet. Previously, she spent hours helping collaborators install her tools. In making the tools easier to install, she says, containers not only free up her time, but they might also encourage scientists to test them and provide feedback.

At CERN, the particle-physics laboratory near Geneva, Switzerland, scientists use containers to accelerate the publication process, says physicist Kyle Cranmer at New York University who works on CERN's ATLAS project, which searches for new elementary particles. When physicists run follow-up studies, they have to dig up code snippets and spend hours redoing old analyses; with containers, they can package ready-to-use data analysis workflows, simplifying and shortening the process.

Cranmer says that although much of the debate around reproducibility has focused on data and code, computing environments themselves also play a big part. It's really essential, he says. One study of an anatomical analysis tool's performance in different computing environments, for example, found that the choice of operating system produced a small but measurable effect (E. H. B. M. Gronenschild et al. PLoS ONE 7, e38234; 2012).

But containers are only as good as the tools they encapsulate, says Lorena Barba, a mechanical and aerospace engineer at George Washington University, Washington DC. If researchers start stuffing their bad code into a container and pass it on, we are foredoomed to failure. And, says Brown, without pressure from funding agencies and journals, containers are unlikely to make researchers suddenly embrace computational reproducibility.

Indeed, few researchers are using containers, says Victoria Stodden, a statistician at the University of Illinois at UrbanaChampaign who studies computational reproducibility. In part that's because of a lack of need or awareness, but it is also because they might not have the computer skills needed to get going.

Behind the scenes, however, that could be changing. Companies such as Google and Microsoft already run some software in containers, says Jonas Almeida, a bioinformatician at Stony Brook University, New York. Large-scale bioinformatics projects may not be far behind. The cloud-based version of Galaxy will eventually run inside containers by default, says Enis Afgan, a computer scientist at Johns Hopkins University in Baltimore, Maryland, who works on Galaxy.

In 510 years, Almeida predicts, scientists will no longer have to worry about downloading and configuring software; tools will simply be containerized. It's inevitable, he says.

The rest is here:
Software simplified - Nature.com