Page 4,241«..1020..4,2404,2414,2424,243..4,2504,260..»

Four in five firms say they avoid running sensitive data in the public cloud – Cloud Tech

As the true definition of hybrid cloud continues to blur, eight out of 10 enterprises in a survey from Stratoscale say they avoid running sensitive data in the public cloud.

The results appear in the companys Hybrid Cloud Survey, which was conducted in June and features more than 600 responses from organisations at varying sizes.

According to the research, almost four in five (77%) respondents define the value of the hybrid cloud in one of two ways, depending on how far advanced their adoption is. In enterprises with hybrid cloud adoption level below 20%, hybrid is most frequently defined as the ability to move workloads between private and public cloud, but as it exceeds 20%, the concept moves to one of different workloads belonging in different public and private environments.

More than 80% of enterprises polled said they had either moderate or high levels of concern around public cloud lock-in, with the smallest companies those with less than 100 employees the most concerned. More than half of enterprises have also moved less than one fifth of their workloads to the public cloud, with smaller firms the furthest into their journey, with large companies having more users who start but a much slower pace of progress.

The transformation to a digital business by implementing cloud services and platforms is no longer much of an option its an imperative for the continued survival of any enterprise, said Ariel Maislos, Stratoscale CEO in a statement. The findings from our survey confirm what were hearing from our customers although many have started their journey to the public cloud, the vast majority of companies are still running mission critical workloads and sensitive data in private solutions, primarily for security reasons.

Its clear the hybrid cloud model represents the near and long-term future for most enterprises, regardless of size, Maislos added.

The lack of consensus around the definition of hybrid cloud is reminiscent of a study conducted by Clutch at the start of this year regarding DevOps, where not even a quarter of respondents could agree on a definitive meaning for the term.

Read more: What is the point of hybrid cloud - or, is it time to re-evaluate hybrid?

See the rest here:
Four in five firms say they avoid running sensitive data in the public cloud - Cloud Tech

Read More..

Amazon Miss Sparks Debate: Will Video Pay Off Like Cloud … – Investor’s Business Daily

Amazon.com's (AMZN) surprise second-quarterearnings miss has analysts debating the margin impact of its far-flung investments, and while cloud computing seems a good bet to pay off, internet video may be raising some eyebrows.

XAutoplay: On | OffThe difference: Amazon Web Services is by far the biggest provider of cloud computing services, leading Microsoft (MSFT), Google-parent Alphabet (GOOGL) and others.

In web video, Amazon aims to catch up with Netflix (NFLX), Google's YouTube, Time Warner's (TWX) HBO and others. Amazon has stepped up investments in original content and has expanded video streaming services overseas.

Shares in the e-commerce giant were down 2.5% to 1,020.04 on the stock market today after Amazon's revenue topped views, but profit fell far short of estimates. That's off intraday lows of 1,001 and above Amazon's recent buy point of 1,016.60.

Michael Olson, analyst at Piper Jaffray, maintains an overweight rating on Amazon stock.

"An irony in the Amazon margin story is that we don't believe investors would disagree with any of the initiatives to which Amazon is committing significant capex or opex dollars," he said in a note to clients.

At Stifel, analysts Scott Devitt isn't sure about that.

"Amazon is spending aggressively on everything under the sun," Devitt said in a report. "It may all work but it is clear that investment levels will be heightened in the near term. The most recent investment cycle created a buying opportunity once certain investments received reduced allocations based on limited success, China and mobile phone. We see a lot of positive in this cycle but note two areas that could eventually be viewed as problematic by investors: video and various new retail initiatives such as grocery."

The e-commerce giant views internet video as a recruitment tool for its customer-loyalty program, Amazon Prime. The subscription service, which costs $99 annually, has 80 million U.S. members. Prime encourages shoppers to buy more goods, with free delivery and other perks.

IBD'S TAKE:Amazon, anIBD Leaderboardstock, has retreated from abuy range after reporting earnings. It had cleared a late-stage flat base with a 1,016.60 buy point.ItsIBD Composite rating has dipped to 97 out of a possible 99.

"Amazon strongly believes that its video content offerings efficiently enable conversion of customers into paid Prime subscribers and lead to greater spend per subscriber," said Mark Mahaney, a RBC Capital analyst in a report.

Aside from cloud computing and online video, Amazon's other big investments include retail grocery, fulfillment-center buildouts, engineering staff and salespeople, artificial intelligence and overseas markets like India. "The fulfillment and AWS waves may dissipate, but the video investment seems early," said Michael Graham, a Canaccord Genuity analyst in a report.

RELATED:

Amazon Second-Quarter Earnings Have A Big Miss On Bottom Line

Is The Holy Grail Of Digital Payments Just One Click Away?

Here's One More Thing Amazon Is Killing

See original here:
Amazon Miss Sparks Debate: Will Video Pay Off Like Cloud ... - Investor's Business Daily

Read More..

RMU offers new course in Amazon cloud computing with eyes on … – Pittsburgh Post-Gazette


Pittsburgh Post-Gazette
RMU offers new course in Amazon cloud computing with eyes on ...
Pittsburgh Post-Gazette
Robert Morris University will begin offering Amazon Web Services training, which could prepare its IT students for a career in cloud computing.

and more »

Excerpt from:
RMU offers new course in Amazon cloud computing with eyes on ... - Pittsburgh Post-Gazette

Read More..

The central role of the server in open networking – Cloud Tech

Open networking is a hot topic these days. When we read about open networking products and initiatives, the emphasis is on network switches more often than not. But server-based networking has also proceeded along an increasingly open path, and in many ways it set the stage for the opening of switch technology.

Network switches like top of rack (TOR) switches have traditionally been closed they come from specific vendors with proprietary software. Networking in commercial off the shelf (COTS) servers has been open for several years, thanks to the proliferation of Linux server operating systems (OSs), and networking technologies like Open vSwitch (OVS). The networking industry wants the switch world to follow servers successful path; hence the birth and popularity of the term open networking.

Switches have traditionally been closed the network operating systems and protocols that run on the switches have been proprietary, could not be disaggregated from the hardware and were not open source. At first, switches were really closed because the switch ASIC, the software and the switch box were all from a single vendor and were proprietary. Then, switches got disaggregated a bit when the switch vendors adopted switch ASICs from merchant silicon vendors like Broadcom. Next came OpenFlow and OpenFlow-based SDN controllers like Floodlight, which proposed that the switch control plane protocols be removed from the switch and placed in an open source controller. This in some ways disaggregated the OS from the switch box.

Subsequently, switch operating systems like Cumulus Linux came into the market. These are disaggregated because they can install and run on merchant switch ASIC-based switch boxes from multiple vendors like Quanta and Dell. But such disaggregated switch OSes are not necessarily open source.

More recently, open source switch operating systems like SONiC and Open Network Linux have been in the news. The open source controller ecosystem has further evolved as well, focusing on feature completeness and carrier grade reliability (i.e. OpenDaylight and ONOS).

All in all, significant action and news in the realm of open networking have been related to switches, geared toward helping the industry manage the switch supply chain more effectively and deploy efficiently, similar to the COTS server model.

Figure 1: Switch disaggregation follows server model

What seems to get overlooked in these discussions about open networking is the all-important precursor to this movement open networking on servers. Most importantly, how open networking on servers (or server-based open networking) has evolved and enabled open networking on switches.

Over the last several years, TOR switches have become simpler because data centre traffic patterns have changed and networking infrastructure efficiency requirements have increased. When using leaf (TOR) and spine switches, the imperative has shifted to moving east-west traffic most efficiently, which requires more bandwidth, more ports and lower latency. As a result, the feature requirements in hardware and software in leaf and spine switches have been reduced to a simpler set. This has made open networking in switches easier to implement and deploy.

However, the smarts of networking did not disappear they just moved to the server, where such smarts are implemented using the virtual switch preferably an open one such as OVS and other Linux networking features like IP tables. Many new networking features related to network security and load balancing have been added to OVS.

OpenStack, as an open source and centralized cloud orchestration tool, has rapidly come to prominence, with more than 60% of OpenStack networking deployed today using OVS (with OpenStack Neutron). Server-based open networking has evolved relatively quietly compared to open networking in switches, but it has made major contributions toward bringing deployment efficiencies and flexibility.

Today, in many high growth cloud, SDN and NFV applications, server-based open networking is running into server sprawl and related TCO challenges. As the networking bandwidths increase and the number of VMs proliferates on servers, OVS processing is taking up an increasingly large number of CPU cycles, which is limiting the number of CPU cycles available for processing applications and VMs.

Data centre operators cannot economically scale their server-based networking using traditional software-based virtual switches. So implementing server-based networking in x86 architectures and software is a double whammy: it increases costs as too many CPU cores are consumed, and it lowers performance as applications are starved for resources.

Offloading network processing to networking hardware is an option that has worked well in the past. However, software-defined and open source networking is evolving at a rapid pace; such innovation stops the moment data centre operators look to inflexible networking hardware for performance and scale.

Figure 2: Networking smarts moving to servers

The solution to this challenge is to offload OVS processing to an SmartNIC. A SmartNIC handles I/O functions and incorporates a programmable network processor that can run OVS and other software. With a SmartNIC handling OVS processing, performance is boosted by up to 5X, and the data centre operator frees as many as 11 CPU cores from network-related processing, enabling greater VM scalability and lower costs. Because it is programmable, a SmartNIC can evolve rapidly with few features, preserving the pace of innovation.

Although server-based networking by itself can cause server sprawl, SmartNICs are making the case for efficient and flexible open networking from the COTS server side.

Figure 3: A SmartNIC offloads networking from servers

See more here:
The central role of the server in open networking - Cloud Tech

Read More..

Federal Cloud Computing – TechTarget

The following is an excerpt from Federal Cloud Computing by author Matthew Metheny and published by Syngress. This section from chapter three explores open source software in the federal government.

Open source software (OSS) and cloud computing are distinctly different concepts that have independently grown in use, both in the public and private sectors, but have each faced adoption challenges by federal agencies. Both OSS and cloud computing individually offer potential benefits for federal agencies to improve their efficiency, agility, and innovation, by enabling them to be more responsive to new or changing requirements in their missions and business operations. OSS improves the way the federal government develops and also distributes software and provides an opportunity to reduce costs through the reuse of existing source code, whereas cloud computing improves the utilization of resources and enables a faster service delivery.

In this chapter, issues faced by OSS in the federal government will be discussed, in addition to the relationship of the federal government's adoption of cloud computing technologies. However, this chapter does not present a differentiation of OSS from proprietary software, rather focuses on highlighting the importance of the federal government's experience with OSS in the adoption of cloud computing.

Over the years, the private sector has encouraged the federal government to consider OSS by making a case that it offers an acceptable alternative to proprietary commercial off-the-shelf (COTS) software. Regardless of the potential cost-saving benefits of OSS, federal agencies have historically approached it with cautious interest. Although, there are other potential issues in transitioning from an existing proprietary software, beyond cost. These issues include, a limited in-house skillset for OSS developers within the federal workforce, a lack of knowledge regarding procurement or licensing, and the misinterpretation of acquisition and security policies and guidance. Although some of the challenges and concerns have limited or slowed a broader-scale adoption of OSS, federal agencies have become more familiar with OSS and the marketplace expansion of available products and services, having made considerations for OSS as a viable alternative to enterprise-wide COTS software. This renewed shift to move toward OSS is also being driven by initiatives such as the 18F and the US Digital Service, and the publication of the guidance such as the Digital Services Playbook, which urges federal agencies to "consider using open source, cloud based, and commodity solutions across the technology stack".

Interoperability, portability, and security standards have already been identified as critical barriers for cloud adoption within the federal government. OSS facilitates overcoming standards obstacles through the development and implementation of open standards. OSS communities support standards development through the "shared" development and industry implementation of open standards. In some instances, the federal government's experience with standards development has enabled the acceptance and use of open standards-based, open source technologies and platforms.

The federal government's use of OSS has its beginning in the 1990s. During this period, OSS was used primarily within the research and scientific community where collaboration and information sharing was a cultural norm. However, it was not until 2000 that federal agencies began to seriously consider the use of OSS as a model for accelerating innovation within the federal government. As illustrated in Fig. 3.1, the federal government has developed a list of OSS-related studies, policies, and guidelines that have formed the basis for the policy framework that has guided the adoption of OSS. This framework tackles critical issues that have inhibited the federal government from attaining the full benefits offered by OSS. Although gaps still exist in specific guidelines relating to the evaluation, contribution, and sharing of OSS, the policy framework serves as a foundation for guiding federal agencies in the use of OSS. In this section, we will explore the policy framework with the objective of describing how the current policy framework has led to the broader use of OSS across the federal government, and more importantly how this framework has enabled the federal government's adoption of cloud computing by overcoming the challenges with acquisition and security that will be discussed in detail in the next section.

The President's Information Technology Advisory Committee (PITAC), which examined OSS, was given the goal of:

The PITAC published a report concluding that the use of the open source development model (also known as the Bazaar model) was a viable strategy for producing high-quality software through a mixture of public, private, and academic partnerships. In addition, as presented in Table 3.1, the report also highlighted several advantages and challenges. Some of these key issues have been at the forefront of the federal government's adoption of OSS.

Over the years since the PITAC report, the federal government has gained significant experience in both sponsoring and contributing to OSS projects. For example, one of the most widely recognized contributions by the federal government specifically related to security is the Security Enhanced Linux (SELinux) project. The SELinux project focused on improving the Linux kernel through the development of a reference implementation of the Flask security architecture for flexible mandatory access control (MAC). In 2000, the National Security Agency (NSA) made the SELinux available to the Linux community under the terms of the GNU's Not Unix (GNU) General Public License (GPL).

Starting in 2001, the MITRE Corporation, for the US Department of Defense (DoD), published a report42 that built a business case for the DoD's use of OSS. The business case discussed both the benefits and risks for considering OSS. In MITRE's conclusion, OSS offered significant benefits to the federal government, such as improved interoperability, increased support for open standards and quality, lower costs, and agility through reduced development time. In addition, MITRE highlighted issues and risks, recommending any consideration of OSS should be carefully reviewed.

Shortly after the MITRE report, the federal government began to establish specific policies and guidance to help clarify issues around OSS. The DoD Chief Information Officer (CIO) published the Department's first official DoD-wide memorandum to reiterate existing policy and to provide clarifying guidance on the acquisition, development, and the use of OSS within the DoD community. Soon after the DoD policy, the Office of Management and Budget (OMB) established a memorandum to provide government-wide policy regarding acquisition and licensing issues.

Since 2003, there were multiple misconceptions, specifically within the DoD, regarding the use of OSS. Therefore, in 2007, the US Department of the Navy (DON) CIO released a memorandum that clarified the classification of OSS and directed the Department to identify areas where OSS can be used within the DON's IT portfolio. This was followed by another DoD-wide memorandum in 2009, which provided DoD-wide guidance and clarified the use and development of OSS, including explaining the potential advantages of the DoD reducing the development time for new software, anticipating threats, and response to continual changes in requirements.

In 2009, OMB released the Open Government Directive, which required federal agencies to develop and publish an Open Government Plan on their websites. The Open Government Plan provided a description on how federal agencies would improve transparency and integrate public participation and collaboration. As an example response to the directive support for openness, the National Aeronautics and Space Administration (NASA), in furtherance of its Open Government Plan, released the "open. NASA" site that was built completely using OSS, such as the LAMP stack and the WordPress content management system (CMS).

On May 23, 2012, the White House released the Digital Government Strategy that complements other initiatives and established principles for transforming the federal government. More specifically, the strategy outlined the need for a "Shared Platform" approach. In this approach, the federal government would need to leverage "sharing" of resources such as the "use of open source technologies that enable more sharing of data and make content more accessible".

The Second Open Government Action Plan established an action to develop an OSS policy to improve access by federal agencies to custom software to "fuel innovation, lower costs, and benefit the public". In August 2016, the White House published the Federal Source Code Policy, which is consistent with the "Shared Platform" approach in the Digital Government's Strategy, by requiring federal agencies make available custom code as OSS. Further, the policy also made "custom-developed code available for Government-wide reuse and make their code inventories discoverable at https://www.code.gov ('Code.gov')".

In this section, we discussed key milestones that have impacted the federal government's cultural acceptance of OSS. It also discussed the current policy framework that has been developed through a series of policies and guidelines to support federal agencies in the adoption of OSS and the establishment of processes and policies to encourage and support the development of OSS. The remainder of this chapter will examine the key issues that have impacted OSS adoption and briefly examine the role of OSS in the adoption of cloud computing within the federal government.

About the author:

Matthew Metheny, PMP, CISSP, CAP, CISA, CSSLP, CRISC, CCSK, is an information security executive and professional with twenty years of experience in the areas of finance management, information technology, information security, risk management, compliance programs, security operations and capabilities, secure software development, security assessment and auditing, security architectures, information security policies/processes, incident response and forensics, and application security and penetration testing. He currently is the Chief Information Security Officer and Director of Cyber Security Operations at the Court Services and Offender Supervision Agency (CSOSA), and is responsible for managing CSOSA's enterprise-wide information security and risk management program, and cyber security operations.

Read more here:
Federal Cloud Computing - TechTarget

Read More..

Facebook’s Sheryl Sandberg: WhatsApp metadata informs governments about terrorist activity in spite of encryption – CNBC

"The goal for governments is to get as much information as possible. And so when there are message services like WhatsApp that are encrypted, the message itself is encrypted but the metadata is not, meaning that you send me a message, we don't know what that message says but we know you contacted me," she said.

"If people move off those encrypted services to go to encrypted services in countries that won't share the metadata, the government actually has less information, not more. And so as technology evolves these are complicated conversations, we are in close communication working through the issues all around the world."

Sandberg recently met Rudd and told "Desert Island Discs" that Facebook and the U.K. government are "very aligned in our goals".

"We want to make sure all of us do our part to stop terrorism and so our Facebook policies are very clear. There's absolutely no place for terrorism, hate, calls for violence of any kind. Our goal is to not just pull it off Facebook but to use artificial intelligence and technology to get it before it's even uploaded.

"We are working in collaboration with the other tech companies now, so if a video by a terrorist is uploaded to any of our platforms, we are able to fingerprint it for all the others so that they can't move from platform to platform."

Originally posted here:
Facebook's Sheryl Sandberg: WhatsApp metadata informs governments about terrorist activity in spite of encryption - CNBC

Read More..

Commissioners need to rethink encryption – LancasterOnline

Note: The following letter was sentFriday to Lancaster County Commissioners Dennis Stuckey, Craig Lehman and Josh Parsons.

I strongly urge you to reconsider the decision to encrypt police department radio transmissions before this change takes place in November.

First, the health and safety of both our Lancaster County community and the law enforcement officials who protect it are paramount.

Second, essential to the well-being of our county must be a government system that values public accessibility, transparency and accountability.

These two truths must find a way to co-exist.

Certainly, a healthy democracy and an informed citizenry here do not depend solely on public and news media access to Lancaster County police radio broadcasts. Both are, however, seriously diminished when the publics right to know is further eroded something that is becoming alarmingly common in our commonwealth and across this country.

Our newspaper has long relied on police communication to provide the public with emergency information. I consider a scanner as essential to my job as a wrench to a plumber, a longtime television journalist in Oklahoma wrote to me last Sunday. He reached out in support of LNPs July 5 editorial opposing encryption.

Think snowstorms. Vehicular accidents. Road closings. Gas leaks. Homicides. Violent protests.

Radio access enables news outlets to work hand-in-hand with first responders to keep the public away from dangerous situations, Melissa Melewsky, media law counsel for the Pennsylvania NewsMedia Association, noted in a recent LNP article. Total encryption addresses a problem that doesnt exist where the media is concerned.

West Hempfield Township Police Chief Mark Pugliese I, who chairs the county chiefs Police Advisory Board to Lancaster County-Wide Communications and represents the county Chiefs of Police Association on this issue, appears to agree.

Referring to events worldwide and expressing concern for police safety, he told you its not unusual for officers today to be ambushed. But he also acknowledged that were not getting that so much in Lancaster County.

Additionally, the chief spoke about incidents here where the public or the media interfered with investigations, in some cases by getting to crime scenes more quickly than police.

When pressed by an LNP reporter, Chief Pugliese could not cite a single situation in Lancaster County where the media interfered at a crime scene.

The chief says he is not anti-media.

Nor am I anti-law enforcement.

When the earth rumbles or a gun fires, citizens rely on police and other first responders to courageously address the emergency. They expect us in the news media to tell them what is happening. Shutting off access to information feeds distrust and anxiety; it fuels the spread of misinformation by social media commenters unbound by the journalistic standards of citing sources and confirming details.

Chief Pugliese said that the removal of public and media access to police broadcasts will make it incumbent on police to improve the lines of communication.

Experience suggests to me that will not happen; I dont see that as law enforcements primary role, and I dont see how it does either. Access to timely and accurate information that serves the public interest will suffer as a result.

Like law enforcement, we in the news media must be allowed to do the work we are trained to do. It is incumbent upon us to get it right and to be held accountable if we dont.

While all three of you are and must be concerned about police safety, Commissioner Lehman has said that blocking police communication might give officers a false sense of security and further isolate them from the community. Hes suggested a compromise of encrypting public transmissions, but allowing access to the news media.

It is certainly a better option.

I was at home July 2 and only yards away from the horrific Manor Township gas explosion that killed one man and injured others as it leveled a house, severely damaged neighboring homes and, in seconds, rattled the psyche of an entire community.

Frightened neighbors ran outside their homes, erroneously speculating about the cause of the blast. I called the newsroom and was accurately informed that it was a gas explosion. Then I walked to the scene to join my newspaper colleagues in probing more deeply as we talked with witnesses, questioned officials and provided real-time information that a county wanted and needed in that moment.

Fire and ambulance dispatches, the ones that guided us that day, are not part of the planned encryption here. At least not yet. As Chief Pugliese noted, the scrambling of police communication, and that of fire and ambulance, is becoming the national norm.

I dont think thats the way to go. I do believe a compromise can be struck, one that will allow law enforcement to do its work, and enable those of us in the news media to do ours.

We both exist, after all, to serve our Lancaster County community to the very best of our abilities.

Barbara Hough Roda is executive editor of LNP and LancasterOnline. Email: broda@LNPnews.com; phone, 717-481-7335; Twitter, @BarbRodaLNP.

Continued here:
Commissioners need to rethink encryption - LancasterOnline

Read More..

Ex-NSA chief Chris Inglis backs government’s encryption push against Apple, Facebook – The Australian Financial Review

The deputy director of the United States' National Security Agency (NSA) during the Edward Snowden leaks has backed the Australian government's push to force tech giants to assist in revealing the content of some encrypted messages, saying the likes of Facebook and Apple could do more to help track terrorists and criminals.

Speaking to The Australian Financial Review ahead of a trip to Australia this week, Chris Inglis, who was the NSA's highest-ranking civilian from 2006 to 2014 says the government's plan to enact law enforcement powers to crack open encryption by the end of the year is an appropriate attempt to strike a balance between protecting privacy and protecting citizens from terrorism.

He says the government's plan will not require the providers of apps such as WhatsApp, Wickr, Telegram Messenger and iMessage to create new so-called back doors into devices and apps, but will simply involve them doing more to open up their systems on request.

"When citizens look to their government they expect them to protect their privacy and also to keep them safe, this is not an either/or proposition. When I hear your Prime Minister and your Attorney-General speaking about this, I don't see them favouring one of these over the other," Inglis says.

"There has been scaremonger comments on these topics, but I haven't heard your government asking for new back doors, they are merely saying that, if there is a capability already there, they would like to use it under the rule of law, which has always been a legitimate government pursuit."

Tech giants such as Facebook and Apple have already asserted they provide as much assistance as they can to law enforcement agencies, both in Australia and globally, and say they are powerless to break the encryption on individual messages.

Prime Minister Malcolm Turnbull raised eyebrows around the world with a comment suggesting the laws of Australia trump the laws of mathematics, which led to Edward Snowden tweeting that such remarks create a "civilizational risk".

Apple chief executive Tim Cook previously wrote an open letter to customers last year after the company refused to build a system to help the FBI unlock the iPhone of a San Bernardino terrorism culprit who jointly killed 14 people.

He said the US government's request to break encryption would require its engineers to weaken the devices for everyone else around the world.

"The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe," Cook wrote.

In July, special adviser to the Prime Minister on cyber security Alastair MacGibbon said he couldn't understand why these companies "viscerally rail against helping protect their customers", and Inglis says he believes that the likes of Apple are balancing their commercial concerns in markets in all corners of the globe against the option of being as open as possible with different governments.

"Many of these systems already have what I would describe as an appropriate, well-known back door, whether it's a patching mechanism, or it's a software update mechanism those are back doors," he says.

"Most users have every confidence in the world that those work very appropriately and that only the vendor who services their software is able to replace the software, update the software and change the function of that phone in every way, shape, or form."

Other experts, such as Firstwave Cloud Technology's Simon Ryan have also suggested that it is entirely possible, at least for Facebook, to reveal the contents of private messages.

Inglis is heading to Australia in his role as chair of the strategic advisory board of US-based behaviour analytics cyber security firm Securonix, which is poised to officially open its operations Down Under this week.

His time in office at the NSA ended a year after its former IT contractor Edward Snowden plunged it into crisis by leaking thousands of documents that laid bare the methods and extent of the agency's surveillance programs.

Securonix provides technology, which it says detects malicious behaviour within an organisation or network in real-time, and would theoretically stop the kind of exfiltration of private data accomplished by Snowden.

While saying that he still sits more closely to the black-and-white view that Snowden committed an act of betrayal, Inglis says he now has some empathy with Snowden's purported intention to expose what he believed to be egregious behaviour by the government.

However, he says Snowden's credentials as a principled whistleblower are called into doubt by the fact that he did nothing to raise concerns in less harmful ways prior to leaking information.

"I would feel more sympathetic about him in 2013 if he had exercised one iota of having raised a hand, lodged a concern, kind of thrown a brick through somebody's window with an anonymous note to us, but he did none of those things," Inglis says.

"With allegations like these, you an obligation to actually be factually correct in what you allege is going on, and he was not I think that if you believe in your cause, you should be willing to stand and speak about that in the presence of your peers, and here he is in Moscow, so none of that speaks well of either of his motivation and certainly not of his means."

Inglis was portrayed in the 2016 Oliver Stone movie Snowden, which followed events leading up to the leak, and which he says provided an "egregious misappropriation of the facts" regarding the attitudes at the NSA and of Snowden's importance within it.

In the movie a character in Inglis' role is seen sending Snowden off to head a mission in Hawaii to solve a problem related to China, yet Inglis says the two never met in person, and Snowden was too far removed from the action to be remotely considered for such work.

"I have to imagine that the reason it was portrayed that way was not to make it more interesting, but rather to impress upon the audience that Edward Snowden was somebody that travelled in circles where he would have direct knowledge of the strategies, the means and the conspiracies that are practised by an NSA, and of course he was nowhere near in those places," he says.

"He was an important enough worker that he was hired to do what he did, but he was working at the edge, and many of the things that he saw, he didn't fully understand the context of, and he therefore misdescribed."

Inglis says the sense of shock that permeated the NSA following the leaks had passed by the time he left the agency. He says that he and others within the NSA were comfortable that they were doing the right thing, with noble intentions, and believed they made the scandal worse by mismanaging their external communications before Snowden leaked.

He says the agency should have explained why it had surveillance plans in place and proactively addressed concerns about a lack of controls and restraint.

"If I could go back in time I would address the fact that the government and NSA were not transparent enough the noble purpose and controls were not as well understood as what Snowden was talking about, which was capability, and a capability that you might enjoy never tells the whole story," Inglis says.

"Most of his allegations were taken as revelations and they were not. His allegations were just that. They were facetious and vilified us."

Moving into the present, Inglis says he understands people outside the US viewing its present administration with a sense of worry. However, he believes that the checks and balances in place would not allow an unpredictable president to become a national security risk.

The Trump presidency has been dogged by suggestions that his team has been too close to Moscow since the election campaign, but Inglis says there are enough protections in place that would prevent the President from exceeding his remit.

"If I was still at the NSA, I would have to appreciate the President has a role, and that role within the United States system is that he is not the sole and ultimate authority on how the nation proceeds," he says.

"You have to actually let this play out, because it's still true that the conflict of ideas is one of our best ideas. I'm confident at the end of the day that our system is going to work its way through what looks like some pretty chaotic controversies at a distance, and frankly, most days, close in, feels that way as well.

"There is a genuine battle of ideas taking place as to what is the proper role of government, and the views are extreme. It looks a bit worrisome, both close in and at a distance, but the system has lived through periods where it was equally chaotic before and we worked our way through it. If you believe in the foundations of this particular form of government, as I do, you have to believe that we'll figure it out, that we'll work our way through."

See the original post:
Ex-NSA chief Chris Inglis backs government's encryption push against Apple, Facebook - The Australian Financial Review

Read More..

Oak Ridge licenses its quantum encryption method – FCW.com

Cybersecurity

A Qubitekk prototype will incorporate ORNL's single-photon source approach, thereby bringing the device closer to generating pairs of quantum light particles in a controlled, deterministic manner that is useful for quantum encryption. (Photo by Qubitekk)

Oak Ridge National Laboratory has licensed a method its researchers developed to keep encrypted machine-to-machine data from being intercepted.

San Diego-based quantum technology company Qubitekk has signed a non-exclusive license for the labs method of "down-conversion" of photons, which produces random, unpredictable pairs of the particles to confound the interception of data, the lab said in a July 25 statement.

"Current encryption techniques rely on complex mathematical algorithms to code information that is decipherable only to the recipient who knows the encryption key," according to the statement. "Scientists, including a team at the Department of Energys ORNL, are leveraging the quantum properties of photons to enable novel cryptographic technologies that can better protect critical network infrastructures."

According to lab officials, the technique harnesses quantum physics to expose, in real-time, the presence of bad actors who might be trying to intercept secret keys to encryption algorithms used by the energy sector.

Qubitekk President and CTO Duncan Earl said in the ORNL statement that his company plans to enhance its existing single-photon quantum information prototype by integrating the labs design. Earl is a former ORNL researcher who worked with the lab's Cyber Warfare group and Quantum Information Sciences team.

The company's work could lead to a tenfold increase in quantum encryption rates and the ability to maintain high data transmission speeds over longer distances, he added.

Earl said the firm plans to conduct field trials with its customers, which include California utility companies.

About the Author

Mark Rockwell is a staff writer at FCW.

Before joining FCW, Rockwell was Washington correspondent for Government Security News, where he covered all aspects of homeland security from IT to detection dogs and border security. Over the last 25 years in Washington as a reporter, editor and correspondent, he has covered an increasingly wide array of high-tech issues for publications like Communications Week, Internet Week, Fiber Optics News, tele.com magazine and Wireless Week.

Rockwell received a Jesse H. Neal Award for his work covering telecommunications issues, and is a graduate of James Madison University.

Click here for previous articles by Rockwell. Contact him at mrockwell@fcw.com or follow him on Twitter at @MRockwell4.

See more here:
Oak Ridge licenses its quantum encryption method - FCW.com

Read More..

iStorage diskAshur2 1TB PIN-protected encrypted external hard drive [Review] – BetaNews

It's hard -- for me at least -- to get too excited about hard drives. They get bigger, they get faster, and that's about it. But the iStorage diskAshur2 is a little different. This is a 1TB USB 3.1 external hard drive with a twist.

It offers hardware-level AES-XTS 256-bit encryption -- so no software is needed -- secured with PIN authentication. As you can see from the photo, there's a PIN pad built into the drive for easy locking and unlocking, and it's compatible with Windows, macOS and Linux ("it will work on any device with a USB port!"). We've already look at the diskAshur Pro 2, but this diskAshur2drive is nearly 20 percent cheaper.

The primary difference between the Pro drive and this one is the form of encryption that's used. While the diskAshur Pro 2 is "designed to be certified to" FIPS 140-2 Level 3, NCSC CPA, Common Criteria and NLNCSA, in the case of the diskAshur2, it's the lesser, older FIPS PUB 197 validation that's in place. In both instance, however, there's AES-XTS 256-bit hardware encryption protecting data which should be more than enough for most circumstances.

FIPS 140-2 Level 3 means that the diskAshur Pro 2's circuit board has a tamper-proof design, but there are still physical protection measures in place with the diskAshur2 for added peace of mind. The protection comes from the built in EDGE (Enhanced Dual Generating Encryption) Technology which protects from "external tamper, bypass laser attacks and fault injections and incorporates active-shield violation technology." There's also security against unauthorized firmware updates, and the onboard processor "reacts to all forms of automated hacking attempts by entering the deadlock frozen state where the device can only restart through a 'Power On' reset procedure."

In short, it's secure. But what's it like to use?

In a word, great. But you're probably looking for a little more detail than that...

The iStorage diskAshur2 is designed with travelling in mind. It's pretty light at 216g, measures a pocketable 124 x 84 x 19 mm and comes with a hand carry case (the 3TB, 4TB and 5TB models are slightly heavier and larger at 325g and 124 x 84 x 27mm). There's a (short) built in USB 3.1 cable so you don't have to remember to carry one around with you, and the drive is available in a choice of four colors -- Fiery Red, Phantom Black, Racing Green and Ocean Blue. It's IP56 rated for water and dust resistance.

What's great about the drive is the incredible ease of use. Encryption usually means having to fiddle around with software, but that's not the case here; everything is built into the drive. The drive is, by default, encrypted. Plug it in, and it remains inaccessible -- and invisible to the computer -- until you enter the necessary PIN and hit the unlock button. From this point, you can manually lock the drive at any time. You can also unplug the drive and it will be automatically locked, or auto-locking will kick in after a predetermined period of inactivity. The lack of software means that it's easy to take the drive from one computer to another, regardless of the operating system it is using.

This video from iStorage gives a good introduction to the device range:

Unlocking the drive is incredibly fast -- much faster than if computer-based software was involved. In terms of performance, this is a 5,400 RPM drive offering read speed of up to 148 MBps and write speeds of up to 140 MBps -- far from earth-shattering, but this is a drive that focuses on security, not performance.

As with the diskAshur Pro 2, brute force protection means that the drive will delete its encryption key (rendering data completely inaccessible) after fifteen consecutive incorrect PINs are entered. You can create a PIN of up to 15 digits, so it should be fairly easy to create a non-guessable PIN. For those who need it, there is also the option of using a Self-Destruct PIN to wipe out the encryption key so data cannot be accessed under any circumstances. For peace of mind, there is a two-year warranty covering the device.

For the vast majority of people, AES-XTS 256-bit hardware encryption and conforming to FIPS PUB 197 should be more than enough. If the relatively high price of the diskAshur Pro 2 was off-putting to you, the diskAshur2 gives you a way to get very much the same product at a pleasingly lower price.

You can find out more and buy a drive direct from iStorage. The 1TB model is priced at 219 (262).

Here is the original post:
iStorage diskAshur2 1TB PIN-protected encrypted external hard drive [Review] - BetaNews

Read More..