Page 2,956«..1020..2,9552,9562,9572,958..2,9702,980..»

‘I recall it with deep despair’: North Sea marks five years since Norway helicopter crash – News for the Oil and Gas Sector – Energy Voice

The North Sea today marks five years since a tragic crash in Norway killed 13 people and changed the future of offshore helicopter transportation.

Workers dubbed the Super Puma helicopter model, once dominant in the sector, the flying coffin after the crash off the island of Turoy in 2016.

It was the last straw for the sector, which had endured a spate of crashes since 2009, claiming 33 lives, with the Super Puma, which has not been in service in the North Sea for the last five years.

Jake Molloy, regional organiser of the RMT union, said: For a lot of the guys it would instil terror to get into that particular model of aircraft and I think a lot of guys breathe easy at the thought that were not using them.

I recall it, like Ive recalled so many down the years, with deep despair that life can so tragically be lost. It really is an event that I wouldnt want anyone to experience. Ive lost good friends and colleagues through the years in events like this.

It sits in the forefront of your mind all the time. Even sitting in the garden, as I am now, seeing them flying overhead, those thoughts come back to you. They never go away.

On April 29, 2016, a CHC-operated Super Puma went down while carrying oil workers from the Gullfaks B platform to Bergen Airport.

Iain Stuart, 41, from Laurencekirk, was among those killed in the crash off Turoy in Norway, taking place after the main rotor detached from the helicopter.

In the last seconds of its journey the chopper fell 2,000 feet, with witnesses describing an explosion in the sky.

The rotor broke off due to a fatigue fracture in a second stage planet gear in the main rotor gearbox.

Investigators later said it was probable that the failure was caused by tiny pieces of debris wearing away at the component. The system installed for detecting the particles was inadequate, they added.

Manufacturer Airbus said it has always expressed deep regret for the accident off Turoy and in recent times has reached settlements with families of the victims, while fully appreciating that such arrangements cannot possibly atone for the loss of their loved ones.

A spokesman said: All of us were shocked and saddened by this event and we continue to extend our sincere and profound sympathies to the families of the bereaved.

Despite the Super Puma crashes, though, many pilots still back the aircraft, which Airbus continues to sell widely in industries such as law enforcement and search and rescue.

Mr Molloy, of RMT, said, for whatever reason, the North Sea appears to have been its Achilles heel.

Along with Norwegian colleagues, trade unions in the UK plan to maintain a position that the Super Puma cannot fly again in the industry.

I think youd find a considerable pushback from the offshore workforce for that ever to be suggested in any case, he said.

Certainly this generation wont be climbing into a Super Puma anytime soon.

The victims were Iain Stuart, 41, Behnam Ahmadi, 54, Arild Fossedal, 43, Ole Magnar Kvamme, 60, Odd Geir Tury, 54, Otto Mikal Vasstveit, 54, Kjetil Wathne, 51, Michele Vimercati, 44, Tommas Helland, 50, Espen Samuelsen, 35, Lyder Martin Telle, 57, and Olav Bastiansen, 57 and Silje Ye Rim Veivg Krogsther, 32.

See the original post:
'I recall it with deep despair': North Sea marks five years since Norway helicopter crash - News for the Oil and Gas Sector - Energy Voice

Read More..

Single-Use Plastics Found at the Deepest Points of the Ocean – Technology Networks

Plastic is "missing" in the oceanSingle-use plastics are, as the name suggests, manufactured for one use before being disposed of. Examples include the plastic cutlery that accompanies your takeout meal, straws, wrappers or general food packaging. Of the 300 million tons of plastic we produce yearly, 50% is attributed to single-use plastics.

Between 117-320 million tons of plastics are currently present in the ocean. However, the amount of plastic debris that has been quantified on the ocean surface is less than one percent of this total figure. Where is the rest of the plastic?

Scientists acknowledge that a large amount of plastic in the ocean is effectively missing. It "goes off the radar", moving from the ocean surface and shallow waters to the deeper ocean. It is likely that some plastic will be degraded by one of three pathways: mechanical, photodegradation or biological degradation; however, research suggests that it can take years, perhaps even centuries, for different plastics to undergo complete degradation by such methods if they are able to occur at the deepest points of the ocean.

"Many scientists believe floating plastics eventually sink into the deeper ocean, yet we havent reached a consensus regarding the amount of plastic debris that has accumulated on the deep seafloor and how these debris were transported there," explains Ryota Nakajima, biological oceanographer at the Japan Agency for Marine Science and Technology (JAMSTEC).

Nakajima is one of the scientists behind a new body of work published in Marine Pollution Bulletin that presents the first study of microplastics on the abyssal seafloor beneath the Kuroshio Extension (KE) and the KE recirculation gyre.

"Why Japanese deep-sea? Because most of the plastic is coming from the Asian continent. A previous study (Jambeck et al. 2015) estimated the top 20 contributors of mismanaged plastic waste available to enter the ocean in 2010.2 Nations on the Asian continent represented 12 of the top 20 countries, with China way out ahead of the pack [] Japan is located in the area where a large amount of plastic waste is transported from these massive waste producers via ocean currents such as Kuroshio, so the seas around Japan could be a hotspot of plastic debris," Nakajima says.

Nakajima and colleagues conducted video observations and physical sampling of plastic debris from multiple deep-sea floor sites beneath the KE and its recirculation gyre, a depth of approximately 5700-5800m, using a human occupied submersible, the Shinkai 6500.

The researchers found that the majority of the debris was single-use plastics, such as plastic bags and food packaging. The mean number of items per km2 was approximately 4561, the highest on record for an abyssal plain.

The specific items collected by the team were varied, including clothes, a toothpaste tube and steak packaging that was dated September 1984 meaning it was 35 years old at the time of collection. The packaging can be seen as visually intact in the image below. "This is an interesting example of the persistence of plastic debris in the marine environment. UV radiation and thermal oxidation are the primary factors to degrade plastics, but these factors are completely missing in the deep-sea environment. Plastic debris on the deep seafloor will most likely persist for at least a century," Nakajima explains.

Chicken steak packaging discovered by the research team. Credit: Nakajima.

The overall findings were, as Nakajima and team had hypothesized, that the deep-sea floor beneath the KE and its recirculation gyre act as a significant reservoir of plastic debris.

The researchers believe that the journey to find missing plastics in the ocean has barely begun. Next summer, the team will target the deep-sea floor beneath the other recirculation gyre of the Kuroshio Current, named the Kuroshio recirculation gyre, which is located to the south of the Japanese archipelago.

Ryota Nakajima was speaking with Molly Campbell, Science Writer for Technology Networks.

References:

1. Nakajima R, Tsuchiya M, Yabuki A, et al. Massive occurrence of benthic plastic debris at the abyssal seafloor beneath the Kuroshio Extension, the North West Pacific. Marine Pollution Bulletin. 2021:112188. doi:10.1016/j.marpolbul.2021.112188.

2. Jambeck JR, Geyer R, Wilcox C, et al. Plastic waste inputs from land into the ocean. Science. 2015;347(6223):768. doi:10.1126/science.1260352.

Visit link:
Single-Use Plastics Found at the Deepest Points of the Ocean - Technology Networks

Read More..

Global Ai In Healthcare Market Top 10 Key players in 2021 |DeepMind Technologies Limited, IBM Corporation, Nuance Communications Inc, Microsoft,…

The most recent Ai In Healthcare Market report employs a multidisciplinary approach to comprehend the evolution of this vertical over the forecast period 2021-2027. Furthermore, the Ai In Healthcare report is made in an easy-to-understand format to allows enterprises understand existing and future market opportunities in order to develop better business plans.

This report provides an all-inclusive experiencing for the Global Ai In Healthcare Market. The Ai In Healthcare reports estimates are the result of extensive secondary research, primary interviews, and in-house expert reviews. These Ai In Healthcare market estimates have been influenced by research into the effects of various social factors.

Major industry Players:

DeepMind Technologies Limited, IBM Corporation, Nuance Communications Inc, Microsoft, NVIDIA Corporation

Get Sample PDF (including COVID19 Impact Analysis) @ Click here.

Along with the Ai In Healthcare market overview, which encompasses market dynamics, the chapter includes Porters Five Forces analysis, which explains the five forces in the Worldwide Ai In Healthcare Business: consumers market share, suppliers buying power, the threat of new entrants, the threat of substitutes, and competitive environment. It describes the various participants, such as system integrators and intermediator. It describes the various financial institutions, such as system integrators, intermediaries, and end-users, within the market ecosystem. The report also focuses on the Worldwide Ai In Healthcare Markets business environment.

Ai In Healthcare industry -By Application:

Ai In Healthcare industry By Product:

The global Ai In Healthcare market has been geographically segmented based on different verticals and developing regions such as North America, Latin America, the Middle East, Asia-Pacific, Africa, and Europe have been examined to get a deeper understanding of different global Ai In Healthcare market perspectives. the primary growth objectives for the global Ai In Healthcare market The goal industry players have been profiled in order to learn more about the current global business conditions. Furthermore, it provides a concise elaboration on standard operating procedures and methodologies that are driving the global Ai In Healthcare markets growth.

Request a discount (20%) @ Click here.

The current COVID-19 pandemic is having a major effect on the global Ai In Healthcare industry. This study offers the most up-to-date information on the Ai In Healthcare market, taking into account the numerous consequences of COVID-19-related business disruptions and halts;

Before you buy, make an inquiry @ Click here.

NOTE: Please let us know your preferences, and we will provide you with a Segmented Research Report fully customized.

About Us:

Infinity Business Insights is a market research company that offers market and business research intelligence all around the world. We are specialized in offering services in various industry verticals to recognize their highest-value chance, address their most analytical challenges, and alter their work. We attain particular and niche demands of the industry while stabilizing the quantum of standard with specified time and trace crucial movement at both the domestic and universal levels. The particular products and services provided by Infinity Business Insights cover vital technological, scientific, and economic developments in industrial, pharmaceutical, and high technology companies.

Contact Us:

Amit J

Sales Coordinator

+1-518-300-3575

See more here:
Global Ai In Healthcare Market Top 10 Key players in 2021 |DeepMind Technologies Limited, IBM Corporation, Nuance Communications Inc, Microsoft,...

Read More..

Has pharma missed the boat? – PharmaTimes

A year ago, amid deep concern over COVID-19 and frantic work to develop vaccines, some of pharmas heaviest hitters saw the possibility of reputational enhancement.

Novartis AG chief executive (CEO) Vas Narasimhan dubbed it a "remarkable, perhaps once-in-a generation opportunity" back in April during the company's first quarter earnings call. Eli Lilly CEO David Ricks, meanwhile, closely echoed this sentiment, stating the sector had a once in a generation opportunity to reset its reputation.

Indeed, the world watched news of vaccine development with bated breath. In the public mind, pharma companies came to be associated with hard-working scientists seeking to cure the world.

Yet it seems like that moment may already be passing, and the opportunity for rehabilitation in the public mind could disappear unless pharma companies take stock.

Earlier this year I warned that the industry may have put all of its eggs into the Covid-19 basket. A dive into our latest analysis of how the pharmaceutical sector is viewed through the crucial lens of its Environmental, Social and Corporate Governance (ESG) actions bears this out, suggesting that the sector may already have missed the boat.

Fading glory

Our insights last year pointed to an improvement in stakeholder perceptions for the sector in 2020, as many warmed to pharma companies thanks to the halo effect of the vaccines. And while recent polling of a sample of the general public by RepTrak indicates that this trend has continued, our latest intelligence finds that this improvement has been short lived.

The headline figure from our deep dive into millions of alternative data, including media, regulator, investor, government, public and NGO sources from December 1, 2020, to February 28 this year shows average ESG perceptions of pharma have since nosedived.

Now, out of a possible range of -100% to +100%, the sectors average ESG rating is -20%, compared to +10% in the previous quarter.

Whats going on? Last June, The Economist compared COVID-19 vaccine development to the way mass production of penicillin during the Second World War revitalised the industrys reputation.

Positive headlines have since dwindled. Media attention had shifted from stories of firms collaborating for the benefit of humanity onto more complex territory. High profile issues like vaccine nationalism, anti-vaxxers and pharma companies falling foul of international politics amid the debate over possible side effects have come to the forefront.

The focus on vaccine manufacturers AstraZeneca and Pfizer, meanwhile, has led to increased scrutiny on the companies histories of alleged corruption and ethical breaches.

The pharmaceutical sector is of course, far bigger than its work onCOVID-19. With the initial shock if not the consequences of the pandemic having passed, we may expect this trend to continue. Scandals that may have not received attention last year will once again garner column inches and retweets even as the pandemic remains a global issue.

Dragging the industry down

What are the crucial ESG factors dragging the industry down?Business ethics is the single-most concerning issue for the industry right now, with a score of -13%. Major lawsuits against Roche and Biogen have opened a Pandoras box of bribery and corruption allegations, garnering major pick-up on social media in the process.

Biogen, which had to pay $22 million to resolve allegations that it illegally paid insurance co-payments for thousands of patients in order to collect Medicare revenue, languishes bottom of our list of 20 companies, with a -67% rating. This represents a drop of -80% compared to the previous quarter.

Roche, the best perceived pharma firm in the previous quarter with +64%, is now seventh with a rating of -3%, having had to pay out a $12.5 million for false claims through Humanas Medicare Advantage programme.

However, unless more scandal emerges, the firms longer-term progress on key ESG issues such as supply chain sustainability means it is well-placed to bounce back. Other firms would do well to take note.

A second issue afflicting the industrys ESG rating is affordability and pricing, with 12 firms scoring negatively on this issue.

While AstraZeneca offered to sell its COVID-19 vaccine at cost price until the pandemic is over, Moderna and Pfizers recent announcement that their versions are likely to contribute at least $15 billion and $18.4 billion respectively to sales this year will have gone down far better with shareholders than the wider public.

This, no doubt, has contributed to Pfizers fall 14 places to 19th position on our ESG rankings of 20 firms, with a score of -65% compared to +14% in the previous quarter. While AstraZenecas ESG score has also dropped, it has only moved down one position to 13th. There is a good chance that its cost price decision will be remembered positively in years to come.

Negative perceptions of affordability and pricing extend beyond issues relating to COVID-19, of course. Price hikes by AbbVie, Bristol Myers Squibb, GSK, Pfizer, Sanofi and Teva in late January have contributed towards the narrative shift to more negative territory.

Hot on the heels of this spike in negative news, meanwhile, came the refusal of nine companies - Amgen, AZ, Eli Lilly, Merck, Novartis and Novo Nordisk among them - to sell therapies at the price legally required by section 340B of the US Public Health Services Act.

Not all bad

It is easy to focus purely on the negatives amid pharmas broader shift in ESG ratings, but the traffic is not all one-way.

The sector has received the most positive ESG-related coverage when it comes to access to medicines, most likely due to the publication of the Access to Medicines Index. This annual report has shone a light on a range of successful initiatives and companies that have developed compelling access plans for low and middle income countries.

Praise for Takedas Access First approach, Patient Assistance Programs and consideration of end-to end' patient access to treatment have played a major key role in its ESG ascent from +11% the previous quarter to +56%, giving it top spot on our index.

Also moving up the ranks is Bayer, which came in second place with +40%, an increase from +8% the previous quarter. The firms ascent is linked with drug safety, an area that also garnered overall positive ESG perceptions with Bayer playing an important role in a new alliance with CureVac, for the development of CVnCoV, a COVID vaccine.

A new challenge

The vast challenges of 2020 offered pharmaceutical companies the chance to re-forge their images in the court of public perception.

For instance, Pfizer paid for National Geographic to follow the development of a vaccine, while Johnson & Johnson launched a social media serieson the hunt for a vaccine, drawing a vast, global audience.

Our ESG findings, however, indicate that pharmaceutical firms that expected a long-termimage-polish borne from vaccine development were sorely misguided.Perceptions of an industry coming together to help the common cause have faded as the reality and complexities of the vaccine rollout kick in.

The narrative is likely to shift towards the global disparities in vaccine acquisition, and attention is shifting once again to the norms of pharma coverage that preceded COVID-19.

Despite all this, the opportunity described by the likes of Vas Narasimhan has not completely vanished. Utilising the opportunity, however, requires far more than some well thought out content and a few savvy marketing gestures.

It demands that firms fully react to growing ESG expectations, building ESG practice into their wider models whether or not they are helping vaccinate the world from COVID-19.It is only by making these changes, and taking the public along with them as they do so, that any fundamental, long lasting shift in perception is achievable.

Siera Torontow is the Managing Director of Healthcare and Consumer Practice at alva

Original post:
Has pharma missed the boat? - PharmaTimes

Read More..

BlanQuil weighted blankets: Products and brand review – Medical News Today

We include products we think are useful for our readers. If you buy through links on this page, we may earn a small commission. Heres our process.

BlanQuil sells a range of mattresses, bedding, and sleep accessories. Its products include a variety of weighted blankets, including cooling, travel, and kid-friendly options.

Some people report that the extra pressure of a weighted blanket provides benefits, such as reduced anxiety and improved sleep.

In this article, we discuss the BlanQuil brand in more detail, including its reputation and the weighted blankets it sells. We also look at the science behind weighted blankets and discuss whether they could offer health benefits.

BlanQuil is a company selling sleep accessories, such as weighted blankets, mattresses, and bedding.

BlanQuil manufactures its products in China, and they are available from the BlanQuil website or in stores across the United States. The company also ships to Canada.

Positive online reviews for BlanQuil weighted blankets frequently mention:

Negative online reviews note:

BlanQuil claims that its weighted blankets apply gentle pressure evenly across the body to help promote relaxation and reduce stress. For people who use them for sleeping, they should allow a deeper, more restful sleep.

BlanQuil states that their weighted blankets use high density, eco-friendly glass microbeads to provide the weight. They also have ties to connect the blanket to a duvet to prevent slippage.

BlanQuil advises people to choose a blanket weight that is 815% of their body weight to feel as though the blanket hugs them. However, a person can select whichever blanket they think will feel most comfortable for them.

Learn about how heavy a weighted blanket should be here.

BlanQuil weighted blanket options include:

Similar products that a person can consider include:

This blanket may suit people who want to try a weighted blanket for the first time and would prefer not to spend too much money doing so.

Features include:

People may want to try the Huggaroo Pouch as an alternative option to a weighted blanket for children.

The Huggaroo Pouch is a fitted, weighted sheet that stretches over the bed to provide even pressure without the extra weight or heat of a weighted blanket.

Learn about more of the best weighted blankets here.

Weighted blankets may help reduce anxiety and increase relaxation.

A 2015 pilot study looked at the effects of a 30-lb weighted blanket on 30 adults receiving inpatient care for mental health conditions. Measurements from the State Trait Anxiety Inventory-10 (STAI-10) and a self-rating system showed that 60% of the participants experienced a significant reduction in anxiety when using a weighted blanket.

Learn more about weighted blankets and anxiety here.

A 2015 article involving 31 adults with chronic insomnia suggests that weighted blankets may help with this condition.

The participants maintained their usual sleep environment for a week. They then used a Somna AB weighted blanket for 2 weeks before returning to their usual sleep environment for the final week.

Objective and subjective measures showed improved sleep quality with the use of the weighted blanket. Objective improvements were higher among those who reported a positive experience of using the weighted blanket and also used sleep medication.

However, it is important to note that Somna AB supported the study with a grant and that there was no control group or placebo blanket.

Learn more about the potential health benefits and risks of weighted blankets here.

BlanQuil offers a range of weighted blankets in different weights and sizes, including travel and child-friendly option.

Some research suggests that weighted blankets may help with anxiety and insomnia, although researchers still need to conduct further studies to confirm these benefits.

Weighted blankets may not be suitable for everyone. Anyone with existing health conditions, particularly circulatory, breathing, or temperature-regulating issues, should check with a doctor before trying one of these products.

A doctor can also advise whether older adults and young children are safe to use a weighted blanket.

Excerpt from:
BlanQuil weighted blankets: Products and brand review - Medical News Today

Read More..

A New Book Explores the Connections Between Music, Physics, and Neuroscience – Columbia University

Q. Can a music lover appreciate the book without having a deep knowledge of math, physics, or neuroscience?

A. Thats the goal. Once you mention math, art lovers and musicians glaze over: Not that they arent interested, but they are confident that they wont be able to follow the discussion. Untrue! This book goes into the nitty gritty of how math and biology underlie music, yet you dont need math skills beyond 5thgrade multiplication and division to understand the content.

In my class related to the book at Columbia, students range from undergrads to medical students to professors in other fields. Each student creates a project based on themes in the book, ranging from building new musical instruments, to creating new sounds, to writing deep learning algorithms that differentiate phrasing by famous piano virtuosos.

Q. What came first for you, neuroscience or music?

A. In junior high school, I fell in love with plantsin part, from Euell Gibbons booksand spent my time in forests with field guides. In college, I studied plant breeding, and thought I would be a contemporary Norman Bourlag and develop better agriculture for the world. After moving to New York City, I made a living for a year as a gigging musician. I applied to grad school in biology at Columbia, but there was only one plant laboratoryAlberto Mancinellis. We each had to take a neuroscience course run by Martin Chalfie and Darcy Kelley. I had not known the field existed until then.

Q. How does your work as a professor and lab directorat Columbia intersect with your life as a musician and composer?

A. These fields are starting to intersect. A talented grad student in my lab, Adrien Stanley, found that a sound associated with another sensory input elicits a specific and enormous change in a specific synaptic connection deep in the brain. His finding provides an entry to discovering how language and music are learned and coupled to meaning. This is important for normal learning and diseases of auditory processing, particularly autism. We are conducting this research with computational scientists, geneticists, and with our own skills as neurophysiologists. I dont think I would know how to start asking these questions if I hadnt taught my students about the auditory pathway.

Q. What music have you listened to during the pandemic?

A. Theres been wonderful music made during this period. David First is producing outstanding pandemic recordings, including drone music onThe Consummation of Right and Wrongwhich may not seem appealing until you listenand great pop songs related to Black Lives Matter withNew Party Systems. The Iranian-L.A. musician Sussan Deyhim is doing startling new work. Theres the new record Tyabala, from Lecole Fula Flute, children in Conakry, Guinea coached by New York musician Sylvain Leroux. This has been a good time to listen to gospel choirs, which Ive been discovering and rediscoveringTrey McLaughlin and the Sounds of Zamar, Kirk Franklin, Bob TelsonsGospel at Colonus.

Q. Any book recommendations?

A. I may have been intellectually transformed by explorer-entomologist Mark Moffetts new book,The Human Swarm. If he is right, some of our most despicable behavior is biologically built-in, just as it is for ants. He writes that pettiness, status-seeking, backstabbing, and nationalism are innate, and that the better we understand this, the better we can deal with issues that will always show up in human society.

Q. What are you teaching now? How have you been able to help your students cope with online learning?

A. For lab research, we had to coordinate schedules so that only one person is in a room at a time. During the several weeks we couldnt do any experiments, all students worked on review papers about the history of their research. This forces them to learn their roots, produces useful articles for the rest of the field, and, for grad students, doubles as the first chapters of their dissertations.

Q. You're hosting a dinner party. Which three academics or scholars, dead or alive, would you invite and why?

A. If the dinner party ought to be in a language I can nearly speak, Ill invite Jonathan Swift, abona-fide academic as Dean of St. Patricks in Dublin, and his troubled spiritual descendent, George Orwell, likewise a dyed-in-the-wool academic who taught college.

Orwell was well aware that his own bleary-eyed One World utopian ideals ran contrary to Swifts dour Anglican view of humanitys venality, and yet Swift was his single greatest influence. Due to the unfair one-way direction of time, Swift hasnt had a chance to hear Orwell out. For a priest, Swift seems to have been talented at partying, and the music will be provided by his close friend, Turlough OCarolan, the blind harpist and sort of national composer of Ireland. If available, hell bring the English folk musician Eliza Carthy to sing Swifts lyrics. Im afraid that the menu will be Guinness and chips in curry sauce wrapped in newspaper.

Check out Booksto learn more about publicationsby Columbia professors.

The rest is here:
A New Book Explores the Connections Between Music, Physics, and Neuroscience - Columbia University

Read More..

The next big thing in cloud computing? Shh It’s confidential – Help Net Security

The business-driven explosion of demand for cloud-based services has made the need to provide highly secure cloud computing more urgent. Many businesses that work with sensitive data view the transition to the cloud with trepidation, which is not entirely without good reason.

For some time, the public cloud has actually been able to offer more protection than traditional on-site environments. Dedicated expert teams ensure that cloud servers, for example, maintain an optimal security posture against external threats.

But that level of security comes at a price. Those same extended teams increase insider exposure to private datawhich leads to a higher risk of an insider data breach and can complicate compliance efforts.

Recent developments in data security technologyin chips, software, and the cloud infrastructureare changing that. New security capabilities transform the public cloud into a trusted data-secure environment by effectively locking data access to insiders or external attackers

This eliminates the last security roadblock to full cloud migration for even the most sensitive data and applications. Leveraging this confidential cloud, organizations for the first time can now exclusively own their data, workloads, and applicationswherever they work.

Even some of the most security-conscious organizations in the world are now seeing the confidential cloud as the safest option for the storage, processing, and management of their data. The attraction to the confidential cloud is based on the promise of exclusive data control and hardware-grade minimization of data risk.

Over the last year, theres been a great deal of talk about confidential computingincluding secure enclaves or TEEs (Trusted Execution Environments). These are now available in servers built on chips from Amazon Nitro Enclaves, Intel SGX (Software Guard Extensions), and AMD SEV (Secure Encrypted Virtualization).

The confidential cloud employs these technologies to establish a secure and impenetrable cryptographic perimeter that seamlessly extends from a hardware root of trust to protect data in use, at rest, and in motion.

Unlike the traditional layered security approaches that place barriers between data and bad actors or standalone encryption for storage or communication, the confidential cloud delivers strong data protection that is inseparable from the data itself. This in turn eliminates the need for traditional perimeter security layers, while putting data owners in exclusive control wherever their data is stored, transmitted, or used.

The resulting confidential cloud is similar in concept to network micro-segmentation and resource virtualization. But instead of isolating and controlling only network communications, the confidential cloud extends data encryption and resource isolation across all of the fundamental elements of IT, compute, storage, and communications.

The confidential cloud brings together everything needed to confidentially run any workload in a trusted environment isolated from CloudOps insiders, malicious software, or would-be attackers.

This also means workloads remain secure even in the event a server is physically compromised. Even an attacker with root-access to a server would be effectively prevented from seeing data or gaining access to data and applicationaffording a level of security traditional micro-segmentation cant today.

A strong argument can already be made that reputable major cloud providers deliver both the resources and focus needed to secure a vast majority of internal IT infrastructure. But data-open clouds bring the risk of greater data exposure to insiders, as well as the inability to lock down a trusted environment under the total control of the CISO.

Data exposure has manifested itself in some of the most publicized breaches to date. CapitalOne became the poster child for insider data exposure in the cloud when its data was breached by an AWS employee.

Implementing a confidential cloud eliminates the potential for cloud insiders to have exposure to data, closing the data attack surface that is otherwise left exposed at the cloud provider. Data controls extend wherever data might otherwise be exposedincluding in storage, over the network, and in multiple clouds.

OEM software and SaaS vendors are already building confidential clouds today to protect their applications. Redis recently announced a secure version of their high-performance software to run over multiple secure computing environmentscredibly creating what may be the worlds most secure commercial database.

Azure confidential computing has partnered with confidential cloud vendors to enable the secure formation and execution of any workload over existing infrastructure without any modification of the underlying application. Support for similarly transparent multi-cloud Kubernetes support isnt far behind.

Taking advantage of confidential computing previously required code modifications to run applications. This is because initial confidential computing technologies focused on protecting memory. Applications had to be modified to run selected sensitive code in protected memory segments. The need to rewrite and recompile applications was a heavy lift for most companiesand isnt even possible in the case of legacy or off the shelf packages.

A new lift and shift implementation path enables enterprises to create, test, and deploy sensitive data workloads within a protected confidential cloud without modifying or recompiling the application. Nearly all cloud providers, including Amazon, Azure, and Google, offer confidential cloud-enabling infrastructure today.

Confidential cloud software allows applications and even whole environments to work within a confidential cloud formation with no modification. Added layers of software abstraction and virtualization have the advantage of making the confidential cloud itself agnostic to the numerous proprietary enclave technologies and versions developed by Intel, AMD, Amazon, and ARM.

A new generation of security vendors has simplified the process to implement private test and demo environments for prospective customers of the public cloud. This speeds the process to both enclave private applications and generate full-blown confidential cloud infrastructure.

Data security is the last barrier to migrating applications to the cloud and consolidating IT resources. The resolution of cloud security flaws took a great step in migrating all but the most sensitive application and data. Eliminating data vulnerability opens a broad new opportunity for businesses to simply deploy a new and intrinsically secure hosted IT infrastructure built upon the confidential cloud.

Excerpt from:
The next big thing in cloud computing? Shh It's confidential - Help Net Security

Read More..

Washington State Law Creates a Pathway to the Cloud – Government Technology

Almost 10 years after constructing a $255 million state data center in Olympia, Wash., in 2011, newly signed legislation will allow agencies to switch to the cloud as early as July.

According to House Bill 1274, one of the reasons for the switch is a result of the states current IT infrastructure having insufficient capacity to handle increased demand due to the pandemic. The bills sponsor, state Rep. David Hackney, D-11, said the legislation would set up a framework for the states current information technology infrastructure to move to the cloud.

The data center currently uses legacy servers, Hackney said. If they break down, have to be repaired, or need to be replaced, it can be very expensive.

Another problem these servers present is a lack of scalability, limiting opportunities to expand.

If we wanted to expand right now, wed need more servers, Hackney said. By switching to the cloud, it would not only provide more opportunities to expand, but it would also be more secure and cost-efficient.

In fact, it could potentially save the state $150 million over five years, according to Hackney. The catch, however, is that such a move would require shutting down the data center and solely utilizing the cloud to achieve this.

The concern in shutting down the data center is that it would lead to job loss, Hackney said.

However, the bill stipulates that it would create a new cloud transition task force to oversee the migration process and provide job training for legacy data center workers rather than outsourcing to an outside company. As for maintaining the cloud-based system, a third party will oversee and manage it.

Washington Technology Solutions (WaTech) will likely pick the provider, Hackney said. The idea is that WaTech will be in charge of this process.

WaTech did identify this as a key recommendation in our cloud assessment report, which we are working to implement, a WaTech spokesperson said. The state Legislature is still in session, and there may be additional changes before the session adjourns.

Derek Puckett, WaTechs legislative affairs director, expanded on the issue, saying, the cloud assessment report has identified key recommendations such as implementing a cloud center of excellence and working with cloud data brokers.

However, Puckett said, identifying what this process will look like needs to happen first. State agencies will decide whether to switch to the cloud or continue storing data in the state data center.

The switch is not going to happen overnight, Puckett said. Not all agencies and systems are going to be cloud-ready.

However, he said, it gives state agencies the opportunity to do so if its right for them.

The bill was signed by Gov. Jay Inslee earlier this month and will take effect July 25.

Katya Maruri is a staff writer for Government Technology. She has a bachelors degree in journalism and a masters degree in global strategic communications from Florida International University, and more than five years of experience in the print and digital news industry.

More here:
Washington State Law Creates a Pathway to the Cloud - Government Technology

Read More..

CISA experiments with cloud log aggregation to ID threats – FCW.com

Cybersecurity

The Cybersecurity and Infrastructure Security Agency has pilot programs underway with multiple departments and agencies to experiment with aggregating cloud logs to a warehouse which in turn will feed the agency's data analysis efforts.

CISA wants to "see if it's possible to send their logs to our aggregation point and make sense of them as a community together," Brian Gattoni, CISA's chief technology officer, said on Wednesday at an event hosted by FCW. "We've run pilots through the [Continuous Diagnostics and Mitigation] program team, through our capacity building team to look at end point visibility capabilities to see if that closes the visibility gap for us."

So far what the agency has learned, Gattoni said, is that "technology is rarely the barrier. There's a lot of policy and legal and contractual and then just rote business process things to work out to make the best use of features in technology that are available to us."

Network visibility is a hot topic among government officials and lawmakers in the wake of the intrusions involving SolarWinds and Microsoft Exchange servers. CISA officials in public settings have made clear the government's current programs were not designed to monitor the vectors that Russian intelligence agents exploited during their espionage campaign.

At the same time, top intelligence chiefs such as Gen. Paul Nakasone, the head of the National Security Agency and U.S. Cyber Command, have warned foreign operatives are exploiting the fact the U.S. intelligence community is unable to freely surveil domestic infrastructure without a warrant.

Nakasone has also signaled he will not make any request for new authorities to monitor domestic networks, despite several lawmakers inviting him to do so.

This has prompted CISA to begin seeking out new capabilities that give the cybersecurity watchdog a clearer picture on individual end points in agency networks.

"For this reason, CISA is urgently moving our detective capabilities from that perimeter layer into agency networks to focus on these end points, the servers and workstations where we're seeing adversary activity today," Eric Goldstein, a top CISA official told House lawmakers at a March hearing.

Gattoni said during his panel discussion that some cloud providers already have the infrastructure built into their service that would aid CISA in gathering the security information it wants to aggregate, but he also said the federal government can't depend on that always being the case.

"There's a lot of slips between the cup and the lip when it comes to data access rights for third party services, so we at CISA have got to explore the use of our programs like [CDM] as way to establish visibility and also look at possibly building out our own capabilities to close any visibility gaps that may still persist," he said.

About the Author

Justin Katz covers cybersecurity for FCW. Previously he covered the Navy and Marine Corps for Inside Defense, focusing on weapons, vehicle acquisition and congressional oversight of the Pentagon. Prior to reporting for Inside Defense, Katz covered community news in the Baltimore and Washington D.C. areas. Connect with him on Twitter at @JustinSKatz.

See the article here:
CISA experiments with cloud log aggregation to ID threats - FCW.com

Read More..

Contain yourselves: Scality object storage gets cloud-native Artesca cousin Blocks and Files – Blocks and Files

Scality has popped the lid on ARTESCA, its new cloud-native object storage, co-designed with HPE, that is available alongside its existing RING object storage product.

Artesca configurations start with a single Linux server and then scale out, whereas the RING product requires a minimum of three servers. The Kubernetes-orchestrated ARTESCA container software runs on x86 on-premises servers with HPE having an exclusive licence to sell them for six months.

A statement from Randy Kerns, senior strategist and analyst at the Evaluator Group, said: Scality has figured out a way to include all the right attributes for cloud-native applications in ARTESCA: lightweight and fast object storage with enterprise-grade capabilities.

Scality chief product officer Paul Speciale told us: We believe object storage is emerging as primary storage for Kubernetes workloads, with no need for file and block access.

ARTESCA uses the S3 interface and storage provisioning for stateful containers is done through its API. There is no POSIX. Artesca has a global namespace that spans multiple clouds and can replicate its object data to S3-supporting targets, and Scalitys RING storage. Speciale said Scality is working on an S3-to-tape interface with tape-held data included in the ARTESCA namespace.

The software can integrate with Veeam, Splunk, Vertica and WekaIO via S3, provisioning data services to them. Existing RING or cloud data can be imported into ARTESCAs namespace.

The software features multi-tenancy and its managementGUI supports multiple ARTESCA instances, both on-premises and in multiple public clouds AWS, Azure, GCP:

ARTESCA has built-in metadata search and workflows across private and public clouds.

Scality says it has high performance with ultra-low latency and tens of GB/s of throughput per server, although actual performance numbers are still being generated in the HPE lab and in actual deployments. We can expect them to be available in a couple of months.

The product has dual-layer erasure coding, local and distributed, to protect against drive and. Server failure. If a disk fails, the server has enough information to self-heal the data locally, with no time-sapping network IO needed. If a full server fails, the distributed codes can self-heal the data to the remaining servers in the cluster. They work in parallel to accelerate the recovery process. Lecat said this scheme makes high-capacity disk drive object storage reliable.

ARTESCA has been developed to support many Kubernetes distributions. It should run with VMwares Tanzu system and with HPE Ezmeral, although Lecat adds that both need to be validated.

Target application areas include cloud-native IoT edge deployments, AI and machine learning and big data analytics. There is an initial supportive ecosystem including CTERA, Splunk, Veeam and Veeams Kasten business, Vertica and WekaIO.

There are six ARTESCA configurations available from HPE, suitable for core and edge data centre locations and including Apollo and Proliant servers in all-flash and hybrid flash/disk versions:

Chris Powers, HPE VP and GM for collaborative platforms and big data, said in a statement: Combined with a new portfolio of six purposefully configured HPE systems, ARTESCA software empowers customers with an application-centric, developer-friendly, and cloud-first platform with which to store, access, and manage data for their cloud-native apps no matter where it lives in the data centre, at the edge, and in public cloud.

ARTESCA is available, through HPE only, for 6 months, with one, three and five-year subscriptions, starting at $3,800 per year which includes 247 enterprise support. HPE is also making ARTESCA available as a GreenLake service.

Scality is following MinIO in producing cloud-native object storage. Speciale said: MinIO is very popular but doesnt have all the enterprise features needed. Being lightweight, ARTESCA fits in with Edge deployment needs and Speciale hopes that this will help propel it to enterprise popularity.

Speciale said that Scalitys RING software has a 10-year roadmap and is not going away. He also said ARTESCA will support the coming COSI (Container Object Storage Interface). CSI is focused on file and block storage.

We can envisage all object storage providers converting their code to become cloud-native at some point in the future. ARTESCA, and MiniO, will surely have a heck of a lot more competition in the future.

More:
Contain yourselves: Scality object storage gets cloud-native Artesca cousin Blocks and Files - Blocks and Files

Read More..