Page 3,407«..1020..3,4063,4073,4083,409..3,4203,430..»

The Launching Ceremony for XnMatrix Wrapped Up, the Next Generation of Cloud Computing Eco-System Sets Sail – PRNewswire

HONG KONG, Sept. 2, 2020 /PRNewswire/ --On Aug. 27, the Launching Ceremony for XnMatrix - the Next Cloud Computing Platform and IPFS distributed Storage Eco-System, by the guidance of Hainan Provincial Industry and Information Technology Department, organized by Hainan Anmai Cloud Network Technology Co. Ltd. and co-organized by Hainan Free Trade Port Blockchain Pilot Area, was held in Haikou.

Expert Analyses and Thoughtful Insights

The ceremony was opened up by Mi Jia, COO of organizer Hainan Anmai Network Technology Co. Ltd., who gavethe keynote speech on Computing Civilization and Society Motivation.

XnMatrix-Digital Civilization Strategy Unveiled

The speech delivered by Wu Wenjie, Chairman of XnMatrix, was the highlight of the conference which unveiled the next generation of cloud computing platform XnMatrix, the digital civilization strategy and product launch.

"Computing is the energy, storage the soil. Algorithm is the laws of life to the digital society which, based on math, blockchain, smart contract, will become the rule of the society and create a digital civilized time that is more effective and orderly," Wenjie said. "The digital civilization strategy of XnMatrix is based on the blockchain system with privacy computing and automatic contract and verificationas its essentialsso as to build a world-leading decentralized cloud computing platform whose mission is to create the underlying infrastructure of the digital civilization world that can help people to embrace the data privacy and human-machine interaction secure challenge caused in the time of smart machine. XnMatrix, via the four underlying frameworks which include decentralized network, contract, infrastructure service and application service, has carried out four standardized products in use, including cloud pay, cloud GPU, cloud storage as well as IPFS cloud host. Besides, industry solutions like IPFS technology, computing power store, digital asset bank has also been put in use."

ThreeLabs, XnMatrix Driven by Technology

The three tech sources of XnMatrix-Glacier Lab, X Lab, and Oxford Digital Asset Research Institute -all showed up. The tech achievements of the three labs step up the improvement on the XnMatrix platform at a high speed.

Agreement on the Launching of the Value Standard for Next Generation of Cloud Computing

Ma Siyuan, Board Secretary of XnMatrix, delivered a speech on the gem of the thoughts of this conference, agreeing with the value of the four propositions for the decentralized cloud computing industry. These four propositions are the consensus reached by all the experts, elites in all the industries, tech talents and eco-system representatives presented at the conference. They will, from the perspective of politics, industry, academics, research, and application, examine the industry system on the basis of the four propositions. The conference presented the value of the next generation of the cloud computing platform by multi-dimension, co-witnessing a comprehensive definition of the next generation of the cloud computing value standard and lending support to the digital eco-system to make sure it goes a long way at a steady pace.

CONTACT:XnMatrix xnmatrixs.com[emailprotected]

SOURCE XnMatrix

Read more here:
The Launching Ceremony for XnMatrix Wrapped Up, the Next Generation of Cloud Computing Eco-System Sets Sail - PRNewswire

Read More..

Media And Entertainment Storage TAM To Exceed $16B By 2025 – Forbes

Professional Video camera operator working with his equipment, blue background

Media and entertainment storage drivers and demand are discussed in the latest report from Coughlin Associates onDigital Storage in Media and Entertainment.The 251 page report offers in depth analysis of the role of digital storage in all aspects of professional media and entertainment.

Projections are given out to 2025 for digital storage demand in content capture, post-production, content distribution and content archiving are provided in 62 tables and 129 figures.

The sixteenth annual report includes results from a 2020 survey of Coughlin Associates, Digital Production Buzz, HPA and SMPTE members on their digital storage needs in these target segments (comparing the results to similar 2009, 2010 and 2012-2019 surveys).These surveys were used to refine the current report analysis from previous editions and track industry trends.The pie chart below, from the report, shows the breakdown of media for long term archives for the survey participants.

Breakdown of media used for M&E archiving from Survey

As a result of changes in the economics of storage devices higher performance solid-state storage will play a bigger role in the future.The cloud and hybrid storage including the cloud have assumed a new importance for many workflows during the Covid-19 pandemic.When the pandemic passes, use of cloud storage will continue to grow in the media and entertainment storage market going forward.

Some highlights from the report:

The Covid-19 pandemic in 2020 will have a big impact on content creation in 2020 and likely into 2021, except for broadcast acquisition.The figure below shows the impact on storage for M&E content acquisition in 2020 and 2021.

Growth in new digital storage for content acquistion

Spending for digital cinema in 2020 and 2021 will also be impacted by the pandemic

Creation, Distribution & Conversion of video content creates a huge demand driver for storage device and systems manufacturers

As image resolution increases and as stereoscopic VR video becomes more common, storage requirements explode

The development of 4K TV and other high-resolution venues in the home and in mobile devices will drive the demand for digital content (especially enabled by high HEVC (H.265) and VVC (H.266) compression and even greater standards for compression to enable 8K and higher resolution and frame rate workflows.

HDD areal density increases are slower but flash memory growth has increased and the price declined.This, plus the growth in higher resolution and higher frame rate content, is causing more applications to use flash memory

Activity to create capture and display devices for 8K X 4K content is occurring with planned implementation in common media systems in this decade

Active archiving will drive increased use of HDD storage for archiving applications, supplementing tape for long term archives

Optical storage developments for higher capacity write-once Blu-ray optical cartridges will create higher capacity discs and this may help slow the reduction in optical disc archiving

Flash memory dominates cameras and is finding wider use in post-production and content distribution systems

From 2019 to 2025 entertainment and media digital storage TAM (without archiving and preservation) will increase by about 1.8 X from $7.3B to $13.3 B

The growth in storage capacities will result in a total media and entertainment storage revenue growth of about 1.6 X between 2019 and 2025 (from $10.3 B to $16.5B)

Overall annual storage capacity demand for non-archival applications is expected to increase over the period from 2019 to 2025 by 5.0X from 24.3 EB to 122.4 EB

Between 2019 and 2025 we expect about a 3.0 X increase in the required digital storage capacity used in the entertainment industry and about a 3.4 X increase in storage capacity shipped per year (from 70.8EB to 241EB

In 2019 content distribution is estimate at 31% of total storage revenue followed by archiving and preservation at 29%, post-production at 22% and content acquisition at 18%.

In 2025 the projected revenue distribution is 33% content distribution, 25% post production, 23% content acquisition and 19% archiving and preservation.

By 2025 we expect about 56% of archived content to be in near-line and object storage, up from 48% in 2019

In 2019 we estimate that 74.7% of the total storage media capacity shipped for all the digital entertainment content segments was in HDDs with digital tape at 19.0%, 2.7% optical discs and flash at 3.5%

By 2025 tape capacity shipment share has been reduced to 13.0%, HDDs shipped capacity is 76.4%, optical disc capacity is down to about 0.5% and flash capacity percentage is at 10.1%

Media revenue is expected to increase about 1.2X from 2019 to 2025 ($1.8B to $2.2B).

The single biggest application (by storage capacity) for digital storage in the next several years as well as one of the most challenging is the digital conversion of film, video tape and other analog formats and its long-term digital preservation

Over116 Exabytes of new digital storage will be used for digital archiving and content conversion and preservation by 2025

Storage in remote clouds is playing an important role in enabling collaborative workflows, content distribution and in archiving

Overall cloud storage capacity for media and entertainment is expected to grow over 13X between 2019 and 2025 (2.2 EB to 29.0 EB)

Overall object storage capacity for media and entertainment is expected to grow about 3.7 X between 2019 and 2025 (14.3 EB to 52.7 EB)

Cloud storage revenue will be about $3.7 B by 2025

By our estimates, professional media and entertainment storage capacity represents about 5.8% of total shipped storage capacity in 2019.

Professional media and entertainment consume about 28% of all tape capacity shipments, 4.9% of all hard disk drive shipments and 2.3% of all flash memory shipments in 2019.We estimate that media and entertainment spending was about 9% of total storage revenue in 2019.

As a result of the Covid-19 epidemic, M&E content acquisition storage will suffer loses in 2020 and 2021, except for broadcast.This will also impact post production storage.Cloud storage will assume a new importance for remote work.The latest M&E storage report projects out to 2025.

See the rest here:
Media And Entertainment Storage TAM To Exceed $16B By 2025 - Forbes

Read More..

Why not open our own Container Registry, muses GitHub as it gives orgs a hand at resource-sharing DEVCLASS – DevClass

With Docker container images being the second most popular ecosystem in GitHub Packages, GitHub has decided to give them their own designated registry.

GitHub Container Registry is now available as a public beta. Microsofts 2018 acquisition seems to be aiming the new offering mainly at organisations, since major features like fine-grained access control and promoting a standard way of doing things arent really crafted for single users.

This doesnt mean, however, that the latter wont be able to profit from the registry, since it also answers the call for anonymous image access, allowing for anonymous pulls of public images the same way the platform lets anyone download public repo content. For packages, users had to at least sign in to be able to get content.

Since container registries can be found on most cloud platforms and from various providers these days (Google, Red Hat, and GitLab, just to name a few), the appeal of GitHubs offering surely boils down to the proximity to the popular repository management service and the associated easy integration into existing workflows. Compared to predecessor GitHub Packages, Container Registry also shines with the decoupling of registry and repository permissions, which opens it up for more usage scenarios when compared to competing offers.

Through this, teams can keep their code private, while sharing images publicly or within their organisation for example. Administrators are also able to grant access permissions separate from those at organisation or repo level, giving single persons and teams either read, write, or admin privileges for images.

Out of the box, a container images visibility is set to private. This can be changed via the package settings, though admins have to enable package creation before anything along those lines can be done. Users have to be aware though, that once the decision for going public has been made, the image cant be made private again.

According to the documentation, storage and bandwidth are free during the beta phase. Afterwards it is meant to use the rates of GitHub Packages, implying no cost for public images. Those using the registry for private packages meanwhile can do so for free until theyve hit a limit which depends on the product theyre using (Free/Pro/Team/Enterprise Cloud). Once thats exhausted, costs currently amount to $0.25 USD per GB of storage and $0.50 USD per GB of data transfer.

GitHub Actions to include the new registry into existing workflows are already available, as is a migration guide for those making use of the Packages Docker registry. To make the switch, authentication with the base URL ghcr.io (which doesnt seem to be available at the time of writing) is necessary and packages have to be republished, which will require a bit of manual work.

Docker Engines and Docker Desktop are meant to seamlessly work with the new registry. Docker of course has its own way of sharing container images, Docker Hub, but says that portability and choice have always been core to the Docker experience, which is presumably why it is so excited about GitHubs move.

GitHub doesnt seem to plan to stick to Docker, though, and has plans to support more open standards for cloud-native development, including Helm 3 charts for Kubernetes applications and using Container Registry for universal storage.

The last item on the list specifically might pique the interest of some Microsoft sceptics, since the GitHub parent deals in cloud storage, amongst other things. It also counts Docker registry Azure Container Registry among its portfolio, so it remains to be seen if a connection to any of that will be revealed at some point.

See the original post:
Why not open our own Container Registry, muses GitHub as it gives orgs a hand at resource-sharing DEVCLASS - DevClass

Read More..

Data breach exposes tens of thousands of NSW drivers licences online – ABC News

Transport for NSW is yet to alert up to tens of thousands of people whose full driver's licence details were mistakenly left exposed in an open cloud storage.

The cache was discovered last week by Ukrainian security consultant Bob Diachenko who stumbled upon the directory while investigating another data breach.

The storage folder, which he said was easily discoverable, contained back-and-front scans of NSW licences alongside tolling notices hosted on Amazon's cloud service.

The total number of images inside the directory was 108,535, or about 54,000 licences.

The documents revealed names, photos, dates of birth and addresses of drivers, which Mr Diachenko labelled a "dangerous exposure".

He said it wasn't clear how long the files had been accessible online, but given how unprotected it was, it probably had been viewed by "malicious actors" who could have made a copy of the files already.

"A malicious actor can impersonate somebody and apply for credit, or do something on behalf of that person," he said.

"For example, you take one licence and connect the dots with one owner of this licence, with his or her emails exposed in another data breach and you've got more information on that person," he said.

He said personal information like this would also commonly be traded through online black markets once it made its way into the hands of a criminal.

A spokeswoman for Transport for NSW said the collection of files was not related to any government system.

"Transport for NSW does not retain, nor collect tolling data in the manner described," she said.

"Transport for NSW is however working with Cyber Security NSW to investigate the alleged data issue relating to an Amazon Web Services S3 bucket containing personal information including driver licences."

The Amazon Web Services S3 bucket is the open cloud storage provider.

The office of the NSW Privacy Commissioner, which is delegated to monitor data breaches within State Government departments, said the data appeared to be linked to an unnamed private business.

"The NSW Privacy Commissioner is aware of the breach and has received a preliminary briefing on the breach from Cyber Security NSW," a spokeswoman said.

"The Privacy Commissioner understands that a commercial business, unconnected to the NSW Government, was responsible for the breach.

"The breach is not associated with a NSW Government agency or any NSW Government system or process."

The Australian Cyber Security Centre has also been alerted and it is understood it contacted Amazon, who ensured the cache was taken offline within hours after it was alerted.

The Transport for NSW spokeswoman said some drivers request a new licence in a case when they believe they've been impacted by identity fraud.

Leading cyber expert and founder of data breach tool Have I Been Pwned, Troy Hunt, said this was an unusual and uncommon kind of breach and it might be too little, too late.

Mr Hunt said even if Transport NSW was not culpable, it had a responsibility to disclose the potentially "high risk" leak to protect its customers.

"I think there should have been a notice," Mr Hunt said.

"I would be pushing for a disclosure on this, because it's something that's quite important."

Even if the licence details, such as the card number, weren't used directly there was "powerful information" on there and it was enough to commit identity theft.

One example he provided would be using it for "social engineering", such as creating a fake Facebook account to solicit relatives for money.

He was concerned about the toll notices having emails and passwords, which are almost always compromised eventually because regular users have poor security hygiene.

Once a malicious actor had a person's email and password, he said, their "ability to go on and do damage is massive".

Read the original:
Data breach exposes tens of thousands of NSW drivers licences online - ABC News

Read More..

Sharing responsibility: Why we need to work together to keep the cloud secure – ComputerWeekly.com

In recent months, especially following the forced online shift through lockdown, there has been a rush across industries to adopt more cloud technology. Education is no exception, and tools such as Zoom, Teams and Microsoft 365 have shot to the top of must-have lists for educators.

Even for institutions that have already started a digital transformation journey, there will likely be conversations happening around how systems can be streamlined and security measures regulated, as more work is being completed remotely.

Security is also particularly high on the digital radar as instances of malware and ransomware attacks are becoming more prevalent, and can compromise sensitive data. A recent ransomware attack on US-based software-as-a-service (SaaS) provider Blackbaud shows how student and staff data within universities can fall prey to cyber criminals.

True cyber security relies on a secure digital infrastructure, and as some of the biggest cyber security suppliers on the planet, cloud providers are aware of the implications. Security has always been a top priority for cloud providers, and they are continually investing in making sure their infrastructure is highly secure. The use of public cloud by high-profile clients such as the military and government organisations also means the level of security across the board is of the highest standard.

Most of the major cloud providers, such as Amazon Web Services (AWS), Microsoft and Google,use a shared responsibility model, which means there are various steps a user needs to take to ensure that they are using cloud services in a secure way it doesnt all fall on the provider.

A shared responsibility model means that all players assume some level of security responsibility. For example, this is often a two-way split between cloud provider and customer, or a three-way split between cloud provider, customer and managed service provider. However, many players are involved in the process, and each will have a role to play in ensuring the security of data.

A good example is the evolution of AWSs digital storage facility, S3. It used to be very easy for a user to create an S3 storage location, called a bucket, which is automatically open to the public, and viewable by anyone with access to the URL. If a bucket was compromised, the blame would have been directed at the user for not switching the settings to private.

For this reason, AWS has updated the service so the standard configuration is private, and the user has to manually make the bucket publicly viewable, should they choose to. AWS is, technically, not responsible for how a user configures a bucket, but has altered the infrastructure to make it easier for the user to select the most secure options.

There are various levels of security that come into play with a shared responsibility model. The physical security of a datacentre is the responsibility of the cloud provider, as they own the physical estate.

This means the cloud provider is responsible for regulating who is allowed to enter the centre and so on. The provider also carries responsibility for the security of the underlying infrastructure. This includes ensuring that security features are built into the cloud platform but does not stretch to elements such as password strength or enabling multifactor authentication (MFA), which are the responsibility of the user.

Users are also responsible for how they deploy cloud applications, and whether they encrypt any data stored on them. These elements use the security tools built by the cloud provider, but the configuration and use of these tools is the responsibility of the user.

For example, if a lecturer uses Microsoft Teams to gather and store coursework from students, there is an implicit trust between the student and lecturer that the lecturer will store the work in a secure way. But there is also onus on the student to upload it safely for example, not making the document public or sending it over an insecure connection. The responsibility of Microsoft, the cloud provider of Teams, is to make sure that, in building and hosting the application, these security measures are possible and easy to implement.

With an increased move to cloud in the education sector, understanding the dividing lines within this model is essential. Moving services to the cloud doesnt mean abdicating all responsibility the security of cloud platforms and the data within is a collaborative effort.

To hear more about cyber security and get involved in the discussion, sign up for JISCs security conference, running online from 3-5 November 2020. Entry is free, book your place here.

Read more:
Sharing responsibility: Why we need to work together to keep the cloud secure - ComputerWeekly.com

Read More..

10 Key Takeaways From NetApp CEO George Kurian: Cloud, Coronavirus And Growth – CRN: Technology news for channel partners and solution providers

NetApp Building Its Cloud Business, Thriving Despite COVID-19

NetApp on Wednesday reported a very surprisingly successful first fiscal quarter 2021. And it was a surprise, both because NetApps last few fiscal quarters have suffered from falling sales and because the IT industry along with the rest of the economy has taken a big hit from the COVID-19 coronavirus pandemic.

However, NetApp saw total revenue for its first fiscal 2021 quarter rise 5 percent to $1.3 billion, revenue for its all-flash array business increase 34 percent on an annual net revenue basis, and annual recurring revenue grow 192 percent.

NetApp CEO George Kurian used his prepared remarks and analyst questions during the companys quarterly financial conference call to talk about the growth of its cloud and all-flash storage business, its competitive environment including a look at Dells new PowerStore, the impact of the pandemic, and more as a way to provide insight into both what happened during the quarter and what to expect going forward.

With the ongoing pandemic, the near future remains uncertain for many companies, and no one knows when we will return to a more normal and predictable environment, he said. Despite the uncertainties, one thing is clear: Data is growing in scale and importance. We help the worlds leading organizations solve the challenge of managing their most critical data.

For a look at what is impacting NetApp and the IT industry, turn the page.

Excerpt from:
10 Key Takeaways From NetApp CEO George Kurian: Cloud, Coronavirus And Growth - CRN: Technology news for channel partners and solution providers

Read More..

How to Prepare for the Next Time the Cloud Goes Down – Gizmodo

Internet access is pretty essential to get anything done these days, whether its chatting with working-from-home colleagues in Slack, binge-watching the latest hit Netflix show, or writing up reports in Google Docs. Most of the apps we rely on run from the cloud, and its all too easy to just assume the cloud will always be there. However, thats not quite true.

Cloud outages happen on a pretty regular basis, and while its rare for multiple web platforms to be knocked out all at once, that happens too. And even if the cloud is working, your connection to it might not be. With that in mind, heres a quick guide to help you get ready for a cloud outagejust in case.

Keep your email close

Email is still a must-have for most of us when it comes to getting through the work day, and this medium relies on the cloud. While sending and receiving emails obviously isnt going to be possible if your email provider of choice goes down, you can at least make sure that youve got copies of your emails so you can still do some inbox sorting and draft some new messages.

Gmail is good at this. You can enable offline Gmail in Chrome by clicking Settings (the cog icon on the right) then See all settings, Offline, and Enable offline mail (you can choose how many days of email get cached). Gmail for Android and iOS actually syncs email for offline access automatically, though you might not realize ithead to the settings for your Gmail address from the main app menu then use the Sync Gmail and Days of emails to sync options (Android) or the Sync settings option (iOS) to manage this.

G/O Media may get a commission

Have you heard of desktop clients? Trust us, they were big in the 90s. Even if you spend most of your time managing your email inside a web browser, its still worth keeping your messages in sync with a desktop program as well, just in caseboth Windows and macOS have basic, built-in Mail apps that will sync your messages locally, or you can use something like Thunderbird.

Adding new email accounts is typically just a question of entering your login credentialsthe email program youve chosen will take care of the rest. You might need to enable third-party access from inside your email service on the web first, and maybe generate a specific password if youre using two-factor authentication. If your email is synced locally, youll at least be able to browse through it and refer back to it while youre waiting for the cloud servers to get back on their feet.

Most cloud storage services now make at least some effort to free up space on your computer by keeping certain files exclusively on the web and only downloading local copies when you actually need them. Thats great for freeing up gigabytes of space on your hard drive, not so great when your cloud storage provider starts having problems.

At the very least make sure your important files are always being synced locally as well as being stored in the cloud. With Dropbox, for example, you can do this by opening up the Dropbox file browser interface from the notification area or system tray, right-clicking on a file or folder, and choosing Smart Sync then Local. Files and folders that are fully synced have a solid green tick next to them.

In the case of OneDrive on Windows, if you right-click on the OneDrive entry in File Explorer then choose Settings and open the Settings tab, youll see a Save space and download files as you use them optionturn this off to store all your files locally. You can also right-click on individual files and folders inside File Explorer and then choose Always keep on this device to make sure important data is always kept locally.

If youre a macOS user, iCloud will start moving older, lesser-used files onlinebut only if you start running out of room on your hard drive. Open up the Apple menu then choose System Preferences, Apple ID and iCloud, then untick Optimize Mac Storage to stop this from happening. Alternatively, just make sure theres always plenty of room left on your local drive so iCloud doesnt attempt any housekeeping.

After following the cloud storage tips above, make sure youve got your work accessible offline wherever possible. This might mean keeping copies of important files on an external disk drive or a NAS drive, for exampleweve written before about how useful NAS drives can be, because they as your own personal cloud on your home network.

If youre a Google Docs, Sheets, and Slides user, you can get files created in these apps to cache locally in Chrome, just in case something happens to the cloud servers (or your internet connection). From the main Google Drive interface, click the cog icon (top right), then Settings and General: Tick the box labeled Create, open and edit your recent Google Docs, Sheets and Slides files on this device while offline and the sync begins.

Saving to the cloud is of course very useful for backing up your work, sharing it with other people, and collaborating on documents, but taking a few moments to save a local copy could save you a lot of trouble if the cloud suddenly becomes unavailable. The most recent versions of the Microsoft Office apps will encourage you to save to OneDrive for syncing purposes, so make sure your local files are actually stored, or save them in a separate folder as well.

Part of surviving a cloud outage or a network fail is just a little bit of planning. You should make sure everyone on your team knows what they need to switch to if Slack or Google Drive or iCloud should collapse, otherwise youll spend the first hour or so of any downtime trying to work out what your alternatives are.

Were not built to be ultra-productive workhorses every minute of every day, and there are times when youre going to want to kick back and enjoy some music or a movie or twowhich can be tricky in the midst of a Spotify or Netflix outage. Aside from digging out DVDs, your best bet here is to make sure your favorite films and tunes are synced ready for offline viewing or listening.

Youll find the feature in just about every music-streaming app, provided youre a paying subscriberits the little blue cloud download icons in the Apple Music desktop app, or the Download toggle switches at the top of every playlist in the Spotify desktop app, for example. It may seem pointless to sync playlists for offline listening on a computer, but it wont seem so pointless when these services go down. Offline syncing is available inside the mobile apps, too, of course.

The offline playback option is less common in video streaming apps on the desktop, so youre probably going to have to rely on phones and tablets to sync movies and shows for watching if streaming isnt availableits a good idea to have at least a few hours of entertainment available offline, just in case. Remember Chromebooks can install the Android versions of apps like Netflix and YouTube, complete with download options, as well as access the web apps.

Your best option here is actually the TV app on macOS, which is still iTunes on Windows for the time being: If this is where all your media is stored, you can click the download button (the cloud and arrow symbol) next to any show episode or film to store it locally. This works for both digital content youve bought from Apple and anything on Apple TV+.

Just take a break for a bit. You wont miss much, we promise.

Excerpt from:
How to Prepare for the Next Time the Cloud Goes Down - Gizmodo

Read More..

Responding to Cloud Misconfigurations with Security Automation and Common-Sense Tips – Security Boulevard

Few things can boil the blood of a security professional quite like the unforced error. It is a common term used in tennis, referencing a mistake attributed to a players own failure versus the skill or effort of their opponent.

In cybersecurity, the unforced error is better known as the misconfiguration. This occurs when security settings, typically involving a server or web application, are set up improperly or left insecure.

This leaves the system vulnerable to attack and furthers the path of least resistance for the bad guys. Considering the increasing sophistication of cyber threats and the ever-expanding attack surface available to your foes, you need not be an infosec veteran to know that your adversaries require no additional help accomplishing their goals.

Webinar Demand: Automate Adversarial Testing and Response Simulations Against AWS Misconfigurations

Security misconfigurations rank No. 6 on OWASPs Top 10 Web Application Security Risks list and are commonly a result of insecure default configurations, incomplete or ad hoc configurations, open cloud storage, misconfigured HTTP headers, and verbose error messages containing sensitive information.

The misconfiguration risk is only rising, especially amid the rise in public cloud computing adoption, whose benefits have become especially stark during the COVID-19 crisis and the subsequent work-from-home binge. Cloud demand has risen across Amazon Web Services (AWS) which controls roughly half the market share as well as Microsoft Azure and Google Cloud Platform (GCP), through the rapid adoption of online collaboration tools and other cloud resources.

A recent survey by Check Point determined that misconfigurations are the top threat to cloud security, with three-quarters of respondents saying they are very or extremely concerned about cloud security and 68% naming misconfigurations as their biggest cloud worry. Their concerns are not unfounded.

Cloud misconfigurations were responsible for potentially exposing an estimated 33.4 billion records in 2018 and 2019, victimizing high-profile organizations and costing organizations some $5 trillion. Considering many misconfigurations go unreported, the figures are likely significantly larger. And not only are misconfigurations obvious harbingers of data exposure, they also can present the ideal foothold to launch a more complex (and potentially more devastating) attack on an organization.

This is by no means an exhaustive list, but can serve as a reliable encapsulation of agreed-upon advice among experts:

At the end of the day, the stats do not lie. Misconfigurations are inevitably going to happen, so the key will be limiting their time of exposure and reducing mean time to detect and respond (MTTD/MTTR). This can be accomplished with the help of automated remediation in concert with security orchestration, automation and response (SOAR).

For example, Check Point CloudGuard Dome9 users gain visibility, control, and compliance across all cloud assets to manage cloud security posture and detect and remediate misconfigurations from a single source of network authority. Meanwhile, the Siemplify SOAR platform integrates with CloudGuard Dome9 to enable enrichment of alerts by integrating data from other Check Point tools, such as ThreatCloud and data from third-party tools such as Azure Active Directory. This integration allows analysts to investigate alerts from CloudGuard Dome9 and implement playbooks that automate remediation from a single console, saving your team time and effort.

Learn more about remote security operations and how Siemplify can help with A Technical Guide to Remote Security Operations, or begin test driving the SOAR platform today through a free trial or by downloading the Siemplify Community Edition.

Dan Kaplan is director of content at Siemplify.

The post Responding to Cloud Misconfigurations with Security Automation and Common-Sense Tips appeared first on Siemplify.

Recent Articles By Author

*** This is a Security Bloggers Network syndicated blog from Siemplify authored by Dan Kaplan. Read the original post at: https://www.siemplify.co/blog/responding-to-cloud-misconfigurations-with-security-automation-and-common-sense-tips/

Read the original post:
Responding to Cloud Misconfigurations with Security Automation and Common-Sense Tips - Security Boulevard

Read More..

Quantum Physics May Upend Our Macroscopic Reality In The Universe – Forbes

If a tree falls in the forest and someone is there to hear it, does it make a sound? Perhaps not.

Once again, quantum physics is calling our concept of reality into question.

If you are familiar with quantum physics, you know that on very tiny scales, the Universe is very weird. Particles act like particles and waves at the same time. An electron may be in one location, and then suddenly in another location, without ever passing through a point between those two spots. Or even a single particle can interact with itself.

But on the macroscopic scale, things are more normal. At least, we think. But perhaps quantum physics also affects us, as macroscopic observers. And recent research published in Nature Physics says for even macroscopic observers, quantum physics may call our reality into question.

As macroscopic observers, we can say three things about reality.

If we observe something, we believe it really did happen.

Lets compare these with reality on a quantum level.

These two realities are very different. If our normal, macroscopic world started acting in a quantum way, the world would be a very different place.

But perhaps, our world is not as clear cut as we thought it is.

Lets try to mess with our macroscopic reality a bit.

To do this, we can do a thought experiment, where the observer of the particle is also observed.

The experiment, known as Wigners friend, goes like this. You have a scientist, lets call him Charlie, who is sealed inside a lab. He makes an observation of a particle as either red or blue. His friend, Alice, waits outside. From Alices perspective, she doesnt know whether Charlie measured the particle as red or blue. According to her, until she opens up that lab door and asks Charlie what he saw, the particle is both red and blue at the same time. This is similar to the outcome we see in the Schrdinger'scat experiment, where a cat in a box is both alive and dead until observed.

Like Schrdinger's cat, from Alice's perspective, Charlie would have measured his particle as both ... [+] red and blue.

Eugene Wigner, the physicist who came up with this thought experiment, thought this was absurd. Charlie has a consciousness - he cant be in two states at once (one where he observed the particle as red and one where he observed the particle as blue). Thus, Wigner claimed, human consciousness causes all of this uncertainty to collapse.

This makes sense to us in a macroscopic world. But whats so special about human consciousness? And why (or are) observers so special?

This is where Wigner left off. But another version, first proposed by aslav Brukner, was recently extended by a group of scientists at the Centre for Quantum Dynamics at Griffith University and the Department of Physics and Center for Quantum Frontiers of Research & Technology at the National Cheng Kung University.

In their version, there are two observers locked in their labs on opposite sides of the planet, Charlie and Debbie. They both observe entangled particles, say, as red or blue. Remember, if Charlie observes his particle as blue, Debbies entangled particle must also be blue. This causes Charlie and Debbie to now be entangled with one another. Charlie and Debbie, in turn, have two observers, Alice and Bob.

Alice and Bob then each flip a coin. If its heads, they open the door to the lab of Charlie and Debbie and ask for the result of their experiment. If tails, they do another measurement, that will come out positive if Charlie and Debbie are entangled with their particles.

No information inside the lab should leak out at all, except if Alice and Bob open the door and ask their friends about the result of their experiment. Even Charlie and Debbie, after the experiment, cant remember the result.

At this point, lets go back to our tenets of reality and see how they relate to this experiment. Charlie and Debbie really see the particle as red or blue, and this reflects some sort of objective reality. In addition, this reality should not be dependent on the choice that Alice and Bob make when they flip their coin.

After thousands of realizations, the researchers found that the number of correlations that Alice and Bob see with whether Charlie or Debbie measure their particle as blue or red exceeds the amount expected if our three tenets of reality hold up.

What this means is that something strange is happening when consciousness interacts with quantum physics. Either our idea of quantum physics needs to be revised, or we dont have a full grasp on reality.

If this experiment holds for human observers, our reality may not be objectively true.

For one, the correlations we discovered cannot be explained just by saying that physical properties don't exist until they are measured, says Dr. Eric Cavalcanti, one of the authors on the paper. Now the absolute reality of measurement outcomes themselves is called into question.

Now, there are some limitations to this experiment. For one, Alice, Bob, Charlie, and Debbie werent real people. However, if the same results arent obtained with real people in some future example of this experiment, that means conscious observers really are special. If we do get the same results, then one of our tenets of reality must not be true. Reality for one person may not be the reality seen by another person.

In any case, this work sets the stage for how quantum physics and consciousness can come together to help us understand the true nature of reality.

Read this article:

Quantum Physics May Upend Our Macroscopic Reality In The Universe - Forbes

Read More..

If you flew your spaceship through a wormhole, could you make it out alive? Maybe… – SYFY WIRE

Can you already hear Morgan Freemans sonorous voice as if this was another episode of Through the Wormhole?

Astrophysicists have figured out a way to traverse a (hypothetical) wormhole that defies the usual thinking that wormholes (if they exist) would either take longer to get through than the rest of space or be microscopic. These wormholes just have to warp the rules of physics which is totally fine since they would exist in the realm of quantum physics. Freaky things could happen when you go quantum. If wormholes do exist, some of them might be large enough for a spacecraft to not only fit through, but get from this part of the universe to wherever else in the universe in one piece.

"Larger wormholes are possible with aspecial type of dark sector,a type of matter that interactsonly gravitationally with our own matter. The usual dark matter is an example.However, the one we assumed involves a dark sector that consists of an extradimensional geometry,"Princeton astrophysicist Juan Maldacena and grad student Alexey Milekhin told SYFY WIRE.Theyrecently performed a new study that reads like a scientific dissection of what exactly happened to John Crichtons spaceship when it zoomed through a wormhole in Farscape.

"This type of larger wormhole isbased on therealization that a five-dimensional spacetime could be describing physics at lowerenergies than the ones we usually explore, but that it would have escaped detection because it couples with our matter only through gravity," Maldacena and Milekhinsaid."In fact, its physics issimilar to adding many strongly interacting massless fields to the known physics,and for this reason it can give rise to the required negative energy."

While the existence of wormholes has never been proven, you could defend theories that they could exist deep in the quantum realm. The problem is, even if they do exist, they are thought to be infinitesimal. Hypothetical wormholes would also take so long to get across that youd basically be a space fossil by the time you got to the other end. Maldacena and Milekhin have found a theoretical way for a wormhole thatcould get you across the universe in seconds and manage not to crush your spacecraft. At least it would seem like seconds to you. To everyone else on Earth, it could be ten thousand years. Scary thought.

"Usually whenpeople discuss wormholes, they have in mind 'short'wormholes: the ones forwhich the travel time would be almost instantaneous even for a distant observer.We think that such wormholes are inconsistent with the basic principles of relativity," the scientists said. "The ones we considered are 'long': for a distant observed the path alongnormal space-time is shorter than through the wormhole.There is a time-dilation factor because the extreme gravity makes travel time very short for the traveller. For an outsider, the time it takes is much longer, so we have consistency with the principles of relativity, which forbid travel faster than the speed of light."

Fortraversable wormholesto exist, but the vacuum of space would have to be cold and flat to actually allow for what they theorize. Space is already cold. Just pretend that its flat for the sake of imagining Maldacena and Milekhin's brainchild of a wormhole.

"These wormholes are big, the gravitational forces will be rather small. So, if they were in empty flat space,they would not be hazardous. We chose their size to be big enough so that theywould be safe from large gravitational forces," they said.

Negative energy would also have to exist in a traversable wormhole. Physics forbids such a thing from being a reality. In quantum physics, the concept of this exotic energy is explained by Stephen Hawking as the absence of energy from two pieces of matter being closer together as opposed to being far apart, because energy needs to be burned so they can be separated despite gravitational force struggling to pull them back together. Fermions, which include subatomic particles such as electrons, protons, and neutrons (with the exception that they would need to be massless), would enter one end and travel in circles. They would come out exactly where they went in, which suggests that the modification of energy in the vacuum can make it negative.

"Early theorized wormholes were not traversable; an observer going through a wormhole encounters a singularity before reaching the toher side, which is related ot the fact that positive energy tends to attract matter and light," the scientists said."This is whyspacetime shrinks at the singularity of a black hole. Negative energy prevents this. The main problem is that the particular type of negative energy that is needed is not possible in classical physics, and in quantum physics it is only possible in some limited amounts and for special circumstances.

Say you make it to a gaping wormhole ready to take you...nobody knows where. What would it feel like to travel through it? Probably not unlike Space Mountain, if you ask Maldacena and Milekhin. In their study, they described these wormholes as "the ultimate roller coaster."

The only thing a spaceship pilot would need to do, unlike Farscapes Crichton, who totally lost control, is get the ship in sync with the tidal forces of the wormhole so they could be in the right position to take off. These are the forces that will push and pull an object away from another object depending on the difference in the objects strength of gravity, and that gravity would power the spaceship through.This is whyit would basically end upflying itself. But there are still obstacles.

"The problem is that every object which enters the wormhole will be acceleratedto very high energies," the scientists said."It means that a wormhole must be kept extremely cleanto be safe for human travel. In particular, even the pervasive cosmic microwaveradiation, which has very low energy, would be boosted to high energies andbecome dangerous for the wormhole traveler."

So maybe this will never happen. Wormholes may never actually be proven to exist. Even if they dont, it's wild to think about the way quantum physics could even allow for a wormhole that you could coast right through.

Continued here:

If you flew your spaceship through a wormhole, could you make it out alive? Maybe... - SYFY WIRE

Read More..