Page 2,190«..1020..2,1892,1902,1912,192..2,2002,210..»

Safeguarding Cloud-Based Data & Mitigating the Cyber Risks Associated with a Remote Workforce – JD Supra

[author: Stephen O'Maley]

INTRODUCTION

Efficiency, scalability, speed, increased cost savings, and advanced security for highly sensitive data remain in high demand by users of eDiscovery services. To meet that demand, cloud technology promised several of those benefits.

However, the advanced security of the data depends on how an eDiscovery service provider implements, maintains, and manages sensitive client information.

This issue has become more significant as the majority of the workforce is dispersed and often working from unsecured home environments has therefore driven an increased usage of cloud services. That greater cloud usage has opened the door to riskier data storage scenarios that might not be fully apparent to those users of eDiscovery services. Furthermore, the firms providing these services may not be knowledgeable about all of the risks inherent to their activities and processes.

Because the industry has moved toward commoditization over customization, the workforce within some eDiscovery providers consists largely of junior staff who should follow strict protocols and procedures while in the office. While these activities may have been proven and vetted in the office environment to meet minimum security standards, the majority of employees are not likely to be mindful of the security risks inherent to working at home.

This paper examines the inherent risks surrounding the protection of client electronic data on cloud-based platforms that have arisen with the proliferation of the at-home work setting. It also explains why its important for users of eDiscovery services to scrutinize the technical capabilities, practices, and experience of the professionals that will be handing their data to ensure proper precautions are in place.

THE CLOUD: A SOLUTION THAT INTRODUCES ADDITIONAL RISKS

Many eDiscovery providers have recently migrated hosted client data from private data centers to public or private cloud environments. As hosted data volumes increased, so did the complexities involved in scaling the physical resources required to maintain private hosting environments in a way that met the speed, efficiency, redundancy, and security requirements of clients. Consequently, eDiscovery providers began reexamining the risks and costs associated with their hosted portfolios and many of them turned to the cloud as a solution. But this also introduced other issues as well that may not have been fully reconciled to date and may have been exacerbated by the pandemic.

Security

It is not uncommon for an organizations most sensitive data to be found on eDiscovery platforms. That data often includes privileged communications, business strategy decisions, trade secret information, potentially embarrassing personal communications, and other confidential communications from its employees, leadership, and legal counsel. Cloud hosting services that are run by eDiscovery providers have a range of security capabilities that are often unexamined by the eDiscovery user.

Due to the increasing sophistication of state and non-state cyber hackers, there is continued and mounting risk of infiltration by hostile actors. This was illustrated in the 2020 SolarWinds attack on the U.S. government. In that scenario, a trusted technology service firm tasked with maintaining the computing environment within several of the worlds most secure data centers provided the doorway for hackers to access the countrys most sensitive data.

Then there are the inherent risks with at-home working environments that have increased due to the COVID-19 pandemic. With the advancement and continued adoption of IOT (Internet of Things) devices and the expansion of high bandwidth Internet services for residential consumers, there exists multiple pathways for trusted home-based Wi-Fi connected services in the form of smart devices (smart speakers, thermostats, alarm systems, TVs, etc.) to become compromised in an environment that isnt usually monitored for malicious network activity. This is compounded when employees of eDiscovery providers lack experience or knowledge around network security risks.

Reliability

Cloud services offer the promise of unparalleled reliability with limited downtime for the document review operations of eDiscovery users. Although there may be regularly scheduled maintenance windows, emergency outages do happen occasionally. Consider Googles outage in December of 2020. Disaster-related outages to users of eDiscovery services hosted in the cloud can have severe impacts on a clients ability to meet court-mandated and other production timelines.

Data protection and privacy concerns

Cloud hosting solutions can and often do provide data storage local to regional jurisdictions that require personally identifiable information (PII) redaction and identification before extraditing that information to another country (such as the United States). This offers the promise of eDiscovery providers having locally available data storage in the region requiring the privacy regulations.

However, given the multitude of regions throughout the globe with data privacy regulations, a user of eDiscovery services should not assume that their data is be hosted in accordance with local regulations. In general, users of eDiscovery services should confirm with their providers where the physical servers are located that will be housing the protected data.

Additionally, with the majority of the staff of eDiscovery providers working from home due to the pandemic, it may be important to ask how a mindful approach to global data privacy regulations is being addressed.

Global context

Cybercrime is projected to have cost the global economy nearly $1 trillion in 2020. Furthermore, hacking and infiltrations into government and business entities is increasingly viewed as the best way for adverse nations and other bad actors to have the greatest impact on their targets. This is all intensified by the global pandemic, when at-home working environments and increased use of social engineering in generally insecure environments present added risks to the security of data under management.

HOW TO ENSURE YOUR DATA IS SECURE

What are some of the ways that users of cloud-based eDiscovery services can verify that their data is being safeguarded?

Cloud security

One important step to take is to ask if the cloud-based eDiscovery solution has been certified to various security standards. While this isnt a guarantee that your data is not exposed, it does present some level of comfort that security protocols are tested on a regular basis by an impartial third party. Some certifications that are relevant here include: SOC2 Type 2, ISO 27001, ISO 27017, ISO 27018, as well as certifications that indicate the hosting provider is mindful of data privacy regulations and HIPPA requirements.

Its important to differentiate certifications that are attributed to the cloud operator as opposed to the data hosting service provider. For example, AWS, Google, and Microsoft Azure have a number of sophisticated data security certifications associated with their up-stream operation of the cloud environment.

However, its important to note that an eDiscovery platform running within that cloud environment employs its own security protocols to allow reviewers to access documents and as a result does not inherit all of the security controls that exist on the base layer cloud offering. Make sure you know what security protocols and certifications your application of choice can directly lay claim to.

Work from home security considerations

This presents additional considerations. Many eDiscovery providers will point to employee handbooks and corporate policy documents as an initial answer, but in this unprecedented time, it is unlikely that those guidelines anticipated a scenario where the majority of the workforce was working from disparate outside and nonsecure locations.

Depending upon the technical environment available at the eDiscovery provider, measures can be taken to come close to the network restrictions in place in the office. No solution will be 100 percent risk free , but there are best practices that can be implemented to mitigate major risks. For example, the provider can take a centralized security approach through the use of a VPN (virtual private network) connection to the office environment that restricts access to non-essential networks and prevents employees from using non-work issued computers.

Its also crucial to be aware of the different levels of security restrictions appropriate for employees focused on different aspects of the eDiscovery process. For instance, someone performing document review likely requires less access to sensitive client data than the project manager in charge of organizing the review. Its necessary to understand what at-home procedures your provider is using and how that affects the safety and exposure of your data.

CONCLUSION

Notwithstanding the issues that have arisen, cloud-based eDiscovery solutions provide users numerous advantages in tackling the unprecedented challenges being faced in the post-COVID world. At the same time, its equally important for users to know and understand what protection providers are enacting to safeguard their data. Cloud storage solutions address issues faced by aging technical infrastructure, can greatly bolster cybersecurity and provide eDiscovery providers the flexibility to operate in a global setting. The added risks posed by work from home environments due to the pandemic mean that buyers of these services should closely monitor the whereabouts, protection, and technical environments employed by the firms working with their sensitive data.

Go here to read the rest:
Safeguarding Cloud-Based Data & Mitigating the Cyber Risks Associated with a Remote Workforce - JD Supra

Read More..

StudioDoc: And Back to the Edge – MarketScale

It wasnt so long ago that cloud computing was the trending topic as the growth of the wireless industry and new technology implementations led to rapid migration from on-premises data centers to cloud servers. According to a Gartner Forecast study, over a third of organizations consider cloud computing as one of their top three investment priorities.

However, cloud computing isnt without its limitations. It requires significant bandwidth and struggles with latency, making it challenging to use with sophisticated industrial IoT applications, which require real-time computation.

The next big solution in the computing space is the edge.

When you think of the word edge, thoughts of the end of an object or place immediately come to mind, like the blade of a knife or cliff of a mountain. In this application, however, the end actually means the start.

While cloud computing continues to serve its purpose of storing a computers system resources on-demand, its ease of use for performing and relaying data back involves much traversing.

Thats where edge computing comes in, bringing the power of the cloud closer to the customer. Edge computing can be used solo or in addition to cloud, and it provides customers with a boosted network performance, increase in reliability, and often a decrease in costs, thereby solving bandwidth and latency issues.

IT professionals have realized that moving their computational power closer to the edge enables the kind of real-time delivery that is necessary for many industries such as oil and gas, retail, and manufacturing. But going with cloud, edge, or both will depend on a customers needs.

Like any alternative solution or add on, cloud and edge computing have pros and cons.

For instance, when using wireless, cloud computing may require data, which can get costly, and this is where edge can help. Senior Director of Engineering for Digi International, Ken Bednasz, explained, The cost of memory is going down; the amount of processing power is increasing tenfold.

With edge computing significantly growing the last five years, customers can now look at their needs and determine which option makes most sense to help optimize their workflows and processes. Considering factors like data size, time sensitivity, innovation, extra security/privacy, and redundancy are all needs that may push a customer to choose or add on edge.

While some may waffle between the choices, many customers will easily be able to choose based on their compliance and regulatory requirements. Jessie LaCome, principal technical marketing engineer at Dell, explained, Youve got heavily regulated environments that will never see the cloud. This is due to the cloud storing its data out of immediate reach from customers, and the edge or more traditional data centers are seen as more secure because of their proximity.

Conversely, those with data intensive applications requiring a lot of storage will want to opt for at minimum utilizing the cloud. This could look like supply chain management systems or CRMs where customers or vendors need to interact with a companys applications. However, CEO of Hivecell, Jeffrey Ricker, and LaCome both warned it could add up quickly due to things like vendor contract lock-ins, data storage, data sent, and time. The only thing thats constant in a cloud billing right now is that it goes up every month, Ricker said.

Clearly, each computing holds its own set of perks, and edge is making data storage even more accessible. However, adding a new platform to the mix isnt always that easy.

Because the edge is relatively new, there will be some expected growing pains. For instance, most systems were built to work with either the cloud or edge, leaving customers with a lack of interoperability. It tends to cost a lot of money and take a lot of time to actually reconfigure these applications, said Vice President of Harbor Research, Harry Pascarella.

Clearly, this leads to concerns about variability of the applications being able to easily talk to both computing options, which leaves customers wondering about the reliability of the platforms working together. Pascarella advised that knowing the application requirements before adopting a new solution is of utmost importance.

Thankfully, as more customers demand more solutions, the interoperability will inevitably catch up to fully optimize these applications more devices will become connected and new processes will unfold to help decrease workloads and increase automation.

Much like the cloud before it, edge computing is expected to continue growing at a rapid pace. According to a study from marketsandmarkets, the edge computing market size is expected to grow from 36.5 billion dollars in 2021 to 87.3 billion dollars by 2026, at a compound annual growth rate of 19.0% during the forecast period.

For companies that want to use their IoT data to increase customer satisfaction and efficiency, deploying the edge at scale is necessary to properly work that information. Raw internet data IoT data versus business-relevant data starts at about 400 to one. That means theres at least 400 bytes of raw data for every byte of meaningful data that you should be pushing to the cloud, Ricker explained. However, experts want industries who rely on or are looking for edge solutions to recognize that it isnt a replacement for the cloud, merely a branch of the same tree.

For instance, when the machine learning models are configured properly, having the platform filter out the relevant data at the source and then pushing those figures to the cloud, is where current organizations are winning today in utilizing both computing solutions. Jason Shepherd, VP of Ecosystem at ZEDEDA, said, The edge is not going to completely replace the cloud by any means The edge is the last cloud to build. Its extending the footprint to be more distributed.

Ultimately, companies already utilizing the edge are seeing major cost savings, increased security, and quicker action.

The challenge for IT professionals moving forward will be to evaluate the strengths and weaknesses of the edge and the cloud and make an informed decision that best meets their particular needs. One thing is clear, though were only on the precipice of growth for the edge.

Continue reading here:
StudioDoc: And Back to the Edge - MarketScale

Read More..

Kwikset Halo smart-lock security flaw fixed here’s what you need to do – Tom’s Guide

The Kwikset Halo smart lock had a flaw in its Android companion app that could let another app on the phone capture login credentials to Kwikset's servers, then use that information to gain control of the smart lock.

This flaw was found by researchers at Bitdefender (opens in new tab), who notified Kwikset of it on Nov. 9, 2021. Kwikset fixed the flaw with an Android app update on Dec. 16, 2021.

If you're a Kwikset Halo smart-lock owner or user, make sure your Android app is updated to version 1.2.11. Kwikset's iOS app did not seem to be vulnerable to this flaw, Bitdefender researchers told Tom's Guide.

The flaw had to do with accessing Kwikset's cloud servers on Amazon Web Services, a Bitdefender report released today (April 6) explained. The credentials to access the servers could be read by other apps installed on the same Android device, the Bitdefender researchers found by using the Drozer (opens in new tab) app-security-checking tool.

The process wasn't that easy. The malicious app would have to create pointer links that tricked the Kwikset app into exported the AWS credentials from a protected file into an unprotected file, where the malicious app could then read them.

Of course, the malicious app would have to be installed by the user on the phone in the first place, but that is not so difficult when hundreds of harmless-seeming but actually malicious Android apps are found in the Google Play app store every year.

The good news is that the Kwikset Halo Android app was otherwise pretty sound. The lock itself which is on our list of the best smart locks had no security flaws that the Bitdefender team could find, and neither did the communications between the lock and the paired smartphone.

The Bitdefender team was not able to use a "man in the middle" attack on the lock, were not able to crack the lock's encryption, were not able to tamper with firmware updates, and were not able to steal the Kwikset-account user password, thanks to two-factor authentication being enabled by default.

"The connection can't be intercepted with a man-in-the-middle attack, as the smart lock verifies the validity of the server certificate," Bitdefender researchers said in their paper. "An attacker can't impersonate the camera to the server as they lack knowledge of the client certificate stored on the device's memory."

Today's best Kwikset Halo Smart Lock deals

Visit link:
Kwikset Halo smart-lock security flaw fixed here's what you need to do - Tom's Guide

Read More..

It’s graphic Fungible composes GPU-as-a-service offering Blocks and Files – Blocks and Files

Fungible has announced GPU-Connect with its DPU dynamically composing GPU resources across an Ethernet network, enabling cloud and managed service providers to offer GPUs-as-a-service.

The startup has specially designed data processing unit (DPU) chip hardware to deal with east-west traffic across a datacenter network, handling low-level all-flash storage and network processing to offload host server CPUs and enable them to process increased application workloads. Fungible DPU cards fit into servers and storage arrays and now a GPU enclosure and are connected across a TruFabric networking link with Composer software dynamically composing server, storage, networking and GPU resources for workloads. The aim is to avoid having stranded resource capacity in fixed physical servers, storage and GPU systems, and increase component utilization.

Toby Owen, Fungibles VP of product, is quoted in Fungibles announcement: Fungible GPU-Connect empowers service providers to combine all their GPUs into one central resource pool serving all their clients. Service providers can now onboard new data-analysis intensive clients quickly without adding expensive servers and GPUs. By leveraging FGC, datacenters can benefit from the collective computing power of all their GPUs and substantially lower their TCO with the reduction of GPU resources, cooling and physical footprint needed.

The Fungible DPU creates a secure, virtual PCIe connection between the GPU and the server, managed by hardware, that is transparent to the server and to applications. No special software or drivers are needed and Fungible GPU-Connect (FGC) can be retrofitted into existing environments. Fungible research indicates that GPUs typically sit idle while accessing hosts digest their results with average GPU utilization per user around 15 per cent.

FGC includes an FX-108 GPU card chassis, an FC200 accelerator card for the host server, and Fungibles Composer software.

The 4U chassis can house up to eight double- or single-width PCIe 3 or 4 16-lane connected Nvidia GPU cards A2, A10, A16, A30, A40, A100-40G, and A100-80G plus an optional NVLink Bridge. It has 100GbitE connectivity and supports Top-of-Rack only, Spine-Leaf network topology. There can be up to four F200 cards.

The Composer software runs on physical or virtual x86 host servers.

The systems performance compared to directly connected GPUs is near identical or even identical within margin-of-error terms. Here is a RESNET50 example:

Fungible is the sole DPU supplier combining its accelerator chips and cards with composability across its own network. Intels IPU is not a composability offering. Neither is Nvidias BlueField product line nor AMDs Pensando chips.

A person close to Fungible said, regarding Pensando: The main difference [is] that we have built products based on our DPUs instead of just trying to shove them into servers. Two different approaches. we are more focused on the composable infrastructure and the datacenter as a whole.

The two main composable system startups, Liqid and GigaIO, both use physical PCIe as their connectivity medium. Fungible provides a virtual PCIe facility across Ethernet.

The intent is that tier-2 MSPs and CSPs will use FGC as a base for their own GPUaaS offerings a hitherto untapped market. Some large or specialized enterprises may use FGC as well. We have been told that Fungible is getting a lot more traction now with partners and customers and the FGC platform is exciting a lot of interest. Lets see if Fungible has composed the right product set for the market.

Read more here:
It's graphic Fungible composes GPU-as-a-service offering Blocks and Files - Blocks and Files

Read More..

Iveda Solutions Products Included in Taiwan Building Project – MarketWatch

By Chris Wack

Iveda Solutions Inc. said Wednesday that its products have been included in the Taoyuan Aerotropolis Project in Taiwan.

The company said the project is a large national infrastructure plan, consisting of 10 major construction projects beginning this year with completion expected in 2028.

The project will expand the Taoyuan International Airport in Taiwan, as well as trade and industrial zones. The project will prioritize six industries: biotechnology, cloud computing industry, aviation, international logistics, smart automobile and green energy.

Iveda said included in the project are IvedaXpress IP cameras with its cloud management platform, IvedaAI intelligent video search platform for face recognition, object search and thermal detection, and weather and IR sensors for electric fences.

The IvedaXpress cloud management platform that Iveda built at a Chunghwa Telecom data center will generate recurring cloud hosting revenue for five years. The company said it estimates $5 million to $7 million in revenue from the multiyear project.

On Tuesday, the company closed its previously announced underwritten public offering of 1.9 million common shares, and accompanying warrants to buy 1.9 million shares, at $4.25 a share of common stock and accompanying warrant.

Iveda shares were up 23% to $3.76 in premarket trading.

Write to Chris Wack at chris.wack@wsj.com

View post:
Iveda Solutions Products Included in Taiwan Building Project - MarketWatch

Read More..

Two Edge Servers from Inspur Information Win 2022 Red Dot Awards – Business Wire

SAN JOSE, Calif.--(BUSINESS WIRE)--Inspur Information, a leading IT infrastructure solutions provider, was honored to have two products in its Edge microserver portfolio win recognition for excellence from Red Dot Award: Product Design 2022. EIS800 and Swing P101 were both winners, lauded for their clean and compact industrial design, modularity and ease of use.

The Red Dot Design Award is an internationally recognized seal of quality and good design with a history that spans more than 60 years. 48 experts from around the world serve as the Red Dot Jury, which meets yearly to identify the best new entries in Product Design, Brands & Communication Design and Design Concept. Inspur Information competed in the original and most competitive category, Product Design. The Red Dot Jury follows the motto In search of good design and innovation, and assesses each product individually and comprehensively to identify the entries with the most outstanding design quality.

An adaptable and minimalist package for Edge computing

EIS800 is an intelligent portable and easy-to-deploy Edge microserver for the digital era. Its specifically designed extensible modules and interfaces achieve high customizability, which allows for rapid deployment across a wide variety of scenarios for intelligent Edge computing. This adaptability combined with its wide variety of wireless communication protocols including ZIGBEE, 4G/5G, WIFI/Bluetooth, and GPS, make it an ideal candidate for nearly any environment or situation. Its structural design is both simple and compact. The housing is manufactured with rugged die-cast and anodic oxidation techniques to provide heat dissipation along with water and dust resistance. The IP65 protection level rating and temperature rating of -40C~70C ensures normal operation in a wide range of harsh edge environments.

An easily modifiable Edge solution

Swing is a highly integrated edge server that improves the computing efficiency of edge AI inferencing scenarios. It can provide more compact and efficient AI computing power for AI teaching assistants in universities, AI software algorithm development, intelligent medical scenarios such as medical image recognition and disease screening, and other applications. Two expansion card slots can be quickly customized with various GPU, ASIC and FPGA accelerator cards. This customization allows optimal functionality in various setup scenarios to quickly complete targeted application inference and computing architecture designs. It features a metallic finish from its anodized surface. The grille-like design is extremely minimalistic, rendering a sleek look that is excellent at heat dissipation. All of these features work together to enhance product development in a clean and simple package.

We are thrilled to have EIS800 and Swing P101 be recognized for their superior product design by Red Dot, said Park Sun, General Manager of Edge Computing, Inspur Information. We are excited to introduce the value of these products to customers. The acceleration of digital transformation has created huge amounts of real-time data that needs to be collected and processed in edge environments. An increasing amount of AI processing workloads require more flexible and more distributed solutions. EIS800 and Swing make that possible.

About Inspur Information

Inspur Information is a leading provider of data center infrastructure, cloud computing, and AI solutions. It is the worlds 2nd largest server manufacturer. Through engineering and innovation, Inspur Information delivers cutting-edge computing hardware design and extensive product offerings to address important technology sectors such as open computing, cloud data center, AI, and deep learning. Performance-optimized and purpose-built, our world-class solutions empower customers to tackle specific workloads and real-world challenges. To learn more, visit https://www.inspursystems.com.

Read the original here:
Two Edge Servers from Inspur Information Win 2022 Red Dot Awards - Business Wire

Read More..

How to Unblock Twitter in Russia in 2022 [Avoid the Russian Ban] – Cloudwards

Twitter is a trusted online news and social media platform. As such, its ban in Russia is part of the governments coordinated efforts to stifle the free flow of information and step up the spread of propaganda about the Ukrainian invasion through state-run media outlets. In this guide, well show you how to unblock Twitter in Russia to open the door to accurate and independent information.

As a prerequisite, youll need a virtual private network (VPN) to successfully bypass Twitter geo-restrictions in Russia. However, if you havent signed up for a VPN yet, fret not. Well reveal tried-and-tested VPNs that bypass the Twitter ban with ease, so keep reading. If youre just looking for a quick and easy answer, then check out ExpressVPN.

No law in Russia outrightly bans the use of VPNs within the country, but its illegal to use VPNs to access blocked content. That said, the Putin regime has blocked popular VPN services for failing to cooperate with its censorship efforts.

ExpressVPN is the best VPN for Russia and the best service to unblock social media in this restrictive country. For these reasons, well use it in our guide to demonstrate how to get around the Twitter ban. Unblocking Twitter in Russia with a VPN is as easy as following the steps below:

Access to reliable and independent information is a fundamental human right and no country should bar its citizens from enjoying this right. However, in some restrictive countries such as Russia, that right isnt a guarantee.

The government wants to be on top of the content Russian citizens consume, especially during its war with Ukraine. To achieve that, it has intensified efforts to block social media providers and international news media. The ban is geared toward isolating Russians from each other and the rest of the world, leaving them with no choice but to rely on state-affiliated news outlets for information.

However, while the move to block Twitter works in the governments favor, its a stumbling block for ordinary Russian citizens, marketers, advertisers and brands in Russia that rely on Twitter.

The Russian authorities began throttling Twitter on Feb. 26, 2022. The government restricted the social media network on leading Russian telecommunication companies including Beeline, MegaFon, Rostelecom and MTS.

The throttling made the site slow, making it nearly impossible for Twitter users to send tweets. Twitter confirmed the stripped access and claimed to be looking for the ideal workaround before the situation morphed into full-scale restriction.

The Twitter restriction came as Russia passed a new draconian law in an attempt to further crackdown on protests. As per the new law, independent journalists and citizens caught spreading fake information that contradicts the Russian governments narrative on the Ukrainian war risk a prison sentence of up to 15 years.

Twitter kept its promise to find a way around the ban by embracing the dark web with a Tor service to outfox the Russian authorities. We cant guarantee the effectiveness of the new Twitter feature, but as far as we know, Tor browser or network isnt as effective as a VPN in these situations.

Keep in mind that bypassing the Twitter ban in Russia isnt all about accessing the social media platform. Besides finding your way around the ban, you have to dust your digital trails just in case the Russian authorities decide to pursue you.

Both Tor and VPN hide your real IP address and location and can get you into blocked social media networks. However, the Tor network hides who you are but isnt as effective when it comes to safeguarding your digital privacy. By contrast, a VPN emphasizes privacy: It hides your identity, your true location and what you do online. Read our VPN vs proxy vs Tor guide to learn more.

A VPN provides digital security and privacy, allowing you to overcome attempts by the Russian government to strangle the free flow of information on Twitter.

ExpressVPN is the best VPN for Russian users to access any social media platform.

ExpressVPN is the best VPN service out there, and its excellent security and privacy make it a great option to access Twitter in Russia. It has over 3,000 servers in 94 countries, including Russian neighbors such as Norway, Poland and Finland, that you can connect to for optimal performance. Each connection utilizes the virtually unbreakable AES-256 encryption to keep your internet traffic away from prying eyes.

Besides that, theres a kill switch that ensures every bit of your online traffic passes through the encrypted tunnel. The DNS leak protection prevents IP leaks that could tip off the Russian authorities about your real location, whereas the TrustedServer technology ensures the VPN servers wipe user data with every reboot.

ExpressVPN abides by its strict no-logs policy, meaning it would have no data to share if coerced or subpoenaed by the Russian authorities. Read our exhaustive ExpressVPN review for more details. Theres also a 30-day money-back-guarantee, meaning you can try it risk-free.

NordVPNs Onion over VPN lets you route your internet traffic through the Tor network and a VPN for an extra layer of security.

NordVPN matches ExpressVPN performance in many aspects, except speed. It has over 5,400 servers spread across 60 countries, including some Russian neighbors: Finland, Poland, Latvia and Norway. Moreover, it uses the AES-256 encryption standard and comes with a kill switch and DNS leak protection.

What sets NordVPN apart from the other two VPNs on our list is its suite of advanced security features such as double-hop servers. As the name suggests, the servers route your internet traffic through two servers, adding an extra layer of protection. These specialty servers come in handy if you want to share sensitive information without the fear of government eavesdropping.

In addition, NordVPN has obfuscated servers that let you use a VPN in heavily restrictive environments. It also comes with a strict no-logs policy, and although it has suffered a security breach before, its still a trustworthy provider. Read our comprehensive NordVPN review for more. Its also a bit cheaper than our top pick, which makes it a better option for those on a budget.

The NoSpy servers are quite adept at bypassing censorship and surveillance in restrictive environments.

CyberGhost is another reputable VPN to unblock Twitter in Russia. Like our first two picks, it has all the basic features, including AES-256 encryption, a kill switch and DNS leak protection. In addition, it has a fleet of over 7,700 servers in 91 countries, including Russia. However, you dont need a Russian IP address to access Twitter in Russia.

What distinguishes it from other VPN services is the NoSpy servers. These are anti-surveillance servers built for use in restrictive countries to ward off third-party meddling and monitoring. Moreover, CyberGhost plays by its no-logs policy and has no history of a security breach. Finally, CyberGhost is one of the most affordable VPNs out there, as you can read in our full CyberGhost review.

We dont vouch for free VPNs because most of them are unreliable. Remember: When it comes to unblocking Twitter in Russia, you need a reliable VPN provider with top-notch security and privacy. In most cases, Free VPNs lack the robust VPN features (servers, protocols and encryption) needed to bypass network restrictions.

That said, not all free VPNs are unreliable. For example, our best free VPNs Windscribe and ProtonVPN guarantee excellent security and may be an option if you want to access Twitter in Russia. Sadly, they come with usage limits.

With Windscribe you get 10GB per month, which might be enough if you only want to tweet and read tweets. You also get access to 11 server locations out of 25. ProtonVPN, on the other hand, gives you unlimited free data, but only lets you use servers in three countries.

Peace talks between Russia and Ukraine are ongoing, and we hope the two nations will find a truce in the coming days. As it stands, the ground is becoming hostile to social media platforms. Meta, Facebooks parent company, had its Facebook and Instagram platforms banned for allegedly committing extremist activities.

If the recent censorship spree is anything to go by, then we can confidently say the Russian government isnt going to lift the ban on Twitter anytime soon. For this reason, you have to arm yourself with the best VPN to overcome censorship. We recommend getting started with ExpressVPN, thanks to its excellent security and privacy. NordVPN and CyberGhost are cheaper alternatives.

Have you used a VPN to access Twitter in Russia? Which VPN service did you use? Are you satisfied with the performance of that VPN? Wed like to hear about it in the comment section. As always, thanks for reading.

Let us know if you liked the post. Thats the only way we can improve.

YesNo

Follow this link:
How to Unblock Twitter in Russia in 2022 [Avoid the Russian Ban] - Cloudwards

Read More..

Amazon, Google or Microsoft? Boeing chooses all of the above for cloud computing services – GeekWire

Boeing engineers huddle over a computer to work on an aircraft design. (Boeing Photo / Bob Ferguson)

The billion-dollar competition to provide Boeing with cloud computing services is finished, and the winner is a three-way split. Amazon Web Services, Google Cloud and Microsoft are all getting a share of the business, Boeing announced today.

In a LinkedIn post, Susan Doniz, Boeings chief information officer and senior VP for information technology and data analytics, called it a multi-cloud partnership.

This represents a significant investment in the digital tools that will empower Boeings next 100 years, she wrote. These partnerships strengthen our ability to test a system or an aircraft hundreds of times using digital twin technology before it is deployed.

Doniz said that becoming more cloud-centric will provide Boeing with global scalability and elasticity without having to predict, procure, maintain and pay for on-premises servers.

Financial details relating to the multi-cloud partnership were not disclosed.

Historically, most of Boeings applications have been hosted and maintained through on-site servers that are managed by Boeing or external partners. You could argue Boeings extensive intranet blazed a trail for todays cloud computing services.

Marketing and performing computer services involves a whole new way of doing business, Boeing President T.A. Wilson declared in 1970 when Boeing Computer Services was formed.

In recent years, Boeing has been transitioning from its own aging computer infrastructure to cloud providers. For example, in 2016 the company chose Microsoft Azure to handle a significant proportion of its data analytics applications for commercial aviation.

At the time, that was considered a notable win for Microsoft, but Boeing also has maintained relationships with AWS, Google and other cloud providers.

Some had expected Boeing to pick a primary provider as a result of the just-concluded bidding process. Last year, The Information quoted its sources as saying that the deal could be worth at least $1 billion over the course of several years and that Andy Jassy, who is now Amazons CEO, saw it as a must-win for AWS.

But if Boeing is favoring one member of the cloud troika above the others, its being careful not to tip its hand publicly to such an extent that todays announcement consistently lists the three companies in alphabetical order. (If you happen to know who the big winner is, send us a tip.)

Update for 4 p.m. PT April 6: In an interview with Insider, Amazon Web Services senior vice president of sales and marketing, Matt Garman, discussed Boeings decision on parceling out the contracts for cloud computing services and said just about what youd expect an executive in his position to say.

Theyre announcing that theyre going to have a couple of different partnerships, Garman said. I think the vast majority of that will land with AWS.

Cloud services arent the only connection between Amazon and Boeing: In its news release about the Boeing cloud deal, Amazon notes that it has more than 110 Boeing aircraft in its Amazon Air delivery fleet.

The other two cloud titans are also talking: Microsoft noted that its been working with Boeing for more than two decades, and that todays deal will deepen the relationship. Meanwhile, Google emphasized its efforts to match 100% of the energy powering its cloud workloads with renewable energy, making it the cleanest cloud in the industry.

Read the original:
Amazon, Google or Microsoft? Boeing chooses all of the above for cloud computing services - GeekWire

Read More..

The key is the cloud: How to keep up with the speed of innovation – CIO

At DISH Network, cloud-adoption strategies vary by when the parts of its business started from those born in the cloud to legacy sectors deploying cloud on an opportunistic basis. But one thing is clear to Atilla Tinic, the companys EVP and CIO: I do think the key is the cloud. He added: The strategy around cloud is not ROI on a case-by-case basis. Its a must if a company wants to stay relevant.

Tinic is among the speakers at CIOs Future of Cloud Summit, taking place virtually April 12-13. Focusing on speed, scale and software innovation, the event will gather technology executives to discuss both strategy and concrete implementation tactics.

The program begins April 12 and will cover aspects of cloud innovation and agility, with Jessica Groopman, founding partner at Kaleido Insights, kicking off the day with a look at three macrotrends reshaping cloud and software innovation. She will also field questions in a live discussion.

Throughout the day, CIOs from leading companies will dive into aspects of their strategy. Christopher Marsh-Bourdon, head of hybrid environments at Wells Fargo Bank N.A., will offer insights on designing hybrid cloud environments for security and flexibility. Shamim Mohammad, EVP and chief information and technology officer at CarMax, will present a case study on how the cloud enables the companys signature Instant Offer feature. Addressing how to maximize the value of every dollar spent in cloud will be Jennifer Hays, SVP of engineering efficiency and assurance at Fidelity Investments, along with FinOps Foundation Executive Director J.R. Storment.

Michael Riecica, director of security strategy and risk in Rockwell Automations Chief Information Security Office, will drill into the security aspects of cloud strategy. And hear how the U.S. Federal Reserve System leverages cloud smart strategies from System CIO Ghada Ijam.

James Cham, a partner at Bloomberg Beta, will offer a venture fund perspective on changes to watch in software development, deriving value from big data, and a view into where AI fits in. Cham will also lead a live discussion on cloud, and other technology investments, can be used as catalysts for building business value.

Another opportunity for interaction with peers and experts will take place in a workshop on cloud as a business strategy platform led by Kevin L. Jackson, CEO of GlobalNet and the host of Digital Transformers.

Hear from world-class analysts such as Dion Hinchcliffe, vice president and principal analyst at Constellation Research, who will preview what cloud will look like in five years and advise CIOs on how to address the challenges of fast change, while successfully dealing with talent scarcity and technical debt. IDCs Dave McCarthy, research vice president of cloud infrastructure services, will advise on how to navigate a future of digital infrastructure focused on cloud. He will cover application modernization, the best approach for multi-cloud deployments and where to invest in automation. McCarthy will follow up the presentation with a live discussion Wednesday on cloud trends.

Wednesday will focus on shifting into a cloud native future, starting with a tutorial on how to gain and maintain a competitive advantage from champion racecar driver Julia Landauer. Later, she will answer questions about habits and mindsets that drive success.

Priceline Chief Technology Officer Marty Brodbeck will share how the online travel agency sped up its cloud native software production. Meanwhile, Expedia Vice President of Development and Runtime Platform Robert Duffy will discuss how to become a results-driven cloud native organization with Cloud Native Computing Foundation Chief Technology Officer Chris Aniszczyk.

Looking to integrate cloud native apps into a seamless operational platform? Professional Case Management CIO Charlie Billings will share his organizations experience. In another session, Joseph Sieczkowski, CIO for architecture and engineering at BNY Mellon, will discuss cultivating an agile and dynamic operating model.

Finally, in a glimpse at whats to come, learn the hottest cloud-native software development trends from InfoWorld senior writer Serdar Yegulalp and Group Editor for UK B2B Scott Carey. In addition, Yegulalp will present a non-technical introduction to Kubernetes and best practices of managing container-based applications at scale.

Throughout the summit, sponsors including Cloudera, Freshworks and others will share innovative solutions for building your cloud strategy.

Check out the full summit agenda here. The event is free to attend for qualified attendees. Dont miss out register today.

Pictured above (left to right): Ghada Ijam, System CIO, Federal Reserve System; Atilla Tinic, EVP, Chief Information Officer, DISH Network; racecar driver Julia Landauer.

Excerpt from:
The key is the cloud: How to keep up with the speed of innovation - CIO

Read More..

Cloud or Mainframe? The Answer is Both – IBM Newsroom

Cloud or Mainframe? The Answer is Both

By John Granger | Senior Vice President of IBM Consulting

April 06, 2022

To respond to the ongoing pressures of the global pandemic, businesses around the world have turbo-charged their digital transformations. Everywhere you look, companies face an acute need for speed to market, flexibility, nimbleness and, of course, ongoing innovation.

These priorities are why companies are looking to take advantage of cloud computing. But it is not straightforward; it's not just the hop to public cloud. Clients have issues of security and data gravity, of complex systems that are expensive to migrate. Strategically, they have concerns about optionality, about lock in, about discovering that their cloud providers have just become their competitors. These realities explain why so few clients have made a wholesale move to cloud.

The unique needs each company faces in their business transformation journey require a diverse mix of applications and environments including traditional data centers, edge computing and SaaS. What is the role of the mainframe in todays IT infrastructure?

According to a recent IBM study*, the vast majority (a whopping 71%) of IT executives surveyed from major corporations across seven industries say critical mainframe-based applications not only have a place in their IT platforms today but are central to their business strategy. And in three years, the percentage of organizations leveraging mainframe assets in a hybrid cloud environment is expected to increase by more than two-fold. Four of five executives say their organizations need to rapidly transform to keep up with competition, which includes modernizing mainframe-based apps and adopting a more open approach to cloud migration.

A hybrid cloud approach that includes and integrates mainframe computing can drive up to five times the value of a public cloud platform alone and the main sources of value are in five categories: increased business acceleration, developer productivity, infrastructure efficiency, risk and compliance management, and long-term flexibility. With the billions of dollars our clients have invested in business-critical mainframe applications like financial management, customer data and transaction processing over the years, this strategy holds true for IBMs global consulting practice. Our clients primary goal is to modernize those existing investments and minimize risk while delivering hybrid cloud innovation.

Digital transformation is not an either-or process. We guide our clients on the application modernization journey with these key recommendations:

First, adopt an iterative approach. Many enterprises are experiencing firsthand the complexity of their IT estates. Continuing to add to the existing vertical cloud silos is undercutting their flexibility by making processes related to development, operations, and security even more fragmented than before and cloud fragmentation makes it virtually impossible to achieve the standardization and scale that cloud promises to deliver. Therefore, part of your plan to integrate new and existing environments must factor in your industry and workload attributes to co-create a business case and road map designed to meet your strategic goals. Adopt an incremental and adaptive approach to modernization as compared to a big bang. Leverage techniques such as coexistence architecture to gradually make the transition to the integrated hybrid architecture.

Then, assess your portfolio and build your roadmap. To understand your desired future state, first assess your current state. Examine the capabilities that define the role of the mainframe in your enterprise today and how those capabilities tie into the greater hybrid cloud technology ecosystem. In addition, take stock of your existing talent and resources and determine any potential gaps. For IBMs consulting business, the partnership and role that IBM Systems plays is fundamental for the simple reason that solutions such as the new IBM z16 perform many of the critical functions underpinning a truly open and secure hybrid cloud environment. These functions include accessing troves of unstructured on-premises data across a hybrid cloud platform, scaling and automating data-driven insights with AI, and being agile enough to process critical apps and data in real-time all while assessing security risks. Storing data across multiple clouds and moving it between partners and third parties can leave companies more vulnerable to security issues such as data breaches. Assessing infrastructure solutions that support the ability to protect data even when it leaves your platform is crucial.

Finally, leverage multiple modernization strategies and enable easy access to existing mainframe applications and data by using APIs. This means providing a common developer experience by integrating open-source tools and a streamlined process for agility, in addition to developing cloud native applications on the mainframe and containerizing those applications.

IT executives expect significant usage increases in both mainframe (35%) and cloud-based applications (44%) over the next two years.

Consider how you can extract more value from both your mainframe and cloud investments. Blending mainframe power, reliability and security into the cloud landscape is essential to achieve the enterprise-wide agility and capability required to keep pace with ever-changing business needs.

* Study Methodology: Oxford Economics and the IBM Institute for Business Value surveyed 200 IT executives (for example, CIOs, Chief Enterprise architects) in North America.

Read the original post:
Cloud or Mainframe? The Answer is Both - IBM Newsroom

Read More..