Page 2,513«..1020..2,5122,5132,5142,515..2,5202,530..»

Chicago on a Revenue Roll From Cloud and Netflix Taxes (1) – Bloomberg Tax

Chicagos bold decision in 2016 to tax two features of the digital economy is reaping significant rewards, while the citys more conventional tax programs languished during the pandemic.

Revenue from Chicagos one-of-a-kind tax on cloud computing is about four times what it was five years ago, bringing in nearly $120 million last year, according to figures provided to Bloomberg Tax by the citys Office of Budget and Management. The citys tax on streaming entertainment has more than tripled over that period as consumers ramped up their subscriptions to services such as Netflix for video, PlayStation for gaming, and Spotify for music. The city collected more than $30 million in taxes on streaming services for the current year.

The two taxes are small slivers of the $1.5 billion in local taxes projected for the city during FY 2022, but a rare bright spot across Chicagos menu of business, hotel, restaurant, recreation, phone and utility taxes, said Laurence Msall, president of the Civic Federation, a tax and fiscal policy think tank.

The so-called cloud tax and the Netflix tax were particularly effective revenue sources for Chicago during the pandemic, when collections linked to lodging, transportation, and conventions took a nosedive. The city deserves credit, Msall said, for developing innovative tax regimes addressing digitally delivered services despite statutory limitations.

Chicagos unique approach using a lease transaction tax and amusement tax has allowed it to tap into streams of revenue that wouldnt be accessible to a municipality when you consider the sales tax laws of Illinois, Msall said.

The Chicago Department of Finance startled many six years ago when it reinterpreted two long-standing tax programs. Under Ruling #12, the city expanded the Personal Property Lease Transaction Taxpreviously imposed on automobiles, business equipment, and other leased itemsto nonpossessory computer leases. The tax extends to a wide range of computing models including platform as a service (PaaS), infrastructure as a service (IaaS), and software as a service (SaaS).

The new tax, at 5.25%, took effect in 2016. The rate ramped to 7.25% in 2020 and 9% in 2021 as the city searched for additional tax dollars.

At the same time, the department issued Amusement Tax Ruling #5. While the city had for many years applied the 9% tax to tickets for recreational activities and theatrical performances, the ruling extended the levy to amusements that are delivered electronically.

While a handful of states and localities around the country have extended their sales taxes to cloud computing and streaming, Illinois law doesnt allow that application of sales taxes. Chicago used its substantial home rule authorities to find another way through lease and amusement taxes. That makes it the only large municipality with this approach.

Creativity is paying dividends, according to data provided by Chicagos Office of Budget & Management.

The Netflix tax netted $9.4 million in 2017, its first full year of service. Revenue climbed to $28.8 million in 2020. Collections for the year ending June 30, 2021 totaled $31.2 million. In total, 47 businesses are registered to collect and remit taxes on electronically delivered amusements.

The cloud tax netted $30.5 million in 2017, its first full year of service. Revenue growth has galloped since then at annual rates of more than 40%, bringing the city $73.7 million in 2020 and $117.2 million for 2021.

Rose Tibayan, a spokeswoman for the citys Office of Budget and Management, attributed the rapid growth to the two rate increases and climbing rates of compliance. In total, 1,098 businesses are remitting taxes on leases of cloud products.

Compliance with both taxes has accelerated due in part to Chicagos active enforcement efforts and a recent statement on economic nexus.

In January the Finance Department issued a bulletin saying it expects out-of-state companies doing business in Chicago and meeting new economic presence, or nexus, standards to fully comply with the two taxes on a prospective basis, beginning July 1. The bulletin included a legal analysis of the tax programs in the context of the U.S. Supreme Courts 2018 ruling in South Dakota v. Wayfair, which permitted states to impose tax collection obligations on remote retailers based on economic activity rather than physical presence.

The guidance also outlined a safe harbor for small businesses selling entertainment and computing services in Chicago. Businesses with less than $100,000 in revenue from Chicago customers during the previous year arent expected to comply.

Its too premature to forecast the full impact of the guidance, but the overall response has been positive and some businesses have, indeed, become more willing to come into compliance after reviewing the bulletin, Tibayan said in an email.

Even as collections and compliance ramp higher, both tax programs remain controversial. The Netflix tax has been stung with litigation since it was first imposed.

In 2015 a libertarian legal advocacy group known as the Liberty Justice Center challenged the tax on constitutional grounds on behalf of Chicago users of Netflix, Hulu, and Spotify. The Entertainment Software Association filed a similar challenge in 2017, but later dropped the litigation. Chicago survived the Liberty Justice Centers challenge, winning trial court and appeals court rulings. In March 2020 the Illinois Supreme Court declined to review the appeals court ruling, which found nothing improper in Chicagos program.

Despite this precedent, tech giant Apple Inc. is pushing hard in a separate challenge to the Netflix tax in Cook County Circuit Court. The next hearing in the case is tentatively slated for December on Chicagos petition for dismissal.

Sony Interactive Entertainment LLC, the gaming division of Sony Corp., objected to the tax on streaming services, claiming it lacked nexus with the city. The company ultimately surrendered and began collecting the tax in 2018. Sony also paid the city $1.2 million under the terms of a confidential settlement.

With respect to the cloud tax, Chicago has been particularly aggressive with its audits, initially focusing on vendors and later targeting consumers of cloud services, said David Hughes, a state and local tax partner with HMB Legal Counsel.

This is not a function of the fact that we happen to be a Chicago law firm and were seeing so many Chicago lease transaction tax audits, Hughes said. I think the city is just being really aggressive in enforcing this. Weve seen it in other jurisdictions, but not to this degree.

Despite the economic nexus bulletin, significant questions remain about Chicagos position on taxes and penalties for prior years. Those issues are frequently debated during audits and the citys views are being applied inconsistently, said Samantha Breslow, a state and local tax associate at HMB.

There is some question about whether a vendor with no physical presence in the city has or had an obligation to collect prior to July of this year and if they did, how far back does that collection obligation go, she said.

Chicagos deputy corporation counsel for tax Wes Hanscom said the safe harbor in the bulletin, waiving compliance for remote vendors with less than $100,000 in revenue from Chicago customers, resolved many of the concerns voiced by taxpayers in recent years.

With a safe harbor, we were basically saying If you are under that threshold, you dont have to worry about it and we dont have to worry about it, Hanscom said.

Continued here:
Chicago on a Revenue Roll From Cloud and Netflix Taxes (1) - Bloomberg Tax

Read More..

Creatio Partners with Whale Cloud to Accelerate Digital Transformation for Telcos – PRNewswire

BOSTON, Nov. 5, 2021 /PRNewswire/ -- Creatio, aglobal vendor of one platform to automate industry workflows and CRM with no-code, today announced its strategic partnership with Whale Cloud (formerly ZTEsoft), a leading technology company providing software solutions and services for telecommunications and multiple industries, to accelerate digital transformation for telco organizations worldwide byofferingcustomers with superior experience across the entire customer journey. Whale Cloud is a Chinese-based company that provides Cloud, Analytics and AI-based software solutions to Telecom operators, Industrial enterprises, and Government sectors.

The new alliance will leverage the Whale Cloud 20+ years of experience in the market, theirexpertise inenabling business and operational innovation for customers in 80+ countries, and Creatio's award-winning offering.New partners believe that the combination of their strengths will empower organizations to increase the bottom line and substantially grow through rapid workflows automation, operational excellence, and streamlined customer relationship management.

"Our core focus is to enable organizations globally with the enterprise-ready no-code platform that allows both IT and business users to automate workflows and create apps in a blink of an eye. With this partnership we are aiming to combinecompelling digital experience and expertise from both sides to provide customers worldwide with cutting-edge solutions and top-notch services," said Alex Donchuk, Senior Vice President, Global Channels at Creatio.

Whale Cloud targets to fast-track the digital transformation process of the telecom industry and extend the benefits of this transformation across industries and marketplaces to help service providers, enterprises, and governments to create massive value in the digital economy.

"Two is better than oneby cooperating with Creatio, we are able to expand our solution portfolio and serve our customers better than ever. Seamless experience, reduced friction, and increased revenue can be achieved by this strategiccollaboration," said Steven Cho, Chief MarketingOfficer at Whale Cloud international.

About Creatio

Creatiois a global vendor of one platform to automate industry workflows and CRM with no-code and a maximum degree of freedom. Millions of workflows are launched on our platform daily in 100 countries by thousands of clients. Genuine care for our clients and partners is a defining part ofCreatioDNA.

About Whale Cloud

Whale Cloud Technology Co., Ltd ("Whale Cloud") is a leading technology company specializing in telecom software development and delivery, cloud computing, big data analytics, AI-enabled service operations, IoT, smart city solutions and other professional services including planning and consulting. Founded in 2003, the company provides services to various market segments including telecom operators, governments, and enterprises around the world. Formerly known as ZTEsoft, Whale Cloud was later invested by China's largest technology and eCommerce giant in the year 2018. At present, Whale Cloud's business scope extends from telecom markets to vertical industries. It has built its core competitiveness in communications software, operation services, cloud computing, big data analytics, AI, and Internet architecture. Formore information, please visit https://online.iwhalecloud.com/

Media ContactVera Mayuk+1 617-765-7997 [emailprotected]

SOURCE Creatio

Here is the original post:
Creatio Partners with Whale Cloud to Accelerate Digital Transformation for Telcos - PRNewswire

Read More..

Riding the wave of edge computing growth – ITProPortal

According to the IMARC research company, the global edge computing market produced robust growth during 2015-2020. So, whats driving this growth and where is it taking organizations in their IT strategies?

Edge computing is not a new phenomenon, but more organizations are exploring the concept because of how it can solve network computing challenges that are themselves growing. The key here is how edge computing reduces the number of processes running on the cloud by moving them to local devices. These can be a users computer, an IoT device, or an edge server.

By switching down the volume of long-distance communication flows between the client and the cloud server, there is a noticeable reduction in latency or congestion and an improvement in process efficiency. This translates into some significant business outcomes such as lowering bandwidth use, associated costs, and server resources. At the same time, edge computing is created the space for doing more real-time data analysis on edge devices that have application in terms of AI, automation, and machine learning.

Over the same period of edge computing becoming more known, companies across all markets have been undergoing programs of digital transformation that also have been focused on process efficiency among other things. This is where edge computing is playing a crucial role because it is a significant enabling force as it breaks down data center walls and pushes cloud capabilities outwards to the edge. In this respect, edge needs to be treated as part of the hybrid cloud infrastructure.

There are some other factors that are making edge computing more mainstream. After GDPR came into action in 2018, data sovereignty and security measures also became major focus points. Managed edge services assist companies in maintaining regulatory compliance while pursuing better customer experiences. Edge computing can be used in a number of different industries to ensure GDPR compliance as personal data will be processed locally.

The momentum for edge computing is not slowing down. IMARC forecasts the global edge computing market will grow at a CAGR of 30 percent from now until 2026. There are several factors for this.

One is of course how the pandemic has shaken up the world of work so that increasingly more people are working from home and more students are learning remotely. According to research from the ONS, 85 percent of homeworkers would like to continue to use a hybrid working pattern going forward.

In effect, this means edge computing aligns with how workers will be working from home or remote offices and switching between these and central offices too. The role for edge computing here is how it provides the intelligence and always-on security to manage how help companies seamlessly transition from traditional workplaces to a more mixed and fluid arrangement of work between office and remote office settings.

Edge can be deployed on-premises, privately and is extremely scalable at a low cost, which makes it suitable for adoption at production plants and healthcare facilities. Within the healthcare sector, edge computing allows patient data to be close to the source by restricting the movement of data, and so reducing privacy breaches. This can also be personalized, where only specific pieces of data can be shared with third parties, and each device is able to have device-specific security protocols. Therefore, further protecting private patient data. Moreover, security breaches of edge computing devices can be centralized and isolated, meaning the entire network will not be threatened.

Equally, there is huge potential for the technology to be further leveraged in augmented reality and virtual reality applications, industrial automation, and telecommunications. Using edge computing in those ways could enhance user experience by limited lag times, leading to faster responses.

There is an increasing number of use cases where edge computing is critical because of its requirements for low-latency processing. And dont forget, edge computing was developed thanks to the exponential growth of IoT devices, which generate enormous amounts of data during the course of their operations. There is a positive correlation between the IoT and edge computing markets. According to research from McKinsey, companies will continue to invest in IoT technology, and these investments are proposed to grow at 13.6 percent per year until 2022. As such, with the continuing growth in IoT investments, it is predicted that investments in edge computing will also continue to rise.

Edge computing has particularly grown over the last year as it provides the ability to optimize and extend the capability of cloud computing by bringing computation and data storage closer to the devices where its being gathered. As such, this allows for greater speed, latency, and security for the end-user. This is more secure for both the business and their customers; it gives them the reassurance of business continuity - a key technology to futureproof the new workplace environment.

5G and IoT adoption will continue to expand, which means an influx of data needing to be processed in centralized cloud computing and storage solutions. Normally, businesses would struggle with latency, bandwidth, and security issues. However, cloud adoption becomes a viable option thanks to edge computing. This brings computation and data storage closer to the IoT device, rather than relying on a central location that can be thousands of miles away. In this way, data will become more secure and not suffer from network latency concerns that can affect an applications performance. As such, edge computing can optimize IoT applications, in particular ones that require real-time actions. The convergence of 5G, IoT and edge computing creates fast networks with heightened security. As the pandemic will continue to have an impact, there will always be a need for innovation and digitization.

Furthermore, companies can become more time- and cost-efficient by having the processing done locally, which minimizes the amount of data that needs to be handled in a centralized or cloud-based location. Thereby, this will lower operational costs for businesses as edge computing devices can usually be implemented at a lower cost and decrease the amount of storage needed. Also, with data being closer to the end-user, this can improve efficiency.

Edge computing is here to say and will continue to evolve. As organizations review their options, channel partners can play a pivotal role in providing and supporting solutions that aggregate the endpoint devices, software and cloud resources needed to maximize the benefits of greater adoption of edge computing.

Stephen Nolan, Senior Vice President, Tech Data EMEA

See the rest here:
Riding the wave of edge computing growth - ITProPortal

Read More..

Analysing cloud providers’ infrastructure management the bank perspective – Finextra

Speaking with two banks with also almost opposite roles and histories, we try to understand how cloud is playing a role in their overarching technology strategy, and the challenges that cloud migration has thrown in their path.

Gordon Mackechnie, chief technology officer at Deutsche Bank explains that the bank has effectively three partners for cloud use: Microsoft is used extensively on the end-user side, Google is used as the strategic public cloud partner and for the bulk of the banks migration, and finally for databases that will remain on premises in the private cloud the bank signed an agreement with Oracle to migrate the bulk of its Oracle Database estate to the Oracle Exadata Cloud@Customer.

We are absolutely multi-cloud, but were not multi-cloud for the same purpose, Mackechnie explains. We want to take best of breed in each instance, and so we will have multiple cloud providers that we dont use simultaneously for the same type of tasks. We think this is important.

Each solution has its strengths in different areas and more than delivers on them.

The app-only Minna Bank which officially began operations inMay, boasts an in-house core banking system built by Zerobank Design Factory andAccenture, and is the first in Japan to run its core banking system on a public cloud Google Cloud. Not only is the core banking system running Minnas retail operations, but will be made available to third parties who wish to offer discrete embedded finance offerings or to run comprehensive branded banking services.

CIO of Japans Minna Bank, Masaaki Miyamoto, also lists a swathe of other cloud providers being leveraged by the bank including Google Cloud, Azure, AWS, Oracle Fusion, DataDog, Salesforce, and PagerDuty.

Understanding the context for cloud migration

Explaining that there has been a range of factors driving the financial services industrys acceptance of cloud adoption, Mackechnie highlights that the investments being made particularly by large cloud providers are significant.

Looking at the alternative building large-scale shared platforms internally, the economics just dont make sense.

This type of project is typically more expensive and more difficult to do than initially expected. Second, he adds that financial institutions can't compete with the level of investment that Google or Microsoft or Amazon are making to their cloud platforms.

Importantly, these projects arent points of commercial differentiation for incumbents. It's not as if were going to build a shared service platform internally, that's going to give us a benefit as an organisation. This removes the incentive to own anything or to self-develop.

He furthers that its important to understand the history of the ecosystem of suppliers in which banks have typically existed. Banks never built their own database software, but relied on software and infrastructure providers to provide these services.

In many cases we already have, a complex supply chain and providers that we use to kind of build the applications that we run the bank on. The cloud is just an evolution of that, rather than a complete revolution.

Getting cloud strategy right from the outset

Now that the industry recognises the importance of cloud use, banks are looking to expedite their migration strategies in ways that will provide both scale and security quickly.

Straight to the point, Mackechnie states that the key mindset that banks should adopt is to go after this migration plan with strong intent.

Theres no point in doing this if youre just going to play around at the edges.

He recommends examining the more difficult problems up front, because while it is possible to shift over certain smaller, more discrete services and operations onto the cloud, in order to get the true benefit that cloud can provide tinkering around the edges isnt going to deliver the progress you desire.

He qualifies this by stating that it is essential to identify the areas where real value will be delivered by shifting the cloud, and lead with these.

If you see the cloud as effectively an infrastructure or an infrastructure cost play, it doesnt necessarily make clear the added value potential to business processes. Were focusing on areas where we see incremental value which can only be achieved in the cloud.

Managing the cloud provider relationship

Minna Bank, while leveraging the services of numerous providers, tends to have a closer relationship with its core-banking cloud provider Google Cloud.

We work very closely and have lots of support from Google Cloud, were in constant contact with their support members, and the Google Cloud team knows how our system works too.

The Minna Bank team also schedules a monthly meeting with Google Cloud to share any error reports or new technologies available through the Google Cloud services which could assist their offering.

Aside from protocol which would involve close assistance with Google Cloud should any critical errors arise, Minna Bank only contacts its other cloud providers when need be.

Miyamoto adds that given its BaaS project currently under construction, it will need API connections to be able to deliver the offering at scale and with the ability to cater to high volumes.

Clients using our APIs dont always notify us ahead of launching a marketing campaign whether our servers are prepared for high volumes of new customers. We need to make sure our servers can withstand the load. To therefore scale up our servers using cloud, Google Cloud is the only provider who can make this possible.

When it comes to what banks should look out for in robust cloud infrastructure management strategies, Miyamoto explains that one factor (among many) is the ability to monitor services efficiently.

It very challenging to monitor all cloud services in one place. Usually core banking services reside in a data centre so that when a problem occurs, you just need to visit the data centre to understand what is going on. But, on a cloud server, there isnt anyone who can contact you to tell you there is a problem going on. Its really important to get to know what has happened in real time, so that people can actually resolve issues quickly.

What risks come with poor cloud infrastructure management?

First and foremost, Mackechnie argues, safety and security of data is the critical consideration that needs to be addressed in any cloud execution process. With anything that is still relatively new and developing, we must be careful about ensuring that at each stage, we manage the risks.

In banking these risks typically manifest themselves as security, stability and operational resilience risks. He furthers that in the same way cloud providers investment significantly in the functionally of their platforms, they also invest heavily in operational resilience and security because much banks, security presents something of an existential threat for cloud providers who must be able to demonstrate and maintain this high level of security to operate in the highly regulated financial space.

That isn't to say that it can't be done safely, but I think we have seen instances of people having problems in the past, maybe because a lack of experience and understanding, maybe a lack of configuration so we must be careful we don't make those mistakes. We have to be very careful to manage those risks effectively as we as we adopt new solution types like the cloud.

Resolving inherent challenges of cloud use

Miyamoto explains that a key challenge faced by Minna Bank is tied to management of cloud servicing protocol. He isnt concerned in a material sense about outages or cloud services going down per say, as Minna Bank has designed into its systems the ability to continue functioning even if one of its cloud services goes down.

It builds this reliability by separating its servers, in fact, Minna Bank holds its data on Google Cloud servers located on both the East and West sides of Japan. This guarantees availability so that business can continue in the unlikely case one of these centres goes down.

However, a challenge it is yet to entirely resolve is managing maintenance periods.

As a cloud infrastructure, the most important thing remains the account-end administration management, security, and operation management. However, because cloud services have to stop their servers for periodic maintenance, there is naturally a period of downtime. When that happens, our services also have to stop from anywhere between a few seconds to a few minutes.

To manage this, Minna Bank tries to control when these maintenance downtimes will occur and prepare our services accordingly. We not only have to negotiate with team members in our office and with the cloud providers, we also have to give notification to our customers, and all these updates cost a lot of money.

Ideally, Miyamoto explains that by automating every process. Even then, as cloud providers are onboarding more and more services, it results in Minna Bank having to spend more to continue operating and managing the costs incurred through maintenance.

Are regulators on the right path for effective cloud use?

Insofar as operational resilience, Mackechnie believes that cloud definitely has a part to play, and Deutsche Bank is managing these resilience requirements as they pertain to cloud use.

We retain full accountability for that resilience, and we are working with the cloud providers to make sure that its happening in a way that would meet our regulatory obligations.

He explains that the cloud providers are very focused on these regulations too, and that they recognise that if they wish to be material players in supporting financial services, they must be able to meet regulatory expectations.

Regulators are also on a journey, continuously evolving their approach to cloud use, but its challenging because a cloud provider is effectively a combination of things [] To some extent cloud providers are a hardware provider, to some extent theyre an open source service provider, and to some extent a theyre a software provider.

Mackechnie adds that while the regulator historically looked at these providers separately, with different levels of oversight depending on the service, they are facing a new challenge today: As you start to conflate these things you think, how do those different regulatory approaches come together in a way that sensible and effective for regulating an activity that's taking place in the public cloud?

According to Miyamoto, Japanese regulators have not ruled out the use of public cloud in banking systems. Rather, they encourage banks to evaluate cloud vendors as potential partners and promote the use of the cloud in line with the specific needs of their business and systems. Banks understand that without cloud use, they won't be able to compete with emerging companies simply by maintaining systems that have been entrenched for years."

How quickly should banks finalise their cloud migration plans?

While there is momentum behind shifting to cloud, Mackechnie argues that despite the increasing prevalence of digital players like Minna Bank which are entirely cloud native, there is not yet a pressure for incumbents to finalise their cloud migration.

If you're starting a bank from scratch today, would you build it all natively in the cloud? I'm sure you would. However you've got to recognise the sheer scale of 50 years-worth of infrastructure, learning, and business logic that has been built in the systems of the larger incumbents.

There's probably a tipping point somewhere down the line, where the cost of carrying a hybrid model becomes a bit painful, but we're a long way from that yet.

He notes that there are still material reasons to maintain on-premises infrastructure at reasonable scale, which will continue for the foreseeable future. These include security, data elements, regulatory elements, online transaction processing, and certain functional low latency type elements (tied to trading activity) that that would be very difficult to move in any way.

On top of this sits the inhibitor of the substantial investment required to migrate properly. To get the benefit you have to re-architect properly, therefore you're picking and choosing where to make those investments really have the most tangible impact.

I don't think there is pressure to finalise, rather, I think right now there's pressure to get it right. The risk profile of this is that we have to be absolutely certain that we're taking the right steps and we're taking those steps in a safe and secure way. So I think that's more the pressure on that than there is to actually finalise.

Original post:
Analysing cloud providers' infrastructure management the bank perspective - Finextra

Read More..

MemVerge takes Big Memory to the cloud – TechTarget

MemVerge is taking its flagship memory product, Big Memory Machine, to the cloud, giving users the ability to efficiently move data-intensive applications.

Founded in 2017, MemVerge is an in-memory computing vendor. Its main product is Big Memory Machine, a software layer that enables users to create a shared pool of memory across multiple servers, said Eric Burgener, a senior analyst at IDC. He likened it to how VMware virtualizes compute, networking and storage.

The product uses Intel Optane persistent memory modules -- initially released around the time of MemVerge's founding -- under its covers to provide users with some flexibility to create a pool that might be terabytes of memory, bypassing the normal capacity limitations in traditional RAM, he said.

In the cloud, users can access features that enable them to use temporary instances without losing data and cloud bursting without interruptions. MemVerge's Big Memory achieves this through AppCapsules, a way of containerizing the application through integration with Kubernetes and its existing ZeroIO Snapshots. AppCapsules provide portability from on premises to cloud and between cloud environments.

Big Memory Cloud is interoperable with the major public cloud providers AWS, Microsoft Azure and Google Cloud Platform as it sits on top of the cloud and requires no integration.

Companies are increasingly moving to hybrid and multi-cloud environments, and applications need to be built to take advantage of the cloud's agility, flexibility and scalability, according to Charles Fan, CEO and co-founder of MemVerge.

For some applications, moving from an on-premises to a cloud environment has been seamless. For others, including applications that require a server to retain data and to scale easily -- such as those that perform artificial intelligence or video rendering tasks -- the story has been more complicated.

Data services like snapshots only work on persistent storage, which would be of no use to workloads running in memory, according to Burgener. If there is a process or server failure, recovery is time intensive or users need to start over.

ZeroIO Snapshots use persistent memory as opposed to storage; in other words, zero I/Os are sent to storage. Users can refer to the recent snapshot of memory, potentially saving hours on recovery.

"ZeroIO is a memory snapshot [that] captur[es] a running application state and encapsulates it," Fan said.

These snapshots are placed into AppCapsules, Fan said. From there, they can be loaded, replicated, recovered and transported across clouds.

"To move an application from one cloud to another, like Google to Azure, users have to understand all of the storage pieces related to the application," Burgener said. "AppCapsules make this easier by figuring out the underlying persistent storage needed and moving it all to a new environment."

The technology may sound similar to that of vMotion, a VMware product that enables live migration of running VMs from one server to another. But, Fan clarified, Big Memory Cloud enables users to move a point-in-time snapshot of applications. After the move, a user can use the AppCapsule to restart the application where it stopped before it was migrated.

Big Memory Cloud's AppCapsules enable customers to take advantage of spot instances. Cloud providers with spare compute instance capacity or spots often provide deep discounts, up to 90%, until they need them back.

The portability of AppCapsules means customers can use spot instances and if the instance is interrupted -- called back by the service provider -- the ZeroIO Snapshot ensures the data is easily recoverable.

Portability also enables applications to use cloud bursting more efficiently, something in-memory apps can't normally do without long service interruptions. As performance needs outgrow the on-premises capabilities, apps extend or "burst" into the public cloud to use its resources, according to MemVerge.

Finally, AppCapsules together with ZeroIO provide added multi-cloud mobility, helping customers work where it makes the most sense while avoiding vendor lock-in, the memory storage vendor stated.

Big Memory Spot Cloud use cases are expected to be available in the next few months, with the cloud mobility and cloud bursting services becoming available next year.

MemVerge is not alone in its push to help customers move to the cloud, but its approach is unique in the market.

VMware released its software-defined memory tiering, also using Intel Optane persistent memory, in Project Capitola last month, which could go head to head with Big Memory Cloud as they both do tiering, but it is still unclear how it will fit into the cloud storage market. Project Capitola won't be available to customers until next year.

If it does push into the same space as Big Memory Cloud, IDC's Burgener said, the competition might heat up.

"When VMware decides it's important enough for them to get into it, that is a critical inflection point for the technology," he said.

See more here:
MemVerge takes Big Memory to the cloud - TechTarget

Read More..

At 2021 PASBA Fall Management Conference in Nashville, Infinitely Virtual Affirms Value of Managed IT No Matter the Host – PRUnderground

On the eve of the 2021 PASBA Fall Management Conference set for Nov. 8-11 at the DoubleTree by Hilton Nashville Downtown the CEO of leading cloud pioneer Infinitely Virtual is extolling the virtues of no host Managed IT.

Two years ago, in a bid to dramatically simplify the complex workload confronting managed services providers and IT professionals, we rolled out IV Managed ITSM, a distinctive new take on the MSP model, said Adam Stern, Infinitely Virtual founder and CEO. Now, as remote work/remote access has become more of a necessity than an option, the introduction seems prescient, and our IV Managed IT may be more appropriate for accounting firms than ever. PASBAs Fall Management Conference provides an ideal opportunity for organizations to get acquainted with this compelling way to have a better experience with Managed IT, whether hosted by AWS or Azure.

Offering comprehensive, premium remote monitoring and management, IV Managed IT is built around a modern, intuitive platform that enables Infinitely Virtual to easily look after customer devices, across all environments, from any location in the world. Given that IV is an MS Direct provider, IVs Managed IT solution puts the focus on people and business strategy as well as on hardware, delivering support from the desktop to the server and everything in between.

With IV Managed IT, were placing our distinctive stamp on IT management services, in part by answering the question, who monitors your local equipment? Our Managed IT suite recognizes that, for small and mid-size accounting firms, the cloud isnt pure play. While much of the compute environment may be off-premises, not all of it is, can be or even should be. Managed IT enables your IT resources to live seamlessly in both worlds.

IV Managed IT rests on six pillars: focusing resources on the core business; easily scaling with growth; aligning technology with strategic goals; remote management; IT help desk; and on-site IT management. Managed IT services are tailored to meet any accounting firms needs, from handling employee endpoints to fully overseeing complex IT infrastructure. Ultimately, we believe businesses should dedicate their resources to what they do best, not worrying about IT, Stern said. IV Managed IT is designed to grow with every business.

PASBArepresents Certified Public Accountants, Public Accountants, and Enrolled Agents who provide accounting services to small businesses throughout the United States. Members of the Association have built a nationwide network of accountants to benefit small business clients across the country. Using the collective resources of this network, Association members offer their clients a level of service and expertise that individual practices are unable to rival.

For more information, visit http://www.infinitelyvirtual.com.

About Infinitely Virtual

The Worlds Most Advanced Hosting Environment

Infinitely Virtual is a leading provider of high quality and affordable Cloud Server technology, capable of delivering services to any type of business, via terminal servers, SharePoint servers and SQL servers all based on Cloud Servers. Ranked #28th on the Talkin Cloud 100 roster of premier hosting providers, Infinitely Virtual has earned the highest rating of Enterprise-Ready in Skyhigh Networks CloudTrust Program for four of its offerings Cloud Server Hosting, InfiniteVault, InfiniteProtect and Virtual Terminal Server. The company recently took the #1 spot in HostReviews Ranking of VPS hosting providers. Founder and CEO Adam Stern is a member of the Forbes Technology Council. Infinitely Virtual was established as a subsidiary of Altay Corporation, and through this partnership, the company provides customers with expert 247 technical support. More information about Infinitely Virtual can be found at: http://www.infinitelyvirtual.com, @iv_cloudhosting, or call 866-257-8455.

The rest is here:
At 2021 PASBA Fall Management Conference in Nashville, Infinitely Virtual Affirms Value of Managed IT No Matter the Host - PRUnderground

Read More..

Huawei Selling x86 Server Business Due To US Sanctions: Report – CRN

China-based technology conglomerate Huawei Technologies is looking to sell its x86 server business to a consortium of buyers due to the company not being able to secure server processors because of U.S. sanctions, according to a Bloomberg report.

Due to the sanctions the U.S. government implemented against Huawei in recent years, the Chinese IT giant is in advanced talks to sell its server business to a group of buyers for likely hundreds of millions of dollars, according to the report by Bloomberg.

The issue mainly stems from U.S. sanctions that made Huawei unable to buy x86 chips from Intel, Bloomberg said, citing people familiar with the matter.

[Related: Pure Storage CRO, Former VMware Exec Jumps Ship To Nutanix]

Huawei is looking to sell to a consortium that includes at least one government-backed buyer, with several possible buyers from the Chinese government and the private sector engaging with the company over the past few months, said Bloomberg. Huawei did not respond to a request for comment by press time.

Huawei has dropped off IDCs worldwide server market-share leader list in recent quarters.

The company failed to crack the top five global market-share revenue leaders list in IDCs second-quarter 2021 Worldwide Quarterly Server Tracker, meaning Huaweis sever business generated less than $1 billion in server sales on a global basis, while also shipping less than 150,000 server units worldwide during the quarter, according to data from IDC.

Huawei has hit tough financial times since the U.S. banned American companies from supplying certain components to the company.

Huawei recently reported third fiscal quarter revenue of $21.2 billion, representing a 38 percent drop in sales year over year. This represents the companys fourth straight quarter of declining sales.

In addition, Huawei needed to sell its Honor smartphone business to the Shenzhen government last year after the U.S. banned American companies from supplying certain components, such as 5G chips, to Huawei.

It is key to note that Huaweis x86 server line was not a core group for the company, as Huawei has developed its own servers for cloud computing powered by ARM-based processors.

Two potential buyers of Huaweis x86 server business are China-based Henan Information Industry Investment Co., which is state-owned and has partnered with Huaweis server group, as well as Chinese asset management company Huaqin Technology, according to Bloomberg.

Looking at the worldwide server market landscape, Dell Technologies and Hewlett Packard Enterprise alongside China-based H3C have been two dominant leaders on a global basis for the past several years.

Worldwide server market revenue declined 2.5 percent year over year to $23.6 billion during the second quarter of 2021, while 3.2 million servers were shipped, representing flat growth.

Here is the original post:
Huawei Selling x86 Server Business Due To US Sanctions: Report - CRN

Read More..

Wiwynn Successfully Implemented Open System Firmware on Its OCP Yosemite V3 Server – PRNewswire

TAIPEI, Nov. 4, 2021 /PRNewswire/ -- Wiwynn (TWSE: 6669), an innovative cloud IT infrastructure provider for data centers, today announces that its Open Compute Project Yosemite V3 (OCP YV3) based server has completed the implementation of Open System Firmware (OSF) and obtained the OCP Accepted recognition. With the help from OCP partners, including Facebook, Intel and 9elements, Wiwynn has made an important milestone for the open community to complete the first OCP product contribution that includes not only hardware design but also OSF.

OSF is a formal OCP project with the goal to move the control of firmware to the system owner. It allows the system firmware (also known as BIOS) to be modified and shared openly. Starting in March 2021, "OCP Accepted" badge for servers requires that server systems support OSF. The openness of OSF will lower the entry barrier of OCP system adoption and accelerate product development. The synergies with other open source firmware communities, such as LinuxBoot and coreboot, will enroll more talents to join and make the ecosystem more open and complete.

Wiwynn's YV3, the 3rd Gen Intel Xeon Scalable processor (codename: CooperLake) based single socket server, is the first product contribution that meets the new requirement of OCP Accepted recognition. 9elements will support the OSF code base maintenance, including rebasing to the tip of open source components, and making releases periodically.

"We have been devoted to the OCP community with more than 32 contributions and seen OSF become an important piece for modern server designs," said Steven Lu, Wiwynn's Senior Vice President of Product Development. "Thanks to Facebook, Intel and 9elements, we are able to move steps further to make OSF part of Wiwynn's YV3 server and obtained OCP Accepted recognition. We are looking forward to duplicating the successful model to more products to come."

"We are excited to see the ecosystem embrace Intel platforms to build the open system firmware," said Anjaneya "Reddy" Chagam, Intel's Cloud Architect. "It's our pleasure to work with Wiwynn, Facebook and 9elementsto realize a successful OSF practice and accelerate the community development."

"This is a disruptive milestone for the OSF Community," said Christian Walter, 9element's Executive Director of Firmware Development. "We are excited to work together with Wiwynn supporting the OSF code base for the Wiwynn YV3. This is the perfect showcase of what can be accomplished working together on open systems, and we hope this will pave the way for more products to come."

"As one of the very first OCP Solution Providers, Wiwynn has shown its continuous commitment to the open community. The OCP Accepted Wiwynn YV3 is a phenomenal milestone that includes system firmware in the OCP product contribution for the first time. We are thrilled to see the great progress of OSF through the close collaboration of Wiwynn and its partners. We also look forward to more OSF projects to be inspired and thrive," said Rajeev Sharma, OCP's Director of Software and Technologies

For more OCP OSF details, please refer to the recent OCP blog post.

Wiwynn will also showcase Wiwynn's YV3 at the upcoming OCP Global Summit 2021 at booth #C2. There will be two in-depth engineering workshops around OSF and SW management for OCP DeltaLake and Yosemite V3 as well. Please follow Wiwynn's OCP event page and stay tuned.

About Wiwynn

Wiwynn is an innovative cloud IT infrastructure provider of high-quality computing and storage products, plus rack solutions for leading data centers. We aggressively invest in next generation technologies for workload optimization and best TCO (Total Cost of Ownership). As an OCP (Open Compute Project) solution provider and platinum member, Wiwynn actively participates in advanced computing and storage system designs while constantly implementing the benefits of OCP into traditional data centers.

For more information, please visit Wiwynn website or contact [emailprotected]Follow Wiwynn on Facebook and Linkedin for the latest news and market trends.

SOURCE Wiwynn

Go here to see the original:
Wiwynn Successfully Implemented Open System Firmware on Its OCP Yosemite V3 Server - PRNewswire

Read More..

LA County Assessor’s Office Looks to Oracle Cloud to Improve Operations – PRNewswire

AUSTIN, Texas, Nov. 4, 2021 /PRNewswire/ --The Los Angeles County Office of the Assessor has successfully migrated its Assessor operations from a paper-based, legacy mainframe environment to Oracle Cloud Infrastructure (OCI). By moving to Oracle Cloud, the Office is able to speed up data processing, reduce risk, and improve the user-experience. Using a series of OCI services including Oracle Autonomous Data Warehouse, Oracle Analytics Cloud, Oracle Exadata Cloud Service, and Oracle Database Cloud Service, LA County is seeing dramatic improvements in performance and achieving significant cost savings by eliminating its on premise infrastructure.

As the largest local assessment agency in the country, the Los Angeles County Office of the Assessor reviews more than 400,000 property documents and completes 500,000 physical property appraisals each year. This work was previously conducted with a paper-based process using 40-year-old mainframe technology that required manuals to interpret its code. With such a high volume of assessments, compounded by California's complex requirements, the Assessor's Office realized this model wasn't sustainable.

The Office began work on a five-phase Assessor Modernization Project (AMP) to develop an in-house custom application with Oracle Consulting. After three successful phases, the AMP application became the go-to production system for the Assessor. During the fourth phase of the project in February 2021, Oracle Consulting and the Assessor's Office extended AMP functionality and moved the application from on-premises to OCI with no disruptions, while eliminating 80 servers.

"The decision to move AMP to Oracle Cloud Infrastructure midway through the project had huge benefits, including cost savings, better performance, flexibility, and resource efficiencies," said Kevin Lechner, CIO, Los Angeles County Office of the Assessor. "We could now focus our attention on application development and increased productivity rather than infrastructure requirements and maintenance."

On OCI, data processing jobs that once took up to eight hours are now processed in four; built-in Disaster Recovery is helping mitigate risk; and end users are seeing faster page loads than ever before. Soon, the Los Angeles County Office of the Assessor plans to open access to AMP for other counties in the state.

"The Los Angeles County Office of the Assessor's modernization project is one of the most forward-thinking government agency endeavors we've seen in recent years," said Jeff Kane, group vice president, Oracle Consulting, North America. "We're looking forward to seeing additional performance enhancements and cost savings they'll realize on OCI services, freeing up their IT staff to innovate with new features and functionality on the platform."

"One of my top goals coming into this Office was to make sure that we provide public service that is both effective and cost-efficient," said Jeff Prang, Los Angeles County Assessor. "This milestone of the project with Oracle Cloud Infrastructure is our biggest success to date."

About OracleOracle offers integrated suites of applications plus secure, autonomous infrastructure in the Oracle Cloud. For more information about Oracle (NYSE: ORCL), please visit us at oracle.com.

TrademarksOracle, Java, and MySQL are registered trademarks of Oracle Corporation.

SOURCE Oracle

http://www.oracle.com

Continue reading here:
LA County Assessor's Office Looks to Oracle Cloud to Improve Operations - PRNewswire

Read More..

Intel Xe-HPG GPU: Aiming The Gaming & Visual Cloud-Based Segment With a Focus on AI & Superior Performance – Wccftech

Last week, Raja Koduri went to Twitter to explain Intel's intent to discontinue the focus on Intel Xe-HP and implement that technology to the newer Intel Xe-HPG GPUs, a newer and more focused ecosystem that they have a five-year plan of growth for developments in AI, gaming and visual cloud-based technology, and higher optimization and performance. This was surprising to a large audience due to the amount of time spent on discussing Intel Xe-HP publicly.

Intel's Xe-HP GPU was originally announced during the middle of 2020, advertised as a "multi-tile graphics series designed for data centers, with the main purpose as media super-computer accelerators." Intel has even gone as far as creating three separate offerings with between "1 to 4 tiles." It will now help Intel for utilizing the Intel Xe-HPs for their own in-house cloud servers.

Apple Silicon Mac Pro to Feature a Variation of the M1 Max, According to the Latest Report

Koduri went on to further explain that Xe-HP GPU would help develop the "ecosystem for Ponte Vecchio's architecture." As far as product lines go, the Xe-HPG lines will replace Xe-HP, adding "AI interference and visual cloud" technology, which was the original intent of Xe-HP.

It is speculated that, due to the focus of Intel Xe-HPG on media in terms of analytics, processing, immersion, and cloud technology, such as cloud gaming and cloud graphics, that Intel is beginning to compete with GeForce NOW, Google Stadia, or Amazon Luna.

What is interesting to note, however, is the usage of media conversion servers/video content providers. Intel was one of the largest hardware providers for the last World Olympics held in Tokyo, Japan. Intel's Xeon servers were the primary utilization to help stream an unbelievable 8K 60fps HDR video that was transferred and implemented directly to the cloud to televisions everywhere. Even though a large group of consumers was unable to access the full resolution capabilities of the signal, the fact that this technology is now available is proof that we have evolved yet again to what is possible.

Lastly, Koduri also discussed Intel Xe-HPG is planned to support high-end applications such as 3DS Max, extending their Xe-HPG to the high-end workstation sections. It could be implied at some point that the Xe-HPG microarchitecture will be accessible for to up to three different markets, just as the same as their competitors NVIDIA and AMD with their Ampere and RDNA2 architectures, respectably.

Read more from the original source:
Intel Xe-HPG GPU: Aiming The Gaming & Visual Cloud-Based Segment With a Focus on AI & Superior Performance - Wccftech

Read More..