Page 3,606«..1020..3,6053,6063,6073,608..3,6203,630..»

The Register calls for aid, and Microsoft’s Rohan Kumar will answer… our questions about SQL Edge and Azure Synapse – The Register

Build There was SQL Edge and Azure Synapse news at Microsoft's reimagined Build gathering this week, so The Register had a chat with corporate vice president Rohan Kumar about the company's database ambitions.

Having lurked in limited preview for a while, Azure SQL Edge has been pushed out to a broader audience as the platform creeps toward general availability. Running on x64 or ARM, with Windows or Linux doing OS duty, Azure SQL Edge is all about shunting the smarts of its bigger SQL Server brother to edge devices for more local processing and storage rather than maintain a constant, and potentially laggardly and expensive, connection to the mothership.

Kumar cited AI training as an example use case for the technology. "[We] see a lot of customers using our big data analytics in the cloud to train and run their machine learning models," he said, "and once they come up with a model that they believe is meeting the requirements, then they mass deploy onto the SQL Edge devices."

Those devices can then run in environments that are online, occasionally connected or fully offline.

"A lot of decisions can then be made very close to where the data originates," he added.

Kumar told us that the content of the wider public preview was pretty much what would go on to hit general availability, although the gang still had a way to go in order to reduce the current 500MB footprint down to 300MB. The limited capacity of Edge devices, such as the popular Raspberry Pi, makes the bloat loss essential.

"This is not the SQL Server of 15 or 20 years ago," laughed Kumar. "It runs everywhere." Indeed, the first SQL Server this writer used was version 4.21 and it ran happily enough under NT with a massive 64 megabytes of memory. Still, it is heartening to see the footprint being reduced while other products continue to pile on the pounds.

Also hitting public preview, and some way from the grungy edge shenanigans of SQL Edge, was Azure Synapse Link, a "cloud native implementation of hybrid transactional analytical processing (HTAP)", according to Microsoft.

Kumar had previously shown off Azure Synapse, a revved-up and rebranded of Azure SQL Data Warehouse (or "evolution", as Microsoft would have it), and its analytical capabilities at last year's Ignite event.

Synapse also gained some new toys in Public Preview at Build (something Kumar described as a "milestone"). The analytics service added new Synapse SQL features and, inevitably, built-in Power BI authoring. The widening of the preview also means that more users can check out if the team's performance and ease of use claims cut the mustard.

However, getting data from operational systems into analytical services has traditionally been a bit of pain. The new Link functionality allows customers to hit that real-time transactional data in Azure without adding a burden to operational systems.

There are, of course, gotchas.

It is undoubtedly neat if you're in the Microsoft ecosystem, but right now Azure Synapse Link only supports Azure CosmosDB in the preview. Other Azure databases such as PostgreSQL, MySQL or even Azure SQL are still in the "coming soon" bucket, which is a little disappointing.

Kumar assured us that the gang were "actively working on both SQL and Postgres", but that "there is a certain set of priorities that every team works through" and CosmosDB came out ahead.

A frequent complaint levelled at Microsoft is that much of the new stuff depends on the Azure cloud while many customers still prefer to keep their data close at hand. Azure Stack is an oft-touted solution to the problem, but Kumar pointed to Microsoft's crack at multi-cloud, Azure Arc, for an indication of where Synapse might go next.

"That could be the next step in the journey," he said, "if a customer is willing to maintain connectivity with Azure, where we're able to get a certain amount of telemetry, then progressively we can add more and more capabilities."

The cloud-versus-on-premises argument continues to rage concerning SQL Server 2019. Could that be the last major on-premises release of the former flagship? "No, absolutely not," stated Kumar. However, he added: "Here's the thing: if you look at the kind of investments we are making in SQL Server, even on-prem, it's essentially not just to support the customers, but to prepare them for the transition to the cloud."

That transition continues to rumble on as Microsoft attempts to persuade those still on the elderly 2008 and 2008 R2 platforms to move to something more modern (faced with the IT curse that those boxes simply work, and most admins know that if something works, it is best not to go fiddling with it).

And the on-premises versus cloud balance? At present it stands at around half and half. There are an awful lot of SQL Servers still alive and well in data centres around the world.

Sponsored: Webcast: Simplify data protection on AWS

The rest is here:
The Register calls for aid, and Microsoft's Rohan Kumar will answer... our questions about SQL Edge and Azure Synapse - The Register

Read More..

What are the different types of cloud load balancing? – TechTarget

Load balancing is the process of distributing network traffic across two or more instances of a workload. IT teams use load balancing to ensure each instance performs at peak efficiency, without any one instance becoming overburdened or failing due to excess network traffic.

Traditionally, a load balancer exists in a local data center as a dedicated physical network appliance. However, load balancing is more frequently performed by an application installed on a server and offered as a network service. Public cloud providers use the service paradigm and provide software-based load balancers as a distinct feature.

Once a load balancer is implemented, it acts as a network front end and often uses a single IP address to receive all network traffic intended for the target workload. The load balancer can evenly distribute the network traffic to each available workload instance, or it can throttle traffic to send specific percentages of traffic to each instance.

With a load balancer, the target workloads can be in different physical places. Cloud load balancing provides similar benefits that enable users to distribute network traffic across multiple instances within the same region or across multiple regions or availability zones.

Load balancing is defined by the layer that the network traffic is handled by based on the traditional seven-layer Open Systems Interconnection network model. Each layer corresponds to specific traffic types. Cloud load balancing is most commonly performed at Layer 4 (transport or connection layer) or Layer 7 (application layer).

For example, AWS' Network Load Balancer service operates at Layer 4 to direct data from transport layer protocols, including Transmission Control Protocol (TCP), User Datagram Protocol (UDP) and Transport Layer Security (TLS). Google Cloud Platform (GCP) refers to this as TCP/UDP Load Balancing, while Microsoft calls its Layer 4 service Azure Load Balancer. Since traffic is handled at a lower level of the network stack, Layer 4 load balancing provides the best performance. Cloud load-balancing services can handle millions of network requests per second and ensure low latencies. They are, therefore, great options for erratic or unpredictable network traffic patterns.

At the top of the network stack, Layer 7 handles more complex traffic, such as HTTP and HTTPS requests. Each of the major cloud providers has its own feature or service for this:

Since this traffic is much higher up the network stack, IT teams can implement more advanced options, such as content- or request-based routing decisions. This type of cloud load balancing works well with modern application instances and architectures, including microservices and container-based workloads.

The choice of a cloud load balancer should extend beyond traffic types alone. Cloud providers also differentiate load-balancing services based on scope and framework. For example, GCP suggests global load-balancing services when workloads are distributed across multiple regions, while regional load-balancing services are a good fit when all workloads are in the same region. Similarly, GCP suggests external load balancers when traffic is coming into the workloads from the internet and internal load balancers when traffic is intended for use within GCP.

Be sure to consider the broader suite of features and capabilities available with cloud load-balancing services. In particular, features can include support for a single front-end IP address, support for automatic workload scaling, and integration with other cloud services, such as monitoring and alerting.

Continue reading here:
What are the different types of cloud load balancing? - TechTarget

Read More..

Masayoshi Son says AWS and Microsoft will buy more chipsets from the SoftBank Vision Fund-backed Arm, and not – Business Insider India

Out of the 88 companies that the SoftBank Vision Fund 1 has invested in, Son assumes that as many as 15 companies could go bankrupt, while 60 others could post average performance. That leaves 13 companies, which Son believes could end up being successful.British chipmaker Arm Holdings could be one among those 13 companies. In fact, Son considers Arm to be one of the important assets of his fund. And why wont he Arms shipments have been growing exponentially.

Advertisement

Amazon Web Services, the current market leader in public cloud services, has announced Graviton2 chipset based on Arms designs for its cloud infrastructure. According to SoftBank, Graviton2 is believed to be 65% faster than Intels Xeon chipsets, and could help save up to 40% of the cost.

Advertisement

Arm is reportedly looking at an IPO in the next 5 years. However, that may make it difficult for SoftBank, which needs at least $3 billion every year in just equity dividends.SoftBank reported its first annual loss in 15 years, posting a net loss of $894 million in FY2019. During the same period last year, it posted a net profit of $19.6 billion.

Advertisement

SoftBank acquired Arm in 2016 for $32 billion. In 2018, it sold 25% of the stake for $8 billion. In its latest earnings, the Japanese group revealed that its stake in Arm is valued at $25 billion, which implies an appreciation of $1 billion in the chipmakers value.

Advertisement

Arm shipped 100 billion chipsets in the first 27 years the next 60 billion came in 3 yearsEstablished in 1991, Arm has shipped more than 160 billion chipsets till date. The first 100 billion shipments took 27 years, while the next 60 billion came in just 3 years.

Advertisement

SEE ALSO:Top 10 scary quotes from Masayoshi Son during the SoftBank earnings callSoftBank Group lost $6 billion on Alibaba shares and another $1.4 billion on Uber during the COVID-19 crisisWeWork India lays off 20% staff, says tough step needed for sustainable structureAdvertisement

More:
Masayoshi Son says AWS and Microsoft will buy more chipsets from the SoftBank Vision Fund-backed Arm, and not - Business Insider India

Read More..

How data centers will become automated and self-reliant – TechHQ

In the thick of the pandemic,TechHQcovered the story of theunsung heroesof the tech world data center workers.

Stringent lockdown measures have impacted the daily workflow of various businesses, and only key essential workers are given the green light to offices and other facilities.

Ambiguity arises when contract data center workers are not given the same pass for movement. Yet these operators and contractors are the front liners when it comes to maintaining and keeping data centers running.

In this light, the significant role of data centers is highlighted more than ever, and some trends are spotted to take shape due to the unique challenges brought upon by the pandemic.

To understand the impact of COVID-19 and remote working on the evolution of the data centers,TechHQinterviewed Lenovo DCGs APAC Director for Software Defined Infrastructure, Kumara Raghavan.

In the past, data center administrators would have to schedule downtime during the weekends and be on standby to do upgrades and updates when the power users were using the mainstream applications. Raghavan added those were the norms by and large, especially if you have a quality hyper converged infrastructure and software combination.

In todays climate, recurring themes Raghavan noted were automation and self-managing tech in data centers that minimize the reliance of human workers being physically present all the time.

We have administrators who do their firmware updates while doing their shopping its a one click upgrade.

Thats just an example of the various levels of automation that has crept into the data centers, he said.

Besides that, for companies with hybrid cloud, they are able to preset the provisioning for servers and applications when a spike occurs. Citing it as a result of automation and scripting, Raghavan shared that this is the reality for companies with a high degree of resilience due to the increasing level of automation baked into their data centers.

Furthermore, Raghavan shared that companies are focused on balancing the workload of data servers by diversifying their traffic through multiple data centers across specific geographical regions. In light of the pandemic, another consideration for companies would be to fulfill the demand of a rise in number of employees working from home.

These may be some of the general trends that are observed, but the decision and evolving role of data center management are dependent on a companys available resources, budgets, purposes, and essentially, experience with the cloud and data centers.

Raghavan explained there are two kinds of behaviors that are driving the way companies manage their data centers in the current environment.

For companies that are still at the early stages of digital transformation, they are most likely to pivot as they face the drastic surge in work from home capacity requirements.

The heightened demand includes the need for extra bandwidth, more compute power or storage but without change to existing budgets. Hence, companies have to rethink how to best cater to these growing needs without completely blowing out on their cash reserves.

Once, they have planned their own transformation, I think its easy for them to move capacity and that is one of the facilities that come from the principle of the hybrid cloud, Raghavan shared.

As for companies that didnt have the full capacity or are inadequately prepared for upheaval times of the pandemic, opening up capability on public clouds seems a viable solution. Raghavan noted companies aiming for want an immediate fix to their problems and are looking to public cloud.

Despite the spike of interest in public cloud as a temporary relief, Raghavan said we dont see this as a permanent move.

I think the underlying concerns of why everything has not shifted to public cloud, contrary to some expectations, havent really gone away, he said.

Some of the considerations impeding mass migration to public cloud relate to the control of data and costs. In the short term, a monthly bill, as opposed to a single capital expenditure, may be a more feasible option for companies but in the long run, the cost proves to be quite expensive.

Raghavan also pointed out the aspect of latency in public cloud as a factor to the trend not taking off, especially in todays environment that there is so much demand on the network.

In sum, we see some temporary increase in the public cloud utilization, but in reality, customers are now thinking through accelerating their digital transformation [] and we see that they have better management of the workflow.

In essence, the management of data centers is noted to be influenced by emerging technologies such as the cloud, edge, and artificial intelligence (AI).

Raghavan shared that the AI tools related to common management will gain prominence.

Besides that, edge devices have the potential to help companies monitor and manage different zones of data centers. For instance, companies can change the settings of data center cooling based on the volume of workload, potentially saving huge amounts of costs spend on cooling and driving efficiency. The key is to try and reduce the dependency on humans when it comes to intervening and that happens within typical servers, said Raghavan.

Adding to the mix, self-managing technologies and predictive technologies will be major players in the space. Cutting-edge technologies will be able to monitor a range of parameters of drives. For instance, algorithms can track the performance of disks and predict with a fair amount of accuracy in the malfunction of it. By doing so, data center administrators are alerted of possible failures and are able to act in a swift manner, minimizing disruption.

In sum, the utilization of AI tools in data centers may not be scaled completely and widely in the discussed level but it does indicate the direction data center management is heading.

View original post here:
How data centers will become automated and self-reliant - TechHQ

Read More..

Chinese IPOs hang in the balance as Senate and Nasdaq change rules – Data Economy

EmpoweringIndiasEnterprise Cloud Adoption, the service provides customers an easy, scalable and secure way to connect to multiple Cloud platforms

GPX India Pvt. Ltd., a global data center and interconnection leader providing next-generation, carrier-neutral, cloud-agnostic services, announces the launch of GPX Open Cloud Exchange at itsMumbaidata center campus.GPX is the first data center and interconnection provider inIndiato offer an Open Cloud Exchange service which enables direct, private and secure connection to multiple Cloud Service Providers (CSPs) hosted inside the same GPX data center campus where the GPX Open Cloud Exchange is hosted.

GPX Cloud Exchange strengthens GPXs market-leading and high-performance Interconnection Ecosystem that exists in the GPX Mumbai data center campus. Utilizing the GPX Open Cloud Exchange service, Enterprises can seamlessly connect to multiple Cloud providers via a single port, accelerating their Cloud adoption and establishing enterprise edge nodes to optimize their hybrid-cloud and multi-cloud network strategies.With burgeoning Digital Transformation underway inIndia, the nations Cloud market is expected to grow threefold to$7.1 billionby 2022, according to a recent Nasscom report.

Building upon GPXs theme of service neutrality, we are looking forward to making our customers journey to the Cloud seamless. Currently, we have 8 CSPs present in GPXs Mumbai Data Center campus, three of which offer private direct connection services. The GPX Interconnection Ecosystem consists of 12 Carriers, 130+ ISPs, 4 IXPs, 8 CSPs, 9 CDNs and leading global content providers with geographic proximity to subsea cables offering the richest interconnection platform inIndia, saidNick Tanzi, President and CEO of GPX Global Systems, Inc.

As part of GPX Cloud Solutions, GPX has been offering GPX Direct Cloud Connect service to enterprise customers for over four years. This service provides direct connection services to GPX three CSP partners AWS Direct Connect, Google Cloud Dedicated Interconnect and Oracle Cloud FastConnect, all of whom have an edge node hosted inside the GPX Mumbai data center campus.

Through the GPX Open Cloud Exchange service offering, GPX will provide easy and scalable way for enterprises to off-load their IT workloads to these CSPs, and shift loads with high flexibility with scalable interconnection capacity from 100 Mbps to 100 Gbps. GPX will be partnering with additional CSPs with direct connection capabilities to increase options for customers. In addition, as part of GPX Cloud Value-Added Services, GPX offers Cloud Data Upload Service that enables customers to upload their data to AWS in a cost-effective and efficient manner.

We have been using GPXs Direct Cloud Connect service for almost two years to connect to Amazon Web Services. GPX has taken the complexity out of Cloud connectivity and made our journey to the Cloud easy and efficient. We like GPXs pricing model of a combined charge for port and Cross Connect, with the option to upgrade rapidly. GPX team is very process oriented and supportive, saidBhisham Sharma, Manager, Network & Security, Jubilant Life Sciences.

Our innovative Open Cloud Exchange service will empower the Indian enterprises to leverage multiple Cloud platforms in an easy and cost-efficient way. GPX is the only data center and interconnection provider to offer an Open Cloud Exchange service connected to multiple CSPs hosted within GPX Mumbai data center campus, therefore offering highly reliable and scalable service, saidManoj Paul, Managing Director, GPX India Pvt. Ltd.

About GPX

Incorporated inAugust 2002, GPX develops and operates next generation, private, carrier-neutral data centers and interconnection platforms in fast-growing commercial markets at cable landing stations in the African andSouth Asiaregions. GPXs data centers are thriving carrier-neutral and connectivity-rich Internet Ecosystems, home to the largest carriers, content providers, cloud service providers, content distribution networks, Internet companies and enterprise edge nodes. It launched its Indian data center,Mumbai1 in 2012 to provide Tier-IV Colocation and Interconnection Services. GPXs second data center inMumbaiwas launched in 2018 which further expands this ecosystem backed by state-of-art infrastructure and connects toMumbai1 via GPXs Data Center Interconnect (DCI) service. Visit the website here.

Read the latest from the Data Economy Newsroom:

Read more from the original source:
Chinese IPOs hang in the balance as Senate and Nasdaq change rules - Data Economy

Read More..

Portworx upbeat on container storage revenues Blocks and Files – Blocks and Files

Portworx, the California container storage startup, today issued a so-called momentum release, boasting of customer and revenue growth.

As usual with US startups, the company does not mention actual figures but by any reckoning it is a small fish in the data storage world annual revenues according to this possibly out-of-date estimate are $14m. However, Portworxs bullishness is an indicator that the container storage market could shaping up into serious money.

Portworx today said it had more than 145 customers, including 54 Global 2000 or government accounts. It reports 136 per cent growth in revenue year over year in Q1 2020, and 92 per cent revenue growth from Q4 2019. Thirteen sales were over $100,000, up from five sales over $100,000 in Q1 2019.

A 2019 Portworx survey showed 87 per cent of respondents said that they used container technologies, compared with 80 per cent in 2018 and 55 per cent in 2017.

Almost Ninety per cent of enterprises are already running container technology and more than half the containers they run are stateful, according to the 451 Research survey: Voice of the Enterprise DevOps, Security, AI/ML, and Cloud Native 2020. And those customers need storage.

Portworx emerged from stealth in 2015 and has bagged $55.5m funding over three rounds. Its software runs on commodity servers and aggregates their storage into a virtual SAN providing scale-out block storage. It provides storage from this pool for containers, at container granularity, and with a global namespace. File and object storage are on its roadmap.

Portworxs pitch is that the storage supplied to containers through an orchestration layer like Kubernetes should be containerised itself and also enterprise class, with features like security, data protection, backup and recovery, disaster recovery, SLA management, and compliance.

It says traditional enterprise storage, with the features, is suited to virtual server environments but not cloud-native ones, even if they Kubernetes CSI plug-ins. Storage provision for containers has to be supplied at the speed and scale of container instantiation, deployment and removal. Portworx claims that only cloud-native storage, its cloud-native storage, can meet this need not legacy SANs.

Read more here:
Portworx upbeat on container storage revenues Blocks and Files - Blocks and Files

Read More..

New study Global Managed Servers Market 2019 | Growth Opportunities, Investment Feasibility, Market Share And Forecast 2025 – Cole of Duty

Global Managed Servers Market Research Report offers complete knowledge, forecast and statistical analysis on past, present and forecast industry situations. The risks and growth opportunities associated with Managed Servers market are highlighted in this study.

The Managed Servers study will drive investment decisions and strategic business plans for a successful and sustainable business. The market growth in terms of CAGR value is presented from 2019-2025. The high-level data pertaining to Managed Servers market trends, supply-demand statistics, production volume and market demand is evaluated. Also, the cost structures, the latest Industry plans and policies and management strategies are explained.

FREE Sample Report Copy Here: https://www.globalmarketers.biz/report/technology-and-media/global-managed-servers-market-report-2019,-competitive-landscape,-trends-and-opportunities/136852#request_sample

The Outlook Of Global Managed Servers Market:

Sungard Availability ServicesCapgeminiHivelocity VenturesAtosiPageLeaseWebAlbatross CloudXLHostHostwayViglan SolutionsEasyspaceHetznerTata Consultancy ServicesInfosysIBM

The Global Managed Servers Market data is represented in graphical format to ease the understanding. This report also lists the Managed Servers driving factors, growth and development opportunities and restraints. Additionally, the Global Managed Servers Market Report provides complete study on product types, Managed Servers applications, research regions and other sub-segments.

The company profile covers the end-user applications, sales channel analysis, competitive landscape view, and expansion plans. The industry plans & policies, value analysis, downstream consumers and Managed Servers market dynamics are presented. The sales value, industry share, growth opportunities and threats to the development are explained. The contribution of worldwide players to the Global Managed Servers Market and its impact on forecast development is analyzed in this study. The global position of Global Managed Servers Industry players, their profit margin, volume analysis, and market dynamics are studied.

Types Of Global Managed Servers Market:

Cloud-BasedOn-Premise

Applications Of Global Managed Servers Market:

BFSIIT & TelecommunicationEducationGovernmentRetailManufacturingConsumer GoodsEnergy & UtilityOthers

Fill Out Inquiry Form For More Details: https://www.globalmarketers.biz/report/technology-and-media/global-managed-servers-market-report-2019,-competitive-landscape,-trends-and-opportunities/136852#inquiry_before_buying

Implemented Data Sources And Research Methodology:

The Global Managed Servers Market details are obtained via primary and secondary research techniques. The data is gathered from vendors, service providers, Global Managed Servers industry experts and third-party data providers. Also, various distributors, service providers and suppliers are interviewed in this study. Besides, Managed Servers Report also states the competitive scenario, SWOT analysis and market size.

The supply-demand side of Global Managed Servers Industry is analyzed by the data gathered from paid primary interviews and through secondary sources. The secondary research techniques involve the Managed Servers data gathered from company reports, consumer surveys, Government databases, economic and demographic data sources. Also, product sources like sales data, custom group data and case studies are analyzed.

Enquire Here For Customization:https://www.globalmarketers.biz/inquiry/customization/ 136852

There Are 8 Sections In Managed Servers Report As Follows:

Section 1: Objectives, Definition, Scope, Global Managed Servers Market Overview, Market Size Estimation, Concentration Ratio and Growth Rate from 2014-2025;

Section 2: Global Managed Servers Industry Segmentation by Type, Application and Research Region;

Section 3: Top Regions of Global Managed Servers Industry (North America, Europe, Asia-Pacific, Middle East & Africa, South America) with the Production Value and Growth Rate;

Section 4: The Changing Global Managed Servers Market Dynamics, Growth Drivers, Limitations, Industry Plans & Policies, and Growth Opportunities are Explained.

Section 5: Industry Chain Analysis, Manufacturing Base, Cost Structures, Production Process, Marketing Channels, and Downstream Buyers.

Section 6: The Top Managed Servers Players, Market Share, Competition, Market Size and Regional Presence is Specified.

Section 7: Forecast Market Trends, Consumption, Value, Production Forecast and Growth Estimates are Analyzed

Section 8: Lastly, Vital Conclusions, Research Techniques, and Data Sources are Listed.

Thanks for reading. We also provide a report based on custom requirements from our clients.

Request for more detailed information (TOC and Sample): https://www.globalmarketers.biz/report/technology-and-media/global-managed-servers-market-report-2019,-competitive-landscape,-trends-and-opportunities/136852#table_of_contents

Visit link:
New study Global Managed Servers Market 2019 | Growth Opportunities, Investment Feasibility, Market Share And Forecast 2025 - Cole of Duty

Read More..

The TD Show Episode 4 – Tim Just’s Top 10 TD Tips – uschess.org

The TD Show

This weeks The TD Show topic will be Tim Justs Top 10 TD Tips and will air at 9pm Eastern/6pm Pacific on Thursday, May 21 on the US Chess Twitch channel at twitch.tv/uschess.

The show will be hosted by NTD Chris Bird and this weeks guest of course will be US Chess Rulebook Editor and NTD Tim Just. Tim will be providing a list of his top 10 general tips for Tournament Directors to hopefully make you a better TD and make the experience of participating in one of your events much better for everyone.

For folks tuning in live, Twitch will provide some interaction between the show and the audience, allowing you to ask questions in real-time and well also finish each episode with some light-hearted fun in the form of trivia based on the topic discussed. However, if you cannot tune in live, each episode will be archived in the TD Videos playlist at the US Chess YouTube Channel.

Replay last weeks episode here:

Link:
The TD Show Episode 4 - Tim Just's Top 10 TD Tips - uschess.org

Read More..

How do I join the Stevenage COVID Chess Challenge? – The Comet

PUBLISHED: 11:26 20 May 2020 | UPDATED: 11:26 20 May 2020

Jacob Savill

The COVID Chess Challenge is open to all ages and abilities. Picture: Stevenage Rotary Grange

Archant

Rotary Stevenage Grange has created an online chess challenge to help keep Stevenage residents socially active during the COVID-19 lockdown.

Email this article to a friend

To send a link to this page you must be logged in.

The COVID Chess Challenge will be launching later this month, and anyone is free to join, of all ages and abilities.

Club President Ian Begg said: Chess offers a great contribution to mental wellbeing. It enables people, young and old, to maintain and forge new friendships during the lockdown. Undertaken safely online, chess has no boundaries of age, race, or ability.

The challenge takes place on a website called Lichess. It is free, and requires only a quick sign up before joining the Stevenage Grange Chess Club.

The clubs first tournament will be called The Bob Fowler challenge, in memory of the former mayor and renowned chess player who sadly passed away with COVID-19 last month.

If you would like to join, go to the clubs private Facebook group Grange Chess Club, or contact Rotarian James Corrigan at james@sgrc.org.uk.

If you value what this story gives you, please consider supporting the Comet. Click the link in the yellow box below for details.

Read more:
How do I join the Stevenage COVID Chess Challenge? - The Comet

Read More..

Why Have Cryptocurrency Payments Failed to Take Off So Far? – Cointelegraph

Paying with crypto has long been at the center of the discussions of why cryptocurrencies exist and why they are useful.

But despite promising growth and excitement during cryptos bullish phases, payments with crypto still remain a fringe niche at best. Cointelegraph interviewed both merchants and industry leaders to find out why.

As a general rule, crypto payments are used where they make sense. This remains the case for darknet markets, which according to a January 2020 Chainalysis report continue posting new volume highs.

Source: chainalysis.com

Despite their tiny share of the overall crypto activity, marketplaces selling primarily illegal goods simply cannot use traditional payment mechanisms. Nevertheless, these markets pale in comparison to the traditional cash-based drug trade, whose volume is estimated at approximately $400 billion yearly.

In legal settings, Crypto.coms CEO Kris Marszalek told Cointelegraph what kinds of products see meaningful usage of crypto:

Its still mostly crypto stuff. So we've got Travala, which is the travel merchant that accepts crypto. Ledger.com [...] when we launched on day one we were doing similar volume to Mastercard.

Marszalek cited figures from leading crypto payment providers BitPay and Coinbase Commerce, which report yearly volumes of $1 billion and $200 million, respectively.

The numbers are very small, Marszalek said bluntly.

Indeed, compared to Visas figure of $2 trillion for a single quarter in 2018, crypto payments have a long way to go.

Marszalek identified a series of issues that are preventing crypto payments adoption, with lack of trust one of them:

For the vast majority of the merchants out there, just like for the vast majority of retail banking users out there, crypto is still something unknown, something they still didnt learn to trust.

Peko Wan, the chief ecosystem officer of crypto point of sale provider Pundi X, told Cointelegraph a similar story:

For the mainstream, the general perception toward crypto are complicated to use or risky to own cryptos.

This attitude is reflected by a U.K.-based business owner operating a recreational plane simulator, whom Cointelegraph interviewed. Despite adding the crypto payment option, they said that no one has ever paid using crypto. They further said to be wary of all cryptos as there are so many scams out there.

Even among crypto enthusiasts, payments are a low priority use case. This is best exemplified by the issuance of WBTC for Ethereum decentralized finance, which is now more than double the size of the entire Lightning Network.

Marszalek believes that part of it is the chicken and egg problem, which limits the amount of merchants accepting crypto:

Because if you only have 50 million people in crypto globally, merchants have very little incentive to deploy this, unless they are in a business that is covering a similar demographic as crypto.

One of the biggest problems of crypto payments is the volatility of even the most established assets. Marszalek believes that most people only know about cryptos price swings, which is not really conducive to merchant adoption, he added.

Furthermore, the premise of many crypto payment providers is that merchants can completely avoid exposure to cryptos volatility.

Marszalek believes that stablecoins are super powerful for e-commerce transactions, citing their speed and cost, and sees Crypto.com eventually creating its own stablecoin as part of its vision of a complete ecosystem.

Claudio Barros, the Portugal-based owner of DBR Electronica and one of merchants using Pundi Xs solutions, believes that stablecoins would be a great addition to the ecosystem:

Any improvement in stability of coins will be a benefit, we need a range from pegged coins to super volatile coins to cater for different needs.

Crypto is competing both with established e-money systems like WeChat in China, and novel technologies like Calibra. Marszalek believes that it is better than either of those, both due to better performance and better privacy.

Marszalek, who is based in Hong Kong, personally witnessed how the cashless transition in China left him unable to pay in a Beijing restaurant, as Hong Kong WeChat does not work in mainland China. Either way, WeChats extreme level of surveillance makes him feel uncomfortable.

Wan also pointed to developing countries, noting:

For the past two years, we also observed that in the countries where the local currency has decreased over time [people] are more aware of crypto or interested in having cryptos.

For Crypto.com, payments are just at the beginning of the beginning, Marszalek said. But he strongly believes that it is the companys most important product, which will take our overall platform to a hundred million users in five years.

For crypto in general, the same statements could likely be made as well.

Read the original post:
Why Have Cryptocurrency Payments Failed to Take Off So Far? - Cointelegraph

Read More..