Page 3,758«..1020..3,7573,7583,7593,760..3,7703,780..»

Bitcoin ranked on top 100 greatest designs of all time – Decrypt

Bitcoin is one of the best-designed products of modern times, according to multinational business magazine Fortune.

Fortune teamed up with the Institute of Design at the Illinois Institute of Technology to create the survey, a replica of one it made in 1959.

Forbes ranks Bitcoin on the same list as major commerical products. Image: Fortune

As a result, numerous educators, influencers, freelance designers and corporate design teams placed Bitcoin in the same weight group as Apples iPhone, Macintosh and Google Search Engine. Although not quite as highly, they ranked Bitcoin 90th.

Bitcoin wasnt invented, it was designed so that a wide range of stakeholdersdevelopers, investors, businesses, miners, individualsall had incentives that reinforced adoption of a new digital currency, without any central issuer or governing authority, said David Kelley, the founder of global design company IDEO.

He noted that just after 10 years after its release, Bitcoins market capitalization is nearing $200 billion (most likely, the list was finalized before the recent holy liquidation) and is used by millions of people around the world.

I dont think any product in the history of the world has bootstrapped quite so effectively, Kelley added.

So, at least Bitcoins not dead yet.

Top stories, original features, rewards & more.

Get the best of Decrypt where you want it most.

Visit link:
Bitcoin ranked on top 100 greatest designs of all time - Decrypt

Read More..

FYI: You can trick image-recog AI into, say, mixing up cats and dogs by abusing scaling code to poison training data – The Register

Boffins in Germany have devised a technique to subvert neural network frameworks so they misidentify images without any telltale signs of tampering.

Erwin Quiring, David Klein, Daniel Arp, Martin Johns, and Konrad Rieck, computer scientists at TU Braunschweig, describe their attack in a pair of papers, slated for presentation at technical conferences in May and in August this year events that may or may not take place given the COVID-19 global health crisis.

The papers, titled "Adversarial Preprocessing: Understanding and Preventing Image-Scaling Attacks in Machine Learning" [PDF] and "Backdooring and Poisoning Neural Networks with Image-Scaling Attacks [PDF]," explore how the preprocessing phase involved in machine learning presents an opportunity to fiddle with neural network training in a way that isn't easily detected. The idea being: secretly poison the training data so that the software later makes bad decisions and predictions.

This example image, provided by the academics, of a cat has been modified so that when downscaled by an AI framework for training, it turns into a dog, thus muddying the training dataset

There have been numerous research projects that have demonstrated that neural networks can be manipulated to return incorrect results, but the researchers say such interventions can be spotted at training or test time through auditing.

"Our findings show that an adversary can significantly conceal image manipulations of current backdoor attacks and clean-label attacks without an impact on their overall attack success rate," explained Quiring and Rieck in the Backdooring paper. "Moreover, we demonstrate that defenses designed to detect image scaling attacks fail in the poisoning scenario."

Their key insight is that algorithms used by AI frameworks for image scaling a common preprocessing step to resize images in a dataset so they all have the same dimensions do not treat every pixel equally. Instead, these algorithms, in the imaging libraries of Caffe's OpenCV, TensorFlow's tf.image, and PyTorch's Pillow, specifically, consider only a third of the pixels to compute scaling.

"This imbalanced influence of the source pixels provides a perfect ground for image-scaling attacks," the academics explained. "The adversary only needs to modify those pixels with high weights to control the scaling and can leave the rest of the image untouched."

On their explanatory website, the eggheads show how they were able to modify a source image of a cat, without any visible sign of alteration, to make TensorFlow's nearest scaling algorithm output a dog.

This sort of poisoning attack during the training of machine learning systems can result in unexpected output and incorrect classifier labels. Adversarial examples can have a similar effect, the researchers say, but these work against one machine learning model.

Image scaling attacks "are model-independent and do not depend on knowledge of the learning model, features or training data," the researchers explained. "The attacks are effective even if neural networks were robust against adversarial examples, as the downscaling can create a perfect image of the target class."

The attack has implications for facial recognition systems in that it could allow a person to be identified as someone else. It could also be used to meddle with machine learning classifiers such that a neural network in a self-driving car could be made to see an arbitrary object as something else, like a stop sign.

To mitigate the risk of such attacks, the boffins say the area scaling capability implemented in many scaling libraries can help, as can Pillow's scaling algorithms (so long as it's not Pillow's nearest scaling scheme). They also discuss a defense technique that involves image reconstruction.

The researchers plan to publish their code and data set on May 1, 2020. They say their work shows the need for more robust defenses against image-scaling attacks and they observe that other types of data that get scaled like audio and video may be vulnerable to similar manipulation in the context of machine learning.

Sponsored: Webcast: Why you need managed detection and response

Follow this link:
FYI: You can trick image-recog AI into, say, mixing up cats and dogs by abusing scaling code to poison training data - The Register

Read More..

Quantum computing is right around the corner, but cooling is a problem. What are the options? – Diginomica

(Shutterstock.com)

Why would you be thinking about quantum computing? Yes, it may be two years or more before quantum computing will be widely available, but there are already quite a few organizations that are pressing ahead. I'll get into those use cases, but first - Lets start with the basics:

Classical computers require built-in fans and other ways to dissipate heat, and quantum computers are no different. Instead of working with bits of information that can be either 0 or 1, as in a classical machine, a quantum computer relies on "qubits," which can be in both states simultaneously called a superposition thanks to the quirks of quantum mechanics. Those qubits must be shielded from all external noise, since the slightest interference will destroy the superposition, resulting in calculation errors. Well-isolated qubits heat up quickly, so keeping them cool is a challenge.

The current operating temperature of quantum computers is 0.015 Kelvin or -273C or -460F. That is the only way to slow down the movement of atoms, so a "qubit" can hold a value.

There have been some creative solutions proposed for this problem, such as the nanofridge," which builds a circuit with an energy gap dividing two channels: a superconducting fast lane, where electrons can zip along with zero resistance, and a slow resistive (non-superconducting) lane. Only electrons with sufficient energy to jump across that gap can get to the superconductor highway; the rest are stuck in the slow lane. This has a cooling effect.

Just one problem though: The inventor, MikkoMttnen, is confident enough in the eventual success that he has applied for a patent for the device. However, "Maybe in 10 to 15 years, this might be commercially useful, he said. Its going to take some time, but Im pretty sure well get there."

Ten to fifteen years? It may be two years or more before quantum computing will be widely available, but there are already quite a few organizations that are pressing ahead in the following sectors:

An excellent, detailed report on the quantum computing ecosystem is: The Next Decade in Quantum Computingand How to Play.

But the cooling problem must get sorted. It may be diamonds that finally solve some of the commercial and operational/cost issues in quantum computing: synthetic, also known as lab-grown diamonds.

The first synthetic diamond was grown by GE in 1954. It was an ugly little brown thing. By the '70s, GE and others were growing up to 1-carat off-color diamonds for industrial use. By the '90s, a company called Gemesis (renamed Pure Grown Diamonds) successfully created one-carat flawless diamonds graded ILA, meaning perfect. Today designer diamonds come in all sizes and colors: adding Boron to make them pink or nitrogen to make them yellow.

Diamonds have unique properties. They have high thermal conductivity (meaning they don't melt like silicon). The thermal conductivity of a pure diamond is the highest of any known solid. They are also an excellent electrical insulator. In its center, it has an impurity called an N-V center, where a carbon atom is replaced by a nitrogen atom leaving a gap where an unpaired electron circles the nitrogen gap and can be excited or polarized by a laser. When excited, the electron gives off a single photon leaving it in a reduced energy state. Somehow, and I admit I dont completely understand this, the particle is placed into a quantum superposition. In quantum-speak, that means it can be two things, two values, two places at once, where it has both spin up and spin down. That is the essence of quantum computing, the creation of a "qubit," something that can be both 0 and 1 at the same time.

If that isnt weird enough, there is the issue of entanglement. A microwave pulse can be directed at a pair of qubits, placing them both in the same state. But you can "entangle" them so that they are always in the same state. In other words, if you change the state of one of them, the other also changes, even if great distances separate them, a phenomenon Einstein dubbed, spooky action at a distance. Entangled photons don't need bulky equipment to keep them in their quantum state, and they can transmit quantum information across long distances.

At least in the theory of the predictive nature of entanglement, adding qubits explodes a quantum computer's computing power. In telecommunications, for example, entangled photons that span the traditional telecommunications spectrum have enormous potential for multi-channel quantum communication.

News Flash: Physicists have just demonstrated a 3-particle entanglement. This increases the capacity of quantum computing geometrically.

The cooling of qubits is the stumbling block. Diamonds seem to offer a solution, one that could quantum computing into the mainstream. The impurities in synthetic diamonds can be manipulated, and the state of od qubit can held at room temperature, unlike other potential quantum computing systems, and NV-center qubits (described above) are long-lived. There are still many issues to unravel to make quantum computers feasible, but today, unless you have a refrigerator at home that can operate at near absolute-zero, hang on to that laptop.

But doesnt diamonds in computers sound expensive, flagrant, excessive? It begs the question, What is anything worth? Synthetic diamonds for jewelry are not as expensive as mined gems, but the price one pays at retail s burdened by the effect of monopoly, and so many intermediaries, distributors, jewelry companies, and retailers.

A recent book explored the value of fine things and explains the perceived value which only has a psychological basis.In the 1930s, De Beers, which had a monopoly on the world diamond market and too many for the weak demand, engaged the N. W. Ayers advertising agency realizing that diamonds were only sold to the very rich, while everyone else was buying cars and appliances. They created a market for diamond engagement rings and introduced the idea that a man should spend at least three months salary on a diamond for his betrothed.

And in classic selling of an idea, not a brand, they used their earworm taglines like diamonds are forever. These four iconic words have appeared in every single De Beers advertisement since 1948, and AdAge named it the #1 slogan of the century in 1999. Incidentally, diamonds arent forever. That diamond on your finger is slowly evaporating.

The worldwide outrage over the Blood Diamond scandal is increasing supply and demand for fine jewelry applications of synthetic diamonds. If quantum computers take off, and a diamond-based architecture becomes a standard, it will spawn a synthetic diamond production boom, increasing supply and drastically lowering the cost, making it feasible.

Many thanks to my daughter, Aja Raden, an author, jeweler, and behavioral economist for her insights about the diamond trade.

See the original post here:
Quantum computing is right around the corner, but cooling is a problem. What are the options? - Diginomica

Read More..

Quantum Computing: Will It Actually Produce Jobs? – Dice Insights

If youre interested in tech, youve likely heard about therace to develop quantum computers. These systems compute via qubits, whichexist not only as ones and zeros (as you find intraditional processors) but also in an in-between state known assuperposition.

For tasks such as cryptography, qubits and superpositionwould allow a quantum computer to analyze every potential solutionsimultaneously, making such systems much faster than conventional computers.Microsoft, Google,IBM, and other firms are all throwing tons of resources into quantum-computingresearch, hoping for a breakthrough that will make them a leader in thisnascent industry.

Questions abound about quantum computing, including whetherthese systems will actually produce the answers that companies really need. Forthose in the tech industry, theres a related interest in whether quantumcomputing will actually produce jobs at scale.

Thelarge tech companies and research laboratories who are leading the charge onR&D in the pure quantum computing hardware space are looking for peoplewith advanced degrees in key STEM fields like physics, math and engineering,said John Prisco, President & CEOof Quantum Xchange, which markets a quantum-safe key distribution thatsupposedly will bridge the gap between traditional encryption solutions andquantum computing-driven security. This is in large part because thereare few programs today that actually offer degrees or specializations inquantum technology.

WhenPrisco was in graduate school, he added, There were four of us in theelectrical engineering program with the kind of physics training this fieldcalls for. More recently, Ive recently seen universities like MIT andColumbia investing in offering this training to current students, but itsgoing to take awhile to produce experts.

Theresevery chance that increased demand for quantum-skilled technologists coulddrive even more universities to spin up the right kind of training andeducation programs. The National Institute of Standards and Technology (NIST)is evaluatingpost-quantum cryptography that would replace existing methods, includingpublic-key RSA encryption methods. Time is of the essence when it comes togovernments and companies coming up with these post-quantum algorithms; thenext evolutions in cryptography will render the current generation pretty muchobsolete.

Combinethat quest with the currentshortage of trained cybersecurity professionals, and you start to see wherethe talent and education crunch will hit over the next several years. Whilehackers weaponizing quantum computers themselves is still a far off proposal,the threat of harvesting attacks, where nefarious actors steal encrypted datanow to decrypt later once quantum computers are available, is already here,Prisco said, pointing at Chinas 2015 hack of the U.S. Office of PersonnelManagement, which saw the theft of 21 million government employee records.

Thoughthat stolen data was encrypted and there is no evidence it has been misused todate, the Chinese government is likely sitting on that trove, waiting for theday they have a quantum computer powerful enough to crack public keyencryption, he said. Organizations that store sensitive data with a longshelf-life need to start preparing now. There is no time to waste.

But what will make a good quantum technologist?

Membership has its benefits. Sign up for a free Dice profile, add your resume, discover great career insights and set your tech career in motion. Register now

HermanCollins, CEO of StrategicQC, a recruiting agency for the quantum-computingecosystem, believes that sourcing quantum-related talent at this stage comesdown to credentials. Because advanced quantum expertise is rare, the biggest sign thata candidate is qualified is whether they have a degree in one of the fields ofstudy that relates to quantum computing, he said. I would say that degrees,particularly advanced degrees, such as quantum physics obviously, physicstheory, math or computer science are a good start. A focus on machine learningor artificial intelligence would be excellent as part of an augmented dynamicquantum skill set.

Although Google, IBM, and theU.S. government have infinite amounts of money to throw at talent, smallercompanies are occasionally posting jobs for quantum-computing talent. Collinsthinks that, despite the relative lack of resources, these small companies haveat least a few advantages when it comes to attracting the right kind of veryhighly specialized talent.

Smaller firms and startups canoften speak about the ability to do interesting work that will impactgenerations to come and perhaps some equity participation, he said. Likewise,some applicants may be interested in working with smaller firms to buildquantum-related technology from the ground up. Others might prefer a moreclose-knit team environment that smaller firms may offer.

Some 20 percent of thequantum-related positions, Collins continued, are in marketing, sales,management, tech support, and operations. Even if you havent spent yearsstudying quantum computing, in other words, you can still potentially land ajob at a quantum-computing firm, doing all the things necessary to ensure thatthe overall tech stack keeps operating.

It is equally important forcompanies in industries where quantum can have impactful results in the nearerterm begin to recruit and staff quantum expertise now, Collins said.Companies competing in financial services, aerospace, defense, healthcare,telecommunications, energy, transportation, agriculture and others shouldrecognize the vital importance of looking very closely at quantum and addingsome skilled in-house capability.

Given the amount of money andresearch-hours already invested in quantum computing, aswell as some recent (and somewhat controversial) breakthroughs, theresevery chance the tech industry could see an uptick in demand for jobs relatedto quantum computing. Even for those who dont plan on specializing in thisesoteric field, there may be opportunities to contribute.

More here:
Quantum Computing: Will It Actually Produce Jobs? - Dice Insights

Read More..

The Four Main Reasons Your Cloud Spending Is Out of Control – Cloud Wars

Editors Note: Were delighted to feature a guest author on Cloud Wars today. Steve Schechter, an IT Director based in Hong Kong, shares the second in his 2-part series on taming runaway cloud costs. You can read Part 1 here.

How much are you spending on cloud every month?

Is it within the budget you set?

Are you able to understand the invoices you receive from your cloud providers? And reliably predict what youll spend in the future?

Its easy enough to answer the first question. But for many companies today, the answer to the others is probably, No.

In my previous article, Taming Runaway Cloud Expenses: How I Cut Cloud-Hosting Costs by 32%, I shared the specific steps I took to cut one clients cloud hosting bill by almost one-third. But every company is different and what worked for one company may not work for another. In order to help you understand how to keep your cloud bill under control, its important to know the reasons why things get out of hand in the first place.

Too many companies today are unaware that public cloud requires a different style of governance than for traditional infrastructure. The lack of proper cloud governance leads to results that are the stuff of nightmares.

More specifically, industry analysts estimate that of the $40 billion spent on public cloud IaaS in 2019, 30% or more is wasted. Gartner estimates that public cloud bills are often two to three times higher than expected and that as many as 80% of companies surveyed report that they consistently go over budget on their IaaS spend.

The situation is bad. Yet the concept of cloud governance isnt receiving the attention it deserves. Plenty of attention is given to cloud architecture, cloud migration, cloud securitybut cloud governance is rarely mentioned at all. The last cloud conference I attended had dozens of break-out sessions covering popular topics like security, hybrid cloud and multi-cloud; and just one poorly attended 45-minute talk on governance.

The main reason that cloud spending gets out of control is the management of resources and approvals. Its a matter of capital expense (CapEx) versus operating expense (OpEx).

Think about how you manage your traditional data-center resources. Buying or leasing almost anything for your data center or colocation service represents a commitment of thousands of dollarsoften many, many thousands of dollars. These are generally viewed as capital expenses and they usually require several layers of approvals. The workflow probably looks like this:

But in the brave new world of public cloud, a compute resource costs just pennies per hour. Pay as you go, right? Just a few dollars here, a few pennies there. So its not a capital expense, its an operating expense (OpEx). So the workflow might look more like this:

However, along with that compute resource theyve also added some disk space, snapshots and backups, perhaps replication for disaster-recovery purposes, perhaps several servers with load balancing, a new Vnet or subnets, new rules for a web application firewall and so on. Before you know it, this person has created cloud resources that will cost thousands of dollars per year all without obtaining any approvals or sign-offs and almost definitely without notifying the CTO or CFO about this new financial commitment.

Cloud governance is a recent paradigm that recognizes that traditional infrastructure governance doesnt work for public cloud. Its a framework that can cover cost optimization, roles and responsibilities, resiliency and possibly security and compliance.

This framework generally strikes a balance between fiscal responsibility and agile innovation. It should recognize that decisions and approvals may no longer be centralized. It may include:

To be effective, cloud governance is best managed using a cloud management platform. While cloud vendors offer increasingly sophisticated tools in this area, third-party solutions are much further ahead in terms of features and, as you might expect, work equally well whether youre using a single cloud vendor or all of them.

The most-important function of such a platform is its ability to provide automated alertsand in some cases automated remediationswhen policy violations occur, as they invariably will.

Another area where you will frequently encounter waste has to do with the sizing of servers. When you buy a physical server, you expect it to last for at least 5 years. You may know how big that server needs to be todaybut do you know how big it will need to be three years from now? Five years? Probably not. So you buy the largest server you can afford, cross your fingers and hope for the best.

To some extent this mindset has carried over to the cloud as well. If no one is watching over this properly, DevOps is probably spinning up huge virtual servers with far more vCPU and vRAM than is needed. It completely overlooks one of the most fundamental tenets of cloud: cloud is elastic. Spin up what you know you need today and its an almost trivial matter to migrate to something larger (or smaller) in the future.

People still dont think cloud. They dont understand the on-demand nature of cloud. As a result, a lot of savings opportunities are missed.

Heres the simplest example. Your developers all go home by 7 PM or midnight or whenever. They dont turn up until 10 AM the next morning. They go home on weekends and holidays. Theyre not working, and their cloud servers arent working. The servers are just sitting there, not gathering dust but gathering charges on your bill.

In the good old days, its just a server sitting in your data center and its only costing you a few pennies extra each month for electricity. But in the brave new world of cloud, its easy to power them down and grab those savings.

I can remember sitting in a clients meeting room with their Cloud Ops team. The client had just published their annual report, a 50-page 4-color glossy booklet that not only reported on the previous years results but also laid out the companys vision for the future. In a nice touch, the report had lots of photos of the companys staff hard at work.

How many people in this room have read the companys latest annual report? I asked. I looked at the 10 other people sitting around at the table, and not a single hand went up. Not one person there had bothered to look at the reportbecause their perception was that it had nothing to do with their daily jobs.

The best system administrators and DevOps people can often operate with blinders on. They know the tasks they have to do and are intensely focused on them, sometimes to the omission of everything else. Butand this is importantthis is not their fault.

You might be the greatest manager in the world, but if all youre doing is assigning tasks, checking progress and putting out status reports, youve only done half your job.

Its the managers job to make people care, to communicate the passion of the companys mission and make every person on the team understand not just the vision but also their role in achieving it.

A big piece of this communication is making teams understand the budget impact of their actions, and how saving money not only benefits the company but will directly benefit them as well. After all, more profits mean higher bonuses and possibly even a greater headcount to share the workload.

Optimizing cloud costs is everyones job. Communicating that consistently to the team, making everyone understand their role and their roles importance, makes it easier for you to do your job, because now everyone around you will make it a priority to contribute to the cost-savings effort.

See the original post here:
The Four Main Reasons Your Cloud Spending Is Out of Control - Cloud Wars

Read More..

Everyone needs the cloud now more than ever. So understand AWS, Azure and more for under $25. – The Next Web

TLDR: The Complete 2020 Cloud Certification Training Bundle will get you ready to manage a companys cloud-based network from anywhere with maximum efficiency.

Weve all quickly adjusted to the fact that working from home is going to be the new normal for a while. But if you step back for a minute, youll see thats a move that workers have been pushing for for quite some time in most industries.

And now that platforms like AWS, Azure and others have helped transport many workplace systems from a physical office to a decentralized, easily-accessible perch in the cloud, its never been easier for employees to do vast majorities of their daily work duties right from home.

Adapting to the cloud isnt a matter of if, its when so IT pros need to understand how to do their jobs in the new environment with instruction like The Complete 2020 Cloud Certification Training Bundle. Right now, its over 90 percent off, just $24, from TNW Deals with promo code: FLASHSAVE40.

Over 12 courses, students get a close look at each of the most popular cloud platforms today, examining how each operates to learn how they can best help your companys tech processes run more smoothly.

Its no surprise the Amazon Web Services (AWS) universe dominates this package. As the cloud markets leading provider, this collection features nine courses digging into the vast assortment of features and options available in their thriving cloud services empire.

The courses here can help even newcomers get familiar with the terminology and processes of managing a system on the AWS platform and start using some of their most popular tools to improve your companys performance and protection.

Students get a complete introduction to proper data storage procedures using Amazon S3, domain hosting with Route 53, and the correct steps for making sure your network is always growing responsible with EC2. And once youve handled the training, theres also a course to get you ready to pass the AWS Solution Architect Certification exam and earn your stripes as a credentialed AWS pro.

Meanwhile, there are also courses offering plenty of opportunity to get to know another huge player in the cloud these days, Microsofts Azure; how to monitor cloud system updates and changes using GIT; and methods to run much of your system automatically with the help of Ansible.

The course package routinely sells for $3,800, but the entire cloud learning collection is on sale now for $2 per course, just $24 while this deal lasts with promo code: FLASHSAVE40.

Prices are subject to change.

Read our daily coverage on how the tech industry is responding to the coronavirus and subscribe to our weekly newsletter Coronavirus in Context.

For tips and tricks on working remotely, check out our Growth Quarters articles here or follow us on Twitter.

See more here:
Everyone needs the cloud now more than ever. So understand AWS, Azure and more for under $25. - The Next Web

Read More..

Zoom warns investors they may become a victim of their own success as costs spiral – MSPoweruser – MSPoweruser

The Zoom video conferencing solution has rapidly gained popularity and prominence due to the new Work from Home directives, but it appears all that success is coming at a steep price.

In a new security filling Zoom has warned investors that their rapid growth is forcing them to invest in building out the service faster than they had planned.

We expect our cost of revenue to increase for the foreseeable future, both in absolute dollars and as a percentage of total revenue, as we expand our data centre capacity and third party cloud hosting due to increased usage stemming from the recent outbreak of the COVID-19 virus, Zoom writes in the filing.

Zooms cloud infrastructure is likely powered by Amazons AWS, meaning their costs per user are fixed, even while most of their new customers are using their free plans.

While we have seen increased usage of our service globally, there are no assurances that we will also experience an increase in paying customers or that new or existing users will continue to utilize our services at the same levels after the outbreak has tempered, Zoom said in the filing.

Zoom also warns that there is a price for not investing more in response to the increase in demand.

Any unfavorable publicity or perception of our platform, including any delays or interruptions in service due to capacity constraints stemming from increased usage due to the recent outbreak of the COVID-19 virus, or of the providers of communication and collaboration technologies generally could adversely affect our reputation and our ability to attract and retain hosts, Zoom writes.

Poor reliability could force users back to more mainstream solutions such as Microsoft Teams, which runs its own backend. We have already heard how Microsoft is positioning themselves directly against Zoom, and any weakness on their part will likely rapidly be exploited.

For their part, Zoom has not yet clamped down in free users, and have in fact removed the 40 minute limit on video calls in China and for educational institutions in several countries.

For now, CEO Eric Yuan has said at the moment he is just focused on providing the best product to customers and helping those impacted by the pandemic.

Via BusinessInsider.

Originally posted here:
Zoom warns investors they may become a victim of their own success as costs spiral - MSPoweruser - MSPoweruser

Read More..

Leostream to the Remote Access Rescue – Associated Press

Press release content from Newswire. The AP news staff was not involved in its creation.

Click to copy

WALTHAM, Mass. - March 19, 2020 - ( Newswire.com )

Dewpoint, an Information Technology Services and Solutions Provider ( http://www.dewpoint.com ), leverages the Leostream Platform for their cloud-based virtual desktop infrastructure solution, enabling their customers to save money and improve efficiency by leveraging the cloud as a hosting platform for VDI.

Last week, a call center customer using Dewpoints Next Gen VDI offering was seeking a solution to allow their support and sales staff to work from home in the event their building closed due to COVID-19.

Dewpoint turned to Leostream, specifically the Leostream Gateway and HTML5 viewer that was already included in the customers Leostream Platform.

The Dewpoint team started working on getting the security in place and in just four hours the customers staff were able to remotely log into desktops both as support and sales.That evening the customer had a handful of employees from their sales and support teams successfully test their connections from home and now the call center is operating remotely.

With Leostream, Dewpoints customer is continuing to support their customers from the safety and convenience of their employees homes.

Leostream to the rescue was the literal subject of an email I received the other day,said Karen Gondoly, CEO of Leostream. Leostream has always made it easy for our customers to provide remote access for a roaming workforce. While the circumstances now are far from ideal, were proud that were doing our part to help organizations keep their workforce productive and safe.

Ron Cox, Senior Systems Engineer at Dewpoint said, Leostream helped our customers call center work from home with the virus outbreak. Right now [our customer] is operating their call center sales and support 100% using the Leostream Gateway external HTML5 viewer.

Leostream is a vendor-agnostic platform providing a comprehensive and scalable solution for organizations to securely deliver and manage virtual desktops, remote sessions, and applications hosted on-premises, in a private cloud, public cloud, or hybrid cloud environment. Learn more at https://leostream.com.

Media Contact: Karen Gondoly Phone: 781-890-2019 Email: sales@leostream.com

Related Links

Leostream Trial

Press Release Service by Newswire.com

Original Source: Leostream to the Remote Access Rescue

Originally posted here:
Leostream to the Remote Access Rescue - Associated Press

Read More..

AMD and Intel have a formidable new foe but youll never guess who it is – TechRadar

An unexpected rival has emerged that could give Intel and AMD a run for their money, at least in the very lucrative server and cloud computing market.

Amazons new Graviton2 CPU has been tested extensively by Andrei Frumusanu from our sister website Anandtech, and the results show this new kid on the block outstrips the incumbents when it comes to performance per dollar.

Graviton2 was tested against two other cloud computing resources offered by Amazon Web Services: the m5a (AMD EPYC 7571) and m5n (Intel Xeon Platinum 8259CL Cascade Lake). Andrei found it could offer savings of up to 54%, which he says represents "a massive shakeup for the AWS and EC2 ecosystem.

So, how did Amazon achieve these results? The chip comes from Annapurna Labs and packs 64 A76 ARM cores - similar to what you can find in a smartphone - with 33MB cache and a high clock speed. Amazon is Annapurna Labs' only customer, which means the processor is extremely fine-tuned for AWS workloads.

According to Andrei, unless you're tied to the x86 platform, you'd be stupid not to switch over to Graviton2 instances once they become more widely available for everything from VPN (AWS VPN) to web hosting (AWS Light Sail).

For now, expect AMDs EPYC2 processors to put up a bit of a fight - at least until Graviton3 lands.

More:
AMD and Intel have a formidable new foe but youll never guess who it is - TechRadar

Read More..

Business Process-as-a-Service (BPaaS) Market To Reach USD 120.70 Billion By 2026 Growing at a CAGR of 12.6% – Weather News Point

Business Process-as-a-Service Market Size was USD 46.38 Billion in 2018 and BPaaS Marketis forecast to reach USD 120.70 Billion by 2026 Growing at a CAGR of 12.6%.

Business Process-as-a-Service (BPaaS) refers to the web-delivered or the cloud hosting models, delivered by the business process outsourcing (BPO) companies and serve multiple tenants across most of the industry verticals. The delivered functions might derive from any of cloud-based Software-as-a-Service (SaaS), Platform-as-a-Service (PaaS), and Infrastructure-as-a-Services (IaaS) models in general.

The global business process-as-a-service (BPaaS) market is growing predominantly as all the industries are slowly moving towards adopting the cloud-based on-demand BPaaS models. BPaaS, has, in turn, made the traditional process of outsourcing models more flexible, clean, accessible, and affordable by lowering down the up-front additional investment costs. Operation & finance management processes sub-segments in the business application has the highest usage rate and also is calculated to achieve the highest growth rate. The mBPM solutions would like to facilitate and also drive the market broadly.

Key participants include Microsoft Corporation, Tata Consultancy Service, HCL Technologies, Genpact, Accenture plc, Wipro, IBM Corporation, Oracle, Concentrix, and Infosys BPM.

Request free sample of this research report at: https://www.reportsanddata.com/sample-enquiry-form/2592

APAC is forecasted to achieve the fastest growth of about 17.8% in the period 2019 2026, due to an increase in the adoption of the cloud-based back-end business and financial process serviced from the BPOs in both the SMEs and large enterprises across all the industry verticals in the countries like China, Singapore, and India. North America holds the highest market share of about 37.8% in 2018 because of the prodigious rate of adoption and improvisation of the influential business process verticals, especially in the conglomerates.

For the purpose of this report, Reports and Data have segmented the global business process-as-a-service (BPaaS) market on the basis of the business application, component, organization size, end-use industries, and region:

Business Application Outlook (Revenue: USD Billion; 2016-2026)

Component Outlook (Revenue: USD Billion; 2016-2026)

Organization Size Outlook (Revenue: USD Billion; 2016-2026)

End-Use Industries Outlook (Revenue: USD Billion; 2016-2026)

Regional Outlook (Revenue: USD Billion; 2016-2026)

Buy Your Excusive Copy Now: https://www.reportsanddata.com/checkout-form/2592

Further key findings from the report suggest

To identify the key trends in the industry, click on the link below: https://www.reportsanddata.com/report-detail/business-process-as-a-service-bpaas-market

Here is the original post:
Business Process-as-a-Service (BPaaS) Market To Reach USD 120.70 Billion By 2026 Growing at a CAGR of 12.6% - Weather News Point

Read More..