Page 3,751«..1020..3,7503,7513,7523,753..3,7603,770..»

Technology saves the day as Kenyan firms send staff to work from home – The East African

By PAULINE KAIRUMore by this Author

The Covid-19 coronavirus pandemic has drastically changed life as we know it.

In Kenya, where the number of confirmed cases had reached 15 by the time of going to press, the government asked employers to allow staff to work from home.

In addition, all learning institutions have been shut down as students go online and to the radio for lessons and homework.

Remote interactions and e-commerce have emerged as valuable contingent options amid the pandemic, which is projected to cause one of the biggest economic recessions in recent times.

Microsoft announced that it is offering its Microsoft Teams collaboration platform, for free, to companies to make remote working possible.

Corporate vice president for Microsoft 365 Jared Spataro said, At Microsoft, our top concern is the well-being of our employees and supporting our customers in dealing with business impact during these challenging times. By making Teams available, we hope that we can support public health and safety by making remote work even easier.

In Kenya, Safaricom and Nation Media Group said they had acquired and were providing laptops, dongles and tech tools to their employees.

Safaricom said over 95 per cent of its workforce has been asked to work from home.

We will be engaging collaboration tech tools such as Microsoft Teams, WebEx, Yammer & Cisco Jabber to enable teams that would ordinarily be required to work from a certain location to work remotely, the companys chief human resources officer, Paul Kasimu, said.

Safaricom also announced on Tuesday that it was doubling its internet speeds for home fibre packages at no extra cost.

Nation Media Group under the Safe Nation mantra, last week rolled out the business continuity plan for remote working protocols.

NMG systems administrator Sicily Rugendo, said, We are connecting our teams to VPN, a vital tool for internet security, especially when you are working from home, which will allow secure remote access to corporate resources.

We are enabling e-mail connectivity to different software systems, and giving them interactive tools like Skype-for-business, to enable people hold meetings remotely, Ms Rugendo added.

The Kenya Association of Manufacturers has launched an online directory for locally manufactured goods to help Kenyans shop online and have products delivered to their homes or shops.

We are doing this to forestall disruptions in the market, said the associations chief executive Phyllis Wakiaga.

The Ministry of Education assured parents and pupils of continuation of learning following the closure of schools.

The Kenya Institute of Curriculum Development said it would deliver the curriculum through YouTube, the Kenya Education Cloud, radio and television starting Monday.

The ministry will broadcast programmes daily, from Monday to Friday, through Radio Taifa and English Service.

Originally posted here:
Technology saves the day as Kenyan firms send staff to work from home - The East African

Read More..

Security Software in Telecom Market is Growing Rapidly Due to Increasing Internet Penetration – Press Release – Digital Journal

"Security Software in Telecom Market"

Global Security Software in Telecom Market Research Report: Information by Component [Solution (Identity and Access Management (IAM), Risk and Compliance Management, Encryption, Data Loss Prevention (DLP), Unified Threat Management, Security Information and Event Management (SIEM), Distributed Denial of Service Mitigation (DDoS) and Firewall)

Security Software in Telecom Market Research Report- Forecast till 2025

Market Highlights

Due to the massive expansion of LTE networks, the customers have been experiencing seamless connectivity across the globe. It has given a significant opportunity to the telecom industries to expand their networks and penetrate in urban, rural, and remote areas. Furthermore, a perpetual increase in the usage of internet, increasing need demand for data services, and rising adoption broadband adoption are some of the critical drivers for the telecom industry. The development of networks has also resulted in a higher number of cyber-attacks in the telecom industry. The Global Security Software in Telecom Market 2020 was valued at USD 3,599.4 million in 2018 and estimated to expand at a CAGR of 11.9% with a value of USD 8,923.5 million by 2025.

Telecom operators have often experienced attacks while signaling, metering, switching, and configuring the network. The increasing number of cyberattacks has motivated the security providers to develop a sustainable solution for the enterprises. Hence, the increase in data breach incidents has led to the demand for establishing secure networks.Inadequate infrastructure and inexpensive security solutions to implement IPV6 technology, which is the most recent version of internet protocol, are some of the factors that are hindering the expansion of security software in the telecom market.

Get a Free Sample @ https://www.marketresearchfuture.com/sample_request/6961

Segmentation:

By component, the security software in the telecom market is classified into solutions and services.

By solution, the security software in the telecom market is segmented into identity and access management (IAM), risk and compliance management, encryption, data loss prevention (DLP), unified threat management, security information and event management (SIEM), distributed denial of service mitigation, firewall, and others.

By services, the market is categorized into professional services (risk assessment, design and implementation, support and maintenance, and others) and managed services.

By security, the market is segmented into network security, endpoint security, application security, cloud security, and others.

By deployment, the market is segmented into the cloud and on-premise.

By end-user, the market is segmented into large enterprises, small and medium enterprises (SMEs), and government.

Regional Analysis

The global security software in the telecom market has segmented into Asia-Pacific, North America, Europe, the Middle East & Africa, and South America.

North America is estimated to dominate the security software in the telecom market. It has also been anticipated that it will retain the market during the forecast period. The advent of new technology and adaptation to it has led to considerable growth in the market. The rise in the deployment of IoT and the existence of internet-enabled solutions and cloud services have led to significant exposure to global security software in the telecom market. Canada and the US are facing a higher number of cyberattacks in the telecom sectors, which is one of the most significant reasons which increases the demand for security software.

Asia-Pacific is estimated to be the fastest-growing telecom market during the forecast period. With fast digitization, along with the developments in IoT and cloud computing has created a higher risk of cyber-attacks and security breaches. It resulted in a higher demand for global security software in the telecom market. It has encouraged the service providers to develop security solutions efficiently for the telecom operators. Countries like India, Japan, and China are growing at a faster pace, and hence, they contribute to the expansion of the market in the APAC region. Moreover, the increasing cyber-crime and strict government rules and regulations are speculated to increase the demand, thereby growing the market in the APAC region.

Key Players

The prominent players of the global security software in telecom market are Symantec Corporation (US), IBM Corporation (US), Palo Alto Networks (US), Dell Inc (US), McAfee (US), and Trend Micro (Japan) Inc. are few of the most eminent players who contributed about 40% of the market share in 2018.

Other players like Check Point (US), Splunk (US), Amazon Web Services (US), Imperva (US), Qualys (US), F-Secure (Finland), HP Enterprise Development LP (US), FireEye (US), Oracle Corporation (US), Forcepoint (US), Fortinet (US), Microsoft Corporation (US), Proofpoint (US), F5 Networks (US), CyberArk (Israel), Sophos (UK), Juniper Networks (US), and, FireEye (US) have also played a key role and have secured position on the global security software in telecom market.

Browse Complete Report @ https://www.marketresearchfuture.com/reports/security-software-telecom-market-6961

Global Security Software in Telecom Market Research Report: Information by Component [Solution (Identity and Access Management (IAM), Risk and Compliance Management, Encryption, Data Loss Prevention (DLP), Unified Threat Management, Security Information and Event Management (SIEM), Distributed Denial of Service Mitigation (DDoS) and Firewall) and Services (Managed Services and Professional Services)], Deployment Mode (Cloud and On-Premise), Security Type (Network Security, Endpoint Security, Application Security and Cloud Security) and Region [North America (the US, Canada, Mexico), Europe (Germany, the UK, France, Italy, Spain and Rest of Europe), Asia-Pacific (China, Japan, India and the Rest of Asia-Pacific), the Middle East & Africa and South America] - Forecast till 2025

Media ContactCompany Name: Market Research FutureContact Person: Abhishek SawantEmail: Send EmailPhone: +1 646 845 9312Address:Market Research Future Office No. 528, Amanora Chambers Magarpatta Road, Hadapsar City: PuneState: MaharashtraCountry: IndiaWebsite: https://www.marketresearchfuture.com/reports/security-software-telecom-market-6961#summary

See original here:
Security Software in Telecom Market is Growing Rapidly Due to Increasing Internet Penetration - Press Release - Digital Journal

Read More..

In Industrial Realm, Trustworthy Software Ensures – IoT World Today

Trustworthy software requires significant initial planning and a long-term perspective.

While many corporations struggle to win the trust of an ever more cynical public, the stakes are higher for industrial organizations that must rely on various software type.

Problematic software can cause operational downtime, intellectual property loss and, in some cases, life-threatening consequences.

There has been a recent uptick in interest in trustworthy software concerning the Internet of Things (IoT) and software quality in general. The fate of the digital economy depends on individuals and organizations trusting computing technology. But trust is less sturdy than it has been in the past, as the National Institute of Standards and Technology concluded in 2016.

In recent years, various organizations have made trustworthy software central to their mission. Founded in 2016, the U.K.-based not-for-profit Trustworthy Software Foundation drives best-practices in software development. Late last year, the Linux Foundation launched Project Alvarium, an initiative exploring mechanisms to support trust in heterogeneous systems, including IoT deployments and between diverse stakeholders. The Industrial Internet Consortium advocates the concept of trustworthiness in industrial IoT.

Outcomes to Avoid

A string of events serve as a warning of the risks of relying on untrustworthy industrial software, according to Bob Martin, co-chair of the Software Trustworthiness Task Group at Industrial Internet Consortium who coauthored the organizations white paper Software Trustworthiness Best Practices.

In 2004, for instance, a software glitch caused air traffic control infrastructure and its backup system to shut down in Southern California, according to the L.A. Times. The error resulted in the diversion of 800 commercial airline flights after radio and radar equipment failed for more than three hours.

Other similarly themed stories include a computer-controlled radiation therapy machine that caused several deaths in the 1980s and a power outage in Tempe, Arizona, in 2007 that resulted from a misconfiguration by a vendor engineer.

Real systems have been deployed in the industrial IoT space with the kinds of errors you dont want to have on your rsum, Martin said.

The explosion of connectivity and new applications in industrial IoT settings has increased the numbers of professionals creating and procuring software for critical processes. People who are new to building systems with software or trying to make software resilient may not have run across these events in their education, Martin said.

The variety of systems and operating environments involved with industrial IoT devices poses another challenge as it opens up the possibility of security- or safety-related risks, said Johannes Bauer, principal security adviser, identity management and security at UL. It also complicates the process of looking for faults in the various processing elements and code involved in a single project.

Creating a Common Trust Language

In the industrial realm, trustworthiness includes facets, including safety, security, privacy reliability and resilience. Trustworthy software can withstand environmental disturbances, human error, system faults and cyberattacks, according to the Industrial Internet Consortium.

Deploying software that can be trusted requires a comprehensive approach that spans the entire software lifecycle process, according to Simon Rix, product strategist at Irdeto. You have to incorporate security early, and you have to work out how to automate it, said Rix, who also co-wrote the IIC whitepaper.

Fostering conversation between those stakeholders can be challenging, however. How do you get the businesspeople to speak in a way that the technical people can understand, and how do you keep the technological people from rushing off on their mission to design a product quickly? Rix asked.

The key is to address the whole life cycle, all the different software development methodologies, and to make sure you bring in the stakeholders of the business as well as the operators, Martin said. Theres a need for a translation key or Rosetta Stone for the different parties to be able to talk about what they care about where others around the table can see their perspectives as well.

Frameworks Provide a Starting Point

A growing number of frameworks distill the subject of trust among various stakeholders, but instilling trust in software remains a complex proposition. The use of the word trust has so much variability that its almost a useless concept except it does let us have a dialogue, Martin said.

Putting controls in place to optimize security and safety of industrial software is a vital first step. But cybersecurity processes need to be continually audited. The concern I have is you can screw anything up, said Chester Wisniewski, principal research scientist at Sophos. For example, I can use [the Advanced Encryption Standard], but I can misuse it far more ways than I can use it correctly. Wisniewski draws a parallel from retail. A lot of stores that have chip readers for credit cards have a piece of cardboard with a sign that says, Please swipe. he said. Having chip readers doesnt mean your credit card processing is secure if you dont actually use [technology designed to limit fraud].

Another pitfall is to focus on deploying secure software initially but not consider that it will become obsolete. We differentiate between end of support and end of use. Just because the original creator may not support the software doesnt mean that it turns into a salt pillar that it is unusable, Martin said.

Ironically, the topic of end-of-life software also underscores the importance of focusing on security considerations from the beginning. If the software is critical to you, then put it in your contract to get rights to the source, Martin advised.

Ultimately, understanding how software works in the real world requires long-term focus. It isnt magical. It reacts, interacts and sometimes needs to be replaced.

Visit link:
In Industrial Realm, Trustworthy Software Ensures - IoT World Today

Read More..

How safe is your brand in the hands of a remote workforce? – Bizcommunity.com

Many employees today already have laptops, high-speed internet connectivity and access to networks via the cloud to perform their daily tasks remotely. However, are they equipped to deliver consistent brand experiences that customers have come to expect when dealing with the organisation?

Having invested significantly into their brands for years, companies need to put the best interests of their employees and customers at heart but not at the detriment of their brands. As such employees should be equipped with tools that will help them to meet customers needs seamlessly and deliver consistent brand experiences in every email and document sent to clients wherever they are working from.

There are several measures that companies should put in place to secure their brand and deliver a consistent experience in all customer and employee engagements whether working remotely or not.

Further, the body of the emails should be on-brand using the same font and colour across the company. It is also recommended to have pre-developed and pre-approved content available and easily accessible for employees to insert into emails while working remotely. This requires minimal input and keeps the brand integrity in every communication.

Employees should have access to the latest company letterheads, templates, documents and presentations that are required for client communication. If documents are updated while the employees are working remotely, the latest versions should be easy-to-access without the need for a Virtual Private Network (VPN) and employees should feel comfortable that they are sending their customers the most up-to-date information at all times.

When employees are separate from the company it is critical they are kept up to date on all important company news and information throughout the day to prevent them from becoming disconnected and uncoordinated. An employee communication tool should be used to broadcast information to employees throughout the day and keep them informed about company news.

It would also be valuable to share updates on topical issues such as the latest coronavirus stats regularly via the broadcast tool to minimise the amount of time employees would otherwise spend looking for the information themselves.

To avoid financial and brand damage, companies need to incorporate layered security to help prevent customers and employees from falling victim to email scams, particularly while working with a remote workforce. Centrally managed, tamperproof email signatures are also a first step in helping to prevent fraudulent emails from being sent on behalf of a company. Built-in email verification would also benefit the company and email recipients and give them added peace of mind that emails are authentic.

However, more than this, companies need to have segmentation of risk built into their email branding solution to safeguard customer and company information at all times, particularly when employees are working remotely. This is key to preventing security breaches.

As such the customer experience has to be nurtured at this time and employees need to be empowered to continue to deliver on-brand experiences wherever they may be working from.

Go here to see the original:
How safe is your brand in the hands of a remote workforce? - Bizcommunity.com

Read More..

The Well-matched Combo of Quantum Computing and Machine Learning – Analytics Insight

The pace of improvement in quantum computing mirrors the fast advances made in AI and machine learning. It is normal to ask whether quantum technologies could boost learning algorithms: this field of inquiry is called quantum-improved machine learning.

Quantum computers are gadgets that work dependent on principles from quantum physics. The computers that we at present use are constructed utilizing transistors and the information is stored as double 0 and 1. Quantum computers are manufactured utilizing subatomic particles called quantum bits, qubits for short, which can be in numerous states simultaneously. The principal advantage of quantum computers is that they can perform exceptionally complex tasks at supersonic velocities. In this way, they take care of issues that are not presently feasible.

The most significant advantage of quantum computers is the speed at which it can take care of complex issues. While theyre lightning speedy at what they do, they dont give abilities to take care of issues from undecidable or NP-Hard problem classes. There is a problem set that quantum computing will have the option to explain, anyway, its not applicable for all computing problems.

Ordinarily, the issue set that quantum computers are acceptable at solving includes number or data crunching with an immense amount of inputs, for example, complex optimisation problems and communication systems analysis problemscalculations that would normally take supercomputers days, years, even billions of years to brute force.

The application that is routinely mentioned as an instance that quantum computers will have the option to immediately solve is solid RSA encryption. A recent report by the Microsoft Quantum Team recommends this could well be the situation, figuring that itd be feasible with around a 2330 qubit quantum computer.

Streamlining applications leading the pack makes sense well since theyre at present to a great extent illuminated utilizing brute force and raw computing power. If quantum computers can rapidly observe all the potential solutions, an ideal solution can become obvious all the more rapidly. Streamlining stands apart on the grounds that its significantly more natural and simpler to get a hold on.

The community of people who can fuse optimization and robust optimization is a whole lot bigger. The machine learning community, the coinciding between the innovation and the requirements are technical; theyre just pertinent to analysts. Whats more, theres a much smaller network of statisticians on the planet than there are of developers.

Specifically, the unpredictability of fusing quantum computing into the machine learning workflow presents an impediment. For machine learning professionals and analysts, its very easy to make sense of how to program the system. Fitting that into a machine learning workflow is all the more challenging since machine learning programs are getting very complex. However, teams in the past have published a lot of research on the most proficient method to consolidate it in a training workflow that makes sense.

Undoubtedly, ML experts at present need another person to deal with the quantum computing part: Machine learning experts are searching for another person to do the legwork of building the systems up to the expansions and demonstrating that it can fit.

In any case, the intersection of these two fields goes much further than that, and its not simply AI applications that can benefit. There is a meeting area where quantum computers perform machine learning algorithms and customary machine learning strategies are utilized to survey the quantum computers. This region of research is creating at such bursting speeds that it has produced a whole new field called Quantum Machine Learning.

This interdisciplinary field is incredibly new, however. Recent work has created quantum algorithms that could go about as the building blocks of machine learning programs, yet the hardware and programming difficulties are as yet significant and the development of fully functional quantum computers is still far off.

The future of AI sped along by quantum computing looks splendid, with real-time human-imitable practices right around an inescapable result. Quantum computing will be capable of taking care of complex AI issues and acquiring multiple solutions for complex issues all the while. This will bring about artificial intelligence all the more effectively performing complex tasks in human-like ways. Likewise, robots that can settle on optimised decisions in real-time in practical circumstances will be conceivable once we can utilize quantum computers dependent on Artificial Intelligence.

How away will this future be? Indeed, considering just a bunch of the worlds top organizations and colleges as of now are growing (genuinely immense) quantum computers that right now do not have the processing power required, having a multitude of robots mirroring humans running about is presumably a reasonable way off, which may comfort a few people, and disappoint others. Building only one, however? Perhaps not so far away.

Quantum computing and machine learning are incredibly well matched. The features the innovation has and the requirements of the field are extremely close. For machine learning, its important for what you have to do. Its difficult to reproduce that with a traditional computer and you get it locally from the quantum computer. So those features cant be unintentional. Its simply that it will require some time for the people to locate the correct techniques for integrating it and afterwards for the innovation to embed into that space productively.

Follow this link:
The Well-matched Combo of Quantum Computing and Machine Learning - Analytics Insight

Read More..

Research by University of Chicago PhD Student and EPiQC Wins IBM Q Best Paper – HPCwire

March 23, 2020 A new approach for using a quantum computer to realize a near-term killer app for the technology received first prize in the 2019 IBM Q Best Paper Awardcompetition, the company announced. The paper, Minimizing State Preparations in Variational Quantum Eigensolver (VQE) by Partitioning into Commuting Families, was authored by UChicago CS graduate studentPranav Gokhaleand fellow researchers from theEnabling Practical-Scale Quantum Computing (EPiQC)team.

The interdisciplinary team of researchers from UChicago, University of California, Berkeley, Princeton University and Argonne National Laboratory won the $2,500 first-place award for Best Paper. Their research examined how the VQE quantum algorithm could improve the ability of current and near-term quantum computers to solve highly complex problems, such as finding the ground state energy of a molecule, an important and computationally difficult chemical calculation the authors refer to as a killer app for quantum computing.

Quantum computers are expected to perform complex calculations in chemistry, cryptography and other fields that are prohibitively slow or even impossible for classical computers. A significant gap remains, however, between the capabilities of todays quantum computers and the algorithms proposed by computational theorists.

VQE can perform some pretty complicated chemical simulations in just 1,000 or even 10,000 operations, which is good, Gokhale says. The downside is that VQE requires millions, even tens of millions, of measurements, which is what our research seeks to correct by exploring the possibility of doing multiple measurements simultaneously.

Gokhale explains the research inthis video.

With their approach, the authors reduced the computational cost of running the VQE algorithm by 7-12 times. When they validated the approach on one of IBMs cloud-service 20-qubit quantum computers, they also found lower error as compared to traditional methods of solving the problem. The authors have shared theirPython and Qiskit codefor generating circuits for simultaneous measurement, and have already received numerous citations in the months since the paper was published.

For more on the research and the IBM Q Best Paper Award, see theIBM Research Blog. Additional authors on the paper include ProfessorFred Chongand PhD studentYongshan Dingof UChicago CS, Kaiwen Gui and Martin Suchara of the Pritzker School of Molecular Engineering at UChicago, Olivia Angiuli of University of California, Berkeley, and Teague Tomesh and Margaret Martonosi of Princeton University.

About The University of Chicago

The University of Chicago is an urban research university that has driven new ways of thinking since 1890. Our commitment tofree and open inquirydraws inspired scholars to ourglobal campuses, where ideas are born that challenge and change the world. We empower individuals to challenge conventional thinking in pursuit of original ideas. Students in theCollegedevelop critical, analytic, and writing skills in ourrigorous, interdisciplinary core curriculum. Throughgraduate programs, students test their ideas with UChicago scholars, and become the next generation of leaders in academia, industry, nonprofits, and government.

Source: The University of Chicago

See the original post here:
Research by University of Chicago PhD Student and EPiQC Wins IBM Q Best Paper - HPCwire

Read More..

Picking up the quantum technology baton – The Hindu

In the Budget 2020 speech, Finance Minister Nirmala Sitharaman made a welcome announcement for Indian science over the next five years she proposed spending 8,000 crore (~ $1.2 billion) on a National Mission on Quantum Technologies and Applications. This promises to catapult India into the midst of the second quantum revolution, a major scientific effort that is being pursued by the United States, Europe, China and others. In this article we describe the scientific seeds of this mission, the promise of quantum technology and some critical constraints on its success that can be lifted with some imagination on the part of Indian scientific institutions and, crucially, some strategic support from Indian industry and philanthropy.

Quantum mechanics was developed in the early 20th century to describe nature in the small at the scale of atoms and elementary particles. For over a century it has provided the foundations of our understanding of the physical world, including the interaction of light and matter, and led to ubiquitous inventions such as lasers and semiconductor transistors. Despite a century of research, the quantum world still remains mysterious and far removed from our experiences based on everyday life. A second revolution is currently under way with the goal of putting our growing understanding of these mysteries to use by actually controlling nature and harnessing the benefits of the weird and wondrous properties of quantum mechanics. One of the most striking of these is the tremendous computing power of quantum computers, whose actual experimental realisation is one of the great challenges of our times. The announcement by Google, in October 2019, where they claimed to have demonstrated the so-called quantum supremacy, is one of the first steps towards this goal.

Besides computing, exploring the quantum world promises other dramatic applications including the creation of novel materials, enhanced metrology, secure communication, to name just a few. Some of these are already around the corner. For example, China recently demonstrated secure quantum communication links between terrestrial stations and satellites. And computer scientists are working towards deploying schemes for post-quantum cryptography clever schemes by which existing computers can keep communication secure even against quantum computers of the future. Beyond these applications, some of the deepest foundational questions in physics and computer science are being driven by quantum information science. This includes subjects such as quantum gravity and black holes.

Pursuing these challenges will require an unprecedented collaboration between physicists (both experimentalists and theorists), computer scientists, material scientists and engineers. On the experimental front, the challenge lies in harnessing the weird and wonderful properties of quantum superposition and entanglement in a highly controlled manner by building a system composed of carefully designed building blocks called quantum bits or qubits. These qubits tend to be very fragile and lose their quantumness if not controlled properly, and a careful choice of materials, design and engineering is required to get them to work. On the theoretical front lies the challenge of creating the algorithms and applications for quantum computers. These projects will also place new demands on classical control hardware as well as software platforms.

Globally, research in this area is about two decades old, but in India, serious experimental work has been under way for only about five years, and in a handful of locations. What are the constraints on Indian progress in this field? So far we have been plagued by a lack of sufficient resources, high quality manpower, timeliness and flexibility. The new announcement in the Budget would greatly help fix the resource problem but high quality manpower is in global demand. In a fast moving field like this, timeliness is everything delayed funding by even one year is an enormous hit.

A previous programme called Quantum Enabled Science and Technology has just been fully rolled out, more than two years after the call for proposals. Nevertheless, one has to laud the governments announcement of this new mission on a massive scale and on a par with similar programmes announced recently by the United States and Europe. This is indeed unprecedented, and for the most part it is now up to the government, its partner institutions and the scientific community to work out details of the mission and roll it out quickly.

But there are some limits that come from how the government must do business with public funds. Here, private funding, both via industry and philanthropy, can play an outsized role even with much smaller amounts. For example, unrestricted funds that can be used to attract and retain high quality manpower and to build international networks all at short notice can and will make an enormous difference to the success of this enterprise. This is the most effective way (as China and Singapore discovered) to catch up scientifically with the international community, while quickly creating a vibrant intellectual environment to help attract top researchers.

Further, connections with Indian industry from the start would also help quantum technologies become commercialised successfully, allowing Indian industry to benefit from the quantum revolution. We must encourage industrial houses and strategic philanthropists to take an interest and reach out to Indian institutions with an existing presence in this emerging field. As two of us can personally attest, the Tata Institute of Fundamental Research (TIFR), home to Indias first superconducting quantum computing lab, would be delighted to engage.

R. Vijayaraghavan is Associate Professor of Physics at the Tata Institute of Fundamental Research and leads its experimental quantum computing effort; Shivaji Sondhi is Professor of Physics at Princeton University and has briefed the PM-STIAC on the challenges of quantum science and technology development; Sandip Trivedi, a Theoretical Physicist, is Distinguished Professor and Director of the Tata Institute of Fundamental Research; Umesh Vazirani is Professor of Computer Science and Director, Berkeley Quantum Information and Computation Center and has briefed the PM-STIAC on the challenges of quantum science and technology development

You have reached your limit for free articles this month.

Register to The Hindu for free and get unlimited access for 30 days.

Find mobile-friendly version of articles from the day's newspaper in one easy-to-read list.

Enjoy reading as many articles as you wish without any limitations.

A select list of articles that match your interests and tastes.

Move smoothly between articles as our pages load instantly.

A one-stop-shop for seeing the latest updates, and managing your preferences.

We brief you on the latest and most important developments, three times a day.

Not convinced? Know why you should pay for news.

*Our Digital Subscription plans do not currently include the e-paper ,crossword, iPhone, iPad mobile applications and print. Our plans enhance your reading experience.

Read the original:
Picking up the quantum technology baton - The Hindu

Read More..

Honeywell Achieves Breakthrough That Will Enable The Worlds Most Powerful Quantum Computer #47655 – New Kerala

The company also announced it has made strategic investments in two leading quantum computing software providers and will work together to develop quantum computing algorithms with JPMorgan Chase. Together, these announcements demonstrate significant technological and commercial progress for quantum computing and change the dynamics in the quantum computing industry.

Within the next three months, Honeywell will bring to market the world's most powerful quantum computer in terms of quantum volume, a measure of quantum capability that goes beyond the number of qubits. Quantum volume measures computational ability, indicating the relative complexity of a problem that can be solved by a quantum computer. When released, Honeywell's quantum computer will have a quantum volume of at least 64, twice that of the next alternative in the industry.

In a scientific paper that will be posted to the online repository arXiv later today and is available now on Honeywell's website, Honeywell has demonstrated its quantum charge coupled device (QCCD) architecture, a major technical breakthrough in accelerating quantum capability. The company also announced it is on a trajectory to increase its computer's quantum volume by an order of magnitude each year for the next five years.

This breakthrough in quantum volume results from Honeywell's solution having the highest-quality, fully-connected qubits with the lowest error rates.

Building quantum computers capable of solving deeper, more complex problems is not just a simple matter of increasing the number of qubits, said Paul Smith-Goodson, analyst-in-residence for quantum computing, Moor Insights & Strategy. Quantum volume is a powerful tool that should be adopted as an interim benchmarking tool by other gate-based quantum computer companies.

Honeywell Chairman and Chief Executive Officer Darius Adamczyk said companies should start now to determine their strategy to leverage or mitigate the many business changes that are likely to result from new quantum computing technology.

Quantum computing will enable us to tackle complex scientific and business challenges, driving step-change improvements in computational power, operating costs and speed, Adamczyk said. Materials companies will explore new molecular structures. Transportation companies will optimize logistics. Financial institutions will need faster and more precise software applications. Pharmaceutical companies will accelerate the discovery of new drugs. Honeywell is striving to influence how quantum computing evolves and to create opportunities for our customers to benefit from this powerful new technology.

To accelerate the development of quantum computing and explore practical applications for its customers, Honeywell Ventures, the strategic venture capital arm of Honeywell, has made investments in two leading quantum software and algorithm providers Cambridge Quantum Computing (CQC) and Zapata Computing. Both Zapata and CQC complement Honeywell's own quantum computing capabilities by bringing a wealth of cross-vertical market algorithm and software expertise. CQC has strong expertise in quantum software, specifically a quantum development platform and enterprise applications in the areas of chemistry, machine learning and augmented cybersecurity. Zapata creates enterprise-grade, quantum-enabled software for a variety of industries and use cases, allowing users to build quantum workflows and execute them freely across a range of quantum and classical devices.

Honeywell also announced that it will collaborate with JPMorgan Chase, a global financial services firm, to develop quantum algorithms using Honeywell's computer.

Honeywell's unique quantum computer, along with the ecosystem Honeywell has developed around it, will enable us to get closer to tackling major and growing business challenges in the financial services industry, said Dr. Marco Pistoia, managing director and research lead for Future Lab for Applied Research & Engineering (FLARE), JPMorgan Chase.

Honeywell first announced its quantum computing capabilities in late 2018, although the company had been working on the technical foundations for its quantum computer for a decade prior to that. In late 2019, Honeywell announced a partnership with Microsoft to provide cloud access to Honeywell's quantum computer through Microsoft Azure Quantum services.

Honeywell's quantum computer uses trapped-ion technology, which leverages numerous, individual, charged atoms (ions) to hold quantum information. Honeywell's system applies electromagnetic fields to hold (trap) each ion so it can be manipulated and encoded using laser pulses.

Honeywell's trapped-ion qubits can be uniformly generated with errors more well understood compared with alternative qubit technologies that do not directly use atoms. These high-performance operations require deep experience across multiple disciplines, including atomic physics, optics, cryogenics, lasers, magnetics, ultra-high vacuum, and precision control systems. Honeywell has a decades-long legacy of expertise in these technologies.

Today, Honeywell has a cross-disciplinary team of more than 100 scientists, engineers, and software developers dedicated to advancing quantum volume and addressing real enterprise problems across industries.

Honeywell (www.honeywell.com) is a Fortune 100 technology company that delivers industry-specific solutions that include aerospace products and services; control technologies for buildings and industry; and performance materials globally. Our technologies help aircraft, buildings, manufacturing plants, supply chains, and workers become more connected to make our world smarter, safer, and more sustainable. For more news and information on Honeywell, please visit http://www.honeywell.com/newsroom.

See the rest here:
Honeywell Achieves Breakthrough That Will Enable The Worlds Most Powerful Quantum Computer #47655 - New Kerala

Read More..

Heres How To Predict Major Moves In The Price Of Bitcoin – Forbes

Bitcoin has been swinging wildly over recent months, seeing even higher volatility than usual.

The bitcoin price, which has had all of its 2020 gains wiped out by panic sparked by the spreading coronavirus, fell to 10-month lows earlier this month only to rebound sharplyand is now trading at around $6,000 per bitcoin.

Bitcoin and cryptocurrency investors are keenly watching for any signs of further volatility with one analyst pointing to "large increases in exchange inflows" as heralding extreme bitcoin price moves.

The bitcoin price has fallen heavily in the face of a broader coronavirus-related market sell-off, ... [+] with some warning bitcoin is failing to act as a so-called safe-haven asset.

"Large increases in exchange inflows have proven to be a good indicator of increased volatility, so we recommend keeping an eye on the amount being transferred to exchanges," Philip Gradwell, the chief economist at New York-based bitcoin, crypto, and blockchain research company Chainalysis, wrote in a blog post this week.

Bitcoin and crypto exchanges saw their daily inflows increase by 250% during the second week of March compared to their 2020 average, according to Chainalysis research.

From March 9 to March 16, exchanges around the world received 1.1 million bitcoin per day, 712,000 bitcoin more than average with trading activity increasing as bitcoin flowing into exchanges was sold.

Chainalysis found that bitcoin trading was driven primarily by new bitcoin entering exchanges, rather than bitcoin already held on exchanges.

"The majority of excess bitcoin arriving at exchanges has been sold, and the worst of the oversupply appears to be finished for now," Gradwell wrote, adding that due to the "uncertainty around the COVID-19 pandemic, its hard to predict where the bitcoin market will go next."

"We also expect that professional traders will continue to drive events, as opposed to retail exchange users, simply because they are responsible for much larger volumes," Gradwell wrote.

Bitcoin exchange inflows rose dramatically during the second week of March, just ahead of the ... [+] bitcoin price taking a huge step downward.

Last month, ahead of bitcoin's coronavirus-related plunge, research found bitcoin's early 2020 rally was being driven by long-awaited institutional investors buying up bitcoin.

At the peak of 2017's epic rally, bitcoin exchange deposits outpaced the bitcoin price, with bitcoin and crypto analytics firm Glassnode recording around 200,000 daily exchange deposits.

Bitcoin exchange deposits have previously increased along with the bitcoin price, with deposits falling back during bear markets, however, average bitcoin exchange deposits dropped sharply over the last six months even as the bitcoin price rosesuggesting the last bitcoin rally wasn't driven by retail investors.

Originally posted here:
Heres How To Predict Major Moves In The Price Of Bitcoin - Forbes

Read More..

Did BTC Miners Crash Bitcoin Price With 51 Days Before the Halving? – Cointelegraph

Bitcoin (BTC) price has started to show strength in its recovery since the black Thursday selloff this past week, but is this something we can expect to continue? Or is this a dead cat bounce on the way down to lower lows?

In today's analysis Im looking not only at the charts, but also at the possibility of large Bitcoin miners being the cause of the 50% price drop on March 12, after supporting data emerged last week suggesting that short-term holders sold a whopping 281,000 BTC, which resulted in the crash.

Daily crypto market performance. Source: Coin360.com

In an article published by Coinmetrics on March 17, on-chain data supported the fact that short-term BTC holders were most likely responsible for the selling rather than new holders.

The figures they quoted included 281,000 BTC was on the move after 30 days of holding, compared to 4,131 that hadnt been touched for over a year before being moved.

This data might suggest to some that it was weak hands that FOMO bought in during Bitcoin's 30% price rise at the beginning of 2020. However, one has to consider the possible motives at play for such a large amount of Bitcoin being sold off cheap.

This to me opens up the very real possibility that the same people responsible for Bitcoin's price rise this year, were the same people responsible for its fall.

As can be seen in the chart below, Bitcoin had been trending in a downward parallel channel since June 2019 a trend that seemed to bottom on Jan. 4, 2020, which saw the Bitcoin price take off in a new ascending channel.

This new impetus for Bitcoins price was welcomed, but not questioned. Why did Bitcoin start to rise? Was it the upcoming halving, which is now just 51 days away? Was it the mining difficulty increase? Was it renewed institutional interest? However, what if it was all of these things combined, but with a twist.

There are 1,800 new Bitcoins mined every day, and between the period of Jan. 4 and Mar 12, there would have been 122,400 Bitcoin mined. This is about 50% of the amount that was revived by short term holders, and you dont get more short term than freshly mined BTC.

BTC USD daily chart. Source: TradingView

Im not going to pretend that I know any of this for a fact, this is just a theory with a lot of supporting data.But I will throw a few reasons that would make sense for larger miners to crash the market before continuing with my analysis.

I believe the above to be plausible reasons, especially if you consider how much hashing power comes out of China, a nation that pretty much has a crystal ball when it comes to the Coronavirus outbreak, as the first cases where being reported there back in November 2019.

This almost creates a perfect storm of conditions to execute the black swan event that was able to simultaneously gain dominance in the market, and regain control of the price. After all, if miners have no control over the price, then the halving will have no impact.

BTC USD daily chart Source: TradingView

When you zoom out on the Bitcoin 1-day chart, its almost obvious where miners could have stopped selling. The breakout at the beginning of the year just looks like a bump in the road, as we have since resumed the same downward channel we were in for the entire second half of 2019.

Well never really know whether or not the above scenario is true because of the selloff. However, one thing that is definite is that the price bounced off the support of the descending channel at $4,400 as anticipated in last week's analysis, so I am keeping these levels in mind looking forward to the week ahead.

At present, the price is holding above the middle of the channel, which is around $5,800. However, should this level fail to hold, then I expect $4,200 to be tested next week.

Should $5,800 continue to hold, then $7,200 is the key level of resistance for Bitcoin to push past and flip to support to be rid of this descending channel once and for all.

BTC mining difficulty. Source: BTC.com

Since the beginning of 2020, we have largely had the mining difficulty increase. This, in turn, seemingly saw the price rise, and as such, it seemed like a valid indicator.

However, next week we are set to see the years first double-digit adjustment, and unfortunately, its a negative one of -10.54%.Only time will tell if this will have a negative impact on the price of Bitcoin.

The yearly trend suggests that it might also just be correcting itself after such a dramatic selloff on the day it last increased. Bitcoin has a history of punishing its holders prior to big rewards, and the next chart might help you visualize what could be in store over the coming weeks and months.

BTC Rainbow Chart Source: Blockchain Centre

Similar to the stock to flow ratio chart, only this one I feel helps convey an important message, and that message is Keep buying Bitcoin at these levels.

Whilst this chart isnt intended to be financial advice, and like the S2F model has most likely been based on hindsight, what it does show is potentially how long we might remain in the blue fire sale zone both ahead and beyond the halving.

As a Bitcoin hodler and trader, I take great comfort in one thing shown on this chart. There are many more buying opportunities than there are selling opportunities, so should the price of Bitcoin continue to slide next week, try to view it as an opportunity to buy more, rather than reflecting on how much of a rekt pleb you feel right now.

I said it last week, Ill say it again, the CME gap still exists at $9,165. Whilst it feels almost impossible right now, 90% of CME gaps still fill, so theres always a possibility.

However, being more realistic and looking at the week ahead, if Bitcoin can hold $5,800 as support then all eyes are on $7,200 as the key level to break out from.

From here, I would expect the next level of resistance to be present around $8,000 before we can even start to think about $10,000 again.

We cant completely ignore all the global turmoil right now. Thus, if $5,800 fails to hold, I think its highly probable that we will revisit $4,200.

Falling below $4,200 is not a scenario I believe we need to consider. However, if this were to break, then $2,760 would be the last level of support Id be looking at, as this would represent an 80% retracement from last year's high of $13,800. If it fails to bounce there, then I would completely expect Bitcoin to go to sub $1,000 levels.

The views and opinions expressed here are solely those of @officiallykeith and do not necessarily reflect the views of Cointelegraph. Every investment and trading move involves risk. You should conduct your own research when making a decision.

Here is the original post:
Did BTC Miners Crash Bitcoin Price With 51 Days Before the Halving? - Cointelegraph

Read More..