Page 3,480«..1020..3,4793,4803,4813,482..3,4903,500..»

Bitcoin Price Retreat As Bulls Run Out Of Steam At The 50-Day SMA – InvestingCube

Bitcoin price retreat for the second consecutive trading session but managed to rebound from daily lows amid an improvement in equities markets. Bitcoin price mirrors the move in stock markets the last weeks and the price of bitcoin moves on the risk-on risk-off sentiment. Bitcoin price rejected at the 50-day moving average several times the last days, and bears might attempt a break below the 9,000 mark.

Bank of England governor Andrew Bailey said while talking to a webinar that BOE is looking into creating a government-backed digital currency (CBDC). The central bank is investigating a central bank digital currency which would have major implications across the payments system and society. Bailey said that the CBDC could be a real possibility in a few years.

Cryptocurrencies are trading mixed today. Ripple (XRPUSD) is 0.24% higher at $0.1990. Ethereum (ETHUSD) is 0.44% higher at $240.63, Litecoin (LTCUSD) is 0.11% lower at $43.85, Lumen (XLMUSD) is 0.82% higher at $0.09103.

Bitcoin price is 0.12% lower at $9,221, as the recent positive momentum for bitcoin cancelled at the 50-day moving average, where it rejected several times the last trading sessions. The technical picture has deteriorated the last days after the failure to break above the 9,400 mark. If the bitcoin price failed to break soon above the 50-day moving average might see a correction down to 8,800 the 100-day moving average.

On the upside, the initial resistance for Bitcoin will be met at $9,242 the daily top. The next obstacle for Bitcoin stands at $9,395 the 50-day moving average. If Bitcoin price breaks higher, then the next target stands at $9,650 the high from June 24th.

On the other side, the immediate support for bitcoin stands at $9,095 the daily low. A break below $9,095 would test the 9,000 psychological support. If the bears continue the selling pressure, then the next target will be met at $8,833 the 100-day moving average.

Dont miss a beat! Follow us onTelegramandTwitter.

More content

Excerpt from:
Bitcoin Price Retreat As Bulls Run Out Of Steam At The 50-Day SMA - InvestingCube

Read More..

US Dept of Homeland Security Buys Analytics Software From Coinbase | News – Bitcoin News

Coinbase is selling its blockchain analytics software to the U.S. Department of Homeland Security and the U.S. Secret Service. Following criticisms from the crypto community, CEO Brian Armstrong defended Coinbases position.

Public records on the U.S. governments websites reveal that the San Francisco-based crypto exchange Coinbase has signed a contract with the U.S. government for its blockchain analytics software. The records were first spotted by The Block.

The contract, awarded by the U.S. Department of Homeland Security (DHS), was signed on May 9. It went into effect the next day with a tentative end date of May 11, 2024. The obligated amount is currently $49,000 and the potential award amount is $183,750. The contracting agency is the U.S. Secret Service, a federal agency that investigates monetary crimes such as fraud and counterfeiting; it was transferred from the Department of the Treasury to the Department of Homeland Security on March 1, 2003.

Following the news of Coinbase selling its analytics software to the U.S. Secret Service, many people took to Twitter to criticize the companys action, with some urging others to delete Coinbase, saying that the company is bad for bitcoin and crypto.

Coinbase CEO Brian Armstrong quickly defended his companys decision. Blockchain analytics software is nothing new has been around a long time it uses publicly available data to try and track crypto transactions usually to catch bad actors, he tweeted.

Armstrong proceeded to explain that his company started off by using some of the existing blockchain analytics services out there. This worked out ok, but the issue with it was that we dont like sharing data with third parties when we can avoid it, and they didnt support all the features/chains we needed. So we realized at some point we would need to bring this capability in house, the CEO described, elaborating:

Its expensive to build this capability, and we want to recoup costs. There is an existing market for blockchain analytics software, so we sell it to a handful of folks as well. It also helps us build relationships with law enforcement which is important to growing crypto.

Last month, it was reported that Coinbase wanted to sell its analytics software to two other U.S. government agencies: the Drug Enforcement Administration (DEA) and the Internal Revenue Service (IRS). Meanwhile, the company is reportedly planning an initial public offering (IPO) in the U.S.

What do you think about Coinbase selling its analytics software to the government? Let us know in the comments section below.

Image Credits: Shutterstock, Pixabay, Wiki Commons, U.S. government

Disclaimer: This article is for informational purposes only. It is not a direct offer or solicitation of an offer to buy or sell, or a recommendation or endorsement of any products, services, or companies. Bitcoin.com does not provide investment, tax, legal, or accounting advice. Neither the company nor the author is responsible, directly or indirectly, for any damage or loss caused or alleged to be caused by or in connection with the use of or reliance on any content, goods or services mentioned in this article.

Read the original here:
US Dept of Homeland Security Buys Analytics Software From Coinbase | News - Bitcoin News

Read More..

Bug in Bitcoin Wallets Found Using the Replace-By-Fee Feature – PRNewswire

DUBLIN, July 13, 2020 /PRNewswire/ -- ResearchAndMarkets.com published a new article on the bitcoin industry "Bug in Bitcoin Wallets Found Using the Replace-By-Fee Feature"

A team at ZenGo discovered the BigSpender bug affecting major crypto-wallets, including Ledger Live, Edge, BreadWallet and potentially many more. The bug exploits how certain wallets handle the replace-by-fee feature which allows a user to swap an unconfirmed transaction with another transaction that has a higher fee. The RBF feature has become a standard way for users to send bitcoin and was developed as a way to circumvent slow confirmation times by paying more in fees.

Attackers can send funds to a wallet and set the fees low enough to almost guarantee the transaction will not receive a confirmation. The attacker can then use the RBF feature to replace the pending transaction with a transaction to another wallet that they control. For vulnerable wallets, this pending transaction will be reflected as an increase in the account balance, leading some users to believe they have received funds even though they have not. Attackers can also use the BigSpender vulnerability to send multiple fake transactions and reroute them before they are confirmed. This can cause the victim's stated balance and actual funds to become decoupled and could make the wallet unusable. Both Breadwallet and Ledger Live have released fixes to prevent the attacks.

To see the full article and a list of related reports on the market, visit "Bug in Bitcoin Wallets Found Using the Replace-By-Fee Feature"

About ResearchAndMarkets.comResearchAndMarkets.com is the world's leading source for international market research reports and market data. We provide you with the latest data on international and regional markets, key industries, the top companies, new products and the latest trends.

Research and Markets also offers Custom Research services providing focused, comprehensive and tailored research.

Media Contact:

Research and Markets Laura Wood, Senior Manager [emailprotected]

For E.S.T Office Hours Call +1-917-300-0470 For U.S./CAN Toll Free Call +1-800-526-8630 For GMT Office Hours Call +353-1-416-8900

U.S. Fax: 646-607-1907 Fax (outside U.S.): +353-1-481-1716

SOURCE Research and Markets

http://www.researchandmarkets.com

Here is the original post:
Bug in Bitcoin Wallets Found Using the Replace-By-Fee Feature - PRNewswire

Read More..

Standard Chartered and Universities Space Research Association join forces on Quantum Computing – PRNewswire

LONDON andMOUNTAIN VIEW, Calif., July 13, 2020 /PRNewswire/ --Standard Chartered Bank and Universities Space Research Association (USRA) have signed a Collaborative Research Agreement to partner on quantum computing research and developing quantum computing applications.

In finance, the most promising use cases with real-world applications include quantum machine learning models (generating synthetic data and data anonymisation) and discriminative models (building strong classifiers and predictors) with multiple potential uses such as credit scoring and generating trading signals. As quantum computing technology matures, clients should benefit from higher quality services such as faster execution, better risk management and the development of new financial products.

Kahina Van Dyke, Global Head of Digital Channels and Client Data Analytics at Standard Chartered, said: "Similar to other major technological advancements, quantum computing is set to bring widespread benefits as well as disrupt many existing business processes. This is why it's important for companies to future-proof themselves by adopting this new technology from an early stage. The partnership with USRA gives us access to world-class academic researchers and provides us with a unique opportunity to explore a wide range of models and algorithms with the potential to establish quantum advantage for the real-world use cases."

Bernie Seery, Senior VP of Technology at USRA noted that "This partnership with the private sector enables a diversity of research through a competitively selected portfolio of quantum computing research projects involving academic institutions and non-profits, growing an ecosystem for quantum artificial intelligence that has already involved over 150 researchers from more than 40 organizations that produced over 50 peer-reviewed publications over the last seven years."

Alex Manson, Global Head of SC Ventures, Standard Chartered's innovation, fintech investment and ventures arm, stated: "The world is currently in the process of identifying commercial use cases where quantum computer capabilities will surpass classical computers. We have a conviction that some of these use cases will transform the way we manage risks in financial services, for example by simulating portfolios and exponentially speeding up the generation of market data. We will work with USRA to identify such use cases in financial services, with a view to implementing them within our bank, as well as potentially offering this service to other market participants over time."

Mark Johnson, Vice President, Processor Design, Development and Quantum Products at D-Wave said: "Quantum computing research and development are poised to have a profound impact on the industries responsible for solving today's most complex problems. That's why researchers and businesses alike are looking to quantum computing today to start demonstrating tangible value. We're proud to work with USRA and Standard Chartered Bank as they improve global access to quantum systems and undertake essential research and development."

At USRA's Research Institute for Advanced Computer Science, Dr. Davide Venturelli, Associate Director for Quantum Computing, notes that quantum annealing is implementing a powerful approach to computing, featuring unique advantages with respect to other traditional and novel approaches, that should be studied, theoretically and experimentally, to advance the state of art of computing technologies for the benefit of nearly all disciplines.

Standard Chartered's team, led by Dr. Alexei Kondratyev, Global Head of Data Science and Innovation, and USRA have collaborated in quantum computing research since 2017. An earlier success in investigating the quantum annealing approach to computational problems in portfolio optimisation use cases led to this strategic partnership, where USRA will continue to support fundamental academic research in quantum physics and artificial intelligence and Standard Chartered will focus on future commercial applications.

In 2012, USRA partnered with NASA to found the Quantum Artificial Intelligence Laboratory (QuAIL): the space agency's hub to evaluate the near-term impact of quantum technologies. With QuAIL, the USRA team has investigated the physics, the engineering and the performance of multiple generations of quantum annealing processors built by D-Wave Systems, as well as participating in U.S. government research programs that looked into application of quantum annealing for combinatorial optimization, aviation, earth science and machine learning. NASA Ames Research Center is currently hosting a D-Wave 2000Q annealing system that will be made available for free for research by U.S. Universities, thanks to the support of this partnership.

Standard Chartered and USRA intend to develop this initial collaboration beyond quantum annealing to all unconventional computing systems that could provide an advantage to applications of interest, such as gate-model noisy-intermediate scale quantum (NISQ) processors and Coherent Ising machines.

For more information, contact: Standard Chartered: Group Media Relations Contact: Shaun Gamble, [emailprotected] Tel: +44 2078855934

USRA: PR Contact: Suraiya Farukhi, [emailprotected] Technical Contact: David Bell, [emailprotected]

About USRA

Foundedin 1969, under the auspices of the National Academy of Sciences at the request of the U.S. Government, the Universities Space Research Association (USRA) is a nonprofit corporation chartered to advance space-related science, technology and engineering. USRA operates scientific institutes and facilities, and conducts other major research and educational programs, under Federal funding. USRA engages the university community and employs in-house scientific leadership, innovative research and development, and project management expertise.RIACS is a USRA department for research in fundamental and applied information sciences, leading projects on quantum computing funded by NASA, DARPA, the US Airforce and NSF.

More info at: https://riacs.usra.edu/quantum/and http://www.usra.edu.

About Standard Chartered

We are a leading international banking group, with a presence in 59 of the world's most dynamic markets, and serving clients in a further 85. Our purpose is to drive commerce and prosperity through our unique diversity, and our heritage and values are expressed in our brand promise, Here for good.

Standard Chartered PLC is listed on the London and Hong Kong Stock Exchanges as well as the Bombay and National Stock Exchanges in India.

For more stories and expert opinions please visitInsightsatsc.com. Follow Standard Chartered onTwitter,LinkedInandFacebook.

SOURCE Universities Space Research Association

http://www.usra.edu

Read more:
Standard Chartered and Universities Space Research Association join forces on Quantum Computing - PRNewswire

Read More..

Standard Chartered teams up with Universities Space Research Association on development of quantum computing apps – FinanceFeeds

In finance, the most promising use cases with real-world applications include quantum machine learning models and discriminative models with various potential uses such as credit scoring and generating trading signals.

Standard Chartered Bank today announcesthe signing of Collaborative Research Agreement with Universities Space Research Association (USRA) to partner on quantum computing research and developing quantum computing applications.

In finance, the most promising use cases with real-world applications include quantum machine learning models (generating synthetic data and data anonymisation) and discriminative models (building strong classifiers and predictors) with multiple potential uses such as credit scoring and generating trading signals. As quantum computing technology matures, clients should benefit from higher quality services such as faster execution, better risk management and the development of new financial products.

Alex Manson, Global Head of SC Ventures, Standard Chartereds innovation, fintech investment and ventures arm, explains:

The world is currently in the process of identifying commercial use cases where quantum computer capabilities will surpass classical computers. We have a conviction that some of these use cases will transform the way we manage risks in financial services, for example by simulating portfolios and exponentially speeding up the generation of market data. We will work with USRA to identify such use cases in financial services, with a view to implementing them within our bank, as well as potentially offering this service to other market participants over time.

Standard Chartereds team, led by Dr. Alexei Kondratyev, Global Head of Data Science and Innovation, and USRA have collaborated in quantum computing research since 2017. An earlier success in investigating the quantum annealing approach to computational problems in portfolio optimisation use cases led to this strategic partnership, where USRA will continue to support fundamental academic research in quantum physics and artificial intelligence and Standard Chartered will focus on future commercial applications.

Standard Chartered and USRA intend to develop this initial collaboration beyond quantum annealing to all unconventional computing systems that could provide an advantage to applications of interest, such as gate-model noisy-intermediate scale quantum (NISQ) processors and Coherent Ising machines.

Go here to see the original:
Standard Chartered teams up with Universities Space Research Association on development of quantum computing apps - FinanceFeeds

Read More..

The crypto-agility mandate, and how to get there – Help Net Security

To achieve long-term data protection in todays fast-changing and uncertain world, companies need the ability to respond quickly to unforeseen events. Threats like quantum computing are getting more real while cryptographic algorithms are subject to decay or compromise. Without the ability to identify, manage and replace vulnerable keys and certificates quickly and easily, companies are at risk.

So, what do we mean when we talk about crypto-agility? Fundamentally, you will have achieved crypto-agility when your security systems are able to rapidly deploy and update algorithms, cryptographic primitives, and other encryption mechanisms. Going a step further, it means you have achieved complete control over cryptographic mechanisms your public key infrastructure (PKI) and associated processes and can quickly make whatever changes are needed without intense manual effort.

The replacement of manual processes with automated ones is critical to keeping up with accelerating change. As computing power and security technologies continue to evolve at a faster and faster pace, your existing cryptographic infrastructure is destined to become obsolete in a few years unless you can keep it upgraded to the latest technologies. Notably, threats continue to evolve as well.

Moreover, as the world transforms to depend on digital systems more fully, weve embedded cryptography deeply into virtually every communication system in the world. Its no longer possible for cryptography to remain isolated from other critical systems. The vast interdependent nature of modern systems makes it imperative that IT teams have the ability to respond quickly or face the risk of major outages and disruption.

Cryptographic standards like RSA, ECC, and AES that are in broad use today are constantly being updated with more advanced versions. Eventually governing bodies like NIST get in the act and mandate the use of the latest standards, with browser and cloud providers often raising the bar as well. To avoid becoming non-compliant, you must have the ability to quickly upgrade all your systems that rely on deprecated cryptography.

A robust, cryptographically agile infrastructure also brings other long-term benefits and plays a critical role in preventing security breaches. Achieving crypto-agility will make your operations teams more efficient, and eliminate unnecessary costs such consulting fees, temporary staff, fines, or remediation costs.

Such scenarios can unfold when a bad actor gains admin access, for instance, and may or may not have issued certificates. This uncertainty means that certificates from the impacted certificate authority (CA) can no longer be trusted and all certs from that CA must be revoked and re-issued. Without crypto-agility and a clear understanding of your potential exposure, youre looking at a costly all-hands-on-deck response to track and update hundreds or thousands of certs. And, of course, anytime you have humans involved with security response, youre opening yourself to human error and further compromise and outages.

The looming threat of quantum computing some say we could see 100,000x faster quantum computers as soon as 2025 represents another compelling reason to focus on improving your crypto-agility. While all crypto algorithms are breakable on paper, the incredible computing power required for such a feat does not currently exist. That could change with quantum computers which one day will be able to break most existing algorithms and hash function in minutes or hours.

To avoid the doomsday scenario where every system in the world is potentially exposed to compromise, work is already underway toward quantum-safe cryptography. However, given how little we know about quantum computing and the inability to perform real-world testing, its safe to assume there will be considerable give and take before quantum-safe algorithms are widely available.

In the meantime, your cryptography, certificate management and key distribution systems must be agile enough to adapt to this very real emerging threat. The table below presents a scenario of the time and expense involved with swapping out existing cryptography for quantum-safe cryptography. In this scenario, with incomplete or partial automation most enterprises would be looking at a 15-month vulnerability period compared to just six days when a fully automated solution has been put in place.

A comparison of quantum doomsday mitigation scenarios

Crypto-agility is a complex topic at scale and working towards it requires a multifaceted approach. Changes need to be made to security setups in organizational policy, operating methods, and core technology and processes. Your PKI may need to be upgraded and enhanced to support rapid swaps of cryptography, and software development procedures may need to be revamped to incorporate a nimbler approach to cryptography as opposed to being bolted on top of finished software.

The first step toward true crypto-agility is to understand the extent of your cryptographic exposure. This is accomplished by tracking down every digital certificate deployed across the organization and capturing details including algorithms and their size, the type of hashing/signature, validity period, where its located and how it can be used.

Once you have a complete inventory, youll then need to identify the vulnerable certificates by the type of cryptography in use and look for anomalies and potential problems. These can include certificates that use wildcards or IP address, certificates located on unauthorized or unintended systems as well as certificates abandoned on deprecated systems.

Finding your certificates and vulnerability isnt enough by itself to deliver crypto-agility youre still looking at the aforementioned 15-month-long process if you need to swap everything out manually.

Here are three pillars of crypto-agility that will put your organization on the right path toward withstanding whatever the future holds:

#1 Automate discovery and reporting. At the push of a button, you should be able to produce a full report of all your cryptographic assets. This will allow you quickly identify vulnerable cryptography and to report anomalies. There are any number of tools available to help you do this, but ideally certificate reporting should just be incorporated into an automated PKI solution.

#2 Automate PKI operations at scale. The ideal solution here is a fully automated Certificate Management Systems (CMS) that will manage the entire lifecycle of a certificate from creation to renewal. When the CMS is used to create a certificate it should have all the data it needs to not only monitor the certificate for expiration but automatically provision a replacement certificate without human intervention.

#3 Be nimble. At an organization and management level, your IT organization from DevOps through to day-to-day operations staff need to be ready for threats and change. You should carefully evaluate and rethink all aspects of your PKI to identify areas that may lock you into a particular vendor or technology.

The risk of having a slow-to-respond cryptographic infrastructure is increasingly daily, not only as digital transformations increase our dependency on inter-connected systems but as external threats and technology evolve with increasing pace. Looming above it all is the threat of quantum computing. Put it all together and its clear that the time to automate your PKI and move toward crypto-agility is at hand.

Read more here:
The crypto-agility mandate, and how to get there - Help Net Security

Read More..

How American Express is tapping the benefits of hybrid cloud – The Enterprisers Project

If there ever was a moment for IT organizations to accelerate cloud adoption, its now. Consumers and businesses are relying on cloud more than ever with the recent massive shift to remote working and learning, not to mention the increasingly widespread expectation for "always on"services.

Evan Kotsovinos is no stranger to that reality. As head of global infrastructure for American Express (Amex), one of his responsibilities includes overseeing cloud strategy for the globally-integrated payments company, which serves more than 100 million card members around the world. Kotsovinos also manages the firms technology response to the COVID-19 pandemic. Wecaught up with him to discuss his perspective on the cloud.

In this interview, Kotsovinos discusses why cloud adoption is all about maximizing business outcomes. Heshares a misnomer he still frequently hears from peers about the cloud (hint: it has to do with pricing), and he explains why infrastructure teams and leaders should consider themselves curators.

The Enterprisers Project (TEP): Cloud computing continues to grow, yet many CIOs are still working on their cloud adoption and migration plans.Does cloud adoption still have a long way to go? Is it inevitable for IT organizations?

Kotsovinos: Looking at the economies of scale that cloud providers can reap, and the engineering and innovation capabilities they have, in my mind the cloud is a complete inevitability. The question is not if, but how, and how fast. How will enterprises move to cloud and what cloud model (private, public, hybrid, and what specific flavor of these) will they embrace? And how fast will they move?

TEP: Can you provide a brief overview of how Amex is using the cloud right now?

Kotsovinos: Our approach to cloud employs a hybrid architecture that allows us to build applications once, inside a secure container, and have the flexibility to deploy those applications on private cloud, public cloud or both at the same time. We have been on our cloud journey for a few years now and have seen strong adoption of the platform as well as of cloud-native development principles and practices.

TEP: What makes you a believer in the benefits of cloud?

We believe the economies of scale achievable through our cloud strategy will drive significant advantage for us over time.

Kotsovinos: The cloud offers significant advantages compared with traditional, on-premises data centers. Three of those benefits stand out in my mind. First is the combination of productivity and speed that cloud offers. Today, thanks to cloud, you can tap extremely powerful, out-of-the-box capabilities in areas such as artificial intelligence, machine learning, or data analytics. Before cloud, that might have taken teams of engineers years to build.

Second is resiliency. At Amex, the ability to run the same application on multiple clouds gives us a high degree of resiliency, allowing us to deliver the always-on experience that our card members crave and deserve.

And lastly, economics. We believe the economies of scale achievable through our cloud strategy will drive significant advantage for us over time.

TEP: Can you describe your approach to application development given the prevalence of cloud today?

Kotsovinos: We are committed to cloud-native application development and to evolving with cloud-native as a standard. As cloud technology matures, we will continue to raise the bar on these cloud-native principles and the service we provide to our customers.

Cloud-native applications are architected to be highly available and continue to serve our customers in a multitude of scenarios. They are built to scale out as well as in, accommodating changes in volumes. They are built based on reusable components. They embrace a number of best practices in terms of logging, configuration management, port binding, and dependency mapping, which support portability.

TEP: Amex is quite advanced in its cloud journey; what are some considerations you are passionate about and believe other CIOs/IT execs should consider as they embark on their cloud journey?

Kotsovinos: As with any new venture, you need to be able to effectively measure results. But its also critical to understand the bigger picture of what youre solving for. To measure our success in cloud, we focus on overall business outcomes rather than the isolated costs of cloud versus on-premises infrastructure.

In addition, to reap the full benefit of cloud computing, you have to move to the cloud in the right way. Lifting and shifting legacy applications to the cloud provides limited benefit. Moving applications that are cloud-native, fully or partially, has a greater return.

Again, its all about maximizing business outcomes, not moving to cloud for clouds sake.

Next, consider how you are going to address your non-cloud native applications. Distinguish between your technical debt those applications in your portfolio that will be out of date and need to be refreshed and those applications that are current, but just are not built for the cloud. Youll likely have more success by reviewing your application refresh cycles and prioritizing your portfolio to move gradually to the cloud. Again, its all about maximizing business outcomes, not moving to cloud for clouds sake.

Finally, account for bubble costs. As you are migrating applications to the cloud, you are likely running parts of your applications on both cloud and traditional infrastructure, which will temporarily inflate your cost base.

TEP: Is there a pet peeve or misnomer you hear from CIOs or even your peers in infrastructure that you wish you could debunk or clarify?

Kotsovinos: A misnomer I still hear is that the cloud is cheaper, or the cloud is more expensive. Both of these statements oversimplify the problem. The cloud is neither cheaper, nor more expensive. The cloud is a different way of delivering a service and what matters is the total cost of the technology stack you are delivering, not the cloud versus on-premise calculation.

TEP: What are some key talent challenges that organizations should consider as they shift to the cloud?

Kotsovinos: I think a crucial aspect of cloud adoption is investing not only in training and upskilling, but also in evangelizing and helping engineering teams adopt the right mindset and tools. Without that, you may not get very far.

[ Hiring for Kubernetes? Read: 14 Kubernetes interview questions: For hiring managers and job seekers ]

Do not assume that if you deliver it, they will come. In 2020, the role of an infrastructure organization is not just to build and deliver those capabilities, but to make sure they are understood and adopted in the right way. The infrastructure team is responsible for being a trusted partner, a consultant, and an advisor to the software engineering teams. Infrastructure teams and leaders are curators of developer experience above all else.

TEP: The world is changing before our eyes, including technological advancements. What are some critical long-term technology infrastructure considerations we might not yet be thinkingabout?

Kotsovinos: I would highlight two long-term developments that may fundamentally transform everything we know about computer architecture, and therefore technology infrastructure, over the next several years. First, quantum computing, which promises vast speed improvements for specific classes of problems but is nowhere near general-purpose computing or the level of hardware stability and usability that would make it mainstream. Secondly, the rise of very large non-volatile memory, which over time can lead to the collapse of the memory hierarchy (cache, RAM, storage) into one large persistent memory array.

[ Read our deep dive for IT leaders:Kubernetes: Everything you need to know. ]

More:
How American Express is tapping the benefits of hybrid cloud - The Enterprisers Project

Read More..

MIT’s New Diamond-Based Quantum Chip Is the Largest Yet – Interesting Engineering

Researchers at MIT have developed a process to manufacture and integrate "artificial atoms" with photonic circuitry, and in doing so, are able to produce the largest quantum chip of its kind.

The atoms, which are created by atomic-scale defects in microscopically thin slices of diamond, allow for the scaling up of quantum chip production.

RELATED: 7 REASONS WHY WE SHOULD BE EXCITED BY QUANTUM COMPUTERS

The new development marks a turning point in the field of scalable quantum processors, Dirk Englund, an associate professor in MITs Department of Electrical Engineering and Computer Science, explained in a press release.

Millions of quantum processors will be required for the oncoming, much-hyped advent of quantum computing. This new research shows there is a viable way to scale up processor production, the MIT team says.

The qubits in the newly-developed chip are artificial atoms made from defects in diamond. These can be prodded with visible light and microwaves, making them emit photons that carry quantum information.

This hybrid approach is described by Englund and his colleagues in a study published inNature.The paper details how the team carefully selected "quantum micro chiplets" that contained multiple diamond-based qubits and integrated them onto an aluminum nitride photonic integrated circuit.

In the past 20 years of quantum engineering, it has been the ultimate vision to manufacture such artificial qubit systems at volumes comparable to integrated electronics, Englund explained. Although there has been remarkable progress in this very active area of research, fabrication and materials complications have thus far yielded just two to three emitters per photonic system.

Using their hybrid method, Englund and his team successfully built a 128-qubit system. In doing so, they made history by constructing the largest integrated artificial atom-photonics chip yet.

Its quite exciting in terms of the technology, Marko Lonar, Tiantsai Lin Professor of Electrical Engineering at Harvard University, who was not involved in the study, told MIT News. They were able to get stable emitters in a photonic platform while maintaining very nice quantum memories.

The next step for the researchers is to find a way to automate their process. In doing so, they will enable the production of even bigger chips, which will be necessary for modular quantum computers and multichannelquantum repeaters that transport qubits over long distances, the researchers say.

Originally posted here:
MIT's New Diamond-Based Quantum Chip Is the Largest Yet - Interesting Engineering

Read More..

Chicago Quantum Exchange Welcomes Seven New Partners in Tech, Computing and Finance – HPCwire

CHICAGO, July 8, 2020 The Chicago Quantum Exchange, a growing intellectual hub for the research and development of quantum technology, has added to its community seven new corporate partners in computing, technology and finance that are working to bring about and primed to take advantage of the coming quantum revolution.

These new industry partners are Intel, JPMorgan Chase, Microsoft, Quantum Design, Qubitekk, Rigetti Computing, and Zurich Instruments.

Based at the University of Chicagos Pritzker School of Molecular Engineering, the Chicago Quantum Exchange and its corporate partners advance the science and engineering necessary to build and scale quantum technologies and develop practical applications. The results of their workprecision data from quantum sensors, advanced quantum computers and their algorithms, and securely transmitted informationwill transform todays leading industries. The addition of these partners brings a total of 13 companies in the Chicago Quantum Exchange to work with scientists and engineers at universities and the national laboratories in the region.

These new corporate partners join a robust collaboration of private and public universities, national laboratories, companies, and non-profit organizations. Together, their efforts with federal and state support will enhance the nations leading center for quantum information and engineering here in Chicago, said University of Chicago Provost Ka Yee C. Lee.

The Chicago Quantum Exchange is anchored by the University of Chicago, the U.S. Department of EnergysArgonne National LaboratoryandFermi National Accelerator Laboratory(both operated for DOE by UChicago), and theUniversity of Illinois at Urbana-Champaign, and includes theUniversity of Wisconsin-MadisonandNorthwestern University.

Developing a new technology at natures smallest scales requires strong partnerships with complementary expertise and significant resources. The Chicago Quantum Exchange enables us to engage leading experts, facilities and industries from around the world to advance quantum science and engineering, said David Awschalom, the Liew Family Professor in Molecular Engineering at the University of Chicago, senior scientist at Argonne, and director of the Chicago Quantum Exchange. Our collaborations with these companies will be crucial to speed discovery, develop quantum applications and prepare a skilled quantum workforce.

Many of the new industry partners already have ongoing or recent engagements with CQE and its member institutions. In recent collaborative research, spectrally entangled photons from a Qubitekk entangled photon source were transported andsuccessfully detectedafter traveling through one section of theArgonne quantum loop.

On another project, UChicago computer scientist Fred Chong and his students worked with both Intel and Rigetti Computing on software and hardware solutions. With Intels support, Chongs team invented a range of software techniques to more efficiently execute quantum programs on a coming crop of quantum hardware. For example, they developed methods that take advantage of the hierarchical structure of important quantum circuits that are critical to the future of reliable quantum computation.

Chicago Quantum Exchange member institutions engage with corporate partners in a variety of collaborative research efforts, joint workshops to develop new research directions, and opportunities to train future quantum engineers. The CQE has existing partnerships with Boeing; IBM; Applied Materials, Inc.; Cold Quanta; HRL Laboratories, LLC; and Quantum Opus, LLC.

The CQEs newest corporate partnerships will help further research possibilities in areas from quantum communication hardware, to quantum computing systems and controls, to finance and cryptography applications.

Jim Clarke, director of quantum hardware at Intel, looks forward to further collaborations with Chicago Quantum Exchange members.

Intel remains committed to solving intractable challenges that lie on the path of achieving quantum practicality, said Clarke. Were focusing our research on new qubit technologies and addressing key bottlenecks in their control and connectivity as quantum systems get larger. Our collaborations with members of the Chicago Quantum Exchange will help us harness our collective areas of expertise to contribute to meaningful advances in these areas.

The Chicago Quantum Exchanges partnership with JPMorgan Chase will enable the use of quantum computing algorithms and software for secure transactions and high-speed trading.

We are excited about the transformative impact that quantum computing can have on our industry, said Marco Pistoia, managing director, head of applied research and engineering at JPMorgan Chase. Collaborating with the Chicago Quantum Exchange will help us to be among the first to develop cutting-edge quantum algorithms for financial use cases, and experiment with the power of quantum computers on relevant problems, such as portfolio optimization and option pricing.

Applying quantum science and technology discoveries to areas such as finance, computing and healthcare requires a robust workforce of scientists and engineers. The Chicago Quantum Exchange integrates universities, national laboratories and leading companies to train the next generation of scientists and engineers and to equip those already in the workforce to transition to quantum careers.

Microsoft is excited to partner with the Chicago Quantum Exchange to accelerate the advancement of quantum computing, said Chetan Nayak, general manager of Microsoft Quantum Hardware. It is through these academic and industry partnerships that well be able to scale innovation and develop a workforce ready to harness the incredible impact of this technology.

Source: Chicago Quantum Exchange

More:
Chicago Quantum Exchange Welcomes Seven New Partners in Tech, Computing and Finance - HPCwire

Read More..

Satoshi Nakamoto Inspiration Gives Advice On Bitcoins Next Move – Forbes

Listening to educator, inventor and scientist Scott Stornetta on May 30th during the presentation curated by the Government Blockchain Association Of UAE provided insight into one of the least understood problems in blockchains. Stornetta was part of the team that created what can be called a proto blockchain. With three out of nine references in the bitcoin paper by Satoshi Nakomoto, Stornetta and co-inventor Stuart Habers ideas have had an outsize influence on the design of bitcoin and of subsequent blockchains.

Stornetta answered a question about what message he will have if he were to ever meet Satoshi. Stornetta said that he would ask him to fully read the second paper. Here Stornetta is referring to a way to upgrade bitcoin or any time-stamping mechanism, if the signature algorithm is in danger of being compromised.

That paper deals with two topics, one of which is familiar to us from bitcoin and other blockchains. The use of Merkle trees as a way of aggregating the commitments and referring to just the root of the tree, which if timestamped and witnessed properly, ensures immutability of all the leaf documents or transactions. This is a way to refer to lots of transactions with just a single number. This is the basis of the concept of block in blockchain.

The papers second topic is how to renew the timestamps of documents if the cryptography behind timestamping using signatures is in danger of being broken. The simple prescription is to renew the timestamp, referring to the document (the hash) and the old signature in the new one.

Stornetta and Habers concerns were to preserving immutable and unrepudiatable references to digital documents with the time they are entered into a registry. That is not about value exchange and control of assets, like most other blockchains. They also make the observation that the timestamp, if fixed in a chain at a time known to be before the break, can be assumed to be correct.

Two baby elephants, symbols of renewal, walk among a herd at the Minneriya National Park in ... [+] Minneriya on July 8, 2020.

Many of the cryptographic structures behind any blockchain network are relatively safe from quantum computing as they are based on hash functions which are quantum resistant. For bitcoin and for other blockchains, this means that, Merkle trees and the structure of the chain itself are quite safe as most of this is based on hashes.

The vulnerability of digital signatures to Shors algorithm using quantum cryptanalysis is a Damocles sword that hangs over any blockchain that is meant to last for ever. Although, it is improbable that quantum computing will be able to break the signature algorithm in the short term there is a possibility that it will in the long run. For decentralized value exchanges whose longevity should be measured in hundreds of years, this consideration is crucial.

The more quantum vulnerable parts are signatures, as value transfer is based on signatures, the unspent transactions are vulnerable. If quantum computing progresses to a point where the signature algorithm is in imminent jeopardy, the signature algorithm needs to be updated to a quantum resistant one. Moreover, all owners of unspent transactions need to transfer values to the new scheme. This action also needs to be done before the signature scheme is broken. Further, it will need actions from all owners of value to safeguard their assets. This also applies to other forms of asset ownership assertion using asymmetric or public key cryptography. Some of these considerations can be seen in the plans for Ethereum Serenity upgrades.

Once quantum computing for factorization becomes a reality, all stranded assets; for which private keys are lost due to negligence or their owners death will start moving again, into the control of people with enough quantum resources.

Facilities for upgrading systems should be part of the initial architecture of any long-lived system. There are many throw-away systems that long outlive the initial horizon, so any system has to be created as if it is going to live a long time.

See the rest here:
Satoshi Nakamoto Inspiration Gives Advice On Bitcoins Next Move - Forbes

Read More..