Page 3,958«..1020..3,9573,9583,9593,960..3,9703,980..»

Cloud Computing: Current Top Trends and Technologies – Datamation

Register for this live video webinar - Tuesday, January 21, 11 AM PT. Ask a top cloud expert - get your questions answered by an industry leader.

Cloud computing has grown from emerging disrupter to the very foundation of today's enterprise IT, and yet the pace of change in the cloud sector shows no signs of lagging.

Hybrid cloud has given way to multicloud -- or is that just hype? The concept of "cloud native" is now au courant, offering its own myriad challenges. Emerging technologies from microservices to kubernetes to edge computing are prompting big shifts.

These many and constant new developments beg the question: what do I need to know to truly be current with cloud in 2020?

To provide insight, Ill speak with a leading cloud expert, Bernard Golden. Golden had held number top tech positions; most recently he was Vice President, Cloud Strategy, Capitol One. Wired magazine dubbed him "one of the ten most influential people in Cloud Computing." He's the author of Amazon Web Services for Dummies, a bestselling cloud computing book.

Register for this live video webinar - Tuesday, January 21, 11 AM PT

In this webinar you will learn:

Bernard Golden, top cloud computing expert

James Maguire, Managing Editor, Datamation moderator

Register for this live video webinar - Tuesday, January 21, 11 AM PT

Get your cloud questions answered by leading expert.

Go here to see the original:
Cloud Computing: Current Top Trends and Technologies - Datamation

Read More..

Join us live online today: Find out how to store and manage data in the hybrid-cloud era to boost your business – The Register

Webcast You know the story: your users are creating data faster than ever before.

But ask yourself: is your information being stored effectively? Can your users get hold of the data they need, and use it when and where they want?

If your answer is "no," or even just "not always," today's webcast, brought to you by NetApp, is for you.

Processing terabytes upon terabytes of business data to extract valuable insights and trends can and should take place anywhere: in your data center, computer room, on your desk, on the road, and in the cloud.

Managing your data across many of these different types of systems, scattered over multiple locations and jurisdictions, can be difficult, though. To keep costs and security under control, you need to have a robust, reliable data platform to handle the ingestion and creation of information, cope with the active use of this data, and manage its long-term storage through to its end of life. And this all has to work without constant skilled supervision.

So, what can you do? Join us today at 3pm GMT to find out as Tony Lock of Freeform Dynamics, and Adrian Cooper of NetApp, discuss what options are available to build a robust unified data platform that operates across hybrid cloud environments.

If you need to improve how you manage your data, or your users could do with some help to get the most out of all the data you hold, please join this webcast, and get involved.

Click right here to sign up now.

Sponsored: Detecting cyber attacks as a small to medium business

Follow this link:
Join us live online today: Find out how to store and manage data in the hybrid-cloud era to boost your business - The Register

Read More..

VIEWPOINT: Refocusing as a Digital-First Publication – Georgetown University The Hoya

At our first onboarding event for The Hoya, we remember how a well-known news media professional who then consulted for The Hoya informed us that the only thing we need to know about a journalists job is that it is always changing. This was especially evident in 2016.

The Hoya faced headwinds heading into its 98th year. When we started our roles as editor-in-chief and general manager, the newspaper was on a precipice. Print readership and thus advertising revenue had declined year after year. The continued focus on print, however, had disconnected the paper from the majority of our readers, who accessed our content online. The Hoya was also in serious need of investment, though a shrinking topline forced the organization to rely on external funding sources, making budgeting for investment virtually impossible. Against this backdrop, our leadership team engaged in a monthslong process to identify what change was necessary if the paper was going to survive the coming decades.

As we kicked off an assessment to understand The Hoyas future, it quickly became clear that we needed to become a digital-first publication. A pivot to an online daily format with a weekly print edition would allow us to leverage our deep content expertise, diverse and inclusive culture, and entrepreneurial spirit to take on a quickly evolving media landscape.

The leadership team put countless hours into a framework that would eventually form the foundation of this transition. We solicited input from experts, our membership and our readers, knowing much was at stake no less than the future of an organization that had been integral to our Georgetown University experience. Our team set ambitious goals, but they were exactly what was needed to take The Hoya into the digital age.

In the spring of 2017, having consulted with our most important organizational stakeholders and after 30 years of printing twice weekly, our leadership and the wider membership of The Hoya voted overwhelmingly in favor of transitioning to an online daily format with a weekly print edition. It was time to get to work.

Our North Star was always our readers. Through the ups and downs of the organizational overhaul, we evaluated our work by how it would improve our content and further our relationship with our readers.

Daily online publication necessitated a drastic increase in the volume of our content, and as we considered ways to increase our production capacity, we had to redefine our relationship with students, faculty and members of the community and understand where there were gaps in our coverage. We also understood that online articles have always been met with greater scrutiny simply by nature of the medium, according to a study by Kantar. For our writers and editors, this skepticism meant increased diligence to ensure the factual accuracy of every article. It also meant a renewed emphasis on community outreach in pursuit of balanced representation.

Within the first week of our term, we overhauled our entire online engagement strategy. Our readers had been finding us through Facebook, Twitter and Instagram. Correspondingly, we expanded our multimedia and social media teams and created content that would likely be consumed in a news feed on a mobile device.

To succeed as a digital-first publication, we also needed to become a 21st century newsroom. Gone were the days of exchanging article edits physically through USBs and undependable hard drives. Cutting one of our print issues allowed us to reallocate resources to building cloud storage systems, introducing online project management and communication platforms, and replace a technology infrastructure first put together in the late 1990s.

Perhaps the most important transformation of all, however, was cultural. Becoming a daily operation required our staffers to spend more hours in the office, often late into the night. This time was not spent on coursework, part-time jobs or social lives, but on publishing stories and giving a voice to people central to the Georgetown community.

When someone is first hired by The Hoya, they are told that the quality of our product is only as strong as the quality of our staffers; our membership is our strongest asset. Therefore, it was necessary to invest heavily in our staffers and their well-being. We expanded staff professional development opportunities, formalized philanthropy efforts, built a strong network of Hoya alumni and piloted a health and wellness initiative.

Those staffers who have worked with us know our bold predictions are few and far between, though, as we ruminate on our past experience with The Hoya, we feel confident in one thing: Though The Hoya will undoubtedly continue to evolve in its next 100 years, there is no better training ground to prepare todays student journalists for tomorrows newsroom.

Toby Hung (COL 18) is a former editor-in-chief of The Hoya. Daniel Almeida (MSB 18) is a former general manager of The Hoya.

Visit link:
VIEWPOINT: Refocusing as a Digital-First Publication - Georgetown University The Hoya

Read More..

Cyber Security Today Millions of files on Americans found open on Internet, and how to avoid juice-jacking – IT World Canada

Millions of files on Americans found open on Internet, and how to avoid juice-jacking

Welcome to Cyber Security Today. Its Monday January 13th. Im Howard Solomon, contributing reporter on cyber security for ITWorldCanada.com.To hear the podcast click on the arrow below:

Employees at companies continue to be sloppy at protecting personal data. Heres another example: Someone at Front Rush, a U.S. firm which provides management software for college athletics programs, left a server open to the Internet. That server had more than 700,000 files including athletes medical records, performance reports, drivers licences and other personal information. Often this is a configuration problem where the person creating a database or file forgets to check a setting, or an IT staffer doing maintenance or an upgrade does something wrong. Regardless, managers around the world arent doing enough to make sure this doesnt happen in their organizations. This incident was originally reported by Vice.com.

Heres a similar incident: According to The Register, a researcher found an open database with details on 56 million American residents including home addresses and phone numbers. The database appears to belong to a web site called CheckPeople.com, where, for a fee, you can look up peoples names and find addresses. Most of the information seems to be available from public sources. Still, why it was unprotected isnt known. The server is in China. We dont know if this was a database stolen from CheckPeople, or an employee put it there and misconfigured it. As of the recording of this podcast CheckPeople hadnt responded to questions.

Misconfigured cloud storage is a big problem for companies. If your firm uses Amazon AWS for storage, there are tools like AWS Security Hub and the new Identity and Access Analyzer that help track down mistakes. If you use Microsoft Azure, theres Azure Security Center. If your firm uses other cloud storage firms, find out what if any security tools they offer.

Lets talk about juice-jacking. No, its not a way to steal fruit drinks. Juice-jacking is slang for delivering malware through infected public USB charging stations in airports, hotels and conferences. These stations are offered as a convenience for you to charge mobile devices. But if theyve been compromised your smartphone, laptop or tablet will be too. Thats right, the power plug and charging cable can deliver malware. Thats theyre used for both delivering power and transferring data. Security researchers have demonstrated how it can be done. But how big a problem is it? Were not sure, writer Mike Elgan says on IBMs Security Intelligence blog. But its better to be safe by not using public charging stations. Nor should you charge your device through someone elses computer. Instead, carry your own charging adapter and cable. If you buy a duplicate, make sure theyre from a packaged brand name and not from an open box of adapters and cables in a store beside the cash register. Worried about running out of power? Buy and carry a rechargeable USB mobile battery.

Finally, tomorrow is Microsofts monthly Patch Tuesday, when it will release security updates for Windows and other company software.

Thats it for Cyber Security Today. Links to details about these stories can be found in the text version of each podcast at ITWorldCanada.com. Thats where youll also find my news stories aimed at businesses and cyber security professionals. Cyber Security Today can be heard on Mondays, Wednesdays and Fridays. Subscribe on Apple Podcasts, Google Podcasts or add us to your Flash Briefing on your smart speaker. Thanks for listening. Im Howard Solomon

View original post here:
Cyber Security Today Millions of files on Americans found open on Internet, and how to avoid juice-jacking - IT World Canada

Read More..

Veeam CTO On Why Veeam Wanted To Be Acquired And What Comes Next – CRN: Technology news for channel partners and solution providers

Looking For Growth In All The Right Places

Veeam, since its founding in 2006, has grown to be one of the top providers of data protection technology, and has more recently invested heavily in the nascent data management market with an eye on the cloud, particularly in the Office 365, Amazon Web Services, and Microsoft Azure environments. The company is also committed to building stronger relationships with the top cloud providers.

However, Veeam has faced one key barrier to growth: While it is the leading provider in Europe, and is arguably one of the largest software vendors of any kind in the world, the Baar, Switzerland-based company has only a fourth-place rank in the all-important U.S. market.

Danny Allan, Veeam's new chief technology officer, told CRN that Veeam's pending acquisition by New York-based private equity company Insight Partners is the key to unlocking the U.S. market and future growth for his company. Insight Partners has expertise not only in helping strong private companies grow, it also has experience in the all-important software-as-a-service technology needed to build a strong cloud-focused business.

For deep insight into why Veeam, a billion-dollar-plus and profitable company, decided to go the private equity route to build a U.S. business, turn the page.

Original post:
Veeam CTO On Why Veeam Wanted To Be Acquired And What Comes Next - CRN: Technology news for channel partners and solution providers

Read More..

New Technique May Be Capable of Creating Qubits From Silicon Carbide Wafer – Tom’s Hardware

Researchers from the University of Chicago have discovered a technique that might be able to produce qubits from defects in commercially available silicon carbide wafers. This could represent a scalable way of creating qubits using off-the-shelf tools.

As published in Science and reported on by IEEE Spectrum, the researchers bought a silicon carbide wafer and shot an electron beam at it. This created deficiencies that behaved as a single electron spin that could be manipulated electrically, optically (with lasers) and magnetically. Basically, the defects act as room-temperature cages for electrons. The electrons spin, an inherent property of electrons, could then be used as a qubit. Individual electron spins can hold their information for up to 1 millisecond.

"Our approach is to see if we can leverage the trillion dollars or so of American industry thats building todays nanoelectronics and see if we can pivot that technology," David Awschalom, one of the researchers and a professor of molecular engineering at the University of Chicago, said, according to IEEE.

However, the researchers' work is still in early stages. They do not have a working quantum computer yet or even a provable qubit.

The work seems similar to one of Intels two approaches to building quantum computers, which makes use of spin qubits manufactured on Intels standard 300mm CMOS wafer lines. The difference seems to be that Intel uses silicon wafers instead of silicon carbide. Intel announced testing the chip in 2018. In December, Intel also created a control chip for its quantum chips.

Continue reading here:
New Technique May Be Capable of Creating Qubits From Silicon Carbide Wafer - Tom's Hardware

Read More..

How to verify that quantum chips are computing correctly – MIT News

In a step toward practical quantum computing, researchers from MIT, Google, and elsewhere have designed a system that can verify when quantum chips have accurately performed complex computations that classical computers cant.

Quantum chips perform computations using quantum bits, called qubits, that can represent the two states corresponding to classic binary bits a 0 or 1 or a quantum superposition of both states simultaneously. The unique superposition state can enable quantum computers to solve problems that are practically impossible for classical computers, potentially spurring breakthroughs in material design, drug discovery, and machine learning, among other applications.

Full-scale quantum computers will require millions of qubits, which isnt yet feasible. In the past few years, researchers have started developing Noisy Intermediate Scale Quantum (NISQ) chips, which contain around 50 to 100 qubits. Thats just enough to demonstrate quantum advantage, meaning the NISQ chip can solve certain algorithms that are intractable for classical computers. Verifying that the chips performed operations as expected, however, can be very inefficient. The chips outputs can look entirely random, so it takes a long time to simulate steps to determine if everything went according to plan.

In a paper published today in Nature Physics, the researchers describe a novel protocol to efficiently verify that an NISQ chip has performed all the right quantum operations. They validated their protocol on a notoriously difficult quantum problem running on custom quantum photonic chip.

As rapid advances in industry and academia bring us to the cusp of quantum machines that can outperform classical machines, the task of quantum verification becomes time critical, says first author Jacques Carolan, a postdoc in the Department of Electrical Engineering and Computer Science (EECS) and the Research Laboratory of Electronics (RLE). Our technique provides an important tool for verifying a broad class of quantum systems. Because if I invest billions of dollars to build a quantum chip, it sure better do something interesting.

Joining Carolan on the paper are researchers from EECS and RLE at MIT, as well from the Google Quantum AI Laboratory, Elenion Technologies, Lightmatter, and Zapata Computing.

Divide and conquer

The researchers work essentially traces an output quantum state generated by the quantum circuit back to a known input state. Doing so reveals which circuit operations were performed on the input to produce the output. Those operations should always match what researchers programmed. If not, the researchers can use the information to pinpoint where things went wrong on the chip.

At the core of the new protocol, called Variational Quantum Unsampling, lies a divide and conquer approach, Carolan says, that breaks the output quantum state into chunks. Instead of doing the whole thing in one shot, which takes a very long time, we do this unscrambling layer by layer. This allows us to break the problem up to tackle it in a more efficient way, Carolan says.

For this, the researchers took inspiration from neural networks which solve problems through many layers of computation to build a novel quantum neural network (QNN), where each layer represents a set of quantum operations.

To run the QNN, they used traditional silicon fabrication techniques to build a 2-by-5-millimeter NISQ chip with more than 170 control parameters tunable circuit components that make manipulating the photon path easier. Pairs of photons are generated at specific wavelengths from an external component and injected into the chip. The photons travel through the chips phase shifters which change the path of the photons interfering with each other. This produces a random quantum output state which represents what would happen during computation. The output is measured by an array of external photodetector sensors.

That output is sent to the QNN. The first layer uses complex optimization techniques to dig through the noisy output to pinpoint the signature of a single photon among all those scrambled together. Then, it unscrambles that single photon from the group to identify what circuit operations return it to its known input state. Those operations should match exactly the circuits specific design for the task. All subsequent layers do the same computation removing from the equation any previously unscrambled photons until all photons are unscrambled.

As an example, say the input state of qubits fed into the processor was all zeroes. The NISQ chip executes a bunch of operations on the qubits to generate a massive, seemingly randomly changing number as output. (An output number will constantly be changing as its in a quantum superposition.) The QNN selects chunks of that massive number. Then, layer by layer, it determines which operations revert each qubit back down to its input state of zero. If any operations are different from the original planned operations, then something has gone awry. Researchers can inspect any mismatches between the expected output to input states, and use that information to tweak the circuit design.

Boson unsampling

In experiments, the team successfully ran a popular computational task used to demonstrate quantum advantage, called boson sampling, which is usually performed on photonic chips. In this exercise, phase shifters and other optical components will manipulate and convert a set of input photons into a different quantum superposition of output photons. Ultimately, the task is to calculate the probability that a certain input state will match a certain output state. That will essentially be a sample from some probability distribution.

But its nearly impossible for classical computers to compute those samples, due to the unpredictable behavior of photons. Its been theorized that NISQ chips can compute them fairly quickly. Until now, however, theres been no way to verify that quickly and easily, because of the complexity involved with the NISQ operations and the task itself.

The very same properties which give these chips quantum computational power makes them nearly impossible to verify, Carolan says.

In experiments, the researchers were able to unsample two photons that had run through the boson sampling problem on their custom NISQ chip and in a fraction of time it would take traditional verification approaches.

This is an excellent paper that employs a nonlinear quantum neural network to learn the unknown unitary operation performed by a black box, says Stefano Pirandola, a professor of computer science who specializes in quantum technologies at the University of York. It is clear that this scheme could be very useful to verify the actual gates that are performed by a quantum circuit [for example] by a NISQ processor. From this point of view, the scheme serves as an important benchmarking tool for future quantum engineers. The idea was remarkably implemented on a photonic quantum chip.

While the method was designed for quantum verification purposes, it could also help capture useful physical properties, Carolan says. For instance, certain molecules when excited will vibrate, then emit photons based on these vibrations. By injecting these photons into a photonic chip, Carolan says, the unscrambling technique could be used to discover information about the quantum dynamics of those molecules to aid in bioengineering molecular design. It could also be used to unscramble photons carrying quantum information that have accumulated noise by passing through turbulent spaces or materials.

The dream is to apply this to interesting problems in the physical world, Carolan says.

Read the rest here:
How to verify that quantum chips are computing correctly - MIT News

Read More..

The hunt for the ‘angel particle’ continues – Big Think

A theoretical class of particles called Majorana fermions remains a mystery. In 2017, scientists believed they had uncovered evidence for the existence Majorana fermions. Unfortunately, recent research shows that their findings were actually due to a faulty experimental device, bringing researchers back to the drawing board in the search for the exotic particles.

The Standard Model of particle physics currently is our best means of explaining the fundamental forces of the universe. It classifies the various elementary particles, like photons, the Higgs boson, and the various quarks and leptons. Broadly, its particles are divided into two classes: Bosons, like the photon and Higgs, and fermions, which comprise the quarks and leptons.

There are a few major differences between these types of particles. One, for instance, is that fermions have antiparticles, while bosons do not. There can be an anti-electron (i.e., a positron), but there's no such thing as an antiphoton. Fermions also can't occupy the same quantum state; for instance, electrons orbiting an atom's nucleus can't both occupy the same orbital level and spin in the same direction two electrons can hang out in the same orbital and spin in opposite directions because this represents a different quantum state. Bosons, on the other hand, don't have this problem.

But back in 1937, a physicist named Ettore Majorana discovered that there a different, unusual kind of fermion could exist; the so-called Majorana fermion.

All the fermions in the Standard Model are referred to as Dirac fermions. Where they and Majorana fermions differ is that the Majorana fermion would be its own antiparticle. Because of this quirk, the Majorana fermion has been nicknamed the "angel particle" after the Dan Brown novel "Angels and Demons," whose plot involved a matter/anti-matter bomb.

Until 2017, however, there remained no definitive experimental evidence for Majorana fermions. But during that year, physicists constructed a complicated experimental device involving a superconductor, a topological insulator which conducts electricity along its edges but not through its center and a magnet. The researchers observed that in addition to electrons flowing along the edge of the topological insulator, this device also showed signs of producing Majorana quasiparticles.

Quasiparticles are an important tool that physicists use when searching for evidence of "real" particles. They aren't the real thing themselves, but they can be thought of as disturbances in a medium that represent a real particle. You can think of them like bubbles in a Coca Cola a bubble itself isn't an independent object, but rather a phenomenon that emerges from the interaction between carbon dioxide and the Coca Cola. If we were to say there was some hypothetical "bubble particle" that really existed, we could measure the "quasi"-bubbles in a Coca Cola to learn more about its characteristics and provide evidence for this imaginary particle's existence.

By observing quasiparticles with properties that matched theoretical predictions of Majorana fermions, the researchers believed that they had found a smoking gun that proved these peculiar particles really existed.

Regrettably, recent research showed that this finding was in error. The device that the 2017 researchers used was only supposed to generate signs of Majorana quasiparticles when exposed to a precise magnetic field. But new researchers from Penn State and the University of Wurzburg found that these signs emerged whenever a superconductor and topological insulator were combined regardless of the magnetic field. The superconductor, it turns out, acted as an electrical short in this system, resulting in a measurement that looked right, but was really just a false alarm. Since the magnetic field wasn't contributing to this signal, the measurements didn't match theory.

"This is an excellent illustration of how science should work," said one of the researchers. "Extraordinary claims of discovery need to be carefully examined and reproduced. All of our postdocs and students worked really hard to make sure they carried out very rigorous tests of the past claims. We are also making sure that all of our data and methods are shared transparently with the community so that our results can be critically evaluated by interested colleagues."

Majorana fermions are predicted to appear in devices where a superconductor is affixed on top of a topological insulator (also referred to as a quantum anomalous Hall insulator [QAH]; left panel). Experiments performed at Penn State and the University of Wrzburg in Germany show that the small superconductor strip used in the proposed device creates an electrical short, preventing the detection of Majoranas (right panel).

Cui-zu Chang, Penn State

Beyond the intrinsic value of better understanding the nature of our universe, Majorana fermions could be put to serious practical use. They could lead to the development of what's known as a topological quantum computer.

A regular quantum computer is prone to decoherence essentially, this is the loss of information to the environment. But Majorana fermions have a unique property when applied in quantum computers. Two of these fermions can store a single qubit (the quantum computer's equivalent of a bit) of information, as opposed to a regular quantum computer where a single qubit of information is stored in a single quantum particle. Thus, if environmental noise disturbs one Majorana fermion, its associated particle would still store the information, preventing decoherence.

To make this a reality, researchers are still persistently searching for the angel particle. As promising as the 2017 research appeared, it looks like the hunt continues.

From Your Site Articles

Related Articles Around the Web

Read more:
The hunt for the 'angel particle' continues - Big Think

Read More..

Googles Quantum Supremacy will mark the End of the Bitcoin in 2020 – The Coin Republic

Ritika Sharma Monday, 13 January 2020, 03:49 EST Modified date: Monday, 13 January 2020, 05:00 EST

Quantum computing whenever hit the headlines left not just Bitcoin holders but also every Cryptocurrency holder worried about the uncertainty around their holdings.

It widely believed that the underlying technology of Bitcoin, Blockchain is immutable, meaning it cannot be changed or encrypted without authority over encryption keys.

However, with quantum computers, it is possible to break a blockchains cryptographic codes. Quantum computing can hit the most significant features of Blockchain like unchangeable data, unalterable, and security making it vulnerable.

Google has achieved quantum supremacy as of late 2019, which poses a threat to Bitcoin. It will be a threat to Blockchain, as quantum computing will affect one blockchains key features like inalterability and security, thus making Blockchain as highly vulnerable technology.

Later, china Joined Google in the quantum supremacy Race and announced working on quantum technology. With this, the year 2020 might witness the end of the Crypto Era.

How can Quantum computing break the Blockchain?

The reason behind this fear is quite genuine and straightforward: Bitcoin or any Cryptocurrency depends on cryptography, hash functions, and asymmetric cryptographic number mainly relies on the computing power of computers. The hash function calculates a random number for each block.

The results obtained by this process are effortless to verify, but challenging to find. However, quantum computing has powerful algorithmic capabilities, which is precisely the enemy of this key.

Quantum computing uses subatomic particles, which will be available in more than one state at one time. This feature makes Quantum computing faster than the technology we use today.

Quantum computers can work 100 million times faster than current systems; the computational power is capable of solving any complex mathematical equation in a matter of a few seconds, which current systems take 10,000 years to solve.

With such super computational powers, Quantum computers is capable of calculating the one-way functions that will make one-way encryption obsolete.

The risk over Blockchain is more if it gets in the wrong hands. Hackers with a quantum computer can hack the Cryptocurrency ledger and take complete control of Blockchain.

Will Googles Quantum computing wipe out your Bitcoins?

Googles quantum Supremacy only to traditional computers on classical problems; this isnt actual quantum technology. It was presented bluntly as, quantum supremacy, though it is just a step in the world of quantum computing space.

Even if Googles quantum computer demonstrates, its computing power on specific problems far exceeds the best performing supercomputing. The results of this research by Google do not have much meaning in terms of Bitcoin. This isnt even near to what we can call breaking Bitcoin or Blockchain.

However, Googles quantum supremacy does not pose any threat to Bitcoin; many people in the space still stressed about quantum threat theory. Many analysts claim that the quantum algorithm used by Shor can crack private keys, but again, there Is a long way to go before it could break bitcoins Blockchain.

According to researchers, a quantum computer with 4,000 qubits is undoubtedly able to break the Blockchain. Still, googles the quantum computer has only 53 qubits, which cannot cause any harm to Blockchain, and it is worth mentioning that The higher the qubit, the more difficult it becomes.

Satoshi Nakamotos Proposed solution to beat Quantum Supremacy

Satoshi was a true visionary, the things we are concerned about today, and had already been answered by him. In 2010, satoshi Nakamoto responded to the question about quantum computers by username llama on bitcoin talk.

He replied that If Bitcoin suddenly cracked, the signature will be destroyed; but if it is slowly changed, the system still has time to convert to a stronger function, and Re-sign all your assets. Another cruder answer to this question suggested by the author of Mastering Bitcoin, Andreas Antonopoulos, If the quantum computer comes, we will upgrade.

The Quantum supremacy threat isnt new to the crypto world, and many cryptocurrency projects such as Ethereum, quantum chains, etc., focused on making blockchain quantum resistance, experts in Cryptocurrency space also advocating the development of quantum encryption technology to ensure the security of funds.

Unless a threat of Actual Quantum computing of far more powerful processor explodes, Bitcoin and its developers still have time to secure it. With the continuous development in Quantum technology and the development of more qubit chips, still, there will be the sword of Damocles hanging on the head of the cryptocurrency.

Follow this link:
Googles Quantum Supremacy will mark the End of the Bitcoin in 2020 - The Coin Republic

Read More..

Bleeding edge information technology developments – IT World Canada

What are some bleeding-edge information technology developments that a forward-thinking CIO should keep an eye on?

Here are a few emerging technologies that have caught my attention. These are likely to have an increasing impact on the world of business in the future. Consider which ones you should follow a little more closely.

A recent advance in quantum computing that a Google team achieved indicates that quantum computing technology is making progress out of the lab and closing in on practical business applications. Quantum computing is not likely to change routine business transaction processing or data analytics applications. However, quantum computing is likely to dramatically change computationally intense applications required for:

Since most businesses can benefit from at least a few of these applications, quantum computing is worth evaluating. For a more detailed discussion of specific applications in various topic areas, please read: Applying Paradigm-Shifting Quantum Computers to Real-World Issues.

Machine learning is the science of computers acting without software developers writing detailed code to handle every case in the data that the software will encounter. Machine learning software develops its own algorithms that discover knowledge from specific data and the softwares prior experience. Machine learning is based on statistical concepts and computational principles.

The leading cloud computing infrastructure providers machine learning routines that are quite easy to integrate into machine learning applications. These routines greatly reduce expertise barriers that have slowed machine learning adoption at many businesses.

Selected business applications of machine learning include:

For summary descriptions of specific applications, please read: 10 Companies Using Machine Learning in Cool Ways.

Distributed ledger technology is often called blockchain. It enables new business and trust models. A distributed ledger enables all parties in a business community to see agreed information about all transactions, not just their own. That visibility builds trust within the community.

Bitcoin, a cryptocurrency, is the mostly widely known example application of blockchain.

Distributed ledger technology has great potential to revolutionize the way governments, institutions, and corporations interact with each other and with their clients or customers.Selected business applications of distributed ledger technology include:

For descriptions of industry-specific distributed ledger applications, please read: 17 Blockchain Applications That Are Transforming Society.

The Industrial Internet of Things (IIoT) is a major advance on Supervisor Control and Data Acquisition (SCADA). SCADA, in many forms, has been used for decades to safely operate major industrial facilities including oil refineries, petrochemical plants, electrical power generation stations, and assembly lines of all kinds.

IIOT is a major advance over relatively expensive SCADA. IIoT relies on dramatically cheaper components including sensors, network bandwidth, storage and computing resources. As a result, IIoT is feasible in many smaller facilities and offers a huge increase in data points for larger facilities. Business examples where IIoT delivers considerable value include production plants, trucks, cars, jet engines, elevators, and weather buoys.

The aggressive implementation of IIoT can:

For summary descriptions of specific IIOT applications, please read: The Top 20 Industrial IoT Applications.

RISC-V is an open-source hardware instruction set architecture (ISA) for CPU microprocessors that is growing in importance. Its based on established reduced instruction set computer (RISC) principles. The open-source aspect of the RISC-V ISA is a significant change compared to the proprietary ISA designs of the dominant computer chip manufacturers Intel and Arm.

RISC-V offers a way around paying ISA royalties for CPU microprocessors to either of the monopolists. The royalties may not be significant for chips used in expensive servers or smartphones, but they are significant for the cheap chips required in large numbers to implement the IIOT applications listed above.

For an expanded discussion of RISC-V, please read: A new blueprint for microprocessors challenges the industrys giants.

What bleeding edge information technology developments would you add to this list? Let us know in the comments below.

Go here to read the rest:
Bleeding edge information technology developments - IT World Canada

Read More..