Page 1,388«..1020..1,3871,3881,3891,390..1,4001,410..»

Why Schrdinger’s Cat is still the most controversial thought experiment in science – BBC Science Focus Magazine

One of the most important tools in the theoretical physicists toolkit is the thought experiment. If you study relativity, quantum mechanics, or any area of physics applying to environments or situations in which you cannot (or should not) place yourself, youll find that you spend a lot more time working through imaginary scenarios than setting up instruments or taking measurements.

Unlike physical experiments, thought experiments are not about collecting data, but rather about posing an imaginary question and working through an if/then logical sequence to explore what the theory really means.

Asking what has to happen if the theory is true? is invaluable for developing intuition and anticipating new applications. In some cases, a thought experiment can reveal the deep philosophical implications of a theory, or even present what appears to be an unsolvable paradox.

Probably the most famous of all physics thought experiments is that of Schrdingers Cat both because it involves (purely hypothetical!) carnage, and because its implications for the nature of reality in a quantum world continue to challenge students and theorists everywhere.

The basic again, purely hypothetical experimental setup is this. Imagine you have a radioactive material in which there is a 50 per cent chance of a nuclear decay in some specified amount of time (lets say, one hour).

You put this material in a box along with a small glass vial of poison and a device that will break the vial if a radioactive decay is detected. Then, you put a live cat in the box, close the lid, wait an hour, and then open the box once again.

Based on this setup, its straightforward to deduce that since the chance the atom decays and triggers the poison is 50 per cent, half the time you do the experiment, you should find a living cat, and half the time, you should find a dead one, assuming youre not re-using the same cat each time.

But when Erwin Schrdinger described the thought experiment to Albert Einstein in 1935, he did so to highlight an apparent consequence of quantum theory that seemed to both scientists to be complete nonsense: the idea that before you open the box, the cat is both alive and dead at the same time.

Ultimately, it comes down to the principle of uncertainty in quantum mechanics. Unlike classical mechanics (the kind of physics that applies to our everyday experiences), in quantum mechanics, there seems to be a fundamental uncertainty built into the nature of reality.

When you flip a coin (a classical event), its only random because youre not keeping careful enough track of all the motions and forces involved. If you could measure absolutely everything, you could predict the outcome every time its deterministic.

But in the quantum mechanical version of a coin flip, the radioactive decay, nothing you measure can possibly tell you the outcome before it occurs. As far as an outside observer is concerned, until the measurement of the quantum coin flip occurs, the system will act like its in both states at once: the atom is both decayed and not decayed, in what we call a superposition.

Superposition is a real phenomenon in quantum mechanics, and sometimes we can even use it to our advantage. Quantum computing is built on the idea that a quantum computer bit (or qubit), instead of being just one or zero, can be in a superposition of one and zero, massively increasing the computers ability to do many complex calculations at once.

In the case of Schrdingers Cat, the apparently absurd conclusion that the cat is both alive and dead comes from considering the whole apparatus the atom, the trigger device, and the poison vial, and the cat to be a single quantum system, each element of which exists in a superposition.

The atom is decayed and not, the device is triggered and dormant, the vial is broken and intact, and the cat is therefore simultaneously dead and alive, until the moment the box is opened.

Whether this conclusion is actually absurd is an open question. What both Schrdinger and Einstein concluded was that true, fundamental uncertainty simply cannot apply to the real, macroscopic, world. These days, most physicists accept that uncertainty is real, at least for subatomic particles, but how that uncertainty 'collapses' when a measurement is made remains up for debate.

In one interpretation, any measurement thats performed fundamentally alters reality though it is usually argued that the trigger device, or, at least, the cat itself, provides a measurement for that purpose. In another interpretation, called Many Worlds, the entire Universe duplicates itself every time a quantum coin is flipped, and the measurement simply tells you whether youre in the dead-cat or alive-cat universe from now on.

While we cant say how long it will take before we fully understand whats really going on in the black box of quantum superposition, applications of quantum theory are already bringing us incredible technological advances, like quantum computers. And in the meantime, clever thought experiments allow us to follow our curiosity, without running the risk of killing any cats.

Read more about quantum physics:

View original post here:
Why Schrdinger's Cat is still the most controversial thought experiment in science - BBC Science Focus Magazine

Read More..

GermaniumTin Transistor Developed as an Alternative to Silicon – Technology Networks

Register for free to listen to this article

Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Scientists at Forschungszentrum Jlich have fabricated a new type of transistor from a germaniumtin alloy that has several advantages over conventional switching elements. Charge carriers can move faster in the material than in silicon or germanium, which enables lower voltages in operation. The transistor thus appears to be a promising candidate for future low-power, high-performance chips, and possibly also for the development of future of quantum computers.

Over the past 70 years, the number of transistors on a chip has doubled approximately every two years according to Moores Law, which is still valid today. The circuits have become correspondingly smaller, but an end to this development appears to be in sight. We have now reached a stage where structures are only 2 to 3 nanometers in size. This is approximately equal to the diameter of 10 atoms, which takes us to the limits of what is feasible. It doesnt get much smaller than this, says Qing-Tai Zhao of the Peter Grnberg Institute (PGI-9) at Forschungszentrum Jlich.

For some time now, researchers have been looking for a substitute for silicon, the primary material used in the semiconductor industry. The idea is to find a material that has more favourable electronic properties and can be used to achieve the same performance with larger structures, the professor explains.

The research is in part focused on germanium, which was already being used in the early days of the computer era. Electrons can move much faster in germanium than in silicon, at least in theory. However, Qing-Tai Zhao and his colleagues have now gone one step further. To optimize the electronic properties even further, they incorporated tin atoms into the germanium crystal lattice. The method was developed several years ago at the Peter Grnberg Institute (PGI-9) of Forschungszentrum Jlich.

The germaniumtin system we have been testing makes it possible to overcome the physical limitations of silicon technology, says Qing-Tai Zhao. In experiments, the germaniumtin transistor exhibits an electron mobility that is 2.5 times higher than a comparable transistor made of pure germanium.

Want more breaking news?

Subscribe to Technology Networks daily newsletter, delivering breaking science news straight to your inbox every day.

Another advantage of the new material alloy is that it is compatible with the existing CMOS process for chip fabrication. Germanium and tin come from the same main group in the periodic table as silicon. The germanium-tin transistors could therefore be integrated directly into conventional silicon chips with existing production lines.

Apart from classical digital computers, quantum computers could also benefit from the germaniumtin transistor. For some time, there have been efforts to integrate parts of the control electronics directly on the quantum chip, which is operated inside a quantum computer at temperatures close to absolute zero. Measurements suggest that a transistor made of germanium-tin will perform significantly better under these conditions than those made of silicon.

The challenge is to find a semiconductor whose switching can still be very fast with low voltages at very low temperatures, explains Qing-Tai Zhao. For silicon, this switching curve flattens out below 50 Kelvin. Then, the transistors need a high voltage and thus a high power, which ultimately leads to failures of the sensitive quantum bits because of the heating. Germaniumtin performs better at these temperatures in measurements down to 12 Kelvin, and there are hopes to use the material at even lower temperatures, says Qing-Tai Zhao.

In addition, the germaniumtin transistor is a further step towards optical on-chip data transmission. The transmission of information with light signals is already standard in many data networks because it is considerably faster and more energy-efficient than data transfer via electrical conductors. In the field of micro- and nanoelectronics, however, data is usually still sent electrically. Colleagues from the Jlich working group of Dr. Dan Buca have already developed a germanium-tin laser in the past that opens up the possibility to transmit data optically directly on a silicon chip. The germanium-tin transistor, along these lasers, provides a promising solution for the monolithic integration of nanoelectronics and photonics on a single chip.

Reference:Liu M, Junk Y, Han Y, et al. Vertical GeSn nanowire MOSFETs for CMOS beyond silicon. Commun Eng. 2023;2(1):1-9. doi:10.1038/s44172-023-00059-2

This article has been republished from the following materials. Note: material may have been edited for length and content. For further information, please contact the cited source.

See the rest here:
GermaniumTin Transistor Developed as an Alternative to Silicon - Technology Networks

Read More..

GCHQ-linked fund backs UK quantum start-up in race against China – The Telegraph

Steve Brierley, founder and chief executive of Riverlane, said the Government had identified quantum as a critical technology the UK has leadership in so they want to continue to back companies who are ahead in the marketplace.

Britains universities are seen as having leading expertise in quantum computers, which use lasers and extremely cold temperatures to manipulate particles and unlock their quantum properties. Several start-ups are developing quantum computers in the UK, with a view to commercialising the technology.

The NSSIF also has a stake in Quantum Motion, which is developing semiconductor technology for the advanced computers, as officials attempt to foster the nascent sector.

While quantum computers promise to be far more powerful than modern classical machines, the newer devices currently have a high error rate that makes them difficult to use reliably.

Riverlane, which works with companies including Rolls Royce and AstraZeneca, has developed hardware decoders and software that can root out these errors and correct them, meaning quantum computers can run more smoothly.

The company plans to develop a quantum semiconductor chip that can perform a similar function by 2025.

The fresh investment has roughly tripled Riverlanes valuation to around 150m, The Telegraph understands.

The new funding round was led by London-listed fund Molten Ventures, along with investment from US-listed computing company Altair. Current investors Amadeus Capital and Cambridge Innovation Capital also joined the deal.

A British Business Bank spokesman said: The investment in Riverlane recognises the companys role as part of the UKs world-leading quantum technology sector.

The Future Fund, the taxpayer-backed pandemic rescue vehicle launched by Rishi Sunak, has stakes in quantum security companies Arqit and Kets Quantum, and a shareholding in computer maker Oxford Quantum Circuits.

See the article here:
GCHQ-linked fund backs UK quantum start-up in race against China - The Telegraph

Read More..

Fellowship winners will continue their studies in England – Yale News

Eight Yale seniors and a recent graduate have been awarded fellowships for graduate study at the universities of Oxford or Cambridge in the United Kingdom.

These fellowship recipients are in addition to the students previously announced in Yale News who have won Rhodes and Marshall Scholarships.

The fellowship winners and their awards are:

Danielle Castro has received a Paul Mellon Fellowship to pursue an M.Phil. in population health sciences at the University of Cambridge. Next month, she will graduate from Yale with a certificate in global health and a joint B.S./M.S. degree in molecular biochemistry. Her thesis is on the development of novel drug candidates for chordoma spine cancers in the laboratory of Craig Crews, the John C. MaloneProfessor of Molecular, Cellular and Developmental Biology. She has a strong connection to her Peruvian and Indigenous heritage, and is passionate about social justice and reducing health inequities. She has worked toward this goal while interning in the New Haven Public Schools, serving on the board of the HAVEN Free Clinic, and conducting public health research in Connecticut and the Peruvian Amazon. She enjoys meeting and mentoring other first-generation immigrant and low-income students, especially in her role as co-president of Latina Women at Yale.

Aidan Evans has been awarded a Huawei Hisilicon Scholarship to earn a Ph.D. in computer science at the University of Cambridge. He is majoring in computer science and philosophy at Yale and additionally is completing a B.S./M.S. in computer science. During his time at Yale he published research on quantum computing at the premier conference on software engineering. He has also served as a teaching assistant for seven courses, ranging from those on systems programming and computer organization to graduate courses on the interplay of computer science with law. Most recently he has taken on the project of writing a book on the history of Yales computer science department. At Cambridge, he will study the logic and the foundations of computer science under the supervision of Professor Anuj Dawar.

Beasie Goddu was awarded a Paul Mellon Fellowship for graduate study at the University of Cambridge, where she will pursue an M.Phil. in English literature. She will examine the portrayal of womens rights in early 20th-century British fiction. She is majoring in English at Yale with a concentration in creative writing. Her academic thesis explored womens agency over physical space in the works of Virginia Woolf and E.M. Forster. Her creative writing thesis is a collection of essays about vision. She serves as a writing partner at Yales Poorvu Center for Teaching and Learning, is a senior editor of The New Journal, a student-run magazine that features creative nonfiction, and is an undergraduate editorial fellow at The Yale Review. She is also president of St. Anthony Hall, an arts and literary society. She aspires to a career in editing, highlighting marginalized female voices.

Tyler Jager was awarded a Kings-Yale Fellowship to pursue an M.Phil. in political thought and intellectual history at the University of Cambridge. He will focus on early 20th-century history and efforts to restrict migration and the freedom of movement, particularly in the British Empire. He will graduate from Yale with a joint B.A./M.A. degree in political science and a certificate in human rights. He was the 2022 winner of the Elie Wiesel Prize in Ethics for an essay he wrote on aid workers in the Mediterranean. He has also written about that topic, tenant organization, and lead poisoning in New Haven for a number of campus and national publications, and currently serves as co-editor of BRINK, Yales undergraduate book review. His senior thesis, an ethnographic study in Greece, explored how aid workers presence in host communities affects anti-refugee prejudice in European Union external border zones. Jager is a tour guide at the Yale University Art Gallery and was the coordinator of the Yale Hunger and Homelessness Action Project Fast, the universitys largest student fundraiser. He has interned at the U.S. Holocaust Memorial Museum and at the journal Foreign Affairs.

Hamzah Jhaveri has received a Keasbey Scholarship to pursue an M.Phil. in social anthropology at the University of Cambridge. At Yale, he majored in anthropology, with a particular interest in the study of moral economics and the corporate form. He has been researching gun culture and commerce in America, and his senior thesis investigates the transformation of the gun-making trade in an early American settlement in Pennsylvania known for its pacifist religious values and socialist economy. Jhaveri served as the editor-in-chief of the Yale Herald, wrote and performed with sketch company groups including the Fifth Humor and Playspace, and has been an organizer with the Yale Endowment Justice Coalition, Sunrise New Haven (the local chapter of a national movement to stop the climate crisis and create millions of new jobs), and New Haven Rising (a community organization dedicated to achieving economic, racial, and social justice through collective action). He has spent summers teaching fifth-graders about climate organizing, interning at a First Amendment law firm, researching petrochemical companies, and harvesting micro greens at an urban hydroponics farm.

Elizabeth Hopkinson was awarded a Paul Mellon Fellowship to pursue an M.Phil. in health, medicine, and society at Clare College, Cambridge. She graduated from Yale in December 2022 with a B.A. in environmental studies. Her senior thesis explored end-of-life care using geographic concepts of place and place-making. At Cambridge, she will continue to study how places affect experiences of aging, dying, and disability. She was a leader of FOOT (First-year Orientation Trips), was a first-year counselor in Jonathan Edwards College, a Yale Daily News editor, and a research assistant at the Yale School of Nursing and in the Human Nature Lab. During the height of the COVID pandemic, she worked as an EMT near her home in Westborough, Massachusetts.

Shaezmina Khan has been awarded the Rotary Global Grant Scholarship to pursue an M.Sc. in global governance and diplomacy from the University of Oxford. She is majoring in global affairs at Yale and will obtain a certificate in human rights from Yale Law School. For her senior capstone, Khan worked for the Afghanistan War Commission and assessed U.S. diplomatic efforts to achieve political settlement in Afghanistan between 2002 and 2021. At Oxford, she hopes to focus her research on regional security dilemmas and conflict mediation in the Afghanistan-Pakistan-India region. She is passionate about American foreign policy, national security, diplomacy, and peacebuilding in the Middle East and North Africa region. She served as a policy trainee at the European Commission in Brussels and as a legislative intern for U.S. Congresswoman Rosa DeLauro in Washington, D.C. She served as the executive director of the Yale International Relations Association and president of the Muslim Students Association, and was a research assistant at both the Yale Law School and Jackson School for Global Affairs.

Ethan Pesikoff received a Henry Fellowship to earn a Master of Advanced Studies (MASt) degree in pure mathematics at the University of Cambridge. At Yale, he is majoring in both mathematics and Near Eastern Languages and Civilizations (NELC). He served on the board of the Yale Undergraduate Math Society, which organizes academic support and social activities for students, and he conducted original mathematical research at Williams College and the University of Minnesota during summer breaks. His senior thesis for NELC seeks to understand previously untranslated Akkadian texts from the early second millennium BCE. After completing his MASt at Cambridge, Pesikoff plans to pursue a Ph.D. in mathematics.

Melissa Wang was awarded a Paul Mellon Fellowship to pursue an M.Phil. in U.S. history at the University of Cambridge, where she will study the consolidation of correctional officer power in late 20th-century America and its effect on mass incarceration policy and prisoners lives. Her research is intended to place correctional officers within a broader history of American law enforcement, militarism, and race. At Yale, she is majoring in history, and ethnicity, race, and migration, and is a scholar in the Multidisciplinary Academic Program in Human Rights. She has served on the board of the Yale Undergraduate Prison Project (YUPP) and Yale Womens Center, and captains the Yale club Wushu team. Her research interests were inspired by work with the Stop Solitary Connecticuts legislative campaign as a project leader at YUPP and as a research assistant at the Yale Law School Lowenstein Clinic. A painter, she is also a volunteer with Justice Arts Coalition, a national network and resource for those creating art in and around the criminal legal system.

See the rest here:
Fellowship winners will continue their studies in England - Yale News

Read More..

UK expected to offer $1.25 billion for nation’s semiconductor industry – Computerworld

UK Prime Minister Rishi Sunak is reportedly about to follow in the footsteps of the US and several European governments by announcing a funding package designed to build up the countrys domestic semiconductor industry, according to a report this week from Politico.

While the exact amount of the funding could change, according to Politicos sources, a topline figure of 1 billion ($1.25 billion) is expected. The UK governments Department for Science, Innovation and Technology is thought to be the prime mover behind the policy, and Sunak is said to be planning to unveil it in next months G7 meeting in Japan.

Government efforts to build domestic semiconductor manufacturing capacity have been spurred largely by the events of the pandemic, the 2022 Russian invasion of Ukraine and the ongoing US semiconductor trade dispute with China. The former event, thanks to the consequent enormous upsurge of remote work, created a new wave of semiconductor demand, highlighting the dependence of the global technology sector on foundries in East Asia. US policy dating back to the Trump administration then created a new set of barriers to exports from China, while the invasion of Ukraine further exacerbated strains on the global supply chain.

Hence, in an increasingly unsettled geopolitical situation, national governments whose countries depend on a large supply of computer chips have taken increasingly dramatic steps to either build new production capabilities or buttress existing ones. The US own CHIPS and Science Act, signed into law by President Biden last summer, appropriated more than $52 billion for a range of incentives, including $39 billion for manufacturing incentives designed to keep semiconductor fabs run by US companies in the country, and provide major sums in subsidy for companies looking to create new ones.

The UKs plan would fit in with UK policy regarding the technology sector. During his Spring Statement, Chancellor of the Exchequer Jeremy Hunt announced several measures, including R&D support for small to midsize businesses in the form of tax credits, an annual $1.25 million award for excellence in AI research, and $3.12 billion in financial support for the governments 10-year plan for quantum computing development. Additionally, the government plans to offer new childcare subsidies for tech workers, and to implement retraining initiatives designed to allow older workers to participate in the tech sector.

The rest is here:
UK expected to offer $1.25 billion for nation's semiconductor industry - Computerworld

Read More..

No need for a super computer: Describing electron interactions efficiently and accurately – Phys.org

This article has been reviewed according to ScienceX's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

One of the outstanding challenges in the field of condensed matter physics is finding computationally efficient and simultaneously accurate methods to describe interacting electron systems for crystalline materials.

In a new study, researchers have discovered an efficient but highly accurate method of doing so. The work, led by Zheting Jin (a graduate student in Yale Applied Physics) and his thesis supervisor, Sohrab Ismail-Beigi, is published in Physical Review B.

Developing methods to accurately describe interacting quantum electrons has long been of interest to researchers in the fields because it can provide valuable insights about many important aspects of materials. Describing the electrons at this level is tricky for a few reasons, though. One is that, because they're quantum mechanical, they move in a wavy manner and tracking them is more complicated. The other is that they interact with each other.

Each component of this problem is "OK to deal with separately," said Ismail-Beigi, Strathcona Professor of Applied Physics, Physics, and Mechanical Engineering & Materials Science. But when you have waviness and interactions, the problem is so complex that nobody knows how to solve it efficiently.

Like many difficult problems in physics and mathematics, one can in principle take a giant computer and numerically solve the problem with brute force, but the amount of computation and storage needed would be exponential in the number of electrons. For example, every time one adds a new electron to the system, the size of the computer needed increases by a factor of two (typically, even a larger factor). This means studying a system with about 50 electrons is infeasible even with today's largest supercomputers. For context, a single iodine atom has 53 electrons, while a small nanoparticle has more than 1,000 electrons.

"On the one hand, the electrons want to move aroundthat's to take advantage of the kinetic energy," Ismail-Beigi said. "On the other, they repel each other'don't come next to me if I'm here already.' Both effects are captured in the well-known Hubbard model for interacting electrons. Basically, it has these two key ingredients, and it's a very hard problem to solve. No one knows how to solve it exactly, and high-quality approximate and efficient solutions are not easy to come by."

The Ismail-Beigi team has developed a method related to a class of approaches that use what's known as an auxiliary or subsidiary boson. Typically, these approaches require much less computational resources but are only moderately accurate as they treat one atom at a time. Ismail-Beigi's team tried a different tack. Rather than examining one atom at a time, the researchers treat two or three bonded atoms at a time (called a cluster).

"Electrons can hop between the atoms in the cluster: we solve the cluster problem directly, and then we connect the clusters together in a novel way to describe the entire system," Ismail-Beigi said. "In principle, the larger the cluster, the more accurate the approach, so the question is how large a cluster does one need to get a desired accuracy?"

Researchers have previously tried cluster approaches, but the computational costs have been prohibitively high and the accuracy has been wanting, given the added computational cost.

"Zheting and I found a clever way of matching different clusters together so that the quantities calculated between the different clusters agree across their boundaries," he said. "The good news is that this method then gives a very highly accurate description with even a relatively small cluster of three atoms. Because of the smooth way one glues the clusters together, one describes the long-range motion of the electrons well in addition to the localized interactions with each other. Going into this project, we didn't expect it to be this accurate."

Compared to literature benchmark calculations, the new method is three to four orders of magnitude faster.

"All the calculations in the paper were run on Zheting's student laptop, and each one completes within a few minutes," Ismail-Beigi said. "Whereas for the corresponding benchmark calculations, we have to run them on a computer cluster, and that takes a few days."

The researchers said they look forward to applying this method to more complex and realistic materials problems in the near future.

More information: Zheting Jin et al, Bond-dependent slave-particle cluster theory based on density matrix expansion, Physical Review B (2023). DOI: 10.1103/PhysRevB.107.115153

Journal information: Physical Review B

Read more here:
No need for a super computer: Describing electron interactions efficiently and accurately - Phys.org

Read More..

What is cloud hosting and how do you use it? – TechRadar

With the continuous rise of technology and the need for businesses to establish an online presence, the best web hosting (opens in new tab) services have become an essential aspect of the digital world.

Among the many types of hosting services available, cloud hosting has proven to be a reliable option for millions of businesses worldwide. However, many people still do not fully understand what cloud hosting is and how it works.

Let's explore the basics of cloud hosting and its benefits, as well as provide steps on how to use it effectively.

Cloud hosting is a service where a website or application is hosted on a network of interconnected servers, known as the cloud, instead of being hosted on a single physical server. The hosting provider manages the cloud infrastructure, which enables you to have access to a range of computing resources, such as CPU, RAM, storage, and bandwidth.

When you sign up for cloud hosting, you can choose the amount of resources you need and pay for only what you use. The hosting provider then allocates the necessary resources from the cloud infrastructure and creates a virtual server instance for you. This virtual server instance can be easily scaled up or down as per your requirements, without the need for any physical hardware modifications.

You can then access your virtual server instance using remote login credentials and manage your website or application, install software, and perform other tasks. The hosting provider takes care of the server maintenance, security, backups, and updates, which helps you focus on your business goals rather than worrying about IT infrastructure.

The cost of the best cloud hosting (opens in new tab) services can vary depending on several factors such as the cloud provider, the type of hosting plan, the amount of resources required, and the location of the data center. Most cloud providers offer a pay-as-you-go pricing model where you only pay for the resources you use, such as storage, bandwidth, and computing power. This can be a more cost-effective option for businesses that have varying needs throughout the year.

The cost of cloud hosting can range from $10 per month for a basic plan with limited resources to $200 or even thousands of dollars per month for enterprise-level plans with advanced features and dedicated resources.

It's important to note that while cloud hosting can be a cost-effective solution, it's essential to carefully consider your needs and choose the right plan for your business to avoid overspending. Also, you should keep an eye out for any additional charges such as data transfer fees, storage fees, or add-on services that may increase the overall cost.

Cloud hosting is a type of web hosting that utilizes a network of remote servers to store, manage, and process data, whereas VPS hosting (opens in new tab) is a type of hosting that uses virtualization technology to create a dedicated server environment on a shared server.

With cloud hosting, resources like CPU, RAM, and storage are distributed across multiple servers, whereas with VPS hosting, a portion of the physical server's resources is allocated to each VPS instance.

Cloud hosting is highly scalable and can easily accommodate sudden spikes in traffic by automatically adding resources as needed. On the other hand, VPS hosting is less scalable as the amount of resources allocated to each VPS instance is fixed.

Web hosting is a type of hosting service where a website or web application is hosted on a physical server that is dedicated to serving that website or application. In web hosting, the website or application is hosted on a single server, and all the resources such as storage, memory, and processing power are shared among the websites hosted on that server. Web hosting is typically used for small to medium-sized websites that do not require a lot of resources.

Cloud hosting, on the other hand, is a type of hosting service where a website or web application is hosted on a network of servers that are interconnected and work together to provide hosting services. In cloud hosting, the website or application is hosted on a virtual server that is created by combining the resources of multiple physical servers.

This allows cloud hosting providers to offer a high level of scalability, reliability, and performance to their customers. Cloud hosting is typically used for larger websites or web applications that require a lot of resources and high levels of availability.

There are many cloud hosting providers available in the market such as Amazon Web Services, Microsoft Azure, Google Cloud Platform, and many more. Choose a provider that offers the features and services that match your requirements.

Cloud hosting providers offer a variety of hosting plans to choose from. You can select the plan that suits your needs in terms of storage, bandwidth, and other resources. Once you have chosen your hosting plan, you can set up your cloud server by following the instructions provided by your hosting provider. You will need to create an account, choose your server location, and configure your server settings.

You can install your software or application on the cloud server using the control panel or command line interface provided by your hosting provider. Next, you will configure your server settings such as security, firewall, and backup options to ensure your website or application is secure and available at all times.

It's important to monitor your cloud server regularly to ensure it's running smoothly and to identify any issues. You can use monitoring tools provided by your hosting provider or third-party tools to monitor your server performance, uptime, and other metrics.

Here are some benefits and risks of cloud hosting:

Benefits

Scalability: Cloud hosting services offer scalability, which means that resources can be easily scaled up or down as per the changing demands of the user.

Cost-effective: Cloud hosting services can be cost-effective as you only pay for the resources you use, and there is no need for upfront investment in hardware.

High availability: Cloud hosting services offer high availability, meaning that you can access your data and applications at any time and from any location with an internet connection.

Security: Cloud hosting providers usually have advanced security measures in place, including encryption and firewalls, to protect your data.

Flexibility: Cloud hosting services offer a high degree of flexibility as you can access your data and applications from any device with an internet connection.

Risks

Reliance on internet connection: Cloud hosting services rely on an internet connection, which means that if the internet connection goes down, you may not be able to access your data or applications.

Security risks: While cloud hosting providers have advanced security measures in place, there is always a risk of data breaches or cyber-attacks.

Limited control: Cloud hosting services limit the level of control you have over your data and applications.

Compliance concerns: Some industries have specific regulatory requirements that may be difficult to comply with when using cloud hosting services.

Vendor lock-in: It can be difficult to switch cloud hosting providers once data and applications have been migrated to a particular service, which can create a vendor lock-in situation.

Cloud hosting can benefit a wide range of users, including individuals, small businesses, large enterprises, and even government agencies.

Here are some specific use cases where cloud hosting may be particularly beneficial:

Startups: Cloud hosting can be an ideal choice for startups that need to scale their operations quickly and efficiently without investing heavily in infrastructure.

Ecommerce websites: Online retailers can leverage cloud hosting to handle sudden surges in traffic during peak shopping seasons like Black Friday and Cyber Monday.

Mobile app developers: Developers can use cloud hosting to host their backend infrastructure and easily scale up or down as needed to handle fluctuations in user traffic.

Content creators: Bloggers, podcasters, and other content creators can use cloud hosting to store and distribute their content globally, ensuring high availability and fast delivery to their audience.

Enterprises: Large businesses with complex IT infrastructure needs can use cloud hosting to reduce costs, increase scalability, and improve their disaster recovery and business continuity capabilities.

Cloud hosting has revolutionized the way some businesses store and manage data. With its flexibility, scalability, and reliability, it provides a better alternative to traditional hosting. By choosing a cloud hosting provider and configuring your resources, you can take advantage of the benefits of cloud hosting and improve your business's web presence.

TechRadar created this content as part of a paid partnership with Hostinger. The contents of this article are entirely independent and solely reflect the editorial opinion of TechRadar.

See the rest here:
What is cloud hosting and how do you use it? - TechRadar

Read More..

Decentralized Exchange Vertex Launches on Ethereum Layer 2 Arbitrum – CoinDesk

Vertex, a decentralized exchange (DEX) for the spot and derivatives trading of digital assets, has gone live on Arbitrum (ARB), a popular network built atop the Ethereum blockchain.

Vertex, which had been operating on a test network, combines an off-chain order book layered on top of an on-chain automated market maker on a decentralized, self-custodial exchange. The firm, which has bases in Singapore and the Cayman Islands, counts Jane Street, Dexterity Capital, Hudson River Trading, GSR, Collab+Currency, JST Capital, Big Brain and Lunatic Capital among its early backers.

The messy collapse of FTX and other centralized trading platform blowups last year has fueled a shift toward decentralized exchanges and self-custody. Ethereum layer 2 system Arbitrum is now the fastest-growing blockchain in total value locked and has surpassed Ethereum in daily transaction volume.

The Vertex team has been working on the protocol for about a year. Co-founder Darius Tabatabai said the platform has drawn interest from institutional traders and from retail traders who use Arbitrum.

We built all the smart contracting ourselves, so were not forking anything, Tabatabai said in an interview with CoinDesk. The [automated market maker] is quite conventional, but we have a bunch of tech under the surface that enables you to do leveraged AMMs, to do looping, and we have an inbuilt money market. So you can think of it as a combination of Aave, dYdX and Uniswap, with an order book.

Building Vertex on Arbitrum and using an off-chain sequencer for the order book has enabled the venue to process between 10,000 and 15,000 transactions per second with the ability to match orders in 10 to 30 milliseconds, a speed that rivals leading centralized venues and surpasses that on other decentralized exchanges, Tabatabai added.

See the rest here:

Decentralized Exchange Vertex Launches on Ethereum Layer 2 Arbitrum - CoinDesk

Read More..

Why top business performers are adopting cloud computing – Open Access Government

Cloud computing is undoubtedly on the rise, with global end-user spending on public cloud services alone estimated to reach over $590 billion this year a 20.7% year-on-year increase. Despite the waves of economic uncertainty experienced these past few years, adopting cloud computing is seen as an opportunity to foster resilience, innovation, growth and scalability.

As such, large enterprises are striving to migrate 60% of their digital infrastructure to the cloud by 2025, including both private and public hosting setups.

Adopting cloud computingbrings a host of benefits, including facilitating long-term and secure remote work, reducing the total cost of ownership, and building the organizational agility needed to support sustained growth. However, until recently, the cloud conversation often eschewed one of its key benefits: serving as an enabler of advanced technologies.

Intelligent automation is one of the technologies ideally suited for adopting cloud computing and scaling within the cloud environment. It combines artificial intelligence, robotic process automation (RPA), business process management, etc., to reengineer processes and drive business outcomes. The pandemic induced accelerated digital transformations around the world, with RPA experiencing massive year-over-year growth. Increasingly, businesses shifted from focusing solely on RPA to combining other advanced technologies to enable intelligent automation to meet the growing pressures for more digital transformation.

The pandemic induced accelerated digital transformations around the world

While most organizations have begun their automation journeys, many are not yet maximizing their automation capacity. This is inspiring a drive for combinatorial solutions innovations across multiple trends to meet growing digitalization needs and demands.

By adopting cloud computing, organizations can more easily scale their automation across the enterprise, create the capacity to incorporate more advanced automations, and add intelligence to their digital workers faster and securely.

Using a cloud intelligent automation platform provides organizations with a suite of advanced automation technologies at their fingertips, without having to invest to build their own infrastructure and deploy and integrate automation software. Companies can lower the total cost of ownership by leveraging a providers solution as a service instead of building and maintaining that for themselves.

In addition to fiscal savings, organizations save the extensive time demands managing and operating an intelligent automation platform requires, freeing management and IT teams to focus on other strategic areas, like innovation, while the provider manages the cloud intelligent automation platform.

Organizations dont have to worry about integrating advanced technologies into their platform or keeping up with the latest tech because their cloud platform offers these ready-to-use technologies. The cloud setup ensures companies have access to the latest updates, advancements, and capabilities, which is helpful in a space that continues to see rapid development and change.

Companies can customize their intelligent automation strategy to meet their unique needs. If a business needs more digital workers specializing in robotic process automation (RPA) than business process management (BPM), a cloud-based intelligent automation platform can accommodate these needs.

It can also be modified to meet changing needs. For example, in the airline industry, seasonal-based adjustments are necessary. Christmas and summertime are peak seasons, so more digital workers are needed to support human workers, but they dont need to maintain this supply of digital workers year-round. With a cloud-enabled intelligent automation platform, airlines can adjust the number of digital workers they use on a seasonal basis.

A cloud-based platform is also able to accommodate unexpected changes in need. For instance, when the Covid-19 pandemic began, there was an influx in demand as passengers around the world suddenly had to cancel their flights. With a cloud-enabled intelligent automation platform, organizations could rapidly scale up their digital workforce to support their human workers with the surge in enquiries.

This flexible approach is much more cost-effective than buying on-premises digital workers and not using them all at full capacity or leaving some idle on the shelf throughout the year.

While most organizations have recognized the need for advanced technologies and adopting cloud computing, RPA tends to be the most commonly deployed.

But in order to gain the full benefits of digital transformation, complementary advanced technologies, which include AI and machine learning, need to be used in tandem. This approach enables organizations to automate more end-to-end processes, which facilitates automation at scale and drives overall better business outcomes.

A cloud-enabled intelligent automation platform connects all these technologies, allowing businesses to achieve transformative change faster. Companies benefit from their providers economies of scale, skills, and experience, which saves investing their own resources into developing such capabilities. For organizations with an established Center of Excellence (CoE), a cloud-based intelligent automation platform allows for focus on automation without waiting for IT teams to build and develop the needed infrastructure for each advanced technology.

By utilizing a providers cloud-based services, CoEs no longer have to deal with execution bottlenecks due to infrastructural challenges. This drastically reduces time to value, empowering organizations to drive their overall transformation plans and make their operations increasingly competitive.

Businesses should start by looking at the bigger picture and making sure their cloud adoption goals are aligned with overall company and digital transformation goals. This type of vision requires high-level executive sponsorship to champion these objectives and visions across the organization.

At the end of the day, nothing will slow down a digital transformation initiative more than an unsupportive work culture. You also need people to look at cloud adoption across the organization and ensure the transformation is coordinated, aligned, and efficient. Similarly, senior team members need to look beyond tasks to end-to-end processes and how cloud-enabled technologies can automate these processes to support workers and customers.

Nothing will slow down a digital transformation initiative more than an unsupportive work culture

Done right, the cloud is an important enabler of true digital transformation, providing all the tools needed for rapid and transformative digitalization. The infrastructure offers the customizability needed for businesses to devise automation plans that will have the greatest impact on their operations. Adopting cloud computing with flexible deployment options has also opened up this facilitator of transformation to historically hesitant sectors, like healthcare and financial services. With solutions today, organizations can adopt a private or hybrid cloud model to meet investor and regulatory requirements.

Automation at scale better supports workers to tackle complex tasks, allows processes to operate more efficiently, gives more time to valuable resources, fosters innovation, and improves returns on investment.

Across 2023 and beyond, the cloud will play a central role in empowering organizations with the power of intelligent automation at scale.

This piece was written and provided by Adam Lawrence, VP Cloud Solutions, SS&C Blue Prism.

Editor's Recommended Articles

The rest is here:
Why top business performers are adopting cloud computing - Open Access Government

Read More..

Bitcoin, Ethereum Technical Analysis: BTC Moves Back Above $29000, After Customers Withdraw $100 Billion From … – Bitcoin News

Bitcoin was back above $29,000 on Wednesday, as markets continued to react to concerns over First Republic Bank. It was reported that customers withdrew around $100 billion in deposits from First Republic in March. Ethereum was also higher on the news, climbing back above $1,900.

Bitcoin (BTC) rebounded strongly on Wednesday, as markets reacted to the news that deposits in First Republic Bank fell by $100 billion last month.

BTC/USD surged to a peak of $29,121.97 earlier in todays session, following a low of $27,217.17 the day before.

This move has pushed bitcoin to its highest point in the last seven-days, when it was trading above $30,000.

Overall, the surge in price comes as bulls rejected a breakout below a long-term support point at $27,000 on Tuesday.

The relative strength index (RSI) also bounced from a floor of its own at the 44.00 mark, and is now tracking at 54.09.

A ceiling of 55.00 will likely act as a checkpoint for bulls, and should they move beyond this, there is a strong possibility that BTC climbs to $30,000.

In addition to BTC, ethereum (ETH) was also in the green, as prices snapped a three-day losing streak.

Following a low of $1,805.32 on Tuesday, ETH/USD jumped to a peak at $1,919.72 earlier in the day.

As a result of this move, ethereum has hit a five-day high, with price now hovering around a resistance point at $1,915.

The last time ETH bulls broke this ceiling was on April 13, and on the occasion the price went on to reach an 11-month high above $2,100.

In order for something similar to happen this go round, the RSI would need to overcome a hurdle at the 53.00 level.

At the time of writing, the index is tracking at 51.64.

Register your email here to get weekly price analysis updates sent to your inbox:

Should the banking crisis worsen, could we see ethereum hit $3,000 in May? Leave your thoughts in the comments below.

Eliman was previously a director of a London-based brokerage, whilst also an online trading educator. Currently, he commentates on various asset classes, including Crypto, Stocks and FX, whilst also a startup founder.

Image Credits: Shutterstock, Pixabay, Wiki Commons

Disclaimer: This article is for informational purposes only. It is not a direct offer or solicitation of an offer to buy or sell, or a recommendation or endorsement of any products, services, or companies. Bitcoin.com does not provide investment, tax, legal, or accounting advice. Neither the company nor the author is responsible, directly or indirectly, for any damage or loss caused or alleged to be caused by or in connection with the use of or reliance on any content, goods or services mentioned in this article.

Read the rest here:

Bitcoin, Ethereum Technical Analysis: BTC Moves Back Above $29000, After Customers Withdraw $100 Billion From ... - Bitcoin News

Read More..