Page 1,913«..1020..1,9121,9131,9141,915..1,9201,930..»

Schrdinger Believed That There Was Only One Mind in the Universe – Walter Bradley Center for Natural and Artificial Intelligence

Consciousness researcher Robert Prentner and cognitive psychologist will tell a prestigious music and philosophy festival in London next month that great physicist Donald Hoffman, quantum physicist Erwin Schrdinger (18871961) believed that The total number of minds in the universe is one. That is, a universal Mind accounts for everything.

In a world where many scientists strive mightily to explain how the human mind can arise from non-living matter, Prentner and Hoffman will tell the HowtheLightGetsIn festival in London (September 1718, 2022) that the author of the famous Cat paradox was hardly a materialist:

In 1925, just a few months before Schrdinger discovered the most basic equation of quantum mechanics, he wrote down the first sketches of the ideas that he would later develop more thoroughly in Mind and Matter. Already then, his thoughts on technical matters were inspired by what he took to be greater metaphysical (religious) questions. Early on, Schrdinger expressed the conviction that metaphysics does not come after physics, but inevitably precedes it. Metaphysics is not a deductive affair but a speculative one.

Inspired by Indian philosophy, Schrdinger had a mind-first, not matter-first, view of the universe. But he was a non-materialist of a rather special kind. He believed that there is only one mind in the universe; our individual minds are like the scattered light from prisms:

A metaphor that Schrdinger liked to invoke to illustrate this idea is the one of a crystal that creates a multitude of colors (individual selves) by refracting light (standing for the cosmic self that is equal to the essence of the universe). We are all but aspects of one single mind that forms the essence of reality. He also referred to this as the doctrine of identity. Accordingly, a non-dual form of consciousness, which must not be conflated with any of its single aspects, grounds the refutation of the (merely apparent) distinction into separate selves that inhabit a single world.

But in Mind and Matter (1958), Schrdinger, we are told, took this view one step further:

Schrdinger drew remarkable consequences from this. For example, he believed that any man is the same as any other man that lived before him. In his early essay Seek for the Road, he writes about looking into the mountains before him. Thousands of years ago, other men similarly enjoyed this view. But why should one assume that oneself is distinct from these previous men? Is there any scientific fact that could distinguish your experience from another mans? What makes you you and not someone else? Similarly as John Wheeler once assumed that there is really only one electron in the universe, Schrdinger assumed that there really is only one mind. Schrdinger thought this is supported by the empirical fact that consciousness is never experienced in the plural, only in the singular. Not only has none of us ever experienced more than one consciousness, but there is also no trace of circumstantial evidence of this ever happening anywhere in the world.

Most non-materialists will wish they had gotten off two stops ago. We started with Mind first, which when accounting for why there is something rather than nothing has been considered a reasonable assumption throughout history across the world (except among materialists). But the assumption that no finite mind could experience or act independently of the Mind behind the universe is a limitation on the power of that Mind. Why so?

Its not logically clear and logic is our only available instrument here why the original Mind could not grant to dogs, chimpanzees, and humans the power to apprehend and act as minds in their own right in their natural spheres not simply as seamless extensions of the universal Mind.

With humans, the underlying assumptions of Schrdingers view are especially problematic. Humans address issues of good and evil. If Schrdinger is right, for example, Dr. Martin Luther King, and Comrade Josef Stalin are really only one mind because each experienced only his own consciousness. But wait. As a coherent human being, each could only have experienced his own consciousness and not the other mans.

However, that doesnt mean that they were mere prisms displaying different parts of the spectrum of broken light. The prism analogy fails to take into account that humans can act for good or ill. Alternatively, it is saying that good and evil, as we perceive them, are merely different colors in a spectrum. As noted earlier, many of us should have got off two stops ago

In any event, Schrdingers views are certain to be an interesting discussion at HowLightGetsIn.

Schrdinger was hardly the only modern physicist or mathematician to dissent from materialism. Mathematician Kurt Gdel (19061978), to take one example, destroyed a popular form of atheism (logical positivism) via his Incompleteness Theorems.

The two thinkers held very different views, of course. But both saw the fatal limitations of materialism (naturalism) and they addressed these limitations quite differently. In an age when Stephen Hawkings disdain for philosophy is taken to be representative of great scientists, its a good thing if festivals like HowLightGetsIn offer a broader perspective and corrective.

You may also wish to read: Why panpsychism is starting to push out naturalism. A key goal of naturalism/materialism has been to explain human consciousness away as nothing but a pack of neurons. That cant work. Panpsychism is not a form of dualism. But, by including consciousness especially human consciousness as a bedrock fact of nature, it avoids naturalisms dead end.

See the rest here:

Schrdinger Believed That There Was Only One Mind in the Universe - Walter Bradley Center for Natural and Artificial Intelligence

Read More..

What is a QPU and how will it drive quantum computing? – IT-Online

A QPU, also known as a quantum processor, is the brain of a quantum computer that uses the behaviour of particles like electrons or photons to make certain kinds of calculations much faster than processors in todays computers.

By Rick Merritt, senior staff writer atNvidia

Just as GPUs and DPUs enableaccelerated computingtoday, theyre also helping a new kind of chip, the QPU, boot up the promise ofquantum computing.

In your hand, a quantum processing unit might look and feel very similar to a graphics or a data processing unit. Theyre all typically chips, or modules with multiple chips, but under the hood the QPU is a very different beast.

So whats a QPU?

A QPU, aka a quantum processor, is the brain of a quantum computer that uses the behaviour of particles like electrons or photons to make certain kinds of calculations much faster than processors in todays computers.

QPUs rely on behaviours like superposition, the ability of a particle to be in many states at once, described in the relatively new branch of physics called quantum mechanics.

By contrast, CPUs, GPUs and DPUs all apply principles of classical physics to electrical currents. Thats why todays systems are called classical computers.

QPUs could advance cryptography, quantum simulations and machine learning and solve thorny optimisation problems.

How does a quantum processor work?

CPUs and GPUs calculate in bits, on/off states of electrical current that represent zeros or ones. By contrast, QPUs get their unique powers by calculating in qubits quantum bits that can represent many different quantum states.

A qubit is an abstraction that computer scientists use to express data based on the quantum state of a particle in a QPU. Like the hands on a clock, qubits point to quantum states that are like points in a sphere of possibilities.

The power of a QPU is often described by the number of qubits it contains. Researchers are developing additional ways to test and measure the overall performance of a QPU.

Many ways to make a qubit

Corporate and academic researchers are using a wide variety of techniques to create the qubits inside a QPU.

The most popular approach these days is called a superconducting qubit. Its basically made from one or more tiny metallic sandwiches called Josephson junctions, where electrons tunnel through an insulating layer between two superconducting materials.

Qubits inside IBMs Eagle superconducting QPU.

The current state of the art creates more than 100 of these junctions into a single QPU. Quantum computers using this approach isolate the electrons by cooling them to temperatures near absolute zero with powerful refrigerators that look like high-tech chandeliers.

Qubits inside IBMs Eagle superconducting QPU.

A qubit of light

Some companies use photons rather than electrons to form qubits in their quantum processors. These QPUs dont require expensive, power-hungry refrigerators, but they need sophisticated lasers and beam splitters to manage the photons.

Researchers are using and inventing other ways to create and connect qubits inside QPUs. For example, some use an analogue process called quantum annealing, but systems using these QPUs have limited applications.

Its early days for quantum computers, so its not yet clear what sorts of qubits in what kinds of QPUs will be widely used.

Simple chips, exotic systems

Theoretically, QPUs may require less power and generate less heat than classical processors. However, the quantum computers they plug into can be somewhat power hungry and expensive.

Thats because quantum systems typically require specialised electronic or optical control subsystems to precisely manipulate particles. And most require vacuum enclosures, electromagnetic shielding or sophisticated refrigerators to create the right environment for the particles.

D-Wave shows qubits and QPU in a full system.

Thats one reason why quantum computers are expected to live mainly in supercomputing centers and large data centres.

QPUs do cool stuff

Thanks to the complex science and technology, researchers expect the QPUs inside quantum computers will deliver amazing results. They are especially excited about four promising possibilities.

First, they could take computer security to a whole new level.

Quantum processors can factor enormous numbers quickly, a core function in cryptography. That means they could break todays security protocols, but they can also create new, much more powerful ones.

In addition, QPUs are ideally suited to simulating the quantum mechanics of how stuff works at the atomic level. That could enable fundamental advances in chemistry and materials science, starting domino effects in everything from the design of lighter airplanes to more effective drugs.

Researchers also hope quantum processors will solve optimization problems classical computers cant handle in fields like finance and logistics. And finally, they may even advance machine learning.

So when will QPUs be available?

For quantum researchers, QPUs cant come soon enough. But challenges span the gamut.

On the hardware level, QPUs are not yet powerful or dependable enough to tackle most real-world jobs. However, early QPUs and GPUs simulating them with software likeNvidia cuQuantum are beginning to show results that help researchers, especially in projects exploring how to build better QPUs and develop quantum algorithms.

Researchers are using prototype systems available through several companies like Amazon, IBM, IonQ, Rigetti, Xanadu and more. Governments around the world are beginning to see the promise of the technology, so theyre making significant investments to build ever larger and more ambitious systems.

How do you program a quantum processor?

Software for quantum computing is still in its infancy.

Much of it looks like the kind of assembly-language code programmers had to slog through in the early days of classical computers. Thats why developers have to understand the details of the underlying quantum hardware to get their programs running.

But here, too, there are real signs of progress toward the holy grail a single software environment that will work across any supercomputer, a sort of quantum OS.

Several early projects are in the works. All struggle with the limitations of the current hardware; some are hampered by the limits of the companies developing the code.

For example, some companies have deep expertise in enterprise computing but lack experience in the kind of high-performance environments where much of the scientific and technical work in quantum computing will be done. Others lack expertise in AI, which has synergies with quantum computing.

Enter hybrid quantum systems

The research community widely agrees that for the foreseeable future, classical and quantum computers will work in tandem. So, software needs to run well across QPUs, CPU and GPUs, too.

Researchers described a hybrid classical-quantum computer in a 2017 paper.

To drive quantum computing forward, Nvidia recently announced theNvidia Quantum Optimized Device Architecture(QODA), an open platform for programming hybrid quantum systems.

QODA includes a high-level language thats concise and expressive so its powerful and easy to use. With QODA, developers can write programs that run on QPUs in quantum computers and GPUs simulating QPUs in classical systems.

Nvidia QODA provides developers a unified platform for programming any hybrid quantum-classical computer.

QODA will support every kind of quantum computer and every sort of QPU.

At its launch, quantum system and software providers including Pasqal, Xanadu, QC Ware and Zapata expressed support for QODA. Users include major supercomputing centers in the US and Europe.

QODA builds on NVIDIAs extensive expertise in CUDA software, which accelerates HPC and AI workloads for scientific, technical and enterprise users.

With a beta release of QODA expected before the end of the year, the outlook for QPUs in 2023 and beyond is bright.

Related

See more here:

What is a QPU and how will it drive quantum computing? - IT-Online

Read More..

The Origin of Zero to What Earths Minerals Can Tell Us About Aliens (The Galaxy Report) – The Daily Galaxy –Great Discoveries Channel

Posted on Aug 3, 2022 in Astrobiology, Astronomy, Astrophysics, Consciousness, Cosmology, Exoplanets, Extraterrestrial Life, James Webb Space Telescope, Milky Way Galaxy, Science, Science News, Space News, Supernova, Technology, Universe

Todays stories range from Could We Use the Suns Gravity to Find Alien Life to The Source of Mysterious Infrared Light to When Will the Milky Ways Next Supernova Occur?

When Will the Next Supernova in Our Galaxy Occur? Scientists have new tools at their disposal to detect and study the dramatic explosion of a star, reports Dan Falk for The Smithsonian. Its been a long wait418 years since weve seen a star explode in our galaxy. So are we overdue for a bright, nearby supernova?

The Elusive Origin of Zero--Who decided that nothing should be something? reports Scientific American. Historians, journalists and others have variously identified the symbols birthplace as the Andes mountains of South America, the flood plains of the Tigris and Euphrates Rivers, the surface of a calculating board in the Tang dynasty of China, a cast iron column and temple inscriptions in India, and most recently, a stone epigraphic inscription found in Cambodia.The tracing of zeros heritage has been elusive.

Could we use the Suns gravity to find alien life?With a telescope at just the right distance from the Sun, we could use its gravity to enhance and magnify a potentially inhabited planet, reports Big Think. Our strongest nearby source of gravity, the Sun, is itself capable of producing a gravitational lens, but only if the geometry is right: conditions that dont begin until were 547 times the Earth-Sun distance away.

What if the reality we perceive is just an evolutionary trick? Do we see the world as it really is? Perhaps not. Maybe even likely not, says Robert Prentner in this YouTube video

With New Study, NASA Seeks the Science behind UFOs--Although modest in scope, a NASA research project reflects shifting attitudes toward the formerly taboo subject of UFOs, reports Adam Mann for Scientific American. NASAs announcement fits in with the suddenly more open-minded zeitgeist regarding UAPs.

Life Helps Make Almost Half of All Minerals on EarthA new origins-based system for classifying minerals reveals the huge geochemical imprint that life has left on Earth. It could help us identify other worlds with life too, reports Quanta.

Where Do Space, Time and Gravity Come From? Einsteins description of curved space-time doesnt easily mesh with a universe made up of quantum wavefunctions. Theoretical physicist Sean Carroll discusses the quest for quantum gravity with host Steven Strogatz at Quanta.com

AI Is Discovering Its Own Fundamental Physics And Scientists Are BaffledAI observed videos of lava lamps and inflatable air dancers and identified dozens of physics variables that scientists dont yet understand, reports Vice Science.

Schrodinger Believed the Universe is One Universal Mind. The quantum physicist and author of the famous Cat Paradox believed that our individual minds are not unique but rather like the reflected light from prisms, reports Mind Matters.

Discovery of new exoplanet raises questions about planet formation, reports University of Florida. The Jupiter-sized world offers two key opportunities to scientists studying how all planets, including those in our own solar system, develop. A mere 1.5-million-year-old infant compared to its probable lifespan of billions of years, the planet is so young it can still provide clues about its birth.

Webb captures stellar gymnastics in the Cartwheel Galaxy, reports the ESA. Webbs high-precision instruments resolved individual stars and star-forming regions within the Cartwheel, and revealed the behavior of the black hole within its galactic center. These new details provide a renewed understanding of a galaxy in the midst of a slow transformation.

Dark Matter Mapped Around Distant Galaxies, reports Physics,com. Gravitational lensing of the cosmic microwave background has been used to probe the distribution of dark matter around some of the earliest galaxies in the Universe.

Is the James Webb Space Telescope finding the furthest, oldest, youngest or first galaxies? An astronomer explains Michael J. I. Brown explains for Space.com

Cosmic Buckyballs Could Be The Source of Mysterious Infrared Light, reports Science Alert. Unidentified Infrared Emission (UIE) bands have baffled scientists for decades; according to a theoretical new work, at least some of these bands can be produced by highly ionized buckminsterfullerene, more commonly known as buckyballs.

Astronomers discover 21 new extremely low-mass white dwarf candidates, reports Phys,org. Extremely low-mass (ELM) white dwarfs (WDs) are rare objects, found with only few exceptions in short-period binaries.

Particle Physicists Puzzle Over a New Duality A hidden link has been found between two seemingly unrelated particle collision outcomes. Its the latest example of a mysterious web of mathematical connections between disparate theories of physics, reports Katie McCormick for Quanta.com

A New Private Moon Race Kicks Off Soon Commercial spacecraft are vying to land on the lunar surface, but can they jump-start a new space economy? reports Scientific American.

Image credit top of page: ESO Observatories, Chile

Curated by The Daily Galaxy Editorial Staff

The Galaxy Report newsletter brings you twice-weekly news of space and science that has the capacity to provide clues to the mystery of our existence and add a much needed cosmic perspective in our current Anthropocene Epoch.

Yes, sign me up for my free subscription.

Recent Galaxy Reports:

Original post:

The Origin of Zero to What Earths Minerals Can Tell Us About Aliens (The Galaxy Report) - The Daily Galaxy --Great Discoveries Channel

Read More..

Post-doctoral Fellow / Research Assistant I/II, Department of Civil Engineering job with THE UNIVERSITY OF HONG KONG | 303412 – Times Higher Education

Work type: Full-timeDepartment: Department of Civil Engineering (14100)Categories: Academic-related Staff

Quantum Computation and Simulation Applications Development

Applications are invited for appointment as Post-doctoral Fellow/Research Assistant I/II in Quantum Computation and Simulation Applications Development in the Department of Civil Engineering (Ref.: 515642), to commence as soon as possible for one year, with the possibility of renewal subject to satisfactory performance.

We are seeking a highly-motivated individual who is passionate to develop quantum computation and simulation applications for environmental engineering and related challenges. Applicants should preferably possess a Ph.D. degree or equivalent in Quantum Simulation and Computation, Quantum Information Science, Computer Science, Data Science, Bioinformatics. However, we also welcome application from applicants with a strong interest in this area and possess a Ph.D. degree or equivalent in Civil and Environmental Engineering, Physics, Biology, Chemistry, Statistics, Mathematics, or other STEM disciplines.

The appointee should have an inquisitive mind, risk-taking attitude, a willingness to learn new skills and the ability to creatively tackle complex problems. Proficiency in Python is preferred but not required. Basic knowledge in quantum computation, quantum information, or quantum physics and exposure to the IBM QISKit quantum programming stack would be an advantage. Those who plan to pursue research postgraduate studies are welcome to apply. Those with lower qualifications and/or less experience may be considered for appointment as Research Assistant I/II.

The appointee will be part of an international joint research collaboration between HKU and Imperial College London (UK), co-led by Dr. Amy Tan and Dr. Po-Heng (Henry) Lee. The appointee will be working within an international and interdisciplinary group to develop quantum computation and simulation applications to tackle grand challenges in environmental sustainability and livable city. Specific topic(s) of research would be dependent on the appointees skillsets and knowledge domain. Examples of research topics could include environmental microbiology and biotechnology, waste management, circular resources, resilient city, etc. Areas that the appointee might be required to undertake include quantum machine learning, quantum optimization, fault-tolerant codes and their decoding algorithms, quantum circuit synthesis and optimization, quantum computational complexity, etc. The appointee will also assist in grant applications, conduct presentations, prepare reports for funded projects, and other ad-hoc duties.

Enquiries about the duties of the post should be sent to Dr. Amy Tan and/or Dr Po-Heng (Henry) Lee at gyatan@hku.hk and po-heng.lee@imperial.ac.uk, respectively.

A highly competitive salary commensurate with qualifications and experience will be offered, in addition to annual leave and medical benefits. At current rates, salaries tax does not exceed 15% of gross income.

The University only accepts online application for the above post. Applicants should apply online and upload an up-to-date C.V. Review of applications will commence as soon as possible and continue until August 25, 2022, or until the post is filled, whichever is earlier.

See the original post here:

Post-doctoral Fellow / Research Assistant I/II, Department of Civil Engineering job with THE UNIVERSITY OF HONG KONG | 303412 - Times Higher Education

Read More..

African businesses can raise up to R10m on this crowd funding platform Heres who qualifies – Business Insider South Africa

African businesses that seek to grow but are struggling to get funding can secure investment between R1.5 million and R10 million on crowd funding platform GoGetta, in exchange for stakes in the business.

The platform, powered by Grovest, anadministrators in the small cap investment space, allows local and international venture capital investors to invest from as little as R1,000 into emerging African businesses.

The platform seeks to solve two investment problems. The one, is setting up a way for up & coming businesses to access capital. The other, is an easy but secure way for investors to put money into these types of ventures.

Quality African businesses are struggling to raise capital using traditional financiers such as banks. Crowdfunding is a billion-dollar global industry and its time for Africa to weigh in. GoGetta is a solution to these funding issues and unlocks the potential of Africas entrepreneurs,"said co-founder of GoGetta Sthembiso Zwane.

Africa is an exciting growth story for investors and our platform showcases some of the best investment opportunities from across the continent,Zwane adds.

GoGetta is looking for businesses within various sectors to list on the platform. Listed businesses have the opportunity to raise investment between R1.5 million and R10 million in exchanges for stakes in the company.

GoGetta is regulated by the Financial Sector Conduct Authority (FSCA), giving investors confidence in our top-notch compliance and governance capability. Africas go getters will also benefit from guidance in formalising their businesses with a smarter way to access capital to fund growth while retaining control, said GoGetta co-founder and CEO Jeff Miller.

Heres who qualifies

Qualifying businesses include those within the following sectors: fintech, agriculture, consumer, energy, enterprise, healthcare, retail and more.

We have already signed up qualifying businesses in South Africa and we are seeing significant interest from businesses on the African continent which are ready for investment, said GoGetta co-founder Leat Sacharowitz.

Read the rest here:
African businesses can raise up to R10m on this crowd funding platform Heres who qualifies - Business Insider South Africa

Read More..

Report: How Fragile is the Cloud, Really? – InformationWeek

A severe cloud infrastructureoutage oftenfeels less like a service disruption and more like an earthquake. One incident can barrel its way across an entire region, indiscriminately disrupting commerce, travel, medical care, and communication. Reverberations are felt far, far from the site of the event.There's nothing most of us can do to prevent it from happening or stop it once it starts. We just have to wait for the shaking to end and hope that none of our most valuable stuff got smashed.

The cloud is quicklybecoming as foundational to life on earthas the ground beneath our feet. AWS, Microsoft Azure,IBM Cloud, Google Cloud, Oracle Cloud, the cloud delivery networks like Fastly are an essentialpart of more businesses and industries every day, whether those businesses know it or not.

And that's not such a bad thing. Cloud computing does enable extraordinary innovations.

Yet the companies running cloud services and maintaining cloud infrastructure have the same challenges all IT teamsdo:People are fallible. Market pressures rush release cycles. Legacy systems hold on with an iron fist.Networks are insufficiently segmented. Credential management is lackluster. Misconfigurationshappen.Updates are risky. Regulatory compliance is a headache.Automation can't solve every problem. And 99.99% uptime sounds good until the .01% event happens.

But the consequences when something goes wrong, the impact of those .01% incidents, are far far worse. When shipping, finance, medical care, government all rely on the same infrastructure to conductbasic operations, and that infrastructure isdisrupted for hours...

Well, it's worth devoting a whole week of coverage to. So, that'swhat we're digging into at InformationWeek and Network Computing this week. Here's what we've covered so far, and what's coming up next:

Lessons Learned from Recent Major Outages,by Sal Salamone

Todays more interconnected business world makes infrastructure and cloud outages all the more impactful. Heres a recap of recent outages and their root causes.

Cyber Resiliency: How CIOs Can Prepare for a Cloud Outage, by John Edwards

The dangers posed by a cloud outage are clear and omnipresent. Here's how to prepare your organization for the inevitable worst-case cloud scenario.

COMING THIS WEEK:

Emerging Tech to Help Guard Against the Malevolence ofCloud Outages, by Pam Baker

From the outermost edge of extreme networks to the center of the energy vortex, emerging technologies are set to permanently end cloud outages. Will they eventually eradicate the clouds, too?

Reality Check: Why Your Cloud Provider Won't Be Providing Multi-Cloud Failover, by Brent Ellis

Your IT organization may view failing over from one hyperscaler's cloud to another as the ultimate security when it comes to cloud resiliency. Here's why that's not going to happen, plus a look at alternatives.

Legislators Gear Up to Regulate Cloud Providers forResilience, by Carlo Massimo

The US, UK, and EU are all weighing regulations that would consider cloud companies "critical infrastructure" and require they meet resiliency standards.

[Sign up for the InformationWeek Cloud biweekly newsletter, your source of insight on the fast-changing world of cloud technology, and how to use it to transform IT and business.]

Are Cloud Outages the Result of Choosing Price Over Reliability?, by Joao-Pierre Ruth

Market pressures and risk tradeoffs made by cloud providers might be the key to understanding why outages happen and what lies ahead for resolving cloud outages that can bring down regional, if not national commerce.

You Get What You Pay For: Cloud Edition, by Jessica Davis

An 8-hour cloud outage during holiday shopping season could cost a retailer millions of dollars. Here's how CIOs are weighing the risks and costs of outages against the complexities and costs of building resilience.

Can You Recover Losses Sustained During a Cloud Outage? By Richard Pallardy and Carrie Pallardy

The cloud comes with tantalizing promises of greater efficiency, improved data security, and boosted profits. But the cloud is not infallible, and outages are inevitable. Heres what IT leaders need to know.

When (and If) to Sue Your Cloud Provider, by Richard Pallardy and Carrie Pallardy

Taking legal action is a slippery slope: Cloud providers are known to have covered their bases well. And besides, is it worth ruining your relationship with the cloud provider? It depends.

What Can Network Managers Do About Cloud Outages? (Not Much), by Sal Salamone

Better observability tools can help net managers maintain some cyber resilience to cloud service outages, but misconfigs and DNS infrastructure is down to the providers.

How Climate Change is Impacting Cloud Resilience, by Samuel Greengard

Datacenter cooling issues are already causing problems for cloud providers. What does a future full of droughts, heat waves, and severe weather events mean for cloud resiliency?

15 Years of Cloud Outages: A Look Back at the InformationWeek Archives

Remember 2008, when "Low" by Flo Rida, featuring T-Pain, was the top single, and Gmail went down for more than 24 hours? We do.

Workspot CEO on Coping with Cloud Outages,by Joao-Pierre Ruth

Read the original post:
Report: How Fragile is the Cloud, Really? - InformationWeek

Read More..

Microsoft and Auckland Transport announce new cloud agreement – IT Brief New Zealand

Auckland Transport (AT) and Microsoft have announced a new cloud agreement aimed at promoting innovation, reducing costs and improving sustainability in transport services.

A significant part of the agreement involves shifting Auckland Transport's data and computing from on-premises servers to Microsoft Azure cloud, making it easier for the organisation to enhance its workforce and provide better services.

The agreement also involves education, with Microsoft training AT employees in cloud fundamentals, security and other digital skills in order to help them navigate emerging technologies and utilise them to their full potential.

An outcome of the agreement will be a shift in focus for AT, giving the transport provider new opportunities to explore innovative ways technology can be harnessed. This will allow them to create new services and enhanced experiences for customers.

Auckland commuters will benefit from the cost saving and the extra agility and efficiency that public cloud creates for AT. During times of high demand, AT will now no longer need to wait for more physical servers to be ordered, and public cloud services can expand to deliver extra capacity as needed. This will help take on and manage more web traffic, transport service updates and card top-up requests.

AT also won't be left paying for unused infrastructure when demand falls, which was a significant problem for the organisation during the COVID-19 lockdowns. The addition of next-generation security services will also boost the resilience of AT's transport systems and better protect customer data.

AT executive general manager business technology Roger Jones says that the new agreement is aided by Microsoft's sustainability values and strategy, which align with the organisation's own.

"At its core, this agreement is about smarter use of resources: using less of the planet's precious resources, optimising operations and increasing our internal capability to make the most of data and modern technologies," he says.

"All of this will help us become a much more agile, efficient organisation that will deliver better services across the region and improve the liveability of our city for many decades to come."

Microsoft has also announced recently that its forthcoming hyperscale datacenter region will be among the most sustainable ever built. The company says it will run on 100% renewable energy from day one and use waterless cooling technologies. Using Microsoft's cloud solutions, AT will easily be able to track emissions across its networks and adjust policies or services to reduce these further.

"One of the things we're getting lots of enquiries about is latency the ability to upload and download data in almost real time, which AT's CCTV networks at stations and intersections rely on," says Microsoft New Zealand MD Vanessa Sorenson.

"Having a local datacenter region here in Aotearoa means much lower latency than ever, so transport systems can run more smoothly and AT is able to respond faster to security or safety incidents, in partnership with Waka Kotahi and the police," she says.

Originally posted here:
Microsoft and Auckland Transport announce new cloud agreement - IT Brief New Zealand

Read More..

Google Cloud Platform Here are the Five Most Important Things you Should Know About the Cloud – Business Cheshire

The public cloud has become increasingly popular in recent years. More and more companies are using the services of, for example, Google Cloud Platform. What dont you know about the cloud? Here are five facts about the GCP.

When running a digital business, you must decide where to store your data and applications. You can invest in your databases, hardware, and software or lease IT infrastructure.

What is the cloud? It is a service for storing data on a providers infrastructure and being able to access a whole range of computing services. Depending on the providers offering, it can include servers (i.e. space for storing files and data), storage, cloud databases, software, or virtual machines.

One of the most popular public clouds is the Google Cloud Platform. Lets take a look at the five most important things to know aboutGoogle Cloud Platform according to FOTC a certified Google Cloud partner company.

Google Cloud Platform is a set of cloud computing services. The components can be selected freely to create your own tailored infrastructure for your business. The services include, among others:

The same services are available to all Google Cloud Platform users whether global corporations, medium and small companies or startups. The scale of use depends on the needs. New businesses can use the same tools as PayPal, eBay or Twitter but pay proportionally less.

Google Cloud is creating its own network of fibre-optic connections between data centres around the world. Among other things, it owns the longest transatlantic fibre-optic cable (9,000 km) between Oregon in the US and Japan. With its own network, Google is able to process and transfer data between locations faster at speeds of up to 10Tbs.

Suppose one of your companys main objectives is to reach users in different corners of the world while maintaining high-speed availability. In that case, you should look at the advantages of GCPs own network.

Cloud services are scalable consumption levels adapt to load levels. Services can be scaled up (for example, when more users enter the service) and down (when users leave the service).

The public cloud also enables prospective scaling. The more a product grows the more functionality and users it has the greater its infrastructure requirements. By using the Google Cloud Platform, you can easily rebuild or expand your infrastructure so that you dont hold back product development.

With scalability also comes cost flexibility. In the GCP, you pay for actual usage, often on a per-second or per-minute basis. A small amount of traffic means a low bill, and a large amount of traffic means a proportionally larger bill, but also the certainty of handling the entire load.

In terms of cost, it is also worth mentioning the total cost of ownership (TCO), or the total cost of maintaining the infrastructure. By owning your own machines or using dedicated servers, you are dedicating specialist time to maintaining them. In the cloud, the physical infrastructure is the responsibility of the service provider, and you can dedicate the time and energy of your specialists to developing the product and introducing optimisations.

Google Cloud has a partner programme that supports sales, proper operation and the development of the full potential of the proposed services. Partner companies are often able to offer better benefits than the provider itself. They also support the implementation of Google Cloud Platform or training courses.

See the original post:
Google Cloud Platform Here are the Five Most Important Things you Should Know About the Cloud - Business Cheshire

Read More..

Cloud Computing IaaS In Life Science Market Size, Scope, Growth Opportunities, Trends by Manufacturers And Forecast to 2029 Shanghaiist – Shanghaiist

New Jersey, United States This unique Cloud Computing IaaS In Life ScienceMarket report begins with planned goals to help industry owners make better and sound decisions making. It covers important data about market growth and briefing about market essentials, market trends, market share, and market size for new entrants. It provides a clear picture of market tactics to help business owners in attaining larger gains. Area marketplace expansion, trade regulations, technological innovations, and novel product launches are some of the key topics covered in this Cloud Computing IaaS In Life Science market analysis report. It quickly briefs on the major impact of COVID-19 on the entire global economy.

Cloud Computing IaaS In Life Science Market research report is the best way of getting detailed overview of industry growth, competition analysis, and a clear view of target customers. It focuses on the entire market scenario to gather all the minute details regarding which factors work best for market and business growth. Key projections for business growth are also emphasized here. Customer needs are the foremost factor to expand product portfolio and consequently expanding the business. By completely understanding customers, it becomes easy for central participants to not get any difficulty while developing or releasing any product or service. Customer understanding is an important success factor for any newbie. It aims at providing all the customer-related details. It also becomes easy to develop an efficient strategy beneficial for business growth.

Get Full PDF Sample Copy of Report: (Including Full TOC, List of Tables & Figures, Chart) @https://www.verifiedmarketresearch.com/download-sample/?rid=4625

Key Players Mentioned in the Cloud Computing IaaS In Life Science Market Research Report:

Cleardata Networks Dell Inc., Global Net Access (GNAX), Carecloud Corporation, Vmware Carestream Health IBM Corporation, Iron Mountain Athenahealth, Inc. and Oracle Corporation.

Cloud Computing IaaS In Life ScienceMarket Segmentation:

Cloud Computing IaaS In Life Science Market, By Component

Software Hardware Services

Cloud Computing IaaS In Life Science Market, By Application

Nonclinical Information Systems (NCIS) Clinical Information Systems

Cloud Computing IaaS In Life Science Market, By Deployment Model

Private Cloud Public Cloud Hybrid Cloud

This Cloud Computing IaaS In Life Science market report assists a number of investors, shareholders as well as enterprises in understanding the tough areas of marketing ideas, technical development, key issues, and systematic analysis in order to accomplish long-term competitive gain in the industry. It goes on to talk about basic market facets along with market drivers, restraints, existing problems, forthcoming opportunities, and forecasts. This Cloud Computing IaaS In Life Science market survey depicts a few exact customer insights in order to build technology strategies to make investment useful. It makes use of both primary and secondary methods to offer wide-ranging industry data to help out you in making business choices and introducing new items to the market.

Inquire for a Discount on this Premium Report@ https://www.verifiedmarketresearch.com/ask-for-discount/?rid=4625

For Prepare TOC Our Analyst deep Researched the Following Things:

Report Overview:It includes major players of the Cloud Computing IaaS In Life Science market covered in the research study, research scope, market segments by type, market segments by application, years considered for the research study, and objectives of the report.

Global Growth Trends:This section focuses on industry trends where market drivers and top market trends are shed light upon. It also provides growth rates of key producers operating in the Cloud Computing IaaS In Life Science market. Furthermore, it offers production and capacity analysis where marketing pricing trends, capacity, production, and production value of the Cloud Computing IaaS In Life Science market are discussed.

Market Share by Manufacturers:Here, the report provides details about revenue by manufacturers, production and capacity by manufacturers, price by manufacturers, expansion plans, mergers and acquisitions, and products, market entry dates, distribution, and market areas of key manufacturers.

Market Size by Type:This section concentrates on product type segments where production value market share, price, and production market share by product type are discussed.

Market Size by Application:Besides an overview of the Cloud Computing IaaS In Life Science market by application, it gives a study on the consumption in the Cloud Computing IaaS In Life Science market by application.

Production by Region:Here, the production value growth rate, production growth rate, import and export, and key players of each regional market are provided.

Consumption by Region:This section provides information on the consumption in each regional market studied in the report. The consumption is discussed on the basis of country, application, and product type.

Company Profiles:Almost all leading players of the Cloud Computing IaaS In Life Science market are profiled in this section. The analysts have provided information about their recent developments in the Cloud Computing IaaS In Life Science market, products, revenue, production, business, and company.

Market Forecast by Production:The production and production value forecasts included in this section are for the Cloud Computing IaaS In Life Science market as well as for key regional markets.

Market Forecast by Consumption:The consumption and consumption value forecasts included in this section are for the Cloud Computing IaaS In Life Science market as well as for key regional markets.

Value Chain and Sales Analysis:It deeply analyzes customers, distributors, sales channels, and value chain of the Cloud Computing IaaS In Life Science market.

Key Findings:This section gives a quick look at the important findings of the research study.

For More Information or Query or Customization Before Buying, Visit @ https://www.verifiedmarketresearch.com/product/global-cloud-computing-iaas-in-life-science-market-size-and-forecast-to-2025/

About Us: Verified Market Research

Verified Market Research is a leading Global Research and Consulting firm that has been providing advanced analytical research solutions, custom consulting and in-depth data analysis for 10+ years to individuals and companies alike that are looking for accurate, reliable and up to date research data and technical consulting. We offer insights into strategic and growth analyses, Data necessary to achieve corporate goals and help make critical revenue decisions.

Our research studies help our clients make superior data-driven decisions, understand market forecast, capitalize on future opportunities and optimize efficiency by working as their partner to deliver accurate and valuable information. The industries we cover span over a large spectrum including Technology, Chemicals, Manufacturing, Energy, Food and Beverages, Automotive, Robotics, Packaging, Construction, Mining & Gas. Etc.

We, at Verified Market Research, assist in understanding holistic market indicating factors and most current and future market trends. Our analysts, with their high expertise in data gathering and governance, utilize industry techniques to collate and examine data at all stages. They are trained to combine modern data collection techniques, superior research methodology, subject expertise and years of collective experience to produce informative and accurate research.

Having serviced over 5000+ clients, we have provided reliable market research services to more than 100 Global Fortune 500 companies such as Amazon, Dell, IBM, Shell, Exxon Mobil, General Electric, Siemens, Microsoft, Sony and Hitachi. We have co-consulted with some of the worlds leading consulting firms like McKinsey & Company, Boston Consulting Group, Bain and Company for custom research and consulting projects for businesses worldwide.

Contact us:

Mr. Edwyne Fernandes

Verified Market Research

US: +1 (650)-781-4080UK: +44 (753)-715-0008APAC: +61 (488)-85-9400US Toll-Free: +1 (800)-782-1768

Email: sales@verifiedmarketresearch.com

Website:- https://www.verifiedmarketresearch.com/

Originally posted here:
Cloud Computing IaaS In Life Science Market Size, Scope, Growth Opportunities, Trends by Manufacturers And Forecast to 2029 Shanghaiist - Shanghaiist

Read More..

The Fathers of Kubernetes: Where Are They Now? – Container Journal

Kubernetes, the open source project for container management, has taken the software development world by storm. The platform is used by countless organizations using containers due to its high scalability, elasticity and reliability. According to the CNCF Annual Survey 2021, 96% of organizations are either using or evaluating Kubernetes.

Kubernetes is a de facto option for container orchestration and scheduling. But it wasnt always that way. It took great minds to construct Kubernetes within Google and others to evangelize its use throughout the software industry. And nowadays, new leaders are emerging to carry the torch forward.

Below, well revisit the history of Kubernetes and check in with its original creators to see where they are today. Well also highlight several other prominent figures within the Kubernetes and open source cloud-native community to gauge where the inertia currently is.

For those unfamiliar with the history of Kubernetes, it was born out of Googles internal infrastructure called Borg. At the time, Google employees were working on Google Compute Engine, Googles version of EC2. Kubernetes was sort of a spiritual successor to Borg, describes Kubernetes co-creator Joe Beda in a 2021 interview with Increment.

Kubernetes emerged due to a combination of the right technology, the right moment and the right people. Google was using Docker, but not in a way that created a competitive edge for Google. Kubernetes was a way to start aligning some of that thinking, Beda explains. This new thinking around resilience and self-healing was already part of the site reliability engineering doctrine (another Google-bred concept). And such tactics were becoming increasingly necessary for the organization to embrace distributed machines where failure was a constant headache. Dealing with the dynamism of an ever-changing system drove all the features that eventually ended up in Kubernetes, Beda says in the interview.

Kubernetes was open sourced in 2014 and donated to the Cloud Native Computing Foundation in 2018. The project now receives support from a vast number of institutions and community members, far beyond the scope of Google. Many of its co-founders have gone on to do extraordinary things so where are they today?

First, theres Kubernetes co-creator Joe Beda. Now semi-retired, this Seattle-based technologists most recent position was as principal engineer at VMware.

Beda has had an illustrious career, including posts at VMware, Split Software, Heptio, Shippable, CoreOS and Microsoft. As a senior staff software engineer working for Google, Beda was the co-founder and technical lead for Kubernetes. Beda has the privilege of filing the first-ever Kubernetes project commit. In his 10-year stint at Google, he also contributed to many other vital projects, like Google Hangouts and Google Compute Engine.

According to his GitHub, Beda is still an active open source contributor, having recently contributed to Kubernetes, VMware Tanzu, ngrok-k8s, and other projects. What impresses Beda is the sheer uptick in Kubernetes adoption. Kubernetes is now the anchor for a broader ecosystem and ways of thinking about deploying and managing applications, says Beda in a session with BrightTalk. We didnt foresee that.

Kubernetes co-founder Brendan Burns is now a corporate vice president at Microsoft, heading Azure projects related to DevOps, including K8s on Azure. Burns recent work has focused on generating client libraries for working with the Kubernetes API. In the era of Borg, Burns was a senior staff software engineer at Google. He had an eight-year run at Google.

In a 2017 interview with ArchiTECHt Show podcast, Burns described his move to Microsoft to aid their container efforts to make containerization easier to use and more flexible for hybrid multi-cloud environments. He also explains how Kubernetes is now bigger than any single company. I think that every single person whos currently involved with [Kubernetes] could step away, and the project would continue. It has that kind of momentum.

Most recently, Burns has written about strengthening RBAC and confidential computing to protect containerized data. Microsoft is one of many organizations making up the Confidential Computing Consortium, a Linux Foundation project.

Now a self-described self-employed stay-at-home Dad, Kubernetes co-founder and Seattlilite Craig McLuckie had a profound career around containerization and making Kubernetes more accessible for developers. At Google, he filled roles as lead product manager and group product manager. He was also the original product lead for Google Compute Engine.

After leaving Google, McLuckie became the founder and CEO of Heptio and later was vice president of R&D at VMware after Vmware acquired Heptio in 2018. MckLuckie was also a major proponent behind the birth of the Cloud Native Computing Foundation.

We still have a lot of work to do as an industry to make the infrastructure technology fade into the background and bring forward the technologies that developers interface with, that enable them to develop the code that drives the business, says McLuckie in a 2019 interview, TechCrunch reports. Lets make that infrastructure technology really, really boring.

As one generation of innovators steps aside, who is stepping into their shoes? We checked in with some of the top faces in the open source cloud-native world. Here are some key thought leaders at the helm of the Kubernetes era today.

Anyone with an eye half open to enterprise software architecture trends will undoubtedly be familiar with one name in particularKelsey Hightower. An often-quoted thought leader and Twitter personality, this celebrity-status developer advocate is currently employed by Google in their cloud computing division. As of 2022, Hightower was a principal engineer at Google working on Googles Cloud Platform.

A self-described minimalist, Hightower has been an evangelist and continuous contributor to Kubernetes since 2014. He co-founded KubeCon in 2015 and even collaborated with Kubernetes co-founders Beda and Burns to write a book on the subject, Kubernetes: Up and Running, published by OReilly in 2017.

Perhaps the most well-known speaker on Kubernetes, Hightower is also a prominent person in cloud computing in general. An article by Tom Krazit of Protocol paints a wonderful background of Hightower, from self-taught programmer to entrepreneur, who even managed a comedy routine at one point in the past.

At Google, Hightower has helped develop Googles Kubernetes Engine (GKE) and Cloud Functions. He advocates for diversity and inclusion within Google and throughout the tech sphere. Today, Hightower brings a human touch to the developer relations role to help customers onboard Google Cloud products and reduce configuration management obstacles.

Another prominent figure in the cloud-native sphere is James Governor, analyst, and co-founder of RedMonk, a developer-focused industry analyst firm. Although Kubernetes has clearly won the container orchestration wars, it still has a path ahead to grow. According to Governor, the focus now is on growing the community, broadening the platform, establishing a strong narrative for event-driven computing and serverless. He also advocates for lowering the developer experience hurdles involved in jumpstarting Kubernetes.

Brian Behlendorf, general manager of Open Source Security Foundation (OpenSSF), a Linux Foundation project, is another prominent contributor and leader within the open source community. He is also the executive director of Hyperledger, an open source blockchain collaboration initiative hosted by the Linux Foundation. Behlendorf was also co-founder of the Apache Project.

In a market where software supply chain attacks are rising, all hands are on the security deck. I think the software industry this year really woke up to not only the fact these earthquakes were happening, Behlendorf told The New Stack in 2022, And how its getting more and more expensive to recover from them.

Kubernetes has become an essential utility for enterprise software developmentundoubtedly, its one of the best ways to manage large container clusters at scale. Thankfully, Kubernetes also benefits from a vibrant culture, represented by over 26,000 thousand virtual and physical attendees at the latest KubeCon + CloudNativeCon Europe 2022. With so much interest, countless other names beyond those mentioned above are now moving the platform forward on a daily basis.

Looking to the future, Kubernetes co-founders tend to agree on one thingthat the core infrastructure should become more boring and fade into the background. We need, like, the Visual Basic for the cloud, says Brendan Burns. While this doesnt necessarily mean removing the complexity that makes Kubernetes perform well, it could mean improving the developer experience around interfacing with the platform.

Some are also bullish on edge computing embracing cloud-native technologies. From a futures perspective, its all about the edge. This is where I see the most excitement I think its going to be a huge growth area and a highly disruptive area of innovation over the coming years. Craig McLuckie told Over The Edge podcast.

Cloud-native is real, its happening now and its accelerating faster than ever. What is your organization doing to prepare? Join Container Journal on August 10 for our virtual CloudNativeDay22 to explore the ecosystem beyond Kubernetes and ways to leverage cloud-native technologies to move faster and more securely. Register now!

Related

Read the original:
The Fathers of Kubernetes: Where Are They Now? - Container Journal

Read More..