Page 2,959«..1020..2,9582,9592,9602,961..2,9702,980..»

Tesla reveals $101 million bitcoin profit following its massive cryptocurrency investment – Yahoo News

Tesla is still holding more than $1.3 billion worth of the cryptocurrency. REUTERS/Mike Blake

Tesla says it cashed out $272 million in bitcoin in the first quarter of 2021.

The company made $101 million in profits from the sale, it said Monday.

Tesla disclosed the sale as part of its first-quarter earnings, which showed continued profitability.

See more stories on Insider's business page.

Tesla made a splash when it announced that it had bought $1.5 billion worth of bitcoin in February - and it's already profited more than $100 million from the investment.

Elon Musk's automaker on Monday said that it sold $272 million worth of digital assets during the first three months of 2021 and that it netted $101 million from the sale, which represented about 10% of its holdings at the time.

Tesla still has $1.33 billion in digital assets on its balance sheet, it said. It's not clear exactly where the other $171 million went, but converting bitcoin to US dollars can come with high fees.

Read more: Tesla is ideally positioned to become the world's most important Bitcoin bank

The disclosure came as part of Tesla's first-quarter earnings report Monday afternoon, in which the company revealed $10.39 billion in revenue and made $438 million of profit for the three-month period ending March 31.

Tesla turned a profit thanks to $518 million it earned selling regulatory credits to other automakers, and the $101 million it brought in from selling bitcoin padded its bottom line as well. Bitcoin has surged about 60% since Tesla's February disclosure but is down slightly from an all-time high of more than $63,000 in April.

On a conference call with investors and analysts on Monday, CFO Zachary Kirkhorn said the company plans to continue adding bitcoin to its balance sheet from vehicle sales.

Musk has been known to hype up cryptocurrencies on social media and has contributed to bitcoin's massive rally by firing off tweets about it to his more than 50 million followers. His past tweets have also sent bitcoin's meme cousin, Dogecoin, soaring to unprecedented highs.

In March, Tesla began accepting bitcoin as a form of payment for its vehicles.

Read the original article on Business Insider

Read more:
Tesla reveals $101 million bitcoin profit following its massive cryptocurrency investment - Yahoo News

Read More..

UW physicists contribute to quantum experiment that may lead to discovery of new subatomic particle – Dailyuw

Claire Anderson @lucky_pennydesigns

A team of physicists from the UW Precision Muon Physics Group has been part of a larger international effort to probe the boundaries of quantum physics. The first results of the Muon g-2 experiment, released earlier this month, revealed a discrepancy between the way a muon should behave in theory and how it behaves in real life.

A muon is a fundamental subatomic particle (like an electron), meaning it cannot be broken down into smaller fragments of matter. While sharing many properties with the electron, the muon only exists for two millionths of a second before decaying, and it is almost 200 times larger than the electron.

This size difference is key, because a particles sensitivity to external influences scales with its mass squared. Since the muon is 200 times larger than an electron, it is 40,000 times more sensitive to any possible effects. Thus, the more accurately we can measure its properties, the more information we can learn about quantum mechanics.

When we make this measurement of the muon, it's actually a direct probe [because] it's actually interacting with all of the particles and forces that we might not even know about, Brynn MacCoy, a physics Ph.D. student involved with the research group, said. So the muons might know about something we dont.

The ongoing experiment aims to precisely measure a property of the muon called the g factor (the g in the experiment name), which describes how a muons internal magnet wobbles.

You can think of it as sort of like a spinning top, Joshua Labounty, another physics Ph.D. candidate involved in the experiment, said. You spin a top, [and] after a while it starts to wobble around, spin and precess. Muons are doing that same sort of motion, just at a super super subatomic scale.

However, the theoretically calculated g factor of a muon does not align with the experimentally determined g factor. The g-2 experiment aims to measure the muons g factor as precisely as possible in order to determine if this discrepancy is a statistical error or if it is evidence of as-yet undiscovered physics.

The experimental value of the muons g factor can be calculated from the Standard Model of particle physics, which is a theory describing all known particles in the universe and three out of the four known fundamental forces. This discrepancy between the measurements could be due to unknown particles or forces not yet included in the Standard Model.

There's a lot of things that are missing from the Standard Model, MacCoy said. It's not surprising that there could be physics beyond [it]. We actually expect there to be physics beyond the Standard Model.

The Standard Model does not include gravity, and it only accounts for about 5% of the matter and energy in the universe, the rest being unknown substances we call dark matter and dark energy. For years, these questions have led optimistic scientists to search for the theory of everything, or a single theory that unites both quantum physics and Einsteins theory of relativity.

It's accurate to say that at the scale that we can measure, our current model of gravity is correct, Hannah Binney, a physics Ph.D. student involved in the experiment, said.

And for the most part, at the scale we can measure, our current model of particle physics is correct. It's just that our brain, our physics brain, says Surely we should be able to connect them.

There are a wide range of theories attempting to explain the g factor anomaly, but the goal of the experiment isnt to prove or disprove any particular one. Rather, the more accurate a measurement the physicists can determine, the more theories they are able to rule out.

There's a big parameter space that you start off with, and every new experiment sort of erases a little bit of that parameter space that you could still have a particle in until you finally can maybe shrink down to one area thats still left and say, OK, there should be a particle here, Labounty said.

In order to confirm the existence of a new particle, the data would have to show a significance of five standard deviations, which is the typical benchmark scientists use to accept a new discovery. This benchmark means that there is a one in 3.5 million chance that the results are a statistical error, and reflects an incredibly high confidence in their accuracy.

The data that was just released correlated to a one in 40,000 chance that the results are a fluke. However, this first release represents only 6% of the data that is planned to be collected over the course of the experiment. This data also comes from the teams first run in 2018, and they are currently on their fourth.

In order to reduce their statistical uncertainty, the researchers plan to continue taking more measurements and analyzing the data from their subsequent experimental runs.

It's a very unique experience to actually have the prospect of pushing the boundaries of physics and possibly finding something new, Binney said.

Reach reporter Sarah Kahle at news@dailyuw.com. Twitter: @karahsahle

Like what youre reading? Support high-quality student journalism by donating here.

Originally posted here:

UW physicists contribute to quantum experiment that may lead to discovery of new subatomic particle - Dailyuw

Read More..

Carlo Rovelli’s Helgoland argues that all reality is relative – New Statesman

Even when you understand the science, its quite hard to accept that rainbows are not real. You can think you can see one, and the person standing next to you agrees. But you are both being fooled. A rainbow is the pattern of coloured light that you perceive when you look out on a very specific set of atmospheric conditions. Because each of your eyes looks out at a slightly different angle, you actually perceive two different rainbows, one in each eye. And the person next to you sees a different rainbow again. None of those rainbows exist out there, outside of a mind.

If you can accept this, maybe you can accept the Italian physicist Carlo Rovellis perception of the universe. His new book, Helgoland, is an argument that nothing we see and experience actually exists. Just as the rainbow is a manifestation of the angle between you, some water droplets in the sky, and the sun, Rovelli tells us that the atoms, electrons, photons of light and other stuff of the universe manifest only in their interactions with each other. Individual objects are the way in which they interact, he says. Reality is a vast web of interacting entities, of which we are a part.

This relational reality is Rovellis favoured way of interpreting quantum theory, physicists best mathematical description of how the universe behaves at its most fundamental level. Quantum physics invites such interpretations because it doesnt actually have anything to say about the nature of reality. It was cobbled together, a somewhat Heath Robinson affair, on the back of late 19th-century attempts to make better electric light bulbs.

First came the assertion from the German physicist Max Planck that energy is emitted by atoms in lumps: it pours out like cereal from a packet, not like milk from a bottle. Planck could not justify his idea he called it an act of desperation. Nonetheless, it enabled him to explain the ratio of heat to visible light given out by electric light bulb filaments. After this problem was solved, one ingenious hack was forced on top of another until we ended up with a theory that could accurately describe the outcomes of any experiment involving atoms and their ilk.

Its been a huge success; developments of quantum theory have given us innumerable technological and scientific breakthroughs. At the same time, though, the theory has never been able to tell us what the constituents of the universe actually are.

This wouldnt have been a problem if a small cadre of physicists didnt insist that it should be. Early in the quantum story the 1920s and 1930s the Danish physicist Niels Bohr tried his best to stem this tide. When Einstein objected to the apparent randomness at the heart of quantum theory, Bohr allegedly told him to, Stop telling God what to do. In the face of efforts to describe what atoms were, Bohr warned that, when it comes to atoms, language can be used only as in poetry.

Bohr might as well have saved his breath. We now have myriad interpretations of quantum theory, each one an attempt to describe an underlying reality that gives rise to the results we obtain in quantum experiments. And they are, essentially, guesswork.

[see also:The moonshot delusion]

You might be familiar with some of the guesses. Theres the many worlds interpretation, for instance, which claims that you are reading this in one of a near-infinite number of alternate universes. Each one is the host of a different outcome of a single event in the quantum world. Another famous interpretation is the hidden variables idea favoured by Einstein, where an atom is not a particle as we tend to think of it, but consists of a particle and an invisible, undetectable quantum wave that guides the particles behaviour.

Rovellis relational interpretation, the central subject of his book, has its roots in the work of a young physicist called Werner Heisenberg. In the summer of 1925, Heisenberg took himself on a retreat to the small, near-treeless North Sea island of Helgoland. Here, he dedicated himself to creating a new mathematical approach to quantum theory: matrix mechanics.

You probably wont have heard of matrix mechanics because it has been suffocated by the popularity of an alternative: Erwin Schrdingers wave equation. Schrdinger treated isolated quantum entities, such as atoms, as if they were waves. When different quantum waves come together, they create what seem like otherworldly influences between quantum stuff, and strange behaviours such as one entity simultaneously existing in multiple places, or simultaneously moving in multiple directions.

In the commonly accepted view, Schrdingers waves eventually crash on the shores of their environment (such as our laboratory measuring apparatus), leaving imprints that we have generally taken as evidence for the existence of quantum particles. However, these imprints often reveal a wave-like past to these particles existence, confounding our understanding of what these things actually are. Hence the American physicist Richard Feynmans evergreen assertion that quantum physics is not actually comprehensible.

Heisenberg appreciated this far earlier than most, declaring Schrdingers quantum waves to be repulsive and crap, and offering his matrix mechanics the pure, unadorned mathematics of how one state of an atom relates to another state in the moment of a measurement of its properties as an alternative. Even Schrdinger conceded that this was more accurate. It is better, he admitted, to consider a particle not as a permanent entity but rather as an instantaneous event.

And according to Rovelli, this series of fortunate events is all there is. The properties of a quantum object are only real with respect to some other object at some moment, just as a rainbow is only real in the mind of an observer at the moment of its observation. Whats more, a third quantum object might not perceive those same properties at all. Putting it another way, reality is relative and truth is subjective. Is it possible that a fact might be real with respect to you and not real with respect to me? Rovelli asks. Quantum theory, I believe, is the discovery that the answer is yes.

Rovelli doesnt really push things much further than that; you wont come away from Helgoland with a sense that you finally understand the true nature of reality. He doesnt explain, for instance, what it is that is doing the interacting, if the entities are nothing but their interactions. But it is a pleasure to travel in his company regardless.

Thats partly because, as with Rovellis previous books, the prose is translated from his Italian by the writer Erica Segre and Simon Carnell, a poet, who have made it a delight to read. They describe reality as a luxuriant stratification: snow-covered mountains and forests, the smile of friends, the rumble of the underground on dirty winter mornings With phrasing like this, who cares if there are no real answers?

And lets not pretend that any books on quantum physics can contain satisfying answers about what reality is. How could they, when the theory is not designed to give any? Using it as a guide to the nature of reality leaves us stranded in the mist, like Heisenberg lost in his thoughts on Helgoland.

Aware of the inadequacy of the science, Rovelli offers a second source for his intuitions. In the last third of the book we are seated with him at the feet of the Buddhist philosopher Nagarjuna, who teaches that there is nothing that exists in itself, independently from something else. For me as a human being, Rovelli says, Nagarjuna teaches the serenity, the lightness and the shining beauty of the world: we are nothing but images of images. Reality, including our selves, is nothing but a thin and fragile veil, beyond which there is nothing. Much like that damned rainbow.

Michael Brookss books include The Quantum Astrologers Handbook (Scribe)

HelgolandCarlo RovelliAllen Lane, 208pp, 20

[see also:This risks creating an arms race: inside Europes battle over the future of quantum computing]

Read more here:

Carlo Rovelli's Helgoland argues that all reality is relative - New Statesman

Read More..

Muon g-2 Particle Accelerator Experiment Results Are Not Explained by Our Current Theories of Physics – SciTechDaily

The Muon g-2 ring sits in its detector hall amidst electronics racks, the muon beamline, and other equipment. Credit: Reidar Hahn, Fermilab

The first results from the Muon g-2 experiment at the U.S. Department of Energys Fermi National Accelerator Laboratory have revealed that fundamental particles called muons behave in a way that is not predicted by scientists best theory to date, the Standard Model of particle physics. This landmark result, published recently in Physical Review Letters, confirms a discrepancy that has been gnawing at researchers for decades.

The strong evidence that muons deviate from the Standard Model calculation might hint at exciting new physics. The muons in this experiment act as a window into the subatomic world and could be interacting with yet-undiscovered particles or forces.

This experiment is a bit like a detective story, said team member David Hertzog, a University of Washington professor of physics and a founding spokesperson of the experiment. We have analyzed data from the Muon g-2s inaugural run at Fermilab, and discovered that the Standard Model alone cannot explain what weve found. Something else, perhaps beyond the Standard Model, may be required.

The Muon g-2 experiment is an international collaboration between Fermilab in Illinois and more than 200 scientists from 35 institutions in seven countries. UW scientists have been an integral part of the team through the Precision Muon Physics Group constructing sensitive instruments and sensors for the experiment, and leading data analysis endeavors. In addition to Hertzog, current UW faculty and lead scientists involved include Peter Kammel, research professor of physics; Erik Swanson, a research engineer with the UWs Center for Experimental Nuclear Physics and Astrophysics, or CENPA; Jarek Kaspar, a research scientist; and Alejandro Garcia, a professor of physics.

Lead fluoride crystals, which are used in detectors designed and constructed at the UW that measure muon decay products for the Muon g-2 experiment. Credit: University of Washington

The UW custom-built instrumentation would not have been possible without the extraordinary dedication and expertise of our CENPA technical staff, who work closely with our postdocs and graduate students, said Hertzog.

A muon is about 200 times as massive as its cousin, the electron. They occur naturally when cosmic rays strike Earths atmosphere. Particle accelerators at Fermilab can produce them in large numbers. Like electrons, muons act as if they have a tiny internal magnet. In a strong magnetic field, the direction of the muons magnet precesses, or wobbles, much like the axis of a spinning top. The strength of the internal magnet determines the rate that the muon precesses in an external magnetic field and is described by a number known as the g-factor. This number can be calculated with ultra-high precision.

As the muons circulate in the Muon g-2 magnet, they also interact with a quantum foam of subatomic particles popping in and out of existence. Interactions with these short-lived particles affect the value of the g-factor, causing the muons precession to speed up or slow down slightly. The Standard Model predicts with high precision what the value of this so-called anomalous magnetic moment should be. But if the quantum foam contains additional forces or particles not accounted for by the Standard Model, that would tweak the muon g-factor further.

Hertzog, then at the University of Illinois, was one of the lead scientists on the predecessor experiment at Brookhaven National Laboratory. That endeavor concluded in 2001 and offered hints that the muons behavior disagreed with the Standard Model. The new measurement from the Muon g-2 experiment at Fermilab strongly agrees with the value found at Brookhaven and diverges from theory with the most precise measurement to date.

The accepted theoretical values for the muon are:

The new experimental world-average results announced by the Muon g-2 collaboration today are:

The combined results from Fermilab and Brookhaven show a difference with theoretical predictions at a significance of 4.2 sigma, a little shy of the 5 sigma or 5 standard deviations that scientists prefer as a claim of discovery. But it is still compelling evidence of new physics. The chance that the results are a statistical fluctuation is about 1 in 40,000.

This result from the first run of the Fermilab Muon g-2 experiment is arguably the most highly anticipated result in particle physics over the last years, said Martin Hoferichter, an assistant professor at the University of Bern and member of the theory collaboration that predicted the Standard Model value. After almost a decade, it is great to see this huge effort finally coming to fruition.

UW research engineer Erik Swanson with equipment used to measure magnetic fields in the Muon g-2 experiment. Credit: University of Washington

The Fermilab experiment, which is ongoing, reuses the main component from the Brookhaven experiment, a 50-foot-diameter superconducting magnetic storage ring. In 2013, it was transported 3,200 miles by land and sea from Long Island to the Chicago suburbs, where scientists could take advantage of Fermilabs particle accelerator and produce the most intense beam of muons in the United States. Over the next four years, researchers assembled the experiment; tuned and calibrated an incredibly uniform magnetic field; developed new techniques, instrumentation, and simulations; and thoroughly tested the entire system.

The Muon g-2 experiment sends a beam of muons into the storage ring, where they circulate thousands of times at nearly the speed of light. Detectors lining the ring allow scientists to determine how fast the muons are wobbling.

Many of the sensors and detectors at Fermilab were constructed at the UW, such as instruments to measure the muon beam as it enters the storage ring and to detect the telltale particles that arise when muons decay. Dozens of scientists including faculty, postdoctoral researchers, technicians, graduate students and undergraduate students have worked to assemble these sensitive instruments at the UW and then install and monitor them at Fermilab.

UW scientists have also been involved in theoretical work around the Muon g-2 collaboration.

The prospects of the new result triggered a coordinated theory effort to provide our experimental colleagues with a robust, consensus Standard-Model prediction, said Hoferichter, who was a UW research assistant professor from 2015 to 2019. Future runs will motivate further improvements, to allow for a conclusive statement if physics beyond the Standard Model is lurking in the anomalous magnetic moment of the muon.

In its first year of operation, in 2018, the Fermilab experiment collected more data than all prior muon g-factor experiments combined. The Muon g-2 collaboration has now finished analyzing the motion of more than 8 billion muons from that first run. The UW team was central to this effort, leading to four doctoral theses to date.

Data analysis on the second and third runs of the experiment is under way; the fourth run is ongoing, and a fifth run is planned. Combining the results from all five runs will give scientists an even more precise measurement of the muons wobble, revealing with greater certainty whether new physics is hiding within the quantum foam.

So far we have analyzed less than 6% of the data that the experiment will eventually collect, said Fermilab scientist Chris Polly, who is a co-spokesperson for the current experiment and was a lead University of Illinois graduate student under Hertzog during the Brookhaven experiment. Although these first results are telling us that there is an intriguing difference with the Standard Model, we will learn much more in the next couple of years.

With these exciting results our team, in particular our students, is enthusiastic to push hard on the remaining data analysis and future data-taking in order to realize our ultimate precision goal, said Kammel.

For more on this research:

Reference: Measurement of the Positive Muon Anomalous Magnetic Moment to 0.46 ppm by B. Abi et al. (Muon g2 Collaboration), 7 April 2021, Physical Review Letters.DOI: 10.1103/PhysRevLett.126.141801

Read more here:

Muon g-2 Particle Accelerator Experiment Results Are Not Explained by Our Current Theories of Physics - SciTechDaily

Read More..

NTT Research and Tokyo Institute of Technology Target Two Applications for CIM – Business Wire

SUNNYVALE, Calif.--(BUSINESS WIRE)--NTT Research, Inc., a division of NTT (TYO:9432), today announced that it has entered into a joint research agreement with Tokyo Institute of Technology (Tokyo Tech) to develop applications for the Coherent Ising Machine (CIM). The two targeted applications for the CIM, an information processing platform based on quantum oscillator networks, are compressed sensing and drug discovery, both of which require extremely high levels of processing on existing computers. Two agreements, signed in 2020, call for collaboration between NTT Researchs Physics & Informatics (PHI) Lab and independent research groups in Tokyo Techs School of Computing, directed by Drs. Yukata Akiyama and Toru Aonishi. NTT Research will lead the five-year project, which will involve approximately ten researchers working in Tokyo and Sunnyvale.

Tokyo Tech, the largest institution for higher education in Japan devoted to science and technology, is a national research university funded primarily through the government. In its School of Computing, Professor Akiyama specializes in bioinformatics, including genome information processing, drug design and parallel applications; and Associate Professor Aonishi specializes in information science, mathematical physics and statistical mechanics. Drug discovery and compressed sensing are considered appropriate applications for a CIM because of their requirements of solving large scale optimization problems. The search for effective drugs involves an astronomical number of potential matches between pharmaceutically appropriate molecules and target proteins responsible for a specific disease. In fields such as magnetic resonance imaging (MRI) and computed tomography (CT), compressed sensing, also known as sparse sampling, can deliver highly efficient results by discarding large amounts of data with no useful information. The CIM is purpose-built to solve combinatorial optimization problems, which is a viable approach to both drug discovery and compressed sensing.

Previous work has focused mainly on understanding how quantum oscillator networks solve combinatorial optimization problems, said Dr. Yoshihisa Yamamoto, Director of the PHI Lab. Through this new application-oriented work undertaken in collaboration with Professors Akiyama and Aonishi, we believe that we will be able to explore new ways to use the networks by better understanding the requirements of a CIM.

A CIM is a network of oscillators programmed to solve problems that have been mapped to an Ising model, which is a mathematical abstraction of magnetic systems composed of competitively interacting spins, or angular momentums of fundamental particles. (For a visual representation of how a CIM solves a combinatorial optimization problem, see this video from the MITs Lincoln Laboratory.) The near-term goals in this joint research include formulating the essential part of the intensive computation required for a CIM to screen drug candidate compounds via combining their functional fragments and developing a CIM-based L0 norm reconstruction algorithm of distorted images. (The L0 norm relates to non-zero elements in a matrix.) Broader expectations are to demonstrate the advantages of a CIM and its related technology in addressing real-world problems and to explore new ways of computing.

We are very pleased to have entered into these agreements with NTT Research and look forward to exciting results over the next five years resulting from the collaboration between Professors Akiyama and Aonishi, their groups and their NTT Research counterparts, said Osamu Watanabe, Executive Vice President, Director of the Office of Research and Innovation, Tokyo Tech.

As part of its long-range goal to radically redesign artificial computers, both classical and quantum, the NTT Research PHI Lab has already established joint research agreements with seven universities, one government agency and one quantum computing software company. The other institutions of higher education are Cornell University, Massachusetts Institute of Technology (MIT), Stanford University, California Institute of Technology, Swinburne University of Technology, the University of Michigan and the University of Notre Dame. The government entity is NASA Ames Research Center, and the private company is 1QBit. In January 2021, NTT Research entered a second agreement with Caltech to develop an extremely fast, miniaturized CIM. The PHI Labs research partners include more than a dozen of the worlds leading quantum physicists. In addition to its PHI Lab, NTT Research has two other divisions: its Cryptography & Information Security (CIS) Lab and Medical & Health Informatics (MEI) Lab.

About NTT Research

NTT Research opened its offices in July 2019 as a new Silicon Valley startup to conduct basic research and advance technologies that promote positive change for humankind. Currently, three labs are housed at NTT Research facilities in Sunnyvale: the Physics and Informatics (PHI) Lab, the Cryptography and Information Security (CIS) Lab, and the Medical and Health Informatics (MEI) Lab. The organization aims to upgrade reality in three areas: 1) quantum information, neuro-science and photonics; 2) cryptographic and information security; and 3) medical and health informatics. NTT Research is part of NTT, a global technology and business solutions provider with an annual R&D budget of $3.6 billion.

NTT and the NTT logo are registered trademarks or trademarks of NIPPON TELEGRAPH AND TELEPHONE CORPORATION and/or its affiliates. All other referenced product names are trademarks of their respective owners. 2021 NIPPON TELEGRAPH AND TELEPHONE CORPORATION

Visit link:

NTT Research and Tokyo Institute of Technology Target Two Applications for CIM - Business Wire

Read More..

Partners Capital Strengthens Global Leadership Team with Appointment of Senior Executive in Asia Pacific – PR Newswire UK

"We are thrilled to welcome Emmanuel to Partners Capital at a time when the client demand for investment services and solutions across Asia has never been greater," said Raghavan, who founded and ran the firm's Asia business in 2011 until becoming CEO in July 2020. "Emmanuel's extensive network of senior relationships with investors, asset managers and financial institutions in Asia-Pacific (APAC) accentuates our ability to holistically serve our global clients."

Pitsilis brings over 25 years of experience in APAC as an investor and leader in the financial services industry. Prior to joining Partners Capital, he was an entrepreneur and early-stage venture investor focused on Asia B2B SaaS and FinTech sectors. Over the last seven years, he co-founded two tech businesses and built a successful venture portfolio. In addition, he spent over 20 years at McKinsey & Company, mostly in Hong Kong, where he was a Senior Partner in the financial services practice focused on building the Asian practice. At McKinsey, his clients included global investment banks, regional financial institutions and senior policy makers such as governments, central banks or securities regulators.

"I am excited to join Partners Capital at a time when Asia is at the forefront of global investment opportunities and investment innovation," said Pitsilis. "After meeting Arjun, Adam and Dominik, it was clear to me that the firm has unmatched potential given its deep intellectual capital and global investment talent. The firm's rigorous approach to every facet of investing including research, highly sophisticated risk management and portfolio construction combined with access to top-tier managers and compelling investment ideas, and an ability to integrate sustainability is truly distinctive. We are well positioned to help both Asian investors institutions and a growing number of family offices in navigating global markets and to help global investors in understanding and investing in Asia."

Pitsilis holds a Masters from cole Polytechnique majoring in Pure Mathematics and Quantum Physics, a Masters in Engineering from cole des Mines de Paris and an MBA from INSEAD.

About Partners Capital

Founded in 2001, Partners Capital is a wholly independent Outsourced Investment Office (OCIO) primarily serving sophisticated institutions and senior investment professionals in Europe, North America and Asia Pacific. With offices in Boston, New York, London, Singapore, Hong Kong, San Francisco and Paris, the firm is one of the few truly global OCIOs, employing 230 people worldwide and covering all major asset classes. The firm oversees assets in excess of $38 billion.1Additional information on Partners Capital may be found at http://www.partners-cap.com

Contact

Prosek on behalf of Partners Capital

pro-partnerscapital@prosek.com

1As at December 2020

Photo -https://mma.prnewswire.com/media/1497197/Prosek.jpgLogo -https://mma.prnewswire.com/media/1421509/Partners_Capital_Logo.jpg

Home

SOURCE Partners Capital

See more here:

Partners Capital Strengthens Global Leadership Team with Appointment of Senior Executive in Asia Pacific - PR Newswire UK

Read More..

Lessons Learned from Warehouse Management Systems in the Cloud – DC Velocity

Warehouse Management Systems from some vendors at least were a little late to move to the Cloud versus other supply chain applications, such as Transportation Management Systems. Thats for a number of reasons, including concerns about response times for critical real-time systems such wireless terminals, Voice picking and materials handling system communications, as well as limited functionalities in some early to market Cloud WMS offerings.

Softeon has been using Cloud WMS deployments for many years, but has really seen adoption take-off in the past two years. Currently, about 75% of new WMS deployments are Cloud-based, heading towards nearly 100%.

We also have a number of customers that have easily migrated from existing on-premise WMS implementations to the Cloud.

This is due to the major advantages from Cloud deployments in such areas as time and cost of the implementations and ease of system management post-go live, requiring little customer internal IT resources.

After a significant number of Cloud WMS deployments, Softeon has gained insights that are potentially interest to prospective WMS adopters, which well summarize below.

Those include:

Supply chain software in the Cloud has clearly achieved critical mass and Cloud-based Warehouse Management Systems are now ready for prime time, providing many advantages to the companies that adopt them.

Go here to see the original:
Lessons Learned from Warehouse Management Systems in the Cloud - DC Velocity

Read More..

Inside Google’s soaring earnings: ‘Our cloud services are helping businesses, big and small, accelerate their digital transformations’" – WRAL…

RESEARCH TRIANGLE PARK Googles digital advertising network has shifted back into high gear, with its corporate parent reporting profit that more than doubled after an unprecedented setback during the early stages of the pandemic. Plus, cloud computing from Google is helping customers capitalize on digitalization, triggered in part by responses to the pandemic.

Sundar Pichai, CEO of Google and Alphabet, noted the trends in his summary of Tuesdays earnings report: Over the last year, people have turned to Google Search and many online services to stay informed, connected and entertained. Weve continued our focus on delivering trusted services to help people around the world. Our cloud services are helping businesses, big and small, accelerate their digital transformations.

From Sundar Pichals earnings conference call:

When I last spoke with you in early February, no one could have imagined how much the world would change, and how suddenly. Our thoughts are with everyone who has been impacted by COVID-19, especially those whove lost loved ones or their livelihoods. Its a challenging moment for the world.

Through it all, were incredibly grateful for all of the essential workers on the front-lines of this crisis, from health care workers and first-responders, to the grocery store clerks and delivery workers, to teachers grappling with new technology to help children learn remotely, to all of the scientists and researchers working hard to develop vaccines and treatments, and many others who are leading through these difficult times. Thank you. These people fill us with hope and show us the power of human resilience. Well need that energy andresolve in the months and years ahead.

Today, there is still a great deal of uncertainty regarding the path to recovery. But there are some things that we can understand better with the patterns we are seeing. For example, its clear from data that people are being more cautious and are seeking authoritative advice and guidance to protect their families health and safety. A return to normal economic activity depends on how effectively societies manage the spread of the virus. Theres no one size fits all and the timing and pace of recovery will vary from location to location. This is a long-term effort.

Its also clear that this is the first major pandemic taking place in a digital world.

Read full transcript online.

The clouds growing importance to Google is reflected by the fact the tech giant expanding its cloud efforts with a new engineering center in Durham where it expects to employ 1,000 people.

Added Chief Financial Officer Ruth Porat:Total revenues of $55.3 billion in the first quarter reflect elevated consumer activity online and broad based growth in advertiser revenue. Were very pleased with the ongoing momentum in Google Cloud, with revenues of $4.0 billion in the quarter reflecting strength and opportunity in both GCP and Workspace.

The robust first-quarter advertising growth provides the latest sign that advertisers are expecting the economy to roar back to life as more people get vaccinated against COVID-19 and burst out of their pandemic cocoons.

That is particularly true in the travel industry, a key part of the ad market that drastically curtailed its spending last year after governments around the world imposed lockdowns to prevent the spread of the novel coronavirus.

Google picks Durham for engineering hub, aims to create 1,000 jobs

Google vast digital ad empire is now benefiting from that recovery, although company executives warned in a conference call that another wrong turn in the pandemic could discourage recent consumer splurging thats also spurring advertisers to spend more.

Its too early to say how durable this consumer behavior will be as economies recover and restrictions on mobility are lifted,Porat.

Googles sales surged 32% from the same time last year to nearly $45 billion during the January-March period. Its the third consecutive quarter of accelerating ad growth for Google following an 8% decline during last years April-June period. That marked the first time Googles quarterly ad revenue had fallen from the previous year since the company went public in 2004.

The resurgence enabled Alphabet to easily surpass the analyst estimates that help set investor expectations.

The Mountain View, California, company earned $17.9 billion, or $26.29 per share, more than double what it reported the same time last year. The profit was inflated by an accounting change of $650 million, or 97 cents per share.

Total revenue, which also includes Googles cloud-hosting service and device sales, climbed 34% from last year.

Analysts had projected earnings of $15.76 per share on revenue of $51.5 billion, according to FactSet. The performance pleased investors, who drove up Alphabets stock by 4% in extended trading after the numbers came out.

Aside from the one-quarter downturn in ad revenue, Google has mostly thrived throughout the pandemic as people became more dependent on its services a phenomenon that has strengthened other technology stalwarts such as Apple, Amazon, Microsoft, Facebook and Netflix.

Alphabets stock is trading above $2,300, nearly double its price when the pandemic was declared 13 months ago. Alphabets market value is now nearly $1.6 trillion. And if Alphabets shares follow the same trajectory during Wednesdays regular trading session, the stock will hit a new peak.

Googles critics contend much of its success has come through anti-competitive practices tied to the dominance of its search engine, which has become the de facto gateway into the digital world. Those complaints culminated in a series of lawsuits filed by U.S. regulators last year in cases aimed at reining in Googles ability to expand, if not forcing a break of its services.

But the main lawsuit filed by the U.S. Justice Department isnt scheduled to go to trial until September 2023, leaving Google ample time to extend its tentacles even further while fighting a case that it contends is unfounded.

Googles YouTube video site remains one of the companys fastest rising stars, with ad revenue increasing 49% from last year to $6 billion. The companys cloud-computing service is also rapidly expanding; its revenue shot up 46% from last year.

Read more:
Inside Google's soaring earnings: 'Our cloud services are helping businesses, big and small, accelerate their digital transformations'" - WRAL...

Read More..

Cloud Hosting Service Providers Market 2021 Is Booming Across the Globe by Share, Size, Growth, Segments and Forecast to 2027 | Top Players Analysis-…

Industry Growth Insights (IGI) has added a latest report on the Global Cloud Hosting Service Providers Market that covers the 360 scope of the market and various parameters that are speculated to proliferate the growth of the market during the forecast period, 2021-2028. The market research report provides in-depth analysis in a structured and concise manner, which in turn, is expected to help the esteemed reader to understand the market exhaustively.

Major Players Covered In This Report:

SoftLayerGoogleDistil NetworksQt Cloud ServicesTelaxCompuLabRed HatAmazonCenturyLinkAcquiaViaWestMicrosoftCSCHPFujitsuCloud Hosting Service Provider

The research report confers information about latest and emerging market trends, key market drivers, restraints, and opportunities, supply & demand scenario, and potential future market developments that are estimated to change the future of the market. This report also serves the strategic market analysis, latest product developments, comprehensive analysis of regions, and competitive landscape of the market. Additionally, it discusses top-winning strategies that has helped industry players to expand their market share.

Get Exclusive Sample Report for Free @ https://industrygrowthinsights.com/request-sample/?reportId=173122

9 Key Report Highlights

Historical, Current, and Future Market Size and CAGR

Future Product Development Prospects

In-depth Analysis on Product Offerings

Product Pricing Factors & Trends

Import/Export Product Consumption

Impact of COVID-19 Pandemic

Changing Market Dynamics

Market Growth in Terms of Revenue Generation

Promising Market Segments

Impact of COVID-19 Pandemic On Cloud Hosting Service Providers Market

The COVID-19 pandemic had persuaded state government bodies to impose stringent regulations on the opening of manufacturing facilities, corporate facilities, and public places. It had also imposed restrictions on travelling through all means. This led to the disruption in the global economy, which negatively impacted the businesses across the globe. However, the key players in the Cloud Hosting Service Providers market created strategies to sustain the pandemic. Moreover, some of them created lucrative opportunities, which helped them to leverage their market position.

The dedicated team at Industry Growth Insights (IGI) closely monitored the market from the beginning of the pandemic. They conducted several interviews with industry experts and key management of the top companies to understand the future of the market amidst the trying times. The market research report includes strategies, challenges & threats, and new market avenues that companies implemented, faced, and discovered respectively in the pandemic.

On What Basis the Market Is Segmented in The Report?

The global Cloud Hosting Service Providers market is fragmented on the basis of:

Products

Cloud-basedOn PremiseCloud Hosting Service Provider

The drivers, restraints, and opportunities of the product segment are covered in the report. Product developments since 2017, products market share, CAGR, and profit margins are also included in this report. This segment confers information about the raw materials used for the manufacturing. Moreover, it includes potential product developments.

Applications

Large EnterpriseSmall And Medium Enterprise

The market share of each application segment is included in this section. It provides information about the key drivers, restraints, and opportunities of the application segment. Furthermore, it confers details about the potential application of the products in the foreseeable future.

Regions

North America

Asia Pacific

Europe

Latin America

Middle East & Africa

Note: A country of choice can be included in the report. If more than one country needs to be added in the list, the research quote will vary accordingly.

The market research report provides in-depth analysis of the regional market growth to determine the potential worth of investment & opportunities in the coming years. This Cloud Hosting Service Providers report is prepared after considering the social and economic factors of the country, while it has also included government regulations that can impact the market growth in the country/region. Moreover, it has served information on import & export analysis, trade regulations, and opportunities of new entrants in domestic market.

Buy the complete report @ https://industrygrowthinsights.com/checkout/?reportId=173122

7 Reasons to Buy This Report

Usage of Porters Five Force Analysis Model

Implementation of Robust Methodology

Inclusion of Verifiable Data from Respectable Sources

Market Report Can Be Customized

Quarterly Updates On Market Developments

Presence of Infographics, Flowcharts, And Graphs

Provides In-Depth Actionable Insights to Make Crucial Decisions

Ask for discount @ https://industrygrowthinsights.com/ask-for-discount/?reportId=173122

Below is the TOC of the report:

Executive Summary

Assumptions and Acronyms Used

Research Methodology

Cloud Hosting Service Providers Market Overview

Global Cloud Hosting Service Providers Market Analysis and Forecast by Type

Global Cloud Hosting Service Providers Market Analysis and Forecast by Application

Global Cloud Hosting Service Providers Market Analysis and Forecast by Sales Channel

Global Cloud Hosting Service Providers Market Analysis and Forecast by Region

North America Cloud Hosting Service Providers Market Analysis and Forecast

Latin America Cloud Hosting Service Providers Market Analysis and Forecast

Europe Cloud Hosting Service Providers Market Analysis and Forecast

Asia Pacific Cloud Hosting Service Providers Market Analysis and Forecast

Asia Pacific Cloud Hosting Service Providers Market Size and Volume Forecast by Application

Middle East & Africa Cloud Hosting Service Providers Market Analysis and Forecast

Competition Landscape

If you have any doubt regarding the report, please connect with our analyst @ https://industrygrowthinsights.com/enquiry-before-buying/?reportId=173122

About Industry Growth Insights (IGI)

Industry Growth Insights (IGI) has extensive experience in the creation of tailored market research reports in several industry verticals. We cover in-depth market analysis which includes producing creative business strategies for the new entrants and the emerging players of the market. We take care that our every report goes through intensive primary, secondary research, interviews, and consumer surveys. Our company provides market threat analysis, market opportunity analysis, and deep insights into the current and market scenario.

To provide the utmost quality of the report, we invest in analysts that hold stellar experience in the business domain and have excellent analytical and communication skills. Our dedicated team goes through quarterly training which helps them to acknowledge the latest industry practices and to serve the clients with the foremost consumer experience.

Contact Info: Name: Alex MathewsAddress: 500 East E Street, Ontario,CA 91764, United States.Phone No: USA: +1 909 414 1393Email: [emailprotected]Website: https://industrygrowthinsights.com

Continued here:
Cloud Hosting Service Providers Market 2021 Is Booming Across the Globe by Share, Size, Growth, Segments and Forecast to 2027 | Top Players Analysis-...

Read More..

S/4HANA Cloud extended edition (EX): Buying team overview – TechTarget

To understand whether S/4HANA Cloud extended edition offers the right balance of control and ease, it's important to start with an overview of this ERP system.

Although SAP has been urging its users to migrate to the cloud, it offers cloud, on-premises and hybrid deployment options for S/4HANA. Software flexibility and functionality varies among these. S/4HANA Cloud is the entry-level offering. It's designed for users with relatively simple needs and for subsidiaries of larger enterprises that are running SAP systems. The S/4HANA Cloud extended edition (EX) is one step up in terms of cost, complexity and flexibility.

SAP frequently changes its product offerings' nomenclature and, to that point, previously called S/4HANA Cloud EX the S/4HANA single-tenant edition. As its former name suggests, extended edition is offered on a dedicated cloud landscape.

S/4HANA Cloud EX offers much of the entry-level S/4HANA Cloud version's simplicity and standardization. However, S/4HANA users have much greater control over their SAP environment because the software offers more customization capabilities and third-party integrations.

The extended edition also has a wider range of user interface options and expanded functionality compared to the multi-tenant version. This comes with a cost, of course. SAP requires a higher user count for EX, and the software's increased complexity makes it more expensive to operate.

S/4HANA Cloud EX is appropriate for midsize and large enterprises that don't want to manage an IT infrastructure but want more options and flexibility than the entry-level version offers.

The extended edition provides the complete range of SAP features, including localization and best practices for 64 countries. It's available in 39 different languages and offers vertical functionality for 25 industries. By contrast, the multi-tenant S/4HANA Cloud ERP functionality is more limited.

Organizations can integrate S/4HANA Cloud EX with SAP SuccessFactors Employee Central, which handles HR; Fieldglass, which handles vendor management; Hybris, which handles e-commerce and CRM; and Ariba Network, which handles procurement and supply chain management.

One of the primary drawbacks of SAP's entry-level S/4HANA Cloud version is its limited customization and enhancement options. The extended edition offers a bit more flexibility but is still limited compared to the more complex "Any Premise" versions.

S/4HANA Cloud EX is likely best for new SAP users who need a bit more flexibility and functionality than the multi-tenant S/4HANA Cloud version can provide, but who are still willing to embrace SAP's standard processes and best practices.

S/4HANA Cloud EX allows ABAP extensions, but modifications that alter SAP's existing functionality are not permitted. As with the basic Cloud version, developers can opt for side-by-side development and create cloud-based apps that run alongside S/4HANA. This approach is likely new for many veteran SAP programmers but can extend S/4HANA functionality to very good effect.

Power users can take advantage of the product's in-app extensibility features to create new fields and add business logic, among other functions.

Like the entry-level product, S/4HANA Cloud EX supports the use of partner-developed content, but partners must add those configurations manually.

Although some of the more complex S/4HANA configurations offer a perpetual license option, S/4HANA Cloud EX is available on a subscription basis only and is only available as a SaaS product.

Hosting and infrastructure management are included in S/4HANA Cloud EX's monthly fees, and companies usually sign up for multiyear contracts.

S/4HANA Cloud EX users have a few new cloud platform options that aren't available for multi-tenant users. Although SAP manages the environment and it runs on the SAP Cloud Platform (SCP), one of the SAP-approved hyperscalers -- such as Microsoft Azure, AWS or the Google Cloud Platform -- may host the environment.

Unlike the multi-tenant version, the S/4HANA Cloud EX ERP option provides a dedicated environment, so organizations don't share with other users. However, the environment is still highly standardized. It offers a common set of infrastructure processes, services and service-level agreements for every user running this version of S/4HANA. Because of that standardization, S/4HANA Cloud EX users have less flexibility and fewer extensibility options than those using the more complex variations of SAP's latest ERP product.

S/4HANA Cloud is limited to new -- that is, greenfield -- implementations. Companies performing a direct migration from an older SAP version will need to select one of S/4HANA's higher-end versions. Users planning a move from SAP R/3 will need to implement a new system from scratch.

Many of SAP's existing users will likely be dissatisfied with the extended edition because it's built for standardization.

S/4HANA Cloud EX's upgrades offer a bit more flexibility than the basic Cloud version. Upgrades occur twice a year, and users have some control over their timing. For many companies, this offers the best of both worlds: They get fast access to new innovations, but they can also postpone the potential disruptions associated with an upgrade process. Companies facing high-volume seasonal fluctuations or other events will likely value that flexibility.

S/4HANA Cloud EX is likely best for new SAP users who need a bit more flexibility and functionality than the multi-tenant S/4HANA Cloud version can provide, but who are still willing to embrace SAP's standard processes and best practices.

Extended edition users may be advised to rework their existing business processes to fit S/4HANA's recommended approach rather than tailoring the software. In this respect, EX offers a move away from S/4HANA Cloud's high degree of standardization.

Continued here:
S/4HANA Cloud extended edition (EX): Buying team overview - TechTarget

Read More..