Page 998«..1020..9979989991,000..1,0101,020..»

China In-Vehicle Payment Market Research Report 2023: Multimodal Interaction in Your Car – The Next Wave of Secure In-Vehicle Payments – Yahoo Finance

Company Logo

Dublin, Aug. 03, 2023 (GLOBE NEWSWIRE) -- The "China In-Vehicle Payment Market Research Report, 2023" report has been added to ResearchAndMarkets.com's offering.

This comprehensive analysis delves into the current state of China's in-vehicle payment market, examining its industry chain components, original equipment manufacturer (OEM) and payment platform layouts, consumer surveys, and development trends.

With the increasing demand for in-vehicle payment solutions, this report explores the rising popularity of this technology that allows for payment through in-vehicle communication and the In-Vehicle Infotainment (IVI) system. In-vehicle payment offers car owners the convenience of paying for various services, such as parking, refueling, food ordering, and shopping, all without leaving their vehicles, resulting in a more seamless and enhanced user experience.

Despite its current relatively low adoption rate, the survey reveals a high willingness among consumers to explore and use in-car payment functionality. Uncover the emerging scenarios where users are already employing in-car payment for parking, highway pass, refueling/charging, and more, and anticipate the promising future of this innovative payment solution in China.

The in-vehicle payment industry chain is taking shape.

In terms of supply chain, in-vehicle payment involves two major segments: in-vehicle payment device and in-vehicle payment platform.

In-vehicle payment devices are led by communication devices (SIM card, communication module and T-Box), interaction devices (touch/voice/ face/gesture/fingerprint interaction), and authentication devices (security chip); in-vehicle payment platforms are primarily cloud platform, payment platform, IVI system, ecosystem service platform, ecosystem service provider, and OEM.

As companies in each industry chain segment worked to make layout in recent years, the in-vehicle payment market has kept growing, with the following two major features.

In-vehicle payment is available to more scenarios.

Foreign automakers including BMW, Mercedes-Benz, Honda and Hyundai, and Chinese automakers such as Great Wall Motor, Xpeng Motors, Geely, Chery and AITO have launched their in-car payment function. They have widely deployed this function in parking, refueling/charging and food ordering scenarios, and are also applying it on a small scale in car wash/maintenance/repair services, feature subscription, ticket booking and other scenarios.

For example, in October 2022, BMW added the BMW ConnectedDrive Store to its IVI system via OTA updates. It enables in-car payment for subscriptions, and 13 features such as front seat heating, steering wheel heating and Carplay through the IVI system.

Multimodal interaction is being added to in-vehicle payment.

At present, the most common in-car payment is scan to pay and password-free payment. As in-car multimodal interaction technology improves, face recognition, fingerprint recognition and voice recognition are becoming the new in-car payment interaction and authentication methods.

For example, Mercedes-Benz has added fingerprint recognition and authentication to its latest in-car payment system PAY+; Chery EXEED TX/TXL supports face verification payment, a function allowing users to pay for parking fees or shopping through face recognition. The addition of multimodal interaction makes in-vehicle payment more secure and convenient.

The ecosystem is a key factor affecting in-car payment.

In the mobile payment system, millions of iOS and Android developers have developed various applications and built very rich application ecosystems, meeting living, work and entertainment needs of consumers and making smartphones an indispensable terminal in users' life.

In the in-car payment system, financial institutions like China UnionPay and VISA have developed a series of in-car payment systems; Alipay, Banma Zhixing and Huawei among others have built a variety of vehicle ecosystem platforms and launched a range of in-car services covering parking, refueling, travel, shopping and other scenarios.

Compared with mobile payment, the in-vehicle payment ecosystem is still weak at this stage, only meeting the payment needs in specific scenarios. With the development of intelligent cockpit and high-level autonomous driving, drivers will be freed from driving tasks in specific scenarios and pay more attention to other in-car needs. At this time, creating an in-car living space and building a closed-loop ecosystem with payment as the entrance will become a big demand.

Story continues

In-vehicle Payment Summary and Trends

Summary of Telematics Development

Development Path of In-vehicle Payment

SWOT Analysis of In-vehicle Payment

Mobile Payment Habit Formation

Technological Environment for In-vehicle Payment

There Will Be Bigger Space to Imagine In-vehicle Payment Data Mining and Application

Of the users who have used in-car payment:

Up to 78.9% use in-car payment for parking;

42.1% use in-car payment for highway tolls;

In-vehicle payment is also often used to pay for refueling/charging fees (31.6%), IVI traffic and APP membership (31.6%), feature subscription (21.1%), car maintenance/repair/wash (15.8%), and car insurance (10.5%);

Fewer users use this function in the scenarios of online food ordering and dining (5.3%) and travel (5.3%).

Key Topics Covered:

1 Overview of In-vehicle Payment1.1 Development History of In-vehicle Payment1.2 Application Scenarios of In-vehicle Payment1.3 In-vehicle Payment System Flow1.4 Mainstream In-vehicle Payment Methods1.5 In-vehicle Payment Industry Chain1.6 In-vehicle Payment Chip1.7 In-vehicle Payment Platform1.8 In-vehicle Payment Ecosystem1.9 In-vehicle Payment Business Layout of OEMs1.10 In-vehicle Payment Patents1.10.1 In-vehicle Payment Patent Map1.10.2 In-vehicle Payment Patent Layout of OEMs1.10.3 In-vehicle Payment Patent Layout of Suppliers1.10.4 In-vehicle Payment Patent Layout of Ecosystem Companies

2 In-vehicle Payment Consumers2.1 Overview of In-vehicle Payment Survey2.2 In-vehicle Payment Usage and Willingness to Use2.3 Frequent Usage Scenarios of In-vehicle Payment2.4 Users' Satisfaction for In-vehicle Payment2.5 Expected Scenarios of In-vehicle Payment2.6 Differences between Actual and Expected Scenarios of In-vehicle Payment2.7 Reasons for Using In-vehicle Payment2.8 Concerns about In-vehicle Payment2.9 In-vehicle Payment Interaction Modes and Payment Method Preferences

3 In-vehicle Payment Layout of OEMs3.1 BMW3.2 Mercedes-Benz3.3 Honda3.4 Hyundai3.5 Renault Samsung Motors3.6 Jaguar Land Rover3.7 Ford3.8 Great Wall Motor3.9 Xpeng Motors3.10 Geely3.11 Chery3.12 AITO3.13 SAIC Volkswagen3.14 SAIC ROEWE3.15 Other OEMs3.15.1 Human Horizons' Layout of In-vehicle Payment Application Scenarios3.15.2 GAC's In-vehicle Payment Patent Filings3.15.3 Xiaomi's In-vehicle Payment Patent Filings

4 In-vehicle Payment Platforms4.1 VISA4.2 China UnionPay4.3 Alipay4.4 Huawei4.5 Other In-vehicle Payment Platforms4.5.1 Xevo4.5.2 IPS Group4.5.3 ZF4.5.4 DABCO

For more information about this report visit https://www.researchandmarkets.com/r/ovqs9s

About ResearchAndMarkets.comResearchAndMarkets.com is the world's leading source for international market research reports and market data. We provide you with the latest data on international and regional markets, key industries, the top companies, new products and the latest trends.

The rest is here:

China In-Vehicle Payment Market Research Report 2023: Multimodal Interaction in Your Car - The Next Wave of Secure In-Vehicle Payments - Yahoo Finance

Read More..

Artificial intelligence threats in identity management – Security Intelligence

The 2023 Identity Security Threat Landscape Report from CyberArk identified some valuable insights. 2,300 security professionals surveyed responded with some sobering figures:

Additionally, many feel digital identity proliferation is on the rise and the attack surface is at risk from artificial intelligence (AI) attacks, credential attacks and double extortion. For now, lets focus on digital identity proliferation and AI-powered attacks.

For some time now, digital identities have been considered a potential solution to improve cybersecurity and reduce data loss. The general thinking goes like this: Every individual has unique markers, ranging from biometric signatures to behavioral actions. This means digitizing and associating these markers to an individual should minimize authorization and authentication risks.

Loosely, it is a trust and verify model.

But what if the trust is no longer reliable? What if, instead, something fake is verified something that should never be trusted in the first place? Where is the risk analysis happening to remedy this situation?

The hard sell on digital identities has, in part, come from a potentially skewed view of the technology world. Namely, both information security technology and malicious actor tactics, techniques, and procedures (TTPs) change at a similar rate. Reality tells us otherwise: TTPs, especially with the assistance of AI, are blasting right past security controls.

You see, a hallmark of AI-enabled attacks is that the AI can learn about the IT estate faster than humans can. As a result, both technical and social engineering attacks can be tailored to an environment and individual. Imagine, for example, spearphishing campaigns based on large data sets (e.g., your social media posts, data that has been scraped off the internet about you, public surveillance systems, etc.). This is the road we are on.

Digital identities may have had a chance to successfully operate in a non-AI world, where they could be inherently trusted. But in the AI-driven world, digital identities are having their trust effectively wiped away, turning them into something that should be inherently untrustworthy.

Trust needs to be rebuilt, as a road where nothing is trusted only logically leads to one place: total surveillance.

Identity verification solutions have become quite powerful. They improve access request time, manage billions of login attempts and, of course, use AI. But in principle, verification solutions rely on a constant: trusting the identity to be real.

The AI world changes that by turning identity trust into a variable.

Assume the following to be true: We are relatively early into the AI journey but moving fast. Large language models can replace human interactions and conduct malware analysis to write new malicious code. Artistry can be performed at scale, and filters can make a screeching voice sound like a professional singer. Deep fakes, in both voice and visual representations, have moved away from blatantly fake territory to wait a minute, is this real? territory. Thankfully, careful analysis still permits us the ability to distinguish the two.

There is another hallmark of AI-enabled attacks: machine learning capabilities. They will get faster, better and ultimately prone to manipulation. Remember, it is not the algorithm that has a bias, but the programmer inputting their inherent bias into the algorithm. Therefore, with open source and commercial AI technology availability on the rise, how long can we maintain the ability to distinguish between real and fake?

Think of the powerful monitoring technologies available today. Biometrics, personal nuances (walking patterns, facial expression, voice inflections, etc.), body temperatures, social habits, communication trends and everything else that makes you unique can be captured, much of it by stealth. Now, overlay increasing computational power, data transfer speeds and memory capacity.

Finally, add in an AI-driven world, one where malicious actors can access large databases and perform sophisticated data mining. The delta to create a convincing digital replica shrinks. Paradoxically, as we create more data about ourselves for security measures, we grow our digital risk profile.

Imagine our security as a dam and data as water. To date, we have leveraged data for mostly good means (e.g., water harnessed for hydroelectricity). There are some maintenance issues (e.g., attackers, data leaks, bad maintenance) that are mostly manageable thus far, if exhausting.

But what if the dam fills at a rate faster than that of what the infrastructure was designed to manage and hold? The dam fails. Using this analogy, the play is then to divert excess water and reinforce the dam or limit data and rebuild trust.

What are some methods to achieve this?

In closing, risk must be taken to realize future rewards. Risk-free is for fantasy books. Therefore, in the age of a glut of data, the biggest risk may be to generate and hold less data. The reward? Minimized impact from data loss, allowing you to bend while others break.

Senior Director, Educator and Author

Continue Reading

More here:

Artificial intelligence threats in identity management - Security Intelligence

Read More..

Arvato partners with KYP.ai to enhance digital transformation … – Directors Club News

Arvato CRM Solutions has partnered with real-time productivity, optimisation, and process mining platform, KYP.ai to continue driving its digital transformation and AI offering.

The strategic partnership will enable Arvato to enhance its already established artificial intelligence and RPA (robotic process automation) solutions, providing clients with further scope for digital transformation through AI-powered intelligent automation.

Arvato CRM Solutions has expanded its product portfolio vastly in the last few years, with multiple solutions available to clients, including RPA, automation, AI, business process outsourcing, and digital road mapping. This partnership offers further benefits to its clients, as well as a brand-new product ADE, Arvatos Discovery Engine.

Utilising KYP.ais unique productivity mining platform, ADE can identify opportunities, potential savings, utilisation gaps, and areas for automation. The product will revolutionise customer service and experience by understanding how people and processes can work better together. In turn, this will enable Arvato to make more informed decisions and recommendations to its clients, increasing productivity based on real life data.

Working closely with a leading luxury automotive client, Arvato CRM Solutions has already implemented a customer service pilot campaign. KYP.ai enabled the team to reach its objectives of increasing the number of cases worked on, as well as increasing accuracy and efficiency, delivering better visibility and insight into team activity, and improving ease of use for the team.

Over the pilot campaign, several key results were identified. One of the main outcomes was the discovery of a 19.4% hidden utilisation gap. Solutions, such as knowledge pooling and creating a streamlined knowledge repository, would allow for the reduction of escalations and time for clarifying. Opportunities identified included utilising AI-powered transcription and content capturing tools.

Aside from the business benefits this new tool provides, its an ideal platform for identifying behaviour changes and well-being issues for employees. Long hours, minimal breaks, and other factors can be highlighted, allowing managers or supervisors to quickly identify issues and put preventative measures in place to hinder burnout or quiet quitting.

Henry Ellender, Head of Sales at KYP.ai, commented: Were ecstatic to develop this partnership with Arvato. By combining KYP.ais advanced productivity mining capabilities, with Arvatos outstanding customer relationship management experience and innovative technological solutions, were excited to see how this partnership can further enhance their expertise.

James Towner, Chief Growth Officer at Arvato CRM Solutions, commented: This partnership brings a fantastic opportunity for growth within our AI capabilities. More importantly, it ensures that we can help our clients identify areas for improvement and implement effective actions for their customer service or back-office teams.

Our focus is always on ensuring our teams are empowered by technology, through best practices, actions, and procedures. This allows for better decision-making, better customer service, and a better work-life balance for employees.

Debra Maxwell, CEO at Arvato CRM Solutions, added: Having the ability to provide this digital transformation platform to our clients is paramount. It puts the customer and the employee at the forefront, utilising data mining effectively to help automate relevant tasks.

It also highlights the value that an innovation-led, digital approach to customer experience can deliver for our clients, both within the public and private sectors.

With digital transformation at the core, Arvatos new AI/IA platform provides its clients with the ability to influence and extract process insights, implement automation, and help their employees excel.

About Arvato CRM Solutions

Arvato CRM Solutions is a trusted partner to the private and public sectors, with expertise in delivering award-winning customer relationship management, business process outsourcing (BPO) and public sector and citizen services.

The business focuses on providing customer service which is driven by technology and powered by its people. It designs and delivers innovative, individual solutions for some of the most respected global consumer brands and UK public sector organisations, through long-term partnerships.

A division of Bertelsmann, Arvato CRM Solutions employs approximately 1,500 people across five UK locations.

For more information, visit: http://www.arvato.co.uk

About KYP.ai

KYP.ai is a Productivity Mining company fuelling digital change. They enable clients to rapidly understand their abstract processes and the complex interaction between people and technology. KYP.ais automatically-generated, data-driven improvement recommendations aim at delivering the fastest possible transformation ROI. KYP.ai algorithms help to accelerate implementation of digitally augmented processes, combining unique human impact with machine-driven outcomes.

Find out more at http://www.kyp.ai

See the rest here:

Arvato partners with KYP.ai to enhance digital transformation ... - Directors Club News

Read More..

From the empirical to the molecular: Understanding biology and … – Open Access Government

The German American physiologist and experimental biologist Jacques Loeb (1859-1924) was one of the most vigorous promoters of biology as an experimental science in the 19th century. Influenced by the physicist and philosopher Ernst Mach, he at first pursued the goal of engineering life by devising techniques to experimentally control animals life functions.

In Machs positivist-empiricist approach to epistemology, understanding life meant controlling life phenomena by physical and chemical means. Loeb became most famous for his success in bringing about artificial parthenogenesis in sea urchins (1899).

Due mainly to new developments in biochemistry and genetics around 1900, which pointed to the crucial role of macromolecules, in particular proteins and DNA (then nuclein) in biology, Loeb abandoned the empiricist-phenomenological approach and the aim of controlling life by purely empirical means. He increasingly focused on the search for molecular mechanisms and causes. He now promoted the view that life was based on the interaction of specific macromolecules.

The principles of biological specificity residing in protein diversity and genetic causality based on the nuclein in chromosomes were crucial for understanding life. Loebs vision and the causal-mechanistic approach to which he significantly contributed at an early stage, became the foundation of molecular biology and are also the basis for research in synthetic biology.

Molecular biology, the search for the molecular understanding of the basic structures and functions of life, such as heredity, development, biological information, and also of processes such as evolution and problems such as diseases and their cures, was the most successful branch of 20th-century biology.

This search for molecular mechanisms of life was rejected at the time by vitalists, nature philosophers, morphologists, and positivists/empiricists such as Mach. Today, molecular biology is challenged not by philosophical currents but by another empiricism scientific movementthe big data revolution in genomics. In contrast to the 19th-century positivism/empiricism that was directed against metaphysical speculation and religion, and, in the case of Loeb, a strategy to fight mysticism and superstition in science, the 21st-century empiricism resulted from the development of new technologies in DNA sequencing and computation. What both have in common is the marginalization or rejection of causal mechanistic examination and explanation of biological phenomena.

In biology, the existence of large amounts of sequencing and gene expression data and powerful computational methodology tremendously facilitates systems approaches and pattern recognition in many fields of research. But data-driven correlation is also used to replace experimentation and causal analysis.

Todd Golub, the director of the Broad Institute of MIT and Harvard, promotes unbiased genomic surveys that are taking, for example, cancer therapeutics in directions that could never have been predicted by traditional molecular biology. According to him, the large-scale, data-harvesting approach to biological research has significant advantages over conventional, experimental methods (Golub 2010). While many genomics institutes are still pursuing the analysis of molecular mechanisms, the trend to data mining is also rising, especially among young researchers.

This new empiricist tendency disregards that science is more than statistics, correlations, or pattern recognition. For a complex science like biology, knowledge of mechanisms is crucial to answering questions about issues such as the causal role of genes and genetic variation in development or the effects of perturbations or diseases on an organism. Science aims at causal explanations of normal functions in the cell and also of deviations such as diseases.

Genomicist Edison Liu perceives great danger when experiments and hypotheses are abandoned in favour of big data: It is fallacious to believe, especially in the complexity of the human body and disease, that you can make consistent predictions simply on data. The term big data is relative and too liberally used how big is big, and when is data big enough to have confidence in the predictions?

In biology, theres usually not enough data.

The other aspect is that we know only what we know. If you had talked to us 25-30 years ago, the argument was, if I knew every single gene element and promoter, I would be able to predict you as a human being. Well, Im sorry that doesnt happen. It doesnt happen because what we thought of as the universe of known information is only a small fraction of reality. We now know that the complexity of splice variance, the complexity of alternative promoters, the complexity of post-translational modification, the complexity of gene-gene interactions, the complexity of eQTLs (genomic loci that explain variation in expression levels of mRNAs), where distant enhancer sites affect a gene megabases away. This is all new information.

what we thought of as the universe of known information is only a small fraction of reality

So, if we were simply to model on old data that we considered was the totality of the biological universe, our predictions would have been mainly wrong. This is why I think the idea that big data in medicine is going to supplant experimentation is not only unreal, its absolutely dangerous. In fact, Im really fearful that were going to fall into the trap of the Dark Ages. (Liu 2022).

uted@post.bgu.ac.il

Editor's Recommended Articles

Read the rest here:

From the empirical to the molecular: Understanding biology and ... - Open Access Government

Read More..

Physics Wallah enters UG education, launches 4-year programme in Computer Science and AI – BusinessLine

Edtech unicorn Physics Wallah has launched an undergraduate residential engineering programme, as the company looks to double down on the upskilling segment.

The company has launched PW Institute of Innovation (PW IOI),with a four year-residential programme in Computer Science and Artificial Intelligence (AI). It will begin in September in Bengaluru, with a batch of about 100 students, said Vishwa Mohan, Chief Investment Officer of Physics Wallah and President of the institute, to Businessline.

ALSO READ | Physics Wallah launches MedEd for NEET PG/NExT exam prep

Phycis Wallah has partnered with universities to offer undergraduate degrees, Mohan said, without giving much details. Students will be eligible to pursue a bachelors degree in parallel with the partner premier institute, said Mohan.

What we essentially want to do is have experienced faculty and industry leaders, an industry-oriented curriculum, which will help students with relevant skills. By the time you come out of IOI, you will be ready and from day one you can be deployed in the industry, he added.

The course is priced at 1.5 lakh per year amounting to 6 lakh for the course, and 1.25 lakh per year for acomodation. The institutes placement cell will offer assistance in securing job offers, with its partners including Tata IQ, Siemens, Leadsquared, SAP, Oracle, KPMG, Amazon and others.

Previously, PW Skills has placed more than more than 1,500 learners with an average salary of 22 lakh per annum and and a top salary of 50 lakh per annum.

PhysicsWallah foryaed into the upskilling segment after it acquired iNeuron.ai, which later got spun off as PhysicsWallah Skills. The company disclose how much they plan to invest in the institute.

Founded in 2020, by Alakh Pandey and Prateek Maheshwari, Physics Wallah had raised $100 million from WestBridge Capital and GSV Ventures at a valuation of $1.1 billion in June last year.

Read this article:

Physics Wallah enters UG education, launches 4-year programme in Computer Science and AI - BusinessLine

Read More..

Bures and Sjqvist metrics over thermal state manifolds for spin qubits and superconducting flux qubits – Phys.org

This article has been reviewed according to ScienceX's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

by Colleges of Nanoscale Science and Engineering

close

Dr. Carlo Cafaro, SUNY Poly faculty in the Department of Mathematics and Physics, has collaborated with Dr. Paul M. Alsing, Principal Research Physicist at the Air Force Research Laboratory in Rome, NY, on work published in The European Physical Journal Plus.

The tutorial paper, titled, "Bures and Sjqvist Metrics over Thermal State Manifolds for Spin Qubits and Superconducting Flux Qubits," in which Cafaro is lead author, is a useful and relatively simple theoretical piece of work. It combines concepts of quantum physics with elements of differential geometry to clarify in simple terms the differences between two important metrics for mixed quantum states of great use in quantum information science.

The interplay among differential geometry, statistical physics, and quantum information science has been increasingly gaining theoretical interest in recent years.

In this paper, Cafaro and Alsing present an explicit analysis of the Bures and Sjqvist metrics over the manifolds of thermal states for specific spin qubit and the superconducting flux qubit Hamiltonian models. While the two metrics equally reduce to the Fubini-Study metric in the asymptotic limiting case of the inverse temperature approaching infinity for both Hamiltonian models, they observe that the two metrics are generally different when departing from the zero-temperature limit.

Cafaro and Alsing discuss this discrepancy in the case of the superconducting flux Hamiltonian model.

They conclude the two metrics differ in the presence of a non-classical behavior specified by the noncommutativity of neighboring mixed quantum states. Such a noncommutativity, in turn, is quantified by the two metrics in different manners. Finally, Cafaro and Alsing briefly discuss possible observable consequences of this discrepancy between the two metrics when using them to predict critical and/or complex behavior of physical systems of interest in quantum information science.

More information: Carlo Cafaro et al, Bures and Sjqvist metrics over thermal state manifolds for spin qubits and superconducting flux qubits, The European Physical Journal Plus (2023). DOI: 10.1140/epjp/s13360-023-04267-9

Provided by Colleges of Nanoscale Science and Engineering

More:

Bures and Sjqvist metrics over thermal state manifolds for spin qubits and superconducting flux qubits - Phys.org

Read More..

The lost women of early analytic philosophy – Aeon

A couple of years ago, the library of the University of Groningen in the Netherlands was subject to a massive reclassification. Hundreds of books were provisionally placed higgledy-piggledy on the shelves, atlases leaning against poetry collections, folios of sheet music wedged between a tome on malaria treatments and a study of birds in the Arctic. In the midst of this jumble, one of us was preparing the valedictory lecture that would mark her official retirement as professor of philosophy.

After two hours of thinking and writing, it was time for a break and a leisurely look at the miscellany of intellectual effort on the shelves. A bright blue book drew attention. It was the fourth volume (the rest were nowhere to be seen) of A History of Women Philosophers (1995) edited by Mary Ellen Waithe, which deals with female philosophers in the 20th century. Upon inspection, it contained not only essays on thinkers such as Simone de Beauvoir and Hannah Arendt, but also a chapter on a completely unknown English philosopher, E E Constance Jones (1848-1922). The authors of this chapter, Waithe and Samantha Cicero, argued that Jones had solved Freges Puzzle two years before Gottlob Frege himself had done so.

Emily Elizabeth Constance Jones (1916) by John Lavery. Courtesy Girton College Cambridge/Wikipedia

This was by all accounts a spectacular claim. Frege, the German mathematician and philosopher born in the same year as Jones, had been the major inspiration for Principia Mathematica, the bible of modern logic that Alfred North Whitehead and Bertrand Russell published between 1910 and 1913. Freges grand aim was to find a foundation from which the whole of number theory could be derived. In carrying out this project, however, he encountered a philosophical problem. How to account for the fact that an equation like 2 x 2 = 1 + 3 is informative, whereas 4 = 4 is not? It is not just that the symbols on both sides of the identity sign are different. After all, in 7 = VII the symbols on either side of the identity sign differ, but the statement is not informative in the way that 2 x 2 = 1 + 3 is; it simply represents the number seven in two different symbol systems. In later work, Frege used a non-mathematical example to illustrate his problem. Why is the statement The morning star is the evening star informative, whereas The morning star is the morning star is not? Since both the morning star and the evening star refer to the planet Venus, both sentences seem to say nothing more than that Venus is Venus.

Frege solved the problem in his paper On Sense and Reference (1892). He argued that the meaning of a term like morning star is not just its reference (Venus), but also contains another component the sense which is the way in which the reference is given to us, in this case as a star that appears in the morning. The morning star is the evening star is informative because the references of morning star and evening star are the same, while their senses are different. In fact, it took the Babylonians quite some time to discover that this star that appears in the morning is the same heavenly body as the star that appears in the evening. The morning star is the morning star, on the other hand, is trivially true for the Babylonians as well as for us.

Waithe and Cicero discovered that Constance Jones was struggling with a problem similar to that of Frege, for she wanted to know: why is the statement A is B significant while A is A is trivial? Waithe and Cicero argued that in 1890 two years before Frege wrote his classic paper Jones had published a solution that was basically the same as Freges.

For any scholar in analytic philosophy, this was breaking news. Both of us have long been teaching the history of analytic philosophy, one of us for more than 30 years. We have taught countless students how, at the University of Cambridge, Bertrand Russell and George Edward Moore revolted against traditional logic and traditional philosophy, thereby founding what became known as analytic philosophy. We have described how, in the 20th century, analytic philosophy branched out in two different directions, a formal one that led to Ludwig Wittgensteins Tractatus Logico-Philosophicus (1922), the Vienna Circle, and W V Quines naturalised philosophy; and an informal one consisting of the ordinary language philosophy associated with J L Austin, Gilbert Ryle, and the later work of Wittgenstein. Nowhere did we mention Constance Jones. We simply did not know about her, much less did we suspect that she could have anticipated that crucial building block of analytic philosophy, Freges distinction between sense and reference.

When we subsequently read Joness work ourselves, we found that the story is a bit more nuanced than what we had gathered from the chapter by Waithe and Cicero. There are similarities between Jones and Frege, but also some salient differences. It is not just that Joness approach is simpler than Freges, dealing only with elementary sentences such as A is B there are differences that cut much deeper than this. Freges distinction between sense and reference (in German: Sinn and Bedeutung) does not coincide with Joness more traditional distinction between what she calls determination and denomination, and later connotation and denotation, or intension and extension. The extension of the predicate term is red, for example, is simply the class of all red things in the world. The Fregean Bedeutung of this term is, however, a concept, more particularly a mathematical function. And while Joness intensions are properties of real or imagined things, Fregean Sinne (senses) constitute an objective realm separate from any actual or fictional world. (For details on the differences, see the chapter E E Constance Jones and the Law of Significant Assertion by Jeanne Peijnenburg and Maria van der Schaar, forthcoming in the Oxford Handbook of British and American Women Philosophers in the Nineteenth Century, edited by Lydia Moland and Alison Stone.)

By their choices, they influence our ideas about who are and who are not important philosophers

None of this alters the fact that Jones was completely forgotten, even though she had been a very active and respected member of the philosophical community. From 1884 to 1916, Jones taught Moral Sciences at Girton, the first residential college for female students in the UK, where she became Vice-Mistress and later Mistress. Her specialisation was logic: she wrote four books on the subject and many articles in leading philosophical journals such as Mind and Proceedings of the Aristotelian Society. Although her work is firmly rooted in the old Aristotelian syllogistics, it is in some respects surprisingly modern. At a time when logic was generally seen as being about subjective laws of thought, Jones anticipated later developments by staunchly asserting that logic was objective. Moreover, her problem-driven approach and remarkably clear style make her work different from the florid prose of some of her contemporaries and more akin to the later analytic tradition. In 1892, she became a member of the Aristotelian Society. Four years later, she was the first woman to address the Cambridge Moral Sciences Club, and established philosophers such as F C S Schiller, W E Johnson and Bernard Bosanquet engaged in public discussions of her work.

Then why was she forgotten? The history of 20th-century philosophy is largely shaped by handbooks, textbooks, companions or anthologies. By the choices they make, by the texts they rely on, historians, editors and educators influence our ideas about who are and who are not important philosophers. Joness name is not in the handbooks. Why not? Perhaps it was due to the supremacy of modern mathematical logic, which reduced the old Aristotelian logic that Jones uses to a mere special case. The fact that Russell was personally exasperated by Jones and her Victorian mindset, describing her in a letter to Ottoline Morrell as motherly and prissy, may not have helped either. But, whatever the precise causes, Jones does not deserve to be consigned to oblivion.

The case of Constance Jones is one of what we may call historiographical marginalisation: although she was a prolific and respected writer during her lifetime, her work never entered the canon because historians and textbook authors for some reason chose not to include it in their overviews. There are also cases where the marginalisation is historical: a philosophers significance is insufficiently recognised by her contemporaries. An example of historical marginalisation is the reception of work by the German philosopher, physicist and mathematician Grete Hermann (1901-84). After the dawn of quantum mechanics at the beginning of the 20th century, physicists and philosophers were baffled by its spectacular empirical successes. How could an essentially indeterministic and counterintuitive theory be so effective? Was the world really that weird? Following Albert Einstein, many people suspected the existence of hidden variables that, once discovered, would reveal that quantum mechanics was deterministic after all. Their hopes were dashed in 1932, when the mathematician John von Neumann seemingly proved that any theory about hidden variables is incompatible with quantum mechanics. The quantum mechanical structure, he argued, is such that it simply does not allow the addition of variables that would enable us to identify deterministic causes, on pain of becoming inconsistent.

But he had a challenger. In a paper of 1935, Hermann showed that von Neumanns argument was flawed. The source of difficulty is an assumption he makes about a sum of noncommuting operators. Von Neumann was right that this assumption holds in quantum mechanics, but he failed to see that it may well be false in an extended theory, encompassing both quantum mechanics and the new or hidden variables. Hermann explained that this failure made his proof essentially circular. Her voice, however, was not heard. Thirty years later, the Irish physicist John Bell independently criticised von Neumann on similar grounds, and the subsequent experimental check of his findings earned Alain Aspect, John Clauser and Anton Zeilinger the Nobel Prize in 2022.

The causes of marginalisation are strong and manifold, ranging from the political, social, cultural or even personal

Although Hermanns argument against von Neumann was mentioned by Max Jammer in his standard work The Philosophy of Quantum Mechanics (1974), and by David Mermin in a paper of 1993, it received little attention at the time. This changed in 2016, when Guido Bacciagaluppi and Elise Crull discovered an unpublished manuscript by Hermann in the archives of the English theoretical physicist Paul Dirac. As it turned out, in 1933, one year after von Neumanns book, Hermann had sent a paper of 25 pages to Dirac, explaining the flaw in von Neumanns argument. Dirac never responded. It is, however, no exaggeration to say that the history of 20th-century physics would have been different if he had, and if the papers by Hermann had been noted earlier.

Historical and historiographical marginalisation are of all times and places: they arise in arts, sciences, and in all corners of philosophy. While generally lacking justification, the causes of marginalisation are strong and manifold, ranging from the political, social, cultural or even personal. More women than men were affected by it, and the history of analytic philosophy is in this respect no exception.

In our recent book Women in the History of Analytic Philosophy (2022), we collected the metadata of articles published in main outlets for analytic philosophers in the first half of the 20th century. In particular, we looked at all the 3,288 articles that appeared in six philosophy journals between 1896 and 1960: Mind, The Monist, Erkenntnis, Analysis, Journal of Symbolic Logic, and Philosophical Studies. In 99.6 per cent of the cases, that is, in 3,274 articles, we were able to identify the gender of the authors. We found that, on average, only 4 per cent of these 3,274 articles were authored by women. Most of these women, 70 in number, are presently forgotten, as is illustrated by recent meetings of the Society for the Study of the History of Analytical Philosophy. Only four of the 246 papers presented at meetings of this society in the period 2015 to 2019 were about female philosophers less than 2 per cent.

In practice, it is often hard to separate historical and historiographical marginalisation, for they typically go hand in hand. If work by female authors is not much read or cited by contemporaries, historians will be disinclined to include it in their textbooks. And if these female philosophers views are not discussed in textbooks, anthologies or introductions, they are less likely to be studied by the next generation of philosophers.

Susanne K Langer photographed by Richard Avedon. Courtesy the Smithsonian National Museum of American History

A prominent example of the interplay between the two types of marginalisation is the reception of work by Susanne K Langer (1895-1985), one of the first to use the term analytic philosophy in print. Langer was an American logician and a student of Whitehead, the co-author of the aforementioned Principia Mathematica. Whitehead had worked at the University of Cambridge in the UK his entire career but had taken up a position at Harvard University in Massachusetts in his 60s. This move greatly stimulated the dissemination of logical analysis in US philosophy, and Langer was among the most active proponents of the new approach. In 1964, she recalled having been part of a small group of students who looked forward to a new philosophical era, that was to grow from logic and semantics. After completing her PhD, Langer actively contributed to the spread of the new analytic philosophy. She published a number of papers on Principia Mathematica, wrote one of the first American logic textbooks, and co-founded the Association for Symbolic Logic, the first international society for logicians.

Langers book sold more than half a million copies and was cited in the academic literature c10,000 times

In the beginning, Langers work was much respected by her colleagues. Her first books and papers were frequently discussed by analytic philosophers, both in print and in private discussion groups. Members of the celebrated Vienna Circle studied her work in the early 1930s and saw her as one of the major representatives of the analytic approach in the US. (For details, see the chapter Susanne Langer and the American Development of Analytic Philosophy by Sander Verhaegh in our book.)

Then, Langer published what would become her most influential work: Philosophy in a New Key (1942). It sold more than half a million copies and has been cited in the academic literature almost 10,000 times. The book is a plea to expand the scope of logical analysis. Until then, analytic philosophers had used the new logic to analyse science, philosophy and language in general. But Langer suggested to apply it to a broader range of phenomena: abstract paintings, sculptures, symphonies, rituals, dreams and myths. All these things, Langer argued, are complex symbols with an internal structure and are therefore suitable subjects for logical analysis. Much as we can investigate the logical form of propositions such as 2 x 2 = 1 + 3 and The morning star is the evening star, we can analyse the logical structure of J S Bachs Air on the G String and Piet Mondrians Composition with Red, Blue, and Yellow.

In the years that Philosophy in a New Key went through reprint after reprint, Langers work began to be ignored by her former analytic companions. In advocating the study of art, myths and rituals, Langer had proposed research topics that many analytic philosophers relegated to the realm of the irrational. While her colleagues were reconstructing the foundations of probability, arithmetic and quantum mechanics, Langer was studying subjects that were taken to be expressions of emotions and feelings. As a result, there was hardly any discussion of her book within the analytic community, despite her rising fame outside it. Even analytic colleagues who were demonstrably influenced by her book, such as Quine, failed to cite it. By the time that analytic philosophers started to compile anthologies and took the first steps towards documenting the history of their own discipline in the late 1940s, Langers work was pushed into the background: it was not mentioned, not even her contributions to the development of logic and analysis in the first phase of her career. Today, Langer is well-known among philosophers of art, but her role in analytic philosophy has been forgotten.

In recent years, quite a lot of attention has been given to the ways in which sociopolitical and other external factors shaped the development of analytic philosophy. Were it not for the grim political situation in the 1930s, members of the Vienna Circle would not have immigrated en masse to England and the US. And were it not for the amenable climate at US universities, where rigour and clarity had become key virtues across the humanities and social sciences, their logical positivism would not so quickly have caught on. Even demographic factors played a role. When the first baby boomers started to enter college, in the 1960s and 70s, many departments had turned analytic, and profited from the explosive growth of higher education, creating more and more jobs for analytically minded philosophers.

Textbooks on analytic philosophy tend to present its development as a more-or-less continuous line, where key figures respond to one another: Russell reacting to Frege, Wittgenstein and Rudolf Carnap to Russell, Quine to Carnap, and so on. This way of telling the history has been very effective: it is no exception to find that, at a conference on the history of analytic philosophy, more than half of the papers are about Frege, Russell, Wittgenstein or Carnap. But the actual spread and growth of analytic philosophy is of course richer, more varied and more complex than is suggested by the stylised and regimented narratives that authors of textbooks are necessarily bound to relate. Like the development of any other historical movement, the development of analytic philosophy is full of interesting details that not only fail to match, but even contradict and undermine the general textbook outline. Had scholars given these details more attention, we might have enjoyed a broader and intellectually more diverse canon. For then we might have seen that the development of analytic philosophy was not only driven by purely philosophical arguments, but also by political, sociological and cultural circumstances, some of which made it difficult for particular academics, such as women, to be heard.

Historians can play a role in correcting the omissions, oversights and downright mistakes of our predecessors

We are not suggesting that a broader recognition of the consequences of historical and historiographical marginalisation will lead to a completely novel canon or a radically new history of the tradition. What happened happened: we cannot go back in time and undo the processes that pushed female philosophers into the periphery. We will have to deal with the facts, even if we do not like them and believe they were preventable. It is a fact that only a small percentage of the publications in analytic philosophy were written by women. And it is also a fact that most of them were junior academics and therefore relatively young. Even if women were allowed to get a degree and were able to make it to the vanguard in a male-dominated intellectual climate, they often stopped publishing when they got married. This is why the 70 female authors we identified were responsible for just 131 publications in the journals we investigated, less than two articles per person on average. Only a very small number of women, such as Jones and Langer, had the time and the opportunity to build a comprehensive philosophical research programme.

What we are saying is that historians can play a role in correcting the omissions, oversights and even downright mistakes our predecessors made in writing about (or worse, not writing about) the contributions of female philosophers. For there is an internal, purely philosophical point to be made. Although external factors influenced its development, analytic philosophy is more than the product of sociopolitical and cultural circumstances. In documenting the history of analytic philosophy, there is something to be right or wrong about. Hermanns discovery really was a significant contribution to the debate about the existence of hidden variables, even if her colleagues and later historians failed to see it. And Langer really did play a major role in the development of US analytic philosophy, even though her name is missing in companions and anthologies on the subject. It is true that, until the 1960s, only a few women actively contributed to the development of analytic philosophy, but many of them had ideas that are worth studying. In examining and re-assessing their work, we will be able to discover interesting but forgotten theories, proofs and arguments, shed new light on the development of the tradition, and contribute to a richer, more diverse and philosophically more fertile canon.

Read the original:

The lost women of early analytic philosophy - Aeon

Read More..

3 Quantum Computing Stocks to Buy Before the Breakout – InvestorPlace

Investing in innovative technology can generate high returns once more people catch on to the opportunity. Quantum computing is one of those technologies that has rewarded shareholders in recent years but still remains in the early innings. This has led to the rise of quantum computing stocks to buy.

Quantum computing enables quicker calculations and more efficiency. This technology can solve problems that traditional computers cannot. This technology revolves around quantum physics and enables more possibilities than classical computers using binary approaches (i.e., 0s and 1s) to process information.

The technology already has already improved our processes in areas like risk management, research & development, and supply chain management.

Investing in quantum computing stocks can yield high returns, and many corporations are investing in the technology. Investors looking for exposure to the industry may want to consider these top quantum computing stocks.

Source: IgorGolovniov / Shutterstock.com

Alphabet (NASDAQ:GOOG) makes the majority of its revenue from its ad network. In the second quarter, Google advertising generated $56.3 billion, or 80.8% of the companys revenue. The advertising markets recovery can lead to more revenue and earnings growth, but thats not the only thing Google has going for it. The conglomerate has expanded into other areas to diversify its income streams, including quantum computing.

GooglesQuantum AI is working on technologies that will give researchers more resources and enable them to operate beyond classical capabilities. The company has also developed a quantum computer that is47 years fasterthan the worlds fastest supercomputer.

That type of speed can expand artificial intelligences capabilities. Alphabet has the capital to become a leader in the quantum computing industry and has a long history of rewarding shareholders. Alphabet shares are up by 45% year-to-date and have gained 110% over the past five years. The company has a 28 P/E ratio and a $1.65 trillion market cap.

Source: Asif Islam / Shutterstock.com

Microsoft(NASDAQ:MSFT) has also rewarded long-term shareholders, generating a 38% year-to-date gain and more than tripling over the past five years. The company also has ambitious goals that revolve around quantum computing.

Microsoft CEO Satya Nadella stated that the company aims to compress the next 250 years of chemistry and materials science progressinto the next 25. Microsoft Azure has several quantum elements that help scientists solve more complex problems. The corporation is also working on a quantum supercomputer.

While investors wait for developments in quantum computing, they have plenty to like about Microsofts business model. Revenue increased by 8% year-over-year inQ4 Fiscal 2023. Net income increased by 20% year-over-year during the same time.

Microsoft Cloud makes an outsized percentage of total revenue. The cloud segment accounted for $30.3 billion out of the companys $56.2 billion in Q4 Fiscal 2023 revenue. Thats 53.9% of the companys total revenue. Cloud revenue can gain momentum as quantum computing strengthens Microsoft Azures value proposition.

Source: Shutterstock

IonQ(NASDAQ:IONQ) is a pure-play speculative quantum computing stock that is unprofitable but has high revenue growth. Investors will have to swallow alofty valuationof a 239 price-to-sales ratio. The 6 price-to-book ratio looks more palatable but still leaves much to be desired.

IonQ has positioned itself as the first mover and a leading player in the quantum revolution. The company expects to generatedouble the bookings next yearand anticipates delivering the first quantum system in Europe in 2023. Being a first in an industry with large potential has helped the company command a sky-high valuation.

The company reported $4.3 million in revenue in thefirst quarter. Thats above the companys high end of its guidance range and more than double last years revenue, which was $2.0 million.

IonQ holds onto cash and cash equivalents of $525.5 million which makes up more than 10% of the companys market cap. The company also increased its full-year revenue outlook from $18.8 million to $19.2 million.

Investors can agree that IonQ is growing at a fast clip. Its hard to argue with triple-digit revenue growth. However, rising losses and a lofty valuation make this stock a speculative play in quantum computing. The company can reward shareholders immensely if it becomes a leader in the industry, but this most certainly is a high-risk, high-reward stock. Shares have surged 348% year-to-date but are only up by 41% over the past five years, demonstrating the dramatic price swings the stock has experienced so far.

On this date of publication, Marc Guberti did not have (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed in this article are those of the writer, subject to theInvestorPlace.comPublishing Guidelines.

Marc Guberti is a finance freelance writer at InvestorPlace.com who hosts the Breakthrough Success Podcast. He has contributed to several publications, including the U.S. News & World Report, Benzinga, and Joy Wallet.

Read the rest here:

3 Quantum Computing Stocks to Buy Before the Breakout - InvestorPlace

Read More..

Deep learning method developed to understand how chronic pain … – EurekAlert

A research team from the Universidad Carlos III de Madrid (UC3M), together with University College London in the United Kingdom, has carried out a study to analyze how chronic pain affects each patient's body.Within this framework, a deep learning method has been developed to analyze the biometric data of people with chronic conditions.

The analysis is based on the hypothesis that people with chronic lower back pain have variations in their biometric data compared to healthy people.These variations are related to body movements or walking patterns and are believed to be due to an adaptive response to avoid further pain or injury.

However, research to date has found it difficult to accurately distinguish these biometric differences between people with and without pain.There have been several factors, such as the scarcity of data related to this issue, the particularities of each type of chronic pain and the inherent complexity in the measurement of biometric variables.

People with chronic pain often adapt their movements to protect themselves from further pain or injury.This adaptation makes it difficult for conventional biometric analysis methods to accurately capture physiological changes.Hence the need to develop this system, says Doctor Mohammad Mahdi Dehshibi, a postdoctoral researcher at the i_mBODY Laboratory in UC3M's Computer Science Department, who led this study.

The research carried out by UC3M has developed a new method that uses a type of deep learning called s-RNNs (sparsely connected recurrent neural networks) together with GRUs (closed recurrent units), which are a type of neural network unit that is used to model sequential data.With this development, the team has managed to capture changes in pain-related body behavior over time.Furthermore, it surpasses existing approaches to accurately classify pain levels and pain-related behavior.

The innovation of the proposed method has been to take advantage of an advanced deep learning architecture and add additional features to address the complexities of sequential data modelling.The ultimate goal is to achieve more robust and accurate results related to sequential data analysis.

One of the main research focuses in our lab is the integration of deep learning techniques to develop objective measures that improve our understanding of people's body perceptions through the analysis of body sensor data, without relying exclusively on direct questions to individuals, says Ana Tajadura Jimnez, a lecturer from UC3M's Computer Science Department and lead researcher of the BODYinTRANSIT project, who leads the i_mBODY Laboratory.

The new method developed by the UC3M research team has been tested with the EmoPain database, which contains data on pain levels and behaviors related to these levels.This study also highlights the need for a reference database dedicated to analyzing the relationship between chronic pain and biometrics.This database could be used to develop applications in areas such as security or healthcare, says Mohammad Mahdi.

These results of this research are used in the design of new medical therapies focused on the body and different clinical conditions.In healthcare, the method can be used to improve the measurement and treatment of chronic pain in people with conditions such as fibromyalgia, arthritis and neuropathic pain.It can help control pain-related behaviors and tailor treatments to improve patient outcomes.In addition, it can be beneficial for monitoring pain responses during post-surgical recovery, says Mohammad Mahdi.

In this regard, Ana Tajadura also highlights the relevance of this research for other medical processes: In addition to chronic pain, altered movement patterns and negative body perceptions have been observed, such as in eating disorders, chronic cardiovascular disease or depression, among others .It is extremely interesting to carry out studies using the above method in these populations in order to better understand medical conditions and their impact on movement.These studies could provide valuable information for the development of more effective screening tools and treatments, and improve the quality of life of people affected by these conditions.

In addition to health applications, the results of this project can be used for the design of sports, virtual reality, robotics or fashion and art applications, among others.

This research is carried out within the framework of the BODYinTRANSIT project, led by Ana Tajadura Jimnez and funded by the European Research Council (ERC) under the European Union's Horizon 2020 research and innovation program (GA 101002711).

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

More here:

Deep learning method developed to understand how chronic pain ... - EurekAlert

Read More..

World Premiere of QUANTUM LOVERS: THE MUSICAL to Explore … – Broadway World

The passionate but troubled romance between the young scientists Albert Einstein and Mileva Maric, who met in Switzerland in the early years of the Twentieth Century, is the subject of the world premiere QUANTUM LOVERS: THE MUSICAL, by Hasan Padamsee, a Cornell University Professor of Physics and playwright with three previously produced plays to his credit. This intense and tragic romance, a musicalization of a previous play by Padamsee, was inspired by the books EINSTEIN IN LOVE: A SCIENTIFIC ROMANCE by Dennis Overbye, EINSTEIN IN BERLIN by Thomas Levenson, and ALBERT EINSTEIN/MILEVA MARIC: THE LOVE LETTERS by Albert Einstein, Jurgen Renn, Robert Schulmann and Shawn Smith. Padamsee wrote the musicals book and the lyrics for its 22 musical numbers, with music by Athena Antiporda (Ainna), Roberto Flora, Michaela Catapano, Anshu Jha, Umuk oro Fortune (El Doxa).QUANTUM LOVERS: THE MUSICAL will be performed three times only on Friday, August 11 and Saturday, August 12 at 7:30 each night; and on Sunday, August 13 at 2:30 pm.The fully staged performances will be at City Lit Theater, located on the second floor of the Edgewater Presbyterian Church at 1020 W. Bryn Mawr Avenue, Chicago.Albert Einstein, known as the father of relativity who transformed space, time and gravity, also played a major role in discovering Quantum Physics, a concept mysterious, full of apparent contradictions, difficult to understand, and yet captivating. Like quantum physics, true love is magical but enigmatic, deceptively familiar but incomprehensible. Can the man with dramatic success in revolutionizing space, time and gravity succeed in taming the Quantum? CanEinstein find true love in the whirlpool of his personal experiences?Padamsees cast of eight includes Carson Carter (Albert Einstein), Mikaela May (Mileva Maric), Peter Stielstra (Max Planck and Professor Weber), Carleigh Ray (Elsa Lowenthal Einstein), Eliana Tirona (Milana Bota and Young Mileva), Ronnie Lyall (Marcel Grossman), Patricia Lomden (Ruzica Drazic), and Erick Heyer-Fogelberg. The production team is Rachel Fox (Stage Manager), Dominic Dom Bonelli (Sound Engineer), Autumn Thielander (Choreographer), Zole Morack (Lighting Designer), and Mario Gallego (Production Assistant).

Padamsee began writing plays as a device to teach physics to his Cornell students. He wrote short plays about fascinating characters and the adventure of their discoveries, and gave his students the option to perform the plays in lieu of writing papers. Since those days he has written the full-length plays CREATIONS BIRTHDAY, QUANTUM LOVERS, QUANTUM WONDERLAND, in addition to QUANTUM LOVERS - THE MUSICAL. Padamsee says of QUANTUM LOVERS, By exploring two less charted dimensions of Einsteins character, passionate lover, and staunch anti-nationalist, the play is transporting, timely and true. It moves the audience to Europe into the time that led to the First World War, strongly connecting to the dominant events of today.

See the rest here:

World Premiere of QUANTUM LOVERS: THE MUSICAL to Explore ... - Broadway World

Read More..