Page 1,985«..1020..1,9841,9851,9861,987..1,9902,000..»

Assistant Professor / Associate Professor / Professor, Statistics and Data Science job with National Taiwan University | 299080 – Times Higher…

Institute of Statistics and Data Science

http://stat-ds.ntu.edu.tw

The Institute of Statistics and Data Science at National Taiwan University invites applications for tenure-track faculty positions at all levels (Assistant, Associate, or Full Professor) with expertise in Statistics and Data Science. The academic ranks will be commensurate with credentials and experiences. The positions will begin in February or August 2023. Before starting, applicants should have a Ph.D. degree in Statistics, Data Science, or a closely related discipline.

Description:

The newly established Institute of Statistics and Data Science, College of Science, National Taiwan University, begins the first enrollment in the 2022 academic year. We are hiring additional faculty members to develop our academic programs. To promote the professionals and research in statistics and data science, the institute emphasizes developing statistical theory and methods as well as interdisciplinary applications of data science. The training in statistical theory and methods assists students in establishing the foundation for quantitative research and analysis. In contrast, the perspective of applied statistics in data science helps cultivate students professional skills in practical data analysis. We aim to meet the trend and market demand in developing modern statistics and tools for data science.

Documents Required:

How to Apply:

Please submit application materials to Search Committee, ISDS Preparatory Office at NTU (e-mail address: ntusds@ntu.edu.tw), with the subject line Application for Faculty Position. Applications received by August 31, 2022, will receive full consideration. While early submissions are encouraged, applications will continue to be accepted until all positions are filled. For more information, please visit http://stat-ds.ntu.edu.tw.

For related inquiries, please contact:

Ms. Kui-Chuan KAO

Administrative Assistant

E-mail: ntusds@ntu.edu.twTel: +886 (2)3366-2833

Link:

Assistant Professor / Associate Professor / Professor, Statistics and Data Science job with National Taiwan University | 299080 - Times Higher...

Read More..

Environmental Factor – July 2022: New initiatives to transform research highlighted at Council meeting – Environmental Factor Newsletter

Precision environmental health, the totality of our environmental exposures, new funding opportunities related to climate change and health (see sidebar), efforts to combat environmental health disparities, and report back of research results were among the topics discussed at the National Advisory Environmental Health Sciences Council meeting held June 7-8.

NIEHS Director Rick Woychik, Ph.D., shared some scientific areas that have come into focus over the past couple of years. Those include precision environmental health and the exposome; computational biology and data science; climate change and health; environmental justice and health disparities; and mechanistic and translational toxicology.

Studying the exposome the totality of an individuals environmental exposures throughout the life course, and their corresponding biological changes is critical for the advancement of precision environmental health, noted Woychik. The precision environmental health framework aims to prevent disease by shedding light on how individuals vary in their response to exposures based on their unique genetic, epigenetic, and biological makeup.

To expand knowledge in this area, NIEHS is hosting an upcoming workshop series titled Accelerating Precision Environmental Health: Demonstrating the Value of the Exposome, which will cover the following topics.

To learn more about the workshops and to register, click here(https://tools.niehs.nih.gov/conference/exposomics2022/).

Woychik also shared information about the Advanced Research Projects Agency for Health (ARPA-H), which is a new entity within the National Institutes of Health (NIH).

ARPA-H will advance breakthroughs in biomedical research by funding cutting-edge scientific studies and approaches. Council members discussed the importance of such funding for NIEHS grantees, especially early-stage investigators.

On May 31, the U.S.Department of Health and Human Services (HHS) formally established a new Office of Environmental Justice (OEJ) in response to President Bidens Executive Order on Tackling the Climate Crisis at Home and Abroad. OEJ will reside within the Office of Climate Change and Health Equity in the Office of the Assistant Secretary for Health.

Arsenio Mataka, J.D., a senior advisor to the assistant secretary, informed Council that OEJ will work to directly improve the wellbeing of underserved communities.

OEJ will lead HHS efforts to coordinate implementation of the Justice40 Initiative, which aims to deliver 40% of the overall benefits of federal investments in clean energy, water, transit, housing, workforce development, and pollution remediation to disadvantaged communities. The NIEHS Environmental Career Worker Training Program is participating in the Justice40 Initiative (see related story in this issue).

Eliseo Prez-Stable, M.D., director of the National Institute on Minority Health and Health Disparities (NIMHD), shared his institutes research agenda on environmental health disparities. He outlined important NIMHD and NIEHS collaborations in this area, such as RADx Underserved Populations, an NIH program designed to reduce disparities in COVID-19 morbidity and mortality.

There is a lot of overlap, and we have much in common, Prez-Stable noted regarding NIMHD and NIEHS. We both have a strong sense of the importance of community engagement, and were both very interested in addressing issues of unequal care and social injustice in health and health care.

NIEHS Division of Extramural Research and Training Deputy Director Pat Mastin, Ph.D., described the divisions longstanding commitment to addressing environmental health disparities and promoting environmental justice. He discussed recent NIEHS workshops on racism as a public health issue, advancing environmental health equity, and womens health disparities.

NIEHS Partnerships for Environmental Public Health (PEPH) Program coordinator Liam OFallon gave an overview of the programs achievements since its 2009 launch. PEPH includes a diverse network of scientists, community members, educators, health care providers, public health officials, and policymakers. Together, they work to address important health challenges and improve lives by translating research into action.

PEPH has evolved into a community of practice for our grantees, partners, and NIEHS staff, he said. It helps to integrate ideas and practices, encourages learning from one another, and enables individuals to solve common problems.

OFallon also highlighted an effort to promote report back of environmental health research results to study participants. He is working with NIEHS grantees Julia Brody, Ph.D., from the Silent Spring Institute, and Katrina Korfmacher, Ph.D., from the University of Rochester Medical Center, to develop guidelines and best practices that will make it easier for researchers to share findings, thereby empowering individuals to take steps to improve their health. (Check out this months NIEHS Directors Corner column to learn more.)

(Ernie Hood is a contract writer for the NIEHS Office of Communications and Public Liaison.)

Read the rest here:

Environmental Factor - July 2022: New initiatives to transform research highlighted at Council meeting - Environmental Factor Newsletter

Read More..

Anaconda wants to chew into the market of Google Colab – Analytics India Magazine

Python is one of the most popular programming languages (especially in the data science community) and consistently leads different indexes and surveys. In fact, recently, Python won the prestigious TIOBE Programming Language of the Year award (2021) for the second time in a row. In the State of Data Science report 2021 by Anaconda, Inc., 63% of respondents said they always or frequently use Python. The survey also showed that 71% of educators are teaching Python, and 88% of students reported being taught Python to enter the data science and machine learning domain.

Image: Anaconda | State of Data Science 2021

But Python can be used by non-programmers too, where its applications can spread beyond data science-related problems. Anaconda, Inc. has been taking steps to democratise the accessibility to the language, it claims. In order to achieve this, the latest move from the company includes the acquisition of PythonAnywhere, a cloud-based Python development and hosting environment. PythonAnywhere allows Python developers to just create web applications within a cloud-based Python environment. This can provide great ease in collaborating and sharing within dispersed teams in todays work scenarios.

Anaconda, Inc. was founded by Peter Wang and Travis Oliphant a decade back. It is behind the release of Anaconda, a distribution of the Python and R programming languages for data science, machine learning, predictive analytics, etc.

Peter Wang, CEO and co-founder of Anaconda, observes that for Python to maintain the growth it has seen as well as remain the most widely used data science programming language in the world, it will be important to improve its accessibility and remove the barriers to collaboration. This acquisition of PythonAnywhere will allow Anaconda to extend its services to all Python developers while building on capabilities for data scientists, engineers, data science enthusiasts, and students, he adds.

PythonAnywhere was founded in 2012 by Giles Thomas and Robert Smithson and is based out of the United Kingdom. It is an online integrated development environment (IDE) and web-hosting service based on the Python language.

The basic accounts are free, and the company charges for advanced services such as professional web app hosting, etc.

Just some time back, Anaconda, at PyCon US 2022, introduced us to PyScript, which received much appreciation. PyScript allows users to create Python applications in the browser by using a mix of Python with standard HTML. It allows a user to run many popular packages of Python and the scientific stack, like NumPy, pandas, scikit-learn, etc. It will also provide bi-directional communication between Python and JavaScript objects and namespaces. Some even hailed PyScript as being able to replace Javascript and its wide usage.

PyScript provides greater flexibility around what job users can do in back-end vs front-end development. Similarly, PythonAnywhere provides Python developers greater flexibility around where they do their job desktop vs cloud. It seems like the two announcements back to back in a short span of time are quite strategic from Anacondas point of view. The two work in sync to work on Anacondas objective to move Python just beyond data science and increase its use among non-programmers too.

Read more:

Anaconda wants to chew into the market of Google Colab - Analytics India Magazine

Read More..

FEATURED – African Centre of Excellence in Data Science (ACE-DS): Achievements and planned activities – The New Times

The African Centre of Excellence in Data Science (ACE-DS) which was established in 2016 in the College of Business and Economics at the University of Rwanda is already bearing fruits as graduates are using their gained data science skills in their respective domains to boost national and regional development.

Data Science is an interdisciplinary course that combines expertise in statistics, computer science, and mathematics among other fields in science.

The Centre was established at the University of Rwanda on 17thOctober 2016 with financial support from the Government of Rwanda and World Bank. The core mission of the centre is to train post-graduate students with combined expertise in statistics, economics, business, computer science, and engineering to use big data and data analytics to solve development challenges.

Prof. Charles Ruranga, the Director of African Centre of Excellence in Data Science said that the Centre has both PhD and Masters programs.

The courses we offer for Masters and PhD candidates include data science in Data Mining, Biostatistics, Econometrics, Actuarial Science, and Demography, he explained.

He said that many achievements have so far been recorded after the accreditation of programs by Rwanda Higher Education Council (HEC).

Since 2017, 151 students have enrolled in Masters programs and 49 PhD programs. Among them 24% are regional students and more than 29% are female students. 36 MSc students of cohort 1 graduated on 27th August 2021 and currently 15 students at Masters level and one at PhD level are also eligible for graduation. This is a great achievement because we could soon have 51 graduates of whom one is at PhD level in total since 2017, Prof. Ruranga said. Graduates help companies to use their data and come up with solutions for growing their businesses.

Another major milestone has been conducting collaborative research where more than 50 papers have published in peer-reviewed journals.

A third milestone is organizing more than 10 professional short courses with more than 300 participants.

A fourth milestone is development of infrastructure for data science training and research at ACE-DS. This includes higher performance computer, creation of three computer labs with needed software in data science, classrooms with teleconferencing capabilities, and work spaces for Masters and PhD students. ACE-DS has also curated a data science library, including printed and online texts, and has invested in a local area network and server for data storage.

Finally, ACE-DS has received international accreditation of its PhD and Masters programs by the Data Science Council of America (DASCA) following an evaluation of the designed programs, available infrastructures and the teaching capacity of the Centre.

Trainees that followed professional short coursesoffered by ACE-DS.

ACE-DS has started Data Driven Incubation Hub (DDIH) which will bring together Data Science experts, Data Science professional and Data Science students in order to provide accurate and innovative data driven solutions.

Now the 5thcall for application is available for another intake and those who are interested can apply for Masters and PhD programmes, he said. More details about requirements are found onhttps://aceds.ur.ac.rw/.

Masters programme students graduate after two years while PhD students graduate after between three and four years.

Prof. Ruranga says rapid and sustained economic growth has increased the need for skilled people in data science, and it is a growing field. A career as a data scientist is ranked as the third-best job in the US for 2020, and by Glassdoor, a job and recruiting site involved in online jobs and career communities in the US.

The work of a data scientist includes breaking down big data into usable information, and creating software and algorithms that help companies and organisations determine optimal operations.

ACE-DS offers professional certifications in collaboration with Data Science Council of America (DASCA). Requirements and additional information are available onhttps://aceds.ur.ac.rw.

Students who graduated from the African Centre of Excellence are now working in different sectors of the economy. Mr. Venant Habarugira is one of the graduates who works in the area of statistics.

He is one of the first cohort of UR-ACEDS (2018-2020) and completed his studies in 2020 then graduated in August 2021.

I knew the ACEDS from the call for application through the UR website. I pursued the Master of Science in Data Science with specialization in Demography. As a researcher statistician and demographer, it was sometimes difficult to lonely undertake a research project that requires multidisciplinary skills like statistics, mathematics, computer science, research methodology, he said.

According to Habarugira, Data Science is an interdisciplinary field combining skills in statistics, mathematics, computer science, and applying knowledge from data across a broad range of application domains to solve development challenges.

It uses scientific methods, processes, algorithms and systems to extract knowledge and insights from noisy, structured and unstructured data.

By pursuing this master of data science, I acquired extensive knowledge that can help in explaining, understanding demographic, socio-economic phenomena using skills gained in statistics, mathematics and computer science, treating very big data, structured or unstructured by using new methods of big data analytics, programming and machine learning techniques, he said.

He testified that he can use techniques and theories drawn from many fields to create programming codes and combine it with statistical knowledge to create insights from data but also explain phenomena.

As a researcher statistician, the skills gained allows him to undertake a research project with minimal support from computer specialists by applying multidisciplinary skills gained to various phenomena under study, he said.

editor@newtimesrwanda.com

Here is the original post:

FEATURED - African Centre of Excellence in Data Science (ACE-DS): Achievements and planned activities - The New Times

Read More..

Discover a promising IT education at The IT University of Copenhagen – Study International News

In the middle of restaden lies The IT University of Copenhagen (ITU) a leading institution that focuses on IT research and education. Here, over 2,000 students benefit from ITUs state-of-the-art teaching and research in computer science, business IT and digital design.

English-German student Nicola Clark is one of them for her, choosing to study at ITU was one of her best decisions. Moving to Denmark has been one of the best things that happened to me in both my academic and my personal life, shares Nicola.

She has just finished the second year of her BSc in Data Science the programme she professes to be right for her future career in many ways. I chose data science because its a subject that would play to many of my strengths and interests, enthuses Nicola.

I previously programmed a little and knew that I enjoyed the problem-solving process that was involved. So I knew that choosing a course where I could develop this skill could work out well.

Indeed, ITUs programmes are exemplary for their close collaboration with industry and strong, international environment. ITU students pursuing data science are set to become the next generation of in-demand IT professionals. They become data scientists with comprehensive analytical and technical skills driving the decision-making of the future. Not only will they learn to work in interdisciplinary teams and make sense of vast amounts of data, but will be able to apply their organisational knowledge and market understanding to make a difference.

For example, its data science module Data Visualisation and Data-driven Decision Making equips you to define data visualisation and identify the most appropriate visualisation strategy. Meanwhile, students explore domain-specific approaches and translate technical concepts to real-world concerns through research-based language during the Data Science in Research, Business and Society module.

There is a machine learning module as well. It gives a fundamental introduction to machine learning (ML) with an emphasis on statistical aspects. Students focus on both the theoretical foundation for ML and the application of ML methods.

Nicola finds it to be amongst the most challenging, yet most interesting, modules thus far. Machine learning can often seem quite mysterious and I, like many people, did not know much about the topic when I started my studies two years ago, she shares.

Most people interact with several different machine learning models every day, often without even realising or paying attention to them. Learning the mathematics and code that underlie such processes provided a fascinating insight into some of the hidden workings of the digital world. It was by far among the most interesting courses I have taken at ITU so far, despite also being one of the most challenging.

The IT University of Copenhagen offers industry-aligned programmes within an international study environment. Source: The IT University of Copenhagen

At ITU, practical learning is emphasised. Second-year students pursuing the data science programme take on a project Introduction to Natural Language Processing and Deep Learning that gets them to explore deep neural networks, including representation learning and practical implementations in Python, and work on a real-world problem using natural language processing technology and approaches.

Pair an international student environment with ITUs academic excellence, and an unforgettable, rich student experience is what youll get. I have enjoyed many nice beers and chats with people in the park after classes and at the Friday Bar. Its the people here that make studying fun and interesting, shares Nicola.

Plus, with ITUs approximately 19,000 square-metre building, students have a great space not only for learning, but for making memories too. Some weekends, especially close to deadlines and exams, I revise at home or at the university with friends. Sometimes we book a study room or meet at the caf to study together which can be quite cosy and fun.

Students like Nicola are set to graduate ready for a promising career thanks to ITUs continuous dialogue and close collaboration with relevant industries. Every March and October, ITU organises its IT Match Making fair for companies to hire more-than-qualified students and collaborate with students on projects.

Over 2,000 students benefit from ITUs state-of-the-art teaching and research in computer science, business IT and digital design. Source: The IT University of Copenhagen

Nicola describes her ITU experience thus far as challenging, interesting and transformative. ITU and data science have played a key role in shaping who I am as a person. I feel that ITU has given me an environment where I have been able to grow both emotionally and academically; I am happy with the person I have become here, she shares.

There have been some difficult courses, deadlines, and exams. But for me there is always a great sense of achievement when difficult concepts are understood, work is handed in, and exams are passed. Kickstart an education at The IT University of Copenhagen here.

Follow The IT University of Copenhagen on Facebook, Twitter, LinkedIn, Instagram and YouTube

Read the original post:

Discover a promising IT education at The IT University of Copenhagen - Study International News

Read More..

Associate Professor in Data Analytics job with EDINBURGH NAPIER UNIVERSITY | 299104 – Times Higher Education

Job title: Associate Professor in Data Analytics Full time, Permanent

Job reference: 0000016303

Date posted: 29/06/2022

Application closing date: 31/07/2022

Salary: Grade 7 51,799 60,022 per annum

Package: Excellent benefits package

Contractual hours: 35

Basis: Full Time

Job descriptionThe Business School of Edinburgh Napier University is looking to appoint an Associate Professor in Data Analytics, with a primary interest in using cutting-edge statistical approaches to provide policy makers in the public and private sector with powerful representations, projections, and scenario forecasts for sustainable urban futures.

As an Associate Professor, you will be joining our Entrepreneurship and Innovation Subject Group and positioned on a Research pathway. With your research and leadership skills, you will contribute to raising the national and international profile of our School and University, support the delivery and ongoing development of our undergraduate and postgraduate programmes, mentor students and colleagues (including early career academics), and become a key member of our Urban Innovation Policy Lab (Unity Lab).

Through multidisciplinary research that connects urban science to technology and innovation studies, the Unity Lab is becoming a major influencer in the international debate on urban sustainability. The research and consultancy activities conducted by the Unity Lab are supporting policy formulation on an international scale, thanks to the cooperation with an expanding network of world-leading organizations that operate in the urban innovation arena. Examples of organizations include the Development Bank of Latin America (CAF), National League of Cities, European Commission, and United Nations.

We welcome applications from candidates working in any areas of data analytics, but preference will be given to applicants with an emerging reputation and interest in geographic data science and mathematical modelling applied to the analysis of urban and regional systems.

About the school

The Business School of Edinburgh Napier University is one of the largest modern Business Schools in Scotland, at the leading edge of innovation, with a strong academic reputation in business research and undergraduate, postgraduate, and executive education. The School is focused on the delivery of a portfolio of programmes that reflect the needs of students and place greater emphasis on practical experiences and application of learning.

We develop employable (and self-employable) graduates who will drive economic growth both at local and international level. We have strong and developing links with business through research-led teaching, practice-based classroom activities, student placements, business support, research and consultancy. More than 1,200 students each year take modules in entrepreneurship and innovation at undergraduate, and postgraduate level.

Many of our academic staff engage in consultancy and contract research activities. They maintain close links with industry, through professional body memberships, knowledge transfer programmes, and both placement and live project module supervision.

To learn more about our Business School, watch this short video.

The role

You will join the Entrepreneurship and Innovation Subject Group of Edinburgh Napier Universitys Business School, where the Unity Lab is placed. This ambitious, fast-growing, and multidisciplinary team of academics are making significant contributions to producing new insight into how different types of stakeholders collaborate in formulating strategies for supporting the transformational changes that are required to enable responsible urban innovation. The expertise of the Entrepreneurship and Innovation Subject Group spans across a multitude of knowledge domains that connect social sciences to engineering and technology disciplines. Examples of subject domains in which the group is operating include urban and innovation studies, entrepreneurship, organization studies, public management, sociology, information systems, political science, governance studies, and psychology.

Within the area of data analytics, you will be responsible for developing, designing, and delivering teaching and student-centred learning underpinned by academic scholarship and professional practice. An appetite for designing engaging materials for virtual learning environments is essential.

You will contribute to growing the academic reputation of the University by collaborating in expanding our portfolio of research and commercial projects and in developing new working relationships with businesses and other external organisations interested in applied research related to your subject domain. You will also apply your mentoring skills to nurture the development of our community of early career researchers.

You are expected to have a sufficiently developed research and scholarly profile, with a clear development trajectory, as well as strong leadership skills and a commitment to research excellence. You must be ambitious and enthusiastic about multi-disciplinary and cross-sector research cooperation and work collaboratively across the University, where you will be required to operate independently, but also as part of different research teams.

Who we are looking for

Essential job requirements are highlighted in the Role Profile and particular attention will be posted to the following qualifications and skills:

Benefits:

Salary: Grade 7 51,799 60,022 per annum

For further details, please click here.

Additional information

Closing Date: 31st July 2022Interviews: Week of 29th August 2022

For informal discussions about this position, please contact Professor Luca Mora (L.Mora@napier.ac.uk) and for other general inquiries please contact recruitment@napier.ac.uk.

Edinburgh Napier is committed to creating an environment where everyone feels proud, confident, challenged and supported and are holders of Disability Confident, Carer Positive and Stonewall Diversity Champion status. Please see here for more information.

Originally posted here:

Associate Professor in Data Analytics job with EDINBURGH NAPIER UNIVERSITY | 299104 - Times Higher Education

Read More..

Colorado’s quantum revolution: How scientists exploring a universe of tiny things are transforming the state into a new Silicon Valley – CU Boulder…

Artist's depiction of an atomic clock. (Credit: Steven Burrows/JILA)

Researchers at CU Boulder and LongPath Technologies are using quantum sensors to detect methane leaks from oil and gas sites. (Credit: CU Boulder)

That new quantum revolution began, in many ways, with a clocknot a wristwatch or a grandfather clock, but a device that can do a lot more with the help of atoms.

Today, scientists at JILA and NIST are developing some of the worlds most precise and accurate atomic clocks. They build off decades of work by Nobel laureates Jan Hall, Dave Wineland and Eric Cornell and Carl Wieman.

First, researchers collect clouds of atoms and chill them down, then trap those atoms in an artificial crystal made of laser light. Next, they hit the atoms with yet another laser. Like pushing a pendulum, that laser beam starts the atoms ticking, causing them to oscillate between energy levels at a rate of quadrillions of times per second.

Child wears a helmet made up of more than 100 OPM sensors. (Credit: FieldLine)

These clocks are also incredibly sensitive. Ye, for example, demonstrated an atomic clock that can register the difference in Earths gravity if you lift it up by just a millimeter. Ye, whos also the director of CUbit, leads a center on campus funded by the National Science Foundation called Quantum Systems through Entangled Science and Engineering (Q-SEnSE).

He imagines using such devices to, for example, predict when a volcano is about to erupt by sensing the flow of magma miles below Earths surface.

For me, one of the most promising technological avenues is quantum sensors, said Rey who has worked with Ye over the years to take atomic clocks to greater and greater levels of precision. Weve already seen that quantum can help us do better measurements.

A team of engineers at CU Boulder is using different quantum sensors to detect methane leaking from natural gas operations in the West.

Meanwhile, CU Boulders Svenja Knappe and her colleagues employ a quantum sensor called an optically-pumped magnetometer, or OPM, to dive into the complex territory of the human brain.

The first time I went to a neuroscience meeting, the neuroscientists there looked at me like I was from Mars, said Knappe, associate professor in the Paul M. Rady Department of Mechanical Engineering.

Knappes OPMs each measure about the size of two sugar cubes. They contain a group of atoms that change their orientation of their "spins," a strange property of atoms and particles, in response to the magnetic fields around them. Its a bit like how the needle in a compass always points north. She and her colleagues are employing the sensors to measure the tiny blips of energy that neurons emit when humans move, think or even just breathe.

Neuroscientists are already using helmets embedded with 128 of these sensors to collect maps of the brains activity, or magnetoencephalograms (MEGs). They are important tools for studying or diagnosing illnesses like schizophrenia and Parkinsons Disease. To date, Knappe and her colleagues have sold sensors to about a dozen clients through a Boulder-based company called FieldLine.

This is not a technology that's 20 years out. Quantum sensors can make an impact on your life now, she said.

Artist's depiction of an Earth-sized planet orbiting a star roughly 100 light-years from our own. (Credit:NASA/Goddard Space Flight Center)

By going smaller and ever more precise, scientists might also pursue questions that have eluded them for decades.

Jun Ye shares his team's new atomic clock, the world's most precise yet. (Credit: NIST)

Learn more about the power of frequency combs. (Credit: NIST)

For instance: What is dark matter?

This mysterious substance constitutes about 84% of the mass in the universe, but scientists have yet to identify what type of particle its made of. Dark matter is, as far as physicists can tell, completely invisible and rarely interacts with normal matter. But Ye suspects that some candidates for dark matter could bump into the atoms in his atomic clocknot very often, but often enough that he and his colleagues could, theoretically, detect the disturbance.

Say you have a clock here in the U.S. and another clock somewhere near the North Pole, Ye said. If one clock was speeding up, while the other was slowing down, and if you have accounted for all other known effects, that might indicate that were seeing different fields pass Earth as it moves through the universe.

Other researchers in Colorado are narrowing the search for dark matter using a different set of quantum technologies.

Quantum researcher Scott Diddamsis turning quantum sensors toward space to look for planets circling stars far away from Earth.

Diddams is a former physicist at NIST who recently joined CU Boulder as a professor in the Department of Electrical, Computer and Energy Engineering. He explained that when exoplanets orbit their home stars, they tug on those stars a little bit, causing them to wobble.

Telescopes on the ground can spot those wobbles, but the changes are very, very faint.

As a star is being pulled away from us, the colors of light look ever so slightly more red. As its being pulled toward us, its light will look slightly more blue, he said. And by slight, I mean less than one part in 1 billion.

He uses a powerful type of laser called a frequency comb to help narrow in on that slight shift. More than 20 years ago, the physicist was part of the team that invented these lasers when he was a postdoctoral scientist at JILAat first, researchers used them to count out the ticking in atomic clocks. But if you also install one of these tools in a telescope on the ground, Diddams said, it can act almost like a ruler for light waves. Astronomers can deploy these rulers to more precisely measure the color of light coming from distant stars, potentially finding planets hiding just out of view.

Diddams colleagues are already doing just that every night from two observatories on the ground. Teams led by Penn State have installed frequency combs at the McDonald Observatory in Texas and the Kitt Peak National Observatory in Arizona, and the researchers haveplans for more.

Its a really interesting example of transitioning technology first developed at JILA out of the lab and into real experiments, he said.

Artist's depiction of a laser heating up bars of silicon many times thinner than the width of a human hair. (Credit: Steven Burrows/JILA)

When Antony van Leeuwenhoek first observed the green cells belonging to algae from lakes in the Netherlands, he was seeing the world at about 200 times its normal size.

Physicists Margaret Murnane and Henry Kapteyn have spent their careers trying to look deeper than thatroughly 100 to 1,000 times deeper.

Margaret Murnane and Henry Kapteyn in their lab on campus. (Credit: Glenn Asakawa/CU Boulder)

A little more than a decade ago, the duo built the worlds first X-ray laser that could fit on a tabletop. And they did it by tapping into the quantum nature of electrons and atoms.

The team uses a laser to pluck electrons in atoms, essentially making them vibrate violentlyakin to what happens if you pluck a guitar string really hard. In the process, the atoms, like guitar strings, can snap, breaking apart but also emitting excess energy in the form of X-ray light. The resulting beams, which are today among the fastestmicroscopes on Earth, oscillate more than a quintillion, or a billion billion, times per second.

Murnane and Kapteyns group at CU Boulder has used these lasers to better understand how heat flows in nanodevices thinner than the width of a human hair. Theyve also found that light can manipulate the magnetic properties of materials more than 500 times faster than scientists previously predicted.

NIST and Imec, a company that develops semiconductors and other devices, are already employing Murnane and Kapteyns lasers to design new nano-sized electronics. Seeing is understanding, said Murnane, a JILA fellow and distinguished professor of physics at CU Boulder. We still cant see everything we need to see to be able to understand nature.

To build the microscopes of tomorrow, Murnane and Kapteyn helped launch a $24 million center on campus called STROBE with funding from the U.S. National Science Foundation. In the 1990s, they started a company called KM Labs to sell their X-ray lasers.

Theyre also continuing to push the limits of what atoms are capable of. Kapteyn said that the team would like to one day create an X-ray laser so powerful that it could see inside human tissue. Such a machine would allow doctors to zoom in on specific regions of the body, and with much greater resolution than current X-rays.

There are a lot of quantum questions that are still outstanding in this area around how far we can push this technology, said Kapteyn, JILA fellow and professor of physics.

Artist's depiction of a qubit formed from an ytterbium atom trapped in laser light. (Credit: Steven Burrows/JILA)

There may be no quantum technology that gets as much hype as quantum computers.

Some of the largest technology companies in the world, includingGoogle, Amazon, Microsoft and IBM, are trying their hand at developing computers that are based on unusual properties of quantum physics. Experts believe quantum computers could one day solve problems that even the largest supercomputers on Earth right now couldnt.

Cindy Regal, center, helped to consult on a mural by artist Amanda Phingbodhipakkiya in Denver's Washington Park celebrating women in physics. (Credit:Amanda Phingboddhipakkiya)

But researchers at CU Boulder say quantum computers that can solveproblems relevant to the lives of real people may still be a long way away. For one thing the quantum processors currently available often produce too many errors to do a lot of basic calculations.

Quantum computers may be really good for the specific tasks they are designed for, said Shuo Sun, a JILA fellow and assistant professor of physics at CU Boulder. But they wont be good at everything."

Researchers at CU Boulder, however, are striving to design new and better qubits. Like the bits that run your home laptop, qubits form the basis for quantum computers. But theyre a lot more flexible: Qubits can take on values of zero or one, like normal bits, but they can also exist in a ghostlike state, or superposition, of zero and one at the same time.

Cindy Regal, a fellow at JILA and associate professor of physics at CU Boulder, and her colleagues are using a technique called optical tweezing to make qubits out of neutral atoms, or atoms without a charge. Optical tweezers deploy laser beams to carefully move around and arrange those atoms. Scientists can then build complex lattices made up of atoms, carefully controlling how they interact with each other.

Sun, in contrast, is designing a different kind of qubit by implanting lone atoms inside diamonds and other crystals.

Regal noted that even if these qubits dont wind up in a computer anytime soon, they can still help scientists answer new questionsallowing them to simulate, for example, the physics of weird states of matter in a controlled lab setting.

Some experts have even imagined quantum computers leading the hunt for new medicines. These devices might, decades from now, scan through large databases of molecules, looking for ones with the exact chemical behavior doctors want.

In the end, Murnane said if quantum researchers want to bring their quantum technologies out of the lab, they need to continue to build connections with researchers beyond physicscollaborating with engineers, materials scientists, astrophysicists, biologists and more.

If you want your research to have a big impact, you need to look beyond your field, Murnane said.

Ye noted that the most amazing quantum technologies may be the ones that scientists havent dreamed up yet. Like van Leeuwenhoek discovered four centuries ago, the deeper you look into the world, the more surprises youll find.

Its just like when people first invented a microscope, Ye said. Theres an entire world down there filled with bacteria, and we never realized that.

Follow this link:
Colorado's quantum revolution: How scientists exploring a universe of tiny things are transforming the state into a new Silicon Valley - CU Boulder...

Read More..

Eight leading quantum computing companies in 2020 | ZDNet

The use of quantum computers has grown over the past several months as researchers have relied on these systems to make sense of the massive amounts of data related to the COVID-19 virus.

Quantum computers are based on qubits, a unit that can hold more data than classic binary bits, said Heather West, a senior research analyst at IDC.

Besides better understanding of the virus, manufacturers have been using quantum systems to determine supply and demand on certain products -- toilet paper, for example -- so they can make estimates based on trends, such as how much is being sold in particular geographic areas, she said.

"Quantum computers can help better determine demand and supply, and it allows manufacturers to better push out supplies in a more scientific way,'' West said. "If there is that push in demand it can also help optimize the manufacturing process and accelerate it and actually modernize it by identifying breakdowns and bottlenecks."

Quantum has gained momentum this year because it has moved from the academic realm to "more commercially evolving ecosystems,'' West said.

In late 2019, Google claimed that it had reached quantum supremacy, observed Carmen Fontana, an IEEE member and a cloud and emerging tech practice lead at Centric Consulting. "While there was pushback on this announcement by other leaders in tech, one thing was certain -- it garnered many headlines."

Echoing West, Fontana said that until then, "quantum computing had felt to many as largely an academic exercise with far-off implications. After the announcement, sentiment seemed to shift to 'Quantum computing is real and happening sooner than later'."

In 2020, there have been more tangible timelines and applications for quantum computing, indicating that the space is rapidly advancing and maturing, Fontana said.

"For instance, IBM announced plans to go from their present 65-qubit computer to a 1,000-qubit computer over the next three years," he said. "Google conducted a large-scale chemical simulation on a quantum computer, demonstrating the practicality of the technology in solving real-world problems."

Improved artificial intelligence (AI) capabilities, accelerated business intelligence, and increased productivity and efficiency were the top expectations cited by organizations currently investing in cloud-based quantum computing technologies, according to an IDC surveyearlier this year.

"Initial survey findings indicate that while cloud-based quantum computing is a young market, and allocated funds for quantum computing initiatives are limited (0-2% of IT budgets), end users are optimistic that early investment will result in a competitive advantage,'' IDC said.

Manufacturing, financial services, and security industries are currently leading the way by experimenting with more potential use cases, developing advanced prototypes, and being further along in their implementation status, according to IDC.

Quantum is not without its challenges, though. The biggest one West sees is decoherence, which happens when qubits are exposed to "environmental factors" or too many try to work together at once. Because they are "very, very sensitive," they can lose their power and ability to function, and as result, cause errors in a calculation, she said.

"Right now, that is what many of the vendors are looking to solve with their qubit solutions,'' West said.

Another issue preventing quantum from becoming more of a mainstream technology right now is the ability to manage the quantum systems. "In order to keep qubits stable, they have to be kept at very cold, subzero temps, and that makes it really difficult for a lot of people to work with them,'' West said.

Nevertheless, With the time horizon of accessible quantum computing now shrinking to a decade or less, Fontana believes we can expect to see "an explosion of start-ups looking to be first movers in the quantum applications space. These companies will seek to apply quantum's powerful compute power to solve existing problems in novel ways."

Here are eight companies that are already focused on quantum computing.

Atom Computing is a quantum computing hardware company specializing in neutral atom quantum computers. While it is currently prototyping its first offerings, Atom Computing said it will provide cloud access "to large numbers of very coherent qubits by optically trapping and addressing individual atoms," said Ben Bloom, founder and CEO.

The company also builds and creates "complicated hardware control systems for use in the academic community,'' Bloom said.

Xanadu is a Canadian quantum technology company with the mission to build quantum computers that are useful and available to people everywhere. Founded in 2016, Xanadu is building toward a universal quantum computer using silicon photonic hardware, according to Sepehr Taghavi, corporate development manager.

The company also provides users access to near-term quantum devices through its Xanadu Quantum Cloud (XQC) service. The company also leads the development of PennyLane, an open-source software library for quantum machine learning and application development, Taghavi said.

In 2016, IBM was the first company to put a quantum computer on the cloud. The company has since built up an active community of more than 260,000 registered users, who run more than one billion every day on real hardware and simulators.

In 2017, IBM was the first company to offer universal quantum computing systems via theIBM Q Network. The network now includes more than 125 organizations, including Fortune 500s, startups, research labs, and education institutions. Partners include Daimler AG,JPMorgan Chase, andExxonMobil. All use IBM's most advanced quantum computers to simulate new materials for batteries, model portfolios and financial risk, and simulate chemistry for new energy technologies, the company said.

By2023, IBM scientists will deliver a quantum computer with a 1,121-qubit processor, inside a 10-foot tall "super-fridge" that will be online and capable of delivering a Quantum Advantage-- the point where certain information processing tasks can be performed more efficiently or cost effectively on a quantum computer, versus a classical one, according to the company.

ColdQuanta commercializes quantum atomics, which it said is "the next wave of the information age." The company's Quantum Core technology is based on ultra-cold atoms cooled to a temperature of nearly absolute zero; lasers manipulate and control the atoms with extreme precision.

The company manufactures components, instruments, and turnkey systems that address a broad spectrum of applications: quantum computing, timekeeping, navigation, radiofrequency sensors, and quantum communications. It also develops interface software.

ColdQuanta's global customers include major commercial and defense companies; all branches of the US Department of Defense; national labs operated by the Department of Energy; NASA; NIST; and major universities, the company said.

In April 2020, ColdQuanta was selected by the Defense Advanced Research Projects Agency (DARPA) to develop a scalable, cold-atom-based quantum computing hardware and software platform that can demonstrate quantum advantage on real-world problems.

Zapata Computing empowers enterprise teams to accelerate quantum solutions and capabilities. It introduced Orquestra, an end-to-end, workflow-based toolset for quantum computing. In addition to previously available backends that include a full range of simulators and classical resources, Orquestra now integrates with Qiskit and IBM Quantum's open quantum systems, Honeywell's System Model H, and Amazon Braket, the company said.

The Orquestra workflow platform provides access to Honeywell's H, and was designed to enable teams to compose, run, and analyze complex, quantum-enabled workflows and challenging computational solutions at scale, Zapata said. Orquestra is purpose-built for quantum machine learning, optimization, and simulation problems across industries.

Recently introduced Azure Quantum provides a "one-stop-shop" to create a path to scalable quantum computing, Microsoft said. It is available in preview to select customers and partners through Azure.

For developers, Azure Quantum offers:

Founded in 1999, D-Wave claims to be the first company to sell a commercial quantum computer, in 2011, and the first to give developers real-time cloud access to quantum processors with Leap, its quantum cloud service.

D-Wave's approach to quantum computing, known as quantum annealing, is best suited to optimization tasks in fields such as AI, logistics, cybersecurity, financial modeling, fault detection, materials sciences, and more. More than 250 early quantum applications have been built to-date using D-Wave's technology, the company said.

The company has seen a lot of momentum in 2020. In February, D-Wave announced the launch of Leap 2, which introduced new tools and features designed to make it easier for developers to build bigger applications. In July, the company expanded access to Leap to India and Australia. In March, D-Wave opened free access to Leap for researchers working on responses to the COVID-19 pandemic. In September, the company launched Advantage, a quantum system designed for business. Advantage has more than 5,000 qubits, 15-way qubit connectivity, and an expanded hybrid solver service to run problems with up to one million variables, D-Wave said. Advantage is accessible through Leap.

Strangeworks, a startup based in Austin, Texas, claims to be lowering the barrier to entry into quantum computing by providing tools for development on all quantum hardware and software platforms. Strangeworks launched in March 2018, and one year later, deployed a beta version of its software platform to users from more than 140 different organizations. Strangeworks will open its initial offering of the platform in Q1 2021, and the enterprise edition is coming in late 2021, according to Steve Gibson, chief strategy officer.

The Strangeworks Quantum Computing platform provides tools to access and program quantum computing devices. The Strangeworks IDE is platform-agnostic, and integrates all hardware, software frameworks, and supporting languages, the company said. To facilitate this goal, Strangeworks manages assembly, integrations, and product updates. Users can share their work privately with collaborators, or publicly. Users' work belongs to them and open sourcing is not required to utilize the Strangeworks platform.

More here:
Eight leading quantum computing companies in 2020 | ZDNet

Read More..

The key to quantum computing AI applications: Flexible programming languages – VentureBeat

We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 - 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!

The advance of quantum computing has the promise of reshaping artificial intelligence (AI) as its known and deployed today. This development is drastically expanding AIs enterprise and commercial outreach, perhaps even getting closer to artificial general intelligence. And there is another promise of convergence of quantum computing, AI, and programming languages into a single computational environment.

The potential effects of this coalescence of capabilities are nothing short of formidable. Deep learning applications will run much faster. The problems they solve will reach a complexity defying that of traditional approaches to advanced machine learning. Statistical and symbolic AI will run in tandem, while verticals from energy production to finance reap the benefits.

None of this will occur, however, without the enablement of flexible AI programming languages. Such programming languages are indispensable for writing AI algorithms bolstered by quantum computing to create advanced applications with the power to transform the use cases for which theyre deployed.

By availing themselves of these adaptive programming languages with the power to support paradigms for object orientation, reflection, procedural and functional programming, and meta-programming, organizations can harness this conjunction of capabilities to achieve a degree of horizontal productivity thats not otherwise possible.

As the foundation for writing effective quantum AI applications, adaptive programming languages tailored for this task are immensely helpful to developers. These high-level languages make it easy to abbreviate the time required to write code while increasing throughput when doing so. The best ones involve functional programming, which is often contrasted with, and considered superior to, imperative programming.

The dynamic capability of these AI languages to change while the program is running is superior to languages relying on a batch method, in which the program must be compiled and executed prior to outputs. Plus, these quantum AI programming languages enable both data and code to be written as expressions. Since functions in these frameworks are written like lists, theyre readily processed like data, so specific programs can actually manipulate other programs via metaprogramming which is key for their underlying flexibility. This advantage also translates into performance benefits in which such languages operate much faster in applications such as those for bioinformatics involving genomics aided by various dimensions of AI.

When enabled by flexible programming languages for developing AI, quantum computing allows organizations to perform AI calculations much faster, and at a greater scale, than they otherwise could. These programming languages also underpin both statistical and symbolic AI approaches enhanced by quantum computing. Optimization problems, for example, are traditionally solved in knowledge graph settings supporting intelligent inferences between constraints.

For applications of advanced machine learning (ML), writing AI algorithms fortified by quantum computing reduces the amount of time required for bringing new pharmaceuticals to market, for example. There are even data science applications that are universally applicable for training better ML models with less computational overhead. In all of these use cases, the key to devising AI solutions enhanced by quantum computing is the array of programming languages that empower developers to write algorithms that unequivocally benefit from the speed and scalability of quantum computing methods.

Although there are several others, the two capital ways quantum computing supplies the above benefits is via quantum computations and quantum annealing. Each of these functions involves specialized hardware for quantum computers that are more effective than traditional computers for tackling problems at the scale and speed at which AI becomes supercharged. Quantum computers encode information as 0s, 1s, or both simultaneously in quantum bits (qubits), whereas traditional computers can only encode them as 0s or 1s. The ability to superimpose these states is one of the ways in which quantum machines process gigantic quantities of data at once.

Another is via quantum annealing, which is reflective of nature in that it solves even NP-hard problems by reaching the lowest energy state of the computer. Traditional computers take an exponential amount of time to solve certain problems, such as concerns for optimization issues related to vehicles, fuel consumption, delivery objectives, and others. Quantum annealing methods expedite the time required to achieve answers to such problems, providing a degree of actionable efficiency thats pivotal for logistics or routing equipment in the travel and transportation industries.

The boons of applying quantum computing to accelerate and buttress the overall utility of AI for society and the enterprise are apparent. Much less attention, however, is given to the programming languages that are used to design these quantum AI applications. These frameworks are the gatekeepers for the future of quantum AI. Shrewd organizations are utilizing them to capitalize on this growing development.

Jans Aasman, Ph.D., is an expert in cognitive science and CEO of Franz Inc.

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.

If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.

You might even considercontributing an articleof your own!

Read More From DataDecisionMakers

View post:
The key to quantum computing AI applications: Flexible programming languages - VentureBeat

Read More..

Another Metropolitan Quantum Network Being Set Up in the Washington DC Area – Quantum Computing Report

Another Metropolitan Quantum Network Being Set Up in the Washington DC Area

A new metropolitan quantum network is being formed to join ones already started in Chicago, Long Island New York, and London. This one will be in the Washington DC area and will be called the Washington Metropolitan Quantum Network Research Consortium (DC-QNet). It will includes six U.S. government agencies located in the area and two additional affiliates outside of the region including:

The two government agencies participating in this project outside of the Washington DC region are:

The network will be used to research entanglement distribution of qubits over multi-kilometer distances. The network will be used to research using a quantum secure communications channel for exchanging data, networking multiple quantum computers together, distributing ultra-precise time signals, creating clusters of quantum sensors, understanding the metrology to operate such a network, and exploring the use of various quantum hardware devices such as quantum memories, single photon devices, transducers, and other related things.

At this time, these metropolitan networks are limited in distance to what can be covered via direct links without the use of quantum repeaters. We believe that as viable quantum repeaters are developed in the future, several of these metropolitan quantum networks will be hooked together to create a larger network, and ultimately, a quantum internet.

Additional information about this DC-QNet can be found in a press release posted by the U.S. Naval Research Laboratory (NRL) here.

July 2, 2022

This site uses Akismet to reduce spam. Learn how your comment data is processed.

View post:
Another Metropolitan Quantum Network Being Set Up in the Washington DC Area - Quantum Computing Report

Read More..