Page 2,013«..1020..2,0122,0132,0142,015..2,0202,030..»

Quantum leap: uOttawa partners with TO firm in bid to commercialize high-powered computing technology – Ottawa Business Journal

The University of Ottawa is teaming up with a Toronto-based company to develop and commercialize high-powered quantum computing technology.

The university said this week its signed a memorandum of understanding with Xanadu, one of the worlds leading suppliers of quantum hardware and software, to create new courses aimed at training the next generation of quantum computing experts as well as develop algorithms to make high-speed quantum computers even more powerful.

The one-year agreement, which has the option of being renewed, is expected to take effect in September. Sylvain Charbonneau, the universitys vice-president of research and innovation, said it will make uOttawa a leader in discovering real-world applications for quantum computing.

This partnership will help elevate emerging quantum research by giving our students and researchers access to the cutting-edge technologies and expertise held at Xanadu, he said in a statement.

It has the potential to change lives as we train the next generation of quantum pioneers, and work with industry experts to develop and commercialize real-life applications.

Xanadu will provide an undisclosed amount of funding for the research program. The federal government which last year said it planned to invest $360 million in a national strategy to advance quantum research is also expected to help fund the project.

Combining uOttawa's deep knowledge in quantum photonics with Xanadu's industry-leading expertise in quantum hardware and software will pave the way for tackling today's most important scientific and engineering challenges, Josh Izaac, Xanadu's director of product, said in a statement.

Under the agreement, uOttawa researchers will use Xanadus hardware and software to test quantum computing technology in real-world settings and help find ways of commercializing it.

Charbonneau said Xanadu which was founded in Toronto in 2016 and now employs more than 130 people will also help the school create new quantum diploma and certificate programs that straddle the border between science and engineering.

Quantum computing uses the laws of quantum physics, tapping into the world of atoms and molecules to create computers that are many times faster and more powerful than traditional digital computers.

Charbonneau said the technology has a wide range of applications, including encrypting data to make it more difficult for hackers to crack and creating ultra-powerful sensors for industries such as health care and mining.

The veteran academic said recent market research suggests quantum computing will be an $86-billion industry by 2040.

Its going to be big, he told Techopia on Wednesday afternoon. If youre (the Department of National Defence) and you want to communicate securely between A and B, youre going to use quantum cryptography for sure.

Charbonneau said uOttawa currently has more than 70 faculty members involved in quantum research, from faculties as diverse as engineering, law and physics. About a dozen of them will be part of the universitys quantum research team, and they will be assisted by upwards of 100 graduate and PhD students.

The new deal with Xanadu promises to boost uOttawas growing expertise in the field of quantum research.

The agreement comes seven years after the launch of the Max Planck uOttawa Centre for Extreme and Quantum Photonics. The facility was created to provide a forum for researchers from the university and the Max Planck Society, a non-profit association of German research institutes, to work together on technology such as high-intensity lasers.

Charbonneau said quantum computing is getting closer to becoming mainstream, and uOttawa hopes to lead the pack when it comes to training developers and programmers.

Talent really is the new currency, and were capable of providing it to the ecosystem, he said.

Link:

Quantum leap: uOttawa partners with TO firm in bid to commercialize high-powered computing technology - Ottawa Business Journal

Read More..

Emily Williams, Mark Turiansky Win 2021-22 Winifred and Louis Lancaster Dissertation Awards – Noozhawk

How can we better hold environmental polluters accountable? How can we enhance the efficiency of qubits?

These questions, which loom large for the researchers who study them, are the type of big-issue topics UC Santa Barbara graduate students are encouraged to tackle. And theyre the central themes of the dissertations that won the 2021-2022 Winifred and Louis Lancaster Dissertation Awards.

This years recipients are Emily Williams and Mark Turiansky, selected by the awards committee for dissertations with significant impact on the field in terms of methodological and substantive contributions.

As global temperatures rise and communities feel the effects of climate change, how do we as a global society address the uneven distribution of harms and gains?

The tropics, for instance, are already bearing the brunt of sea level rise and ocean acidification, yet they are not the places that have generated the magnitude of carbon emissions that cause these events, nor do they benefit in a proportionate way from the activities that cause these emissions.

Elsewhere around the world, weather events of disastrous proportions are increasing in severity and frequency, clearly caused by anthropogenic activity, yet who exactly do we hold accountable?

Inequalities and blind spots such as these are the type of thing that spark Emily Williams curiosity and activist drive. A long-time environmentalist, she got her first taste of the discipline of environmental studies as an undergraduate at UCSB under the tutelage of the late Professor William Freudenburg.

He opened my eyes to thinking about the causes of climate change, Williams said. She became conscious of the strategies corporations use to justify their actions and their methods of deflection from their outsized contribution to the problem.

Around that time, Typhoon Haiyan, then the most powerful typhoon on record, struck the central Philippines, becoming a strong and real reminder of global warmings effects. But even more compelling for Williams who had become part of a civil delegation to the UN Framework Convention on Climate Change (the international climate negotiations space) was the maddening slowness to address these impacts.

Fast-forward several years, and Williams desire to illuminate the gaps in climate accountability resulted in her dissertation, Interrogating the science of climate accountability: Allocating responsibility for climate impacts within a frame of climate justice. In it, she builds a best practices conceptual framework to identify responsibility for climate impacts.

She then tests it using an empirical case study involving the drought in the greater Four Corners region and the Zuni people who live there.

I had the opportunity to work with very diverse mentors, meaning I got to do the attribution science, engage ethnographic methods, organizational sociology and some science and technology studies-related work, she said. Its certainly hard to do interdisciplinary work, but if you find a group of mentors that will support you in this effort, its fascinating.

Among the things she uncovered in her research is the meteorological concept of vapor pressure deficit and its role on droughts, as a result of increased temperatures.

By linking this fundamental principle to vegetation, Williams and her co-authors were able to estimate what the Four Corners region would look like without climate change, and identify the human fingerprint in this whodunit of global warming.

This ability to definitively attribute effects to human activity can help build a case toward holding polluters accountable, advancing the field of climate justice. Its also what earned Williams the Lancaster Award.

Emilys outstanding integration of theory with qualitative and quantitative methods and her passionate commitment to climate justice truly set her apart, said her adviser, geography professor David Lpez-Carr.

Her dissertation makes a significant contribution to the nascent climate accountability literature by being the first to identify the human contribution to regional climate change and to follow those climate change impacts on vulnerable populations at the local level," Lpez-Carr said.

Her work provides a framework for future researchers and practitioners to advance the important area of climate accountability, with real-world implications for holding those responsible for climate change emissions and for mitigating impacts on vulnerable populations, he said.

I feel so honored and so humbled to have received this award, said Williams, who plans to complete a short post-doc before moving into the nonprofit world for more advocacy work. I know for certain that anyone who gets through a Ph.D. program, with all the challenges and opportunities the program presents, deserves such an award.

"I chose my dissertation topic because I believe so deeply in the importance of ensuring climate accountability work is done within principles of justice. I am just so happy that the selection committee thinks this topic is important, too.

The quantum world holds much potential for those who learn to wield it. This space of subatomic particles and their behaviors, interactions and emergent properties can open the door to new materials and technologies with capabilities we have yet to even dream of.

Mark Turiansky is among those at the forefront of this discipline at UCSB, joining some of the finest minds in the quantum sciences as a fellow at the NSF-supported UCSB Quantum Foundry.

The field of quantum information science is rapidly developing and has garnered a ton of interest, said Turiansky, who developed an abiding interest in physics as a child. In the past few years, billions of dollars of funding have been allocated to quantum information science.

Enabled by relatively recent technologies that allow for the study of the universeat its smallest scales, quantum researchers like Turiansky are still just scratching the surface as they work to nail down the fundamentals of the strange yet powerful reality that is quantum physics.

At the heart of some of these investigations is the quantum defect imperfections in a semiconductor crystal that can be harnessed for quantum information science.

One common example is the nitrogen-vacancy center in a diamond: In an otherwise uniform crystalline carbon lattice, an NV center is a defect wherein one carbon atom is replaced with a nitrogen atom, and an adjacent spot in the lattice is vacant. These defects can be used for sensing, quantum networking and long-range entanglement.

The NV center is only one such type of quantum defect, and though well-studied, has its limitations. For Turiansky, this underlined the need to gain a better understanding of quantum defects and to find ways to predict and possibly generate more ideal defects.

These needs became the basis of his dissertation, Quantum Defects from First Principles, an investigation into the fundamental concepts of quantum defects, which could lead to the design of a more robust qubit the basic unit of a quantum computer.

To explore his subject, Turiansky turned his attentions to hexagonal boron nitride.

Hexagonal boron nitride is an interesting material because it is two-dimensional, which means that you can isolate a plane of the material that is just one atom thick, he said. By shining light on this material, it is possible to detect quantum defects called single-photon emitters by the bright spots that shine back. These single photons, he said, are inherently quantum objects that can be used for quantum information science.

The main feat was identifying the defect that was responsible for single-photon emission, Turiansky said. He accomplished it with computational methodologies that he worked to develop in his research.

One methodology that Ive worked on a lot is for nonradiative recombination, he said, describing it in his paper as fundamental to the understanding of quantum defects, dictating the efficiency and operation of a given qubit.

By applying his methodology, Turiansky was able to determine the origin of these single photon emitters a topic of much debate in the community. Its a feat that could be applied to examine other quantum defects, and one that was deemed worthy of the Lancaster Award.

Marks work has moved the field forward by systematically identifying promising quantum defects, and providing an unambiguous identification of the microscopic nature of the most promising quantum emitter in hexagonal boron nitride, said Turianskys adviser, materials professor Chris Van de Walle. He accomplished this by creatively applying the computational approaches he developed and fruitfully collaborating with experimentalists.

Its really an exceptional honor to receive such a prestigious award for my research efforts over the last five years, Turiansky said. Its even more meaningful knowing the high quality of research turned out at UCSB and the fierce competition of my peers.

"Im incredibly grateful to my adviser, group members, collaborators, friends and family who helped make this achievement possible.

The two Lancaster dissertations are enteres into a national competition sponsored by the Council of Graduate Schools. A check for $1,000 and a plaque will be awarded upon completion of entry for the national competition.

Excerpt from:

Emily Williams, Mark Turiansky Win 2021-22 Winifred and Louis Lancaster Dissertation Awards - Noozhawk

Read More..

Inside ‘Everyday AI’ and the machine learning-heavy future – SiliconANGLE News

Artificial intelligence is being used for everything from automating workflows to assisting customers and even creating art.

Dataiku Ltd. calls this Everyday AI. Its a systematic approach to embedding AI into the organization in a way that makes it part of the routine of doing business. Dataiku has teamed up with cloud-based data warehousing giant Snowflake Inc. to set organizations up for everyday AI and the machine learning-heavy future.

We believe that AI will become so pervasive in all of the business processes, all the decision-making that organizations have to go through, and that its no longer this special thing that we talk about, said Kurt Muehmel (pictured, right), chief customer officer of Dataiku. Its the day-to-day life of our businesses. And we cant do that without partners like Snowflake, because theyre bringing together all of that data and ensuring that there is the computational horsepower behind that to drive.

Muehmel and Ahman Khan (pictured, left), head of artificial intelligence and machine learning strategy at Snowflake, spoke with theCUBE industry analysts Lisa Martin and Dave Vellante at Snowflake Summit, during an exclusive broadcast on theCUBE, SiliconANGLE Medias livestreaming studio. They discussed Everyday AI, making AI scalable and accessible, scaling data science and more. (* Disclosure below.)

One of the biggest issues in AI, historically, has been the amount of data and processing power it takes to train and run machine learning models. Dataiku and Snowflake took advantage of the scalable nature of cloud computing using Snowflakes infrastructure and, using push-down optimization, made AI more accessible and easier to manage.

Any kind of large-scale data processing is automatically pushed down by Dataiku into Snowflakes scalable infrastructure, Khan explained. So you dont get into things like memory issues or situations where your pipeline is running overnight and it doesnt finish in time.

The AI focus relates to two big announcements Snowflake made during the summit that include its Snowpark and Streamlit products, including the ability to run Python in Snowflake and easily incorporate models into Dataiku.

You can now, as a Python developer, bring the processing to where the data lives rather than move the data out to where the processing lives, Khan said. The predictions that are coming out of models that are being trained by Dataiku are then being used downstream by these data applications for most of our customers.I can write a complete data application without writing a single line of JavaScript CSS or HTML. I can write it completely in Python, which makes me super excited as a Python developer.

Heres the complete video interview, part of SiliconANGLEs and theCUBEs coverage of the Snowflake Summit event:

(* Disclosure: TheCUBE is a paid media partner for the Snowflake Summit event. NeitherSnowflake Inc., the sponsor for theCUBEs event coverage, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)

More here:
Inside 'Everyday AI' and the machine learning-heavy future - SiliconANGLE News

Read More..

Snowflake is trying to bring machine learning to the everyman – TechRadar

Snowflake has set out plans to help democratize access to machine learning (ML) resources by eliminating complexities for non-expert customers.

At its annual user conference, Snowflake Summit, the database company has made a number of announcements designed to facilitate the uptake of machine learning. Chief among them, enhanced support for Python (the language in which many ML products are written) and a new app marketplace that allows partners to monetize their models.

"Our objective is to make it as easy as possible for customers to leverage advanced ML models without having to build from scratch, because that requires a huge amount of expertise," said Tal Shaked, who heads up ML at Snowflake.

"Through projects like Snowflake Marketplace, we want to give customers a way to run these kinds of models against their data, both at scale and in a secure way."

Although machine learning is a decades-old concept, only within the last few years have advances in compute, storage, software and other technologies paved the way for widespread adoption.

And even still, the majority of innovation and expertise is pooled disproportionately among a small minority of companies, like Google and Meta.

The ambition at Snowflake is to open up access to the opportunities available at the cutting edge of machine learning through a partnership- and ecosystem-driven approach.

Shaked, who worked across a range of machine learning projects at Google before joining Snowflake, explained that customers will gain access to the foundational resources, on top of which they can make small optimizations for their specific use cases.

For example, a sophisticated natural language processing (NLP) model developed by the likes of OpenAI could act as the general-purpose foundation for a fast food customer looking to develop an ML-powered ordering system, he suggested. In this scenario, the customer is involved in none of the training and tuning of the underlying model, but still reaps all the benefits of the technology.

More from Snowflake Summit

Theres so much innovation happening within the field of ML and we want to bring that into Snowflake in the form of integrations, he told TechRadar Pro. Its about asking how we can integrate with these providers so our customers can do the fine-tuning without needing to hire a bunch of PhDs.

This sentiment was echoed earlier in the day by Benoit Dageville, co-founder of Snowflake, who spoke about the importance of sharing expertise across the customer and partner ecosystem.

Democratizing ML is an important aspect of what we are trying to do. Were becoming an ML platform, but not just where you built it and use it for yourself; the revolution is in the sharing of expertise.

Its no longer just the Googles and Metas of this world using this technology, because were making it easy to share.

Disclaimer: Our flights and accommodation for Snowflake Summit 2022 were funded by Snowflake, but the organization had no editorial control over the content of this article.

See the rest here:
Snowflake is trying to bring machine learning to the everyman - TechRadar

Read More..

Using Machine Learning to Automate Kubernetes Optimization The New Stack – thenewstack.io

Brian Likosar

Brian is an open source geek with a passion for working at the intersection of people and technology. Throughout his career, he's been involved in open source, whether that was with Linux, Ansible and OpenShift/Kubernetes while at Red Hat, Apache Kafka while at Confluent, or Apache Flink while at AWS. Currently a senior solutions architect at StormForge, he is based in the Chicago area and enjoys horror, sports, live music and theme parks.

Note: This is the third of a five-part series covering Kubernetes resource management and optimization. In this article, we explain how machine learning can be used to manage Kubernetes resources efficiently. Previous articles explained Kubernetes resource types and requests and limits.

As Kubernetes has become the de-facto standard for application container orchestration, it has also raised vital questions about optimization strategies and best practices. One of the reasons organizations adopt Kubernetes is to improve efficiency, even while scaling up and down to accommodate changing workloads. But the same fine-grained control that makes Kubernetes so flexible also makes it challenging to effectively tune and optimize.

In this article, well explain how machine learning can be used to automate tuning of these resources and ensure efficient scaling for variable workloads.

Optimizing applications for Kubernetes is largely a matter of ensuring that the code uses its underlying resources namely CPU and memory as efficiently as possible. That means ensuring performance that meets or exceeds service-level objectives at the lowest possible cost and with minimal effort.

When creating a cluster, we can configure the use of two primary resources memory and CPU at the container level. Namely, we can set limits as to how much of these resources our application can use and request. We can think of those resource settings as our input variables, and the output in terms of performance, reliability and resource usage (or cost) of running our application. As the number of containers increases, the number of variables also increases, and with that, the overall complexity of cluster management and system optimization increases exponentially.

We can think of Kubernetes configuration as an equation with resource settings as our variables and cost, performance and reliability as our outcomes.

To further complicate matters, different resource parameters are interdependent. Changing one parameter may have unexpected effects on cluster performance and efficiency. This means that manually determining the precise configurations for optimal performance is an impossible task, unless you have unlimited time and Kubernetes experts.

If we do not set custom values for resources during the container deployment, Kubernetes automatically assigns these values. The challenge here is that Kubernetes is quite generous with its resources to prevent two situations: service failure due to an out-of-memory (OOM) error and unreasonably slow performance due to CPU throttling. However, using the default configurations to create a cloud-based cluster will result in unreasonably high cloud costs without guaranteeing sufficient performance.

This all becomes even more complex when we seek to manage multiple parameters for several clusters. For optimizing an environments worth of metrics, a machine learning system can be an integral addition.

There are two general approaches to machine learning-based optimization, each of which provides value in a different way. First, experimentation-based optimization can be done in a non-prod environment using a variety of scenarios to emulate possible production scenarios. Second, observation-based optimization can be performed either in prod or non-prod by observing actual system behavior. These two approaches are described next.

Optimizing through experimentation is a powerful, science-based approach because we can try any possible scenario, measure the outcomes, adjust our variables and try again. Since experimentation takes place in a non-prod environment, were only limited by the scenarios we can imagine and the time and effort needed to perform these experiments. If experimentation is done manually, the time and effort needed can be overwhelming. Thats where machine learning and automation come in.

Lets explore how experimentation-based optimization works in practice.

To set up an experiment, we must first identify which variables (also called parameters) can be tuned. These are typically CPU and memory requests and limits, replicas and application-specific parameters such as JVM heap size and garbage collection settings.

Some ML optimization solutions can scan your cluster to automatically identify configurable parameters. This scanning process also captures the clusters current, or baseline, values as a starting point for our experiment.

Next, you must specify your goals. In other words, which metrics are you trying to minimize or maximize? In general, the goal will consist of multiple metrics representing trade-offs, such as performance versus cost. For example, you may want to maximize throughput while minimizing resource costs.

Some optimization solutions will allow you to apply a weighting to each optimization goal, as performance may be more important than cost in some situations and vice versa. Additionally, you may want to specify boundaries for each goal. For instance, you might not want to even consider any scenarios that result in performance below a particular threshold. Providing these guardrails will help to improve the speed and efficiency of the experimentation process.

Here are some considerations for selecting the right metrics for your optimization goals:

Of course, these are just a few examples. Determining the proper metrics to prioritize requires communication between developers and those responsible for business operations. Determine the organizations primary goals. Then examine how the technology can achieve these goals and what it requires to do so. Finally, establish a plan that emphasizes the metrics that best accommodate the balance of cost and function.

With an experimentation-based approach, we need to establish the scenarios to optimize for and build those scenarios into a load test. This might be a range of expected user traffic or a specific scenario like a retail holiday-based spike in traffic. This performance test will be used during the experimentation process to simulate production load.

Once weve set up our experiment with optimization goals and tunable parameters, we can kick off the experiment. An experiment consists of multiple trials, with your optimization solution iterating through the following steps for each trial:

The machine learning engine uses the results of each trial to build a model representing the multidimensional parameter space. In this space, it can examine the parameters in relation to one another. With each iteration, the ML engine moves closer to identifying the configurations that optimize the goal metrics.

While machine learning automatically recommends the configuration that will result in the optimal outcomes, additional analysis can be done once the experiment is complete. For example, you can visualize the trade-offs between two different goals, see which parameters have a significant impact on outcomes and which matter less.

Results are often surprising and can lead to key architectural improvements, for example, determining that a larger number of smaller replicas is more efficient than a smaller number of heavier replicas.

Experiment results can be visualized and analyzed to fully understand system behavior.

While experimentation-based optimization is powerful for analyzing a wide range of scenarios, its impossible to anticipate every possible situation. Additionally, highly variable user traffic means that an optimal configuration at one point in time may not be optimal as things change. Kubernetes autoscalers can help, but they are based on historical usage and fail to take application performance into account.

This is where observation-based optimization can help. Lets see how it works.

Depending on what optimization solution youre using, configuring an application for observation-based optimization may consist of the following steps:

Once configured, the machine learning engine begins analyzing observability data collected from Prometheus, Datadog or other observability tools to understand actual resource usage and application performance trends. The system then begins making recommendations at the interval specified during configuration.

If you specified automatic implementation of recommendations during configuration, the optimization solution will automatically patch deployments with recommended configurations as they are recommended. If you selected manual deployment, you can view the recommendation, including container-level details, before deciding to approve or not.

As you may have noted, observation-based optimization is simpler than experimentation-based approaches. It provides value faster with less effort, but on the other hand, experimentation- based optimization is more powerful and can provide deep application insights that arent possible using an observation-based approach.

Which approach to use shouldnt be an either/or decision; both approaches have their place and can work together to close the gap between prod and non-prod. Here are some guidelines to consider:

Using both experimentation-based and observation-based approaches creates a virtuous cycle of systematic, continuous optimization.

Optimizing our Kubernetes environment to maximize efficiency (performance versus cost), scale intelligently and achieve our business goals requires:

For small environments, this task is arduous. For an organization running apps on Kubernetes at scale, it is likely already beyond the scope of manual labor.

Fortunately, machine learning can bridge the automation gap and provide powerful insights for optimizing a Kubernetes environment at every level.

StormForge provides a solution that uses machine learning to optimize based on both observation (using observability data) and experimentation (using performance-testing data).

To try StormForge in your environment, you can request a free trial here and experience how complete optimization does not need to be a complete headache.

Stay tuned for future articles in this series where well explain how to tackle specific challenges involved in optimizing Java apps and databases running in containers.

The New Stack is a wholly owned subsidiary of Insight Partners, an investor in the following companies mentioned in this article: StormForge.

Feature image via Pixabay.

View original post here:
Using Machine Learning to Automate Kubernetes Optimization The New Stack - thenewstack.io

Read More..

Datatonic Wins Google Cloud Specialization Partner of the Year Award for Machine Learning – PR Newswire

LONDON, June 15, 2022 /PRNewswire/ -- Datatonic, a leader for Data + AI consulting on Google Cloud, today announced it has received the 2021 Google Cloud Specialization Partner of the Year award for Machine Learning.

Datatonic was recognized for the company's achievements in the Google Cloud ecosystem, helping joint customers scale their Machine Learning (ML) capabilities with Machine Learning Operations (MLOps) and achieve business impact with transformational ML solutions.

Datatonic has continuously invested in expanding their MLOps expertise, from defining what "good" MLOps looks like, to helping clients make their ML workloads faster, scalable, and more efficient. In just the past year, they have built high-performing MLOps platforms for global clients across the Telecommunications, Media, and e-Commercesectors, enabling them to seamlessly leverage MLOps best practices across their teams.

Their recently open-sourced MLOps Turbo Templates, co-developed with Google Cloud's Vertex AI Pipelines product team, showcase Datatonic's experience implementing MLOps solutions, and Google Cloud's technical excellence to help teams get started with MLOps even faster.

"We're delighted with this recognition from our partners at Google Cloud. It's amazing to see our team go from strength to strength at the forefront of cutting-edge technology with Google Cloud and MLOps. We're proud to be driving continuous improvements to the tech stack in partnership with Google Cloud, and to drive impact and scalability with our customers, from increasing ROI in data and AI spending to unlocking new revenue streams." - Louis Decuypere - CEO, Datatonic

"Google Cloud Specializations recognize partner excellence and proven customer success in a particular product area or industry," said Nina Harding, Global Chief, Partner Programs and Strategy, Google Cloud. "Based on their certified, repeatable customer success and strong technical capabilities, we're proud to recognize Datatonic as Specialization Partner of the Year for Machine Learning."

Datatonic is a data consultancy enabling companies to make better business decisions with the power of Modern Data Stack and MLOps. Its services empower clients to deepen their understanding of consumers, increase competitive advantages, and unlock operational efficiencies by building cloud-native data foundations and accelerating high-impact analytics and machine learning use cases.

Logo - https://mma.prnewswire.com/media/1839415/Datatonic_Logo.jpg

For enquiries about new projects, get in touch at [emailprotected]For media / press enquiries, contact Krisztina Gyure ([emailprotected])

SOURCE Datatonic Ltd

View original post here:
Datatonic Wins Google Cloud Specialization Partner of the Year Award for Machine Learning - PR Newswire

Read More..

Advances in AI and machine learning could lead to better health care: lawyers – Lexpert

Of course, transparency and privacy concerns are significant, she notes, but if the information from our public health care system benefits everyone, is it inefficient to ask for consent for every use?

On the other hand, cybersecurity is another essential consideration, as weve come to learn that there are a lot of malevolent actors out there, says Miller Olafsson, with the potential ability to hack into centralized systems as part of a ransomware attack or other threat.

Even in its more basic uses, the potential of AI and machine learning is enormous. But the tricky part of using it in the health care sector is the need to have access to incredible amounts of data while at the same time understanding the sensitive nature of the data collected.

For artificial intelligence to be used in systems, procedures, or devices, you need access to data, and getting that data, particularly personal health information, is very challenging, says Carole Piovesan, managing partner at INQ Law in Toronto.

She points to the developing legal frameworks in Europe and North America for artificial intelligence and privacy legislation more generally. Lawyers working with start-up companies or health care organizations to build AI systems must help them stay within the parameters of existing laws, says Piovesan, and provide guidance on best practices for whatever may come down the line and help them deal with the potential risks.

More:
Advances in AI and machine learning could lead to better health care: lawyers - Lexpert

Read More..

Machine Learning to Enable Positive Change An Interview with Adam Benzion – Elektor

Machine learning can enable positive change in society, says Adam Benzion, Chief Experience Officer at Edge Impulse. Read on to learn how the company is preventing unethical uses of its ML/AI development platform.

Machine learning can enable positive change in society, says Adam Benzion, Chief Experience Officer at Edge Impulse. Read on to learn how the company is preventing unethical uses of its ML/AI development platform.

Priscilla Haring-Kuipers: What Ethics in Electronics are you are working on?

Adam Benzion: At Edge Impulse, we try to connect our work to doing good in the world as a core value to our culture and operating philosophy. Our founders, Zach Shelby and Jan Jongboom define this as Machine learning can enable positive change in society, and we are dedicated to support applications for good. This is fundamental to what and how we do things. We invest our resources to support initiatives like UN Covid-19 Detect & Protect, Data Science Africa, and wildlife conservation with Smart Parks, Wildlabs, and ConservationX.

This also means we have a responsibility to prevent unethical uses of our ML/AI development platform. When Edge Impulse launched in January 2020, we decided to require a Responsible AI Licensefor our users, which prevents use for criminal purposes, surveillance, or harmful military or police applications. We have had a couple of cases where we have turned down a project that were not compatible with this license. There are also many positive uses for ML in governmental and defense applications, which we do support as compatible with our values.

We also joined 1% for the Planet, pledging to donate 1% of our revenue to support nonprofit organizations focused on the environment. I personally lead an initiative that focuses on elephant conservation where we have partnered with an organization called Smart Parks and helped developed a new AI-powered tracking collar that can last for eight years and be used to understand how the elephants communicate with each other. This is now deployed in parks across Mozambique.

Haring-Kuipers: What is the most important ethical question in your field?

Benzion: There are a lot of ethical issues with AI being used in population control, human recognition and tracking, let alone AI-powered weaponry. Especially where we touch human safety and dignity, AI-powered applications must be carefully evaluated, legislated and regulated. We dream of automation, fun magical experiences, and human-assisted technologies that do things better, faster and at a lesser cost. Thats the good AI dream, and thats what we all want to build. In a perfect world, we should all be able to vote on the rules and regulations that govern AI.

Haring-Kuipers: What would you like to include in an Electronics Code of Ethics?

Benzion: We need to look at how AI impacts human rights and machine accountability aka, when AI-powered machines fail, like in the case of autonomous driving, who takes the blame? Without universal guidelines to support us, it is up to every company in this field to find its core values and boundaries so we can all benefit from this exciting new wave.

Haring-Kuipers: An impossible choice ... The most important question before building anything is? A) Should I build this? B) Can I build this? C) How can I build this?

Benzion: A. Within reason, you can build almost anything, so ask yourself: Is the effort vs. outcome worth your precious time?

Priscilla Haring-Kuipers writes about technology from a social science perspective. She is especially interested in technology supporting the good in humanity and a firm believer in effect research. She has an MSc in Media Psychology and makes This Is Not Rocket Science happen.

More:
Machine Learning to Enable Positive Change An Interview with Adam Benzion - Elektor

Read More..

Artificial Intelligence in Drug Discovery Market worth $4.0 billion by 2027 Exclusive Report by MarketsandMarkets – GlobeNewswire

Chicago, June 15, 2022 (GLOBE NEWSWIRE) -- According to the new market research report AI in Drug Discovery Market by Offering (Software, Service), Technology (Machine Learning, Deep Learning), Application (Cardiovascular, Metabolic, Neurodegenerative), End User (Pharma, Biotech, CROs) - Global Forecasts to 2027, published by MarketsandMarkets, the global Artificial Intelligence in Drug Discovery Market is projected to reach USD 4.0 billion by 2027 from USD 0.6 billion in 2022, at a CAGR of 45.7% during the forecast period.

Browse in-depth TOC on Artificial Intelligence (AI) in Drug Discovery Market177 Tables 33 Figures 198 Pages

Download PDF Brochure: https://www.marketsandmarkets.com/pdfdownloadNew.asp?id=151193446

The growth of this Artificial Intelligence in Drug Discovery Market is driven by the growing need to control drug discovery & development costs, and growing number of cross-industry collaborations and partnerships, On the other hand, a lack of data sets in the field of drug discovery and the inadequate availability of skilled labor are some of the factors challenging the growth of the market.

Services segment is expected to grow at the highest rate during the forecast period.

Based on offering, the AI in drug discovery market is segmented into software and services. In 2021, the services segment accounted for the largest market share of the global AI in drug discovery services market and also expected to grow at the highest CAGR during the forecast period. The benefits associated with AI services and the strong demand for AI services among end users are the key factors driving the growth of this market segment.

Machine learning technology segment accounted for the largest share of the global AI in drug discovery market.

Based on technology, the AI in drug discovery market is segmented into machine learning and other technologies. The machine learning segment accounted for the largest share of the global market in 2021 and expected to grow at the highest CAGR during the forecast period. The machine learning technology segment further segmented into deep learning, supervised learning. reinforcement learning, unsupervised learning, and other machine learning technologies. Deep learning segment accounted for the largest share of the market in 2021, and this segment also expected to grow at the highest CAGR during the forecast period.

Request Sample Pages: https://www.marketsandmarkets.com/requestsampleNew.asp?id=151193446

The immuno-oncology application segment accounted for the largest share of the AI in drug discovery market in 2021.

On the basis of application, the AI in drug discovery market is segmented into neurodegenerative diseases, immuno-oncology, cardiovascular diseases, metabolic diseases, and other applications. The immuno-oncology segment accounted for the largest share of the market in 2021, owing to the increasing demand for effective cancer drugs. The neurodegenerative diseases segment is estimated to register the highest CAGR during the forecast period. The role of AI in resolving existing complexities in neurological drug development and strategic collaborations between pharmaceutical companies & solution providers are the key factors responsible for the high growth rate of the neurodegenerative diseases segment.

Pharmaceutical & biotechnology companies segment accounted for the largest share of the global AI in drug discovery market.

On the basis of end user, the AI in drug discovery market is segmented into pharmaceutical & biotechnology companies, CROs, and research centers and academic & government institutes. The pharmaceutical & biotechnology companies segment accounted for the largest market share of AI in drug discovery market, in 2021, while the research centers and academic & government institutes segment is projected to register the highest CAGR during the forecast period. The strong demand for AI-based tools in making the entire drug discovery process more time and cost-efficient is driving the growth of this end-user segment.

Speak to Analyst: https://www.marketsandmarkets.com/speaktoanalystNew.asp?id=151193446

North America is expected to dominate the Artificial Intelligence in Drug Discovery Market in 2022.

North America accounted for the largest share of the global AI in drug discovery market in 2021 and also expected to grow at the highest CAGR during the forecast period. North America, which comprises the US, Canada, and Mexico, forms the largest market for AI in drug discovery. These countries have been early adopters of AI technology in drug discovery and development. Presence of key established players, well-established pharmaceutical and biotechnology industry, and high focus on R&D & substantial investment are some of the key factors responsible for the large share and high growth rate of this market

Top Key Players in Artificial Intelligence in Drug Discovery Market are:

Players in AI in Drug Discovery Market adopted organic as well as inorganic growth strategies such as product upgrades, collaborations, agreements, partnerships, and acquisitions to increase their offerings, cater to the unmet needs of customers, increase their profitability, and expand their presence in the global market.

Browse Adjacent Markets:Healthcare IT Market Research Reports & Consulting

Browse Related Reports:

Drug Discovery Services Market by Process (Target Selection, Validation, Hit-to-lead), Type (Chemistry, Biology), Drug Type (Small molecules, biologics), Therapeutic Area (Oncology, Neurology) End User (Pharma, Biotech) - Global Forecast to 2026https://www.marketsandmarkets.com/Market-Reports/drug-discovery-services-market-138732129.html

Artificial Intelligence In Genomics Market by Offering (Software, Services),Technology (Machine Learning, Computer Vision), Functionality (Genome Sequencing, Gene Editing), Application (Diagnostics), End User (Pharma, Research)-Global Forecasts to 2025https://www.marketsandmarkets.com/Market-Reports/artificial-intelligence-in-genomics-market-36649899.html

Read more:
Artificial Intelligence in Drug Discovery Market worth $4.0 billion by 2027 Exclusive Report by MarketsandMarkets - GlobeNewswire

Read More..

Machine learning-led decarbonisation platform Ecolibrium launches in the UK – Yahoo Finance

The advisory and climate tech-led sustainability solution has opened a new London HQ after raising $5m in a pre-Series A funding round, to support growing demand from commercial and industrial UK real estate owners striving to meet net zero carbon targets

Ecolibriums Head of Commercial Real Estate Yash Kapila (left) and CEO Chintan Soni (right) will lead the business UK expansion from its new London HQ. Image credit: Max Lacome

UK expansion builds on considerable success in Asia Pacific, where Ecolibrium's technology has been deployed across 50 million sq ft by globally renowned brands including Amazon, Fiat, Honeywell, Thomson Reuters, Tata Power, and the Delhi Metro

The $5m pre-Series A funding round was co-led by Amit Bhatia's Swordfish Investments and Shravin Bharti Mittal's Unbound venture capital firm

Launches in the UK today having already signed its first commercial contract with Integral, real estate giant JLL's engineering and facilities service business

LONDON, June 13, 2022 /PRNewswire/ -- Machine learning-led decarbonisation platform Ecolibrium has today launched its revolutionary sustainability solution in the UK, as the race to reduce carbon emissions accelerates across the built environment.

Founded in 2008 by entrepreneur brothers Chintan and Harit Soni at IIM Ahmedabad's Centre for Innovation, Incubation and Entrepreneurship in India, Ecolibrium provides expert advisory as well as technology-driven sustainability solutions to enable businesses in commercial and industrial real estate to reduce energy consumption and ultimately achieve their net zero carbon ambitions.

Relocating its global headquarters to London, Ecolibrium has raised $5m in a pre-Series A funding round as it looks to expand its international footprint to the UK. The round was co-led by Amit Bhatia's Swordfish Investments and Shravin Bharti Mittal's Unbound venture capital firm, alongside several strategic investors.

Ecolibrium launches in the UK today having already signed its first commercial contract with Integral, JLL's UK engineering and facilities service business.

The fundraising and UK expansion builds on Ecolibrium's considerable success in Asia Pacific, where its technology is being used across 50 million sq ft by more than 150 companies including Amazon, Fiat, Honeywell, Thomson Reuters, Tata Power, and the Delhi Metro. An annual reduction of 5-15% in carbon footprint has been achieved to date by companies which have deployed Ecolibrium's technology.

Story continues

Ecolibrium has also strengthened its senior UK management team, as it prepares to roll-out its green platform across the UK, by hiring facilities and asset management veteran Yash Kapila as its new head of commercial real estate. Kapila previously held senior leadership positions with JLL across APAC and EMEA regions.

Introducing SmartSense

At the heart of Ecolibrium's offer is its sustainability-led technology product SmartSense, which assimilates thousands of internet of things (IoT) data points from across a facility's entire energy infrastructure.

This information is then channelled through Ecolibrium's proprietary machine learning algorithms, which have been developed over 10 years by their in-house subject matter experts. Customers can visualise the data through a bespoke user interface that provides actionable insights and a blueprint for achieving operational excellence, sustainability targets, and healthy buildings.

This connected infrastructure generates a granular view of an asset's carbon footprint, unlocking inefficiencies and empowering smart decision-making, while driving a programme of continuous improvement to deliver empirical and tangible sustainability and productivity gains.

Preparing for future regulation

Quality environmental data and proof points are also providing a distinct business advantage at this time of increasing regulatory requirements that require corporates to disclose ESG and sustainability performance. Ecolibrium will work closely with customers to lead the way in shaping their ESG governance.

According to Deloitte, with a minimum Grade B Energy Performance Certification (EPC) requirement anticipated by 2030, 80% of London office stock will need to be upgraded an equivalent of 15 million sq ft per annum.

Research from the World Economic Forumhas found that the built environment is responsible for 40% of global energy consumption and 33% of greenhouse gas emissions, with one-fifth of the world's largest 2,000 companies adopting net zero strategies by 2050 or earlier. Technology holds the key to meeting this challenge, with Ecolibrium and other sustainability-focused changemakers leading the decarbonisation drive.

Chintan Soni, Chief Executive Officer at Ecolibrium, said:"Our mission is to create a balance between people, planet and profit and our technology addresses each of these objectives, leading businesses to sustainable prosperity. There is no doubt the world is facing a climate emergency, and we must act now to decarbonise and protect our planet for future generations.

"By using our proprietary machine learning-led technology and deep in-house expertise, Ecolibrium can help commercial and industrial real estate owners to deliver against ESG objectives, as companies awaken to the fact that urgent action must be taken to reduce emissions and achieve net zero carbon targets in the built environment.

"Our goal is to partner with companies and coach them to work smarter, make critical decisions more quickly and consume less. And, by doing this at scale, Ecolibrium will make a significant impact on the carbon footprint of commercial and industrial assets, globally."

The UK expansion has been supported by the Department for International Trade's Global Entrepreneur Programme. The programme has provided invaluable assistance in setting up Ecolibrium's London headquarters and scaling in the UK market.

In turn, Ecolibrium is supporting the growth of UK innovation, promoting green job creation, and providing tangible economic benefits, as part of the country's wider transition to a more sustainable future.

Minister for Investment Lord Grimstone said: "Tackling climate change is crucial in our quest for a cleaner and green future, something investment will play an important part in.

"That's why I'm pleased to see Ecolibrium's expansion to the UK. Not only will the investment provide a revolutionary sustainability solution to reduce carbon emissions across various sectors, it is a continued sign of the UK as a leading inward investment destination, with innovation and expertise in our arsenal".

About Ecolibrium

Ecolibrium is a machine learning-led decarbonisation platform balancing people, planet and profit to deliver sustainable prosperity for businesses.

Founded in 2008 by entrepreneur brothers Chintan and Harit Soni, Ecolibrium provides expert advisory as well as technology-driven sustainability solutions to enable commercial and industrial real estate owners to reduce energy consumption and ultimately achieve their net zero carbon ambitions.

Ecolibrium's flagship technology product SmartSense is currently being used across 50 million sq ft by more than 150 companies including JLL, Amazon, Fiat, Honeywell, Thomson Reuters, Tata Power, and the Delhi Metro. SmartSense collects real-time information on assets, operational data and critical metrics using internet of things (IoT) technology. This intelligence is then channelled through Ecolibrium's proprietary machine learning algorithms to visualise data and provide actionable insights to help companies make transformative changes to their sustainability goals.

For more information, visit: http://www.ecolibrium.io

For press enquiries, contact: FTI Consulting: ecolibrium@fticonsulting.com, +44 (0) 2037271000

Photo - https://mma.prnewswire.com/media/1837227/Ecolibrium_Yash_Kapila_and_Chintan_Soni.jpg

Cision

View original content to download multimedia:https://www.prnewswire.com/news-releases/machine-learning-led-decarbonisation-platform-ecolibrium-launches-in-the-uk-301566340.html

SOURCE Ecolibrium

See the original post here:
Machine learning-led decarbonisation platform Ecolibrium launches in the UK - Yahoo Finance

Read More..