Page 3,138«..1020..3,1373,1383,1393,140..3,1503,160..»

Getting Prices Right in 2021 – Progressive Grocer

People will always want to shop and interact with others, so some elements of 2020 will only last as long as there is a virus risk, but for the most part, the trends that grew last year will stay in place, he notes.

Two examples of those newer trends? Grocery deliveries and curbside pickup. But keeping up with consumer demand for those services in 2021 and beyond, and keeping ahead of competitors, will require more food retailers to invest in software as a service (SaaS) or other forms of optimization technology. Price optimization remains one of the best ways to translate what consumers like into viable actions to stay relevant as trends evolve over time, Pavich says.

Such Saas technology can provide the benefit of a team to retailers that deploy those systems a reflection of a larger trend in the food retail world in 2021, as business becomes ever more digital, and new and real-time data points accumulate more quickly.

Retailers using SaaS pricing platforms also benefited from being part of a community of customers, having strategic partners who were able to monitor broader pricing trends and advise on best practices during very challenging times, Pavich points out. Imagine how Apollo 13 would have turned out if the crew didnt have all of their instruments, gauges, and a room full of experts and scientists in Houston advising them and helping them through their crisis.

Even if a particular food retailer isnt quite ready to deploy new price and promotional optimization strategy or has yet to earmark the money for doing so work toward that goal can be completed now.

The main thing that retailers should be doing is re-evaluating and refreshing their strategies to reflect the new realities of the current market, Pavich advises. This includes taking a new look at pricing zones, competitive indexes, KVIs, category pricing roles and other key aspects of their pricing strategy. High-quality analytics and industry best practices can help retailers build a pricing strategy to meet todays needs while driving sustainable value in 2021 and beyond.

Its hard to go even a day in the world of food retail without hearing about some new deployment of artificial intelligence, or fresh boasting about its near-term promise. That holds true when it comes to pricing and promotional optimization but dont make the mistake that everything is about AI.

Other technologies and processes also matter significantly, according to Maia Brenner, a data scientist for Tryolabs, a San Francisco-based data science consulting firm.

That includes machine learning kind of like the less sophisticated but useful older cousin of AI. Brenner says that mastering price optimization in 2021 requires food retailer tech departments to either master machine-learning techniques or find a partner that can. To do that, retailers need to break past old habits and embrace the future. Many grocery operators proved they could do that in 2020 when they went full force with e-commerce and associated services. Now, that same attitude is required for better price optimization.

Old statistical univariate forecasting tools arent sufficient, since forecasting future sales cant be predicted only by having a look at last years or months sales, Brenner observes. Without doubt, handling new real-time structured or unstructured data sources also helps anticipate the future and make better decisions. For instance, price optimization technologies can be powered up with computer vision and natural language-processing techniques by providing more information to perform better recommendations and personalization.

She also stresses the importance of investing in real-time data sources and crafting a pricing strategy with enough flexibility to be able to analyze all of that information and respond accordingly and before competitors do.

Another anticipated change serves to further underscore why food retailers this year need to take price optimization even more seriously than has been the case.

In 2020, consumers took fewer trips, went to fewer retailers and spent more per trip, notes Edris Bemanian, CEO of Engage3, a price optimization firm with offices in Davis, Calif., and Scottsdale, Ariz. During this period, prices skyrocketed due to lack of supply, and a significant reduction in promotions driven by the lack of supply. At the same time, a greater percentage of sales than ever before shifted to e-commerce, and retailers had to be more sensitive to pricing actions that would be perceived as unethical or unfair.

In 2021, according to Bemanian, ongoing economic constraints promise to place an even higher focus on price. Price optimization will need to shift to be able to deal with smaller trip sizes, households with less money to spend on each trip, shifts to private label, and smaller pack sizes, he predicts. Historical price optimization models of increasing price to drive profit wont work. Retailers will need to focus on the items that drive their price image to keep traffic up.

In other words, the future is pretty much now when it comes to better pricing and promotions. The race toward better optimization could even prove more important in the longer term than the fight to win over online shoppers.

See original here:

Getting Prices Right in 2021 - Progressive Grocer

Read More..

Why Artificial Intelligence May Not Offer The Business Value You Think – CMSWire

PHOTO:Adobe

Last September, Gartner published its Hype Cycle for AI in which it identified two emerging trends (and five new AI solutions) that would have an impact on the workplace. One of those trends was what Gartner described as the democratization of AI. While there are many ways that this can be interpreted, in simple terms what it meansfor workersis the general distribution and use of AI across the digital workplace to achieve business goals.

In the enterprise, the target deployment of AI is now likely to include customers, business partners, business executives, salespeople, assembly line workers, application developers and IT operations professionals. As AI reaches a larger set of employees and partners, it requires new enterprise roles to deliver it to a wider audience.

While this was an emerging trend last summer, with COVID-19 and the adoption of many new technologies to enable remote working, the widespread use of AI, while still only anecdotal, now appears to be an established fact in the workplace.

Bill Galusha of senior director of marketing at Calsbad, Calif.-based digital intelligence company ABBYY points out, however, that this is not a new phenomena. In the past couple of years, weve seen AI enabling technology like OCR and machine learning become more accessible to non-technical employees and partners through no code/low code platforms, he said.

He points out that thetechnologies designed to help workers understand and extract insights from content have been in high demand as more digital workers increase the number of tasks a knowledge workers have to perform.

In practical terms these new AI platforms enable users to design cognitive skills that are can be easily trained to take unstructured data from type of document like invoices, utility bills, IDs, and contracts, or access trained cognitive skills available through online digital marketplaces. This new approach to making it easy to train machine learning content models and deliver them as skills in a marketplace are certainly going to fuel the online growth and reusability of AI as businesses look to automate all types of content-centric processes across the enterprise, he said.

Related Article:The Risks and Rewards of the Citizen Developer Approach

However, if AI is being used widely across the enterprise, it does not necessarily follow that it is providing business value to every organization, according to Chris Bergh, CEO of Cambridge, Mass.-based DataKitchen, a DataOps consultancy and platform provider.

AI is being deployed everywhere we look, but there is a problem that no one talks about. Machine learning tools are evolving to make it faster and less costly to develop AI systems. But deploying and maintaining these systems over time is getting exponentially more complex and expensive, he told us.

Data science teams are incurring enormous technical debt by deploying systems without the processes and tools to maintain, monitor and update them. Further, poor quality data sources create unplanned work and cause errors that invalidate results.

This is the heart of the problem and one that is likely to impact the bottom line of any business that uses AI. The AI code or model is a small fraction of what it takes to deploy and maintain a model successfully. This means that the delivery of a system that supports an AI model in an application context, is an order of magnitude more complex than the model itself. You can't manage the lifecycle complexity of AI systems with an army of programmers. The world changes too fast. Data constantly flows and models drift into ineffectiveness. The solution requires workflow automation, he said.

There is another problem for businesses too. Given the explosion in the amount of data that is available to them, at first glance you would think that developing AI was getting easier and, consequently, easier to deploy democratized across the enterprise. Not so, according to Chris Nicholson, CEO of San Francisco-based Pathmind, which develops a SaaS platform that enables businesses to apply reinforcement learning to real-world scenarios without data science expertise.

The real problem, he argues is that you cannot decouple algorithms from data, and the data is not being democratized, or made available, across the organization. In many cases, as with GDPR, the data is getting harder to access and because the data is not being democratized, most startups and companies will not be able to train AI models to perform well, because each team is limited to the data it can access.

In a few cases, a general-purpose machine-learning model, can be trained and made available behind an API. In this case, developers can build products on top of it, and that very particular type of AI is slowly percolating into products and impacting customers lives. But, in most cases, businesses have custom needs that can only be met by training on custom data, and custom data is expensive to collect, store, label and stream, he said. At best, AI is a feature. In the best companies, data scientists embed with developers to understand the ecosystem of the data and the code, and then they embed their algorithms in that flow.

Like the discussion around citizen data scientists (and democratizing data science), business leaders need to know what they want this new democratized AI to do. They will not be able to design and build AI models from scratch; that will always require an understanding of what the underlying methods and parameters do, which requires theoretical knowledge.

Given some gray box AI systems, one can envision such systems learning to solve well-defined classes of problems when they are trained or embedded by non-AI experts, Michael Berthold, Switzerland-based KNIME CEO and co-founder, said. Examples he cites are object recognition in images, speech recognition, or probably also quality control via noise and image tracking. Note that already here choosing the right data is critical so the resulting AI is not already biased by data selection.

I think this area will see growth, and if we consider this democratization of AI, then yes, it will grow, he added. But we will also see many instances where the semi-automated system fails to do what it is supposed to do because the task did not quite fit what it was designed to do, or the user fed it misleading information data.

It is possible to envision a shallower training enabling people to use and train such preconfigured AI systems without understanding all the algorithmic details. Kind of like following boarding instructions to fly on a plane vs. learning how to fly the plane itself.

If organizations take this path to develop AI, there are two ways enterprises can push AI to a broader audience. Simplify the tools and make them more intuitive, David Tareen, director of AI and analytics at Raleigh, N.C-based SAS told us.

Simplified Tools - A tool like conversational AI helps because it makes interacting with AI so much simpler. You do not have to build complex models but you can gain insights from your data by talking with your analytics.

Intuitive Tools - These tools should make AI easier to consume by everyone. This means taking your data and algorithms to the cloud to become cloud native. Becoming cloud native improves accessibility and reduces the cost of AI and analytics for all.

In organizations do this, they will see benefits everywhere. He cites the example of an insurance company that uses AI throughout the organization will reduce the cost of servicing claims, reduce the time to service claims, and improve customer satisfaction compared to the rest of the industry. He adds that some enterprise leaders are also surprised to learn that enabling AI across the enterprise itself involves more than the process itself. Often culture tweaks or an entire cultural change must accompany the process.

Leaders can practice transparency and good communication in their AI initiatives to address concerns, adjust the pace of change, and result in a successful completion of embedding AI and analytics for everyone, everywhere.

View original post here:

Why Artificial Intelligence May Not Offer The Business Value You Think - CMSWire

Read More..

Awful Earnings Aside, the Dip in Alteryx Stock Is Worth Buying – InvestorPlace

Shares of Alteryx (NASDAQ:AYX) dropped big in February after the data analytics company reported fourth-quarter numbers that while beating estimates revealed structurally weakening growth trends. The guide also called for these weaknesses to persist for the foreseeable future. Naturally, AYX stock plunged after the print.

Source: rafapress / Shutterstock.com

Bad news aside, this dip is an opportunity for AYX bulls to buy in at a discount.

Theres no sugarcoating it, though Alteryxs growth trends look awful. Customer growth is slowing. Revenue growth is slowing. Margins are compressing. Profits are turning into losses. Pretty much nothing looks good right now.

But we dont value businesses based on what they are today. We value them based on what will be tomorrow. And tomorrow, Alteryx will once again be a hypergrowth, hyper-profitable company with great prospects.

So, buy the dip in AYX stock!

Heres a deeper look:

Alteryx had a bad fourth quarter like a really, really bad fourth quarter.

The company added just 128 customers in the quarter. In the year ago quarter, Alteryx added 474 customers. In each of the past 18 quarters, Alteryx has added more than 200 customers. So, adding just 128 customers in the quarter is unusually bad for Alteryx. Indeed, it rounds out to 16% customer growth, its slowest growth rate ever.

Meanwhile, average revenue per customer dropped 15%, leading to meager revenue growth of just 3%. Revenues are expected to drop next quarter. Last year, this was a 50%-plus revenue growth company. Clearly, things are decelerating and fast.

Worse yet, this slowdown in growth is killing margins, because the company isnt able to cut back on expense growth as fast revenue growth is falling. Operating margins two points year-over-year in Q4, and are expected to drop nearly 20 points next quarter.

Things are not going well for Alteryx right now. Theres no other way to put it.

Its no wonder that AYX stock fell off a cliff.

When it comes to Alteryx, you have a classic case of near-term pain, long-term gain.

Data-driven decision making is the future of the business world. Alteryx provides an end-to-end platform, which enables this data-driven decision making, by giving enterprises the analytics and tools necessary to turn raw mountains of messy data into clean, actionable insights.

Importantly, Alteryx does this in a friendly, low-code, easy-to-learn and easy-to-use software environment. That is, you dont need to be a data scientist or have a computer science degree in order to make use of the Alteryx platform. Alteryx enables regular Joes to make advanced data-driven decisions.

Thats big, because most companiesdont have big data skills. All the data scientists in the world are going to work forFacebook(NASDAQ:FB) andMicrosoft(NASDAQ:MSFT), while only 6% of large companies and very few small businesses employ even a single data scientist.

So, as we pivot into a data-driven future, most companies are going to lean into low-code, easy-to-use data science platforms to help them make data-driven decisions. Alteryx is the best-in-breed provider of these solutions and, to that end, the company is going to sell a lot of enterprise seats to its data science platform over the next several years.

The company just hit a rough spot amid the pandemic because businesses leaned up their budgets. But we have multiple highly-effective Covid-19 vaccines that are being distributed rapidly, and it increasingly appears that normal is coming back at some point in 2021/22. As normal returns, businesses will re-up their budgets, and the Alteryx growth narrative will once again fire on all cylinders.

So, amid this ephemeral choppiness, its best to buy AYX stock and ignore the noise.

Ive revised my long-term model to account for Alteryxs slowdown in growth in 2020. Still, I believe AYX stock is worth much than its current price tag.

I see Alteryx as a company that will, post-2020, maintain 10%-plus revenue growth into 2030. Gross margins are up at 90%, so double-digit revenue growth should drive positive operating leverage and allow for sustainable and steady operating margin expansion.

Plugging those assumptions into my model, I see Alteryx growing earnings to $8 per share by 2030. Based on a 35X forward earnings multiple and an 8% annual discount rate, that implies a 2021 price target for AYX stock of $280.

Alteryx had an awful fourth quarter. Oh well. It happens.

This is still a great company, with great growth prospects, and a highly scalable business model. It will sprint back into hypergrowth mode throughout 2021/22 as Covid-19 headwinds on enterprise spending fade.

As the company does that, AYX stock will rebound.

Buy the dip before that rebound.

On the date of publication, Luke Lango did not have (either directly or indirectly) any positions in the securities mentioned in this article.

By uncovering early investments in hypergrowth industries, Luke Lango puts you on the ground-floor of world-changing megatrends. Its how his Daily 10X Report has averaged up to a ridiculous 100% return across all recommendations since launching last May.Click hereto see how he does it.

Read more:

Awful Earnings Aside, the Dip in Alteryx Stock Is Worth Buying - InvestorPlace

Read More..

Scientists use machine learning to tackle a big challenge in gene therapy – STAT

As the world charges to vaccinate the population against the coronavirus, gene therapy developers are locked in a counterintuitive race. Instead of training the immune system to recognize and combat a virus, theyre trying to do the opposite: designing viruses the body has never seen, and cant fight back against.

Its OK, really: These are adeno-associated viruses, which are common and rarely cause symptoms. That makes them the perfect vehicle for gene therapies, which aim to treat hereditary conditions caused by a single faulty gene. But they introduce a unique challenge: Because these viruses already circulate widely, patients immune systems may recognize the engineered vectors and clobber them into submission before they can do their job.

Unlock this article by subscribing to STAT+ and enjoy your first 30 days free!

STAT+ is STAT's premium subscription service for in-depth biotech, pharma, policy, and life science coverage and analysis.Our award-winning team covers news on Wall Street, policy developments in Washington, early science breakthroughs and clinical trial results, and health care disruption in Silicon Valley and beyond.

Here is the original post:
Scientists use machine learning to tackle a big challenge in gene therapy - STAT

Read More..

How to expand your machine learning capabilities by installing TensorFlow on Ubuntu Server 20.04 – TechRepublic

If you're looking to add Machine Learning to your Python development, Jack Wallen shows you how to quickly install TensorFlow on Ubuntu Desktop 20.04.

Image: Google

TensorFlow is an open source development platform for machine learning (ML). With this software platform, you'll get a comprehensive collection of tools, libraries, and various resources that allow you to easily build and deploy modern ML-powered applications. Beginners and experts alike can make use of this end-to-end platform, and create ML models to solve real-world problems.

How do you get started? The first thing you must do is get TensorFlow installed on your machine. I'm going to show you how to make that happen on Ubuntu Desktop 20.04.

SEE: Top 5 programming languages for systems admins to learn (free PDF) (TechRepublic)

The first thing to be done is the installation of the necessary dependencies. It just so happens these are all about Python. Log in to your desktop and install the dependencies with the command:

Python should now be installed, so you're ready to continue on.

How we install TensorFlow is from within a Python virtual environment. Create a new directory to house the environment with the command:

Change into that newly created directory with the command:

Next, create the Python virtual environment with the command:

Once the above command completes, we must then activate the virtual environment, using the source command like so:

After you activate the environment, your command prompt should change, such that it begins with (venv) (Figure A).

Figure A

We've activated our virtual Python environment.

Next, we're going to upgrade Pip with the command:

We can now install TensorFlow with the command:

TensorFlow should now be installed on your system. To verify, issue the command:

The command will return the TensorFlow version number (Figure B).

Figure B

I'm running the Ubuntu 20.04 desktop on a virtual machine, so GPU is an issue.

Make sure, when you're done working, that you deactivate the Python virtual environment with the command:

And there you have it, you've successfully installed TensorFlow on Ubuntu Desktop 20.04 and are ready to start adding machine learning to your builds.

Subscribe to TechRepublic's How To Make Tech Work on YouTube for all the latest tech advice for business pros from Jack Wallen.

You don't want to miss our tips, tutorials, and commentary on the Linux OS and open source applications. Delivered Tuesdays

Read more from the original source:
How to expand your machine learning capabilities by installing TensorFlow on Ubuntu Server 20.04 - TechRepublic

Read More..

Getting Help from HAL: Applying Machine Learning to Predict Antibiotic Resistance – Contagionlive.com

Highlighted Study: Lewin-Epstein O, Baruch S, Hadany L, Stein GY, Obolski U. Predicting antibiotic resistance in hospitalized patients by applying machine learning to electronic medical records. Clin Infect Dis. Published online October 18, 2020. doi:10.1093/cid/ciaa1576

Appropriate empirical antimicrobial therapy is paramount for ensuring the best outcomes for patients. The literature shows that inappropriate antimicrobial therapy for infections caused by resistant pathogens leads to worse outcomes.1,2 Additionally, increased use of broad spectrum antibiotics in patients without resistant pathogens can lead to unintended consequences.3-5 As technology advances, it may enable clinicians to better prescribe empiric antimicrobials. Lewin-Epstein et al studied the potential for machine learning to optimize the use of empiric antibiotics in patients who may be harboring resistant bacteria.

As machine learning and artificial intelligence technology improves, investigators are examining new ways to implement it in practice. Lewin-Epstein et al studied the potential for machine learning to predict antibiotic resistance in hospitalized patients.6 This study specifically targeted the use of empiric antibiotics, attempting to reduce their use in patients who may be harboring resistant bacteria.

The single-center retrospective study was conducted in Israel from May 2013 through December 2015 using electronic medical records of patients who had positive bacterial culture results and resistance profiles for the antibiotics of interest. The investigators studied 5 antibiotics from commonly prescribed antibiotic classes: ceftazidime, gentamicin, imipenem, ofloxacin, and sulfamethoxazole-trimethoprim. The data set included 16,198 samples for patients who had positive bacterial culture results and sensitivities for these 5 antibiotics. The most common bacterial species were Escherichia coli, Klebsiella pneumoniae, coagulase negative Staphylococcus, and Pseudomonas aeruginosa. The investigators also collected patient demographics, comorbidities, hospitalization records, and information on previous inpatient antibiotic use.

Employing a supervised machine learning approach, they created a model comprising 3 submodels to predict antibiotic resistance. The first 85% of data were used to train the model, whereas the remainder were used to test it. During training, the investigators identified the variable with the highest effect on predictionthe rate of previous antibiotic-resistant infections, regardless of whether the bacterial species was included in the analysis. Other important variables included previous hospitalizations, nosocomial infections, previous antibiotic usage, and patient functioning and independence levels. The model was trained in multiple ways to identify which manner of use would be the most accurate. In one analysis, the model was trained and evaluated on each antibiotic individually. In another, it was trained and evaluated on all 5 antibiotics. The model was also evaluated when the bacterial species was included and excluded. The models success was defined by the area under the receiver-operating characteristic (auROC) curve and balanced accuracy, which is the unweighted average of the sensitivity and specificity rates.

The ensemble model, which was made up of the 3 submodels, was effective at predicting bacterial resistance, especially when the bacteria causing the infection were identified for the model. When the bacterial species was identified, the auROC score ranged from 0.8 to 0.88 versus 0.73 to 0.79 when the species was not identified. These results are more promising than previous studies on the use of machine learning in identifying resistant infections, despite this study incorporating heterogenous data and multiple antibiotics. Previous studies that only included 1 species or 1 type of infection yielded auROC scores of 0.7 to 0.83. This shows that using the composite result of multiple models may be more successful at predicting antibiotic resistance.

One limitation of this study is that it did not compare the model with providers abilities to recognize potentially resistant organisms and adjust therapy accordingly. Although this study did not directly make a comparison, a previous study involving machine learning showed that a similar model performed better than physicians when predicting resistance. The model in this study performed better than the one in the previous study, which suggests that this model may perform better than providers when predicting resistance. Another limitation of this study is that it did not evaluate causal effects of antibiotic resistance. The authors believe that further research should be conducted in this area to evaluate whether machine learning could be employed to determine further causes of antimicrobial resistance. A third limitation is that this study only evaluated the 5 antibiotics included, which are the 5 antibiotics most commonly tested for resistance at that facility. Additional research and machine learning would likely need to be incorporated to apply this model to other antibiotics.

The authors concluded that the model used in this study could be used as a template for other health systems. Because resistance patterns vary by region, this seems to be an appropriate conclusion. A model would have to be trained at each facility that was interested in employing machine learning in antimicrobial stewardship, and additional training would have to occur periodically to keep up with evolving resistance patterns. Additionally, if a facility would like to incorporate this type of model, they might want to also incorporate rapid polymerase chain reaction testing to provide the model with a bacterial species for optimal predictions. Overall, the results of this study indicate that great potential exists for machine learning in antimicrobial stewardship programs.

References

See the original post here:
Getting Help from HAL: Applying Machine Learning to Predict Antibiotic Resistance - Contagionlive.com

Read More..

Postdoctoral Research Associate in Digital Humanities and Machine Learning job with DURHAM UNIVERSITY | 246392 – Times Higher Education (THE)

Department of Computer Science

Grade 7:-33,797 - 40,322 per annumFixed Term-Full TimeContract Duration:7 monthsContracted Hours per Week:35Closing Date:13-Mar-2021, 7:59:00 AM

Durham University

Durham University is one of the world's top universities with strengths across the Arts and Humanities, Sciences and Social Sciences. We are home to some of the most talented scholars and researchers from around the world who are tackling global issues and making a difference to people's lives.

The University sits in a beautiful historic city where it shares ownership of a UNESCO World Heritage Site with Durham Cathedral, the greatest Romanesque building in Western Europe. A collegiate University, Durham recruits outstanding students from across the world and offers an unmatched wider student experience.

Less than 3 hours north of London, and an hour and a half south of Edinburgh, County Durham is a region steeped in history and natural beauty. The Durham Dales, including the North Pennines Area of Outstanding Natural Beauty, are home to breathtaking scenery and attractions. Durham offers an excellent choice of city, suburban and rural residential locations. The University provides a range of benefits including pension and childcare benefits and the Universitys Relocation Manager can assist with potential schooling requirements.

Durham University seeks to promote and maintain an inclusive and supportive environment for work and study that assists all members of our University community to reach their full potential. Diversity brings strength and we welcome applications from across the international, national and regional communities that we work with and serve.

The Department

The Department of Computer Science is rapidly expanding. A new building for the department (joint with Mathematical Sciences) has recently opened to house the expanded Department. The current Department has research strengths in (1) algorithms and complexity, (2) computer vision, imaging, and visualisation and (3) high-performance computing, cloud computing, and simulation. We work closely with industry and government departments. Research-led teaching is a key strength of the Department, which came 5th in the Complete University Guide. The department offers BSc and MEng undergraduate degrees and is currently redeveloping its interdisciplinary taught postgraduate degrees. The size of its student cohort has more than trebled in the past five years. The Department has an exceptionally strong External Advisory Board that provides strategic support for developing research and education, consisting of high-profile industrialists and academics.Computer Science is one of the very best UK Computer Science Departments with an outstanding reputation for excellence in teaching, research and employability of our students.

The Role

Postdoctoral Research Associate to work on the AHRC-funded project Visitor Interaction and Machine Curation in the Virtual Liverpool Biennial.

The project looks at virtual art exhibitions that are curated by machines, or even co-curated by humans and machines; and how audiences interact with these exhibitions in the era of online art shows. The project is in close collaboration with the 2020 (now 2021) Liverpool Biennial (http://biennial.com/). The role of the post holder is, along with the PI Leonardo Impett, to implement different strategies of user-machine interaction for virtual art exhibits; and to investigate the interaction behaviour of different types of users with such systems.

Responsibilities:

This post is fixed term until31 August 2021 as the research project is time limited and will end on 31 August 2021.

The post-holder is employed to work on research/a research project which will be led by another colleague. Whilst this means that the post-holder will not be carrying out independent research in his/her own right, the expectation is that they will contribute to the advancement of the project, through the development of their own research ideas/adaptation and development of research protocols.

Successful applicants will, ideally, be in post byFebruary 2021.

How to Apply

For informal enquiries please contactDr Leonardo Impett (leonardo.l.impett@durham.ac.uk).All enquiries will be treated in the strictest confidence.

We prefer to receive applications online via the Durham University Vacancies Site.https://www.dur.ac.uk/jobs/. As part of the application process, you should provide details of 3 (preferably academic/research) referees and the details of your current line manager so that we may seek an employment reference.

Applications are particularly welcome from women and black and minority ethnic candidates, who are under-represented in academic posts in the University.We are committed to equality: if for any reason you have taken a career break or periods of leave that may have impacted on your career path, such as maternity, adoption or parental leave, you may wish to disclose this in your application.The selection committee will recognise that this may have reduced the quantity of your research accordingly.

What to Submit

All applicants are asked to submit:

The Requirements

Essential:

Qualifications

Experience

Skills

Desirable:

Experience

Skills

DBS Requirement:Not Applicable.

Read more:
Postdoctoral Research Associate in Digital Humanities and Machine Learning job with DURHAM UNIVERSITY | 246392 - Times Higher Education (THE)

Read More..

The head of JPMorgan’s machine learning platform explained what it’s like to work there – eFinancialCareers

For the past few years, JPMorgan has been busy building out its machine learning capability underDaryush Laqab, its San Francisco-based head of AI platform product management, who was hired from Google in 2019. Last time we looked, the bank seemed to be paying salaries of $160-$170k to new joiners onLaqab's team.

If that sounds appealing, you might want to watch the video below so that you know what you're getting into. Recorded at the AWS re:Invent conferencein December, it's only just made it to you YouTube. The video is flagged as a day in the life of JPMorgan's machine learning data scientists, butLaqab arguably does a better of job of highlighting some of the constraints data professionals at allbanks have to work under.

"There are some barriers to smooth data science at JPMorgan," he explains - a bank is not the same as a large technology firm.

For example, data scientists at JPMorgan have to check data is authorized for use, saysLaqab: "They need to go to a process to log that use and make surethat they have the adequate approvals for that intent in terms of use."

They also have to deal with the legacy infrastructureissue: "We are a large organization, we have a lot of legacy infrastructure," says Laqab. "Like any other legacy infrastructure, it is built over time,it is patched over time. These are tightly integrated,so moving part or all of that infrastructure to public cloud,replacing rule base engines with AI/ML based engines.All of that takes time and brings inertia to the innovation."

JPMorgan's size and complexity is another source of inertia as multiple business lines in multiple regulated entities in different regulated environments need to be considered. "Making sure that those regulatory obligationsare taken care of, again, slows down data science at times," saysLaqab.

And then there are more specific regulations such as those concerning model governance. At JPMorgan, a machine learning model can't go straight into a production environment."It needs to go through a model review and a model governance process," says Laqab. "- To make sure we have another set of eyes that looksat how that model was created, how that model was developed..." And then there are software governance issues too.

Despite all these hindrances, JPMorgan has already productionized AI models and built an 'Omni AI ecosystem,'which Laqab heads,to help employees to identify and ingest minimum viable data so that they canbuild models faster. Laqab saysthe bank saved $150m in expenses in 2019 as a result. JPMorgan's AI researchers are now working on everything fromFAQ bots and chat bots, to NLP search models for the bank'sown content, pattern recognition in equities markets and email processing. - The breadth of work on offer is considerable. "We play in every market that is out there," saysLaqab,

The bank has also learned that the best way to structure its AI team is to split people into data scientists who train and create models and machine learning engineers who operationalize models, saysLaqab. - Before you apply, you might want to consider which you'd rather be.

Photo by NeONBRAND on Unsplash

Have a confidential story, tip, or comment youd like to share? Contact:sbutcher@efinancialcareers.comin the first instance. Whatsapp/Signal/Telegram also available. Bear with us if you leave a comment at the bottom of this article: all our comments are moderated by human beings. Sometimes these humans might be asleep, or away from their desks, so it may take a while for your comment to appear. Eventually it will unless its offensive or libelous (in which case it wont.)

Originally posted here:
The head of JPMorgan's machine learning platform explained what it's like to work there - eFinancialCareers

Read More..

5 Ways the IoT and Machine Learning Improve Operations – BOSS Magazine

Reading Time: 4 minutes

By Emily Newton

The Internet of Things (IoT) and machine learning are two of the most disruptive technologies in business today. Separately, both of these innovations can bring remarkable benefits to any company. Together, they can transform your business entirely.

The intersection of IoT devices and machine learning is a natural progression. Machine learning needs large pools of relevant data to work at its best, and the IoT can supply it. As adoption of both soars, companies should start using them in conjunction.

Here are five ways the IoT and machine learning can improve operations in any business.

Around 25% of businesses today use IoT devices, and this figure will keep climbing. As companies implement more of these sensors, they add places where they can gather data. Machine learning algorithms can then analyze this data to find inefficiencies in the workplace.

Looking at various workplace data, a machine learning program could see where a company spends an unusually high amount of time. It could then suggest a new workflow that would reduce the effort employees expend in that area. Business leaders may not have ever realized this was a problem area without machine learning.

Machine learning programs are skilled at making connections between data points that humans may miss. They can also make predictions 20 times earlier than traditional tools and do so with more accuracy. With IoT devices feeding them more data, theyll only become faster and more accurate.

Machine learning and the IoT can also automate routine tasks. Business process automation (BPA) leverages AI to handle a range of administrative tasks, so workers dont have to. As IoT devices feed more data into these programs, they become even more effective.

Over time, technology like this has contributed to a 40% productivity increase in some industries. Automating and streamlining tasks like scheduling and record-keeping frees employees to focus on other, value-adding work. BPAs potential doesnt stop there, either.

BPA can automate more than straightforward data manipulation tasks. It can talk to customers, plan and schedule events, run marketing campaigns and more. With more comprehensive IoT implementation, it would have access to more areas, becoming even more versatile.

One of the most promising areas for IoT implementation is in the supply chain. IoT sensors in vehicles or shipping containers can provide companies with critical information like real-time location data or product quality. This data alone improves supply chain visibility, but paired with machine learning, it could transform your business.

Machine learning programs can take this real-time data from IoT sensors and put it into action. It could predict possible disruptions and warn workers so they can respond accordingly. These predictive analytics could save companies the all-too-familiar headache of supply chain delays.

UPS Orion tool is the gold standard for what machine learning can do for supply chains. The system has saved the shipping giant 10 million gallons of fuel a year by adjusting routes on the fly based on traffic and weather data.

If a company cant understand the vulnerabilities it faces, business leaders cant make fully informed decisions. IoT devices can provide the data businesses need to get a better understanding of these risks. Machine learning can take it a step further and find points of concern in this data that humans could miss.

IoT devices can gather data about the workplace or customers that machine learning programs then process. For example, Progressive has made more than 1.7 trillion observations about its customers driving habits through Snapshot, an IoT tracking device. These analytics help the company adjust clients insurance rates based on the dangers their driving presents.

Business risks arent the only hazards the Internet of Things and machine learning can predict. IoT air quality sensors could alert businesses when to change HVAC filters to protect employee health. Similarly, machine learning cybersecurity programs could sense when hackers are trying to infiltrate a companys network.

Another way the IoT and machine learning could transform your business is by eliminating waste. Data from IoT sensors can reveal where the company could be using more resources than it needs. Machine learning algorithms can then analyze this data to suggest ways to improve.

One of the most common culprits of waste in businesses is energy. Thanks to various inefficiencies, 68% of power in America ends up wasted. IoT sensors can measure where this waste is happening, and with machine learning, adjust to stop it.

Machine learning algorithms in conjunction with IoT devices could restrict energy use, so processes only use what they need. Alternatively, they could suggest new workflows or procedures that would be less wasteful. While many of these steps may seem small, they add up to substantial savings.

Without the IoT and machine learning, businesses cant reach their full potential. These technologies enable savings companies couldnt achieve otherwise. As they advance, theyll only become more effective.

The Internet of Things and machine learning are reshaping the business world. Those that dont take advantage of them now could soon fall behind.

Emily Newton is the Editor-in-Chief of Revolutionized, a magazine exploring how innovations change our world. She has over 3 years experience writing articles in the industrial and tech sectors.

Go here to see the original:
5 Ways the IoT and Machine Learning Improve Operations - BOSS Magazine

Read More..

Rackspace Technology Study uncovers AI and Machine Learning knowledge gap in the UAE – Intelligent CIO ME

As companies in the UAE scale up their adoption of Artificial Intelligence (AI) and Machine Learning (ML) implementation, a new report suggests that UAE organisations are now on par with their global counterparts in boasting mature capabilities in these fields.

Nonetheless, the vast majority of organisations in the wider EMEA regionincluding the UAEare still at the early stages of exploring the technologys potential (52%) or still require significant organisational work to implement an AI/ML solution (36%).

These are the key findings of new research from Rackspace Technology Inc, an end-to-end, multi-cloud technology solutions company, which revealed that the majority of organisations lack the internal resources to support critical AI and ML initiatives.The survey, Are Organisations Succeeding at AI and Machine Learning?,indicates that while many organisations are eager to incorporate AI and ML tactics into operations, they typically lack the expertise and existing infrastructure needed to implement mature and successful AI/ML programmes.

This study shines a light on the struggle to balance the potential benefits of AI and ML against the ongoing challenges of getting AI/ML initiatives off the ground. While some early adopters are already seeing the benefits of these technologies, others are still trying to navigate common pain points such as lack of internal knowledge, outdated technology stacks, poor data quality or the inability to measure ROI.

Other key findings of the report include the following:

Countries across EMEA, including the UAE, are lagging behind in AI and ML implementation, which can be hindering their competitive edge and innovation, said Simon Bennett, Chief Technology Officer, EMEA, Rackspace Technology. Globally were seeing IT decision-makers turn to these technologies to improve efficiency and customer satisfaction. Working with a trusted third-party provider, organisations can enhance their AI/ML projects moving beyond the R&D stage and into initiatives with long-term impacts.

Facebook Twitter LinkedInEmailWhatsApp

Excerpt from:
Rackspace Technology Study uncovers AI and Machine Learning knowledge gap in the UAE - Intelligent CIO ME

Read More..