Page 2,485«..1020..2,4842,4852,4862,487..2,4902,500..»

Citrix initiates ‘Restructuring Program’ jobs and facilities to go – The Register

Citrix has initiated a "Restructuring Program" under which the company will reduce headcount and close some offices. The Register understands that staff around the world have already been let go.

The application streamer on Monday emitted a regulatory filing that detailed a plan that includes "elimination of full-time positions, termination of certain contracts, and asset impairments, primarily related to facilities consolidations".

The restructure is forecast to result in charges of approximately $130m to $240m, $65m to $90m of which has been set aside to cover employee severance payments.

Another change is a move for Paul Hough, currently executive veep and chief product officer, to become an advisor to interim CEO Robert Calderoni, who took over after the sudden departure of David Henshall.

Calderoni flagged changes on the company's Q3 earnings call, during which he reported year-on-year revenue growth of one per cent, described 2021 as "a trough year for both margins and cash flow" and lamented that some of the company's structures and sales arrangements are not conducive to growth.

The CEO also described "cash flow headwinds" that came from colossal commissions owed to salespeople who cashed on in COVID-created demand for Citrix's remote working wares. Those payments about $100 million worth made for lower margins.

"Clearly we just have to reverse some of the things that we did over the last year or two and make the business more attractive," the interim CEO told investors.

Making things more interesting is that activist investor Elliot Management has reportedly taken a ten per cent stake in the company. Elliot was instrumental in installing Henshall as CEO a decision that turned Citrix around after difficult years. In April 2020, the management firm was so confident that Citrix was on the right path that its director departed the company's board and signed off saying everything was on track.

Investors appear not to mind the new restructuring plan too much. Citrix's share price is off by three per cent this week, while the NASDAQ exchange that hosts it is up 1.5 per cent.

Read the original post:
Citrix initiates 'Restructuring Program' jobs and facilities to go - The Register

Read More..

Is mass cloud adoption going to last forever, or is it just a phase? – ITProPortal

Third-party technology partners have been vital in assisting businesses throughout the pandemic, enabling them to stay resilient and operational as restrictions continue to change. This period of extreme uncertainty has forced companies to adopt online collaboration tools for staff that are geographically dispersed and shift the majority of their data and applications to the cloud.

Recent data from Statista has highlighted that worldwide spending on cloud infrastructure services reached $47 billion by the end of the second quarter of 2021. This rise in the number of businesses locating their data in the cloud was a natural fit during a time of great uncertainty, but now, with a new normal on the horizon, are organizations about to realize they acted on impulse rather than implementing a forever solution?

We recently spoke to 500 senior decision-makers in the UK about their experiences during the pandemic to better understand their working relationships with third-party technology providers, and to find out how successful their cloud investments have been so far.

Over a third (35 percent) of respondents stated they now have a greater understanding of what those working in IT functions within their company can help them achieve within the wider business. Thirty-one percent feel more motivated to learn about how their company can enhance its agility, and 30 percent feel more motivated to learn about how their company can use cloud technologies to enhance products and services.

This shift has provided many of those surveyed with the chance to make strong cases for investment and accelerate decision-making. Worryingly however, over half (54 percent) of respondents believe unnecessary investments in technologies have been made with third-party technology partners over the last 12 months, increasing to 85 percent amongst owners and proprietors. Two-thirds (68 percent) report it remains to be seen if all investments made based on advice will be suitable long-term.

When it comes to working with third-party cloud partners, only 12 percent of respondents did not face challenges.

Top technology-based obstacles included concerns over data security, compliance, and regulatory issues, unexpected or unpredictable costs, and issues surrounding the management of data and applications between on-premise and cloud storage. Top obstacles to working with third-party cloud partners included an inability to admit fault or shortcomings when warranted, an unwillingness to compromise, a lack of honesty and integrity, a lack of empathy, and an inability to work through conflicts maturely.

Just like any relationship, solid working relationships do not just happen. They take time, patience, transparency, and input from two parties that truly want to work effectively together.

So what is the secret to a happy, long-lasting professional relationship?

1. Decide on an end goal together

As with any journey, it is key to have a clear destination in mind before embarking and discuss this as a team. Establish a clear strategy and identify the goals and outcomes you hope the chosen cloud will deliver.

Organizations require a clear vision that supports long-term goals, but as the last 12 months has demonstrated, being able to adapt to sudden change both technological and market-orientated must be accounted for too. When deciding on outsourced cloud solutions, select a vendor whose processes, procedures, and abilities best fit your planned journey, with the flexibility to alter course if priorities suddenly or drastically shift in a new direction.

2. Share everything with each other

It is important to discuss the entire asset inventory. Mature IT estates may include a variety of platforms such as colocation, clouds, and mainframe, and a careful analysis of each application is required if performance and functionality are to be maintained.

When it comes to migration, in some instances it will be straightforward. In other cases, the application can be refactored to allow for the new environment. Businesses must evaluate whether the best option is to keep the application as is with a third party, and either continue to run it internally, or look for a hosting vendor that can support it in its current state along with cloud offerings for a seamless, integrated solution.

3. Appreciate similarities and differences

Most large organizations use several clouds but may not know how to best use each one individually. In defining a cloud strategy, it is critical to understand differences in operation, management, scale, security, and governance for each. Business goals should drive cloud choice, not the other way around.

4. Be available and resilient

Availability and resiliency are key for every relationship and every business.

One strategy might include using the cloud for data vaulting, replication, and disaster recovery. In such cases, businesses must take a hard look at their recovery cloud vendor with their third party for details such as which applications are business-critical, demanding the high availability that comes from an active environment, and therefore not appropriate for cloud-based recovery.

5. Establish a long-term plan

Only by both parties understanding the complete business picture can a solid cloud strategy be developed. This includes not only new and innovative technology elements, but also the current IT environment, and future-proofing IT where possible.

For senior decision-makers, it is key to choose the right partner and technology that can support them now and, in the years, to come. organizations that do so will be able to better understand and leverage disparate elements into a single cohesive picture, knowing with confidence, that the cloud has a place in improving competitive advantage and assuring future success.

Every business was forced to pivot during the pandemic, but the expansion of cloud services is one change that will remain long-term. Its clear that there is more understanding at the top of organizations about the importance of technology investment decisions, and how building a good relationship with technology partners can help overcome challenges.

By working in tandem with knowledgeable third-party cloud partners to adopt a cloud-ready approach, that identifies possible problems before they occur and prevents a shift back to on-premises data storage when it goes wrong, organizations will be better placed to reap the benefits the cloud can offer in 2022, and well beyond.

Chris Huggett, Senior Vice President for EMEA & India, Sungard AS

Go here to see the original:
Is mass cloud adoption going to last forever, or is it just a phase? - ITProPortal

Read More..

MediaTek’s flagship 5G chip for top-of-the-line Android smartphones is coming right up – The Register

MediaTek is ready to show off its first real flagship processor that it hopes can take on Qualcomm's Snapdragon family in high-end Android smartphones.

The Taiwanese chip designer plans to unveil its Dimensity 2000 system-on-chip at the end of this week, said Dan Nystedt, a financial analyst in Taiwan who is reliable in these matters.

A few days ago, MediaTek shared a teaser video highlighting a chip made using a 4nm process. This component will compete against Qualcomm's top-end Snapdragon, which is expected to be announced at the US firm's annual Snapdragon summit later this month.

While these rival chips go toe-to-toe in higher-end Android smartphones, MediaTek hopes to eventually put its silicon into Arm-compatible Windows 11 PCs, which Qualcomm has a lock on. It is unclear whether the flagship Dimensity 2000 SoC will be used for that market.

The Dimensity chip will be "its entry in the flagship segment," with an opportunity to gain market share, MediaTek's CEO Rick Tsai said during an earnings call in October. The Arm-based SoC will have a 5G modem.

"Today, all major China brands have adopted our 5G flagship SoC. Revenue of the flagship product will begin at the end of this year, and run from first quarter next year," Tsai said, meaning devices containing the system-on-chip are expected to ship soon to buyers.

It remains to be seen if consumers are attracted to premium devices with MediaTek chips, which have mainly gone into low-cost devices in the past. The low-to-middling performance of MediaTek chips has also put off some buyers, though the new silicon may address that.

That said, MediaTek was the top mobile chip designer with a market share of 43 per cent share in the second quarter of 2021, growing from 26 per cent in the year-ago quarter, according to Counterpoint Research. Qualcomm's share dropped to 24 per cent compared to 28 per cent in the year-ago quarter. MediaTek's growth was driven by "a competitive 5G portfolio in the low-mid segment and without major supply constraints," Counterpoint said in a statement.

Spokespeople for MediaTek were not available for further comment.

Read more:
MediaTek's flagship 5G chip for top-of-the-line Android smartphones is coming right up - The Register

Read More..

‘We are not people to Mark Zuckerberg, we are the product’ rages Ohio’s Attorney General in Facebook lawsuit – The Register

Facebook was sued by Ohios Attorney General Dave Yost on Tuesday for allegedly deceiving shareholders about the potential harm its social media platform inflicted on young users.

The complaint [PDF] was filed on behalf of the states largest pension fund, Ohio Public Employees Retirement System, and all other investors that acquired Facebook shares between April 29 and October 22 this year.

Facebook founder Mark Zuckerberg, CFO David Wehner, and VP of Global Affairs and Communications (and former British Deputy Prime Minister) Nick Clegg were also listed as defendants.

Its the latest federal securities fraud lawsuit to hit Facebook, also known as Meta for corporate reputation-washing reasons.

Whistleblower and ex-employee Frances Haugen obtained and leaked internal documents that, for one thing, indicated the social media giant optimized its algorithms to keep users hooked on its platform even though doing so was detrimental to the mental health of some netizens.

The data also demonstrated how Facebooks AI content moderation systems were ineffective at removing misinformation, toxic hate speech, and violent videos. Illicit activities such as drug smuggling and human trafficking also flew under the radar.

The documents led to a string of news articles dubbed The Facebook Papers, an effort led by the Wall Street Journal and other publishers. Haugen presented the evidence to and testified before US lawmakers in Congress and in front of British Members of Parliament. The files are also referenced in the Ohio lawsuit.

"Facebook said it was looking out for our children and weeding out online trolls, but in reality was creating misery and divisiveness for profit," Yost claimed in a statement. "We are not people to Mark Zuckerberg, we are the product and we are being used against each other out of greed."

Suddenly, it feels like we're in 2010 again.

The complaint claims Facebook repeatedly told investors it has the most robust set of content policies out there to prevent fake news and harmful content from spreading. But Haugens leaked documents revealed Facebook knew its social media empire was riddled with flaws that sow dissension, facilitate illegal activity and violent extremism, and cause significant harm to users, the lawsuit alleged.

Despite this knowledge, Facebook opted to maximize its profits at the expense of the safety of its users and the broader public, exposing Facebook to serious reputational, legal, and financial harm, according to the complaint. The onslaught of bad press caused the companys shares to tumble over 14 per cent, wiping more than $150bn in Facebooks value, the lawsuit continued.

Ohios Dave Yost wants to turn the case into a class-action lawsuit on behalf of investors affected by the loss. He reckons Facebook should fork out damages and shareholders should be compensated.

This suit is without merit and we will defend ourselves vigorously, a Meta spokesperson told The Register in a statement.

A similar lawsuit was filed by a shareholder in the Eastern District of New York last month.

See the original post here:
'We are not people to Mark Zuckerberg, we are the product' rages Ohio's Attorney General in Facebook lawsuit - The Register

Read More..

The Rust Foundation gets ready to Rumbul (we’re sure new CEO has never, ever heard that joke before) – The Register

The Rust Foundation the US non-profit behind the programming language since Mozilla let the team go has picked a new CEO: Rebecca Rumbul, formerly director of research and engagement at digital democracy charity mySociety, and before that the Privacy Collective.

Dr Rumbul's appointment at the relatively new foundation reflects the growing importance of the Rust language which can be seen from the foundation's list of members. Facebook is using it, as is Google, Microsoft, various Linux kernel developers, and Linux lappy vendor System76. There are even a couple of Rust-based OSes, Redox and Theseus.

One reason is of course speed Rust is consistently one of the fastest languages, right after C and C++. But so are Ada and Fortran, which excite very few people these days. Arguably Rust's most significant rival in recent years is Google's Go language: last year, it was the language most developers said they wanted to learn next.

So let's compare them. Both are curly bracket languages, with C-like syntax that makes them unintimidating for C programmers. Both are designed to be memory-safe. Both compile direct to native code. Both are designed to be simpler, cleaner replacements for C++.

So much for the similarities; now to how they differ. Go was designed to compile fast, to be relatively simple, and be a good fit for large teams. It has strong support on concurrency with goroutines and channels, but weaker error handling and it does memory management for you, using garbage collection.

In contrast, Rust is a more complex, flexible language, with a steeper learning curve, and eschews garbage collection for RAII (Resource Acquisition Is Initialization), sometimes called Scope-Bound Resource Management. It only gained concurrency support 2019's v1.39.

To summarise, you can tell a lot about what they're good for by where they came from: Go was built by a giant provider of web services and Rust by a web-browser company. Go's strength is arguably web services being built by DevOps teams, whereas Rust is for lone coders and small teams, building standalone applications. Since the latter is the heartland of Linux and FOSS, you're likely to hear more about it. Since Mozilla cut it off, it's good to know that Rust now has a new strong and well-funded backer.

Excerpt from:
The Rust Foundation gets ready to Rumbul (we're sure new CEO has never, ever heard that joke before) - The Register

Read More..

Pace receives NSF grant to expand data science instruction nationwide – Westfair Online

Instructors from Pace Universitys Dyson College of Arts and Sciences recently received a $499,354 grant from the National Science Foundation that will allow them to expand the teaching of data science skills into introductory biology and environmental science courses.

Specifically, the grants purpose is to allow the group to continue expanding the Biological and Environmental Data Education Network, which will create a system of instructors addressing the need for data science skills and offer training, resources and professional development opportunities for biology and environmental science instructors to enable them to teach the skill set in their regular coursework.

Aiello-Lammens and Crispos roles in the program are to help other instructors learn to incorporate data education into curricula, an approach that the two instructors already practice in all of their courses taught at Pace.

Aiello-Lammens has been involved with the network before, during its original founding through a National Science Foundation incubator grant, of which he was a recipient along with colleagues from Kenyon College and Denison University. Crispo was among one of the first attendees of the networks activities.

We think its vital that our students understand how to make sense of these data and use them to make decisions for what they should be doing from whether to eat certain foods, consider certain medicines, or accept a particular job, Aiello-Lammens said. If they have these data science skills in general, then they can apply them both in their work and in their lives.

Today were able to collect more data more rapidly, collect it on computers, and analyze it on supercomputers, Crispo said. Its becoming more challenging to handle data and analyze itand its becoming increasingly important to give students the skills to be able to do so.

According to the National Science Foundation, data science skills are increasingly necessary in the fields of science, technology, engineering, mathematics and beyond, but instructors often do not have the training necessary to impart this skill set to their students. Other barriers it cited include curricula that are already perceived as overcrowded, confusion over what the key skills are and a lack of confidence in teaching them.

Data management, data analysis, data visualization, programming, modeling and reproducibility are the main subject areas and strategies the network will focus on imparting to students.

The Biological and Environmental Data Education Network will hold annual meetings, the first of which will be held at Paces Manhattan campus in 2022 and will focus on diversity and inclusion in data science education.

Its other education expansion efforts will include developing training and education workshops, publishing a curriculum guide and adding members to create a more active and diverse community of educators. Aiello-Lammens and Crispo aim to expand it nationally and internationally.

According to Pace, the program aligns with its new university-wide strategic plan, Pace Forward.

This grant epitomizes what we believe in at Pace and helps to put Pace at the forefront of educational innovation, said Tresmaine Grimes, dean of Dyson College of Arts and Sciences and School of Education. The work of Professors Aiello-Lammens and Crispo is inspiring in its aim to be cutting-edge, far-reaching, and cross-disciplinary, and will serve instructors and students not only at Pace, but across the country.

See the rest here:

Pace receives NSF grant to expand data science instruction nationwide - Westfair Online

Read More..

Snowflake Shapes the Future of Data Science with Python Support – Business Wire

No-Headquarters/BOZEMAN, Mont.--(BUSINESS WIRE)--Snowflake (NYSE: SNOW), the Data Cloud company, today announced at its Snowday event that data scientists, data engineers, and application developers can now use Python - the fastest growing programming language1- natively within Snowflake as part of Snowpark, Snowflakes developer framework. With Snowpark for Python, developers will be able to easily collaborate on data in their preferred language. At the same time, they can leverage the security, governance, and elastic performance of Snowflakes platform to build scalable, optimized pipelines, applications, and machine learning workflows. Snowpark for Python is currently in private preview.

Developers want flexibility when working with data, simpler environments that require less administrative work and maintenance, and immediate access to the data they need. Snowpark brings the programming languages of choice for data to Snowflake. With Snowpark, developers can unlock the scale and performance of Snowflakes engine and leverage native governance and security controls built-in to Snowflakes easy-to-use platform. In addition to Java and Scala, Snowpark now supports Python, allowing users to have different languages and different users all working together against the same data with one processing engine, without needing to copy or move the data.

As a result of the recently announced partnership with Anaconda, Snowflake users can now seamlessly access one of the most popular ecosystems of Python open source libraries, without the need for manual installs and package dependency management. The integration can fuel a productivity boost for Python developers. Snowflakes recently launched Snowpark Accelerated Program also supports customers with access to numerous pre-built partner capabilities and integrations, from directly within their Snowflake account.

With Snowpark for Python, data teams can:

Novartis, the multi-national healthcare company that provides solutions to address the evolving needs of patients worldwide, needed a way to empower their global team of analysts and data scientists with a powerful data platform that would reduce data preparation time and provide self-service capabilities for building models and running analytics.

Novartis mission is to reimagine medicine to improve and extend people's lives, and to do so successfully today we need to leverage digital technologies that continue to put data and data science at the center of our transformation, said Loic Giraud, Global Head of Digital Platform & Product Delivery at Novartis. As a progressive, data-driven life-science organization, the flexibility and scale of Snowflakes Data Cloud allows us to accelerate our pace of knowledge through data interpretation and insight generation, bringing more focus and speed to our business. Bringing together all available data ultimately unlocks more value for our employees, patients, and health care providers, and data science innovations help us realise this goal."

Snowflake has long provided the building blocks for pipeline development and machine learning workflows, and the introduction of Snowpark has dramatically expanded the scope of whats possible in the Data Cloud, said Christian Kleinerman, SVP of Product at Snowflake. As with Snowpark for Java and Scala, Snowpark for Python is natively integrated into Snowflakes engine so users can enjoy the same security, governance, and manageability benefits theyve come to expect when working with Snowflake. As we continue to focus on mobilizing the worlds data, Python broadens even further the choices for programming data in Snowflake, while streamlining data architectures.

Learn More:

Forward-Looking Statements

This press release contains express and implied forwarding-looking statements, including statements regarding the availability of Snowpark for Python. These forward-looking statements are subject to a number of risks, uncertainties and assumptions, including those described under the heading Risk Factors and elsewhere in the Quarterly Report on Form 10-Q for the fiscal quarter ended July 31, 2021 that Snowflake has filed with the Securities and Exchange Commission. In light of these risks, uncertainties, and assumptions, actual results could differ materially and adversely from those anticipated or implied in the forward-looking statements. As a result, you should not rely on any forwarding-looking statements as predictions of future events.

About Snowflake

Snowflake enables every organization to mobilize their data with Snowflakes Data Cloud. Customers use the Data Cloud to unite siloed data, discover and securely share data, and execute diverse analytic workloads. Wherever data or users live, Snowflake delivers a single data experience that spans multiple clouds and geographies. Thousands of customers across many industries, including 212 of the 2021 Fortune 500 as of July 31, 2021, use Snowflake Data Cloud to power their businesses. Learn more at snowflake.com.

1According to SlashData, Developer Economics: State of the Developer Nation 20th Edition.

Read more:

Snowflake Shapes the Future of Data Science with Python Support - Business Wire

Read More..

UK Railroads Invest in Data Science, AI, and Machine Learning – Cities of the Future

The oldest rail network in the world, with over 32,000 km. of track, is investing in artificial intelligence and machine learning to streamline maintenance and deal with weather-related challenges.

Two months ago, during the AI and Big Data Expo, I had the opportunity to meet Nikolaos (Nick) Kotsis, Chief Data Scientist at Network Rail. At the conference, he was one of the speakers, talking about their digital transformation and how Network Rail was implementing new ways to inspect and manage the networks assets, shifting work from traditional planning and maintenance schedules to a proactive predict and prevent approach.

At the time, Kostis mentioned the data collection happening in real-time, simultaneously from the trains using the network, the people inspecting the tracks, drones, helicopters, and over 30,000 IoT sensors deployed all over the country.

All this data collection allows Network Rail to know what is happening and take immediate action when something goes wrong or needs fixing. But the real magic, which helps predict and prevent incidents, and provides predictive maintenance, happens when AI and Machine Learning are applied to that massive amount of data.

To learn more about how Network Rails Data Science department works and how it impacts the organization, we reached out to Nick Kotsis again. He answered our questions by email.

PV: As discussed in our previous conversation, Network Rail is undergoing a substantial digital transformation in the field and the data center. Can you tell us a bit about your data science department and its role within the organization?

Nick Kotsis: Our initial plan for the data science function was leaning towards being primarily guidance and advisory; however, once we began engaging with customers, we realised that taking on responsibilities for delivery would not only return cost benefits to the taxpayer but would also help our partner network operate more effectively.

Inspiring confidence and gaining the trust of our customers and suppliers was the focus of my role in the first six months. Since then, we have evolved to become an actual delivery function for data analytics, advanced machine learning, and AI technology packaged into fully integrated digital products that customers can use with minimal training. The end result is a trusted service supported by a skilled, confident team focused on the customer and responding to the most complex problems in Network Rail.

When customers from across the organisation ask for help, it means we do something right.

PV: You said before that NR is a complex machine, managing the network and some of the largest train stations and maintaining the freight trains. How is data analytics (and AI) affecting the different services?

Nick Kotsis: Our organisation is responsible for maintaining a complex infrastructure that is vulnerable to environmental and weather conditions and the constant pressure of hundreds of daily train journeys.

Network Rail maintains 20,000 miles of track. Our job is to maximise the use of data to make the maintenance of the infrastructure a safer and more efficient environment for both our passengers and workforce.

To give you an example, performing remote inspections on track assets with the assistance of AI technology instead of visiting the track is a genuine safety benefit for our maintenance teams. Of course, we are not planning to pause physical inspections, but if we could confidently limit them to absolutely necessary ones, that would be a true benefit.

Data analytics and the more sophisticated machine learning techniques consistently demonstrate high quantitative and qualitative benefits. Examples of these benefits can be seen in: the prediction of incidents; the automation of tasks that would be repetitive and mundane for a human; the complex risk assessment on thousands of assets in a split second; and complexity management using optimisation algorithms which also bring speed to complex decision making.

In our case, we have successfully developed automated risk assessments using computer vision techniques that identify assets for immediate attention on the track and surrounding areas. We also established a preventive maintenance process driven by predictive algorithms which calculate the likelihood of an asset failing days or weeks before it actually happens, allowing us to resolve an issue before it becomes a problem. Both of these systems offer significant improvements to safety along with reduced delays and disruption for passengers.

The advanced big data engineering and algorithmic (AI) logic we use behind the scenes should, and will, continue to expand across every part of our infrastructure. I am confident that eventually, we will reach the desired levels of deployments needed to scale up our operations to prevent the incidents.

PV: The UK has the oldest rail service worldwide. Nevertheless, across Europe and many other places are also old rail services operating, each with complex and specific challenges. Based on your experience, what would you recommend to those organizations starting or undergoing digital transformation?

Nick Kotsis: Finding the correct answer to this is not easy as every organisation will be starting at a different point and has a different maturity trajectory. Also, financial investment for some organisations can be made easier than in others, and thus the digital journey will be very different.

As with most complex initiatives, digital, data, and AI success depends on leadership and commitment to completing the journey. In our case, we are fortunate to have solid technical leadership from our CIO and CTO. They saw our destination at the early stages of development and supported us in making the data science vision a reality.

My advice to colleagues in other organisations? be clear about the vision and data strategy, engage with customers who need your help, create the right team and set them for delivery, and focus on projects with clear benefits.

Sign up to ournewsletterto receive the latest Cities of the Future news. You can also follow us onTwitterandFacebook.

Excerpt from:

UK Railroads Invest in Data Science, AI, and Machine Learning - Cities of the Future

Read More..

Top Upcoming Data Science Webinars to Attend in 2021 – Analytics Insight

Data Science training empowers professionals with data management technologies such as Hadoop, Flume, Machine learning, etc. If a candidate has the knowledge and proficiency of these significant data skills, it would be an added advantage for them to have an improved and competitive career. Here are the top data science webinars for you to attend in 2021.

Nov 17 2021, 11:30pm

Join Adam Mansour, Head of Sales Engineering, and Daniel West, Head of Mid-Market Sales at ActZero for this fireside chat as they discuss: Techniques to detect compromises before a threat is detonated Recommendations to achieve greater threat coverage and visibility into attacks How to improve security effectiveness by merging data science with cybersecurity How thought security leaders and change agents are preparing for the future by leveraging AI/ML.

Join here.

Nov 18 2021, 9:30pm

As pharmaceutical companies collect more unstructured Voice of Customer data, manually reading and analyzing each entry becomes more costly and inefficient. The advances of natural language processing and other machine learning techniques make it possible to create an insights generation program where subject matter experts work with technology to analyze more data and accelerate time to insights. Setting yourself up for success and maximizing the value of your VoC data requires changes in strategies and the way you handle data. In this webinar, youll hear from Seth Tyree, VP-Pharma Insights at Stratifyd, and Tolga Akiner, Data Scientist, as they share their experience and best practices from building an insights generation program. Learn about the importance of critical data prep activities like data labeling Gain an understanding of key AI models and their roles in your insights generation from unstructured VoC data See how AI and machine learning can work with your SMEs to accelerate insights.

Join here.

Nov 23 2021, 6:30am

This webinar is presented by Fangjin Yang, Co-Founder and Chief Executive Officer of Imply. This webinar will talk about Data analytics, big data, data science, and real-time analytics.

Join here.

Nov 24, 2021 2.30 pm

This Data Science webinar on How to Become a Data Scientist includes all the skills required for becoming a modern-day Data Scientist.

Agenda

Join here.

Nov 17 2021, 9:30pm

At any given time, organizations are attempting to transform their business (think business process, digital, management, organizational, and cultural transformations) with the common end goals of operational change, business model innovation, and domain expansion. Now is the time to use AI-enabled solutions to drive business transformation, but how is that done in practice? Join Jerry Hartanto, AI Strategist at Dataiku, for an overview of how AI mitigates business transformation risks, accelerates the time to value, and drives tangible outcomes.

Join here.

Follow this link:

Top Upcoming Data Science Webinars to Attend in 2021 - Analytics Insight

Read More..

The Iconic CEO Erica Berchtold wants to use the science of data to convert you to online shopping – The Australian Financial Review

I really do believe most of the journey that you have in a physical store can be done in a digital world, she says.

The Iconic leverages data to lure consumers, and keep them. Already, the site will suggest sizes based on other brands you wear, and offers virtual sneaker try-ons. It curates products based on your preferences, sending you a personalised edit of clothing, shoes and accessories you might like the way a sales assistant might. But Berchtold sees a future when the technology of the metaverse, for example, could be used as a virtual fitting room.

Although The Iconic employs just over 1000 people, half of these are stationed at its fulfillment centre in Yennora, in western Sydney. Of the rest, 120 are in technology roles. Berchtold is recruiting 70 more staff in the tech team, across data, engineering, product and information technology.

COVID hasnt helped, with the borders being shut, she says. There is also a lack of investment in technology education here in Australia. We havent been grooming our own workforce as much as we could have.

An unplanned meeting with federal Labor leader Anthony Albanese. Louie Douvis

Berchtold has spent her career in retail, and revels in the shop floor. The closest thing to that at The Iconic is the raft of customer emails that arrive each week; Berchtold reads them all.

When I was having a bad day at other jobs, Id go to the shop floor to remind myself of what were actually doing, she says. This is the same thing.

Some feedback will resonate with Berchtold so much she is compelled to send the writer a gift voucher. Like, we had one customer, says Berchtold. She wrote, I really love this brand. But I hate the models. Theyre just such skinny things. I was like, Oh, shes cool. I hear that. And so, were going to make sure that were a bit more diverse in the models selected for that particular brand. And I sent her a $50 voucher.

The company has grown significantly in 10 years. In 2011, The Iconic launched with 1000 products from 125 brands. Today, it boasts 165,000 products from 1500 brands, with 500 new arrivals to the site daily.

The Yennora site is Australias largest fashion fulfillment centre, with a capacity for 3.75 million units, and can fulfil orders in as little as eight minutes. At its Alexandria Hub, a purpose-built production studio in inner Sydney, more than 60 staff pump out curated editorial for the sites 19.5 million monthly users. Its app is Australias most downloaded fashion app, with 5 million downloads. In the 2020-21 financial year, it sent out 6.2 million parcels, a 1.2 million increase on fiscal 2019.

Numbers aside, Berchtold is most proud that the company has cemented itself as a permanent part of the retail landscape.

Expanding from clothing to categories such as childrens wear, homewares and beauty has been a significant shift for the business. And Im proud that we are playing a leadership role in things like sustainability, diversity and inclusion.

Last year, the company launched Giving Made Easy, enabling customers to donate pre-loved clothing by downloading a pre-paid postage label and dropping off their donation at any Australia Post location. The company says this has saved more than 25,000 kilograms of textiles from landfill.

It also enacted a Reconciliation Action Plan, endorsed by Reconciliation Australia, and is committed to body diversity, urging vendors to offer extended sizing. This year, it launched a modest dressing capsule.

Not, Berchtold says, that she has a lot of choice. The average team member is 27, she says. I do not have to push for this stuff. It is business as usual for young people. I dont need to explain why sustainability matters to anybody in our team.

As for the future of retail and technology Berchtold is bullish.

I look at COVID, and the way some retailers were saying, We got a chance to experience what its like to be pure-play online. Like, you think? Weve had a 10-year head start on that stuff. The data and technology that is baked into our DNA. I think some retailers would be astounded by it.

I dont love trying stuff on. So could we use virtual clothing on the site to show people what it looks like on an avatar of themselves? That would be enormously helpful.

The idea behind all The Iconics technology is to make every transaction more seamless.

The team is constantly testing and learning. Recently, it announced a partnership with AirRobe, an online resale platform. Customers can automatically add their just-bought clothing to AirRobe, so that if they want to sell it later, the information and imagery is ready for them to use.

Berchtold is waiting on early results of the AirRobe experiment, but is confident the re-economy is the future. Reuse, repair, recycle, she says. This is the way ahead. A repair program is in the works.

One of the big lessons of the past decade has been learning how to use data.

Youve got the data right there. But theres a real art to knowing what to do with it, the data to actually pay attention to, what you can just not ignore, Berchtold says. Thats actually a real skill, and it is very important and will only continue to grow in importance. And I dont think a lot of retailers would do that easily.

Its a muscle you have to work, over and over. And weve had 10 years of trial and error in that space. So, weve got a pretty strong muscle.

Here is the original post:

The Iconic CEO Erica Berchtold wants to use the science of data to convert you to online shopping - The Australian Financial Review

Read More..