Page 2,183«..1020..2,1822,1832,1842,185..2,1902,200..»

V&A Consulting Engineers Announces the Release of Their Proprietary VANDA GRAVITY MAIN CLEANING INDEX – GlobeNewswire

OAKLAND, Calif., April 06, 2022 (GLOBE NEWSWIRE) -- The VANDA Gravity Main Cleaning Index was created by V&A to provide a standard for recording condition assessment data during gravity main cleaning with objective criteria. The index presents typical examples of conditions observed when cleaning gravity main segments. This index also becomes the standard rating methodology for machine learning need-for-cleaning predictions (characterized as LOFlikelihood of failure) generated by V&A's data science services.

"The development of our indexsupports improved data recording and standardized reporting. We are driven to contribute to the water and wastewater industries with innovation, alignment, and collaboration, andthe indexes are just one way we are supporting municipalities," saidDebra Kaye, V&A CE0 & President.

The index is presented as a durable ruler to be a tool for professionals assessing and reporting gravity main cleaning conditions. The VANDA Gravity Main Cleaning Index facilitates data collection that builds a foundation for data analysis and improved gravity main cleaning. The adoption of a standardized index, positions collection systems for low cost, sophisticated analytics that provides maintenance supervisors with insights for improving the cleaning processes.

"V&A has been working with collection systems for the past two years to understand how to enable data-driven recommendations to gravity main cleaning process decision making. Because gravity mains are typically cleaned prior to CCTV inspection, CCTV data that provides operational guidance for removal of FOG, roots, or debris is limited," said Lars Stenstedt, V&A Data Science Manager. "Promoting the adoption of a cleaning condition assessment data standard is an important step in enabling collection systems to take advantage of available data science techniques and processes to become as efficient as possible with sewer system overflow (SSO) prevention maintenance."

CWEA Annual Conference attendees are invited to visit booth #629 for free copy of the VANDA Cleaning Index and attend "Gravity Sewer Main Cleaning: Innovative Use of Data Science for Optimization" presented by Lars Stenstedt on April 11, 10 a.m. at the Hyatt Regency Sacramento 1209 L Street.

"Padre Dam has been collecting condition assessment data during gravity main cleaning for many years. This type of data has enabled Padre Dam to leverage our cleaning crew knowledge and observations for continuous improvement of our maintenance processes, Padre Dam supports V&A's promotion of the importance of gathering this kind of condition assessment data during gravity main cleaning," said Daniel Lockart, Maintenance Supervisor, Padre Dam Municipal Water District.

"The Town of Hillsborough collects condition assessment data during the cleaning process and has been working with V&A on the development of this standard index for the past year," said Rick Pina, Sewer Supervisor, Town of Hillsborough. "The process of developing this standard by leveraging senior staff and their institutional knowledge has enabled Hillsborough to streamline the onboarding and training of new maintenance staff. Understanding data-driven predictions helps us to focus our maintenance efforts removing FOG, roots and debris, as well as developing future CIP projects for the collection system."

About V&A Consulting Engineers

Headquartered in Oakland, CA, with offices in San Diego, CA, Houston, TX, and Sarasota, Florida, and founded in 1979, V&A Consulting Engineers (V&A) is a multidisciplined engineering team, led by Debra Kaye, CEO, concentrating on civil infrastructureprimarily in the fields of water, wastewater, and light rail transit.Visithttp://www.vaengineering.com.

Media Contact: Robin Rhea| rrhea@vaengineering.com 510.903.6600

Related Images

Image 1: V&A Logo

V&A Consulting Engineers Logo

This content was issued through the press release distribution service at Newswire.com.

See the original post here:

V&A Consulting Engineers Announces the Release of Their Proprietary VANDA GRAVITY MAIN CLEANING INDEX - GlobeNewswire

Read More..

Goldacre recommendations to improve care through use of data – GOV.UK

Patient care in the NHS will be improved through more efficient and safer access to health data, which will drive innovation and lead to potentially lifesaving research.

Professor Ben Goldacre, Bennett Professor of Evidence-Based Medicine at the University of Oxford, has today published the findings from his independent review into how the NHS can achieve better, broader and safer use of health data. Learning lessons from the pandemic, the review advises how to utilise health data in healthcare and sets out 185 recommendations to the government.

The pandemic has demonstrated the immense value of health data in driving research to improve patient outcomes and save lives such as the discovery of dexamethasone as the first treatment for COVID-19. Large-scale data analysis enabled better understanding of patient outcomes more rapidly than previously possible. The speed and scale of this data analysis was possible through the interconnected nature of NHS systems, as well as specific legal measures to enable data access quickly.

Data also allows the NHS to continue delivering world-leading care, for instance by helping to understand whether different patient groups respond better to different treatment options, or to anticipate future demands on the healthcare system by tracking the prevalence of disease.

Health and Social Care Secretary, Sajid Javid, said:

Countless lives have been saved through the pandemic after health data enabled ground-breaking research.

As we move forwards, millions of patients could benefit from the more efficient use of health data through boosting innovation and ensuring the NHS can continue to offer cutting-edge care, saving lives.

I want to thank Professor Ben Goldacre, his team, and all those who contributed to this review this work, alongside our upcoming data strategy, will help to transform the NHS on our road to recovery.

The review makes a range of proposals, including:

Professor Ben Goldacre said:

NHS data is a phenomenal resource that can revolutionise healthcare, research and the life sciences. But data alone is not enough. We need secure, efficient platforms and teams with skills to unleash this potential. This will be difficult, technical work. It is inspiring to see momentum grow for better, broader, safer use of health data across so many sectors.

The governments response to the review will be included in the upcoming Health and Social Care Data Strategy which will set the direction for the use of data in a post-pandemic healthcare system.

The Goldacre Review was launched on 9 February 2021.

The Data Strategy was published in draft form in June 2021.

The government recently announced 200 million of funding for health data research and development.

Ben Goldacre is a clinical researcher at the University of Oxford where he is Director of the Bennett Institute for Applied Data Science, and Bennett Professor of Evidence-Based Medicine in the Nuffield Department of Primary Care Health Sciences.

He advises government on better uses of data and leads an academic team that uses large health datasets to deliver research papers and tools including:

He is also active in public engagement. His books, including Bad Science, have sold over 700,000 copies in more than 30 countries and his online lectures have over 5 million views.

Original post:

Goldacre recommendations to improve care through use of data - GOV.UK

Read More..

Leadership, talent and the digital future – Bangkok Post

How can leaders approach future-proofing their workforce?

I am frequently amazed at how much digital technology has transformed my work and the work of my people. For me as a senior leader, some aspects are great. It is much easier to communicate with more people inside and outside the organisation. Receiving updates is easier. I can also contribute to things like recruitment more effectively. At the same time, many of the skills I need have been transformed, and some are obsolete.

I am very conscious that the effect of digitisation is much bigger for many of my people. They may be more digitally native, but they do not have the experience of other staff in a rapidly changing workplace.

However, these people, many of whom we can consider the future of our organisations, cannot just be replaced. There simply is no alternative talent available to replace them. Demand far outstrips supply. We face a highly complex situation full of challenges and opportunities. The answer, I believe, is to develop these people as digital talent, with the right blend of capabilities.

The challenge is that technology and possibilities are outpacing the traditional education approaches many companies cling to, even if most development now takes place online. If leaders do not commit to developing digital talent at scale, they will run their businesses into economic choke points.

Additionally, the young people coming into the workforce today do not have the digital skills their industry requires. The available digital talent will be more expensive, and companies will not have a big enough pool of skilled workers in data science and AI.

There is no one-size-fits-all answer to these challenges.

My organisation has gone increasingly digital in the last year or so. We have added entire new departments and completely new types of talent. We are a relatively small and specialised organisation. We do not have the advantages of many big organisations, but we also do not face all the challenges of smaller businesses. We have made many mistakes, and we are still learning, but I would like to share a few of my observations in the hope they may help leaders like myself.

First, leaders need to take some time to understand the digital skills their future business needs. Do not jump at whatever is trendy. For us, data was a capability we needed to build. We also had to learn more about virtual delivery very quickly. Understanding helped us plan for development at scale and effectiveness in the areas critical for our business. I also had to consider our culture to identify potential barriers and actively lead my people in the required direction.

Second, I had to become along with the rest of the leadership team somewhat of a digital champion. I had to talk about digital. I had to be seen to become more digitally savvy, which was an enjoyable challenge. I had to highlight potential opportunities to use technology to do things better. Leaders doing this is essential. Leaders who dont become more digitally inclined can become massive barriers to successful wider adoption.

Third, I had to rethink how I could digitally educate my people at scale. I was lucky that I had some people already engaged in their own digital upskilling. Additionally, being in the education business provided some insights and resources. But fundamentally, how we did things transformed.

There were much fewer classes and courses. Instead, there were learning journeys and applications for our organisations key jobs to be done, and there were challenges. Since many of us were relatively new to this, there was a lot more discussion, teaching each other, and sharing of tips and insights.

It worked, and although we have a long way to go on what is probably a never-ending journey, some of my most senior leaders and people are remarkably digitally more advanced than I would have thought possible.

Finally, I learned that developing hard digital skills by themselves was not sufficient. The data team I mentioned needed to develop the softer and thinking capabilities to not just make sense but to engage everyone else with the data.

In this time of a critical digital skills shortage, I do not believe there are any shortcuts. Leaders need to commit to a holistic approach to future-proofing their workforce. They have to commit to the long term and invest in their people.

They also need to teach their people that it is not just about building digital skills, it is about building the business. Whether you like it or not, for most companies, digital is now the business. Your customers are already more digital, and if your people do not have the digital skills to give them what they want, someone else will.

Even your dinosaurs can be brought along if you make developing their capabilities easy and user-friendly.

Arinya Talerngsri is Chief Capability Officer and Managing Director at SEAC Southeast Asias Lifelong Learning Center. She can be reached by email at arinya_t@seasiacenter.com or https://www.linkedin.com/in/arinya-talerngsri-53b81aa. Talk to us about how SEAC can help your business during times of uncertainty at https://forms.gle/wf8upGdmwprxC6Ey9

More here:

Leadership, talent and the digital future - Bangkok Post

Read More..

Canada needs workers so why aren’t more companies hiring the neurodivergent? – CBC News

The founders of a job fair for those with autism don't only wantto find careers for an untapped workforce they also hope employers will realize these highly skilled job seekers can help solve a national labour shortage.

"People with autism are very much capable of working and they are some of the best employees," said Neil Forester who, along with his business partner Xavier Pinto, created the Spectrum Works Job Fair that ran Friday.

Now in its sixth year, the job fair has grown from having 150 attendees to almost 2,000 job seekers with autism, all looking to connect with recruiters and hiring managers at major tech, finance, hospitality and retail companies across the country. Though it's been held in various cities, the job fair was a virtual event this year and last.

Getting companies to take part, though, has been a struggle.

Of the 10,000 employers Forester and his team have reached out to in the last six years, just 40 companies took part in this year's job fair.

"The majority of the time we don't get any response," Forester said.

The creators of the fair say they understand there is a wide range of abilities across the autism spectrum and, while perhaps not every person with autism is employable, both Forester and Pinto are confident a large portion of this community can and wants to work.

And Forester questions why more employers aren't looking at this neurodiverse talent pool to help solve the labour shortages that so many companies are experiencing.

In the last quarter of 2021, Canadian employers were looking to fill 915,500 jobs, up 63 per cent from the year before, according to Statistics Canada.

And with the current unemployment rate so low, "virtually all industries are bumping up against labour shortages," wrote Royal Bank economist Nathan Janzen in an economic update this week.

Even with the demand for workers, employment barriers remain for Canadians with autism.

Data compiled by the Public Health Agency of Canada found that in 2017 just 33 per cent of Canadian adults with autism reported being employed compared to 79 per cent of adults without a disability.

Forester said he was unaware of just how few neurodiverse employees there are in the workforce before he started the job fair.

"I just didn't realize how big of a problem this was or how big of an issue this was to the community," he said.

Javier Herrera is one of the comparatively few Canadians who are both employed and living with autism.

He attended the Spectrum Works job fair last year and got a job offer.

"It was overall a very positive experience. I met not only recruiters, but also other facilitators, coaches, government agencies, non-profits, you name it," said Herrera who now works as a business systems analyst with an insurance company based in Vancouver.

Herrera is encouraged to see that some employers purposefully seek out people with autism, but he feels that "as a society we are still doing baby steps" to get more people who are neurodiverse into the workforce.

That said, there are some companies specifically tapping into this talent pool, including two of the so-called "Big Four" accounting firms.

In the last few years, Ernst & Young has made strides in diversifying its hiring strategy.

The multinational launched the Neurodiversity Centre of Excellence in Toronto in November 2020, with a goal of recruiting employees with autism, ADHD or other sensory and cognitive differences.

"We're dying for talent as an organization," said Anthony Rjeily, a partner at Ernst & Young and the company's neurodiversity program national leader. "So we wanted to see if there was any talent pool out there that we could potentially tap into."

Since the launch of the program, the company has recruited 45 neurodiverse employees to their Toronto, Vancouver, Halifax and Montreal offices and plans to expand recruitment in other cities.

Rjeily said the initiative has more than paid off, noting the retention rate among neurodiverse candidates that the company has hired is 98 per cent.

"The level of creativity, the innovation, the productivity that they are able to deliver is incredible," he said.

Mohit Verma was one of the first people Ernst & Young hired in 2020 through the neurodiversity recruitment program.

"At EY my work revolves around certain sub-competencies such as automation, data science and, to some extent, blockchain," Mohit said in an interview with CBC News. "So far I have been part of five to six main projects."

Deloitte Canada is another corporation with an eye on hiring the neurodiverse.

In an attempt to better understand the barriers and workplace needs of neurodiverse workers, the accounting giant teamed up with Auticon Canada, a global technology consulting firm that employs people with autism and recently did a survey along with Deloitte of what the needs of employees with autism might be.

The survey, 'Embracing neurodiversity at work: How Canadians with autism can help employers close the talent gap,' was done between July and October 2021. It included 454 respondents with autism who completed the survey online, as did seven companies that had neurodiversity in their workforces were interviewed over videoconferencing.

In their survey, they found that 41.7 per cent of respondents were underemployed, meaning they were working on a part-time, contract or temporary basis or were doing jobs that were "under their educational capabilities," said Roland Labuhn who is a partner with Deloitte Canada.

One of the most eye-opening findings was that the hiring process itself could be a major barrier, as 40 per cent of those polled said the job interview was a "great challenge" for them.

"The people we surveyed felt that the interview was a trick or scary," said Labuhn, who worries that the typical job interview process could eliminate some highly qualified candidates with autism.

With a goal of getting better at both recruiting and retaining neurodiverse workers, companies like Deloitte and Ernst & Young are trying to change the interview process so that it focuses more on competence rather than how a candidate might behave in a certain scenario.

That kind of accommodation provides hope to people like Pinto and Forester.

The inspiration for their job fair came out of Pinto's concerns about his son's future. Xavi, 12, is on the spectrum and is "so creative," his father said.

He's "really focused on what he wants done."

And seeing more employers begin to sign up for the job fair gives him hope that he's helping to create a world in which his son can go after his dreams.

Read more:

Canada needs workers so why aren't more companies hiring the neurodivergent? - CBC News

Read More..

Cloud First is no longer enough; its Cloud Everywhere that firms want: Som Satsangi – The Financial Express

As public expectations evolve, government IT departments have to continuously find new ways to support the needs of digital citizens, says Som Satsangi, senior vice-president & managing director, Hewlett Packard Enterprise, India. The task is complex, and IT needs to move quickly with limited resources as digital transformation is critical to success, he tells Sudhir Chowdhary. Excerpts:

What are the biggest challenges in the adoption of cloud in the public sector?The main hindrances to adoption of cloud/achieving digital transformation by the public sector include security, data sovereignty, funding pressures, political complexity, regulatory curbs, and population scale and diversity. The pandemic has only added to the pressure, with increased demand for speed and resiliency to deliver critical services. The technology ambitions of enterprises have shifted from Cloud First to Cloud Everywhere as the hybrid world continues to explode. Newer applications and tech approaches are warranting a rethink of traditional deployment of strategies.

How is HPE GreenLake Edge-to-Cloud Platform helping enterprises in their digital transformation?Digital transformation is no longer a priority, but a strategic imperative, and data is essential to operate in the new digital economy, being at the heart of every modernisation initiative. And yet organisations have been forced to settle for legacy platforms that lack cloud-native capabilities, or go for complex migrations to the public cloud that require customers to adopt new processes and risk vendor lock-in.

The HPE GreenLake cloud services for data and analytics empower customers to overcome these trade-offs and give them one platform to unify and modernise data everywhere. It offers the agility and innovation of the cloud while preserving control of applications and workloads that need to run on premises. It helps public entities accelerate IT modernisation, reduce costs, and harness the power of data. Public sector entities like Steel Authority of India Ltd (SAIL) and ONGC have recently signed partnership with HPE and deployed the HPE GreenLake edge-to-cloud platform to accelerate their digital transformation efforts, respectively.

HPE has played a significant role in Indias digital transformation programmes, which have ranged from pure play data centre builds to running digitisation programmes for the largest insurance company in the country to the national identity programme. We have also played a role in key digital projects and platforms which fundamentally impact every citizen and business in the country.

What are likely to be the top technology trends in 2022?First and foremost, we believe there will be continued explosion of data at the edge, driven by the proliferation of devices which require secure connectivity. This data will have to be managed through its lifecycle ensuring organisations gain insights. Second, there will be a mandate for a cloud-everywhere experience that allows customers to manage data and workloads across a distributed enterprise. Third, there will be a growing need to quickly extract value from data to generate insights and build new business models.

The government recently announced plans to set up 9 more supercomputers in Indian institutes. How are you contributing to this space?This announcement will not only meet the increased computational demands of academia, researchers, MSMEs, and startups working in areas like oil exploration, flood prediction, genomics, and drug discovery but also firm up indigenous capability for developing supercomputers. HPE is committed to the development of an end-to-end HPC (high performance computing) ecosystem spanning processors, servers, and data centres, all effectively integrated to deliver industry-leading use-case capabilities. We are the worlds largest supplier of HPC systems and have more than 100 customers in India using our HPC set-up. Almost all the top institutes in the scientific and research sector in India are using our HPC footprint.

QUOTEThe HPE GreenLake cloud services for data and analytics give customers one platform to unify and modernise data everywhere. It offers the agility and innovation of the cloud while preserving control of applications and workloads that need to run on premises

Follow this link:
Cloud First is no longer enough; its Cloud Everywhere that firms want: Som Satsangi - The Financial Express

Read More..

5 Best IT Support in Cleveland, OH – Kev’s Best

Below is a list of the top and leading IT Support in Cleveland. To help you find the best IT Support located near you in Cleveland, we put together our own list based on this rating points list.

The top rated IT Support in Cleveland, OH are:

Forefront Technology Inc, their team, at Forefront, is profoundly passionate about the success of our clients. They are a young company made up of engineers and managers that spent a large share of their careers in the global technology operations of corporate America. It was their dream to leave large enterprises and bring their knowledge and expertise to small and medium businesses in a cost-effective way. Their yearning was to impact and enable the visions of those they work within a powerful manner.

Forefront Technology is an outstanding, nationwide IT services engineering firm specializing in solutions that are open, scalable, and drive greater productivity and competitiveness for their clients. Their solutions and services portfolio provides their enterprise clients with Cloud, Security, Collaboration, Core Infrastructure, and Managed Services. Forefront Technology is a private function company headquartered in Cleveland Ohio.

Products/Services:

Manage Cloud Services, Advisory & Consulting, Onsite Resources/Skills Gap Fill, & More

LOCATION:

Address: 1360 W 9th St Suite 215, Cleveland, OH 44113Phone:(216) 223-3090Website: http://www.myforefronttech.com

REVIEWS:

Excellent customer service. The entire staff is very knowledgeable. Ive completed many projects with them and will continue to work with them in the future. Andrew D.

FIT Technologies is proud to assist many clients, varying in size, sector, and service needs throughout Ohio and other locations across the country. They are a company that has improved their service offering over the years, but they have been constant about the way in which they want to do business: in cooperation with their clients. They understand the only way to become the trusted IT advisor for an organization is to have a terrific team of talented people who are committed to customer service. They have created an amicable atmosphere in which employees work together to gain the confidence of their customers by building and helping their tech capabilities.

Products/Services:

Manage, Develop, Strategize, Implement

LOCATION:

Address: 1375 Euclid Ave #310, Cleveland, OH 44115Phone:(216) 583-5000Website: http://www.fittechnologies.com

REVIEWS:

This IT Team is the best. They are quick in resolving issues and very knowledgeable. Thanks, FIT. Elizabeth H.

Acroment IT Services, Since 2004, Acroment Technologies has utilized a one-of-a-kind approach to assist small businesses across Northeast Ohio lower their costs, boosting productivity, and getting the most out of their technology investment. Thats more than a decade of providing excellent IT services and solutions, and in the IT industry, thats a lifetime. They believe its because they realize that your business is not technology, your business simply depends on technology to keep it running smoothly.

When you pick Acroment Technologies to be your IT department, you can stop worrying about trying to keep up with the fast-moving world of technology, because they will take care of it for you.

Products/Services:

Managed Services, Cloud Services, Virtualization, Email & Spam Protection, Data Backup, Free System Assessment

LOCATION:

Address: 1579 W 117th St, Cleveland, OH 44107Phone:(216) 255-6300Website: http://www.acroment.com

REVIEWS:

They fix everything in a timely manner with communication. Michael S.

Green Line Solutions Business IT Support was initially founded in 2011 by brothers Brad & Nate Holton. In the nearly 10 years since the foundation of Green Line was formed, the company has substantially expanded to become a highly trusted name for hundreds of businesses and is vigorously supporting them across 10 states. While Green Lines customer base and staff have greatly grown, the belief that it was founded on has remained the same and thats what still makes it successful today.

They endeavor to be different than what you may be used to when it comes to IT companies. If youre seeing for the best IT services provider in Cleveland, youve found them. They are not here to force you into an expensive and confusing managed services agreement with hidden traps that increase your service charges. They are just here to help you run your business more effectively.

Products/Services:

IT Consulting, Managed Services, IT & Technical Support, Project Management, & More

LOCATION:

Address: 4176 W 130th St, Cleveland, OH 44135Phone:(216) 930-9301Website: http://www.greenmakesithappen.com

REVIEWS:

Excellent service and friendly staff. Resolved all my issues the same day. John F.

Kloud9 IT-Cleveland was established in 2006 beginning as an unsophisticated computer repair and consulting company that would eventually grow into something more. Founder, Trent Milliron is an IT professional with years of experience and an extraordinary perspective on the tech industry. His specific experience motivated him to take an innovative approach to IT, opening Kloud9 to assist businesses to find tech solutions. As an entrepreneur himself, Trent comprehends both the tech side and the business side and has put his energy into developing an IT process designed for business owners.

At Kloud9 their mission is to give their clients fast, friendly, and professional computer support and telephone solutions while maintaining an unparalleled level of customer service. They produce reliable relationships with their customers, employees, and partners. They handle problems in a professional, competent, and timely manner. The communities they serve see them as valuable, furnishing members. This emphasis allows them to build lasting relationships and remain competitive in the markets they serve.

Products/Services:

Managed Services, IT Consulting, Business IT Support, Help Desk, Cloud Computing, Cloud Servers, Microsoft Office 365, Virtualization, Virtual Desktops, & More

LOCATION:

Address: 9999 Granger Rd, Cleveland, OH 44125Phone:(216) 393-2484Website: http://www.kloud9it.com

REVIEWS:

Good people to work with and a good IT service provider. Jacob L.

Ermily has worked as a journalist for nearly a decade having contributed to several large publications online. As a business expert, Ermily reviews local and national businesses.

The rest is here:
5 Best IT Support in Cleveland, OH - Kev's Best

Read More..

Macro Trends in the Technology Industry, March 2022 – iTWire

As we put together the Radar, we have a ton of interesting and enlightening conversations discussing the context of the blips but not all this extra information fits into the radar format.

These macro trends articles allow us to add a bit of flavor and to zoom out and see the wider picture of whats happening in the tech industry.

The ongoing tension between client and server-based logicLong industry cycles tend to cause us to pendulum back and forth between a client and server emphasis for our logic. In the mainframe era we had centralised computing and simple terminals so all the logic including where to move the cursor! was handled by the server. Then came Windows and desktop apps which pushed more logic and functionality into the clients, with two-tier applications using a server mostly as a data store and with all the logic happening in the client. Early in the life of the internet, web pages were mostly just rendered by web browsers with little logic running in the browser and most of the action happening on the server. Now with web 2.0 and mobile and edge computing, logic is again moving into the clients.

On this edition of the radar a couple of blips are related to this ongoing tension. Server-driven UI is a technique that allows mobile apps to evolve somewhat in between client code updates, by allowing the server to specify the kinds of UI controls used to render a server response. TinyML allows larger machine learning models to be run on cheap, resource-constrained devices, potentially allowing us to push ML to the extreme edges of the network.

The take-away here is not that theres some new right way of structuring a systems logic and data, rather that its an ongoing tradeoff that we need to constantly evaluate. As devices, cloud platforms, networks and middle servers gain capabilities, these tradeoffs will change and teams should be ready to reconsider the architecture they have chosen.

Gravitational softwareWhile working on the radar we often discuss things that we see going badly in the industry. A common theme is over-use of a good tool, to the point where it becomes harmful, or of using a specific kind of component beyond the margins in which its really applicable. Specifically, we see a lot of teams over-using Kubernetes Kubernetes all the things! when it isnt a silver bullet and wont solve all our problems. Weve also seen API gateways abused to fix problems with a back-end API, rather than fixing the problem directly.

We think that the gravity of software is an explanation for these antipatterns. This is the tendency for teams to find a center of gravity for behavior, logic, orchestration and so on, where its easier or more convenient to just continue to add more and more functionality, until that component becomes the center of a teams universe. Difficulties in approving or provisioning alternatives can further lead to inertia around these pervasive system components.

The industrys changing relationship to open sourceThe impact of open source software on the world has been profound. Linux, started by a young programmer who couldnt afford a commercial Unix system but had the skills to create one, has grown to be one of the most used operating systems of our time. All the top 500 supercomputers run on Linux, and 90% of cloud infrastructure uses it. From operating systems to mobile frameworks to data analytics platforms and utility libraries, open source is a daily part of life as a modern software engineer. But as industry and society at large has been discovering, some very important open source software has a bit of a shaky foundation.

It takes nerves of steel to work for many years on hundreds of thousands of lines of very complex code, with every line of code you touch visible to the world, knowing that code is used by banks, firewalls, weapons systems, web sites, smart phones, industry, government, everywhere. Knowing that youll be ignored and unappreciated until something goes wrong, comments OpenSSL Foundation founder Steve Marquess.

Heartbleed was a bug in OpenSSL, a library used to secure communication between web servers and browsers. The bug allowed attackers to steal a servers private keys and hijack users session cookies and passwords. The bug was described as catastrophic by experts, and affected about 17% of the internets secure web servers. The maintainers of OpenSSL patched the problem less than a week after it was reported, but remediation also required certificate authorities to reissue hundreds of thousands of compromised certificates. In the aftermath of the incident it turned out that OpenSSL, a security-critical library containing over 500,000 lines of code, was maintained by just two people.

Log4Shell was a recent problem with the widely-used Log4j logging library. The bug enabled remote access to systems and again was described in apocalyptic terms by security experts. Despite the problem being reported to maintainers, no fix was forthcoming for approximately two weeks, until the bug had started to be exploited in the wild by hackers. A fix was hurriedly pushed out, but left part of the vulnerability unfixed, and two further patches were required to fully resolve all the problems. In all, more than three weeks elapsed between the initial report and Log4j actually having a fully secure version available.

It's its important to be very clear that we are not criticizing the OpenSSL and Log4j maintenance teams. In the case of Log4j, its a volunteer group who worked very hard to secure their software and gave up evenings and weekends for no pay and who had to endure barbed comments and angry Tweets while fixing a problem with an obscure Log4j feature that no person in their right mind would actually want to use but only existed for backwards-compatibility reasons. The point remains, though: open source software is increasingly critical to the world but has widely varying models behind its creation and maintenance.

Open source exists between two extremes. Companies like Google, Netflix, Facebook and Alibaba release open source software which they create internally, fund its continued development, and promote it strongly. Wed call this professional open source and the benefit to those big companies is largely about recruitment theyre putting software out there with the implication that programmers can join them and work on cool stuff like that. At the other end of the spectrum there is open source created by one person as a passion project. Theyre creating software to scratch a personal itch, or because they believe a particular piece of software can be beneficial to others. Theres no commercial model behind this kind of software, no-one is being paid to do it, but the software exists because a handful of people are passionate about it. In between these two extremes are things like Apache Foundation supported projects, which may have some degree of legal or administrative support, and a larger group of maintainers than the small projects, and commercialized open source where the software itself is free but scaling and support services are a paid addon.

This is a complex landscape. At Thoughtworks, we use and advocate for a lot of open source software. Wed love to see it better funded but, perversely, adding explicit funding to some of the passion projects might be counterproductive if you work on something for fun because you believe in it, that motivation might go away if you were being paid and it became a job. We dont think theres an easy answer but we do think that large companies leveraging open source should think deeply about how they can give back and support the open source community, and they should consider how well supported something is before taking it on. The great thing about open source is that anyone can improve the code, so if youre using the code, also consider whether you can fix or improve it too.

Securing the software supply chainHistorically theres been a lot of emphasis on the security of software once its running in productionis the server secure and patched, does the application have any SQL injection holes or cross-site scripting bugs that could be exploited to crack into it? But attackers have become increasingly sophisticated and are beginning to attack the entire path to production for systems, which includes everything from source-control to continuous delivery servers. If an attacker can subvert the process at any point in this path, they can change the code and intentionally introduce weaknesses or back doors and thus compromise the running systems, even if the final server on which its running is very well secured.

The recent exploit for Log4j, which we mentioned in the previous section on open source, shows another vulnerability in the path to production. Software is generally built using a combination of from-scratch code specific to the business problem at hand, as well as library or utility code that solves an ancillary problem and can be reused in order to speed up delivery. Log4Shell was a vulnerability in Log4j, so anyone who had used that library was potentially vulnerable (and given that Log4j has been around for more than a decade, that could be a lot of systems). Now the problem became figuring out whether software included Log4j, and if so which version of it. Without automated tools, this is an arduous process, especially when the typical large enterprise has thousands of pieces of software deployed.

The industry is waking up to this problem, and we previously noted that even the US White House has called out the need to secure the software supply chain. Borrowing another term from manufacturing, a US executive order directs the IT industry to establish a software bill of materials (SBOM) that details all of the component software that has gone into a system. With tools to automatically create an SBOM, and other tools to match vulnerabilities against an SBOM, the problem of determining whether a system contains a vulnerable version of Log4J is reduced to a simple query and a few seconds of processing time. Teams can also look to Supply chain Levels for Software Artifacts (SLSA, pronounced salsa) for guidance and checklists.

Suggested Thoughtworks podcast: Securing the software supply chain

The demise of standalone pipeline toolsDemise is certainly a little hyperbolic, but the radar group found ourselves talking a lot about Github Actions, Gitlab CI/CD, and Azure Pipelines where all the pipeline tools are subsumed into either the repo or hosting environment. Couple that with the previously-observed tendency for teams to use the default tool in their ecosystem (Github, Azure, AWS, etc) rather than looking at the best tool, technique or platform to suit their needs, and some of the standalone pipeline tools might be facing a struggle. Weve continued to feature standalone pipeline tools such as CircleCI but even our internal review cycle revealed some strong opinions, with one person claiming that Github Actions did everything they needed and teams shouldnt use a standalone tool. Our advice here is to consider both default and standalone pipeline tools and to evaluate them on their merits, which include both features and ease of integration.

SQL remains the dominant ETL languageWere not necessarily saying this is a good thing, but the venerable Structured Query Language remains the tool the industry most often reaches for when theres a need to query or transform data. Apparently, no matter how advanced our tooling or platforms are, SQL is the common denominator chosen for data manipulation. A good example is the preponderance of streaming data platforms that allow SQL queries over their state, or use SQL to build up a picture of the in-flight data stream, for example ksqlDB.

SQL has the advantage of having been around since the 1970s, with most programmers having used it at some point. Thats also a significant disadvantage many of us learnt just enough SQL to be dangerous, rather than competent. But with additional tooling, SQL can be tamed, tested, efficient and reliable. We particularly like dbt, a data transformation tool with an excellent SQL editor, and SQLfluff, a linter that helps detect errors in SQL code.

The neverending quest for the master data catalogueA continuing theme in the industry is the importance and latent value of corporate data, with more use cases arising that can take advantage of this data, coupled with interesting and unexpected new capabilities arising from machine learning and artificial intelligence. But for as long as companies have been collecting data, there have been efforts to categorise and catalogue the data and to merge and transform it into a unified format, in order to make it more accessible, more reusable, and to generally unlock the value inherent in the data.

Strategy for unlocking data often involves creating whats called a master data catalogue a top-down, single corporate directory of all data across the organisation. There are ever more fancy tools for attempting such a feat, but they consistently run into the hard reality that data is complex, ambiguous, duplicated, and even contradictory. Recently the Radar has included a number of proposals for data catalogue tools, such as Collibra.

But at the same time, there is a growing industry trend away from centralized data definitions and towards decentralised data management through techniques such as data mesh. This approach embraces the inherent complexity of corporate data by segregating data ownership and discovery along business domain lines. When data products are decentralised and controlled by independent, domain-oriented teams, the resulting data catalogues are simpler and easier to maintain. Additionally, breaking down the problem this way reduces the need for complex data catalogue tools and master data management platforms. So although the industry continues to strive for an answer to the master data catalogue problem, we think its likely the wrong question and that smaller decentralised catalogs are the answer.

Thats all for this edition of Macro Trends. Thanks for reading and be sure to tune in next time for more industry commentary. Many thanks to Brandon Byars, George Earle, and Lakshminarasimhan Sudarshan for their helpful comments.

More:
Macro Trends in the Technology Industry, March 2022 - iTWire

Read More..

SMBStream for Accelerated VPN-Less Access to SMB shares, is Now Available in the AWS Marketplace – openPR

London, CA, April 09, 2022 --(PR.com)--Storage Made Easy, with a mission of simplifying storage for everyone, announced today that their new SMBStream product can now be launched directly from the AWS Marketplace.

SMBStream provides high-performance, secure access to file servers in the cloud, in data centers, and between geographically distributed offices across the world. Unlike using a VPN, users and applications have speedy access to the file data they need in real-time, and the solution scales as more users are added.

Launching SMBStream from the AWS Marketplace makes it even easier to consolidate file servers into the cloud, to include remote storage in cloud workloads and to integrate distributed file storage into the Enterprise File Fabric platform.

SMBStream Highlights:

Real-time Access - Users are able access live file storage over the internet. Real-time access means there is no office cache to procure, no snapshots to synchronize, and no global locking challanges.Fast - SMBStream enables productive use of remote file systems from distributed offices. Improves remote file access up to 15 times compared to a traditional VPN.Secure - Adds key authentication, repudiation and AES-256 encryption for secure access over the public internet.Vendor Neutral Extends the reach of your SMB compatible file servers including Amazon FSx, Nasuni, NetApp Cloud Volumes.

Adam Faircloth, IT Director from Anthologic, a digital media company, said about SMBStream: With so much of our team working remotely, accessing local NAS storage takes too long and frustrates users. Using SMBStream access to those file shares from the cloud is many times faster, its like magic and just what we needed.

For more information about SMBStream visit: https://storagemadeeasy.com/smbstream/

Here is the original post:
SMBStream for Accelerated VPN-Less Access to SMB shares, is Now Available in the AWS Marketplace - openPR

Read More..

Heard on the Street 4/5/2022 – insideBIGDATA

Welcome to insideBIGDATAs Heard on the Street round-up column! In this regular feature, we highlight thought-leadership commentaries from members of the big data ecosystem. Each edition covers the trends of the day with compelling perspectives that can provide important insights to give you a competitive advantage in the marketplace. We invite submissions with a focus on our favored technology topics areas: big data, data science, machine learning, AI and deep learning. Enjoy!

Factors influencing the demand of AI in todays world. Commentary by Shubham A. Mishra, Global CEO and Co-Founder of Pixis

AI is fundamentally changing the way businesses communicate with their audiences, in that, its helping improve the accuracy of communication and targeting. With an expected growth of 40.2% over the next few years, AI will be transformative to any business helping them provide a seamless customer experience. With AI, businesses are able to get data-backed insights into performance, which empowers practitioners across the board to get clarity on the effectiveness of their efforts. Marketers will gain sharper insights into the what the audience wants aspect, thus optimizing their business growth.In the next few years, we will witness modern AI generative networks completely rebooting the landscape of digital content creation and empowering brands to hyper tune their messaging to every single potential customer. With the shift to cookieless web, AI is going to play an important role in promoting strategic process, and will be in the front seat of executing any campaign because of its power to optimize efficiency by its self-evolving nature.

The Growing Impact of Data Storytelling and How to Harness It. Commentary by Mathias Golombek, CTO, Exasol

As more organizations today become increasingly data-driven, they are using data storytelling to glean the most accurate, meaningful, and actionable insights from their data. Data storytelling provides the much-needed context for painting a clearer picture. Without this context, data insights can fall flat. For business leaders, data storytelling explains what the data is showing and why it matters. According to Exasolsresearch, nearly all (92%) IT and data decision-makers surveyed agreed that storytelling is an effective means of delivering the findings of data and analytics. Given this trend, there is a major demand for data storytellers across all industries with companies seeking to build best-in-class data teams with people from different backgrounds with various skillsets. These modern data scientists need to have more than just technical knowledge and advanced data science skills but also the ability to interpret data for business-focused stakeholders. To make data storytelling truly successful, organizations must empower knowledge workers to become more data savvy so that they can also interpret the data along with their more technical counterparts.Data storytelling isnt just about being able to work a data platform; its also about data literacy skills and the ability to communicate more widely understanding the business context, the importance of numbers, and then break those down into a pithy, compelling narrative. Fortunately, there are new smarter self-service tools that help both teams turn data into stories including new easy-to-use BI tools, self-service data exploration and data preparation solutions and auto-machine learning tools that enable nearly all employees to interpret complex information on their own, act on their findings and then let them tell their own data stories.

Apple outage potentially caused by lack of connection between systems and data. Commentary by Buddy Brewer, New Relic Group Vice President & General Manager

As companies scale and tech stacks become more complex, the risk of outages will rise. Outages like this can happen to any company at any time. When an outage happens, the impact to the business can snowball really fast. Not only is the IT team trying to get the system back up and running, they are also fielding what can be a massive influx of requests ranging from internal stakeholders up the Board level to customer complaints. Minimizing the time to understand the issue is critical. What makes this difficult is that most companies have observability data scattered everywhere.The first thing any company needs to do to fix the issue is to focus on connecting the data about their systems together, ideally storing it all together so that they can gain a single pane of glass view of their system to resolve issues quickly, minimizing the impact to their end-users. Redundancy in the form of failovers, multi-cloud, and more is also important for the resilience of their system.

Why More Companies are Leaving Hadoop. Commentary by Rick Negrin / Vice President, Product Management, Field CTO at SingleStore

While Hadoop started off with the promise of delivering faster analytical performance on large volumes of data at lower costs, its sheen has worn off and customers are finding themselves stuck with complex and costly legacy architectures that fail to deliver insights and analytics fast. It doesnt take more than a quick Google search to see why enterprises around the world are retiring Hadoop. It wasnt built to execute fast analytics or support the data-intensive applications that enterprises demand.Moreover, Hadoop requires significantly more hardware resources than a modern database. Thats why more companies are seeking out replacements or, at the very least, augmenting Hadoop. As an industry, we must meet the needs of demanding, real-time data applications. We must ensure there are easier, more cost and energy efficient choices for users who need reliable data storage and rapid analytics for this increasingly connected world.

Snowflakes new cloud service signals new trend of industry-based cloud offerings. Commentary by Clara Angotti, President at Next Pathway

Snowflakes new Healthcare & Life Sciences Data Cloud is a great example of the new trend to vertically specialized cloud offerings and services. The use of the cloud is becoming purpose-driven as companies are choosing cloud data warehouse and cloud platforms based on their ability to enable specialized business change. Companies are looking for industry-specific solutions based on applications, services, security and compliance needs to drive unique business outcomes. As this market becomes more lucrative and competitive, the players will look to differentiate themselves through unique, vertical offerings.

New Data Privacy Laws. Commentary by David Besemer, VP/Head of Engineering at Cape Privacy.

If data is kept encrypted when stored in the cloud, the risks associated with unauthorized access are mitigated. That is, even if data becomes inadvertently exposed to outside actors, the encryption maintains the privacy of the data. The key to success with this approach is to encrypt the data before moving it to the cloud, and then keep it encrypted even while processing the data.

How AI Puts a Companys Most Valuable Asset to Work. Commentary by David Blume, or VP of Customer Success, RFPIO

Every business leader asks themselves a common question; how do I get my employees to perform their best work? And while effective hiring, pay incentives and a positive workplace environment all play a large role, one aspect that often gets overlooked involves the tools that, once implemented, can improve every aspect of employee efficiency. Thats where machine learning-driven response management software comes into play. Response management software that incorporates machine learning helps employees at every level utilize their companys most valuable resource: knowledge. When a company invests in the technology that allows its workers to utilize all their content in an accurate, accessible and effective manner, it can have wide ranging and substantial benefits to the organization. For higher-level executives, this helps reduce repetitive questions from lower-level staff and minimizes errors as a result of old or inaccurate data. For employees who just joined a company, the onboarding process will be quicker and more streamlined as many of the questions they will have can now be easily accessible, accurate, and addressable via a shared knowledge library. Used properly, response management software can improve employee productivity resulting in increased ROI and boosted bottom lines.

Cloud Costs On the Rise? Developers are less than thrilled. Commentary by Archera CEO Aran Khanna

Its no secret that cloud costs are devilishly hard to predict and control, but at least they trend downward over time. Every customer faces a visibility puzzle, making it tough to understand which team is causing cloud spending shocks and whether they stem from traffic gains (good) or wasteful deployments (bad). Then throw in complex billing structures, countless resource and payment options, and the fact that customers consume before they pay. Small wonder the house always seems to win when the invoice arrives, regardless of which cloud provider the house is. You can now add price inflation to these challenges, with Google boosting some prices 100% starting October 1st. But, some prices certain archival storage at rest options, among others are dropping. Or, capacity changes; Always Free Internet egress will jump from 1 GB to 100 GB per month.Developers, overloaded trying to control cloud costs in a blizzard of choices, are not thrilled with the new flexibility. The answer? A Google FAQ encourages customers to better align their applications to these new business models to mitigate some of the price changes. IT cannot compare millions of choices, however, and still hope to get actual work done. They need a sustainable methodology and the capability for centralized, automated cloud resource management to correctly match provider options to consumption.We recommend a dynamic monthly sequence of establishing visibility, forecasts, then budgets, governance, contract commitments, and monitoring and adjusting. This lets organizations avoid unnecessary spending, even when the world goes upside-down and prices inflate.

Removing Human Bias From Environmental Evaluation. Commentary by Toby Kraft, CEO, Teren

Historically, environmental data has been captured and evaluated through boots on the ground and local area experts relying heavily on human interpretation to glean insights. Subject matter and local experts make decisions based on data retrieved from field surveys, property assessments and publicly available data that may or may not be up to date and accurate. Additionally, the human interpretation of these data lends itself to error and inconsistencies, which can have dramatic impacts on the companies using it, such as insurance and construction organizations. Remotely-sensed data, geospatial technology, and data science can eliminate human error in environmental data by automating highly accurate, relevant data capture, processing and interpretation. Automated analytics can extract unbiased, reliable and, most importantly, replicable insights across industries to detect, monitor and predict environmental changes. Companies have long used geospatial technology for large-scale asset management, however, the data is generally limited to infrastructure without much insight into the environmental conditions within which the asset is situated. Asset owners are now combining remotely-sensed data, machine learning, and geospatial technologies to manage the environmental data surrounding an asset and proactively mitigate potential threats. Insurance and construction firms can take note and apply the same methodology to underwriting and project scoping saving time and lowering risk before an asset is even operational.

NVIDIA: We Are A Quantum Computing Company. Commentary by Lawrence Gasman, President of Inside Quantum Technology

Quantum has evolved to the point where a semiconductor giant and a Wall Street darling like Nvidia is self-identifying itself as a quantum computing company. Thats a huge development for a company thats been making strides by circling the market, creating the cuQuantum software kit for quantum simulations, currently used by Pasqal, IBM, Oak Ridge National Laboratory (ONRL) and others. The recent announcement of a new quantum compilerand a new software appliance to run quantum jobs in data centers is a further statement that Nvidia is intent on pursuing quantum market opportunities. They already serve the high-performance computing community with powerful processors and accelerated architectures. This shift will help embrace a unified programming model for hybrid classical-quantum systems.

The risks that come with big data: highlighting the need for data lineage. Commentary by Tomas Kratky, founder and CEO, MANTA

The benefits of harnessing big data are obvious it feeds the applications powering our digital world today, like advanced algorithms, machine learning models, and analytics platforms. To get the desired value, we deploy tens or hundreds of technologies like streaming, ETL/ELT/reverse ETL, APIs or microservices. And such complexity actually poses some serious risks to organizations. A solid data management strategy is needed to remove any blind spots in your data pipelines. One heavily overlooked risk with big data architectures (or frankly any complex data architectures) is the risk of data incidents, extreme costs associated with incident resolution, and limited availability of solutions enabling incident prevention. An associated risk is low data quality. Data-driven decisions can only be as good as the quality of the underlying data sets and analysis. Insights gleaned from error-filled spreadsheets or business intelligence applications might be worthless or in the worst case, could lead to poor decisions that harm the business. Thirdly, compliance has become a nightmare for many organizations in the era of big data. As the regulatory environment around data privacy becomes more stringent, and as big data volumes increase, the storage, transmission, and governance of data become harder to manage. To minimize compliance risk, you need to gain a line of sight into where all your organizational data has been and where its going.

Whysemantic automationis the next leap forward in enterprise software. Commentary by Ted Kummert,Executive Vice President of Products and Engineering, UiPath

Demand forautomationcontinues to skyrocketas organizations recognize the benefits ofan automation platformin improving productivity despite labor shortages, accelerating digital transformation during pandemic-induced challenges, and enhancing both employee and customer experiences.Semantic automation enhances automationby reducingthe gap between how software robots currently operate and the capacity to understand processes the way humans do. Byobserving and understandinghow humans complete certain tasks, software robots powered by semantic automation can better understand the intentof the user, in addition to relationships between data, documents, applications,people, and processes. Robots that can understand higher levels of abstraction willsimplify developmentand deployment of automation. Further, as software robots continue to learn how to complete tasks and identify similarities, organizations will see better outputs from their automations and can also find new opportunities to scale across the business.

The Best Data Science Jobs in the UK according to data. Commentary by Karim Adib, Data Analyst, The SEO Works

According to a report commissioned by the UK government, 82% of job openings advertised online across the UK require digital skills. While digital industries are booming, some industries are easier to break into than others, and some industries pay higher than others. The Digital PR team at The SEO Works gathered data on the average salary from top UK job boards Glassdoor, Indeed, and Prospects to reveal some of the most in-demand digital jobs along with how difficult it is to get started in them. All smart businesses and organizations now use data to make decisions, so there is a growing demand for these jobs. Its also not too hard to get into Data Science compared with some other digital jobs. All three of the data science jobs analyzed fall between 40 and 60 on the difficulty score because of the long time frame associated with getting into data science and the degree requirements a lot of the jobs have. Data Analyst came out as the best salary to difficulty ratio in the study, with Data Scientist just behind it. More and more businesses nowadays are adopting data as a way to make informed decisions and as a result, the demand for those who work with data is constantly increasing, making it a great choice for those looking to get into the digital industry.

The Digital Transformation and Managing Data. Commentary by Yael Ben Arie, CEO of Octopai

Businesses have never had access to the amount of data that is infiltrating corporations today. As the world goes through a digital transformation, the amount of information and data that a company collects is enormous. However, the data is only useful when it can be leveraged to improve processes, business decisions, strategy, etc. How can businesses leverage the data that they have? Are businesses even aware of all the data that they possess? And is the data in the hands of the right executives so that information can be used throughout the entire organization and in every department? After all, the digital transformation is turning everyone into a data scientist and valuable data cant be utilized solely by the BI department. In order for data to be leveraged effectively and to access the full picture, businesses need to automate it a comparison is equal to a Google search versus going to the library. Manually researching the data flows is almost impossible and leaves the data untrustworthy and prone to errors. Automating data and implementing a centralized platform that extracts metadata and presents it in a visual map from all systems and spreadsheets provides an accurate and efficient process to discover, and track data and its lineage, within seconds. The advantage of doing such, enables any user to find the data they need, make faster business decisions, providing an insight into the true metrics of the business. One of the most metamorphic aspects of the digital transformation is that data will become the foundation of corporate growth, propelling all corporate employees to take part in data discovery. As the digital transformation continues to move forward we should expect to see more emphasis on verifying the accuracy and truth of the data that was uncovered.

Why adaptive AI is key to a fairer financial system. Commentary by Martin Rehak, Co-founder and CEO of Resistant AI

Despite the best intentions of regulators, the worlds financial system remains the biggest, most critical, and yet most obscure network underpinning our global economy and societies. Attempts to fight the worst the world has on offer financial crime, organized crime, drug smuggling, human trafficking, and terrorist financing are generally less than 0.1% effective. And yet, it is this very same system that the worlds largest economies are relying on to sanction Russia into curtailing its aggression in Ukraine. At the heart of this inefficiency is a natural mismatch any data scientist would recognize: an overwhelming amount of economic activity that needs to be detected, prioritized, analyzed, and reported by human financial crime investigators, and all within the context of jurisdictionally conflicting and ever-updating compliance regulations. Previous attempts to solve this problem with AI have usually relied on expensive and rigid models that both fail transparency tests with regulators and at catching ever-adaptive criminals who render them obsolete within months by adopting new tactics.Instead, the path to a financial system that actually benefits law-abiding citizens lies in nimble, fast-deploying and fast-updating multi-model AI anomaly detectors that can explain each and every finding at scale. That path will require constant collaboration between machines, financial crime investigators, and data scientists. Failure to build a learning cycle that includes human insights is metaphorically throwing our hands up at the idea of a fairer and safer financial future for all.

The need for strong privacy regulations in the US is greater than ever.Commentary by Maciej Zawadziski, CEO of Piwik PRO

The General Data Protection Regulation (GDPR) is the most extensive privacy and security law in the world containing hundreds of pages worth of laws for organizations worldwide. Though it was put into effect by the European Union (EU), it imposes requirements onto the organizations that target or collect data related to people in the EU as well. Google Analytics is by far the most popular analytics tool on the market, but the recent decision of the Austrian Data Protection Authority, the DSB, states the use of Google Analytics constitutes a violation of GDPR. The key compliance issue with Google Analytics stems from the fact that it stores user data, including information about EU residents, on US-based cloud servers. On top of that, Google LLC is a US-owned company and is therefore subject to US surveillance laws, such as the Cloud Act. Companies that collect data of EU residents need to rethink their choices as more European authorities will soon follow the DSBs suit, possibly resulting in a complete ban on Google Analytics in Europe. The most privacy-friendly approach would be to switch to an EU-based analytics platform that protects user data and offers secure hosting. This will guarantee that you collect, store and process data in line with GDPR.

Could AI support a future crisis by strategically planning a regional supply chain?Commentary by Asparuh Koev, CEO of Transmetrics

If you havent heard the word enough, collaboration within the supply chain is what will provide sustainable, long-term futures for our retailers, shippers, manufacturers, and suppliers combined. And with the extent and size of todays business network, joining forces without AI is no longer an option. From the pandemic to the Russia-Ukraine war causing unexpected havoc on an already beaten chain of backlogs, consolidating data sources, forecasting demand, and initiating just-in-case stock planning is the move for successful supply chains. Armed with complete transparency and visibility of data, historically isolated functions can benefit from AIs power to read multitudes of information in seconds and create optimal scenarios for capacity management, route optimization, asset positioning, and last-mile planning. Fully integrated supply chains can work with real-time and historical data to enhance pricing models by understanding the entire market while increasing early detection and disruptions in advance of a crisis.This enables scenario planning on a scale that has never been seen before,indispensable in a time of crisis.

US AI Bias Oversight? Commentary by Sagar Shah, Client Partner at Fractal Analytics

There was era where explainability was talked about, then came era of fairness and now its privacy and monitoring. Privacy and AI can co-exist with the right focus on policy creation, process compliance, advisory and governance. It needs a lot of assistance in advisory to educate companies in the ethical use of AI respecting transparency, accountability, privacy and fairness. Privacy by Design is a pillar which is becoming stronger in a cookie-less world. Many companies are exploring avenues to make personalization engines with in this new normal, to make it a win-win for consumer experience and customized offerings. Differential privacy injects noise to decrease correlation between features. However, it is not a foolproof technique since the injected noise can be traced backwards by the data science professional.

How can bad data ruin companies? Commentary by Ben Eisenberg, Director of Product, Applications and Web at People Data Labs

As all businesses become more data-driven, questions of data quantity will give way to questions of data quality. And with businesses increasingly leaning heavily on data to guide more of their decision-making, the risk associated with bad data has grown. Where once bad data might have a limited impact, it can now proliferate across multiple systems and processes leading to widespread dysfunction.To avoid these problems, businesses should prioritize investing in data that is compliantly sourced. Data is increasingly regulated across states and regions, and its important that any data you acquire from a third party be fully compliant. That means checking your vendors privacy compliance and questioning their practices around ensuring compliance from their sources.Another tactic is keeping data fresh. Most of the data businesses rely on reflects individual human beings, and human beings are not static. Every year millions of people move, change jobs, get new contact information, take out new loans, and adopt new spending habits. The fresher your records, and the more often you enrich them with fresh data, the more likely you are to avoid data decay that can diminish the value of your data and lead to problems.

World Backup Day. Commentary by Pat Doherty, Chief Revenue Office at Flexential

Weve learned to expect the unexpected when it comes to business disruption, illustrating the immense need for proper backup solutions. In 2022, investment in Disaster Recovery-as-a-Service (DRaaS) will be a major theme for businesses of all sizes to ensure long-term business success and survival no matter the disruption. Moving DRaaS to a secondary site cloud environment can ensure that data is safe and secure and that organizations can operate as normal even when employees are not on site.

World Backup Day. Commentary by Indu Peddibhotla, Sr. Director, Products & Strategy, Commvault

Enterprise IT teams today are increasingly starting to realize that backup extends far beyond serving as their last line of defense against cyberattacks. It can now help them take the offense against cybercriminals, by allowing them to discover and remediate cyberattacks before their data is compromised.For example, data protection solutions now have the ability to detect anomalous behaviors indicating a threat to a companys data. In addition, emerging technologies will soon allow enterprise IT teams to create deceptive environments that can trap cybercriminals. These features, coupled with other early warning capabilities, will allow companies to use their backups to detect, contain, and intercept cyberattacks before they can lock, alter, steal, or destroy their data.

World Backup Day. Commentary by Stephen McNulty, President ofMicro Focus

When disasters occur, organizations suffer. That is why they see backups, recovery, and security of data and systems as crucial for business continuity. Backups are an essential practice to safeguard data, but they are not the most important step. While they do indeed ensure availability and integrity of data, I believe recovery strategies should take precedence. Heres why it is the ability to restore data and systems to a workable state, and within a reasonable time frame, that makes backups valuable. Without this ability, there is no point in performing the backup in the first place. Furthermore, backups must also be complemented with adequate security controls. To that end, business leaders should consider the Zero Trust model, which implements a collection of solutions covering a range of needs from access control and privilege management to the monitoring and detection of threats. This will ultimately provide the best protection possible as information travels across devices, apps, and locations.

World Backup Day. Commentary by Brian Spanswick, CISO at Cohesity

While all eyes are on backup today, organizations must strive for holistic cyber resilience and recognize that backup is just one component of a much larger equation. Achieving true cyber resilience means developing a comprehensive strategy to safeguard digital assets, including integrated defensive and recovery measures that give organizations the very best chance of weathering the storm of a cyber attack.Organizations should embrace a next-gen data management platform that enables customers to adopt a 3-2-1 rule to data backups, ensure data is encrypted both at transit and at rest, enable multi-factor authentication, and employ zero trust principles. Only then can organizations address mass data fragmentation challenges while also reducing data proliferation.Further, backups that can be restored to a precise point in time deliver the business continuity required for organizations to not only survive attacks, but continue to thrive in spite of them.

World Backup Day. Commentary by Brian Pagano, Chief Catalyst and VP at Axway

It is important to distinguish between syncing and backup, most people conflate the two. In order to qualify as a backup, you should be able to do a fresh install with complete data recovery. Sync is designed to allow you to work seamlessly across devices by pushing deltas to the cloud. But if something happens to corrupt your local copy, that corruption may get synced and propagate across your devices. Organizations can help customers with backups by allowing easy export of a complete data in a standard (not proprietary) format. You as a user have the responsibility of keeping a copy of your data on a local or remote drive that is not connected to sync. You must periodically do this (either manually or with a script that triggers after a certain period).

World Backup Day. Commentary by Joe Noonan, Product Executive, Backup and Disaster Recovery for Unitrends and Spanning

World Backup Day is a great reminder for businesses to take a closer look at their full business continuity and disaster recovery (BCDR) planswhich includes everything from the solutions they use to their disaster recovery run book.The shift to remote working completely transformed the way organizations protect and store their data. Today, there is a greater focus on protecting data no matter where it lives on-prem, on the laptops of remote employees, in clouds and in SaaS applications. Recovery time objectives (RTOs) are increasingly shrinking in todays always-on world, with goals being set in hoursif not minutes. Cybercriminals have taken advantage of the remote and hybrid work environments to conduct increasingly sophisticated cyberattacks, and the data recovery process post-incident has become more complex due to new cyber insurance requirements. These new regulations include critical audits and tests that businesses must comply with in order to restore their data and receive a payout after an attackwhich can slow down the recovery process.With data protection becoming increasingly complex, more organizations are turning to vendors that provide Unified BCDR, which includes backup and disaster recovery, AI-based automation and ransomware safeguards as well as disasterrecovery as a service (DRaaS). Unified BCDR has become a necessity due to the growing amount of data organizations must protect and the increasing number of cyberattacks taking place against businesses of all sizes.

Sign up for the free insideBIGDATAnewsletter.

Join us on Twitter:@InsideBigData1 https://twitter.com/InsideBigData1

See the original post here:
Heard on the Street 4/5/2022 - insideBIGDATA

Read More..

Accenture Invests in Good Chemistry Company to Drive Quantum Computing in Materials and Life Sciences – HPCwire

NEW YORK, April 6, 2022 Accenture has made a strategic investment, through Accenture Ventures, inGood Chemistry Company, a company that uses quantum chemistry, machine learning and quantum computing to accelerate new materials design.

The Good Chemistry Company platform,QEMIST Cloud, combines cloud, AI, and quantum computing in an integrated platform designed for developers. The platforms engine enables faster, more accurate, and scalable ways to perform computational chemistry simulations.

Were doubling down on the growth potential of quantum computing and uncovering new ways to navigate its potential while empowering our clients to confidently absorb and access this breakthrough technology, said Tom Lounibos, managing director, Accenture Ventures. Simulating chemistry in this new way leverages easily and readily accessible computers on the cloud to perform simulations that were previously intractable even on expensive, high-performance computing environments. This brings a competitive advantage to clients and can change pharmaceutical drug discovery and more.

According to theAccenture Technology Vision 2022, 69% of global executives say quantum computing will have a breakthrough or transformational positive impact on their organizations in the future. Quantum is the pinnacle of next generation problem solving and Accenture andBiogen collaborated with 1QBitalready to accelerate drug discovery, developing a proof of concept that validated a quantum-computing molecule comparison approach and building an enterprise-ready, quantum-enabled application with transparent processes that generates molecular comparison results with deeper insights about shared traits.

Arman Zaribafiyan, CEO of Good Chemistry Company said, With our platform, we are re-imagining the way computational chemistry simulations are done. Simulating chemistry on computers will help drive faster, more accurate and more accessible materials innovation in the decades to come. With Accentures support and collaboration, we will be able to explore the vastness of chemical space and enable rational materials design at scale.

Carl Dukatz, Accentures global quantum computing lead said, By building on and extending our relationship with 1QBit to the newly formed Good Chemistry Company, we are demonstrating our ongoing commitment to accelerating quantum computing innovation. We are witnessing the emergence of a new class of scalable cloud-based technology that is stretching the boundaries of what computers can solve. We recognize the potential of arming our clients with the next generation of chemistry, material science, and structural design.

Good Chemistry Company is the latest organization to join Accenture VenturesProject Spotlight, an engagement and investment program that connects emerging technology software startups with the Global 2000 to fill strategic innovation gaps. Project Spotlight offers extensive access to Accentures domain expertise and its enterprise clients, helping startups harness human creativity and deliver on the promise of their technology.

Terms of the investment were not disclosed.

About Accenture

Accenture is a global professional services company with leading capabilities in digital, cloud and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Interactive, Technology and Operations services all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities. Visit us ataccenture.com.

About Good Chemistry Company

Good Chemistry Companys mission is to enable high-throughput high-accuracy computational chemistry simulations to accelerate new material designs. Their proprietary QEMIST Cloud, a cloud-based computational chemistry platform, provides the building blocks for computational chemistry developers to build chemical simulation applications and workflows, using emerging algorithms in quantum chemistry, machine learning, and quantum computing. Through simple, easy-to-use APIs, QEMIST Cloud provides access to computational chemistry tools with unprecedented scale enabled by the power of cloud. Headquartered in Vancouver, Canada, Good Chemistry Companys interdisciplinary team comprises computational and quantum chemists, software developers, ML engineers and quantum computing scientists. For more information about Good Chemistry Company, visitgoodchemistry.com.

Source: Accenture

See original here:
Accenture Invests in Good Chemistry Company to Drive Quantum Computing in Materials and Life Sciences - HPCwire

Read More..