Page 2,777«..1020..2,7762,7772,7782,779..2,7902,800..»

AI experts to med students: Don’t compete with the machine. Collaborate with it – AI in Healthcare

Give and take

Some of the liveliest material, including the question and answer above, emerged during a brief Q&A period following the prepared presentations.

To Parikhs point on machine learnings limitations in clinical settings, Chase added an illustrative anecdote.

A couple of medical students told me last week that when the sepsis alert goes off in the electronic health record, basically everybody ignores it because they dont believe it, Chase said. Its black box and was delivered within the electronic health record. Nobodys tested it on sensitivity and specificity.

So [you should] hit the pause button and then decide whether or not [an algorithms] data point applies to your patient.

Demise of the doctors?

Perhaps inevitably, the subject of AIs potential for replacing physicians came up during the Q&A.

Theres no question that imaging-based specialtiesradiology, pathology, dermatologyhave been notably successful using machine learning, Chase responded. But the goal should be better care by way of a happy outcome for both AI and human image interpreters.

I think the next generation of radiologists will be operating at a higher level, Chase explained. Theyll be overseeing the cases that are being referred to them by the machine and making sure that you dont over-biopsy a patient because of a false positive.

And I think that actually will make the profession to some extent more interesting. Youre not going to be looking at film after film that ends up being negative.

Machine, meet patient

Parikh urged attendees to imagine a clinical decision-making scenario from the patients point of view.

If you were hearing that a machine rather than a human was going to be diagnosing your lung cancer would you be interested in that? I would imagine that a lot of patients wouldnt be, Parikh said. Theres still a huge demand for a human element to how we practice medicine. That element is never going to be replaced by machines.

Too often, weve been thinking about these things as adversarialhuman versus machinewhen the real purpose of a machine is to collaborate with a human.

AMA session summary with video here. Standalone video posted to YouTube here.

More:
AI experts to med students: Don't compete with the machine. Collaborate with it - AI in Healthcare

Read More..

Feeding the machine: We give an AI some headlines and see what it does – Ars Technica

Enlarge / Turning the lens on ourselves, as it were. Is Our Machine Learning?View more stories

There's a moment in any foray into new technological territory when you realize you may have embarked on a Sisyphean task. Staring at the multitude of options available to take on the project, you research your options, read the documentation, and start to workonly to find that actually just defining the problem may be more work than finding the actual solution.

Reader, this is where I found myself two weeks into this adventure in machine learning. I familiarized myself with the data, the tools, and the known approaches to problems with this kind of data, and I tried several approaches to solving what on the surface seemed to be a simple machine-learning problem: based on past performance, could we predict whether any given Ars headline will be a winner in an A/B test?

Things have not been going particularly well. In fact, as I finished this piece, my most recent attempt showed that our algorithm was about as accurate as a coin flip.

But at least that was a start. And in the process of getting there, I learned a great deal about the data cleansing and pre-processing that goes into any machine-learning project.

Our data source is a log of the outcomes from 5,500-plus headline A/B tests over the past five yearsthat's about as long as Ars has been doing this sort of headline shootout for each story that gets posted. Since we have labels for all this data (that is, we know whether it won or lost its A/B test), this would appear to be a supervised learning problem. All I really needed to do to prepare the data was to make sure it was properly formatted for the model I chose to use to create our algorithm.

I am not a data scientist, so I wasn't going to be building my own model any time this decade. Luckily, AWS provides a number of pre-built models suitable to the task of processing text and designed specifically to work within the confines of the Amazon cloud. There are also third-party models, such as Hugging Face, that can be used within the SageMaker universe. Each model seems to need data fed to it in a particular way.

The choice of the model in this case comes down largely to the approach we'll take to the problem. Initially, I saw two possible approaches to training an algorithm to get a probability of any given headline's success:

The second approach is much more difficult, and there's one overarching concern with either of these methods that makes the second even less tenable: 5,500 tests, with 11,000 headlines, is not a lot of data to work with in the grand AI/ML scheme of things.

So I opted for binary classification for my first attempt, because it seemed the most likely to succeed. It also meant the only data point I needed for each headline (besides the headline itself) is whether it won or lost the A/B test. I took my source data and reformatted it into a comma-separated value file with two columns: titles in one, and "yes" or "no" in the other. I also used a script to remove all the HTML markup from headlines (mostlya few HTML tags for italics). With the data cut down almost all the way to essentials, I uploaded it into SageMaker Studio so I could use Python tools for the rest of the preparation.

Next, I needed to choose the model type and prepare the data. Again, much of data preparation depends on the model type the data will be fed into. Different types of natural language processing models (and problems) require different levels of data preparation.

After that comes tokenization. AWS tech evangelist Julien Simon explains it thusly: Data processing first needs to replace words with tokens, individual tokens. A token is a machine-readable number that stands in for a string of characters. So 'ransomware would be word one, he said, crooks would be word two, setup would be word three so a sentence then becomes a sequence of tokens, and you can feed that to a deep-learning model and let it learn which ones are the good ones, which ones are the bad ones.

Depending on the particular problem, you may want to jettison some of the data. For example, if we were trying to do something like sentiment analysis (that is, determining if a given Ars headline was positive or negative in tone) or grouping headlines by what they were about, I would probably want to trim down the data to the most relevant content by removing "stop words"common words that are important for grammatical structurebut don't tell you what the text is actually saying (like most articles).

However, in this case, the stop words were potentially important parts of the dataafter all, we're looking for structures of headlines that attract attention. So I opted to keep all the words. And in my first attempt at training, I decided to use BlazingText, a text processing model that AWS demonstrates in a similar classification problem to the one we're attempting. BlazingText requires the "label" datathe data that calls out a particular bit of text's classificationto be prefaced with "__label__". And instead of a comma-delimited file, the label data and the text to be processed are put in a single line in a text file, like so:

Another part of data preprocessing for supervised training ML is splitting the data into two sets: one for training the algorithm, and one for validation of its results. The training data set is usually the larger set. Validation data generally is created from around 10 to 20 percent of the total data.

There has been a great deal of research into what is actually the right amount of validation datasome of that research suggests that the sweet spot relates more to the number of parameters in the model being used to create the algorithm rather than the overall size of the data. In this case, given that there was relatively little data to be processed by the model, I figured my validation data would be 10 percent.

In some cases, you might want to hold back another small pool of data to test the algorithm after it's validated. But our plan here is to eventually use live Ars headlines to test, so I skipped that step.

To do my final data preparation, I used a Jupyter notebookan interactive web interface to a Python instanceto turn my two-column CSV into a data structure and process it. Python has some decent data manipulation and data science-specific toolkits that make these tasks fairly straightforward, and I used two in particular here:

Heres a chunk of the code in the notebook that I used to create my training and validation sets from our CSV data:

I started by using pandas to import the data structure from the CSV created from the initially cleaned and formatted data, calling the resulting object "dataset." Using the dataset.head() command gave me a look at the headers for each column that had been brought in from the CSV, along with a peek at some of the data.

The pandas module allowed me to bulk-add the string "__label__" to all the values in the label column as required by BlazingText, and I used a lambda function to process the headlines and force all the words to lower case. Finally, I used the sklearn module to split the data into the two files I would feed to BlazingText.

Read the rest here:
Feeding the machine: We give an AI some headlines and see what it does - Ars Technica

Read More..

Machine Learning is Set to Detect Driver Drowsiness to Reduce Road Accidents – Analytics Insight

The machine learning approach is used for drowsiness detection of drivers to reduce the number of road accidents per year. Integration of machine learning algorithms into computer vision can help to detect whether drivers are feeling drowsy through video streams and facial recognition. IIT Ropar has built an algorithm that can extract facial features of drowsiness like eyes and mouths to effectively detect the real-time feeling of a driver. This is expected to reduce road accidents in a country by alerting the drivers on time.

There are three techniques that the team of IIT Ropar developed drivers operational behavior can be tracked with the understanding of the steering wheel, accelerator or brake patterns and speed; physiological features of a driver like heart rate, head posture or pulse rate and computer vision system to recognize facial expressions. Machine learning can detect drivers drowsiness accurately in multiple vehicle models.

The tech companies and institutes have realized the utmost need for machine learning algorithms in drowsiness detection. Scientists have developed this alert system with the help of Video Stream Processing that analyses an eye blink through an Eye Aspect Ratio (EAR) as well as the Euclidean distance of an eye. IoT can send a warning message with a degree of collision along with real-time location data. The Raspberry Pi, OpenCV or Python monitoring system will help in issuing this crucial message on the spot.

EAR includes a simple calculation that is based on the ratio of distances between the lengths and width of the eyes. The eye aspect is very crucial in detecting drowsiness. Thus, EAR can be plotted for multiple frames of a video sequence through computer vision. There are three command lines to order the detector to use shape-predictor, alarm, and webcam. If the EAR for a driver starts to decline over multiple frames, the machine learning algorithms can detect that the driver is drowsy. There is also a presence of Mouth Aspect Ratio (MAR) the ratio of distances between the length and width of the mouth of a driver. This will detect when the driver will yawn and lose control over the mouth. There is a significant emphasis on the pupil of the eye known as Pupil Circularity. It helps to detect whether the eyes are half-open or almost closed during driving.

Thus, the advancement in cutting-edge technology is utilized in reducing road accidents per year with the help of machine learning algorithms. It is a natural feeling to be drowsy on roads for numerous causes. Thus, it is the work of machine learning algorithms to protect drivers and their families from incurring a massive loss.

Share This ArticleDo the sharing thingy

About AuthorMore info about author

See the original post here:
Machine Learning is Set to Detect Driver Drowsiness to Reduce Road Accidents - Analytics Insight

Read More..

Bank of England to crack down on ‘secretive’ cloud computing services – Reuters

LONDON, July 13 (Reuters) - Cloud computing providers to the financial sector can be "secretive", and regulators need to act to avoid banks' reliance on a handful of outside firms becoming a threat to financial stability, the Bank of England said on Tuesday.

Banks and other financial firms are outsourcing key services to cloud computing companies such as Amazon (AMZN.O), Microsoft (MSFT.O) and Google (GOOGL.O) to improve efficiency and cut costs, with the trend accelerating last year as the COVID-19 pandemic unfolded.

The BoE said cloud computing could sometimes be more reliable than banks hosting all their servers themselves. But big providers could dictate terms and conditions - as well as prices - to key financial firms.

"That concentrated power on terms can manifest itself in the form of secrecy, opacity, not providing customers with the sort of information they need to monitor the risk in the service," BoE Governor Andrew Bailey told a news conference. "We have seen some of that going on."

Bailey did not name specific firms he had concerns about.

Earlier, the BoE's Financial Policy Committee said additional policy measures were needed to mitigate financial stability risks in cloud computing.

"In terms of the standards of resilience and the testing of those standards of resilience, frankly we will have to roll some of that back, that secrecy that goes with it. It's not consistent with our objectives," Bailey said.

Bailey said the BoE understood cloud providers' desire not to reveal too much publicly about their operations, in case it opened the door to cyber-attacks, but that the firms needed to give more information to regulators and customers.

"We have got to strike a balance here," Bailey said.

Google Cloud said cloud's benefits had come into full view during the pandemic, and it welcomed further discussion with policymakers on areas raised by the BoE.

"We're committed to working with financial services customers and regulators to provide them with controls and assurances on risk management, data locality, transparency, and compliance," a Google Cloud spokesperson said.

Amazon and Microsoft had no immediate comment.

The BoE said it welcomed the engagement of the finance ministry and Financial Conduct Authority on how to tackle risks from cloud computing, but that a broader approach may be needed, including other regulators and overseas partners.

Additonal reporting by William SchombergEditing by Mark Potter

Our Standards: The Thomson Reuters Trust Principles.

Read the original:
Bank of England to crack down on 'secretive' cloud computing services - Reuters

Read More..

UPDATE 1-Bank of England to crack down on ‘secretive’ cloud computing services – Yahoo Finance

(Adds Google comment)

By Huw Jones and David Milliken

LONDON, July 13 (Reuters) - Cloud computing providers to the financial sector can be "secretive", and regulators need to act to avoid banks' reliance on a handful of outside firms becoming a threat to financial stability, the Bank of England said on Tuesday.

Banks and other financial firms are outsourcing key services to cloud computing companies such as Amazon, Microsoft and Google to improve efficiency and cut costs, with the trend accelerating last year as the COVID-19 pandemic unfolded.

The BoE said cloud computing could sometimes be more reliable than banks hosting all their servers themselves. But big providers could dictate terms and conditions - as well as prices - to key financial firms.

"That concentrated power on terms can manifest itself in the form of secrecy, opacity, not providing customers with the sort of information they need to monitor the risk in the service," BoE Governor Andrew Bailey told a news conference. "We have seen some of that going on."

Bailey did not name specific firms he had concerns about.

Earlier, the BoE's Financial Policy Committee said additional policy measures were needed to mitigate financial stability risks in cloud computing.

"In terms of the standards of resilience and the testing of those standards of resilience, frankly we will have to roll some of that back, that secrecy that goes with it. It's not consistent with our objectives," Bailey said.

Bailey said the BoE understood cloud providers' desire not to reveal too much publicly about their operations, in case it opened the door to cyber-attacks, but that the firms needed to give more information to regulators and customers.

"We have got to strike a balance here," Bailey said.

Google Cloud said cloud's benefits had come into full view during the pandemic, and it welcomed further discussion with policymakers on areas raised by the BoE.

Story continues

"We're committed to working with financial services customers and regulators to provide them with controls and assurances on risk management, data locality, transparency, and compliance," a Google Cloud spokesperson said.

Amazon and Microsoft had no immediate comment.

The BoE said it welcomed the engagement of the finance ministry and Financial Conduct Authority on how to tackle risks from cloud computing, but that a broader approach may be needed, including other regulators and overseas partners. (Additonal reporting by William Schomberg Editing by Mark Potter)

Read this article:
UPDATE 1-Bank of England to crack down on 'secretive' cloud computing services - Yahoo Finance

Read More..

Microsoft Windows 365 moves your PC to the cloud – TechSpot

The idea of computing via the cloud has become so commonplace through the pandemic that virtually no one gives it much of a thought anymore. Application suites like Office 365, Microsoft 365, and Google Workspace, communication tools like Zoom, Teams, and Webex, and even file storage services like OneDrive, DropBox, or Google Drive are all just part of how we get things done these days.

For most of us, however, the operating system through which we use these applications and access our files typically comes through the client device: Windows 10 or MacOS on PCs, iOS or Android on smartphones and tablets, etc.

With the launch of Microsofts latest cloud servicedubbed Windows 365however, Microsoft is now streaming the Windows OS and full PC experience from Microsofts Azure cloud infrastructure to any type of connected computing device, from smartphone to PC, running any major OS. Hence, the Cloud PC.

Truth be told, the concept isnt exactly newin fact, far from it. There have been numerous variations on delivering a desktop experience from powerful remote computing resources for several decades, dating back to mainframes and terminals, through thin clients and associated servers, to virtual desktops delivered over the cloud via tools like Citrix Workspace.

In fact, Windows 365 is essentially a simplified version of Microsofts Azure Virtual Desktop offering (which will continue). Win365 is designed for what the company described as the 80% of organizations that are interested in desktop virtualization-type services but lack personnel with the very specific skills necessary to run sophisticated VDI environments.

One other important point of clarification is that Microsofts current concept of a Cloud PC is not a physical devicethough those are likely to come in the futurebut rather a cloud-delivered PC experience. The concept of a cloud PC has been bandied about by numerous PC and chip makers for many years. We may finally see future hardware designs that are optimized for the cloud-delivered desktop experience offered by Windows 365, but not with the initial launch.

Windows 365 serves a full Microsoft Windows experience including personal apps, data and settings from the cloud to any device with an internet connection. Image courtesy of Microsoft.

What Windows 365 does offer is an easily configurable, flexible way to let people working for businesses, schools, and other organizations to run a consistent Windows experience across whatever devices they have access toeven a regular Windows PC.

The basic concept is that these organizations can create standardized Windows 10 desktop environments (or Windows 11 once it becomes available later this year), complete with the necessary applications, settings, security protocols, and file access needed, and then make these standardized environments available to whatever groups of workers desired for whatever time frame desired.

Unlike previous virtual desktop-based solutions, however, Windows 365 keeps the process of configuring these cloud PC desktops simple, by limiting options to a few key choices. People who need to access these resources can then launch a simple application on whatever devices they have available and get access to their cloud-delivered Windows desktop. If they switch to another device or start working from another location, the experiencedown to the backgrounds, open windows, etc.remains consistent.

For organizations with seasonal workers, project-based temps, etc., this is obviously an ideal solution, because it lets these organizations turn on and turn off access to applications, shared files, etc. on an as-needed basis.

Even businesses that dont have these kinds of part-time employees can benefit by virtue of things like letting employees use personal devices to access their work resources in a secure, separated way. In addition, there are options to essentially provide super-powered PCs remotely to workers who need them for demanding applications like 3D modelling, graphic design, coding, etc.

By essentially providing access to more cloud-based computing resources (through the simple Endpoint Manager console that Microsoft provides admin access to as part of the Win365 offering), some users can get access to more computing power than they could get from even the most well-configured local PC. In fact, Microsoft has added what they call a new Watchdog Service thats constantly monitoring the performance of all Windows 365-connected systems and can provide tools and suggestions on how to fix any issues that may arise.

Despite these assurances, veterans of previous VDI technologies may raise performance-related concerns, because there have certainly been many employees who suffered slowly and painfully through poorly configured virtual desktop solutions in the past. In order to address that, Microsoft said that one other key change it is making with Windows 365 is essentially widening the pipe between the client device and cloud-based computing resources.

Obviously, the speed, quality, and consistency of any broadband connection between a given device and the internet is going to have a potentially even more profound impact on performance, but Microsoft claimed that it has optimized the client-to-cloud connection for Windows 365 to ensure a high-quality experience.

The company has also made several important security enhancements, including a number of simplified baseline settings that leverage tools like Microsoft Defender. In addition, the company claims its security policies are built around zero trust and least privileged access principles, while also offering support for multi-factor authentication through Azure Active Directory (AD). From a device management perspective, the revised Endpoint Manager console lets Cloud PCs and physical PCs be managed side-by-side in an intuitive manner, making it approachable even for small businesses with limited IT resources.

Given the growing use of other cloud-based computing servicessuch as Microsofts own OneDriveits certainly easier now for workers to navigate the potential complexities of hybrid working environments than it has been in the past. Still, for many organizations, those types of capabilities simply arent enough, and the need for an even more flexible and far-reaching service like Windows 365 makes a great deal of sense.

Cloud-delivered virtual desktops have proven to be a very effective tool for many more advanced IT organizations throughout the pandemic. They also appear to be a powerful starting point as we enter the new world of hybrid work. Previous complications have certainly limited the use of virtualized desktop systems up until now, so its good to see Microsoft bring these Cloud PC-based computing models to a wider audience with Windows 365.

Bob ODonnell is the founder and chief analyst of TECHnalysis Research, LLC a technology consulting firm that provides strategic consulting and market research services to the technology industry and professional financial community. You can follow him on Twitter @bobodtech.

Read the original here:
Microsoft Windows 365 moves your PC to the cloud - TechSpot

Read More..

SmugMug Source Preview: Say Goodbye to NAS Servers and Hello to the Best Cloud Storage Yet – Fstoppers

SmugMug, a household name when it comes to online image hosting, has developed a new service called Source. Already known for their incredible functions in online display and sales of images, SmugMug is taking the next leap towards providing photographers with new, never-seen-before cloud storage and management service for next to nothing.

Although cloud storage is relatively new, many photographers already use it as backup or as easy access to their library of images. There are, of course, a few shortcomings that most popular cloud storage platforms have: very few are designed with the photographers in mind. None are designed to such a degree as SmugMug Source.

The reason I am so excited to preview SmugMug Source is that it is a new product that has a multitude of applications for every workflow and photographer while being very affordable and easy to integrate with what most people currently have. Lets break down some exciting features SmugMug Source brings and how they can be applied in a potential professional photography workflow.

SmugMug Source supports: .RAF, .RW2, .CR2, .NRW, .ARW, .NEF, .DNG, .SRF, .RAW, .DCR, .ORF, .CRW, .SRW, .CR3, .RWL, .X3F, .MRW, .IIQ, .PEF, .tif, .tiff

Everything from huge Phase One (.iiq) files to the simple .tiff can be processed by Source making this product relevant to any photographer shooting on a digital camera. While other cloud storage platforms can store these files, few can process them as well as SmugMug Source does.

One unique feature of Source is that it can process photos on the fly. Process, as in edit? No, not quite. What it can do is upon import of raw files create a .jpeg for clients to preview. As an event photographer, I remember spending hours on end showing the files from my image-processing program to the client. This time now can be saved by uploading photos into SmugMugs cloud storage. Upon upload, it will create a visual .jpg that can be immediately shown to the clients using SmugMugs incredible photo-sharing possibilities such as client galleries. I know for a fact that when I was selling image by image to my clients, they often wanted to see the photos, but before that could be done, I was asked to select the best ones. Now, the whole gallery can be shown at once, which is both saving time on the photographer's sideas well as giving clients the full story. Moreover, this sharing feature allows for photographers to deliver photos at speeds unparalleled by anything else. Often, at large events, photographers must deliver images every few hours for social media and other online usage. In the past, this would mean dumping photos to two separate hard drives as well as a cloud storage service and exporting .jpgs at the end. Now, much of it can be saved.

Every photographer is scared of the day when the next drive will fail. Often, this is very unpredictable and annoying. But organizing a robust backup system thats automated can be quite time-consuming. SmugMug Source eliminates the problem of forgetting to back up. The software will make a backup automatically when the hard drive is plugged in. It will watch your catalog of raw assetsand upon any changes, update the backup on the SmugMug Source end. Additionally, this eliminates forgetting to upload. As a fashion photographer, I travel a lot, and there isnt always great internet at every place I open the laptop. However, I rely on online galleries for my clients to be able to select the images they want. Furthermore, with Zoom entering the studio, I can now tether to my computer and automatically upload the files to the cloud of anyone on the team (editors, art directors, retouchers) to access.

As I said, I travel a lot. Most photographers have to travel at some point for assignments. Some of my friends spend less time in what they consider home than they do abroad. Traveling and shooting entail bringing loads of drives to store and back up photos on. However, there is never enough storage if you ask me. Going away for a few shoot days can take up loads of drive space, and if its a long project, that space becomes scarce very fast too fast. SmugMug Source eliminates that problem for photographers by allowing them to store photos in the cloud. The only thing you need is an internet connection, and youre good to go. When you arrive back at the office, you can simply download them back to your physical storage and edit them.

Speaking of storage, the huge benefit of Source is that it is priced by need. Depending on the size of your project, you can pick to use up to 512 GB, up to 1 TB, and above. This is priced as so:

The final benefit that many photographers will appreciate is Lightroom integration. Being one of the most popular image-processing programs, Lightroom can be used in conjunction with SmugMug Source. Their original plugin has been used and loved by thousands of photographers, with Source taking it to the next level. Storing your raw files in the cloud doesnt mean you cant edit upload and sync your entire archive through Lightroom. Not only you can edit from the cloud, but also have the ability to keep your catalog nice and tidy by managing your raw files alongside the finished photographs.

Overall, SmugMug Source is a fantastic cloud storage platform that does much more than just store files in the cloud. It integrates with Lightroom, makes hard drives redundant on trips, backs up in the background, supports most types of raw files, and ultimately allows for easy previews. This new service has certainly captured my interest. I will be exploring how I can integrate it into my professional fashion photography workflow, as it brings a lot of useful features that will make a lot of photographers lives easier. The possibilities are amplified when you take a look at SmugMugs other services, which, among other things, allow you to beautifully share and sell photos online.

Link:
SmugMug Source Preview: Say Goodbye to NAS Servers and Hello to the Best Cloud Storage Yet - Fstoppers

Read More..

2021 State of the Cloud: No end in sight – Logistics Management

As companies assess their current technology infrastructures and look for new ways to tackle the rigors of the current business environment, more and more of them are turning to the Cloud.

Whether theyre replacing existing, on-premises supply chain software or simply looking to add newer, more modern functionalities, these companies are tapping into Cloud delivery models. Promising faster implementation times, cheaper upfront costs, and less reliance on internal IT teams, these solutions are now being fully embraced by both shippers and software vendors alike.

Supply chain organizations are in good company. According to Accenture, the global Cloud services industry has been growing year-over-year since 2010 and is now worth $370 billion (as of 2020). It says worldwide spending on public Cloud services is expected to grow by 18.4% this year, driven in part by the move to more remote worka shift that requires more flexible, Cloud-based software.

Calling 2020 a pivotal year for the Cloud, Accenture says it played a lead role in facilitating remote work solutions. According to Accenture, the Cloud has become an essential part of continuing business and is the key to unlocking organizational growth.

A software sector that has been gradually moving into the Cloud for over a decade now, supply chain management (SCM) handles a broad range of functions for logistics and supply chain operators. Under that umbrella, supply chain execution (SCE) manages activities like warehouse management (WMS), procurement, transportation management (TMS), global trade management (GTM), yard management (YMS), and labor management (LMS), among others.

Also encompassing supply chain planning (SCP) solutions, SCM touches most links in the typical, end-to-end supply chain. Within that realm, Bart De Muynck, vice president of research at Gartner, says Cloud SCM has experienced steady growth over the last four years.

Consider this: In 2017, De Muynck says software as a service (SaaS) comprised about 30% of all new SCM implementations, with the remainder being on-premises installations. By 2022, he predicts that ratio will flip, as SaaS implementations will outnumber on-premises installations.

Certain market segments within SCM will move to the Cloud faster than others. According to De Muynck, roughly 62% of all new procurement implementations will take place in the Cloud by 2022, versus a current 40%and around 30% in 2017. He says SCE solutions will follow a similar pattern, having grown from being 30% in the Cloud in 2017 and now on track to exceed 50% Cloud implementations by 2022.

Some of the drivers are economic in nature. With companies watching their spending right now, capital expenditures (capex) are taking a backseat to operating expenses that are quick-to-value and less resource constrained. At the same time, quick, easy implementations have taken precedence over long, drawn-out on-premises software implementations. Companies are deferring their larger, capex software investments that take a long time and that consume a lot of resources, says De Muynck. Those have been put on hold.

De Muynck says that other key market drivers right now include the need to replace legacy on-premises warehouse management systems. We still see a high number of companies with warehouse systems on premise, in a server room in the back of their warehouses, says De Muynck, who adds that some shippers are reluctant to move that data out into the Cloud. Convincing a business to take a box thats physically sitting in its warehouse and put it in a centralized locationlet alone into the Cloudcan be difficult.

Transportation management, on the other hand, has historically been one of the most Cloud-first SCM applications. To operate most effectively, TMS must be able to connect to many different carriers, trading partners, and even customers. As a result, this corner of the SCM market tends to be one of the biggest drivers of overall supply chain software Cloud adoption. This, in turn, has helped drive innovation within the segment, as new vendors come on the scene and find new ways to help shippers leverage the Cloud.

Established in 1896, Castelliniis a U.S. distributor of fresh produce that provides next-day shipping and cold storage options. With foodservice operations and wholesale locations in Wilder, Ky., and Conley, Ga., the company is the largest distributor of organic produce east of the Mississippi River.

An enVista customer since 2014, Castellini looked to enVistas team to implement a Cloud-based warehouse management system (WMS) to replace its existing, mature system. As the company continued to grow, it needed a best-in-breed WMS that would enable greater flexibility and better service for its customers, as well as offer a competitive edge in the market.

With these specific needs in mind, Castellini selected Blue Yonder for its WMS and engaged enVista for implementation in early March 2020. EnVista created a plan for Castellini within a week of the initial conversation, prior to COVID-19.

As uncertainty surrounding the pandemic grew and restrictions were put into place throughout the country, mid-March became an ideal time to begin the project to provide the distributor with greater flexibility and enhanced service for its customers during the pandemic.

The enVista and Blue Yonder teams worked to create a seamless integration from Castellinis existing, on-premises WMS. Due to restrictions, enVista restructured its methodology to collaborate with both Castellini and Blue Yonder and effectively use tools and technology to ensure success. As a result of this restructure, 90% of the project was done remotely.

At the same time as the Cloud WMS implementation, Castellini also upgraded its enterprise resource planning (ERP) system, adding a new level of communication needed to ensure a seamless go-live. EnVista channeled its expertise and proper implementation methodology to help the distributor ensure the WMS was fully tested against the right systems with the right data.

Training was also provided by enVista to help ensure that the distributor could be self-sufficient post implementation. By ensuring the correct training and education was delivered to the right users from Castellinis team, the company has remained self-sufficient after the system has been implementedthereby reducing the potential need for outsourcing or system errors and ensuring it can successfully prioritize resources.

The key factors for choosing to work with enVista were their in-depth knowledge, dedication to our needs and requirements, as well as their ability to supply Castellini with innovative supply chain solutions that has transformed our entire business, says Dan Taylor, Castellinis CIO.

Some vendors have made TMS much easier and cheaper to implement and use, and this has driven up the use of Cloud-based TMS, says De Muynck, who points to the new crop of last-mile, rail management and fleet management software solutions as a few examples of the latest vendor innovations in this arena. Were seeing software providers developing Cloud-based solutions that are fairly reasonably priced and that can be live within 30 days, in some cases.

Over the last year, William Brooks, vice president of the North American transportation portfolio at Capgemini, says he has seen more customers asking for Cloud-based SCM systems. That interest reached new heights during the pandemic, as companies worked to shore up their supply chain operations, manage remotely, and fully leverage their technology investments.

Cloud adoption is continuing its strong growth trajectory, says Brooks. I dont see any letup in sight.

At least some of that growth is being driven by the fact that the Cloud is now viewed as a tested and proven software delivery method. Past stigmas concerning data security and the possible loss of control that comes when the Cloud replaces on-premises servers have been dispelled by the 92% percent of organizations whose systems were already at least somewhat in the Cloud as of 2020, according to a recent InfoWorld survey.

Companies are more educated on the Cloud, have tested its waters, and realize that it really does work as advertised, says Brooks. And because of that, the Cloud is becoming more and more mainstream. Those converted organizations also like Cloud softwares lower upfront costs, the fact that it doesnt consume internal IT resources, and because it can be scaled up easily as a company grows.

Knowing this, SCM vendors have steadily started offering more Cloud-based solutions. Some have made Cloud their core product offerings, says Brooks, and are using pre-built integrations and application programming interfaces (APIs) that allow shippers to hook those applications back into their existing, on-premises systems.

Clint Reiser, director of supply chain research at ARC Advisory Group, says the big news on this front for 2020 was the introduction of Manhattan Associates Active WMS, a Cloud- native platform comprised of microservices architecture.

According to Manhattan, these platforms connect different applications (the microservices), each of which runs a unique process. In retail, for example, these applications may include order management, point of sale, inventory management, and fulfillmenteach of which contributes to the overallcustomer experience.

In 2021, Manhattan followed up with its Active TMS solution, which takes a similar approach with transportation. Reiser sees this as a key development in the push to create even more advanced, Cloud-based SCM solutions. Manhattan built these solutions using different, interchangeable widgets [microservices], says Reiser. This isnt just a lift and shift to the Cloud; the platform is designed on a different infrastructure.

Put simply, the software developer didnt just move its existing WMS and TMS into the Cloud, it completely rearchitected the technology within the microservices environment. De Muynck says this allows shippers to more easily use the solutions, which can be acquired on a microservice by microservice basis. Its also makes it much easier for vendors to extend their solutions capabilities, he adds.

Asked whether he thinks other SCM vendors will follow Manhattans lead on the microservices front, De Muynck says those that are starting from scratch may naturally move in this direction. Established vendors may have to rethink their current application stacks if they decide to move in the microservices direction.

From the vendor perspective, Reiser says increased Cloud adoption has helped provide stability for the WMS market over the last three years to four years. In other words, even companies that backburnered their large capex investments and on-premises software implementations have been willing to give Cloud a try.

In response, vendors have created more Cloud offerings, effectively stabilizing the revenues of the market, says Reiser. A percentage of the marketplace has moved to SaaS, so now there arent as many vendors that rely on the software portion of their revenues, he continues. This trend has also stabilized the vendors relationships with their own customers.

With no end in sight to Clouds domination in the supply chain software arena, expect to see new innovations, functionalities, and capabilities hitting the market in 2021 and beyond. As companies continue to emerge from the pandemic and continue their digitalization journeys, De Muynck says that more of them will be seeking Cloud-based solutions that incorporate artificial intelligence (AI), real-time visibility and advanced analytics capabilities.

The pandemic-related disruptions that have taken place over the last 16 months have suddenly made these needs much more acute, says De Muynck. As a result, companies are investing in these different technologies within the logistics space, where the race is on to get these digital capabilities in place.

Visit link:
2021 State of the Cloud: No end in sight - Logistics Management

Read More..

Consumer Cloud Subscription Market To See Massive Growth By 2026 | Google, Microsoft, Box The JC Star – The JC Star

The latest study released on the Global Consumer Cloud Subscription Market by AMA Research evaluates market size, trend, and forecast to 2026. The Consumer Cloud Subscription market study covers significant research data and proofs to be a handy resource document for managers, analysts, industry experts and other key people to have ready-to-access and self-analyzed study to help understand market trends, growth drivers, opportunities and upcoming challenges and about the competitors.

Free Sample Report + All Related Graphs & Charts @:https://www.advancemarketanalytics.com/sample-report/170778-global-consumer-cloud-subscription-market

Definition and Brief Information about Consumer Cloud Subscription:Cloud storage gives 24/7 access to the documents, photos, music, and videos and the user can get all of it wherever they want and on any compatible device, as long as the user have an internet connection. Cloud storage also makes sharing easy and not least, its an admirable way to back up all the digital content. Computer systems have been gradually moving away from local storage to remote, server-based storage and processing. Consumers are affected too they now stream video and music from servers rather than playing them from discs. Most cloud services do offer some level of backup, almost as a consequence of their planned function.

This Report also covers the emerging players data, including: competitive situation, sales, revenue and global market share of top manufacturers such as:Apple (United States), Google (United States), Microsoft (United States), Box (United States), Dropbox (United States), Amazon (United States), Sync Inc. (United States), Hubic (OVH) (France), Mediafire (United States), Pcloud (Switzerland)

Keep yourself up-to-date with latest market trends and changing dynamics due to COVID Impact and Economic Slowdown globally. Maintain a competitive edge by sizing up with available business opportunity in Consumer Cloud Subscription Market various segments and emerging territory.

Market Trends:

Rising Volume of Data Being Uploaded on the Servers

Market Drivers:

Presence of Next-Generation Cloud Technologies and Deployment of 5G Networks

Growing Adoption & Penetration of the Internet

Market Opportunities:

Adoption of AI to Achieve Data Integrity

Development of Distributed Storage Arrays

Enquire for customization in Report @:https://www.advancemarketanalytics.com/enquiry-before-buy/170778-global-consumer-cloud-subscription-market

Application:Individual

Enterprises (SMEs, Large Enterprises)

Type:

Direct Consumer Cloud Subscription

Indirect Consumer Cloud Subscription

What benefits does AMA research study is going to provide?

Region Included are:North America, Europe, Asia Pacific, Oceania, South America, Middle East & Africa

Country Level Break-Up:United States, Canada, Mexico, Brazil, Argentina, Colombia, Chile, South Africa, Nigeria, Tunisia, Morocco, Germany, United Kingdom (UK), the Netherlands, Spain, Italy, Belgium, Austria, Turkey, Russia, France, Poland, Israel, United Arab Emirates, Qatar, Saudi Arabia, China, Japan, Taiwan, South Korea, Singapore, India, Australia and New Zealand etc.

Strategic Points Covered in Table of Content of Global Consumer Cloud Subscription Market:

Get More Information:https://www.advancemarketanalytics.com/reports/170778-global-consumer-cloud-subscription-market

Note In order to provide more accurate market forecast, all our reports will be updated before delivery by considering the impact of COVID-19.

(*If you have any special requirements, please let us know and we will offer you the report as you want.)

Contact Us:

Craig Francis (PR & Marketing Manager)AMA Research & Media LLPUnit No. 429, Parsonage Road Edison, NJNew Jersey USA 08837Phone: +1 (206) 317 1218sales@advancemarketanalytics.com

See the original post here:
Consumer Cloud Subscription Market To See Massive Growth By 2026 | Google, Microsoft, Box The JC Star - The JC Star

Read More..

Privilege Elevation for Workstations and Servers – Security Boulevard

The good news is that you dont need to take on everything at once. In fact, we suggest you dont.

We find that most organizations start strong when they adopt PAM getting a vault set up and domain passwords and local shared accounts under control. Then, they start to get complacent. They stagnate on their journey somewhere between stages two and three.

Most organizations start strong then stagnate somewhere between stages two and three.

Meanwhile, the organization keeps growing and the IT environment gets more complex and difficult to manage Service accounts proliferate, unchecked. Identities multiply and become siloed in Active Directory, LDAP, etc. This is especially true for Linux systems in the cloud, with no centralized management like AD. Cloud platforms like AWS have their own IAM services, which leads to more siloed accounts.

Just as technology mushrooms, the number of privileged users grows exponentially. Business users adopt more applications without IT management, engineering teams spin up more systems, and developers store passwords in libraries and code.

Cyber criminals are getting more sophisticated and emboldened all the time.

To protect your growing attack surface, you cant hold your organization at the Basic stage. The jump to Advanced is an important one and its manageable. Lets break it down.

Fundamentally, the Advanced stage of PAM maturity is about implementing a Zero Trust model founded on the Principle of Least Privilege (PoLP). With this approach, users and systems should only have the access and permissions they need to do their jobs, nothing more.

Traditional password vaults offer a basic level of control and fundamental security benefits. Password theft, however, is only one step in a cyber criminals attack chain. Should an attacker successfully gains access to a system, they will also need the ability to export data without detection, so they can sell it on the black market or ransom it off. To further secure your organization, and mature in your PAM program, privilege elevation solutions should be used. This will allow you to assign admin rights to individual tasks, applications, or scripts that require them for a granular level of control.

There are two parts of your attack surface where maintaining least privilege is essential for a strong security posture: user workstations and servers. In both situations, privilege elevation capabilities allow you to easily assign or revoke privileges for a specific period, providing just-in-time, just-enough access when admin control is absolutely necessary.

The rest is here:
Privilege Elevation for Workstations and Servers - Security Boulevard

Read More..