Page 2,186«..1020..2,1852,1862,1872,188..2,2002,210..»

Evolving Ransomware Demands an AI-powered Threat Detection and Response System – Tech Wire Asia

The dissemination of information cuts two ways. On the one hand, commerce is enabled, yet on the other, so too are the criminalized branches of commerce, and as a result, evolved ransomware is one of the most dangerous threats on the internet today. Its a low-cost, high-profit model and the threat is evolving to keep up with changes in how we work.

Ransomware gangs and their associates are in the business of making money, have an ROI mindset. Groups and individuals learn new techniques, capitalising on their abilities to gain access to systems and data, and either steal, ransom-and-return, or just encrypt and charge.

Ransomwares latest variations actively examine the network for shared files on servers and computers to which the compromised host has access privileges, then spreads from one device to a large number of others.

Because of the operational downtime and data loss caused by ransomware encrypting file shares, attacks become incredibly costly. When a company is targeted by a ransomware attack, its an all-hands-on-deck situation that necessitates urgent action to recover systems while business operations are held hostage.

When the target is a cloud service provider, and the systems encrypted are those of its customers, the downtime gets even worse. In 2019, ransomware attacks affected cloud hosting companies DataResolution.net and iNSYNQ, preventing over 30,000 clients from using their services.

In the same year, ransomware evolved from opportunistic to targeted attacks on businesses willing to pay a higher ransom to regain access to their files. And yet companies seem to continue to pay up rarely admitting doing so with an evident rise in the amounts demanded.

Network file encryption in ransomware

Documents are saved in shared volumes are often thought of as backups, in addition to the sole copy of information to enable better productivity while sharing information for teamwork (especially important for mobile workers).

With access to documents in network shares, a single host can lock access to documents across multiple departments in a targeted organisation thanks to high-capacity data storage.

Theres also the deep integration with many cloud services thats abstracted away from the user, yet highly attractive to attackers. Integrated filesharing services based in the cloud (to take a single example), allow local attacks to spread out into shared resources hosted anywhere. And the more these services are integrated (log in with your Google account credentials), the greater the scope for potential damage to the enterprise at large.

That goes some way to explain why, according to , the numbers of attacks may be declining: fewer attacks, sure, but increasingly effective, lucrative and impactful as methods evolve.

The fact that the total number of detections is decreasing does not indicate that businesses should relax and not take any safety measures. Whether its needed investment in extra backups, loss of reputation, loss of IP or interruption to business, ransomware is very, very expensive, and in some cases, terminal.

How Vectra AI addresses ransomware

Ransomwares evolution has moved the technology away from broad, automated spray-and-pray attacks and toward highly focused human-driven attacks. These new ransomware generations frequently rely on stolen credentials to gain privileged access. And identity-based threats are undetectable by signature-based safeguards, at least, until the payload drops and code hosted on the victim begins to exhibit atypical behavior.

If ransomware evolves, so must your detection and response. The use of AI in this instance is perfect in detecting hidden and unknown attackers in real time, allowing for quick, decisive action. Machine learning algorithms detecting anomalies can raise red flags early, helping companies isolate potential infections before lateral spread of the encryption payload.

The Vectra AI platform looks for telltale symptoms of a ransomware compromise, such as reconnaissance, lateral movement, and command and control in network traffic that includes packets from and to cloud and IoT devices.

As Vectra AI is the solution that can see and stop ransomware before it can hurt you. Click here to find out more.

Original post:
Evolving Ransomware Demands an AI-powered Threat Detection and Response System - Tech Wire Asia

Read More..

NetApp To Acquire Instaclustr: Moving Up The App Stack – CRN

NetApp Thursday said it plans to acquire Instaclustr, a developer of a platform for delivering fully managed open-source database, pipeline and workflow applications as a service. With the acquisition, Sunnyvale, Calif.-based NetApp said it aims to take its technology to yet a higher layer above its traditional storage focus.

The acquisition is expected to close in 30 to 45 days, subject to regulatory approval. NetApp declined to discuss the value of the acquisition.

NetApps acquisition of Instaclustr, with its ability to run open-source databases on the cloud and on-premises, is part of what has become NetApps centerpiece of optimizing the cloud for customers, said Anthony Lye, NetApps executive vice president and general manager for public cloud services.

[Related: NetApp CEO George Kurian: Dell, HPE Are Doing What We Did In 2014]

Were helping clients manage storage from on-prem to the cloud, Lye told CRN. Our OnCommand Insight is now a multitenant, cloud-based monitoring platform. And we optimize storage to compute with Spot.

NetApps Instaclustr acquisition is the latest in a series of acquisitions that NetApp has made that has moved the vendor from a focus on storage to one of optimizing data and applications across public clouds and on-premises.

It comes just a month after NetApp acquired Fylamynt, which brought CloudOps automation to its Spot portfolio of cloud-native services.

NetApp in June 2020 acquired Spot, which develops technology to manage and optimize compute instances on public clouds.

Since then, NetApp has expanded the Spot portfolio to include its Ocean Kubernetes DevOps technology; its CloudJumper acquisition, which gave it the ability better manage virtual desktop infrastructure and is now known as Spot PC; its CloudHawk security technology, now known as Spot Security; and its Data Mechanics acquisition for optimizing Apache Spark analytics, now known as Ocean for Apache Spark.

NetApps move to optimize hybrid multi-cloud environments above the storage layer is paying off, Lye said.

A couple years ago, our goal was to reach $1 billion in ARR [annual recurring revenue] by fiscal year 2025, he said. Last week, we said at our investor conference we expect $2 billion in ARR by the end of fiscal year 2026.

For NetApp, the goal is to bring all its goodness to wider platforms, Lye said.

A lot of customers tell us they love our tools, and that we have these cool services, he said. But they ask us, what else can we do for them? So we have Spot PC, which lets us run their virtual desktop infrastructures. We have Data Mechanics, which provides customers with a fully managed Spark service. Instaclustr will sit nicely on top of everything we do.

Customers have alternatives to Instaclustr, but they dont offer the capabilities NetApp can with Instaclustr, Lye said.

So for the ability to work across different open-source projects and multi-cloud environments, Instaclustr can be a very valuable service for us.

Lye said NetApp wants to be part of Platform as a Service and not just Infrastructure as a Service.

We want to shift as more decisions are made by the application teams than by the IT teams, he said. IT teams used to say they need a data center, servers, storage and on top of that virtualization, Linux and databases. The last people to get in on the decision-making was the app team. Now I want to flip that around and let the app pick the infrastructure. We can do that not just at the storage, compute and network layer, but also at the app layer. Customers dont want to deal with the infrastructure.

NetApps acquisition of Instaclustr is another brilliant move, said John Woodall, vice president of engineering and NetApp enablement at General Datatech, a Dallas-based solution provider and longtime NetApp channel partner.

In my opinion, NetApp is accelerating their transition towards more automated application pipelines and a DevOps perspective, Woodall told CRN. Theyre moving up the application stack. Instaclustr fits well with the Spot portfolio. Customers really understand how Spot works with applications.

Instaclustr lets NetApp move toward more open-source, cloud-native database applications, and helps it do more at the application layer and not just at the infrastructure layer, Woodall said.

This has big ramifications for making applications more cloud-aware, he said. Anthony [Lye] is clearly moving farther into the application and database end of the stack, and making NetApp more application-aware, typology-aware, cloud-aware and even security-aware. This is a different story for NetApp. Its indicative of the transition to running as a services-led, hybrid cloud capabilities-driven business.

See the original post here:
NetApp To Acquire Instaclustr: Moving Up The App Stack - CRN

Read More..

How to combine the power of cloud and edge computing – Raconteur

Like companies all around the world, US fast-food chain Taco Bell responded to the pandemics commercial impact by accelerating its shift to the cloud. As customers traditional patterns of restaurant and drive-through consumption changed rapidly and permanently to include kiosk, mobile and web ordering, often through third-party delivery services, Taco Bell moved the remainder of its group IT to cloudservices.

But this 100% cloud-based approach stops at the restaurant door. Given that many of its 7,000 outlets dont have fast and/or reliable internet connections, the company has recognised the limitations of the public cloud model and augmented its approach with edge computing. This set-up enables the company to process data near the physical point at which it is created, with only a periodic requirement to feed the most valuable material back to the cloud and receive updates fromit.

Taco Bell is just one of thousands of firms seeking to exploit the fast-evolving and much-hyped distributed IT capability that edge computing can offer.

Edge computing is getting so much attention now because organisations have accepted that there are things that cloud does poorly, observes Bob Gill, vice-president of research at Gartner and the founder of the consultancys edge research community.

Issues of latency (time-lag) and limited bandwidth when moving data are key potential weaknesses of the centralised cloud model. These drive a clear distinction between the use cases for cloud and edge computing. But the edge is also a focus for many organisations because they want to add intelligence to much of the equipment that sits within their operations and to apply AI-powered automation at those endpoints.

Early adopters include manufacturers implementing edge computing in their plants as part of their Industry 4.0 plans; logistics groups seeking to give some autonomy to dispersed assets; healthcare providers with medical equipment scattered across hospitals; and energy companies operating widely dispersed generation facilities.

For such applications to be viable and efficient, their data must be processed as close to the point of origin or consumption as possible, says George Elissaios, director of product management at Amazon Web Services. With edge computing, these applications can have lower latency, faster response times and give end customers a better experience.Edge computing can also aid interconnectivity by reducing the amount of data that needs to be backhauled to datacentres.

In some ways, the emergence of edge computing represents a new topology for IT. So says Paul Savill, global practice leader for networking and edge computing at Kyndryl, the provider of managed infrastructure services that was recently spun out ofIBM.

Companies are looking at the edge as a third landing spot for their data and applications. Its a new tier between the public cloud and the intelligence at an end device a robot, say, heexplains.

But most organisations dont expect their edge and cloud implementations to exist as distinct entities. Rather, they want to find ways to blend the scalability and flexibility they have achieved with the cloud with the responsiveness and autonomy of internet-of-things (IoT) and satellite processors installed at theedge.

Gill believes that cloud and edge are pure yin and yang. Each does things the other doesnt do well. When put together effectively, they are highly symbiotic.

They will need to be, as more and more intelligence is moved to the edge. More than 75 billion smart digital devices will be deployed worldwide by 2025, according to projections by research group IHS Markit. And it is neither desirable nor realistic for these to be interacting continuously with thecloud.

Cloud and edge are pure yin and yang When put together effectively, theyre highly symbiotic

When you start to add in multiple devices, you see a vast increase in the volume, velocity and variety of the data they generate, says Greg Hanson, vice-president of data management company Informatica in EMEA and Latin America. You simply cant keep moving all of that data into a central point without incurring a significant cost and becoming reliant on network bandwidth and infrastructure.

In such situations, edge IT performs a vital data-thinning function. Satellite processors sitting close to the end points filter out the most valuable material, collate it and dispatch it to the cloud periodically for heavyweight analysis, the training of machine-learning algorithms and longer-term storage. Processors at the edge can also apply data security and privacy rules locally to ensure regulatory compliance.

Gill notes that edge computing has shifted quickly from concept and hype to successful implementations. In many vertical industries, it is generating revenue, saving money, improving safety, enhancing the customer experience and enabling entirely new applications and datamodels.

Before achieving such gains, many edge pioneers are likely to have surmounted numerous significant challenges. Given that the technology is immature, there are few widely accepted standards that businesses can apply to it. This means that theyre often faced with an overwhelmingly wide range of designs for tech ranging from sensors and operating systems to software stacks and data management methods.

Such complexity is reflected in a widespread shortage of specialist expertise. As Savill notes: Many companies dont have all the skills they need to roll out edge computing. Theyre short of people with real competence in the orchestration of these distributed application architectures.

The goal may be to blend cloud and edge seamlessly into a unified model, but the starting points can be very different. There are two fundamentally different though not totally contradictory schools of thought, according to Gill. The cloud out perspective, favoured by big cloud service providers such as Amazon, Microsoft and Google, views the edge as an extension of the cloud model that extends the capabilities of theirproducts.

The other approach is known as edge in. In this case, organisations develop edge-native applications that occasionally reach up to the cloud to, say, pass data on to train a machine-learning algorithm.

Adherents of either approach are seeing significant returns on their investments when they get itright.

We may be in the early phase of exploiting that combination of IoT, edge and cloud, but the capabilities enabling these distributed architectures the software control and orchestration tools and the integration capabilities have already reached the point where theyre highly effective, Savill reports. Some companies that are figuring this out are seeing operational savings of 30% to 40% compared with more traditional configurations.

In doing so, they are also heralding a large-scale resurgence of the edifice that cloud helped to tear down: on-premises IT albeit in a different form.

In the next 10 to 20 years, the on-premises profile for most companies will not be servers, Elissaios predicts. It will be connected devices and billions ofthem.

See the article here:
How to combine the power of cloud and edge computing - Raconteur

Read More..

SMBStream for Accelerated VPN-Less Access to SMB shares, is Now Available in the AWS Marketplace – IT News Online

PR.com2022-04-09

London, CA April 09, 2022 --(PR.com)-- Storage Made Easy, with a mission of simplifying storage for everyone, announced today that their new SMBStream product can now be launched directly from the AWS Marketplace.

SMBStream provides high-performance, secure access to file servers in the cloud, in data centers, and between geographically distributed offices across the world. Unlike using a VPN, users and applications have speedy access to the file data they need in real-time, and the solution scales as more users are added.

Launching SMBStream from the AWS Marketplace makes it even easier to consolidate file servers into the cloud, to include remote storage in cloud workloads and to integrate distributed file storage into the Enterprise File Fabric platform.

SMBStream Highlights:

Real-time Access - Users are able access live file storage over the internet. Real-time access means there is no office cache to procure, no snapshots to synchronize, and no global locking challanges.Fast - SMBStream enables productive use of remote file systems from distributed offices. Improves remote file access up to 15 times compared to a traditional VPN.Secure - Adds key authentication, repudiation and AES-256 encryption for secure access over the public internet.Vendor Neutral Extends the reach of your SMB compatible file servers including Amazon FSx, Nasuni, NetApp Cloud Volumes.

For more information about SMBStream visit: https://storagemadeeasy.com/smbstream/

Contact Information:Storage Made EasyMariado Martinez, Marketing Manager+442086432885Contact via Emailhttp://StorageMadeEasy.com

Read the full story here: https://www.pr.com/press-release/858959

Press Release Distributed by PR.com

More here:
SMBStream for Accelerated VPN-Less Access to SMB shares, is Now Available in the AWS Marketplace - IT News Online

Read More..

The smart lock market is estimated to be valued at USD 2.1 billion in 2022 and reach USD 3.9 billion by 2027, registering a CAGR of 12.9% – Yahoo…

ReportLinker

between 2022 and 2027. According to a United Nations report published in July 2018, the global urban population is expected to increase to 4. 46 billion in 2021 and 6. 68 billion by 2050. Urbanization is expected to increase in 600 large cities worldwide by 2025.

New York, April 08, 2022 (GLOBE NEWSWIRE) -- Reportlinker.com announces the release of the report "Smart Lock Market with COVID-19 Impact by Lock Type, Communication Protocol, Unlocking Mechanism, Vertical and Region - Global Forecast to 2027" - https://www.reportlinker.com/p05169655/?utm_source=GNW Emerging nations are also witnessing rapid urbanization with the development of smart cities. In emerging nations, the concentration of industrial development in cities leads to a growing demand for improved infrastructure. This would ultimately lead to the development of several new and renovated educational and healthcare institutions, public administration offices, shopping malls, stores, and warehouses. Thus, infrastructure development would boost the demand for advanced biometric solutions and smart locks, particularly in technologically advancing countries such as India, China, and Brazil.

The increasing adoption of the Internet of Things has enabled the use of various cyber-physical devices such as smartphones, connected cars, and wearable devices. Smart locks can be operated through the remote servers of the manufacturers of smart locks.

All information regarding the properties of a smart lock and its virtual key is stored in the vendors cloud server.Smart locks allow for easy sharing of keys to other authorized persons and unauthorized entry and breakage of the lock to be monitored and reported.

However, the vendor cloud server can be attacked through code injection, cross-site scripting, password eavesdropping, and other means. Data communication between a smartphone and a smart lock through the Bluetooth Low Energy (BLE) protocol can also be hacked through a man-in-the-middle (MITM) attack.

Server locks & latches: The fastest lock type of smart lock market .Server locks and latches use server systems based on the cloud to access doors from a remote location.The server cover prevents unauthorized access to the users server.

Locking the front door (available on some models) avoids unauthorized access to the installed drives.Noke is one of the startups working on this model.

The Noke web-based portal provides users with a comprehensive tool to manage lock functions, track digital keys, share real-time data, and even integrate users existing platforms.

Wi-Fi: The fastest communication platform of the smart lock market .

Wi-Fi is one of the key technologies responsible for the increasing implementation of the IoT in smart locks.Wi-Fi can be accessed through various devices such as smartphones, personal computers, and tablets.

These devices can be connected to the Internet through a wireless access point.The Wi-Fi Direct standard enables users to connect any 2 devices without a wireless router.

The growth in the adoption of smartphones and tablets has increased the application scope of IoT in smart locks equipped with Wi-Fi connectivity. Key players offering Wi-Fi-based smart locks are LockState (US), Allegion (Ireland), Gate, and ASSA ABLOY (Sweden).

Touch based: Fastest growing unlocking mechanism of smart lock marketThe touch-based unlocking mechanism in smart locks uses the fingerprint recognition technique.Fingerprint recognition is an effective and simple method for identifying and authenticating individuals.

One technology that should be of particular interest to manufacturers in the smart home industry is biometrics.Biometric authentication can complement the new smart home trend and add real value to modern domestic security solutions.

Unlike password-protected smart locks, biometric authentication uses personally identifiable information stored securely on-device (whether the lock itself or a fingerprint-secured access card) for maximum privacy. This makes biometrics difficult to hack and near-impossible to spoof, ensuring that homes stay considerably safer than using password-secured, internet-enabled, or traditional key locks.

Residential: Fastest vertical of smart lock marketSmart locks are increasingly used in individual houses to ensure security and safety.Controlled access to allow the entry of only authorized persons is considered the critical function of asset security.

Smart locks have become an integral part of smart homes, as they help control the door locks remotely, protecting people and property.The growth of this market can be attributed to the increasing demand for smart homes and rising urbanization across the globe.

The growing urban population is increasing the need for better infrastructure equipped with security systems to protect people and proper.

North America: The largest region in the global smart lock market .

North America is one of the most technologically advanced regions and is a large market for smart lock technology.The growing awareness about home security solutions, benefits provided by smart locks such as connectivity through smart devices, and their remote access features are driving the regional markets growth.

In addition, the recent upswing in the trend of smart homes, rising adoption of IoT-based services, and the large presence of smart lock vendors are the factors supporting the growth of the smart lock market in North America. The US, Canada, and Mexico are the key countries contributing to the growth of the smart lock market in this region.

The study contains insights from various industry experts, ranging from component suppliers to Tier 1 companies and OEMs. The break-up of the primaries is as follows: By Company Type: Tier 1 55%, Tier 2 25%, and Tier 3 20% By Designation: C-level Executives 75%, and Managers 25% By Region: APAC 40%, RoW 30%, Europe 20%, North America 10%

Key players operating in the smart lock market are ASSA ABLOY AB (Sweden), dormakaba Group (Switzerland), Spectrum Brands, Inc. (US), SALTO Systems, S.L. (Spain), Allegion plc (Ireland), Honeywell International Inc. (US), Dahua Technology Co., Ltd (China), Samsung Electronics Co., Ltd. (South Korea), Vivint, Inc. (US), ZKTECO CO., LTD. (China), igloohome Pte Ltd (Singapore), RemoteLock (US), Onity (US), Master Lock Company LLC. (US), MIWA Lock Co. (Japan), SentriLock (US), Avent Security (China), HavenLock, Inc. (US), Shenzhen Vians Electric Lock Co., Ltd. (China), Anviz Global Inc. (US), CANDY HOUSE, Inc. (US), AMADAS (South Korea), Thekeywe (South Korea), Gate Video Smart Lock (US), and DESSMANN Schliessanlagen GmbH (Germany).

Research Coverage:The report segments the smart lock market and forecasts its size, by value, based on lock type, communication protocol, unlocking mechanism, vertical, and region.The report also provides a comprehensive review of market drivers, restraints, opportunities, and challenges in the smart lock market. The report also covers qualitative aspects in addition to the quantitative aspects of these markets.

Key Benefits of Buying the ReportThe report will help the leaders/new entrants in this market with information on the closest approximations of the revenue numbers for the overall market and the sub-segments.This report will help stakeholders and gain more insights to better position their businesses and plan suitable go-to-market strategies.

The report also helps stakeholders understand the pulse of the smart lock market and provides them information on key market drivers, restraints, challenges, and opportunities. The report also covers COVID-19 impact on smart lock market.Read the full report: https://www.reportlinker.com/p05169655/?utm_source=GNW

About ReportlinkerReportLinker is an award-winning market research solution. Reportlinker finds and organizes the latest industry data so you get all the market research you need - instantly, in one place.

__________________________

Story continues

Here is the original post:
The smart lock market is estimated to be valued at USD 2.1 billion in 2022 and reach USD 3.9 billion by 2027, registering a CAGR of 12.9% - Yahoo...

Read More..

Storage requirements for AI, ML and analytics in 2022 – ComputerWeekly.com

Artificial intelligence (AI) and machine learning (ML) promise to transform whole areas of the economy and society, if they are not already doing so. From driverless cars to customer service bots, AI and ML-based systems are driving the next wave of business automation.

They are also massive consumers of data. After a decade or so of relatively steady growth, the data used by AI and ML models has grown exponentially as scientists and engineers strive to improve the accuracy of their systems. This puts new and sometimes extreme demands on IT systems, including storage.

AI, ML and analytics require large volumes of data, mostly in unstructured formats. All these environments are leveraging vast amounts of unstructured data, says Patrick Smith, field CTO for Europe, the Middle East and Africa (EMEA) at supplier Pure Storage. It is a world of unstructured data, not blocks or databases.

Training AI and ML models in particular uses larger datasets for more accurate predictions. As Vibin Vijay, an AI and ML specialist at OCF, points out, a basic proof-of-concept model on a single server might expect to be 80% accurate.

With training on a cluster of servers, this will move to 98% or even 99.99% accuracy. But this puts its own demands on IT infrastructure. Almost all developers work on the basis that more data is better, especially in the training phase. This results in massive collections, at least petabytes, of data that the organisation is forced to manage, says Scott Baker, CMO at IBM Storage.

Storage systems can become a bottleneck. The latest advanced analytics applications make heavy use of CPUs and especially GPU clusters, connected via technology such as Nvidia InfiniBand. Developers are even looking at connecting storage directly to GPUs.

In AI and ML workloads, the learning phase typically employs powerful GPUs that are expensive and in high demand, says Brad King, co-founder and field CTO at supplier Scality. They can chew through massive volumes of data and can often wait idly for more data due to storage limitations.

Data volumes are generally large. Large is a relative term, of course, but in general, for extracting usable insights from data, the more pertinent data available, the better the insights.

The challenge is to provide high-performance storage at scale and within budget. As OCFs Vijay points out, designers might want all storage on high-performance tier 0 flash, but this is rarely, if ever, practical. And because of the way AI and ML work, especially in the training phases, it might not be needed.

Instead, organisations are deploying tiered storage, moving data up and down through the tiers all the way from flash to the cloud and even tape. Youre looking for the right data, in the right place, at the right cost, says Vijay.

Firms also need to think about data retention. Data scientists cannot predict which information is needed for future models, and analytics improve with access to historical data. Cost-effective, long-term data archiving remains important.

There is no single option that meets all the storage needs for AI, ML and analytics. The conventional idea that analytics is a high-throughput, high-I/O workload best suited to block storage has to be balanced against data volumes, data types, the speed of decision-making and, of course, budgets. An AI training environment makes different demands to a web-based recommendation engine working in real time.

Block storage has traditionally been well suited for high-throughput and high-I/O workloads, where low latency is important, says Tom Christensen, global technology adviser at Hitachi Vantara. However, with the advent of modern data analytics workloads, including AI, ML and even data lakes, traditional block-based platforms have been found lacking in the ability to meet the scale-out demand that the computational side of these platforms create. As such, a file and object-based approach must be adopted to support these modern workloads.

Block-based systems retain the edge in raw performance, and support data centralisation and advanced features. According to IBMs Scott Baker, block storage arrays support application programming interfaces (APIs) that AI and ML developers can use to improve repeated operations or even offload storage-specific processing for the array. It would be wrong to rule out block storage completely, especially where the need is for high IOPS and low latency.

Against this, there is the need to build specific storage area networks for block storage usually Fibre Channel and the overheads that come with block storage relying on an off-array (host-based) file system. As Baker points out, this becomes even more difficult if an AI system uses more than one OS.

As a result, system architects favour file or object-based storage for AI and ML. Object storage is built with large, petabyte capacity in mind, and is built to scale. It is also designed to support applications such as the internet of things (IoT).

Erasure coding provides data protection, and the advanced metadata support in object systems can benefit AI and ML applications.

Against this, object storage lags behind block systems for performance, although the gap is closing with newer, high-performance object technologies. And application support varies, with not all AI, ML or analytics tools supporting AWSs S3 interface, the de facto standard for object.

Cloud storage is largely object-based, but offers other advantages for AI and ML projects. Chief among these are flexibility and low up-front costs.

The principal disadvantages of cloud storage are latency, and potential data egress costs. Cloud storage is a good choice for cloud-based AI and ML systems, but it is harder to justify where data needs to be extracted and loaded onto local servers for processing, because this increases cost. But the cloud is economical for long-term data archiving.

Unsurprisingly, suppliers do not recommend a single solution for AI, ML or analytics the number of applications is too broad. Instead, they recommend looking at the business requirements behind the project, as well as looking to the future.

Understanding what outcomes or business purpose you need should always be your first thought when choosing how to manage and store your data, says Paul Brook, director of data analytics and AI for EMEA at Dell. Sometimes the same data may be needed on different occasions and for different purposes.

Brook points to convergence between block and file storage in single appliances, and systems that can bridge the gap between file and object storage through a single file system. This will help AI and ML developers by providing more common storage architecture.

HPE, for example, recommends on-premise, cloud and hybrid options for AI, and sees convergence between AI and high-performance computing. NetApp promotes its cloud-connected, all-flash storage system ONTAP for AI.

At Cloudian, CTO Gary Ogasawara expects to see convergence between the high-performance batch processing of the data warehouse and streaming data processing architectures. This will push users toward object solutions.

Block and file storage have architectural limitations that make scaling beyond a certain point cost-prohibitive, he says. Object storage provides limitless, highly cost-effective scalability. Object storages advanced metadata capabilities are another key advantage in supporting AI/ML workloads.

It is also vital to plan for storage at the outset, because without adequate storage, project performance will suffer.

In order to successfully implement advanced AI and ML workloads, a proper storage strategy is as important as the advanced computation platform you choose, says Hitachi Vantaras Christensen. Underpowering a complex distributed, and very expensive, computation platform will net lower performing results, diminishing the quality of your outcome, ultimately reducing the time to value.

Continued here:
Storage requirements for AI, ML and analytics in 2022 - ComputerWeekly.com

Read More..

Six inducted into Missouri S&T Academy of Electrical and Computer Engineering – Missouri S&T News and Research

Six electrical and computer engineers with ties to Missouri University of Science and Technology were inducted into the Missouri S&T Academy of Electrical and Computer Engineering during the academys induction ceremony, which was held Thursday, April 7, at Comfort Suites Conference Center.

Founded in 1980, the academy is a departmental advisory group composed of alumni and other electrical and computer engineers who have made outstanding contributions to their profession. The new inductees were recognized for their service and leadership in electrical and computer engineering.

New members are listed below:

Cameron K. Coursey of Defiance, Missouri, vice president of platforms at AT&T, earned bachelors and masters degrees in electrical engineering from Missouri S&T in 1987 and 1988, respectively. Coursey began his career as a communications system engineer for McDonnell Douglas Corp. in 1989. In 1991, he moved to SBC Technology Resources Inc. as a member of its technical staff and was promoted to senior member in 1996. From 1999 to 2003, Coursey served as director of SBC Technology Resources and Cingular Wireless, and in 2003 became executive director of AT&T. He has served in his current role since 2009. A senior member of IEEE, Coursey holds 13 U.S. patents and authored two technical books. He is a past member of the board of the 5G Automotive Association.

John M. Haake of St. Charles, Missouri, founder and owner of Titanova Inc., earned bachelors and masters degrees in electrical engineering from Missouri S&T in 1986 and 1988, respectively. Haake began his career as a principal engineer at McDonnell Douglas Corp. then co-founded Nuvonyx Inc., where he worked until 2007. He then served as product line manager at Coherent for a year before founding Titanova Inc. in 2008. Haake holds 29 U.S. patents and is a member of the Greater St. Charles Chamber of Commerce, NTMA, ASM, ASME and Eta Kappa Nu.

Clay E. Merritt of Bella Vista, Arkansas, earned a bachelors degree in electrical engineering from Missouri S&T in 1985 and a masters degree in electrical engineering from University College Dublin in 1987. He began his career as a vending machine control board design engineer for Coin Acceptors in Clayton, Missouri. He then moved to Motorola Semiconductor where he served as a field application engineer and an application engineering manager. In 2008, he was named new product definition manager for Freescale Semiconductor in Austin, Texas, then joined Spansion as microcontroller applications manager in 2013. He served VORAGO Technologies in microcontroller applications and definition from 2015-2019. Merritt published nine application notes, holds two patents and was Motorolas FAE of the Year in 1998. He has managed teams in five different countries and while working in product definition for Motorola visited customers in over 30 countries.

Dale L. Morse of Milford, Michigan, who retired from General Motors in 2014, earned a bachelors degree in electrical engineering from Missouri S&T in 1979. He also holds a masters degree from Rensselaer Polytechnic Institute. Morse began his career as a reliability engineer for General Motors. In 1986 he was named design release engineer, and in 1995 he was named engineering group manager. He also served as university relations team coordinator from 2005-2011. Morse was awarded a patent for Fade Compensated Tone Control Method and Apparatus in 1993. A member of the Order of the Golden Shillelagh at Missouri S&T, Morse served as president of the Motor City Section of the Miner Alumni Association from 2010-2016 and earned the associations Robert V. Wolf Alumni Service Award in 2018.

Russell L. Woirhaye of Stillwell, Kansas, retired design engineer from SEGA Consultants, earned a bachelors degree in electrical engineering from Missouri S&T in 1971. Woirhaye began his career as a field engineer in Westinghouse Electric Co.s power generation division. In 1976, he moved to Black & Veatch Consulting Engineers where he served as design engineer and then project manager. After working as a consulting engineer for a year in 2003, Worihaye joined SEGA Consultants in 2004 and retired as its design engineer in 2010. He is a retired professional engineer in Kansas, Missouri and Texas, an IEEE Life Member and a past member of the IEEE and IEEE Communications Society.

Zhiping Yang of Campbell, California, signals team lead for Waymo, earned a Ph.D. in electrical engineering from Missouri S&T in 2000. He also holds bachelors and masters degrees in electrical engineering from Tsinghua University at Beijing. Yang began his career as a technical leader for Cisco Systems Inc. and then served as principal signal integrity engineer for Apple, signal integrity engineer and principal engineer for Cisco Systems (formerly Nuova Systems), senior principal power integrity engineer for Apple and senior hardware manager for Google. He joined Waymo in his current position in 2021. A member of Eta Kappa Nu and an IEEE Fellow, Yang holds over 20 U.S. patents and has authored over 70 conference and journal publications. He holds leadership positions in IEEE EMC Society, serves as associate editor of two IEEE journals, chaired the NSF Industry Advisory Board IUCRC CEMC Research Center, was IEEE EMC Society Distinguished Lecturer and received the IEEE EMC Society Technical Achievement Award. Yang was founding chair of the Missouri S&T/UMR EMC Lab Alumni Network and serves as librarian and board member for IBIS open forum. He earned the Best Symposium Paper Award from the IEEE EMC Society Symposium in 2006 and 2011, and earned the 2006 IEEE PES Prize Paper Award.

Read more here:

Six inducted into Missouri S&T Academy of Electrical and Computer Engineering - Missouri S&T News and Research

Read More..

UNL College of Engineering beginning to see payoff on investments into people, facilities – Lexington Clipper Herald

CHRIS DUNKER Lincoln Journal Star

The empty lot that became a big hole in the ground on the University of Nebraska-Lincoln's campus is beginning to transform again.

As steel beams begin to stretch skyward, outlining what will become Kiewit Hall, a $115 million facility funded through private donations, UNL's College of Engineering is working to transform alongside it.

University of Nebraska College of Engineering Dean LancePrez talks about changes in the engineering program at Othmer Hall on Thursday.

"We're going to transform the student experience," Lance Prez, the college dean, told the NU Board of Regents on Thursday during a campus tour.

Regents also toured and spoke throughout the day with personnel at the UNL College of Education and Human Sciences, the College of Journalism and Mass Communications, the Johnny Carson Center for Emerging Media Arts and the College of Law.

Many of the presentations allowed regents to see the results of the board's investments made in faculty, facilities and new programs over the last few years.

In addition to building a new facility Kiewit Corp. in Omaha made the lead gift of $25 million the College of Engineering is also expanding its faculty and financial support for students.

People are also reading

Forty new engineering faculty have been hired over the last three years,Prez said, with many moving into the 87,000 square feet of newly renovated space known as the Engineering Research Center.

Many of those individuals come to UNL with National Science Foundation career awards, or an equivalent award, which brings research funding along with it, and the chance for undergraduate and graduate students to get valuable hands-on experience.

UNL is also undertaking a professional development program aimed at shifting its engineering curriculum toward more active and collaborative learning, the dean said.

The interactions that kind of learning will foster, between instructors and students and between students and students, and instructors and instructors across disciplines, may lead to new academic programs.

Every first-year student entering the college this fall will enroll in the Complete Engineer program, which teaches non-technical skills such as communication, leadership, teamwork, professional ethics, and civic and social responsibility alongside the engineering education.

Those competencies will be included on the students' transcripts when they graduate, Prez said.

UNL is also partnering with Kiewit Corp. and the Peter Kiewit Foundation to support students in ways Prez said is making a College of Engineering education more accessible for Nebraska students.

A second cohort of 10 Kiewit Scholars, whose cost of education is paid for by executives at Kiewit Corp., will enter UNL in the fall. The first cohort of 10 students has earned an average 3.8 GPA and had opportunities to visit projects across the country and meet business leaders.

Starting this fall, the Peter Kiewit Foundation Scholars will pay the cost of education including tuition, room and board and books for 40 students.

The $5 million annual program, which is focused on Nebraska students, as well as those with demonstrated need, received hundreds of applications, Prez said.

"We're really doing everything we can to make sure that every Nebraskan has an opportunity to study engineering and preserve that access to students across the state," he told regents.

Prez said the massive investments from the state, university and private sector are beginning to show results.

After peaking at 3,117 students in the 2017-18 school year roughly corresponding to record enrollment across the NU system the College of Engineering experienced three consecutive years of enrollment losses.

Enrollment jumped to 3,023 students last fall, however, and the number of students who have been accepted is also on the rise.

A total of 435 students have placed enrollment deposits for the fall 2022 semester, Prez said a nearly 22% increase over the number of enrollment deposits last year at this time.

"If that comes to fruition,"Prez said, referring to students who have been admitted and paid a deposit showing up for fall classes, "we'll really start to see the growth we're talking about."

This year's enrollment deposits are 9% higher than the 398 students who had been admitted to the College of Engineering at this point in 2019 before the pandemic, when enrollments plummeted across the country.

Prez credited the leadership of NU President Ted Carter, UNL Chancellor Ronnie Green and the state in pushing to keep the university open last year, when many institutions remained closed because of COVID-19.

"We were aggressive in understanding the value of in-person education, while also learning lessons of remote learning and access, so we are entering this fall poised in a position of strength compared to many other institutions," he said.

Prez said as the College of Engineering watches Kiewit Hall continue to take shape, it will also keep close tabs on how students it educates and trains take shape as well.

"I feel confident every one of those students will get an offer from a Nebraska company," he said. "Whether or not they take it, that's their decision, of course, but there is plenty of opportunity in the state right now across engineering, computing and construction."

During Thursday's tour, Green pointed out to regents a space across Vine Street to the south of Kiewit Hall where he envisions a future School of Computing building.

UNL proposed the creation of a School of Computing that operates under the College of Engineering in February 2020; the Board of Regents approved the idea in August 2021.

The proposed $80 million facility, which will be a part of a future fundraising effort launched by the University of Nebraska Foundation, will be home to future degree programs in data science, artificial intelligence and other high-tech fields.

"We don't do vocational training we educate students for the future because they are going to be doing jobs we haven't thought of yet," Prez said.

The University of Nebraska-Lincoln's Architectural Hall, seen here in July 1987 following a $4.38 million renovation, is the university's oldest building. It originally housed UNL's library and art gallery and served as headquarters for the Nebraska State Historical Society.

Louise Pound Hall previously housed the College of Business Administration. It opened in 1919 and was renovated in 2018.

The Temple Building, at 12th and R streets, is home to the Hixson-Lied College of Fine and Performing Arts. It was completed in 1908 and renovated in the 1970s.

Morrill Hall, home to State Museum of Natural History on the UNL City Campus, was built for $350,000 and dedicated in 1927.

Banners tream down through the atrium gathering place which unites UNL's Architecture Hall and Architecture Hall West, the former law college building, in this November 1987 photo.

Construction underway in January 1986 renovating and linking UNL's former law building (left) and Architectural Hall.

Construction on Architecture Hall, originally built as a library for $110,000, started in 1892. It was renovated for $4.3 million in the mid-1980s.

UNL demolish the Cather and Pound residence halls in 2017.

Reach the writer at 402-473-7120 or cdunker@journalstar.com.

On Twitter @ChrisDunkerLJS

Get our local education coverage delivered directly to your inbox.

More:

UNL College of Engineering beginning to see payoff on investments into people, facilities - Lexington Clipper Herald

Read More..

University of Nottingham Ningbo China’s Chemical and Environmental Engineering Programmes Lead the Way for Future Careers in Sustainable Development -…

NINGBO, China--(BUSINESS WIRE)--The University of Nottingham Ningbo China (UNNC)s School of Chemical and Environmental Engineering has received multiple international accreditations, including by the Institute of Materials, Minerals and Mining (IoM3) and the Institution of Chemical Engineers (IChemE). In addition, the School also has excellent and experienced teaching fellows and receives regular glowing reviews from its international student body.

In recent years, courses related to the disciplines of chemistry and environmental studies have become increasingly popular with students and parents. The world is also recognizing the importance of the disciplines: among the 17 Sustainable Development Goals proposed by the United Nations, several are closely related to the environment.

In the view of Professor Tao Wu, Dean of the Faculty of Science and Engineering at UNNC, the two undergraduate courses of UNNC Chemical Engineering and Environmental Engineering have great potential. "High-end chips need not only chip design and manufacturing talents, but also experts in materials and chemistry."

The School has experienced teaching fellows and the small class size of the School of 1:8 is advantageous for study, allowing lecturers to guide students to participate in scientific research, use cutting-edge experiment equipment, and support students to create and innovate. During each summer vacation, lecturers stay on campus voluntarily to supervise and guide students to conduct summer research.

Carbon neutrality, and the transformation of energy and the manufacturing industry are critical to the future development of our country and the world, said Yiyang Liu, a UNNC alumnus. Despite receiving job offers from over 30 renowned companies, Yiyang chose to go on to further study at a top 10 international university in the UK.

A degree certificate with international accreditation substantially enhances students competitiveness in further study and employment. At UNNC, previous employment destinations of the graduates of the School include UNEP office in Beijing, Morgan Stanley China, ExxonMobil and environmental protection departments of government at all levels. According to the employability report, more than 85% of Class 2020 students who chose to continue their studies were admitted to world top 50 universities.

Visit link:

University of Nottingham Ningbo China's Chemical and Environmental Engineering Programmes Lead the Way for Future Careers in Sustainable Development -...

Read More..

Tony Barrett Promoted to President of Cyber and Engineering at BigBear.ai – Executive Gov

Intelligence executiveTony Barrett has been elevated to the position of president of cyber and engineering at data and digital services companyBigBear.ai.

In his new role, Barrett will build upon his work in BigBears integrated defense solutions division while leading both security efforts and the innovation-minded creation of new strategies, he announced in aLinkedIn post on April 1.

I am humbled as well as energized by the opportunity and look forward to many more great things at BigBear.ai, Barrett shared.

The executives experience combines business operations, software and enterprise-level computer system operations and military combat service. For over two decades, Barrett was an officer, commanding officer, major and deputy director in the U.S. Marine Corps.

After this time, he began a career in the private sector spearheading intelligence-surveillance-reconaissance data and technology integration throughout various data origin points, domains and security levels at software company Modus Operandi.

Subsequently, Barrett was Hanscom site lead and and director of Department of Defense operations for technology and management consulting firm PCI, which was eventually acquired by BigBear.ai in February 2021.

Barretts core capabilities lie in intelligence operations, investigations, enterprise architecture and counterinsurgency operations. He also specializes in counterterrorism operations, internal and personal security and intel systems.

He attended Boston University and graduated cum laude with a bachelors degree in history in three years time while on active duty in a Marine-enlisted commissioning program.

The promotion of Barrett follows BigBear.ais February addition of twonew senior vice presidents:Todd Hughes assumed the position of SVP of technology and research andDan Jones came aboard as SVP of products.

Excerpt from:

Tony Barrett Promoted to President of Cyber and Engineering at BigBear.ai - Executive Gov

Read More..