Page 3,717«..1020..3,7163,7173,7183,719..3,7303,740..»

Edge Computing The Future IoT Solution – Electropages

The field of IoT has seen a dramatic rise of internet technologies being integrated into everyday life. However, the lack of security has resulted in social pressure and government action forcing designers to implement stronger security features. How can edge computing help and why may it become the ultimate solution for IoT in the future?

Since their introduction, IoT devices have exploded globally with an estimated number of at least 20 billion globally. While the term Internet of Things, or IoT, is a relativity new term the use of internet related technologies dates back to the creation of the internet itself. But the IoT movement is more concerned with simple devices that traditionally would not have internet capabilities (such as sensors and data loggers) which is why it is considered a separate sector to standard internet computing technologies such as computers, laptops, and phones.

The first IoT devices were simple in nature and often targeted for niche markets including basic remote temperature and humidity logging. As the nature of the data that was being gathered was benign in nature (i.e. not sensitive), security was given the minimal concern with many devices using default passwords and unencrypted messaging protocols. Since the number of IoT devices at the beginning was minimal combined with the lack of capabilities these devices went unnoticed by security experts, cybercriminals, and governments alike. But all of this changed as technology improved, devices became more intelligent, and the nature of the data being gathered became more sensitive.

One technology that has accelerated thanks to the IoT sector is AI thanks to the unimaginable quantities of data provided by IoT devices. AI systems are being used to power many modern tasks that are otherwise too difficult or too varied to be programmed traditionally using if statements and switch cases to account for every possibility. Such examples would include speech recognition, voice recognition, image recognition, intelligent search results, and personalised assistants. As stated previously, the first data types gathered by IoT were benign in nature including temperature and humidity which could be use to create intelligent systems that can respond to those environmental stimuli. But designers quickly realised that with the advancements in microcontroller technologies (for example, the shift from 8-bit to 32-bit ARM) more complex data types could be gathered including audio and visual. Such systems could be used to create advanced AI IoT devices that could not only gather data about their surroundings but send this data to a cloud-based AI system which can learn from the data and provide better results in the future. For example, the Amazon Echo is an IoT device that submits spoken user requests to a cloud-system which is analysed for both performing the request as well as improving the AI for future use. Very quickly, IoT devices exploded globally containing a whole range of integrated features ranging from accelerometers, magnetometers, motion sensors, cameras, and microphones. But the speed at which these devices were being designed and put to market was far too great and this is where cybercriminals began to take advantage.

The speed at which IoT designs changed as well as the sudden increase in demand for IoT devices saw engineers turn around products in record time. This, combined with the inability of governments to respond to fast changing markets as well as the short-sightedness of designers quickly saw many billions of devices on the market that contained insufficient security measures while handling highly sensitive data. It would not be long before cybercriminals used the many weaknesses of IoT devices to perform malicious activities including DDoS attacks, crypto-mining, blackmail, and data selling. Devices on the market would either have a default password or no password, would not use encrypted messaging protocols, would be built on unsecured silicon technologies, or would leave admin privileges in place for the application space (i.e. the firmware would run with full processor privileges). Devices could also leave networks exposed by allowing an attacker to gain easy entry to the device and then utilising its network connection to either gain internet access or local access (which could allow it to gain entry to servers and other devices on the same network).

Despite warnings from security specialists and others in the industry governing bodies around the world have begun to introduce regulations that describe how designers should remove features that leave their designs open to easy attack. So far, the majority of these regulations are more concerned with removing default passwords but as time progresses these may change to include more features such as mandatory encryption, on-device hardware security, and the need for security when the device is decommissioned. However, there is one emerging technology that may help to solve issues with IoT security; edge computing.

Currently, IoT devices gather data from their surroundings and stream this data to a cloud-based platform which in-turn can provide multiple features including data viewing, data learning, and data processing. For example, an advanced home automation system might have various IoT sensors around a property whose data is streamed to a cloud-based service that determines how environmental controls should be adjusted. This use of the cloud to perform data processing is often called cloud-computing and essentially means that the data processing is done remotely from the IoT device responsible for gathering that data. Edge computing, however, is where the IoT device itself is responsible for some proportion of data processing either partially or entirely. Early IoT devices would not be capable of edge computing due to the limitations of technology at the time but with the introduction of powerful microcontrollers at equivalent prices local IoT devices can start to process their own data.

Edge computing holds a lot of advantages over cloud-computing including security, latency, and reliability. Since edge computing devices transmit little data to a cloud-based system (if at all) sensitive data is less exposed to potential sources of attacks. The lack of transmission means that an attacker would need to gain direct entry to the device itself as opposed to performing a man-in-the-middle attack, an attack on the server itself, or spoofing the server. Keeping data local to a device also provides designers with more opportunity to protect the data as soon as it is gathered with the use of memory encryption as well as dedicated security hardware. Edge computing devices can also perform partial processing on sensitive data before sending it to a cloud-based system for further processing which can help obscure data and thereby reducing its usefulness to an attacker (i.e. a trained neural net is far less sensitive than visual data from a camera).

Processing data itself locally to a device also means that latency is significantly reduced which is highly beneficial in applications requiring fast results (such as self-driving cars). The ability to locally process the data also removes the need for a constant internet connection which helps to improve reliability of the design. Many areas globally still suffer from internet reliability and can also be subjected to large swings in internet speed. The use of edge computing helps to increase the available bandwidth of a local network which can improve other services such as local servers and other IoT devices and therefore increase the maximum number of devices on a single network (thereby allowing for more IoT devices to be integrated).

While the cost of micros has continued to fall while their capabilities have significantly increased they are still more expensive than cheaper microcontrollers making low-end micros more desirable for mass-produced devices. The introduction of regulations also makes it harder to use mid-range devices that have the processing capabilities needed for advanced features as they may lack hardware security that could leave them exposed. At the same time, the need for AI in modern products also further limits the choices for engineers who may need AI engines on their IoT devices to efficiently run neural networks.

Edge computing provides designers with a whole new paradigm of computing that can see low-latency, high reliability IoT devices that can combine the best features of cloud-computing with local processing. Hardware security features such as secure boot and root-of-trust will become key technologies for securing devices and the inclusion of AI engines will allow devices to perform the majority of data processing locally. But despite the many security advantages provided by edge computing designers still need to carefully consider how their device handles sensitive data, how it can potentially be used maliciously, and how they can help to not only protect the users but also contribute to the world stage in an ever more interconnected future.

Read the original:
Edge Computing The Future IoT Solution - Electropages

Read More..

OnlyOffice review: create and collaborate with this feature-rich office solution review – TechRadar

OnlyOffice ticks a lot of boxes, and is built for collaboration and teamwork. If youre looking for a powerful Microsoft Office alternative for your business, this may be it. Read through our OnlyOffice review to see why we were so impressed with this cloud and server-based office suite.

We tested macOS and web versions, and had a streamlined experience on both. A main landing page presents you with your folders and documents, collaborative folders and tools, and cloud accounts, all with a sleek design.

Like many suites, OnlyOffice will look very familiar to those who have experience with Microsoft Office. You can easily edit text and add elements with the ribbon at the top, while a sidebar supports more advanced features like editing embedded chart data and customizing tables.

The interface is well-organized, and the HTML5 web app is impressively responsive: it really felt like using an on-disk program. The only notable limit we found regarded trackpad zooming: it doesnt work in Safari and is overly sensitive and slow to respond in Chrome.

Some plans (see below) enable business to customize the appearance, interface and function of the software at a deep level.

Before diving in, wed like to note that OnlyOffice supports the addition of premade and homemade plugins, which means that if a feature doesnt exist, you can create it yourself. Its compatible with all Office and OpenDocument filetypes.

Documents

OnlyOffice supports some of the richest text formatting weve seen, in addition to style creation and customization. Page layout options are comprehensive, with margins, custom page sizes, and even personalized watermarks. Columns are supported, but must all be the same size.

Lists creation is exemplary: hyphens and asterisks result in new lists; a huge range of icons is available; indents cycle through list styles; multilevel lists are supported and customizable; and formatting changes carry through list levels.

A references tab supports automatic Table of Contents creation and customizable footnotes. While OnlyOffice lacks any kind of citation manager, an EasyBib plugin exists, so subscribers can enjoy full integration. Find & Replace supports Replace All, but not finding styles.

Spreadsheets

Our opinion of the Spreadsheets app was mixed. On the one hand, its certainly powerful: there are lots of built-in formulas, plus support for filtering, Text to Columns and pivot tables. Cell formatting is rich, with customizable number, date, and currency formats. Finally, charts and graphs are easy to create from data and customize.

On the other hand, we found formula input limited. Suggestions appear as you type, but descriptions are available only on hover, and they disappear once the formula is selected. Similarly, argument hints could be clearer or provide examples. Next, error parsing fails to indicate specific problem elements, making it hard to tell whats gone wrong. Finally, #NAME and #VALUE errors give no information when selected, and error tracing is unsupported.

Presentations are straightforward and easy to use. Adding slides and elements and choosing highly customizable transitions worked intuitively, as did presenter mode. We did notice that its impossible to record the timing of a rehearsed slideshow, a feature which MS Office supports.

OnlyOffice is first and foremost a web app, and works incredibly well as such. We did find, however, that the iOS app is a bit cumbersome. This was surprising, given the polished look and smooth interface of the desktop and web app versions. None of the features are easily accessible, as they are with the ribbon, but rather hidden behind icons and menus that dont make their function immediately obvious. There is also no hand-writing support.

That being said, most of the functions are present, if a little difficult to find. For example, we were happy to see that embedded graphs worked just fine in word documents, even if editing data takes you to another screen.

Collaboration features are deeply integrated and one of the core functions of OnlyOffice, which is marketed primarily towards businesses looking for a streamlined company-wide solution. Files can be edited by multiple collaborators in real time or by syncing changes, and you can easily invite users or groups from within your network. The web and desktop app support version history, though this feature is regrettably absent from the iOS app.

Finally, cloud sharing is available with services like Google Drive and DropBox.

OnlyOffice offers four productsCloud Service, Enterprise, Integration, and Developerand the pricing scheme is complex.

Cloud Service provides cloud storage and access to the OnlyOffice suite for $5/user/month ($3 billed annually/$2 triennially). Storage starts at 20GB for 12 users and increases up to 500GB for 50 users. For teams of over 50, you must contact OnlyOffice for custom pricing.

Meanwhile, Enterprise gives you access to the office suite, plus other collaborative tools like email and calendars, on a private server. You also get enhanced security options and help with installation. There are three tiers, with the lowest priced starting at $1200/year for up to 50 simultaneous connections.

Integration is built to work with cloud services your company already uses, like Jira or Moodle. The Home Server edition costs a one-time payment of $99 for up to 10 users, while the Single Server edition costs $1100 for 50 users, and can be scaled according to need.

Finally, Developer enables you to build OnlyOffice into your own software or SaaS from the ground up, customizing it at the most basic level to fit your needs and your companys brand. For $1500 per server, you get 20 connections, with the price increasing depending on the number of connections.

If youre looking for the best office suite for your business, OnlyOffice ticks almost all the boxes. It already includes most office features and is highly customizable with powerful plug-in support. Thus, while we lamented certain missing functions, like intuitive error parsing or advanced find & replace, its theoretically possible to add these. Finally, at $40/month for 10 users with 100GB cloud storage, its reasonably priced. All in all, this is a great solution for business of all sizes.

OnlyOffice is highly customizable and great for businesses, but does require some setting up. If youre looking for a quicker, plug-and-play solution, iWork and OfficeSuite are both good options.

If your business is Mac-based, iWork is the way to go. Its free and supports collaboration and handwriting out-of-the-box. For smaller teams, OfficeSuites $49.99 Group plan supports up to five users, with highly functional spreadsheets for data analysis.

To see how OnlyOffice fares against the competition, check out our guide to the Best Microsoft Office alternatives.

Read more:
OnlyOffice review: create and collaborate with this feature-rich office solution review - TechRadar

Read More..

Avoiding DR and High Availability Pitfalls in the Hybrid Cloud – Computer Business Review

Add to favorites

The SLAs only guarantee the equivalent of dial tone for the physical server or virtual machine

The private cloud remains the best choice for many applications for a variety of reasons, while the public cloud has become a more cost-effective choice for others, writes David Bermingham, Technical Evangelist at SIOS Technology.

This split has resulted intentionally or not in the vast majority of organizations now having a hybrid cloud. But there are many different ways to leverage the versatility and agility afforded in a hybrid cloud environment, especially when it comes to the different high availability and disaster recovery protections needed for different applications.

This examines the hybrid cloud from the perspective of high availability (HA) and disaster recovery (DR), and provides some practical suggestions for avoiding potential pitfalls.

The carrier-class infrastructure implemented by cloud service providers (CSPs) gives the public cloud a resiliency that is far superior to what could be justified for a single enterprise.

Redundancies within every data center, with multiple data centers in every region and multiple regions around the globe give the cloud unprecedented versatility, scalability and reliability. But failures can and do occur, and some of these failures cause downtime at the application level for customers who have not made special provisions to assure high availability.

While all CSPs define downtime somewhat differently, all exclude certain causes of downtime at the application level. In effect, the service level agreements (SLAs) only guarantee the equivalent of dial tone for the physical server or virtual machine (VM), or specifically, that at least one instance will have connectivity to the external network if two or more instances are deployed across different availability zones.

Here are just a few examples of some common causes of downtime excluded from SLAs:

It is reasonable, of course, for CSPs to exclude these and other causes of downtime that are beyond their control. It would be irresponsible, however, for IT professionals to use these exclusions as excuses for not providing adequate HA and/or DR protections for critical applications.

Properly leveraging the clouds resilient infrastructure requires understanding some important differences between failures and disasters because these differences have a direct impact on HA and DR configurations. Failures are short in duration and small in scale, affecting a single server or rack, or the power or cooling in a single datacenter. Disasters have more enduring and more widespread impacts, potentially affecting multiple data centers in ways that preclude rapid recovery.

The most consequential effect involves the location of the redundant resources (systems, software and data), which can be local on a Local Area Network for recovering from a localized failure. By contrast, the redundant resources required to recover from a widespread disaster must span a Wide Area Network.

For database applications that require high transactional throughput performance, the ability to replicate the active instances data synchronously across the LAN enables the standby instance to be hot and ready to take over immediately in the event of a failure. Such rapid, automatic recovery should be the goal of all HA provisions.

Data is normally replicated asynchronously in DR configurations to prevent the WANs latency from adversely impacting on the throughput performance in the active instance. This means that updates being made to the standby instance always get made after those being made to the active instance, making the standby warm and resulting in an unavoidable delay when using a manual recovery process.

All three major CSPs accommodate these differences with redundancies both within and across data centers. Of particular interest is the variously named availability zone that makes it possible to combine the synchronous replication available on a LAN with the geographical separation afforded by the WAN. The zones exist in separate data centers that are interconnected via a low-latency, high-throughput network to facilitate synchronous data replication. With latencies around one millisecond, the use of multi-zone configurations has become a best practice for HA.

IT departments that run applications on Windows Server have long depended on Windows Server Failover Clustering (WSFC) to provide high availability. But WSFC requires a storage area network (SAN) or some other form of shared storage, which is not available in the public cloud. Microsoft addressed this issue in Windows Server 2016 Datacenter Edition and SQL Server 2016 with the introduction of Storage Spaces Direct. But S2D has its own limitations; most notably an inability to span multiple availability zones, making it unsuitable for HA needs.

The lack of shared storage in the cloud has led to the advent of purpose-built failover clustering solutions capable of operating in private, public and hybrid cloud environments. These application-agnostic solutions facilitate real-time data replication and continuous monitoring capable of detecting failures at the application or database level, thereby filling the gap in the dial tone nature of the CSPs SLAs. Versions available for Windows Server normally integrate seamlessly with WSFC, while versions for Linux provide their own SANless failover clustering capability. Both versions normally make it possible to configure different failover/failback policies for different applications.

More information about SANless failover clustering is available inEnsure High Availability for SQL Server on Amazon Web Services. While this article is specific to AWS, the clusters basic operation is the same in the Google and Azure clouds.

It is worth noting that hypervisors also provide their own high availability features to facilitate a reasonably quick recovery from failures at the host level. But they do nothing to protect against failures of the VM, its operating system or the application running in it. Just like the cloud itself, these features only assure dial tone to a VM.

For DR, all CSPs have ways to span multiple regions to afford protection against widespread disasters that could affect multiple zones. Some of these offerings fall into the category of DIY (Do-It-Yourself) DR guided by templates, cookbooks and other tools. DIY DR might be able to leverage the backups and snapshots routinely made for all applications. But neither backups nor snapshots provide the continuous, real-time data replication needed for HA. For databases, mirroring or log shipping both provide more up-to-date versions of the database or transaction logs, respectively, but these still lag the data in the active instance owing to the best practice of having the standby DR instance located across the WAN in another region.

Microsoft and Amazon now have managed DR as a Service (DRaaS) offerings: Azure Site Recovery and CloudEndure Disaster Recovery, respectively. These services support hybrid cloud configurations and are reasonably priced. But they are unable to replicate HA clusters and normally have bandwidth limitations that may preclude their use for some applications.

One common use case for a hybrid cloud is to have the public cloud provide DR protection for applications running in the private cloud. This form of DR protection is ideal for enterprises that have only a single datacenter and it can be used for all applications, whether they have HA protection or not. In the enterprise datacenter, it is possible to have a SAN or other form of shared storage, enabling the use of traditional failover clustering for HA protection. But given the high cost of a SAN, many organizations are now choosing to use a SANless failover clustering solution.

The diagram below shows one possible way to configure a hybrid cloud for HA/DR protection. The use of SANless failover clustering for both HA and DR has the additional benefit of providing a single solution to simplify management. Note the use of separate racks in the enterprise data center to provide additional resiliency, along with the use of a remote region in the public cloud to afford better protection against widespread disasters.

This hybrid HA/DR configuration is ideal for enterprises with only a single datacenter.

This configuration can also be flipped with the HA cluster in the cloud and the DR instance in the enterprise datacenter. While it would also be possible and even preferable to use the cloud for both HA and DR protection, this hybrid configuration does at least provide some level of comfort to risk-averse executives reluctant to commit 100% to the cloud. Note how using SANless failover clustering software makes it easy to lift and shift HA configurations when migrating from the private to a public cloud.

With multiple availability zones and regions spanning the globe, all three major CSPs have infrastructure that is eminently capable of providing carrier-class HA/DR protection for enterprise applications. And with a SANless failover clustering solution, such carrier-class high availability need not mean paying a carrier-like high cost. Because SANless failover clustering software makes effective and efficient use of the clouds compute, storage and networking resources, while also being easy to implement and operate, these solutions minimize ongoing costs, resulting in robust HA and DR protections being more affordable than ever before.

David Bermingham is Technical Evangelist at SIOS Technology. He is recognized within the technology community as a high-availability expert and has been honored to be elected a Microsoft MVP for the past eight years: six years as a Cluster MVP and two years as a Cloud and Datacenter Management MVP. David holds numerous technical certifications and has more than thirty years of IT experience, including in finance, healthcare and education.

View original post here:
Avoiding DR and High Availability Pitfalls in the Hybrid Cloud - Computer Business Review

Read More..

From server room to boardroom the demands of today’s CIO – The Union Journal

The significance of IT in organisation today is such that those holding the reins to it are some of the most substantial leaders in any type of company.

A companys safety and security, client experience, item advancement, affordable distinction, organisation knowledge currently drops, or at the very least overlaps, right into the CIO or their matchings remit. No stress after that.

The hilly job of the CIO in 2020 can be a unrecognized one CIOs are usually anticipated to supply both day-to- day outcomes while intending very closely for the firms future.

At the very same time, they might have to please both inner stakeholders (typically by ways of conserving prices and also boosting efficiency with remarkable technical tasks) along with the end customers (by supplying smooth client experience and also elevating marginal client responses resolution flags).

This fragile harmonizing act can be greatly exhausting on any type of specific, not to mention the CIO that is frequently under the cosh to introduce, in advance of the nearby rival. In reality, the demands can be so extreme that the CIOs functioning life expectancy is one of the quickest in the C-suite, balancing simply 4.3 years.

These significant demands and also assumptions, however, are weakening or even more precisely, underestimating the transformative impact that the CIO really possesses in todays business.

No much longer is the CIO essentially a pietistic IT supervisor, charged with handling and also keeping the firms IT framework, information streams, and also web servers.

Instead, as IT has actually come to be a main part to organisation, CIOs have (some of them, possibly, unwillingly) transitioned to understanding, and after that combining organisation and also modern technology objectives to drive the advancement and also development of their company in todays electronic economic climate.

In a function that has actually developed as rapidly as IT has within the last years, the CIO should currently sweep in between the advancement group and also the c-suite, be dexterous and also fast to respond to the range of one-of-a-kind obstacles, and also discuss financial investment in transformational modern technology.

Besides connecting the technology divide on part of the company inside, CIOs should additionally be hip to to the end individuals and also the optimization of their experience.

CIOs are typically heading the electronic movement of customer communications with the firm throughout numerous systems like the internet, applications, social media sites making these communications much faster, much more appealing, along with even more noticeable to participants of the sales and also advertising funnels, so the firm realizes of customer assumptions.

A vital device in the CIOs significant toolbox is the leveraging of third-party efficiency devices and also solutions, such as moving physical work and also handling treatments on the internet right into cloud computer settings.

As well as discussing for financial investment in the right devices and also modern technologies to maintain organisation procedures ticking efficiently right into an electronic future, CIOs should additionally prepare to defend ability spending plans, so the right professionals can be onboarded, the CIO can unload specific jobs, and also return from those technology financial investments can absolutely be recognized.

That might consist of cloud professionals to lead cloud movements, information researchers to look after campaigns with AI and also analytics, and also also a marked gatekeeper to guarantee the organisation is well protected daily.

In such an extensive function which is so fast to develop, management abilities are currently as useful as technological expertise, and also possibly tactical delegation can assist them press an additional couple of months out of that balance period.

Post Views: 67

Here is the original post:
From server room to boardroom the demands of today's CIO - The Union Journal

Read More..

U.S. Census Goes Digital With The iPhone 8 – The Mac Observer

Its census year in the U.S., but this time around its going to be different. Each enumerator tasked with getting the data is to be handed an iPhone 8 instead of a pen and paper. CNet looked into how it is all going to work, and the risks involved.

In an effort to make the door-to-door process, which is the most laborious and expensive part of the census, faster and more efficient, the bureau is arming 500,000 enumerators with the Apple iPhone 8. But as the census goes mobile, instantaneously beaming respondents answers to data centers and cloud servers, it opens itself up to those who may want to access or manipulate such valuable information. The stakes to pull off a census have always been high, but with this years adoption of new technological methods, the pressure to succeed is even higher.

Check It Out: U.S. Census Goes Digital With The iPhone 8

Related

Add a Comment

Log in to comment (TMO, Twitter, Facebook) or Register for a TMO Account

More here:
U.S. Census Goes Digital With The iPhone 8 - The Mac Observer

Read More..

Lenovo Teams With Microsoft Azure At The Edge – The Next Platform

In the ever-evolving landscape that is the edge, applications are the driving force. The Internet of Things (IoT) and the billions of connected devices and systems that make it up are giving the edge structure, and the massive amounts of data that those systems and devices are generating are the jewels. But its the applications the run the engine, such as AI-infused software that will pull the crucial, real-time information from all that data and allow enterprises to make immediate and informed business decisions.

It is a point Keerti Melkote, Aruba founder and president of Aruba Networks within Hewlett Packard Enterprise as well as president of HPEs Intelligent Edge business, made to us last year, saying that the interesting part to me is, whats the application ecosystem at the edge look like? Thats what drives the infrastructure ecosystem and data ecosystem.

That said, hardware is no doubt part of the land rush out to the edge. OEMs and original design manufacturers for the past several years have been rolling out a plethora of servers and storage systems designed not only for the power and space constraints of the edge but also to be able to connect back to the core datacenter and the cloud, creating a distributed IT environment that can pass applications and data from one to another and can be managed with common tools. Theyre also being built to be ready for such advanced technologies like 5G networking and the significant upgrades in speed, bandwidth and capacity that will come with it as it ramps over the next several years.

Recently, Dell EMC in February rolled out the PowerEdge XE2420, a short-depth two-socket system built for space-constrained and harsh edge environments and using up to four Nvidia GPUs, and the Modular Data Center Micro 415, a pre-integrated system that comes with power, cooling and remote management that can be placed in various edge locations, part of the companys larger strategy of developing a continuum from the edge to the datacenter and cloud with systems using common components. Supermicro earlier this month expanded its edge portfolio with small and rugged systems for outdoor environments, 5G RAN (radio access networks, and AI inference.

Like most system makers, Lenovo looks at the edge as a key growth area for the company. The companys Data Center Group (DCG) for the past few years has been looking to build on its $2.1 billion acquisition in 2014 of IBMs x86 server business. The company put a sharp focus on the edge last year. At the Transform 2019 event, Lenovo introduced the ThinkSystem ES350, an edge server about the size of a notebook powered by Intels Xeon-D chip and Nvidia Tesla T4 GPU accelerators and aimed at workloads like machine learning inference. The rugged system provides 256GB of RAM and 16TB of internal solid-state storage and can tolerate environments with a range of temperatures, a lot of dust and vibration.

If you look at it from a hardware perspective, I havent seen anybody deliver something like the SE350 today in the market, Kamran Amini, vice president and general manager of server, storage and software-defined infrastructure at Lenovo DCG, tells The Next Platform. Its a very unique product thats delivering AI capability, storage, different types of connectivity from Wi-Fi to 10 gig LAN to LTE and cellular and enabled for future support of 5G. The form factor and the ruggedized and the security built-in, if you look at the market, I dont see anybody having something similar.

Lenovo is coming around this week with another server based on the SE350 as well as family of all-flash and hybrid flash storage systems that deliver NVM-Express throughout, from the compute through the fabric and all the way to the storage data management area. The new offerings are tied closely to Microsofts Azure software stack for enterprises operating in a hybrid cloud model, with a presence both on-premises and in the cloud. Microsoft actually pre-announced the new ThinkAgile MX1021 at its Ignite 2019 show in November, according to Amini.

The MX1021, coming out of Lenovos family of hyperconverged infrastructure systems, uses the Azure Stack HCI solutions to push compute further out to the edge. It also can be used in combination with Azure Stack Hub to drive the datacenter-to-cloud-to-edge infrastructure model by enabling enterprises to run Azure cloud services on-premises and with Azure IoT Hub to connect edge devices to the Azure cloud. Its a 1U, half-width system powered by a Xeon D-2100 with up to eight cores, up to 256GB of RAM and four DIMM slots. It includes multiple M.2 SATA drive bays and provides direct-connect networking to enable a two- to three-node HCI cluster.

On the storage side is the ThinkSystem DM7100 storage family for Azure hybrid cloud environment. It includes the DM7100F (see below), an all-flash model that includes the NVM-Express protocol for improved performance for non-volatile storage like flash and integrated Azure cloud tiering that drives consistent data management across all enterprise environments. In addition, uses can keep data reduction capabilities in the cloud to reduce their cloud footprints. The DM7100H brings flash with support for SAN and NAS.

With both of these, were really targeting not just purely the traditional environments, but beyond that at whats happening in retail, whats happening in manufacturing or healthcare, Amini says. For example, when you look at healthcare, data is very sensitive and you require pure security, encryption of the drives, insurance, full security. You want to be able to provide better performance, data tiering into the cloud, leveraging the healthcare IT administration more efficiently and streaming what they do. You look at areas like manufacturing, where with our edge MX1021, where delivering HCI capabilities shows no loss of sensor data at that remote manufacturing location. Allowing also offline operation with pure connectivity environments. Were really looking at this to tackle a broad spectrum, but also looking at how 5G and others are changing the dynamics of what is required by IT delivering a true edge-to-core-to-cloud offering here between these two products.

For edge environments, its more than just having a piece of hardware, he says, adding that the key is leveraging that ecosystem and being first where you truly have a seamless integration between, for example, Azure and Azure HCI on-prem and delivering truly whats needed at the edge, which is the seamless integration, deployment and management across the entire stack with Microsoft. You could think about it also that its helping customers that want to extend running Microsoft HCI more to the edge. This is a seamless extension to the edge for them, leveraging Lenovos leading-edge computing technology plus Microsoft HCI Stack. Customers look at edge to have easy access to the cloud while they have access to the core as well.

When looking at the evolution of the edge, verticals are going to play an important role. Industries have different needs that solutions and infrastructure are going to have to address. Lenovo will address those needs with not only through DCG, but also its Intelligent Devices Group, which includes its Nano and Tiny products with their small and micro PCs and other systems, and its IoT business.

Theyre looking at how you deliver services as well, Amini says. How you manage your datacenter, its all consolidated a single space. Its one person to 10,000 devices in one place. When you look at edge, the far edge, youre looking at 10,000 pieces that have to be serviced. Were also looking at how we provide different managed services that are a structure of management of the IT, where it can be deployed and the seamless upgrade if you want or managed in that environment. Thats where youre going to see continued enhancements of this space happen.

Read this article:
Lenovo Teams With Microsoft Azure At The Edge - The Next Platform

Read More..

[Update: Launching April 13] New HomeKit-compatible pan-and-tilt Eufy camera seems to be on the way – 9to5Mac

Earlier this week, Eufy started rolling out HomeKit Secure Video support for the eufyCam 2, and it now seems there is a new pan-and-tilt Eufy camera on the way.

If accurate, this would be the first non-static camera from the budget-focused smart home company

Update 4/3:Reported by HomeKit News, an email from Eufy to customers interested in the new camera says that it will launch the new HomeKit product on April 13th.

HomeKit News reports.

Earlier today, we sent out a tweet about a post in the HomeKit subReddit, where user u/KingKarl-TM uploaded what appears to be an offer from Eufy, via their Facebook page, revealing not one, but two indoor cameras, one of which would appear to be capable of pan and tilt functionality. Given the number of doctored images found online these days, its prudent to be cautious about such claims, however, Christopher Close from iMore did a bit more digging, and found further evidence of the tilt camera in the Eufy Security app.

Not only does the app reveal the aforementioned camera albeit in beta, but the existence of a HomeKit code is shown on the base of the device, further pushing this into the realm of being likely to be HomeKit compatible. Stranger things have happened of course, and it wouldnt be beyond the realms of possibility that such a camera doesnt actually get released. Still, given that it appears they at some point have promoted these via Facebook (no sign of this now seemingly exists beyond the Reddit post) it does look rather promising.

As we explained previously, there are two levels of HomeKit support for security cameras, and theres no indication at this stage which the new cameras might support.

HomeKit Secure Video is designed to address the main security weakness with most of todays smarter cameras. In order to do things like detect people and recognize faces, most cameras upload your video stream to the cloud and carry out the analysis there. That means that unencrypted video is stored on someone elses server, representing a tasty target for hackers.

With HSV, all of the people, animal, and vehicle detection is performed locally, on your own iPad, HomePod, or Apple TV, and only encrypted video is sent to Apples cloud servers.

The situation is, however, complicated by the fact that additional cameras have a more basic form of HomeKit-compatibility, meaning Siri support and motion-triggered actions, but not HomeKit Secure Video.

Eufy offers a range of more wallet-friendly smart home products, including security cameras and robot cleaning devices. A pan-and-tilt Eufy camera would be a welcome addition to the range.

FTC: We use income earning auto affiliate links. More.

Check out 9to5Mac on YouTube for more Apple news:

View post:
[Update: Launching April 13] New HomeKit-compatible pan-and-tilt Eufy camera seems to be on the way - 9to5Mac

Read More..

Google is using machine learning to improve the quality of Duo calls – The Verge

Google has rolled out a new technology to improve audio quality in Duo calls when the service cant maintain a steady connection called WaveNetEQ. Its based on technology from Googles DeepMind division that aims to replace audio jitter with artificial noise that sounds just like human speech, generated using machine learning.

If youve ever made a call over the internet, chances are youve experienced audio jitter. It happens when packets of audio data sent as part of the call get lost along the way or otherwise arrive late or in the wrong order. Google says that 99 percent of Duo calls experience packet loss: 20 percent of these lose over 3 percent of their audio, and 10 percent lose over 8 percent. Thats a lot of audio to replace.

Every calling app has to deal with this packet loss somehow, but Google says that these packet loss concealment (PLC) processes can struggle to fill gaps of 60ms or more without sounding robotic or repetitive. WaveNetEQs solution is based on DeepMinds neural network technology, and it has been trained on data from over 100 speakers in 48 different languages.

Here are a few audio samples from Google comparing WaveNetEQ against NetEQ, a commonly used PLC technology. Heres how it sounds when its trying to replace 60ms of packet loss:

Heres a comparison when a call is experiencing packet loss of 120ms:

Theres a limit to how much audio the system can replace, though. Googles tech is designed to replace short sounds, rather than whole words. So after 120ms, it fades out and produces silence. Google says it evaluated the system to make sure it wasnt introducing any significant new sounds. Plus, all of the processing also needs to happen on-device since Google Duo calls are end-to-end encrypted by default. Once the calls real audio resumes, WaveNetEQ will seamlessly fade back to reality.

Its a neat little bit of technology that should make calls that much bit easier to understand when the internet fails them. The technology is already available for Duo calls made on Pixel 4 phones, thanks to the handsets December feature drop, and Google says its in the process of rolling it out to other unnamed handsets.

More:
Google is using machine learning to improve the quality of Duo calls - The Verge

Read More..

Infragistics Adds Predictive Analytics, Machine Learning and More to Reveal Embedded Business Intelligence Tool – GlobeNewswire

Reveal adds major new features.

Cranbury, NJ, April 03, 2020 (GLOBE NEWSWIRE) -- Infragistics is excited to announce a major upgrade to its embedded data analytics software, Reveal. In addition to its fast, easy integration into any platform or deployment option, Reveals newest features address the latest trends in data analytics: predictive and advanced analytics, machine learning, R and Python scripting, big data connectors, and much more. These enhancements allow businesses to quickly analyze and gain insights from internal and external data to sharpen decision-making.

Some of these advanced functions include:

Our new enhancements touch on the hottest topics and market trends, helping business users take actions based on predictive data, says Casey McGuigan, Reveal Product Manager. And because Reveal is easy to use, everyday users get very sophisticated capabilities in a powerfully simple platform.

Machine Learning and Predictive Analytics

Reveal's new machine learning feature identifies and visually displays predictions from user data to enable more educated business-decision making. Reveal reads data from Microsoft Azure and Google BigQuery ML Platforms to render outputs in beautiful visualizations.

R and Python Scripting

R and Python are the leading programming languages focused on data analytics. With Reveal support, users such as citizen data scientists can leverage their knowledge around R and Python directly in Reveal to create more powerful visualizations and data stories. They only need to paste a URL to their R or Python scripts in Reveal or paste their code into the Reveal script editor.

Big Data Access

With support for Azure SQL, Azure Synapse, Goggle Big Query, Salesforce, and AWS data connectors, Reveal pulls in millions of records. And it creates visualizations fastReveals been tested with 100 million records in Azure Synapse and it loads in a snap.

Additional connectors include those for Google Analytics and Microsoft SQL Server Reporting Services (SSRS). While Google Analytics offers reports and graphics, Reveal combines data from many sources, letting users build mashup-type dashboards with beautiful visualizations that tell a compelling story.

New Themes Match Apps Look and Feel

The latest Reveal version includes two new themes that work in light and dark mode. They are fully customizable to match an apps look and feel when embedding Reveal into an application and provide control over colors, fonts, shapes and more.

More Information

For in-depth information about Reveals newest features, visit the Reveal blog, Newest Reveal FeaturesPredictive Analytics, Big Data and More.

About InfragisticsOver the past 30 years,Infragisticshas becomethe world leader in providing user interface development tools andmulti-platform enterprise software products and services toaccelerate application design and development, including building business solutions for BI and dashboarding. More than two million developers use Infragistics enterprise-ready UX and UI toolkits to rapidly prototype and build high-performing applications for the cloud, web, Windows, iOS and Android devices.The company offers expert UX services and award-winning support from its locations in the U.S., U.K., Japan, India, Bulgaria and Uruguay.

Follow this link:
Infragistics Adds Predictive Analytics, Machine Learning and More to Reveal Embedded Business Intelligence Tool - GlobeNewswire

Read More..

Data Science and Machine-Learning Platformss Market Share Opportunities Trends, And Forecasts To 2020-2027 with Key Players: SAS, Alteryx, IBM,…

Global Data Science and Machine-Learning Platforms Market Forecast 2020-2027

This report offers a detailed view of market opportunity by end user segments, product segments, sales channels, key countries, and import / export dynamics. It details market size & forecast, growth drivers, emerging trends, market opportunities, and investment risks in over various segments in Data Science and Machine-Learning Platformss industry. It provides a comprehensive understanding of Data Science and Machine-Learning Platformss market dynamics in both value and volume terms.

The report provides a basic overview of the industry including definitions and classifications. The Data Science and Machine-Learning Platformss Market analysis is provided for the international markets including development trends, competitive landscape analysis, and key regions development status.

The Major players reported in the market include: SAS, Alteryx, IBM, RapidMiner, KNIME, Microsoft, Dataiku, Databricks, TIBCO Software, MathWorks, H20.ai, Anaconda, SAP, Google, Domino Data Lab, Angoss, Lexalytics, and Rapid Insight

The final report will add the analysis of the Impact of Covid-19 in this report Data Science and Machine-Learning Platformss industry.

Get Sample Copy of the Complete Report

The report firstly introduced the Data Science and Machine-Learning Platformss Market basics: definitions, classifications, applications and industry chain overview; industry policies and plans; product specifications; manufacturing processes; cost structures and so on. Then it analyzed the worlds main region market conditions, including the product price, profit, capacity, production, capacity utilization, supply, demand and industry growth rate etc. In the end, the report introduced new project SWOT analysis, investment feasibility analysis, and investment return analysis.

Table Of Content

1 Report Overview

2 Global Growth Trends

3 Market Share by Key Players

4 Breakdown Data by Type and Application

5 North America

6 Europe

7 China

8 Japan

9 Southeast Asia

10 India

11 Central & South America

12 International Players Profiles

13 Market Forecast 2019-2025

14 Analysts Viewpoints/Conclusions

15 Appendix

This report studies the Data Science and Machine-Learning Platformss market status and outlook of Global and major regions, from angles of players, countries, product types and end industries; this report analyzes the top players in global market, and splits the Data Science and Machine-Learning Platformss market by product type and applications/end industries.

Customization of this Report:This report can be customized to meet the clients requirements. Please connect with our sales team ( [emailprotected] ), who will ensure that you get a report that suits your needs. For more relevant reports visit http://www.reportsandmarkets.com

What to Expect From This Report on Data Science and Machine-Learning Platformss Market:

The developmental plans for your business based on the value of the cost of the production and value of the products, and more for the coming years.

A detailed overview of regional distributions of popular products in the Data Science and Machine-Learning Platformss Market.

How do the major companies and mid-level manufacturers make a profit within the Data Science and Machine-Learning Platformss Market?

Estimate the break-in for new players to enter the Data Science and Machine-Learning Platformss Market.

Comprehensive research on the overall expansion within the Data Science and Machine-Learning Platformss Market for deciding the product launch and asset developments.

If U Know More about This Report

Any special requirements about this report, please let us know and we can provide custom report.

About Us:

Market research is the new buzzword in the market, which helps in understanding the market potential of any product in the market. Reports And Markets is not just another company in this domain but is a part of a veteran group called Algoro Research Consultants Pvt. Ltd. It offers premium progressive statistical surveying, market research reports, analysis & forecast data for a wide range of sectors both for the government and private agencies all across the world.

For more detailed information please contact us at:

Sanjay Jain

Manager Partner Relations & International Marketing

http://www.reportsandmarkets.com

Ph: +1-352-353-0818 (US)

View original post here:
Data Science and Machine-Learning Platformss Market Share Opportunities Trends, And Forecasts To 2020-2027 with Key Players: SAS, Alteryx, IBM,...

Read More..