Page 3,820«..1020..3,8193,8203,8213,822..3,8303,840..»

Explain MEC like I’m a 7-year-old. – Verizon Communications

Even for a tech-savvy millennial like me, some concepts can be confusing. I find the easiest way to learn -- or teach -- is to break things down into simple terms, as if talking to a child.

So who better to teach me about MEC than a 7-year-old?

Ava Carnes, daughter of V Teamer Lauren Schulz, described MEC in terms I could understand: Ice cream.

If you live in the suburbs and want to go for a treat, you wouldnt drive all the way to the big city; youd go to your local shop.

In the same way, imagine you live in a smart home and you give a command to your lamp. Currently, that data needs to be transmitted to cloud servers that may be hundreds of miles away, analyzed and then a command is sent back to the lamp to switch on.

But with Mobile Edge Computing (MEC), the command only has to travel to a mini cloud that exists in your own neighborhood. By performing processing tasks closer to you, the end user, it improves the performance of applications and our network. The result? An Internet of Things (IoT) that will be smarter, faster, more responsive and more efficient than ever before.

To learn more about the power of MEC, watch the VTalk with Kyle Malady, Srini Kalapala and Valerie Feldmann and another on MEC strategy and partnerships.

Thanks, Ava! You made us all a little smarter. Now, can you help me with my taxes?

Tell us what you think of Up To Speed.

See original here:
Explain MEC like I'm a 7-year-old. - Verizon Communications

Read More..

Its time for smart home devices to have local failover options during cloud outages – Stacey on IoT

Earlier this week, a Nest outage lasted for 17 hours. Nest cameras didnt capture any video footage during that time. This downtime was likely a minor inconvenience, in most cases, if it was noticed at all. But for others who experienced some type of incident during the 17-hour window and that video footage would have been valuable to have, its a complete fail.

Yep. And my elderly father fell. Only the two times I needed it history was deleted.

Proud Knights Fan (@3600dollarsgone) February 25, 2020

Im using the word fail for a specific reason. As smart home systems mature and gain more mainstream acceptance, the failure of a cloud-based device or service becomes less acceptable. One possible solution is to start engineering these devices with some type of local failover, even if its limited in function.

Google says that the Nest outage was caused by a server update that didnt go as planned. Having managed servers in Fortune 100 companies, I get that. And Im not specifically calling Nest out here. Amazon Echo devices have occasionally experienced similar outages as have Ring products, which are part of the Amazon family.

Just in the past 30 days Ring device owners have experienced some service disruptions, as noted by Rings outage history page:

And last week, some owners of the PetNet smart feeding system saw their pets go hungry due to a service disruption with one person saying My cat starved for over a week, in a Twitter response to PetNet support.

The point here is that people are fully reliant on these types of smart home products to work. Not most of the time, but all of the time. When a supporting cloud service (often paid for in subscription fees) does go down, it can have very negative implications.

So whats the answer?

We need smart home companies to deliver on the promises of local controls for existing products, and we need new products designed to smartly failover in some local capacity.

Last year, Google and Amazon both announced more localized services and smarts at the edge. Yet we havent seen much progress on this front. If new localized controls and smarts have found their way to our smart homes, I havent seen either company make a big news splash about it.

When it comes to new products, most of the ones Ive seen are still focused on the subscription revenue model, which generally means some sort of cloud service for integrations or video storage. I dont bemoan companies making money from services, but a local failover of some kind would improve the customer experience and, therefore, could sell more products and services in the long run.

Take the example of cloud-connected cameras and video doorbells, which are both a hot category right now. Having them solely dependent on a web connection to some servers is a recipe for disaster. Sure, they need the cloud in many cases for person recognition, data storage, or other services, but theyre IP-based devices on a home network, too. Before the smart homes of today, we had IP-based cameras that we could view in real-time from a phone.

Why cant todays smart devices failover to some localized viewing mode and rudimentary notification system during an outage? And if youre going to add that, why not a limited amount of on-device storage, or a storage expansion slot for times like that?

Yes, theres cost involved to add such storage or slots for a memory card. But as Wyze has proved with its $20 WyzeCam, it cant be that much money. I have my own 32 GB memory card in my WyzeCam for this very purpose.

Somethings got to give here because the smart home is increasingly being relied upon by millions to monitor, react to and inform us of changes in the roof over our head. Server outages are a question of when, not if, even for the best of companies that have large-scale redundancy. Its time for smart device makers to consider building in local failover options for when the inevitable system outage occurs.

Related

Visit link:
Its time for smart home devices to have local failover options during cloud outages - Stacey on IoT

Read More..

Cloud demands new way of thinking about IT – Gadget

Overall shipments of personal computing devices (PCD) will decline 9% in 2020, reaching 374.2-million by the end of this year, as a result of the impact of coronavirus, or COVID-19, on manufacturing, logistics and sales.

According to new projections from the Worldwide Quarterly Personal Computing Device Tracker,International Data Corporation (IDC) has lowered its forecast for PCDs, inclusive of desktops, notebooks, workstations, and tablets.

The long-term forecast still remains slightly positive, with global shipments forecast to grow to 377.2-million in 2024, with a five-year compound annual growth rate (CAGR) of 0.2%. However, this is based on an IDC assumption that the spread of the virus will recede in 2020. Since the figure represents only marginal growth, the ongoing impact of the virus could quickly reduce long-term forecasts into negative expectations. IDC did not provide alternative scenariosshould this occur.

The decline in 2020 is attributed to two significant factors; the Windows 7 to Windows 10 transition creates tougher year-over-year growth comparisons from here on out and, more recently, the spread of COVID-19 is hampering supply and leading to reduced demand. As a result, IDC forecasts a decline of 8.2% in shipments during the first quarter of 2020 (1Q20), followed by a decline of 12.7% in 2Q20 as the existing inventory of components and finished goods from the first quarter will have been depleted by the second quarter. In the second half of the year, growth rates are expected to improve, though the market will remain in decline.

We have already forgone nearly a month of production given the two-week extension to the Lunar New Year break and we expect the road to recovery for Chinas supply chain to be long with a slow trickle of labour back to factories in impacted provinces until May when the weather improves, saidLinn Huang, research vice president, Devices & Displays. Many critical components such as panels, touch sensors, and printed circuit boards come out of these impacted regions, which will cause a supply crunch heading into Q2.

Theres no doubt that 2020 will remain challenged as manufacturing levels are at an all-time low and even the products that are ready to ship face issues with logistics, addedJitesh Ubraniresearch manager for IDCsWorldwide Mobile Device Trackers. Lost wages associated with factory shutdowns and the overall reduction in quality of life will further the decline in the second half of the year as demand will be negatively impacted.

Assuming the spread of the virus subsides in 2020, IDC anticipates minor growth in 2021 as the market returns to normal with growth stemming from modern form factors such as thin and light notebooks, detachable tablets, and convertible laptops. Many commercial organizations are expected to refresh their devices and move towards these modern form factors in an effort to attract and retain a younger workforce. Meanwhile, consumer demand in gaming, as well as the rise in cellular-enabled PCs and tablets, will also help provide a marginal uplift.

Worldwide Topline Personal Computing Device Forecast Changes, Year-Over-Year Growth %, 2020-2021 (Annual)

Worldwide Topline Personal Computing Device Forecast Changes, Year-Over-Year Growth %, 2020 (Quarterly)

Source: IDC Worldwide Quarterly Personal Computing Device Tracker, February 19, 2020

Read the rest here:
Cloud demands new way of thinking about IT - Gadget

Read More..

Microsoft is retiring its MCSA, MCSD and MCSE certifications in June 2020 – ZDNet

Microsoft is retiring three of its more popular professional certification categories as part of a move toward "role-based training." Its Microsoft Certified Solutions Associate (MCSA), Microsoft Certified Solutions Developer (MCSD) and Microsoft Certified Solutions Expert (MCSE) certifications are going away, officials said via a blog post on February 28, 2020. Microsoft also has no plans to offer Windows Server 2019 or SQL Server 2019 certifications, officials said.

As of June 30, 2020, all exams for MCSA, MCSD and MCSE will no longer be available. Microsoft is advising those already in the midst of working toward these certifications should work toward passing required exams before that date, as those certifications no longer will be awarded after June 30. Individuals who passed a qualifying exam prior to its retirement will still be able to count it toward a partner competency requirement for 12 months after the exam has retired.

Those who already have MCSA, MCSD or MCSE certifications will be able to reference them for up to two years after the deadline; after that point, they will be marked as "inactive."

Microsoft puiblished today its recommendations for those with MCSA, MCSD and MCSE certifications who are interested in moving to the newer role-based certifications. Unsurprisingly, the recommendations focus around Azure, Microsoft 365, Data and AI and Dynamics 365.

Microsoft's FAQ about the certification changes said that Windows Server 2019 and SQL Server 2019 content will be included in role-based certifications on an "as-needed basis for certain job roles in the Azure Apps & Infrastructure and Data & AI solution areas."

Microsoft's blog post has a full list of certifications and exams that will be retiring on June 30, 2020.

Here is the original post:
Microsoft is retiring its MCSA, MCSD and MCSE certifications in June 2020 - ZDNet

Read More..

Sticking With Both The Pure And The Storage Strategy – The Next Platform

The best technology companies have always taken something that was complex and done a whole lot of engineering or in many cases re-engineering of it to make it usable and consumable with the right pricing so it can go mainstream.

Pure Storage was not the pioneer in enterprise flash storage, but it is the unicorn that has grown its product line, customer base, and revenues steadily since its founding in 2009 in the belly of the Great Recession. And like other innovative storage upstarts Nutanix comes to mind Pure Storage has had issues in making money as it grows its business, but Wall Street seems to have patience that in long run the flash storage appliance maker will be able to grow sales faster than costs and flip to profitability and stay there.

It was not at all obvious at the time when Pure Storage was founded that it would emerge as one of the leaders in all-flash arrays. Four years earlier, Fusion-io created flash cards that took a long time to define the flash storage market, and eventually the cost of flash came down enough that both Apple and Facebook used Fusion-io devices to underpin database acceleration; from 2011 through 2013 to the tune of $150 million to $200 million a year, and these two accounts comprised more than half of the companys revenues. Violin Memory, also founded in 2005, created storage appliances that, like those from Pure Storage many years later, were based on homegrown flash modules, not stock SATA or SAS solid state drives, and an operating system that masked many of the reliability and durability issues of flash from the systems that accessed them. A slew of all-flash array makers have launched since then, and some of them are still here, chasing the incumbent storage giants that serve the enterprise datacenter, many of whom have long histories as disk array makers and have added flash to make hybrid devices to try to defend against the all-flash onslaught.

We have said it before, and we will say it again: In the fullness of time, except for the largest hyperscalers, cloud builders, and HPC centers with absolutely monstrous, exascale-class storage problems, most enterprises will at some point be able to move to all-flash storage for their applications and they will abandon disk-based arrays entirely. The mantra for a long time was disk is the new tape and flash is the new disk, but we think if you look carefully and lets take the instances and storage services at Amazon Web Services as an example tape is the new tape, since the AWS Glacier is based on tape libraries with very good caching on the front end, and flash is the new disk and, unless you need to store trillions of cat videos or immense simulation and modeling inputs and outputs, you probably dont need disk arrays.

Good data is hard to come by that allows comparisons, but is not a coincidence that according to data from IDC for the first quarter of 2019, the all-flash array market was about the same size as the hybrid flash array market $2.47 billion versus $2.81 billion but there was still another $8.09 billion in disk-based storage. And, we strongly suspect, a lot more capacity was sold on disk than on hybrid and certainly on all flash. Some of that is short-stroking disk arrays for performance (putting data only on the outside third of tracks where the rotational velocity is highest on the platter and therefore the average access time is slower on files).

At the current rate of revenue decline for disk storage, it will take until 2027 for disk array sales to fall to the level of all-flash arrays and hybrids. This decline could happen more abruptly or not at all. Extrapolation works like that. Still, it is funny to think that, in the longest of runs, when the hyperscalers and the cloud builders are the last customers to be buying disk drives for massive exascale storage farms, enterprises will have outsourced disk arrays to them, and consumers, too. And the real comparison over time is between on-premises flash versus cloud storage.

This is one of the big bets that Pure Storage made more than a decade ago, and inasmuch as it has built a business that generated $1.64 billion in its fiscal 2020 ended in January, with around 7,500 customers, and is on track to kiss $2 billion in sales in fiscal 2021 (which is for the most part resident in calendar 2020), then Pure Storage has been successful precisely within its wildest dreams.

Pure Storage is right in there, competing with the other storage array makers once you take the internal disk arrays that server makers bundle inside their server skins as well as the ODM storage servers that the hyperscalers and cloud builders out of the mix call it about 5 percent share. If it were possible to focus just in one large enterprise accounts, we think that Pure Storages share could be somewhere in the range of 2X to 3X this level, which is remarkable for any storage startup and is akin to the rise of EMC back in the 1990s, when it cut its teeth making RAID clusters of chip disks with smart controllers with giant cache memories that emulated IBM 3880 and 3990 mainframe disk arrays and then rose the Unix server wave up with a POSIX file system on its Symmetrix arrays, completely changing enterprise storage.

However, EMC was smaller when it went public in 1986, and it was profitable, so it could be smaller and still raise a substantial amount of money relative to its size. That year, Dick Egan and Roger Marino cashed in some of their shares on Wall Street and raised $30 million for their company, which posted $66.6 million in sales that year (twice the level of the year before) and had $18.6 million in net income. There was no venture capital beyond the bootstraps of Egan and Marino. When EMC hired Moshe Yanai, who has designed the Symmetrix, XIV, and Infinidat storage systems, the company was able to ride up the rocket as IT underwent the Unix revolution and then the dot-com boom. In 1990, when the Symmetrix array launched after three years of development, EMC had 0.2 percent share of the mainframe disk market, and five years later, EMC had 41 percent share and IBM had 35 percent share, and it had to buy the Storagetek disk array business to cover the embarrassment. In 1994, when EMC was roughly the size of Pure Storage, it generated $1.37 billion in sales but $251 million in net income and it was growing five times faster.

That was a different time, and in the 21st century, companies grow faster than that long ramp between 1979 when EMC was founded as a company to sell furniture and 1990, when the company created the product that would define it. And storage upstarts struggle to make profits as they drive the revenue growth that their venture capital investors demand and, if they are lucky, that their Wall Street investors expect when and if they get to go public before being acquired by a storage incumbent who recognizes the threat they pose and has the cash to acquire them before they walk down to lower Manhattan with some empty wheelbarrows.

In the six fiscal years that we have been tracking Pure Storage, the company has an aggregate of $5.37 billion in sales and net losses of $1.19 billion, and in general, as you can see from the chart and table above, it is closing the gap between what it costs to support its revenues and what it needs to generate to cover those costs.

Some of those losses in the early years were covered in part by the $530.9 million in venture funding the company raised in eight rounds between 2009 and 2014, and the $425 million the company raised in its initial public offering in October 2015. Pure Storage had a market value of $3.1 billion when it went public, and it has a value of $3.9 billion today after a pretty bad week for all stocks thanks to the coronavirus; the stock had been trading higher in the summer of 2018, when the company had a market capitalization of $7.2 billion, and presumably those VCs and other Wall Street investors post IPO cashed out and made some tidy profits from their investments. As the fiscal 2020 year came to a close in January, Pure Storage had $697 million in deferred revenue in the bank and $1.3 billion in cash and investments, so it is in pretty good shape for investing in the future and chasing more customers and more deals even if it incurs losses as it has in the past. But somewhere around $500 million per quarter, it is at break even and even though its revenue guidance for fiscal 2021 is only for 16 percent growth year on year, that would put the company at $1.93 billion and closer to breakeven. If the company keeps costs level, that is. Pure Storage may decide to keep spending on the rise in pace to fuel growth; it may have no choice, in fact, particularly if the economy starts to stumble.

The key to the success of Pure Storage to date is that it innovated heavily atop flash storage and created an appliance experience, akin to what we get from our iPhone and its services for those of us who use Apple products, to make complex storage easier to deploy without sacrificing underlying sophistication. The companys innovative Evergreen upgrade program builds in controller upgrades every three years so companies know they can boost the performance of their flash arrays without having to dump their flash storage modules. The company has also expanded its product lines over the years to address new markets, and continued to do so this week with the launch of the FlashArray//X R3 block and file storage product.

Pure Storage started out back in 2011 with the FlashArray//M block and file storage, and the FlashBlade follow-on announced in 2016 was about creating less expensive flash and more scalable arrays that could tackle the big object storage jobs enterprises were beginning to wrestle with thanks to the mountains of unstructured data that they started piling up in the hopes of converting it into money. The FlashBlade initially could scale to two controllers and 30 storage blades for a maximum of 30 blades and 3.2 PB of capacity usable employing its fattest storage blades, which weigh in at 52 TB. Today, Pure Storage can scale a single object store across ten enclosures for a total of 150 blades, or five times that amount. As important, the flash object storage has 150 GB/sec of bandwidth across those ten chassis and can deliver an aggregate of 24 million IOPS running the NFS file system.

The beefier FlashArray//X series block and file storage debuted in 2017, and they included new DirectFlash Modules that employed NVM-Express internally and that dropped the average access time of data from 1 millisecond for a FlashArray//M to around 500 microseconds with the FlashArray//X. (The average enterprise-class disk array using SAS spinning rust drives was around 30 milliseconds, by comparison.) With the new and improved FlashArray//X systems announced a year ago, which included a revamped NVM-Express over Fabrics (NVMe-oF) interconnect for the flash called DirectFlash Fabric, that average access time was cut in half again to 250 microseconds. The net effect of the last nine years of innovation is that a database query running on block disk storage from 2011 might take 5 minutes to complete, but on the FlashArray//X R2 systems announced last year, that would drop down to 2.5 seconds an improvement that comes just because of the shift from disk to flash block storage. The goal is to get that average file access down to 100 microseconds and to get that database query down to 1 second.

In the middle of all of that, at the end of 2018, Pure Storage took its Purity storage software stack and carved it up to run on flash-based instances on the Amazon Web Services public cloud to create a virtual FlashArray of sorts called Cloud Block Store.

The FlashArray family is a system with a pair of redundant controllers based on Xeon processors from Intel that uses a mix of flash SSDs off the shelf or custom flash modules manufactured by Pure Storage. The top end FlashArray//X90 R3 will top out at over 3 PB, which a lot of capacity for a single instance of block and file storage in the enterprise. In fact. Matt Kixmoeller, vice president of strategy at the company, tells The Next Platform that the typical customer who wants to do rack-scale block and file flash external from the servers but accessible by all the servers in that rack tend to buy arrays in the range of 500 TB to 1 PB of capacity. A blast radius larger than a few racks makes enterprises uncomfortable, and consequently customers tend to go with the FlashArray//X50 and FlashArray//X70 models, not the top end box. Very roughly speaking, the arrays range from around $100,000 for a loaded up FlashArray//X10 to $1 million or more for a FlashArray//X90. You can upgrade from entry level to the big bad box, and upgrade as the box changed.

With the new FlashArray//X R3 models announced this week, the machines are only being sold with the DirectFlash Modules, and off-the-shelf flash SAS SSDs will not be supported in the R3 machines. Pure Storage is also delivering a new 1 TB DirectFlash module, based on 3D TLC NAND, as well.

DirectFlash has been an awesome architecture for Pure, and we want to bring it across the product line, says Kixmoeller. We have been measuring performance and reliability and power efficiency of our DirectFlash Modules from the early days and we also believed that it would be more reliable as well. We have seen over the years of shipping DFMs, that DFMs have about half the failure rate of SSDs.

The new FlashArray//X R3 systems are also getting a controller boost, with a move to the new Cascade Lake Xeon refresh processors that Intel announced earlier this week. The more capacious variants of the FlashArrays get heftier processors, so it is not just that there is one chip chosen but rather each model gets its own processors. But in general, the //X R3 controller has about 25 percent more raw compute oomph than the //X R2 controller from last year. For those who are coming from the FlashArray//M R2 models from 2016 and 2017, the controller performance increase is more like 50 percent, says Kixmoeller. And whole 100 microsecond access time is still a goal, the new //X R3 machines have got that down to 150 microseconds, which is still a 40 percent reduction in latency. By the way, customers that want to add persistent memory into their FlashArrays can do so with Intel Optane 3D XPoint SSD modules, which slide into the enclosures. You dont have to do anything special to access these its just another kind of storage. In some cases, thanks to the lower latency of Optane, it can be cheaper, says Kixmoeller, to boost the performance of I/O sensitive applications like databases by plugging in some Optane SSDs rather than upgrading the compute on the controller. The Optane upgrade is also less expensive, which is a plus.

One last thing: Pure Storage seems uninterested in adding compute functions to its arrays to run them natively on the devices in a hyperconverged fashion, just as it has not been interested every time we have asked.

Link:
Sticking With Both The Pure And The Storage Strategy - The Next Platform

Read More..

Strategic Embrace and Global Right-wing Embers – Economic and Political Weekly

United States (US) President Donald Trumps whirlwind visit to India is being projected as a roaring success. Prime Minister Narendra Modi accorded a royal welcome to the US President. Trump reciprocated by describing his trip as very very wonderful.

The event was a good concoction of diplomacy and politics. The warmth at the occasion conveyed the growing cordiality between the two governments. As a consequence, India and the US are now comprehensive global strategic partners.

Trump is palpably happy because he has returned home with a defence deal worth $3 billion and an assurance from Modi that Indian armed forces will enhance interoperability with their counterparts in the US. This in effect means that India will procure more military machines from the US that would be connected to US military cloud servers.

India has also been introduced to the Blue Dot network. Joining the network would make India liable to get its infrastructure and development projects certified as per the standards determined by the US International Development Finance Corporation (DFC).

With India displaying its willingness to plug and play into US social, political, economic as well as military networks, Trump has returned home satisfied, reassured that India is an able ally that will act as an effective counterpoise to China.

Modi dearly wanted two things from Trump: One, a strong blow against the Western liberals who sanctioned his travel abroad, and two, relief from the fear that the US may not back him for revoking Article 370. Trump has delivered on both counts, but has not obliged Modi by confronting Pakistan directly.

The Indian elite is more than pleased to have bought the MH-60R naval and AH-64E Apache helicopters and has got a promise from Trump that he would add the Haqqani network and Tehrik-i-Taliban in Pakistan to the list of terror groups. All in all, New Delhi is satisfied that Washington no longer ignores it.

The otherwise well-orchestrated diplomatic event was marred by the eruption of violence and mayhem in north-east Delhi. A communal war ravaged near the Yamuna, when Modi and Trump were embracing each other on the banks of the Sabarmati river and learning the lessons of peace at Gandhi Ashram.Oblivious to the rising death toll, Trump and Modi continued to thump each others backs. The diplomatic show went on uninterrupted.

The indifference of the two leaders to the killings is rooted in the right-wing political ideology that they subscribe to. The subversion of democratic institutions, espousal of anti-immigration rhetoric, and Islamophobia are the common values that Modi shares with Trump. Both think that liberalism is responsible for weakening their respective civilisational moorings.

While the two politicians could be expected to extract their pound of flesh from the crisis, it is surprising that the American secret service that was positioned in Delhi one month prior to Trumps arrival never raised any alarm about the deteriorating security situation in Indias capital. American intelligence completely failed to assess the volatile communal situation in Delhi.

The heady mix of ideology and diplomacy had begun in September last year, when Indias ambassador to the US, Harsh Vardhan Shringla (now, foreign secretary), met far-right American ideologue and one-time White House chief strategist Steve Bannon in the Indian embassy. Putting diplomacy aside, Shringla boldly tweeted his picture with Bannon and called him the legendary ideologue and Dharma warrior. Coincidentally, a documentary directed by Errol Morris on Bannons world view is titled American Dharma. Bannon, known for flaunting his racist image, is the leading light of the international far-right movement protecting his fellow travellers political interests.

Trump has aided both Israeli Prime Minister Benjamin Netanyahu and British Prime Minister Boris Johnson in winning elections; he will also come to Modis rescue when his electoral fortunes begin to plummet. Modi has already done enough to boost Trumps electoral prospects among the Indian Americans. Paradoxically, the champions of nationalism and anti-globalism are in the vanguard of the political globalisation project of the global far-right.

Trumps reticence on Modis attempts to dismantle the citizenship structure in India is no different from his endorsement of Israels new nation-state basic law. Trump seems convinced that Modi is his Netanyahu in South Asia. Much like Zionism, Brahminical Hindutva shares distinct affinity with Trumps supremacism. Trump would be as impressed by Hindutvas treatment of Muslims and the plans to disenfranchise them as he is by the Zionists treatment of the Palestinians.

Trump has imposed a Middle East peace deal in cahoots with Netanyahu. It may not be farfetched to imagine that he may be contemplating a similar one-sided Kashmir plan in concert with Modi; a proposal that may further destabilise the region.

Managing the worst communal conflagration in Delhi since the 1984 anti-Sikh riots, with the US President physically present in Delhi,is the next high point in Modis political career. He is certainly feeling emboldened to tell the liberals that they have no America left to complain against him.

Read the original here:
Strategic Embrace and Global Right-wing Embers - Economic and Political Weekly

Read More..

Qualcomm And Partners Unveil Worlds First Boundless XR Over 5G Experience For Commercial Retail Applications – Forbes

ZeroLight Created The VR Content Used In Qualcomm's Demo.

Qualcomm announced its Snapdragon XR2 5G back in December of last year, the second-gen 5G-enabled iteration of the companys mobile XR platform. For the uninitiated, XR is a term Qualcomm has coined to encompass technologies employed for Extend Reality experiences, an amalgam of virtual reality, augmented reality, and mixed reality. Today the company is taking things a step further and announcing a new XR2 5G-based reference design that leverages this new platform and is designed to accelerate boundless XR over 5G device development for OEMs.

In addition, in conjunction with key partners like ZeroLight and NVIDIA, Qualcomm also unveiled an innovative XR over 5G demo experience, that could foreshadow sweeping changes in a number of industries.

Qualcomm Snapdragon XR2 5G Based Headset Designs.

The demo features a wireless Qualcomm Snapdragon XR2 5G-enabled reference design headset with six-degrees of freedom (6DoF), to enable a completely standalone, high-quality VR/AR experience. In the demo, users can configure and explore a range of Pagani vehicles, and experiment with various colors and interior trim options. The experience is enabled by the Qualcomm Snapdragon XR2 5G headset, which features advanced reprojection technologies to effectively reduce latency, augmented by high-quality graphics rendering on edge servers thats transmitted over a high speed 5G network.

And thats where Qualcomms partners come in. ZeroLight created the VR content thats rendered on the edge-cloud servers. That content is streamed to the Snapdragon XR2 reference design headset using NVIDIAs CloudXR platform and a 5G network connection.

We are now in the era of 5G which will transform how the world connects and communicates, said Brian Vogelsang, senior director of product management, Qualcomm Technologies, Inc. With distributed processing from boundless XR over 5G, XR will see one of the most dramatic 5G transformations. Industrial-level processing power will be available on-demand with ultra-reliable [connections] and low latencies, revolutionizing XRs potential for both the consumer and the enterprise.

Cloud-Rendered VR Over 5G.

We have had untethered, standalone HMDs (Head Mounted Displays) for a while now, but those platforms typically lack the compute and GPU resources to render graphically rich, high-fidelity XR applications in real-time. To date, high-performance PCs with powerful GPUs have been required to handle workloads like this, but by leveraging NVIDIAs CloudXR technology and the high-bandwidth / low-latency characteristics of 5G, the rendering workload is shifted to the edge cloud and the content streamed wirelessly to the standalone XR device. This technology eliminates the need for retail, commercial, or enterprise businesses to invest and maintain expensive, high-performance XR-Ready PCs, in favor of streaming content from the cloud, and the simplified configuration of a standalone headset.

While on the subject of XR headsets, Goertek has leveraged Qualcomms XR2 5G-based reference design to create a VR form factor that includes an IR emitter for hand tracking and head tracking with simultaneous localization and mapping (SLAM), 3D audio and voice commands, and 2Kx2K per-eye dual panel LCD support. The design also has embedded motion tracking technology from Atraxa, electromagnetic tracking technology from Northern Digital Inc. (NDI), and compatibility with embedded eye tracking from Tobii, which includes Tobii Spotlight Technology which uses foveated rendering to reduce the rendering workload.

The culmination of all of this technology could have far-reaching impact on everything from commercial retail to collaborative design applications and entertainment. Qualcomm wasnt specific in terms of when these solutions would be available from OEM and ODM partners, just that the platform will be made available sometime this year as the 5G roll-out continues to gain momentum.

Continue reading here:
Qualcomm And Partners Unveil Worlds First Boundless XR Over 5G Experience For Commercial Retail Applications - Forbes

Read More..

GIGABYTE Announces Servers with Cooled-by-ZutaCore Technology – HPCwire

SAN JOSE, Calif.andTAIPEI, Taiwan,Feb. 27, 2020 GIGABYTE Technology Co. Ltd, maker of precision engineered servers, and ZutaCore, a waterless, two-phase, liquid cooling company, have announced their partnership to bring to market their first pre-configured, warrantied Cooled-by-ZutaCore GIGABYTE servers. GIGABYTEs rack servers are ready-to-integrate solutions that combine a high level of performance, energy efficiency and overall reliability for the most demanding server applications. ZutaCores HyperCool technology provides innovative direct-on-chip evaporative cooling to meet and surpass the challenges posed by server-level hot spots and edge computing requirements, while mitigating the risk of IT failure.

This major server OEM partnership with ZutaCore enables the scalability of HyperCool by leveraging GIGABYTEs established global network. Now, for data center owners and operators in need of world-class servers that easily integrate into trusted rack systems equipped with the latest in liquid cooling, this provides a clear path to creating a full solution. For example, the pre-configured GIGABYTE servers can be easily implemented alongside the recently announced Rittal HPC Cooled-by-ZutaCore solutions, which will be demonstrated at next weeks OCP Global Summit at booth B9. With this platform, customers have the unprecedented benefit of trusted IT racks from Rittal and leading servers from GIGABYTE, combined with waterless, two-phase, direct-on-chip, liquid cooling technology from ZutaCore for a fully qualified strategy to address the ever-growing demands of high power processors.

GIGABYTE is thrilled to be the first major server OEM providing Cooled-by-ZutaCore servers to the data center market, saidDaniel Hou, CTO, GIGABYTE. As we invest more into the advancement of liquid cooling technology, we believe ZutaCore is a partner that shares our goal of providing efficient, high performance cooling solutions that look at the massive problem of cooling in a completely new and extremely effective way. It is becoming clear that liquid cooling is an important technology that will be capable of keeping up with future cooling demands. With ZutaCores direct-on-chip, waterless, two-phase technology we are going beyond what our hyperscale and colocation customers are asking for so we can evolve with them.

Now customers can use easily integrated servers from GIGABYTE combined with a Rittal HPC Cooled-by-ZutaCore technology platform to transform data center economics and push the boundaries of cooling. The HyperCool solution is a complete hardware system, enhanced by a software-defined-cooling platform that alleviates cooling challenges at the chip, server, rack, POD and data center levels, consistently, in any climate. With ZutaCore, customers can triple computing densities on a fraction of the footprint and halve costs. Furthermore, two-phase liquid cooling is prepared for any evolution in high-powered chips, as there is no limit to what it can cool as processors progress upwards of 1000W.

We have always looked at the full picture to understand the many pain points that drive decision making in data centers, saidUdi Paret, President, ZutaCore. While we feel we have cracked the code on cooing and improving overall data center economics, we know this is not possible without being closely aligned with leading, global companies who address the bigger picture. By bringing to market Cooled-by-ZutaCore GIGABYTE servers we take the guessing out of the equation. Customers already know and trust GIGABYTE and can now extend that belief in a truly scalable cooling platform from ZutaCore to effectively dissipate heat generated by the most powerful processors on the market.

Representatives from GIGABYTE and ZutaCore will attend the 2020 OCP Global Summit fromMarch 4-5 at the San Jose Convention Center. At the Rittal booth, number B9, visitors can view an interactive demo of their Cooled-by-ZutaCore GIGABYTE servers. Specifically, the TO22-Z61 GIGABYTE servers will be available as pre-configured for Cooled-by-ZutaCore solutions.

You can learn more about the Cooled-by-ZutaCore TO22-Z61 GIGABYTE server at:https://www.gigabyte.com/Racklution-OP/TO22-Z61-rev-10-20

About GIGABYTE

GIGABYTE is an engineer, visionary, and leader in the world of tech that uses its hardware expertise, patented innovations, and industry leadership to create, inspire, and advance. Renowned for over 30 years of award-winning excellence, GIGABYTE is a cornerstone in the HPC community, providing businesses with server and data center expertise to accelerate their success. At the forefront of evolving technology, GIGABYTE is devoted to invent smart solutions that enable digitalization from edge to cloud, and allow customers to capture, analyze, and transform digital information into economic data that can benefit humanity and Upgrade Your Life. For more information and news on GIGABYTE products, visit the official GIGABYTE website: http://www.gigabyte.com.

About ZutaCore

ZutaCore is a waterless, two-phase change, liquid cooling technology company, unlocking the power of cooling and revolutionizing data centers. The HyperCool technology platform alleviates cooling boundaries at the chip, server, rack, POD and data center levels. The HyperCool solution is a complete hardware system, enhanced by a software-defined-cooling platform, yields unparalleled heat dissipation at the chip level, triples computing densities on a fraction of the footprint and halves costs. Designed by a veteran team in Israeland enabled by 14 patent-pending innovations, HyperCool is a near plug-and-play system that delivers consistent results, in any climate. ZutaCores R&D center is in Israel with HQ office in California. For more information, visit http://www.zuta-core.com/.

Source: ZutaCore

Excerpt from:
GIGABYTE Announces Servers with Cooled-by-ZutaCore Technology - HPCwire

Read More..

Edge Computing Architecture Is Coming to Your Enterprise – IoT World Today

As data and devices proliferate, enterprises will need edge computing architecture as much as they have come to rely on public clouds.

The Internet of Things (IoT) is poised for explosive growth in the coming decade, and IoT device growth is projected to exceed 75 billion worldwide by 2025 a fivefold leap in just 10 years.

With each connected sensor or device, the deluge of data increases, offering enterprises new insight into boosting operational efficiencies, enhancing performance, improving safety and minimizing unplanned downtime.

But unlocking the true potential of IoT hinges on effectively and efficiently processing the data these billions of devices generate. Enter edge computing architecture.

What Is Edge Computing Architecture?

Research firm Gartner defines edge computing as part of a distributed computing topology where information processing is located close to the [networks] edge, where things and people produce or consume that information.

While cloud-based data centers are often located miles from the data and devices they support, edge computing hardware and services localize data processing resources close to devices. An edge gateway, for example, can process data from edge IoT devices (video cameras, sensors, drones), and transmit relevant information to the cloud or back to the original edge device. This process reduces latency, spares network bandwidth and makes data insights actionable in real time.

Organizations that have embarked on a digital business journey have realized that a more decentralized approach is required to address digital business infrastructure requirements, said Santhosh Rao, senior research director at Gartner. As the volume and velocity of data increases, so too does the inefficiency of streaming all this information to a cloud or data center for processing.

Edge solutions typically use distributed architectures that balance workloads between the edge layer, a cloud or edge network, and the enterprise layer. While there are several proposed architectures for edge computing, no accepted standard has emerged, nor is there a consensus on what edge architectures physically look like, according to the German Research Center for Artificial Intelligence. However, the organization argues that true edge computing architectures must meet requirements including the following:

Why Edge Computing Architecture Matters

Edge computing supports various compelling use cases. For example, autonomous delivery vehicles in traffic must respond instantly to pedestrians in their path; relying on a remote server to slow down or brake is not a viable option. Vehicles exploiting edge technology also can communicate with one another directly, sharing information on accidents, traffic jams, upcoming detours or weather conditions.

Edge computing also powers innovation in enterprise security. Surveillance systems can identify potential threats and alert organizations to unusual activity in real time, reducing incidents such as data theft, industrial sabotage and reputational damage.

Go here to read the rest:
Edge Computing Architecture Is Coming to Your Enterprise - IoT World Today

Read More..

The winners of the 2020 SC Awards Honored in the US | SC Media – SC Magazine

Trust AwardBestAuthenticationTechnologyForgeRockForgeRock Identity Platform

All journeys have a beginning, middle and an end, and its the jobof the ForgeRock Identity Platform to ensure that everyauthenticationjourney, from start to finish, remainssafe for the client and easy for the user.

The platforms IntelligentAuthenticationfeaturedelivers the unique ability to visually map userauthenticationjourneys with a drag-and-dropinterface and, post-implementation, use analytics to measure the userexperience.

This makes it possible to offer a more personalized and frictionlessauthenticationexperience across channels and digital touchpoints in a manner that caters to customer or employee needs. Meanwhile, the organizations implementing these journeys are able to consolidate multiple logins into a single, consistent and secure experience; audit all login events; and minimize the risk of DDoS attacks and breaches.

One of the keys to IntelligentAuthentications effectiveness is the use of authenticationtreesthat allow for multiple paths and decision points throughout a journey. Thesetrees are composed of various nodes that define actions taken duringauthenticationand can be combined to create uniqueuser experiences.

A recent ForgeRock case study demonstrated how the state of Utahbenefited from the Identity Platform by saving up to $15 million overfive-to-six years, due to efficiencies from modernizing its identity and accessmanagement infrastructure.

In December 2018, ForgeRock enabled its platform to be deployed onany cloud environment, with preconfigured installation packages for 1 million,10 million and 100 million identities. Customers reported reducing theirimplementation costs by 25 percent while doubling ROI. The platform is builtfor limitless scaling, and it supports DevOps practices using Docker andKubernetes.

Finalists 2020

Trust AwardBestBusinessContinuity/Disaster Recovery SolutionSemperisSemperis AD Forest Recovery

It reportedly took 10 days for the global shipping company Maerskto rebuild its network following a devastating NotPetya disk wiper attack in2017. It was an impressive comeback, but the company spent a large chunk ofthose 10 days recovering Microsoft Active Directory, a collection of servicesthat are foundational to saving the rest of the network. Altogether, the attackcost Maersk up to $300 million.

Semperis AD Forest Recovery exists to prevent similar disastersfrom befalling another organization by automating and expediting therestoration effort with a cyber-first, three-click approach that can savemillions that would be otherwise lost tobusinessinterruptionscaused by such threats as ransomware and wipers.

According to Semperis, traditional AD back-up tools only addressrecovery from IT operational issues, where the AD is impacted but host serversarent. And legacy approaches such as bare-metal recovery can cause issuesbecause backups contain boot files, executables and other artifacts wheremalware can linger and lie in wait to cause secondary infections.

AD Forest Recoverys cyber-first approach, on the other hand,separates AD from the underlying Windows operating system and only restoreswhats needed for the servers role (e.g. a domain controller, DNS server, DHCPserver, etc.), virtually eliminating the risk of re-infection, Semperisasserts.

Additionally, the tools automation helps organizations avoidhuman errors while accelerating the restoration process, including rebuildingthe global catalog, cleaning up metadata and the DNS namespace, andrestructuring the site topology. Such capabilities can help organizationsreduce downtime to minutes rather than days or weeks, while restoring AD to thesame or different hardware, on-premises or in the cloud.

Finalists 2020

Trust AwardBestCloudComputingSecurity SolutionBitglassBitglass CASB

Its easy to see the business benefits ofcloud-basedapplications. But figuring out whatcloud security solution is best to secure them allin a consistent manner? Thats when things can get a little, well,cloudy.

Bitglass CASB (CloudAccess Security Broker) solutionclears up the fog, enablingenterprises to secure any SaaS apps, IaaS instances,data lakes, on-premises apps and privatecloudappsbuilt on any platform. The companys total data protection suite providesend-to-end security and comprehensive visibility over corporate data, whilelimiting sharing and preventing data leakage.

Bitglass CASB protects data on any device, at any time, and fromanywhere in the world without the need for agent-based deployments. ITdepartments can confidently adoptcloud technologies and BYOD policies, knowing theyare filling critical security and compliance gaps.

The solution doubles as a mobile device management solution, anidentity and access management solution (replete with single sign-on), and adata loss prevention tool that works across any app or workload. This providesa single pane of glass for enterprise IT departments trying to managedisjointedcloudservicesand security tools.

Bitglass CASB owes its success to its hybrid architecture, whichleverages a combination of proxies and API integrations including reverseproxy to ensure complete coverage against all risk of data leakage on any appor device.

The solution delivers real-time, advanced threat protection,capable of detecting zero-day threats at upload, at download and at rest. Otherstandout features include full- strength encryption, as well as unmanaged appcontrol that renders apps read-only to prevent data leakage.

And because the agentless solution can be rolled out quickly andrequires no software installations, customers report large operational cost savings.

Finalists 2020

Trust Award

Best ComputerForensicSolutionOpenText

EnCaseForensicEnCase Endpoint Investigator and EnCase Mobile Investigator

Step aside, New York Yankees and New England Patriots. Yourdynasties pale in comparison to that of the EnCase product line from OpenText,which has now won the SC Award for Best ComputerForensicSolution for 10 years running.

Collectively, EnCaseForensic, EnCase Endpoint Investigator and EnCaseMobile Investigator help law enforcement officers gather digitalforensicevidence from endpoints such ascomputers, mobile devices and IoT devices. Meanwhile, the solutions alsoprovide businesses with the tools to examine HR issues, compliance violations,regulatory inquiries and IP theft.

Despite its decade-long winning streak, OpenText isnt resting onits laurels. The company just recently introduced its OpenTextMedia analyzer, anew module that allows investigators to quickly analyze large volumes of imagesand video collected as evidence.

Digitalforensicinvestigatorsrequire court-proven tools that can deliver 360-degree visibility, collectevidence from vast datasets, and improve efficiency and effectiveness byautomating the laborious investigation processes into a few simple steps.

EnCase Endpoint Investigator provides seamless, remote access tolaptops, desktops,

and servers, ensuring that all investigation-relevant data isdiscreetly searched and collected in aforensically sound manner. EnCaseForensicoffers broad operating system fileparsing capabilities and encryption support, allowing users to quickly completeinvestigations of any operating system. And EnCase Mobile was introduced in2017 to augment mobileforensicinvestigations.

User organizations can make confident decisions related tosensitive internal matters due to EnCases thoroughness and EndpointInvestigators unique ability to prove the chain of custody of data if a casefaces legal challenges. According to EnCase, it is not unusual for users toexceed a 100 percent ROI after their first few investigations.

Finalists 2020

Excellence AwardBestCustomerServiceSecurityScorecard

Nobody scored better incustomerservicethispast year than SecurityScorecard.

The security ratings company assesses various companies cyberpostures and assigns a score that security professionals can review, helpingthem assess the risk of current or future business partners.

The companyscustomerservicesuperioritystarts with theCustomerSuccessManager (CSM) that each client is assigned as a strategic advisor. The CSMtakescustomersthrough a customized on-boarding process, which includes a live demo of theplatform thats specific to each clients use case, and helps ensure thatproject milestones are met.

Supplementing the CMS is theCustomerSupportteam, which reviews, validates and remediates disputed claims or ratings within48 hours.

Customers alsohave a dedicated solutions engineer for technical support, while acustomer reliabilityengineer ensures all remediation requests delivered through the platform areresolved in an appropriate and timely manner.

From a sales perspective, SecurityScorecard operates via a podstructure, with each pod focused on a territory supported by a field salesrepresentative or inside sales representative, who acts as an additional lineof communication.

Customers also have access to unlimited web-based help, as well ason-site support (via its ProfessionalServices offering) and reading materials, including platform video tutorials, knowledge base articles, supplemental best practice documentation, eBooks, white papers and FAQs.

The company responds tocustomerfeedbackvia reviews and social media, and its product management team also holdsregular user feedback sessions. Additionally, SecurityScorecard has aCustomerAdvisory Board for knowledge sharingand strategic feedback.

Finalists 2020

Professional AwardBest CybersecurityHigherEducationProgramCapitol Technology University

Capitol Technology University offers its students a boldguarantee: You will receive a job offer within 90 days of commencement, or theschool will provide up to 36 additional undergraduate credits, tuition-free,while the search for employment continues.

Theres a reason the private South Laurel, Maryland school is soconfident: By the time they finish sophomore year, most undergraduate studentsat Capitol are already employable. Also, the university maintains closerelationships with private-sector companies and the nearby Department ofDefense, regularly tailoring its curriculum to suit these organizations needs.

Capitol offers BS, MS and DSc programs. Undergrads gain technicalknowledge and basic skills in their first semester, and in their ensuing yearsearn certifications such as Security +, CEH, and Access Data Forensics. MSstudents are trained to lead teams of security professionals for cyber defenseoperations, research and analysis, and can develop specializations (e.g.cyberlaw, forensics and cryptography). And its doctoral program is designed toproduce senior cybersecurity leaders who take on challenging careers incybersecurity and academia.

Capitol offers an extensive variety of cyber lab projects,competitions and clubs. Lab areas include cyber, digital and mobile forensics,identity management, IoT vulnerability assessments, quantum computing and SOCanalyst training.

A designated a CAE-CDE institution, Capitol was chosen in 2014 toprovide Masters-level courses to newly hired NSA security engineers as part oftheir development program prior to permanent assignment. Capitol has also beenselected by over 20 Cyber Scholarship Program scholars over the past 10 yearsto earn their degrees in cybersecurity and then return to government service incritical cybersecurity positions.

Finalists 2020

Trust AwardDataLossPrevention(DLP) SolutionDigital GuardianDigital GuardianDataProtection Platform

Combine DLP with EDR and UEBA and what do you get? Well, if youreinto anagrams, you might get BEAR PUDDLE, but if youre into cybersecurity,then you get the Digital GuardianData Protection Platform.

The solution unifiesdatalossprotectioncapabilities with endpoint detection and response, as well as User EntityBehavior Analytics, enabling organizations to detect and gain insights intoanomalous activity, while stopping insider threats and external attackers fromexfiltratingdata.

A key component is the Digital Guardian Analytics & Reporting Cloud, which incorporates an innovative function that leverages the same endpoint agent, network sensor and management console to preventdataloss. This approach simplifies management, streamlines information sharing, eases the burden on resources and reduces cost.

Users derive a rich set of analytics from monitoring system, useranddataevents. Alarms are only triggered forhigh-fidelity events, and when they do occur, security professionals canrespond with drag-and-drop incident management and real-time remediation,blacklisting processes as needed.

The solution also comes with analyst-approved workspaces, whichpoint security professionals to events relevant to identifying suspiciousactivity. Analysts can drill down to follow an investigation and determine nextsteps, or to create custom dashboards, reports and workspaces.

DGsDataProtectionPlatform can be deployed as a software-as-a-service or on-premises solution, oras a managed service.

Digital Guardian made significant improvements to its DLPtechnology this past year. Fully integrated UEBA capabilities were optimized tosupplementdataclassificationand rule-based policies with even more granular insights. And the Security RiskDashboard now allows users to view everything in a single user interface, whileprioritizing the most important security alerts corresponding highly tosensitivedata.

Finalists 2020

Trust AwardBest Database Security SolutionImpervaImperva Data Security

After winning Best Database Security Solution in 2019, Impervaretains the honor this year for its Imperva Data Security product offering.

Imperva Data Security is equipped with machine learning andanalytics to quickly detect, classify and quarantine suspicious data activityand protect sensitive information on premises, in the cloud and across hybridIT environments. It also provides security teams with deep context to quicklyinvestigate and remediate security incidents.

Imperva automates a litany of processes, helping users conserveresources. The solution discovers, identifies and classifies sensitive data;assesses database vulnerabilities; monitors data access and usage; analyzesuser behavior and flagsactions that contradict normal activity; and detects policy violations in realtime, sending alerts or even terminating sessions in critical cases. Impervacan monitor and evaluate billions of database events in near real time.

Additionally, Imperva features built-in standardized auditingacross heterogeneous

enterprise databases and also allows customers to take monitoringand reporting workloads off their database server so that the server can beoptimized for database performance and availability.

A Total Economic Impact Study commissioned by Imperva found thatorganizations can save more than $3 million over three years by switching froma legacy database security solution to Imperva Data Security, due to reducedrisk and lowered cost of compliance audits. The study further determined thatusers can achieve a return on investment in fewer than 16 months.

Imperva Data Security offers flexible and predictable licensing tofit the needs of customers regardless of the number, location or type of devicesor services used, nomatter where the data lives.

Finalists 2020

Trust AwardBestDeceptionTechnologyAttivo NetworksThreatDefend Platform

Your eyes are not deceiving you. The ThreatDefend Platform fromAttivo Networks stands out amongdeceptionsolutions due to itsauthentic-looking decoy environment and high-fidelity alert system that reducesfalse positives.

For user organizations, this results in a sharp reduction inattacker dwell time across all environments, including the network, endpoints,applications, databases, user networks, data centers, the cloud and evenspecialty attack surfaces like IoT devices, industrial controls systems andpoint-of-sale solutions all with a focus on high-value assets.

According to Attivo, the challenge with many detection solutions is the time it takes for them to learn the nuances of an organizations digital environment. But ThreatDefend provides immediate detection value with its ability to identify and flag attack engagement as well as spot activities such as reconnaissance, credential harvesting and lateral movement.

Moreover, the platform enables enterprises to accurately mimictheir real-life production environments inside the decoy environment, furtherenhancing its realism via Active Directory integrations. This tricks attackersinto interacting with fake assets, revealing themselves in the process.

ThreatDefends machine learning-based preparation, deployment andmanagement keepdeception fresh and authentic. Its BOTsink attack analysisengine generates accurate alerts, which are substantiated with full TTPs andIOCs, simplifying and accelerating incident response while reducing fatiguecaused by false alarms.

When an intruder is detected, the solution recommends potentialattack paths for mitigation before a major attack occurs. And its 30-plusnative integrations and ThreatOps repeatable playbooks automate and expediteincident response such as blocking, isolation and hunting.

Attivo customers have even started to generate additional value by further leveraging ThreatDefend for digital risk management operations, endpoint detection and response, managed services, incident response and continuous assessment/resiliency testing of IT environments.

Finalists 2020

Trust AwardBest Email Security SolutionProofpointProofpoint Email Security

E-mail-based attacks come in many forms: malware, credentialphishing and fraud schemes among them. But not every threat carries the sameweight, and not every target in an organization is equally desirable tocybercriminals.

ProofpointEmailSecurity is designed to catch and killall of these species of threats, while also prioritizing them. The solutionidentifies an organizations most frequently attacked people and surfacesinteresting threats from the noise of everyday malicious activity. Securityteams can set adaptive controls based on each users risk profile, enabling anautomated response.

Delivered as a cloud-based solution available across all platformsand devices, Proofpoint Email Security combines inboundemailanalysis and filtering with outbounddata protection, encryption and secure file sharing.

To combat polymorphic malware, weaponized documents and maliciousURLs,ProofpointEmail Securityuses sandboxing with static and dynamic analysis. The solutionalsoprovides email isolation to isolate URL clicks and prevent malicious contentfrom impacting corporate devices.

To thwart attempts at credentials phishing and fraud schemes like businessemail compromise (BEC),Proofpointincorporatesdetailedemailanalysisand classification with full kill-chain analysis, including dynamic sandboxing.It also signatures the output of the kits that attackers use to generatephishing pages and proactively detects lookalike domains.

The solutions automated response capabilities include removingemailsfrom an end user inbox if they are determined to be malicious after delivery,such as when a URL is weaponized after theemail is sent. Meanwhile, the solutions data lossprevention capabilities protect outboundemails by automatically detecting a wide variety ofprivate information and blocking, quarantining or encrypting this info asappropriate.

Finalists 2020

Excellence AwardBestEmergingTechnologyOneTrustVendorpedia

A 2018 survey of 1,000 companies found that businesses, on average, share sensitive information with about 583 third-party partners.

Unfortunately, it takes only one to cause a damaging data breachincident that harms customers and violates regulations that can lead to massivefines.

Its imperative that modern security programs extend theirsecurity, privacy and compliance expectations to their vendors. Founded in2016, OneTrust seeks to cut down on third-party risk with its Vendorpediaproduct, which security pros can use to assess vendors, access research andreference thousands of pre-completed vendor assessments, as well as monitorvendors in accordance with global laws and frameworks.

Vendorpedia lets users automate the entire vendor lifecycle fromonboarding to offboarding. Offerings include dynamic assessments with automatedrisk identification; risk mitigation workflows and tracking; free vendorchasing services to offload assessment-related work; a global risk exchangewith pre-populated research and assessments on roughly 8,000 vendors; contractmanagement and service-level agreement performance monitoring; data flowvisualizations and custom dashboards; and a breach and enforcement tracker forongoing oversight.

The platform is updated with the latest privacy laws and securityupdates thanks to OneTrusts 40-plus in-house, full-time privacy researchersand a globally available network of 500 lawyers representing 300 jurisdictions.

Vendorpedia has allowed us to be more agile and scale rapidly tooptimize our business processes and simplify our assessment, mitigation andmonitoring of third-party risks, said Jonathan Slaughter, director ofcompliance, security and privacy at cloud solutions provider ClearDATA.

OneTrust plans to further to advance its platform with futureupdates that will include expansion of its Global Risk Exchange plusenhancements to its depth of research; breach and enforcement automationworkflows to enhance incident response; and an autocomplete assessment tool sovendors can respond to questionnaires faster.

Finalists 2020

Excellence AwardBest Enterprise Security SolutionCyberArkCyberArk Privileged Access Security Solution

Winning back-to-back titles in any endeavor is not an easyaccomplishment, but the CyberArk team achieved this level of success by takinghome the Best Enterprise Security Solution award in 2019 and once again in2020.

What CyberArk delivers with the CyberArk Privileged AccessSecurity Solution is the ability to protect its customers as they necessarilyinvest in digital transformational technologies, move to the cloud, bring on aDevOps team, and invest in IoT and robotic process automation. While theseadditions certainly make a company more viable, they also greatly increase itsattack surface.

In order to continue delivering the highest level of protection against this ever-increasing attack surface, the company in July 2019 unveiled a suite of privileged access security solution products. This includes CyberArk Alero, a dynamic solution for mitigating risks associated with remote vendors accessing critical systems through CyberArk, and CyberArk Endpoint Privilege Manager, a SaaS-based solution that reduces the risk of unmanaged administrative access on Windows and Mac endpoints.

Read the original post:
The winners of the 2020 SC Awards Honored in the US | SC Media - SC Magazine

Read More..