Page 2,873«..1020..2,8722,8732,8742,875..2,8802,890..»

iCloud vs OneDrive – Creative Bloq

When it comes to storing your files, cloud storage is an increasingly popular alternative to traditional hard drives, especially for creative businesses and freelancers. With a good cloud storage provider, you can be sure your files are safely backed up, and you can access your work wherever you are, on all your devices.

Two of the best cloud storage providers currently available are Apples iCloud Drive and Microsofts OneDrive. If you want to know which is better suited to you, read this iCloud vs OneDrive comparison, in which we look at the features, performance, support, and pricing of the two providers.

EXCLUSIVE: IDrive One-year 5TB plan:$69.50$3.48Get 95% off: IDrive is our choice as the best cloud storage service available, and Creative Bloq readers can get an exclusive deal today. You can get 5TB for only $3.48, for a whole year.View Deal

There are several features shared by iCloud and OneDrive, including device backup and sync, sharing files and folders using custom links and scanning documents using the mobile app.

An advantage of OneDrive is that its well integrated with Microsofts Office apps: Word, Excel, PowerPoint and Outlook. These can be bought separately or as part of the Microsoft 365 plan, which includes OneDrive storage.

OneDrive offers excellent collaboration options: you can share files and folders, and multiple users can work on a shared file simultaneously. You can work on these either within the Office apps themselves or through the browser-based OneDrive interface, which means that you can access and edit your files from any computer with an internet connection.

iCloud has similar integration with Apples iWork apps Pages, Numbers, and Keynote which are free to Apple device owners. There are a few similar collaborative features, such as the ability to work simultaneously, but unlike OneDrive, you cant access and edit documents from the browser interface, which limits you to editing on fully synced devices.

OneDrive has a useful version history tool that allows you to revert files to previous versions from the past 30 days. This works with all file types, not only Microsoft documents but also PDFs, photos, videos, and more. With Apples software, while the iWork apps save version history, theres no equivalent versioning feature built into iCloud Drive.

When it comes to security features, both iCloud and OneDrive offer two-factor authentication, but OneDrive has an extra tool called the Personal Vault. It allows you to set up an area within your drive with additional security measures, such as a fingerprint/face scan or a code sent via email or SMS. This customisation means you can divide your files between folders that you want to keep secure and folders where youre more concerned with quick access.

Overall, OneDrive offers a broader and more impressive set of features. See our OneDrive review.

With both iCloud and OneDrive, you can upload files via desktop app, mobile app, or web browser. Both services make this a streamlined and accessible process, with synchronisation being automated and efficient. iCloud has a maximum file size of 50GB, while OneDrive has an impressive 250GB maximum though unless you work with seriously large video files, its unlikely that youll ever need to upload files over 50GB.

iCloud works particularly smoothly with Apple devices. Your iCloud folder system comes built into the Macs Finder. Similarly, iCloud is built into the way that iPhones and iPads organise various types of filese.g., you can easily synchronise an iPhones camera roll with the Photos app across your other devices. Theres little in the way of set-up because if you have an Apple device, youll have an iCloud account, and all the software needed to store your files on the cloud is built in.

However, if you dont use Apple devices, you might struggle with iCloud, as its Windows app is difficult to work with, and its web browser interface is basic compared to competitors.

OneDrive, meanwhile, being a Microsoft product, runs smoothly on Windows PCs. Its folder system is built into Windows 10s File Explorer, with no fuss in getting started with it. It has well-designed apps not only for a broader range of smartphone and tablet models but also for Mac computers. Theres a bit of set-up required, but once this is done, OneDrive operates and synchronises smoothly across various devices.

In other words, if you primarily use Apple devices, youll find that iCloud runs well, but if not, then OneDrive is the better option in terms of efficient performance.

If you encounter problems using either service, then you can contact tech support for help. Apples support for iCloud is very efficient. All users, even those on the free plan, can talk to a technician via phone. When we checked, the wait time was just two minutes, though this will vary depending on where in the world you are.

With OneDrive, phone support is only available for business users, but all users can contact support through web chat. However, when we tested this chat, we found the service slow and the responses unhelpful.

So, in terms of being able to contact support, Apple wins. That said, both providers have online FAQ systems Microsofts seems more comprehensive than Apples and active community forums, so you can often find solutions to your tech problems without the need to contact support.

With both iCloud and OneDrive, all users can access 5GB of storage space for free. While this is a useful way to test out the services, its not enough storage for most users needs, especially those of creative professionals. With both providers, youll need to pay a subscription fee to get more storage.

With iCloud, the options are simple: 50GB of space for $0.99/month, 200GB for $2.99/month, or 2TB for $9.99/month.

Microsofts pricing options are more complex. The standalone OneDrive plan gives you 100GB for $1.99/month, or you can subscribe to Microsoft 365, which includes 1TB of space and the four Office apps, for either $69.99/year or $6.99/month. Theres also a Family plan, which provides 1TB each for six users and Microsoft Office for $99.99/year or $9.99/month. For larger teams, there are business plans that start at $5/user/month.

So, when it comes to a simple comparison of how much space you get vs how much you pay for it, the two services are closely matched, with iCloud perhaps being marginally preferable the cheap 50GB option is ideal for those who only need a small amount of storage. But the bundling with Office apps makes Microsofts plans a good value deal.

Its for good reason that iCloud and OneDrive are both popular: theyre well-priced providers that make it easy to back up your files and access them from various locations and devices.

Overall, our preferred choice is OneDrive. It has a deeper range of collaboration features and a much better web browser interface, and it functions more smoothly across a broader range of devices. Its a good choice for small creative businesses or teams of freelancers where collaboration is essential.

iCloud is only a worthwhile choice if you primarily use Apple devices, as the integration with Apples system is valuable but comes at the expense of functionality elsewhere. That said, its cheap 50GB plan is ideal for iPhone and Mac owners who want extra personal storage space.

Today's best cloud storage deals

Read more:

Read the original post:
iCloud vs OneDrive - Creative Bloq

Read More..

Your occasional storage digest with TrendForce NAND figures and more than 25 other announcements Blocks and Files – Blocks and Files

At opposite ends of the storage speed spectrum, IBM is withdrawing LT06 tape products and Phison has a fast PCIe Gen 4 SSD controller. Meanwhile, NAND revenues are growing steadily, according to TrendForce analysts. Cohesity has hired an EMEA boss, and Lightbits Labs has a new VP for marketing.

Research house TrendForce reported NAND revenues in 2021s first quarter rose 5.1 per cent quarter-on-quarter. Bit shipments rose by 11 per cent while the overall ASP dropped by five per cent the bit ship rise offset the ASP fall. More bits were needed for consumer devices and smartphones. Data center demand was flat.

Among the suppliers:

Removing 3D XPoint, which TrendForce included in the Intel NAND number, Intels pure NAND revenues would have risen 9.7 per cent from the prior quarter Combined Kioxia/WD revenues were $4.96bn, a tad under Samsungs $4.7bn.

Actian announced its Ingres NeXT initiative with a new version of Actians OpenROAD 4GL application development environment and support for Ingres relational database management software as a fully managed service in the cloud.

ATTO has released ATTO 360 Tuning, Monitoring and Analytics Software version 3. This can improve the performance of ATTO FastFrame Ethernet SmartNICs by up to 30 per cent. V3 adds support for Remote Direct Memory Access (RDMA), plus three new profiles for NVMe-over-Fabrics, iSER, and NFS over RDMA, plus a new DellOneFS profile for NFS over RDMA.

AWS announced the general availability of Amazon Elastic Container Service (ECS) Anywhere, which enables customers to run and manage container-based applications on-premises using the same APIs, cluster management, workload scheduling, monitoring, and deployment pipelines they use with Amazon ECS in AWS.

AWS also announced the general availability of Amazon Redshift ML, making it possible for data analysts, database developers, business analysts, and data scientists to create, train, and deploy machine-learning models with familiar SQL commands instead of having to use a separate interface and a different programming language.

Backblaze announced a spring release of its B2 Cloud Storage platform with S3-compatible object lock for immutability, object-lock legal hold, server-side encryption, and cross-origin resource sharing. Later this year, the B2 Cloud Storage platform is set to add a Partner API, giving software partners the ability to provision and manage Backblaze B2s scalable storage natively.

Enteprise file collaborator Egnyte announced new integrations with Google Workspace and Dropbox to help companies secure and govern their sensitive cloud content.

BigID, which supplies a data intelligence and management platform, announced a partnership with Databricks, a data and AI company, to integrate their products. BigID enables customers to find and identify their sensitive and critical data in Delta Lake on Databricks and take action on it for security, privacy, and governance.

Research house DCIG has announced the availability of its 2021-22 DCIG TOP 5 Enterprise Multi-site File Collaboration Solutions report. This report provides guidance organisations should consider for SDS-based file storage solutions that enable effective file collaboration.

Druva, which supplies cloud data-protection-and-management-as-a-service, has appointed Redington as the its first national distributor in India.

Singapore-based Flexxon has launched an SSD thats said to have AI-powered cybersecurity features built in. Theres a video here.

In-memory computing supplier GridGain has released a software-as-a-service version of the GridGain Control Center. This is a cloud-native, subscription-based solution for operations teams to manage their production environments, and enables customers to manage compute clusters that are deployed across single or multi-cloud services, on-premises, and hybrid environments.

HPE has announced its forthcoming a Kubernetes CSI driver will support its newly launched Alletra 6000 and 8000 arrays.

Effective August 30, 2021, IBM will withdraw from marketing its LT06 tape drive and TS2900 Tape Library with LTO6 HH SAS drive and rack mount kit products.

Kingston Technology has announced high-performance, enthusiast, and gaming DDR3 and DDR4 memory modules under the Fury brand.

NAND and DRAM supplier Micron has now expects its fiscal Q3 2021 revenue to be at or above the high end of the guided range, which was $6.9-$7.3bn. The upside is largely driven by an improved pricing environment across both the NAND and DRAM markets.

GPU supplier Nvidia has announced a string of server OEMs for its BlueField-2 DPUs, aka SmartNICs: ASUS, Dell Technologies, GIGABYTE, QCT, and Supermicro.

UK-based object-storage supplier Object Matrix has delivered an additional 600TB node for BT TVs existing MatrixStore On-Prem object-storage solution.This has enabled BT TV to migrate content from 6 MatrixStore nodes being decommissioned. This changeover increases the capacity of the existing system, and saves power andrackspace by replacing sixlegacynodes with a single new one.

Panzura announced that DCIG has added Panzura CloudFS global file system to its list of top 5 enterprise multi-site file collaboration solutions. Get the DCIG report here.

Amer Sports, an international sporting goods company, is using Percona to provide support for MySQL databases at the heart of its applications and services for customers.Perconas open-source database team will implement best practices and support services across Amer Sports MySQL database deployments, with the goal being to reduce operational costs, improve performance, and enhance security.

Flash controller supplier Phison Electronics Corp is shipping its PS5018-E18 PCIe Gen 4 controllers to manufacturer partners with 176-layer replacement gate NAND. It delivers up to 7,400 MB/sec sequential reads and 7,000 MB/sec sequential writes.

HCI supplier Scale Computing announced sales wins at several US educational institutions: Auburn University, Hood College, Virginia Tech, and Community High School District 218 in Illinois.

Virtual SAN provider StorMagic announced its SvSAN and Zerto (DR and cloud data management and protection) have been validated with HPE Proliant servers. The three products can be delivered through the HPE Complete Program so customers can protect edge-to-edge, edge-to-core, or edge-to-cloud workloads.

Chinese supplier TerraMaster has announced its two-bay F2-210 NAS system with up to 36TB capacity for under $150, and its four-bay F4-210 with up to 72TB capacity for under $300. Both are available on Amazon.

Varada, the data lake query acceleration startup, announced a new capability designed to support text analytics workloads and work faster on exabytes of string-based data. Varadas technology is integrated with Apache Lucene and works on a customers data lake, serving SQL data consumers out-of-the-box.

WekaIO, the fast filesystem startup, and Arch Platform Technologies, have announced the integration of Weka cloud deployments with Archs studio-in-the-cloud platform. The two say artists and studios can on-ramp to cloud-based creative production with their integrated pair of products and and get to market quicker.

Data manager and protector Cohesity has named Richard Gadd as VP and GM of EMEA sales with immediate effect. He comes from a similar position at Hitachi Vantara.

NVMe-over-TCP/IP-based software-defined storage supplier Lightbits Labs has appointed Carol Platz as its VP of Global Marketing. She comes from WekaIO.

Read more from the original source:
Your occasional storage digest with TrendForce NAND figures and more than 25 other announcements Blocks and Files - Blocks and Files

Read More..

NetApp CEO Kurian: ‘We Are Reaching More Customers Than Ever Before With Our Public Cloud Business’ – CRN

NetApp And A Cloud Future

NetApp Wednesday released the financials for its fiscal year 2021, which turned out to be a very good year on which to build a strong fiscal year 2022. Not only did NetApp see product growth in the fourth quarter of the year for the first time in several quarters, it also reported growth in its cloud business.

During NetApps quarterly analyst conference call, CEO George Kurian was very clear that the cloud was the key to NetApps future. We plan to accelerate our public cloud services and continue to grow our hybrid cloud business, he said. I am excited about the year ahead and confident in our ability to deliver top-line growth by supporting our customers on their cloud and digital transformation journeys.

Just how important is the cloud to NetApp? While better known as the largest independent storage vendor, NetApp has become the pioneer in technology to allow data to be seamlessly managed and migrated across on-premises, private cloud, hybrid cloud and public cloud environments. Kurian, during the conference call, made it clear that NetApp is on the way to seeing massive growth in cloud annual recurring revenue and is on a roll in terms of growing its customer base in the cloud.

Here are five key things Kurian said on the quarterly conference call.

The Cloud Is NetApps Future

We intend to leverage our deep infrastructure expertise and our credibility with the cloud providers to expand our multi-cloud infrastructure management services. My confidence in our ability to reach our goal of $1 billion in public cloud ARR [annual recurring revenue] in fiscal year 2025 is further enhanced by the strength and uniqueness of our cloud services position.

Our focused execution last year has set us up well for fiscal year 2022. We have returned to growth, we are gaining share in key storage markets, and our public cloud services are at a scale where they are positively impacting total company billings and revenue growth. Our momentum underscores our value to customers in a hybrid, multi-cloud world. In fiscal year 2022, we plan to accelerate our public cloud services and continue to grow our hybrid cloud business.

NetApp Is Reaching More Cloud Customers Than Ever

We are reaching more customers than ever before with our public cloud business. Over the course of fiscal year 2021, we added approximately 1,500 new-to-NetApp customers with public cloud services, and grew our total cloud customer count by 137 percent from Q4 fiscal year 2020. In addition to adding new cloud customers, existing cloud customers are expanding their spend with us. Our dollar-based net revenue retention rate increased to 252 percent in the fourth quarter.

Cloud Annual Recurring Revenue Boomed In 2021

Cloud Volumes, Cloud Insights and Spot all performed well in the quarter, driving our public cloud services ARR to $301 million exiting fiscal year 2021, an increase of 171 percent year over year. ... Cloud Volumes, Cloud Insights and Spot are now the primary growth engines of our public cloud services business. They are well established for enterprise applications, and we are taking each of them into the cloud-native world, further expanding our market opportunity.

Cloud Annual Recurring Revenue Poised For Huge Growth

Were expanding the number of sellers and revenue-generating teams facing customers. Weve seen really good success with Microsoft in terms of their route to market. Were working with the other hyperscalers to also train and expand the range of ways we take our products to market with them. And were building our customer service and customer success teams.

I feel even more confident that we have the range of capabilities to achieve the $1 billion target in ARR that we indicated. And the customer use cases that we are deploying on our public cloud portfolios, these are run-the-business applications, mission-critical, highly differentiated business value-creating applications. So, I feel really, really good about where we.

Storage Product Sales Outlook

In Q4, product revenue grew 6 percent, and our all-flash array business grew 11 percent year over year. Based on our growth, I am confident that we have gained share in the storage and all-flash markets. We advanced our hybrid cloud portfolio with the introduction of ONTAP 9.9 and Astra. This innovation will support continued product revenue growth and share gains through fiscal year 2022. Additionally, we continue to make good progress with Keystone, with many new wins, including our largest ever Keystone deal

NetApp Astra offers application-aware data management that protects, moves and manages data-rich Kubernetes workloads. Spot Ocean automates cloud infrastructure for containers, automatically scaling compute resources to maximize utilization and availability while minimizing costs. Cloud Insights for Kubernetes provides simplified infrastructure monitoring to quickly identify performance issues and resource constraints. Together, our public cloud services give our customers, and especially their CloudOps and DevOps teams, a robust suite of multi-cloud infrastructure management services.

Continue reading here:
NetApp CEO Kurian: 'We Are Reaching More Customers Than Ever Before With Our Public Cloud Business' - CRN

Read More..

The future of storage resides at the intersection of the edge and cloud – ITWeb

Ossama El Samadoni, Sr Sales Director, MERAT.

Information is the lifeblood of companies and in this new data era, the combination of massive amounts of data and unparalleled technology innovation has given businesses of all sizes the opportunity to become disruptive, digital powerhouses. Data has become more diverse than ever before and is now being created, processed and stored everywhere, from edge to cloud.

Tim Berners-Lee, the inventor of the World Wide Web, said: Data is a precious thing and will last longer than the systems themselves.Today, data is increasingly generated and consumed across a geographically distributed and mobile infrastructure of people, processes and tools. As every aspect of our lives and business go online, organisations are rapidly transforming how and where business happens. The edge increasingly is where data is being created, with Gartner estimating that 75% of the data will be generated and processed at the edge.

Edge represents an incredible opportunity for organisations as they digitise business processes and extract value from all the data they collect from their operations. It also represents new challenges as organisations must find a way to gain real-time insights across a massively distributed set of devices, with large data volumes in a cost-effective manner. Attempting to manage this growth and data with traditional storage technology where data is moved to and processed in centralised data centres comes with challenges, such as:

Most organisations have found that no single infrastructure can address all their data requirements, so they utilise different architectures, creating silos of IT resources that are managed and consumed independently. At the same time, IT is under increasing pressure to deliver greater levels of simplicity and agility on the business side.

In order to manage data generated by edge environments more efficiently, enterprise-grade, on-premises storage must now provide the same operational flexibility as cloud, becoming ever more adaptable, automated and easier to integrate with existing management frameworks. By landing storage and cloud capabilities out at the edge, organisations have more latitude to decide where a specific workload is best processed.

But this brings us to the real question. What are some of the enterprise data storage practices that would help unlock the real value of data capital at the edge?

Firstly, the digital leaders of the future cant be built on the technology approaches of the past IT needs to evolve to provide a technology foundation that accelerates digital innovation. Todays storage infrastructure technology is designed to make hybrid cloud environments and data produced at the edge easier to deploy and manage. These purpose-built suites of solutions have evolved to fill an essential role in the data centre, providing ever-expanding levels of performance, capacity and resiliency for mission-critical workloads. Modern storage architecture is helping businesses succeed, by not only supporting current business needs, but also allowing scale to evolve IT infrastructure as business dynamics change.

Therefore, organisations must refresh their storage infrastructure on a regular basis and keep up with the increased data demands by eliminating ageing infrastructure that is more susceptible to failures that cause outages/downtime. Modern storage infrastructure also frequently includes advanced data protection features that help ensure the on-premises data remains safe and secure. Data encryption adds an additional layer of protection to this, improving data security and mitigating the potential for data loss.

Next, against the backdrop of post pandemic recovery and the volume of data being generated from the edge, one thing is clear. The lines between storage and cloud are blurring as organisations demand agility and simplicity for business-critical IT infrastructure to respond to changing market dynamics. According to a recent report from Coherent Market Insights, the GCC and Levants data storage market is set to reach a record-high of $8.5 billion by 2027, nearly tripling from $2.9 billion in 2019. In order to manage and process high volumes of data, enterprises are transitioning from multicloud architecture to hybrid cloud storage that leverage the power of cloud for their edge environments. This is because hybrid cloud data storage technology provides the flexibility and resiliency that enterprises need for evolving workloads.

To conclude

The premise is simple. The edge is a key element of the future of computing, and data storage and cloud are intrinsically linked to that. To effectively ride the data waves, organisations must modernise their data centres and embrace intelligent and adaptable enterprise cloud storage infrastructure. The organisations that do will be prepared to manage the deluge of data thats already here.

See original here:
The future of storage resides at the intersection of the edge and cloud - ITWeb

Read More..

Donald Trump, Assault Weapons, Prom: Your Weekend Briefing – The New York Times

(Want to get this briefing by email? Heres the sign-up.)

Here are the weeks top stories, and a look ahead

1. Donald Trump began his next act Saturday night at the North Carolina Republican convention.

In a 90-minute speech, Mr. Trump ran through a litany of conservative culture war issues and ended with an extended frontal attack on voting and American democracy in which he endorsed a long list of Republican voter suppression proposals.

The former president is both a diminished figure and an oversized presence, our White House correspondents write. He shut down his blog after hearing from friends that the site was getting little traffic and making him look small and irrelevant, according to a person familiar with his thinking. But he remains the front-runner for the Republican Partys 2024 presidential nomination in every poll, and believes he could be reinstated to the White House in August.

If youre a one-term president, you usually go quietly into the night, said a presidential historian. He sees himself as leading the revolution, and hes doing it from the back of a golf cart.

Newly uncovered emails provided to Congress show that during Trumps final weeks in office, Mark Meadows, his chief of staff, repeatedly pushed the Justice Department to investigate unfounded conspiracy theories about the 2020 election.

2. The U.S. appears to be trying to close the curtain on the pandemic. Across the ocean, in Britain and the European Union, it is quite a different story. Above, Parisians getting coffee last month after the countrys lockdown measures had been eased.

America has essentially lifted all rules for people who are vaccinated, while parts of Europe are maintaining limits on gatherings, reimposing curbs on travel and weighing local lockdowns even as infection levels plunge. The split is particularly stark in Britain, which is facing the spread of Delta, a new variant first detected in India.

Thailand is one of many Southeast Asian countries suffering a late-breaking wave. Two nightclubs are at the epicenter of its biggest and deadliest surge.

3. Calling it a failed experiment, a federal judge overturned Californias 32-year assault weapons ban.

The judge, Roger T. Benitez, wrote in his ruling that the firearms banned under the states law were fairly ordinary, popular, modern rifles, describing the AR-15 assault rifle as a perfect combination of home defense weapon and homeland defense equipment. Above, AR-15 style rifles at a gun store in Oceanside, Calif. in April.

The judge granted a 30-day stay to allow the states attorney general to appeal the decision, where it is likely to join a number of other closely watched gun rights cases on appeal. The judges vividly worded opinion, comparing military-style firearms to Swiss Army knives, underscored the growing boldness of gun rights advocates.

4. For the first time in a generation, workers are gaining the upper hand.

Companies are becoming more willing to pay a little more to train workers, to take chances on people without traditional qualifications and to show greater flexibility in where and how people work, our senior economics correspondent writes. Above, Adquena Faine, a former ride-hailing driver who is now building a career as a cloud storage engineer.

The share of job postings that say no experience necessary is up two-thirds over 2019 levels, according to one firm. The shift builds on changes already underway in the tight labor market before the pandemic, when the unemployment rate was 4 percent or lower for two straight years.

But polls suggest Americans remain divided on whether President Bidens policies are helping or hurting the recovery. Progressive activists contend that the enhanced pandemic unemployment insurance, which Republicans and many employers decry, is giving workers a bit more leverage. The White House is emphasizing that the benefit will expire in September, as planned.

5. A severe drought of historic proportions has much of the Western half of the U.S. in its grip.

Nearly all of California, Oregon, Nevada, Arizona, New Mexico, Utah and North Dakota are in drought, and in large areas of those states conditions are severe or exceptional. Above, water-intensive almond trees are removed from an orchard in Snelling, Calif.

Wildfires of a size normally seen in summers have already occurred in California, Arizona and New Mexico. Experts are concerned that this summers wildfires will be severe and widespread. Reservoirs in California hold about half as much water as usual for this time of year.

On the other side of the Pacific, the annual summer monsoon in South Asia begins this month. A million years of data suggests global warming is likely to make monsoons worse.

6. President Biden will head to England this week for a Group of 7 summit and will later hold meetings with European leaders.

Ahead of the summit, finance ministers agreed to back a new global minimum tax rate of at least 15 percent that companies would have to pay regardless of where they are based. Officials said the agreement could reshape global commerce and solidify public finances after more than a year of combating the pandemic.

As E.U. leaders prepare to welcome Biden, the simple fact that he regards Europe as an ally and NATO as vital is almost a revelation. Yet the Trump administration has left scars that some experts say will not soon heal, and there are serious issues to discuss: the withdrawal from Afghanistan, cyberwarfare, trade disputes, vaccines.

Meanwhile, Vice President Kamala Harris is embarking on her first international trip, to Guatemala and Mexico, to address migration to the U.S. by seeking to improve conditions in those countries.

7. At the U.S. Womens Open, a 17-year-old amateur put herself in competition.

Nearly eliminated in qualifying, Megha Ganne, above, a Stanford-bound high-school junior from New Jersey, rose to the top of the leaderboard after two rounds. One of her most famous competitors, Michelle Wie West, 31, wouldnt be in the tournament if crude comments from Rudy Giuliani hadnt inspired her comeback.

8. Stonehenge, Angkor Wat in Cambodia, above, and the Taj Mahal: Demand for once-in-a-lifetime travel is high.

Last year, travelers had to put aside their bucket-list dreams of trekking to Mount Everest base camp or visiting the wonders of the ancient world. Now, as vaccines are available and countries open to visitors, tour companies are reporting a resurgence in interest for summer and fall trips from those hoping to get to these iconic sites.

If youre more of a lounging type, these aerial photographs of pools around the world are soothing, and so are these ideas for do-it-yourself rain gardens.

9. As spring turns to summer, hope blooms at prom.

There were custom-made masks to match outfits. There were silent discos to encourage social distancing. There was dancing, outdoors, on the football field. And there was joy, as American high school rites of passage proved durable, flexible, pandemic-proof. We went to four California high schools to report on Covid-influenced proms.

For more big looks and glam dresses, meet Symone the drag queen persona of Reggie Gavin, winner of this seasons Ru Pauls Drag Race.

10. And finally, relax and read.

The mystery of the $113 million deli. The life and death of your jeans. Kate Winslet, above, without a filter. Find these and more in The Weekender.

Read the original:
Donald Trump, Assault Weapons, Prom: Your Weekend Briefing - The New York Times

Read More..

HPE lifts earnings outlook: Storage finally returns to growth as overall revenues lift Blocks and Files – Blocks and Files

HPE has reported earnings showing a Y/Y increase in overall revenues, as the edge, compute, HPC and storage segments all reported business increases. For the storage segment, it was particularly welcome reprieve after six consecutive quarters of decline.

The firm was confident enough to issue a third dividend for this fiscal year.

In the first 2021 quarter, ended April 30, HPE reported $6.7bn in revenues, up 11 per cent Y/Y, with a profit of $259m, just 3.9 per cent of revenues. It was HPEs third consecutive quarter of Y/Y top line growth. in Dells equivalent quarter, its $938m profit was 9.4 per cent of its revenues, indicating that HPE has comparatively higher costs.

Antonio Neri, HPEs president and CEO, said of the results: We are strengthening our core compute and storage businesses, doubling down in our growth Intelligent Edge and HPC businesses and accelerating our pivot to as-a-service, while also advancing our cloud-first innovation agenda to become the edge-to-cloud platform as-a-service choice for our customers and partners.

Financial summary:

Business segment results:

HPE said the Aruba Edge Services platform supports well over 100,000 customers with 150 new customers added every day.

Within the storage segment, HPE said there was notable strength in software-defined storage, including Nimble, up 17 per cent Y/Y when adjusted for currency, with strong momentum in dHCI growing triple-digits. All flash arrays grew 20 per cent from the prior-year period led by Primera, the 3PAR replacement array, also up triple-digits from the prior-year period.

Compared to the latest storage-related results from Pure Storage, Nutanix, Dell Technologies and Snowflake, HPEs storage revenues show a downward trend since the start of its fy2019 with the latest quarter showing growth after six consecutive quarters of decline. It must be very welcome.

HPEs switchover to as-a-service/GreenLake product supply delivered a 30 per cent Y/Y increase in annual recurring revenue (ARR) to $678m.

The outlook for the next quarter is for a low single digital revenue increase Y/Y; Wells Fargo analyst Aaron Rakers estimated this to mean $6.921bn, a 1.5 per cent Y/Y increase. Neri said: We expect to continue to see improvement in customer IT spending throughout 2021 in the earnings call.

With HPE storage returning to growth, based on strong growth in the Nimble and Primera product lines, HPE must be hoping that the products announced successors, the Alletra 6000 (Nimble) and 9000 (Primera), will continue to build on this base. Both have an even stronger as-a-service element than before aiming for the cloud-style hybrid environment the firm believes customers want. Neri said Alletra was propelling HPEs storage business into our cloud-native software-defined data services business.

The focus is moving away from the storage hardware platforms to subscribed data services delivered across the on-premises and multiple public cloud environments.

We would expect HPE to pivot its on-premises ProLiant and Apollo server base to a consumption model over time, with a lifecycle management focus. Neri said we should stay tuned for more announcements here shortly.

Follow this link:
HPE lifts earnings outlook: Storage finally returns to growth as overall revenues lift Blocks and Files - Blocks and Files

Read More..

Improve Your Insight by Mixing Qualitative Research With Data Science – Built In

As a data science practitioner, think about all the data you have at your disposal. Then think about the latest data science methods you know about. Now, try answering these two questions: Why do your users adopt or reject your products? And how do they use these products?

Both may sound simple, but theyre trick questions. You can much more easily figure out which features of the products are being adopted, when and where theyre adopted, and perhaps even by whom.

Answering why and how questions with analytics, however, is much more complicated because you need to better understand your users, the contexts in which they operate, their considerations, and their motivations. Though answers to these questions are critical, data science teams cannot always rely only on assumptions, models, and numbers to understand the choices users make and the decisions that lead them to use a product in the ways they do.

The purpose of this little thought exercise is to suggest that, while analytics has many advantages, it also has its limits. Recognizing these limits will help you broaden your insight and become more innovative.

But to achieve this, youll need to engage in exploration where parameters and variables are less known, assumptions are mostly absent, and curiosity abounds. Youll need to think in a way that diverges from how you were trained, and youll need to use data and research methods fundamentally different from those you usually rely on.

In short, think about incorporating qualitative research into your analytics process.

Typical analyses involve getting data from users devices, their logged activities, or through user experiments such as A/B testing. But to answer why and how, youll need to learn the perspectives, meanings, and considerations of users from them directly.

Instead of top-down analytics, gather data by getting out into the field the contexts in which your users operate. Rather than rely on known hypotheses and existing variables to carry out deductive work (by far the more common form of analysis), immerse yourself in an inductive process of qualitative research.

The data collected in qualitative research is different from what youre used to in data science. This is how it works:

Note that, unlike traditional data used in data science, qualitative data are multilayered and complex. Field observations are tied to notes, the notes are then connected to interviews, both are then connected to the documents.

Its not a linear process, and by going back-and-forth between these interconnected layers of data, patterns emerge, research questions get refined, new behaviors and characteristics identified, and insights gained. And because the data is intentionally collected mostly in an unstructured fashion (meaning not answers to specific, close-ended questions) the input reflects different perspectives.

All of this can lead to refined questions and hypotheses to further pursue using data science tools.

Read More From Our Expert ContributorsIs a Data Lake Right for Your Company?

In business contexts, qualitative research is mainly reserved for studies of user experience (UX), product and UX design, and innovation. This intensive research work is largely disconnected from the work done by data science teams.

However, if your data context involves people, you should consider bridging this disconnect.

Take, for instance, an engineering team at Indeed that realized they needed to create a new measure for lead quality but didnt have enough background about leads and how to assess their properties. So they spent some time observing, interviewing, and analyzing documents from their account executives. By analyzing these data, they identified features that theyd not considered before and developed the measure they were after.

Being able to collect data on new features and designing machine learning models added significant value. But they realized that as the market, user needs, and their platform continued to evolve, it was important to return to collecting qualitative data from time to time to further inform their analytics pipeline. This ongoing integration of qualitative data and big data resulted in millions of dollars in added revenue.

Or consider the results of having qualitative research and data science teams work together at Spotify. Despite having a wealth of user data at the online streaming service, the company still needed to make sense of users behavior when receiving ads. The data science team followed the standard approach and performed an A/B test (with the intervention being skippable ads and the control being the standard ad experience). The results led the data science team to identify distinct behavior profiles.

Interestingly, the company also asked qualitative researchers to directly study users. Their findings were fundamentally different. For instance, they found that some of these profiles had nothing to do with inherent choices but were actually more an outcome of confusion about features and presented information.

Learning from this experience, the company started embracing a mixed methods approach (where qualitative data is integrated with more structured big data) to leverage the benefits of both approaches. They established a common research query, devised a process where researchers continuously communicated, and then triangulated their insight with both qualitative and quantitative data.

The result was more comprehensive research data, where business and design decisions, such as notifying users explicitly about ad skip limits, were based upon insight gained from users and data about users.

There are several ways for a data science team to engage with qualitative data:

You should also familiarize yourself with the three data collections methods (observations, interviews, and document analysis) that are at the core of qualitative research:

Finally, connect with researchers who have experience with immersive research, inside and outside your company. They can help you as you are thinking about ways to collect and analyze qualitative data. And perhaps you can offer to help them with parts of the tedious analysis of the more quantitative bit of their data too.

Go here to see the original:

Improve Your Insight by Mixing Qualitative Research With Data Science - Built In

Read More..

How data science gives new insight into air pollution in the US – MIT News

To do really important research in environmental policy, said Francesca Dominici, the first thing we need is data.

Dominici, a professor of biostatistics at the Harvard T.H. Chan School of Public Health and co-director of the Harvard Data Science Initiative, recently presented the Henry W. Kendall Memorial Lecture at MIT. She described how, by leveraging massive amounts of data, Dominici and a consortium of her colleagues across the nation are revealing, on a grand scale, the effects air pollution levels have on human health in the United States. Their efforts are critical for providing a data-driven foundation on which to build environmental regulations and human health policy. When we use data and evidence to inform policy, we can get very excellent results, Dominici said.

Overall, air pollution has dropped dramatically nationwide in the past 20 years, thanks to regulations dating back to the Clean Air Act of 1970. On average, we are all breathing cleaner air, said Dominici. But the research efforts of Dominici and her colleagues show that even relatively low air pollution levels, like those currently present in much of the country, can fall well within national regulations and still be harmful to health. Moreover, recent patterns of decreasing air pollution have left certain geographic areas worse off than others, and exacerbated environmental injustice in the process. We are not cleaning the air equally for all of the racial groups, Dominici said.

Speaking over Zoom to audience members tuning in from around the world, Dominici shared these findings and discussed the underlying methodologies at the 18th Henry W. Kendall Memorial Lecture on April 21. This annual lecture series, which is co-sponsored by the MIT Center for Global Change Science (CGCS) and MITs Department of Earth, Atmospheric and Planetary Sciences (EAPS), honors the memory of the late MIT professor of physics Henry W. Kendall. Kendall was instrumental in bringing awareness of global environmental threats to the world stage through the World Scientists' Warning to Humanity in 1992 and the Call for Action at the Kyoto Climate Summit in 1997. The Kendall Lecture spotlights leading global change science by outstanding researchers, according to Ron Prinn, TEPCO Professor of Atmospheric Science in EAPS and director of CGCS.

How Much Evidence Do You Need? 18th Henry W. Kendall Lecture

In the various studies Dominici discussed, she and her colleagues honed in on a specific kind of harmful air pollution called fine particulate matter, or PM2.5. These tiny particles, less than 2.5 microns in width, come from a variety of sources including vehicle emissions and industrial facilities that burn fossil fuel. Particulate matter can penetrate very deep into the lungs [and] it can get into our blood, said Dominici, noting that this can lead to systemic inflammation, cardiovascular disease, and a compromised immune system.

To analyze how much of a risk PM2.5poses to human health, Dominici and her colleagues turned to the data specifically, to large datasets about people and the environment they experience. One dataset provided fine-grained information on the more than 60 million Americans enrolled in Medicare, including not only their health history, but also factors like socioeconomic status and Zip code. Meanwhile, a team led by Joel Schwartz, a professor of environmental epidemiology at the Harvard T.H. Chan School of Public Health, amassed satellite data on air pollution, weather, land use, and other variables, combined it with air quality data from the EPAs national network, and created a model that provides daily levels of PM2.5for every square kilometer in the continental United States over the last 20 years. In this way we could assign, to every single person enrolled in the Medicare system, their daily exposure to PM2.5, said Dominici.

Combining and analyzing these datasets provided a holistic look at how PM2.5affects the population enrolled in Medicare, and yielded several important findings. Based on the current national ambient air quality standards (NAAQS) for PM2.5, levels below 12 micrograms per cubic meter are considered safe. However, Dominicis team pointed out that even levels below that standard are associated with a higher risk of death. They further showed that making air quality regulations more stringent by lowering the standard to 10 micrograms per cubic meter would save an estimated 140,000 lives over the course of a decade.

The scope of the datasets enabled Dominici and her colleagues to use not only traditional statistical approaches, but also a method called matching. They compared pairs of individuals who had the same occupations, health conditions, and racial and socioeconomic profiles, but who differed in terms of PM2.5exposure. In this way, the researchers could eliminate potential confounding factors and lend further support to their findings.

Their research also illuminated issues of environmental injustice. We started to see some drastic environmental differences in risk across socioeconomic and racial groups, said Dominici. Black Americans have a risk of death from exposure to PM2.5that is three times higher than the national average. Asian and Hispanic populations, as well as people with low socioeconomic status, are also more at risk than the national population as a whole.

One factor behind these discrepancies is that air pollution has been decreasing at different rates in different parts of the country over the past 20 years. In 2000, nearly the entire eastern half of the United States had relatively high levels of PM2.5at 8 micrograms per cubic meter or higher. In 2016, those pollution levels had dropped dramatically across much of the map, but remained high in areas with the highest proportions of Black residents. Racial inequalities in air pollution exposure are actually increasing over time, said Dominici. She noted that one thing to consider is whether future regulations can tackle such inequities while also lowering air pollution for the entire nation on average.

Issues of both air pollution and environmental injustice have been thrown into stark relief during the Covid-19 pandemic. An early study on Covid-19 and air pollution led by Dominici showed that long-term exposure to higher levels of air pollution increased the risk of dying from Covid-19, and that areas with more Black Americans are even more at risk. Additional research showed that during last years wildfire season in California, up to 50 percent of Covid-19 deaths in some areas were attributable to the spikes in PM2.5that result from wildfires.

Due to a lack of data on individual Covid-19 patients, some of these analyses were based on county-level data, which Dominici noted was a major limitation. Fortunately, in some geographical areas, weve started getting access to individual-level records, said Dominici. Access to more and better data has sparked additional research around the world on the link between air pollution and Covid-19. Dominici was also part of an international collaboration that estimated, for example, that 13 percent of Covid-19 deaths in Europe were attributable to fossil-fuel related emissions.

For Dominici, a data scientist at heart, findings like these highlight the role of data science in influencing critical environmental policy decisions. Our all being devastated by this pandemic could provide an additional source of evidence of the importance of controlling fossil-fuel related emission.

See the article here:

How data science gives new insight into air pollution in the US - MIT News

Read More..

ExcelR has completed the training for 5000+ Students in Pune on Data Science Since their Inception. | YorkPedia – York Pedia

(YorkPedia Editorial):- Pune, Maharashtra Jun 4, 2021 (Issuewire.com)ExcelR Data science course in Pune:

ExcelR, in partnership with Steinbeis University, Germany, provides you with the best data science courseor data scientist course to help you acquire greater heights in your data science career. The course follows a practical approach where everything is taught with real-life examples and case studies, and at a later stage, you apply your learnings on live projects. Our course material is designed by industry experts having more than 17+ years of industry experience and currently the best in the business. On top of it, you get the alumni status from Steinbeis University, Germany.

More on YorkPedia:

This course makes you ready to get recruited by the top multi-international companies in the world. After you are done with the course, you can appear for the exams. You have to score approximately 60% in the exam to obtain a certificate. We have a proper system to help our participants in assignments, projects, resume preparation, and interviews as well. You also get lifetime access to self-paced learning. Students learn everything from Hadoop, Spark, MySQL, Python, R, Tableau, to even Artificial intelligence technologies. ExcelR trains you to be the best in the market and then places you in a reputable company. 5000+ students from Pune have been placed already. You also get access to attend unlimited batches for 365 days with our JUMBO PASS.

Why get ExcelR training for data science?

ExcelR is one of the best training grounds when it comes to data science courses. Our training is curated for everyone regardless of the knowledge they possess. We help you get placed in top MNCs with our 100% placement assistance. Our placement cell has over 150 corporate partners ready to hire you after the successful completion of the course and exam.

All the data science concepts are practically and theoretically explained with the proper use of case studies. Then we try to give you hands-on experience with the help of different projects where you learn the real-life implementation of theoretical concepts.

Our faculty consists of alumni of IITs, IIMs, and ISBs and possess industry experience of over 15 years. Some of them are Ph.D.s as well.

All thanks to expert trainers, ExcelR has gained a top position among the data science training institutes. Students enrolled with ExcelR can choose either classroom sessions or self-learning. However, the lectures are always recorded and made available to the students by the instructors. Students have access to re-learn from these lectures as many times as they want for one year. They dont need to pay any charges for this.

The benefits of receiving a data science certification:

Digitalization and the Internet have had a massive impact on the amount of data we consume and produce. The consumption, as well as the production of data, is increased by many folds. And this data needs to be explored and properly evaluated to get valuable insights. This quest for the exploration of data gave rise to data science and has generated various opportunities in the market.

The demand for data scientists is ever-increasing as we will continue to consume and produce data. Plus, the salaries are also very lucrative in the field of data science. Undoubtedly, a career in data science will be rewarding in the long term.

Even right now, there are almost 1.4 lakh jobs available for data scientists. Businesses have realized the significance of analyzing data in growing their business. In India, even the entry-level professionals in data science are offered salaries up to 5 LPA.

So, what are you waiting for? Contact us right away and start your data science career with a BANG!

Go here to see the original:

ExcelR has completed the training for 5000+ Students in Pune on Data Science Since their Inception. | YorkPedia - York Pedia

Read More..

Ways in Which Big Data Automation is Changing Data Science – Analytics Insight

Inventions and discoveries of new trends in the market are getting more refined constantly. Big data automation is undoubtedly one of the most complex and troublesome technologies that is altering the dominion of technology, on a whole, significantly. However, irrespective of the complex nature of big data automation, it remains to be a crucial aspect in organizations and its multifarious benefits cannot be overlooked. The nerve of big data automation lies in finding out patterns that consist of projecting values.

Industries and organizations receive a deluge of data on a daily basis. Data is then analyzed to harness valuable insights from it. Reportedly, the automation of big data has induced massive benefits in the companies, improving operational competence, improved self-service modules, and increased scalability of big data technologies.

In an international conference on data science and analytics, conducted by the Institute of Electrical and Electronics Engineers (IEEE), the model of big data automation was focused on. The objective of the conference was to observe and deduce the multiple ways in which big data automation can have significant impacts on data science. It was observed that the role that automation plays in data science depends on few important factors.

This particular factor depends on a pragmatic approach in which the categorization of analytics is made into diverse segments. The study was conducted to find any definite volume of data over a considerable period of time.

In the case of predictive analysis, the time required is actually reduced by automation. Predictive analyses are often complex and thus, it demands a robust language that makes identification of prediction problems easy and lucid. Big data automation provides a tailored framework that can work with diverse specifications automatically.

The objective of implementing data automation is to present it in a measurable format. Additionally, automation is deemed as an astute assistant of data analysts as it helps in finding out the main prediction problems in a uniform format.

Big data automation plays an impeccable role in determining the improvement trajectory of data science. Automation in data science has actually opened avenues for businessmen in leveraging its numerous factors and eliminating the complexities. The fact that the model is a self-service one also makes it cost-effective. Besides, it also helps data scientists and analysts to be attentive towards value-added activities and deep competencies.

Share This ArticleDo the sharing thingy

About AuthorMore info about author

Visit link:

Ways in Which Big Data Automation is Changing Data Science - Analytics Insight

Read More..