Page 2,696«..1020..2,6952,6962,6972,698..2,7102,720..»

How to sync your Google Drive and OneDrive with your PC to access your files offline – Ohionewstime.com

When it comes to file sharing, storage and backup, Google Drive and OneDrive are very popular cloud storage services. These are also multi-platform. This means that these apps are available for PC, Mac, Android smartphones, iPads, iPhones and more. There is a general perception that top access files in cloud storage always require an active internet connection. That may not be exactly true. It also describes how to use the cloud storage services of Google Drive and Microsoft OneDrive. You can set up Google Drive and OneDrive sync on your computing device to access files stored in Windows Explorer (for Windows PC) or Finder (for Apple Mac) without an internet connection. If your broadband isnt working, or if youre traveling, its very relevant.

Google drive

Step 1: To enable Google Drive sync on your PC, you need to download the Google Drive desktop application.

Step 2: After installing Google Drive, you will be prompted to log in to your Google account. Do it.

Step 3: Then click the icon in the system tray area to open the Google Drive window. Click the gear icon in the upper right to open the settings.

Step 4: Select options as needed in the settings menu. If you want to control more settings, you can also click the icon in the upper right corner.

Step 5: You should now see the Google Drive option in Explorer next to your local storage. Restart your PC once to verify that your changes work effectively.

OneDrive

Step 1: Download OneDrive from this official link if its not yet available on your Windows PC.

Step 2: Once the app is installed, launch it from the Start menu if the app does not launch. When the window opens, sign in to your Microsoft account or create one if you dont have one.

Step 3: After signing in, the app will show you the location of your OneDrive folder. You can change the location as needed.[]Press to follow the quick usage guide. Finally, the app opens the OneDrive folder.

Step 4: You can manage various settings by right-clicking the OneDrive icon in the system tray and opening the settings in the window that appears. You can access your OneDrive files by clicking the OneDrive folder in the left panel of Explorer.

Read all the latest news, Breaking news When Coronavirus news here

How to sync your Google Drive and OneDrive with your PC to access your files offline

Source link How to sync your Google Drive and OneDrive with your PC to access your files offline

Read the rest here:
How to sync your Google Drive and OneDrive with your PC to access your files offline - Ohionewstime.com

Read More..

The rise of resilient cloud-native edge solutions – ITP.net

Datasets are the raw ingredients of knowledge. The cloud-native revolution is fueled by an unprecedented rate of developing methods and applications to transform data into information and knowledge. Seagates internal deployment of smart manufacturing offers just one example.

The amount of potential knowledge is staggering when we consider the diverse types and enormous amounts of data can be gathered via a plethora of sensors and IoT devices. Only the limits set by the available physical storage systems may impede the rate of gaining knowledge.

As businesses attempt to expand data gathering, figuring out where to put this data and managing the costs to get it there can inhibit these gains. As they realise that much of todays data starts at the edges of the cloud, organisations are looking for solutions to store data at or near the edge to reduce complexity. Scalability and reliability must be central to these solutions.

ALSO READ:How to modernise a data centre with agile infrastructure

As a member of The Cloud Native Computing Foundation (CNCF), which builds sustainable ecosystems for cloud-native software, Seagate thinks innovating and enabling cloud-native technologies is critical. Cloud-native computing empowers organisations to build and run scalable, reliable applications with an open-source software stack in public, private, and hybrid clouds.

To be reliable means to be resilient in the face of both normal and challenging situations. For edge cloud storage, normal situations include power spikes, hardware defects, software updates and network loss to name a few. The ultimate goal: applications and services should run effectively despite these events, with little to no effect.

Complete cloud-native edge solutions should offer features that ensure resiliency both within and between regions of edge deployments.

Storing large amounts of data requires large amounts of hard drives and enclosures connected to servers or to a LAN within a region. When a drive fails, recovery should be resilient and rapid; the time required to heal to regain optimal resiliency should be as short as possible. To rebuild a duplicate of data on a large 18TB drive can take days, when standard RAID methods are involved. Seagate ADAPT Data Protection Technology hardens data by spreading the work across all the drives in the enclosure. This reduces the rebuild time to a few hours, without burdening any network or CPU in the deployment.

Observing the increasing natural disasters of the past several years, its clear that a solution to ensure geographical resiliency is also necessary. Geographical resiliency would mean maintaining one or more copies of data in multiple data centres across multiple regions. While mass-capacity mobile storage and data transfer movements can be serviced by Seagates Lyve Mobile products and services, real-time failover requires a distributed system or synchronous replication.

Seagate CORTX Intelligent Object Storage Software provides geographic resiliency for vast amounts of unstructured data objects. For Kubernetes persistent storage, file- and block-level volumes offer agility and performance. Meanwhile, LINBITs LINSTOR block storage management for containers provides an open-source storage-defined service platform that works hand-in-glove with Kubernetes Container Storage Interface for orchestrated micro service data placement and scheduling. Moving the container to the edge offers lower total cost of ownership (TCO) than moving data to the core.

The combination of Seagate Exos storage platforms and LINSTOR synchronous replication creates a feature-rich, enterprise-class, resilient solution, open to connect with any other public or private cloud solutions. These platforms commitment to open source provides operational agility while providing visibility into all layers of the solution.

While Seagate Exos storage can, of course, be used to present block devices to any storage software stack, Seagate has developed an Exos back-end driver specifically for LINSTOR enabling fine-grain volume control of the Seagate ADAPT hardware RAID engine. With this driver solution in place, hardware-based thin provisioning, SSD/HDD auto tiering and snapshots now are orchestrated by Kubernetes with geo-replication.

Also new to LINSTOR with this solution is multi-host aware resource management. A single Exos system with over 1.5PB of storage can be directly connected to 8 application servers, with LINSTOR connecting Kubernetes Persistent Volume Claim to the fault-tolerant logical volumes. Kubernetes using LINSTOR resource definition composes the pairing of compute node and storage connection for both load balancing and agility.

This cloud-native solution is open for business.

More:
The rise of resilient cloud-native edge solutions - ITP.net

Read More..

7 Leading Software and Cloud Stocks to Buy – Investorplace.com

While the pandemic meant a setback for a number of sectors, digitalization grew apace increasing the need for software and cloud services. In turn, the demand for cloud computing industry should grow further as more businesses adapt to work-from-home. And thus, software and cloud stocks should be hot options for investors.

Overall, shares of many software and cloud companies have seen significant returns in the past year. For instance, so far in 2021, the Dow Jones US Software Index is up 27% and hit an all-time high in recent days. And the ISE Cloud Computing Index has returned 10% year-to-date (YTD).

Additionally, according to research group Gartner, Worldwide end-user spending on public cloud services is forecast to grow 23.1% in 2021 to total $332.3 billion, up from $270 billion in 2020. The same study forecasts the sector to reach $397.5 billion in 2022. Therefore, in this article, I will introduce seven leading software and cloud stocks to buy in the coming months.

Even before the pandemic-forced digitalization, we were already benefitting from the fourth industrial revolution, or Industry 4.0 or 4IR, defined as the current trend of automation and data exchange in manufacturing technologies, including cyber-physical systems, the Internet of things, cloud computing and cognitive computing and creating the smart factory.

Now, as we move on to Industry 5.0, the focus will shift to collaboration and interaction between humans and machines. With that information, the following seven software and cloud stocks deserve your attention in the coming weeks:

Now, lets dive in and take a closer look at each one.

52-Week Range: $420.78 $642.55

Lets start with a global diversified software company, Adobe. The San Jose, California-based group offers digital marketing and advertising software and services to students, creative artists, small businesses, government agencies, as well as many of the largest global brands worldwide. Founded in 1982, Adobe has more than 23,000 employees worldwide today.

According to the second quarter financial results announced in mid-June, revenue grew 23% year-over-year (YOY) to $3.84 billion. Non-GAAP net income came in at $1.46 billion, up 22.7% YOY. Diluted earnings per share (EPS) was $3.03 on a non-GAAP basis, an increase of 23.7%. Cash flows from operations were a record $1.99 billion, up 67.9% compared to the previous-year quarter. Adobe ended the quarter with $4.25 billion in cash and equivalents.

On the results, CEO Shantanu Narayen cited, Our innovative product roadmap and unparalleled leadership in creativity, digital documents and customer experience management position us for continued success in 2021 and beyond. He added that the large market opportunity and momentum we are seeing across our creative, document and customer experience management businesses position us well to deliver another record year.

Adobe has returned over 27.5% YTD, and hit a record high in recent days. The shares trade at 44.44 times consensus forward earnings and 21.04 times current sales. Despite the recent run-up in price, the software giant still offers growth potential and hence potentially high investment returns. Investors should keep the stock on radar to buy the dips.

52-Week Range: $45.86 $139

Burlington, Massachusetts-based Cerence is an artificial intelligence (AI)-powered assistant provider for connected and autonomous vehicles (CAVs). Its services also include software platforms for building automotive virtual assistants.

Cerence reported strong third quarter financial results on Aug. 9. Revenue of $96.8 million implied a growth of 28.7% YOY. Non-GAAP net income was $26.1 million, increased 110% compared to prior-year quarter. Non-GAAP diluted EPS stood at 62 cents, up 93.75% YOY. Free cash flow ended the quarter at $21.2 million compared to $13.4 million a year ago.

CEO Sanjay Dhawan commented, Enhancing our future growth opportunities are the strategic collaborations we announced in the quarter with Sirius XM, Visteon and Harman. In the case of Visteon, the collaboration extends into the two-wheeler market, a new adjacent market in which we are making steady progress.

According to Statista metrics, autonomous vehicles should comprise 12% of car registrations by 2030. Thus, Cerence has the potential to be a winner in the secular growth in CAVs market.

In early June, Japan-based Pioneer announced a strategic partnership with CRNC to develop scalable AI-powered products and services that enhance mobility experiences for drivers and passengers globally.

CRNC stock is up more than 8% YTD. The companys forward price-earnings (P/E) and price-sales (P/S) ratios are 39.22 and 11.39, respectively. Given the potential for secular growth, interested investors should keep the shares on their radar screen with a view to buy for long-term portfolios.

52-Week Range: $17.66 $33

San Francisco, California-based Dropbox provides online file storage and sharing services. The company was founded in 2007 and currently operates in 180 countries. It also has more than 700 million registered users.

Dropbox released Q2 financial results on Aug. 5. Total revenue came in at $530.6 million, up 13.5% from the same period previous year. Non-GAAP net income was $160.5 million, an increase of 72.2%. Non-GAAP diluted EPS was 40 cents, up 81.8% YOY. Cash and short-term investments ended the quarter at $1.944 billion. Free cash flow was $216 million compared to $119.8 million a year ago.

On the results, CEO Drew Houston remarked, Were proud of our execution this quarter as we delivered even more value to our customers and shareholders and are excited about the opportunity ahead to build next-generation tools to support the new world of distributed work.

Moreover, regular InvestorPlace.com readers might remember that in March, Dropbox acquired DocSend. This is a secure document sharing and analytics company with more than 17,000 customers. The transaction was priced at $165 million.

DBX stock hit as high as $33 after the release of Q3 results. So far this year, the shares have surged around 36%. The companys consensus forward P/E and P/S ratios stand at 24.94 and 6.16, respectively. Accelerated digital transformation should continue to increase the demand for cloud storage platforms. And given the companys solid position and growth strategy, investors should consider buying into the declines.

52-Week Range: $21.87 $30.42

Expense Ratio: 0.68% per year

Lets continue with Global X Cloud Computing ETF, an exchange-traded fund (ETF) that provides exposure cloud computing companies. These firms include Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS) names as well as managed server storage space and data center real estate investment trusts (REITs).

The fund started trading in April 2019 and tracks the Indxx Global Cloud Computing Index. As far as sector allocations are concerned, information technology leads the ETF with 80.7%, followed by consumer discretionary (7.6%), communication services (6.4%) and real estate (5.4%).

U.S.-based companies comprise 86.9% of the fund, which currently has 36 holdings. Other countries represented are Canada (4.8%), New Zealand (3.4%), the U.K. (2.5%), and China (2.4%).

The top ten names make up around 44% of net assets of $1.35 billion. The Schaumburg, Illinois-headquartered cloud-based payroll and human capital management (HCM) solutions provider Paylocity Holding (NASDAQ:PCTY); San Jose, California-based online security platform Zscaler (NASDAQ:ZS) and Oklahoma City-based HCM applications provider Paycom Software (NYSE:PAYC) lead the names in the roster.

CLOU has returned around 28% in the last 52 weeks and 3.3% YTD. Potential investors who seek exposure to cloud names could find value below $28.

52-Week Range: $8.90 $45.00

Denver, Colorado-based data-mining and analytics group Palantir Technologies builds and deploys two software platforms. The first one is Palantir Gotham, which focuses on the government intelligence and defense agencies.

Meanwhile, Palantir Foundry is used by leading companies from energy, transportation, financial services, and healthcare sectors. Additionally, it offers Palantir Apollo, the continuous delivery software that powers the SaaS platforms, Foundry and Gotham, in the public cloud.

On Aug. 12, PLTR reported better-than-expected Q2 financial results. Revenue soared 49% YOY to $376 million. Adjusted net income of $98 million implied an increase of almost 644%. Adjusted diluted EPS was 4 cents, up 4 times over the prior-year quarter. Cash flow from operations came in at $23 million and adjusted free cash flow stood at $50 million.

On the earnings call Chief Operating Officer Shyam Sankar said, Cutting-edge product and continued efficiencies in distribution drove exceptionally strong year-over-year Q2 results, and added, The number of commercial customers grew 32% over last quarter. We closed 62 deals of $1 million or more, 30 of which were for $5 million or more and 21 were for $10 million or more.

Sankar also introduced the Meta-Constellation software that integrates with existing satellites building a more efficient AI-enabled decision chain. Looking forward, the company raised its full-year guidance for adjusted free cash flow to more than $300 million, up from more than $150 million.

Though PLTR shares surged more than 10% after the release of Q2 results, they are currently trading at around $24, shy of the record highs seen in late January. So far this year, PLTR stock is up 2.6% and trades at 163.93 times forward earnings and 36.86 times current sales. Buy-and-hold investors could consider investing around current levels.

52-Week Range: $839.40 $1650

Canada-based Shopify provides a cloud-based e-commerce platform primarily to small and midsize companies. It offers merchant solutions to over 1.7 million businesses worldwide.

SHOPs Q2 financial results issued in late-July showed a YOY growth of 57% in the total revenue, which came in at $1.12 billion. Adjusted net income was $284.6 million, or $2.24 per diluted share. Those metrics showed an increase of 120% YOY and 113% YOY, respectively. Cash and marketable securities stood at $7.76 billion compared with $6.39 billion on Dec. 31, 2020.

At a recent Shopify Unite developer conference, CEO Tobi Ltke pointed out, What used to be two completely distinct industries, the retail industry and the online commerce industry, are now just the commerce industry Shopify is building the essential infrastructure for this increasingly digital world to allow as many people as possible to participate.

Shopify has proven to be a winner in recent years, and management still invests for growth despite the deceleration in pandemic-fueled gains. SHOP stock trades at 232.56 times forward earnings and 48.47 times current sales. So far this year, the shares have returned 29.5%. Given the overstretched valuation levels, potential investors could consider investing towards $1400.

52-Week Range: $113.56 $177.74

Expense Ratio: 0.35% per year

The final choice of software and cloud stocks is the SPDR S&P Software & Services ETF which tracks the S&P Software & Services Select Industry Index. The fund provides exposure to a range of U.S.-based software businesses.

The funds market value has reached more than $560 million since its inception in late September 2011. XSW currently has 184 equal-weighted stocks. The sub-sectors are application software (55.42%), data processing & outsourced services (19.05%), systems software (16.7%), IT consulting & other services (5.35%) and interactive home entertainment (3.2%).

The top ten holdings weigh almost 6% of total net assets. Leading stocks include the work management platform provider Asana (NYSE:ASAN), Paycom Softwareand Paylocity.

XSW has returned nearly 9% YTD and 40% in the last 12 months. Given the low expense ratio and the diversified exposure of the fund, investors could consider investing below $165.

On the date of publication, Tezcan Gecgil did not have (either directly or indirectly) any positions in the securities mentioned in this article.The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.

TezcanGecgil has worked in investment management for over two decades in the U.S. and U.K. In addition to formal higher education in the field, she has also completed all 3 levels of the Chartered Market Technician (CMT) examination. Her passion is for options trading based on technical analysis of fundamentally strong companies. She especially enjoys setting up weekly covered calls for income generation.

See the rest here:
7 Leading Software and Cloud Stocks to Buy - Investorplace.com

Read More..

Deduplication software: Which solution is best? – IDG Connect

With the digital age leading to an unprecedented increase in the amount of data that businesses create, managing it effectively has become increasingly difficult. Many businesses are now turning to deduplication software to help with their data management practices. However, selecting a solution is not easy, and there are multiple choices for businesses to consider.

Over 388,000 professionals have used IT Central Station research to inform their purchasing decisions. Their latest paper looks at the highest rated deduplication software vendors, profiling each and examining what they can offer enterprise.

Heres a breakdown of the key players currently active in the market:

Average Rating: 8.5

Top Comparison: HPE StoreOnce

Overview: Revolutionises disk backup, archiving, and disaster recovery with high-speed, inline deduplication.

Average Rating: 8.1

Top Comparison: Dell EMC NetWorker

Overview: A fast, efficient backup and recovery through a complete software and hardware solution.

Average Rating: 8.3

Top Comparison: Dell EMC PowerProtect DD (Data Domain)

Overview: A data backup solution that provides a consistent, high-performance, scale-out architecture for the entire enterprise.

Average Rating: 8.5

Top Comparison: Dell EMC PowerScale (Isilon)

Overview: Offers powerful, affordable, flexible data storage for midsized businesses and distributed enterprises.

Average Rating: 8.1

Top Comparison: Dell EMC PowerProtect DD (Data Domain)

Overview: A single turnkey solution for backup, storage, and deduplication.

Average Rating: 8.5

Top Comparison: HPE StoreOnce

Overview: Advanced integration with backup and enterprise applications for increased performance and ease of use.

Average Rating: 8.7

Top Comparison: Veeam Backup & Replication

Overview: Designed from the ground up for the cloud-integrated systems. Gives businesses the flexibility to easily back up data wherever it resides and replicate the data to a private location of choice.

Average Rating: 8.0

Top Comparison: Veeam Backup for Office 365

Overview: Provides secure, fast back up data to any cloud at up to 90% lower cost compared with on-premises solutions.

Average Rating: 8.5

Top Comparison: Dell EMC Avamar

Overview: Provides data protection, replication, and re-use, whilst supporting long term retention in the public cloud.

Average Rating: 8.7

Top Comparison: Dell EMC PowerProtect DD (Data Domain)

Overview: A software-defined platform for secure cloud storage, deduplication and replication. Expands data storage options while reducing storage footprint and costs.

Read the original post:
Deduplication software: Which solution is best? - IDG Connect

Read More..

What Is an FTP Server and How Does It Work? – Server Watch

A file transfer protocol (FTP) server is an intermediary for transferring files between computers on a network.

While FTP servers traditionally were a physical unit in an organizations back end, the adoption of SaaS technology brings those capabilities to the cloud. In either environment, FTP servers are the storage mechanisms that provide the secure transfer of files of varying weights and file formats.

On-premises FTP servers will remain a component for large organizations managing complex and mission-critical file transfer requirements but the trend towards cloud-based FTP servers is clear. Looking at the existing market, physical FTP servers typically have the fullest set of features needed for enterprises.

This article looks at what a traditional FTP server is, how it works, examples of modern FTP services, and more.

Learn more about the range of server types, functions, and purposes in our Guide to Servers.

The File Transfer Protocol (FTP) is a communication standard for transferring files over a network. Designed for the client-server model architecture, FTP servers allow users to sign-in and access files. Most FTP servers today implement stronger security with SSH-enabled FTP (SFTP), and TLS-enabled FTP (FTPS). The newest iteration, managed file transfer (MFT), has an even more robust approach to FTP and is aimed at enterprises.

FTP servers go beyond other servers in facilitating file transfers over the internet. The above graphic shows how FTP servers act as an intermediary between devices. With two devices known as FTP clients connected to the internet and a specific FTP server, the server enables the uploading and downloading of data between the two parties.

How different FTP servers differ often is the security of accessing the FTP. In the realm of cybersecurity concerns, some FTPs allow for anonymous connection while others require a username, password, or MFA.

The following features are typical for modern FTP server solutions:

With the advancement of cloud computing, cloud service providers offer a worthy alternative to traditional FTP servers. Both offer online file transfer and sharing features, but how do they differ?

Though FTP servers have long been the enterprise choice for file transfer, todays cloud solutions tend to offer more advanced features with increasing security awareness. With the convenience of the cloud and a lower cost, the migration of FTP tasks to the cloud is a reasonable trend.

Applications like Google Drive and Dropbox are known for being alternatives to FTP servers. As platforms that can do everything an FTP does, both are examples of cloud-based FTP servers.

The dominant free FTP solution is the open-source software, FileZilla. Supporting FTP and its two encrypted forms (SFTP and FTPS), FileZilla works on Windows, Linux, and macOS. Features include supporting the transfer of files larger than 4GB, tabbed user interface, and configuration for remote file editing, transfer speed limits, and directory synchronization. FileZilla Pro goes further with a long list of integration options like AWS, Azure, Dropbox, GCP, and OpenStack.

Angled more towards businesses, Files.com describes itself as smart cloud storage for modern teams. The Files.com platform includes features like an API and SDK for in-house development; SSO via any OAuth, SAML, or LDAP provider; and group and multi-level user management. With security and encryption a priority, the firm makes compliance easy with targeted features for HIPAA and GDPR implementation. Plans start at teams of five, with custom quotes coming in for teams over 30.

Hailing from Oakland, California, ExaVault is a dedicated SFTP provider for both modern and traditional file transfer workflows. The vendor boasts a modern web-based interface with proven security and FTP. Features include unlimited users, real-time notifications, and an API for developers. ExaVault offers four plans with quite a range of features between sharing, automation, and advanced controls capabilities. At the upper end, their Enterprise plan includes 1TB+ of file storage, 2 million daily transactions, and unlimited API credentials.

Priding itself as the gold standard of macOS file transfer apps, Panics Transmit 5 is for everything Apple. Transmit 5 is compatible with 11 cloud services and able to handle FTP, SFTP, WebDAV, and S3. New features include Panic Sync, the firms method for site syncing that supports local-to-local and remote-to-remote sync. Panic offers single-use up to 1000 copies of the Transmit 5 license on their website, with discounts for larger orders.

Launched in 1996 in San Antonio, Texas, Globalscape is a managed file transfer (MFT) software vendor providing FTP to organizations utilizing Windows. A range of plans are available for SMBs to large organizations. Globalscape offers its Enhanced File Transfer (EFT) Arcus product as a cloud-based MFT (MFTaaS) that provides secure access protocols, data visibility, governance, and automation. Alternatively, Globalscapes EFT for on-premises (EFT Enterprise and EFT Express) offers advanced features and the highest levels of data security.

Focused on supporting enterprise needs, SmartFile includes on-premises, cloud, and hybrid services for organizations needing scalability. Product features include granular controls for file permissions, tracking and file activity, file versioning, and user management. For the enterprise looking for on-premises FTP capabilities, SmartFiles FileHub is a virtual file management server that connects storage globally and load balances with HAProxy.

In April 1971, a young computer scientist at MIT, Abhay Bhushan, published RFC 114 detailing FTP and e-mail protocols for what would become the internet. As the decade progressed, the Transmission Control Protocol and Internet Protocol (TCP/IP) became the basis for networking and formalized the means of direct and indirect access with a remote host.

The rest is here:
What Is an FTP Server and How Does It Work? - Server Watch

Read More..

This high-performance computer just smashed a world record for solving a millennia-old math problem – ZDNet

The high-performance computer completed the Pi calculation with a precision of exactly 62,831,853,071,796 digits.

A team of Swiss researchersare claiming that their high-performance computer has added 12.8 trillion new digits to the number Pi, in a calculation that reached a record-breaking 62.8 trillion figures in total.

Based at the University of Applied Sciences of the Grisons's center for data analytics, visualisation and simulation (DAViS), the high-performance computer completed the Pi calculation with a precision of exactly 62,831,853,071,796 digits, smashing the previous record of 50 trillion digits achieved by Timothy Mullican last year.

The best cloud storage services

Free and cheap personal and small business cloud storage services are everywhere. But, which one is best for you? Let's look at the top cloud storage options.

Read More

Before Mullican, the trophy was held by none other than Google, whose team found over 31.4 trillion digits for Pi in 2018.

SEE: Supercomputers are becoming another cloud service. Here's what it means

The Swiss team obtained the result in just over 108 days that is, three and a half times faster than Mullican, who reached the previous record in 303 days and is now awaiting verification before it can be entered into the Guinness World Records. Only then will the entire number be made publicly available, but the researchers teased that the last ten known digits of Pi now are: 7817924264.

For most people, the number Pi will only bring back distant memories of math classes, where it was described as the ratio of the circumference of a circle to its diameter, and often shortened to its first few digits: 3.1415.

For centuries in fact, as early as the ancient Babylonians mathematicians have been trying to calculate the digits of Pi with as much accuracy as possible. But with the number Pi being known as an irrational number, meaning that it can never be represented with ultimate precision, the point isn't exactly to find practical uses; rather, the calculation has become an unofficial benchmark for high-performance computing, and an opportunity for scientists to compete against one another.

"We wanted to achieve several goals with the record attempt," said Heiko Rlke, the head of DAViS. "In the course of preparing and performing the calculations, we were able to build up a lot of know-how and optimize our processes. This is now of particular benefit to our research partners, with whom we jointly carry out computationally intensive projects in data analysis and simulation."

DAViS's researchers used a well-established algorithm called the Chudnovsky formula, which was developed in 1988 and is considered the most effective method to calculate the number Pi. Google's team and Mullican also used the Chudnovsky algorithm.

The algorithm was run thanks to another popular computer software program, y-cruncher, that was designed in 2009 by American developer Alexander Lee specifically to compute Pi.

One of the main challenges, according to the Swiss team, was the amount of memory that was needed to achieve such a large calculation. DAViS's high-performance computer was set up with two AMD Epyc 7542 processors coupled with 1TB of RAM, which isn't sufficient to hold all of the digits they were aiming to come up with. The y-cruncher program, therefore, was used to swap out the digits to an additional 38 hard disk drives (HDD) with a total 16TB of storage space, saving a large part of the RAM on the HDDs.

During operation, the computer and the disks could reach up to 80C, which is why the system was housed in a server rack with constant air cooling to avoid overheating. This contributed over half of the total 1,700 watts of power that the scientists estimate was required for the full calculation, whichwould still place the system in 153rd position on the Green500 list.

It is unlikely that Pi's extra 12.8 trillion digits will be used for any practical applications any time soon; the achievement is rather a reflection of scientific ingenuity and high-computing performance.

SEE: What is quantum computing? Everything you need to know about the strange world of quantum computers

The Chudnovsky formula, for example, is known for its complexity: when implementing the algorithm, scientists find that the time and resources necessary to calculate the digits increase more rapidly than the digits themselves, while it becomes more difficult to survive hardware outages as the computation increases.

For the Swiss researchers, the new achievement is a reflection of the capabilities of high-performance computing systems, and their potential for other research areas. "The calculation showed that we are prepared for data and computing power-intensive use in research and development," said Thomas Keller, project manager at the University of Applied Sciences of the Grisons. "The calculation also made us aware of weak points in the infrastructure, such as insufficient back-up capacities."

DAViS supports the use of high-performance computing in machine learning, for example, in a project called Translaturia that is building a computer-aided tool to translate from the Romansh language, spoken predominantly in the Swiss canton of the Grisons and currently at threat of disappearing.

The computing center is also looking at applications of DNA sequence analysis in allergy and asthma research, which also calls for high-performance computing systems. The new record helps prepare the groundwork for future practical applications.

See more here:
This high-performance computer just smashed a world record for solving a millennia-old math problem - ZDNet

Read More..

Synology DS1821+ review: The Russian doll of the digital world – The Australian Financial Review

Youd share disks like that for any number of reasons.

One reason is for extra storage and backups of laptops and tablets.

Another reason is that all data storage fails, eventually. Every hard disk you have, every SD card or built-in gigabyte of storage on your phone or tablet will eventually cark it, losing whatever you have stored on it. Lots of data can and should be backed up to a cloud storage service, but for the very bulky stuff at the least, you should do local backups.

(You should probably be backing important files locally, too, just in case your cloud system goes out of business, or you forget its there, or forget the username, or something.)

Of course, the fact that all storage fails means the drives you put into your Synology NAS will fail eventually, too. But like all NAS devices, the Synology can be set up so that all its data is backed up internally. When one hard disk fails, you just pull it out, replace it with a new one, and the system will automatically restore whatever was on the broken drive, good as new.

The new version of Synologys operating system has hundreds of new features and a lovely overhaul of its interface.

This month, Synology has released a new version of the operating system that runs its NAS devices, DiskStation Manager (DSM).

As well as an appealing new user interface accessed via a web browser on some other device on your network DSM 7 also supports a hybrid storage system that will store files on Synologys servers in the cloud and bring them down to your device when you need them. Known as Synology C2, it was in beta testing during this review, and we were unable to try it. It looks promising but, like all cloud storage, it could get pricey if you use it for all your storage needs.

Part of what we love about Synology NAS devices, though, is that theyre not just capable of running their own DSM software and the dozens of DSM apps. Some of the more powerful models (including the DS1821+) can run virtual computers, too. If you have, say, Windows or Ubuntu Linux running in a box under your bed to control your home automation system 24 hours a day, you can run that as a virtual PC inside DSM.

Better yet, DSM supports a system, known as Docker, that lets you run all manner of containerised apps if you cant find DSM versions or want to run newer versions than those available. (Plex, the excellent home entertainment sharing system, often has a newer version available as a Docker download than as a DSM download, for instance.)

It so happens that many of the Internet of Things devices here in the Digital Life Labs, such as lights, power switches, doorbells, video cameras, heating controllers and motion detectors, are running as a series of device-specific plugin programs inside a brilliant program known as Homebridge, which is in turn running inside Docker, which is running inside DSM 7, which is running inside the DS1821+, which is under my desk.

Its like the Russian doll of the digital world, except it doesnt get smaller, the deeper you delve into it. It gets bigger.

SYNOLOGY DS1821+Likes: Easy to use. Very powerful. Incredibly useful.Dislikes: Nothing.Price: $1599 plus cost of disks, plus more for optional C2 cloud system, plus even more for solid-state disks that can speed up its performance.

Read the original post:
Synology DS1821+ review: The Russian doll of the digital world - The Australian Financial Review

Read More..

While Cloud Computing Is Secure for the Moment, SecPro Skills Are Hard To Find – CMSWire

PHOTO:Adobe

As the number of attacks on enterprise systems continues to rise, you might think that recruitment of cybersecurity professionals was also rising. In normal circumstances that would indeed be the case, but recent research published in the fifth annual global study of cybersecurity professionals by the Information Systems Security Association (ISSA) and industry analyst firm Enterprise Strategy Group (ESG) indicates there is a major crisis in the cybersecurity industry.

While the solution to these kinds of crisis is to train more people across the enterprise, the reality is that many organizations are finding it impossible to fill these posts. In fact, according to the research, the cybersecurity skills crisis continues on a downward, multi-year trend of bad to worse and has impacted more than half (57%) of organizations.

The Life and Times of Cybersecurity Professionals 2021 report surveyed 489 cybersecurity professionals and reveals that the crisis is taking on a number of different nuances that organizations are finding difficult to manage. Among the more striking findings are:

Furthermore 95% of respondents state the cybersecurity skills shortage and its associated impacts have not improved over the past few years and 44% say it has only gotten worse.

Notably, the three most-often cited areas of significant security skills shortages include cloud computing security, security analysis and investigations, and application security. For many organizations with their dependency on cloud computing, the news is worrying. It also raises the question as to how secure their cloud deployments are and even whether they can trust the cloud at all.

Related Articles:Why Enterprises Are Bringing Their Workloads to Multi-Cloud Environments

This is particularly true with the explosion in remote working. More remote working means greater usage of cloud applications, which has led to increased demand for cybersecurity professionals with skills in cloud computing security, according to Pieter VanIperen of New York City-based PWV Consultants, told us. A significant number of organizations are struggling to find the people to fill these gaps.

There has been a known shortage of software developers in the technology industry for some time, security is no exception. He said that there are currently about five jobs for every one developer (roughly), so the inability of companies to find cloud computing security pros isnt all about knowledge. Much of the problem is around a simple lack of people rather than what people know. Even so, cloud computing is still more secure than traditional methods.

Cloud service providers, he adds, ensure that storage systems are backed up thoroughly so that nothing gets lost, even in the event of a breach. They also have dedicated specialists who can walk businesses through how to use the cybersecurity services they offer. So, yes, cloud computing is still safe. Businesses should make sure they understand the security risks they assume versus what falls under the umbrella of the cloud provider so that proper adjustments can be made, but every business should be utilizing the cloud, he said. Technology is eating the world, digital transformation trends force businesses into the cloud to stay competitive, and while it can be difficult to find developers to keep in-house, there are always experts who can be called upon for assistance.

Related Article:Take Your Cloud Strategy Into the Future

However, the shortage of technicians is not a problem that is going to be solved overnight, Daniel Cohen, VP of Cloud Services at Sunnyvale, Calif.-based Radware, added. He says that avoiding cloud technologies is not the solution. Today's enterprises require 24x7, always-on digital access to either connect to their workforce or end-customers.

To help bridge the gap, organizations need to ensure that there is not only more advanced education and upskilling for our security teams, but also more security awareness training for all employees. Security is everyone's responsibility in our anytime anywhere workplace.

Cybersecurity firms also have a major role to play in managing the shortage. By delivering solutions that leverage advanced technologies, such as machine learning and automation for increased productivity, they can help keep organizations protected even with a dwindling cybersecurity team, he added.

So, what exactly is needed to keep your cloud deployments safe? The skill sets for cloud security professionals are different from those of other cybersecurity skills in two areas, Terumi Laskowsky, a cybersecurity instructor at Denver-based DevelopIntelligence, said.

First, the shared responsibility model for security points to how two parties share the security responsibility for the cloud-based systems: The Cloud Service Provider (CSP) and the Cloud Service Consumer (i.e., the cloud customer).

Think of the CSP as an outsourcer. The CSPs offer their physical infrastructure (i.e., datacenter, servers, network, storage, etc.) and other services to the consumer. The consumer uses them to migrate their existing systems, create new ones and upload their data. Each party (CSP and CSC) is responsible for security for their respective areas of responsibility. But the CSC has ultimate responsibility for ensuring safety of organizational data and systems.

A CSC cloud security professional must be able to vet the security of CSPs, while also managing risk and designing, implementing and managing organizational security controls.

When a company moves into the cloud, the first thing that goes away is the physical servers, networks and storage. Of course, the physical equipment still exists, but they are owned and managed by the CSPs. For the CSC cloud security professionals, almost all the things they manage will be virtual virtual servers, software-defined networks, virtual storage systems, containers, managed services, serverless offerings and the list goes on.

The physical is abstracted away from the CSC. For example, virtual machines (VMs) abstract the physical infrastructure, containers abstract the operating system and serverless services abstract the runtime engines. The skill sets required to work with the abstracted services are quite different from working with the physical. They may act and look the same, but they are different and often more complex under the hood. In general, as complexity increases, the likelihood of vulnerabilities also increases.

Vulnerabilities arise from assuming that the CSPs are responsible for certain security aspects when they are not. The CSPs will not stop you from creating vulnerable systems. They can only offer advice.

This also is related to consumers exposing sensitive data in the cloud, such as PII (Personally Identifiable Information) and other secrets. CSPs are not going to stop you from doing that because the data is the responsibility of the cloud consumer. Working with virtual environments requires investment in learning the technology and understanding the differences compared to the physical. Since separating networks provide a level of isolation (i.e., security), and routers provide security controls when connecting them, the security professional must learn how to implement security using a different technology.

If an organization does not have enough trained cloud security professionals, all the issues mentioned above go without being addressed properly, she said. Among the issues mentioned above, the lack of visibility related to the shared responsibility model for security can cause issues for the security professionals.

There are two other issues that need to be considered too. Scott Caschette is chief information officer of Tampa, Fla.-based Schellman & Company, notably cloud infrastructure security and the other cloud data security.

As the remote workforce has become larger, more diverse and decentralized so has your corporate data. Long gone are the days of IT providing applications and data to a sedentary group of people within the confines of a physical building and 8-5 schedules. With the explosion of cloud computing, SaaS platforms, mobile devices and portability, your data is everywhere. Like it or not, your users demand it. Therefore, referring to our earlier hemispheres, data security has become less secure by the nature of organic growth.

Security positions in the enterprise can help drive tools, visibility and risk management but once it leaves your border no amount of security skills is going to help, he said. Like water, data wants to be free and will find the path of least resistance and for many, has. Administrators, security engineers and application developers struggle to stay ahead of the curve when it comes to keeping corporate data safe. Training, hygiene, DLP, disk encryption, MFA and anti-malware are a good start but should be considered table-stakes at this point.

On the other hand, he added, "When we talk about infrastructure, corporate data centers and proprietary networks I think we would be foolish to think that a small team of daytime FTEs can compete with the budgets, skills and quantity of large cloud platforms and SaaS companies. Further with efficiencies of scale these platforms have tools that can automate much of the inherent risk right out of the tenant.

Read more from the original source:
While Cloud Computing Is Secure for the Moment, SecPro Skills Are Hard To Find - CMSWire

Read More..

The State of Cybersecurity and the Cloud Today – Californianewstimes.com

When cloud computing was first introduced, numerous questions were asked about the ability of cloud service providers to keep infrastructures and data secure. As more businesses and organizations began migrating to the cloud, it became evident that major cloud service providers were better equipped and more capable than many businesses and organizations when it comes to securing data and networks.

However, years later, questions still remain, and customers of cloud services are still on the hunt to find more improved ways to secure their data. Today, cloud security is becoming less of a disconnected practice and more of an essential element of security and data protection strategies. The cloud is everywhere, and this means that securing cloud applications and data should be among the top priorities of a business or organization.

On the face of it, cloud computing and cybersecurity might seem like extreme opposites. Cloud computing requires businesses and organizations to store their data off-site and cybersecurity requires building virtual walls around an organization, protecting data at all costs. Cloud computing means outsourcing and putting your data in the hands of a third party to keep data and transactions safe. Cybersecurity means keeping everything close, putting faith in employees, and trusting that on-site strategies, procedures, and protocols can get the job done.

As more businesses and organizations move their computing and data to the cloud, we see a mutual relationship between the two practices. As a result, we have been introduced to cloud security, the practice of ensuring cybersecurity when depending on cloud computing. Businesses and organizations are accelerating their use of the cloud, but more organizations should slow down and make sure security is implemented in the very beginning.

Cloud security is turning into the new cybersecurity, but it was not easy to reach this point. There was some distrust that made it a challenge for some IT managers to allow data to be stored and protected on off-site servers. Businesses and organizations that migrate to the cloud and benefit from the cost savings will likely find a great amount of success. This makes cloud computing an essential business strategy and this also makes cloud security necessary. Unfortunately, some businesses and organizations do not have effective cloud security structures. When asked for an opinion on the state of cloud security today, Jorge Rojas said the following: We find that for most clients, cloud security is poor.

Cloud security is at the forefront of all IT leaders today. The new workforce that includes in-office, fully remote, and hybrid has caused many new challenges. The attack vector has grown significantly because of this new working ecosystem. This challenge can be addressed but requires organizations to have a full grasp of their IT systems. Every system or application needs to be managed by single sign-on and two-factor authentication. SASE solutions should be implemented as well. At the end of the day, with cloud solutions, it comes down to managing the endpoint. All corporate devices should be managed with a central identity that can ensure patching and compliance, said Holden Watne of Generation IX, a Los Angeles IT Services company.

The cloud has rapidly become one of the most popular technologies for organizations and individuals thanks to its availability on multiple platforms, ease of use, data storage, and on-demand computing capacity. The prevalence of the cloud is incredible, with more users and organizations making use of it each day. This is also the reason for concerns regarding data privacy and the security of cloud computing services, said Anthony Buonaspina, BSEE, BSCS, CPACC, CEO and Founder of LI Tech Advisors.

Cloud computing can offer an effective security solution. Small to medium-sized businesses are especially vulnerable to cyber threats and cyberattacks such as ransomware because they do not always have the tools and resources that are needed to improve their cybersecurity. Moving to the cloud can improve security because cloud vendors have some of the most effective and strong security in the IT space. The state of cybersecurity will go as small business owners go. Meaning, the more technology innovates and can bring the best solutions down to the small user, the better our economy will be insulated from hackers, said Mike Selah of Advantage Industries.

Cloud vendors understand the part they have to play in cybersecurity, but in the end, if a business or organizations data is compromised, it is the organization that will have to respond to complaints and/or pay fines and penalties. Also, if an organization falls victim to a ransomware attack, the cybercriminals will go after the organization for the ransom. Even when you are using cloud computing, you cannot let your guard down at any point. There are still numerous challenges that organizations face when it comes to cloud security.

The biggest challenge is a lack of evolution. Too many companies will do one thing to improve their cyber security and then remain status quo for years at a time. The tools we implement for our customers today are vastly different than they were just six months ago. This needs to be a budget item. There isnt a single company that would not factor paying for internet access into their monthly budget, yet many companies do exactly that when it comes to cyber. Over 80% of companies with 200 employees or less have never conducted an independent third-party cyber security audit. They dont even know what they dont know. We had a customer return to us last week (the left because we were too expensive) because one partner had their email hacked and the hacker sent out thousands of spoof emails to their database. The less expensive vendor could not regain control of the email for them and the customer feels they must now shut down an email account they have used for over a decade, just to block out the hacker, said Mike Selah of Advantage Industries.

The most common challenges organizations face are data breaches that result in loss or exposure of client personal and private information. These types of breaches can put an organization at great risk and face huge expenditures in remediation as well as a ruined reputation. One of the major challenges are security breaches due to employee negligence due to lack of cybersecurity training said Anthony Buonaspina, BSEE, BSCS, CPACC

CEO and Founder of LI Tech Advisors.

Most times are to have clients that the cloud is not cheaper than on-site and there is management involved in them, ie Microsoft provides the platform, but there is no backup, and you need to manage it. For additional security, there are substantial licenses/ setup and monitoring costs, said Jorge Rojas of Tektonic Inc, a Toronto, Canada IT services company.

The most common challenges organizations face are data breaches that result in loss or exposure of clients personal and private information. These types of breaches can put an organization at great risk and face huge expenditures in remediation as well as a ruined reputation. One of the major challenges is security breaches due to employee negligence due to a lack of cybersecurity training, said Anthony Buonaspina, BSEE, BSCS, CPACC, CEO and Founder of LI Tech Advisors.

The biggest challenges that businesses face is the cost of security versus the protection they receive. Basically, what is the best bank for the buck and what security measures should be invested in over others, added Buonaspina.

From my perspective, the biggest issue with cloud security today is that it is extremely easy to add cloud accounts and share information. Trying to control how many different cloud applications are being used by the enterprise and keeping all of them at the same security level has become almost impossible. My advice to any enterprise today is to limit the number of cloud applications that are being used since most of them do not have the security integration that is required in order to keep your data safe, said Ilan Sredni of Palindrome Consulting, Inc.

There were a number of data breaches in 2020 that were of enormous concern for businesses and organizations. In 2021, there has been a dramatic rise in attacks, typically driven by cybercriminals looking to take advantage of poor or lack of security measures. Aside from the familiarity of data breaches that have affected small and large organizations across the globe, there are many lessons that can be learned about how any business or organization could be the next victim.

When organizations moved from on-prem to cloud, everyone definitely took a step back on the security front. But as the adoption is rapidly growing, so is the security and organizations maturity around it. In 2020, there were so many incidents that came to light revolving around cyber security, like CapitalOne, Marriott, Solarwinds etc, said Ashu Singhal of Orion Networks. A few aspects to keep in mind: 1. No infrastructure is big or small when it comes to cybersecurity 2. Do not just focus on perimeter security (ie firewalls are not enough) but also think about various domains of security (end points, application, network etc). This is especially important in the cloud as not everything is behind a perimeter anymore 3. Do not just bank on preventive measures, but also have a plan of action if it does happen, so its not as disruptive

According to Gartners cloud security assessment, by the year 2025, 99% of failures in cloud security will be a result of security issues on the customer side, not the cloud provider side. Some of the most catastrophic security breaches in 2020 were the result of deficient cloud security. This trend will continue as long as the organizations do not improve their defenses, said Buonaspina.

Everyone is vulnerable. A good start is to implement MFA whenever possible, in addition with complex passwords, solid patch management of both OS and applications, said Rojas.

When asked to share a few pieces of advice for organizations in 2021 and moving forward, Rojas shared the following:

Buonaspina shared the following:

It is here to stay, with the explosion to work from home due to the pandemic. This trend is here to stay. Bad actors are aware of this and they are cashing in, said Rojas.

The future of cloud security will be equivalent to building bigger walls and deeper motes. The cloud of the future is going to need to be stronger and more resilient than anything before and fortified with defenses that will be able to keep up with threats in real-time. These types of protections are going to have to be more automated, more discriminating, and sophisticated, incorporating AI and technologies that will allow for high levels of security while maintaining a good user experience that doesnt encumber their access to the information they require, said Buonaspina.

See original here:
The State of Cybersecurity and the Cloud Today - Californianewstimes.com

Read More..

WisdomTree – WisdomTree Cloud Computing Fund (WCLD) falls 0.39% in Light Trading on August 19 – Equities.com

Last Price$ Last TradeChange$ Change Percent %Open$ Prev Close$ High$ low$ 52 Week High$ 52 Week Low$ Market CapPE RatioVolumeExchange

WCLD - Market Data & News

WisdomTree Trust - WisdomTree Cloud Computing Fund (NASDAQ: WCLD) shares fell 0.39%, or $0.22 per share, to close Thursday at $56.39. After opening the day at $56.17, shares of WisdomTree - WisdomTree Cloud Computing Fund fluctuated between $56.97 and $56.17. 179,892 shares traded hands a decrease from their 30 day average of 239,914. Thursday's activity brought WisdomTree - WisdomTree Cloud Computing Funds market cap to $1,256,087,250.

Visit WisdomTree Trust - WisdomTree Cloud Computing Funds profile for more information.

The Nasdaq Stock Market is a global leader in trading data and services, and equities and options listing. Nasdaq is the world's leading exchange for options volume and is home to the five largest US companies - Apple, Microsoft, Amazon, Alphabet and Facebook.

To get more information on WisdomTree Trust - WisdomTree Cloud Computing Fund and to follow the companys latest updates, you can visit the companys profile page here: WisdomTree Trust - WisdomTree Cloud Computing Funds Profile. For more news on the financial markets be sure to visit Equities News. Also, dont forget to sign-up for the Daily Fix to receive the best stories to your inbox 5 days a week.

Sources: Chart is provided by TradingView based on 15-minute-delayed prices. All other data is provided by IEX Cloud as of 8:05 pm ET on the day of publication.

DISCLOSURE:The views and opinions expressed in this article are those of the authors, and do not represent the views of equities.com. Readers should not consider statements made by the author as formal recommendations and should consult their financial advisor before making any investment decisions. To read our full disclosure, please go to: http://www.equities.com/disclaimer

Read this article:
WisdomTree - WisdomTree Cloud Computing Fund (WCLD) falls 0.39% in Light Trading on August 19 - Equities.com

Read More..