Page 3,440«..1020..3,4393,4403,4413,442..3,4503,460..»

How the FT prepared for a world without third-party cookies – The Drum

Permutive won the 'Best Sell Side Innovation category at The Drum Digital Advertising Awards 2020 for its collaboration with The Financial Times. Here, the team behind the entry reveal the challenges faced and strategies used to deliver this successful project.

The challenge

The Financial Times is one of the worlds leading news organisations, recognised internationally for its authority, integrity and accuracy. It topped a million subscribers in 2019 (some 75% of them digital), a year ahead of schedule and has hefty ambitions moving forward. Operating a split revenue model, the FT does not rely on subscriber revenues alone: it also includes advertising with branded content playing a larger role in delivering that advertising revenue.

However, with the introduction of privacy-focused laws such as GDPR and browser changes, including Apple Safari and Mozilla Firefox anti-tracking measures, the FT knew it needed to keep ahead of changes without losing the ability to target its audiences.

Knowing that Google could also clamp down on cookies (it has since announced that third party cookies will be phased out by 2022), the FT needed to change the way it operated - and fast. Chrome accounted for just under half of the FTs impressions in 2018-19, so if it continued to rely on third party cookies, its revenue would have taken a big hit. Googles own figures suggest that without third-party cookies, publisher revenues would drop an average 52% on its platform.

FT.com needed a real-time and first-party cookie solution to unlock its valuable audiences and provide clients the scale they demanded.

The strategy

The FT turned to Permutive to combat the following challenges:

Privacy and regulation - third-party data is becoming increasingly redundant as laws and browser changes take effect. FT was also looking for more efficient methods in responding to GDPR requests.

Browser changes - Apples ITP was the trigger for the FT team to prepare for a cookie-less world. For every new ITP release Permutive estimates that publishers experience up to a 60% drop in programmatic revenue on the browser.

Reporting was time consuming - The reporting and analysis on its legacy DMP was too manual, making it time consuming and less effective at informing decision-making.

Workarounds were not working - FT found that most vendor alternatives were not publisher-driven; they were also focused on finding workarounds to maintain third-party cookies and continue working. These were quickly being eliminated by browser updates.

Audience segmentation - The FT knew that its clients wanted to know more about their audiences: from the segment they were targeting to how a campaign performed on site, as well as what learnings they could take into the next campaign.

The solution

All segments from the FTs existing DMP were recreated in Permutive and since Permutive does not rely on third-party cookies, the FT can collect, analyse and activate its entire audience across all devices and browsers. Additionally, Permutive is built on edge computing, unlike traditional DMPs built in the cloud. This means that data is processed on the users device and isnt sent back-and-forth to cloud servers.

All of FTs segments were historically built using frequency and recency, but with Permutive it is now starting to build out segments based on behaviour on site such as total engagement time and, potentially, scroll depth. This will help improve CTR within campaigns and is a focus for 2020. That additional layer is also helping its sales team to build a stronger narrative to take to market.

The FT can now target users in milliseconds to serve relevant audience-targeted advertising and deliver more information to clients on its users and campaign performance, whilst also being more secure as data doesnt leave the device mitigating against data leaks and providing GDPR-compliance.

The results

The project increased scale, revenue and privacy compliance for FT. Adopting Permutive helped from a privacy perspective as the previous DMP utilised a network-wide domain meaning audience behaviours could be collated across any publisher. Utilising the FT.com domain removed any risk of data being pooled.

Permutive allows FT to target users based on engagement, and learn more about its readers and how they interact with, specifically, its marketing. It has added another weapon to the FTs commercial arsenal. Whereas before it had one or two different data sources it can now unlock a whole other layer in terms of proving who its audience is, demonstrating certain interests that are relevant to the client outside of demographic information.

The FT makes the most of the analytics feature within Permutive, having previously been reliant on manually inputting data into excel sheets to draw information on audience segments. It can now can easily segment users at a granular level using all of the information it is collecting about on-site behaviour. For segmentation and analysis, it can also look back at all historical data, with none of the previous limits. This builds a much more insightful picture of all FT users, making it easy to package audiences for advertisers.

We looked at other vendors but Permutive stood out because it was a publisher-focused, real-time DMP. It was the right time to be talking about a first-party data system, and once we could see how the technology works on device, rather than in the cloud, it made a lot of sense. Were seeing improvements in scale of inventory as well as revenue, and campaign performance is seeing an uplift, too. Its all helping our sales teams build a much stronger narrative to go to market with. - Paris Luc, digital targeting manager, the Financial Times

This project was highly commended at The Drum Digital Advertising Awards 2020. To find out which Drum Awards are currently open for entries, click here.

// Featured in this article

Financial Times

The Financial Times is one of the worlds leading news organisations, recognised internationally for its authority, integrity and accuracy. It is part of Nikkei Inc., which provides a broad range of...

Go here to read the rest:
How the FT prepared for a world without third-party cookies - The Drum

Read More..

Nutanix Clusters takes on-premises Nutanix to AWS Blocks and Files – Blocks and Files

Nutanix is ready to announce Nutanix Clusters. This brings the on-premises Nutanix experience to AWS and opens another front in the companys battle with VMware.

Sources close to the company say Nutanix Clusters in AWS (NCA) has been in an early-access test phase for many months and is now robust and ready to move into general availability.

NCA runs on bare metal all-flash servers in AWS and uses AWS networking. Customers spin up servers using their AWS account and deploy Nutanix software on them. This process uses AWS Cloud Foundation, Amazons facility to provision and model third-party applications in AWS. On-premises Nutanix licenses can be moved to AWS to instantiate NCA there.

VMware uses its ESX hypervisor as an overlay atop AWS networking and this can sap resources and become a performance bottleneck, according to Nutanix sources.

NCA supports four types of AWS instance, including Large CPU, Large Memory and Large Storage.The Nutanix Prism management console can be used to manage NCA.

NCA bursts on-premises Nutanix deployments to AWS to cope with spikes for example, an immediate requirement to add 1,000 virtual desktops. It also has disaster recovery capabilities.

Customers can use NCA to migrate on-premises applications running on Nutanix to AWS. There is no need to re-engineer applications as the software runs the Nutanix environment transparently across the public cloud and on-premises worlds.

When no longer needed, a Nutanix cluster in AWS can be spun down with its data stored in S3, incurring S3 charges only, until spun up again. The the spun-up NCA is rehydrated from the S3 store. We understand this go-to-sleep facility will follow the main NCA release in a few months.

A blog by Nutanix CTO Binny Hill provides more background on NCA and there is more info an early-access solution brief.

Visit link:
Nutanix Clusters takes on-premises Nutanix to AWS Blocks and Files - Blocks and Files

Read More..

Samsung Electronics : – Defining the Boundaries of Communications – marketscreener.com

Communication is about sharing information with others.

The evolutions of communications technology has enabled us to be more connected than ever before, meaning that information can be shared anytime and anywhere.

In mobile communication, a business with a well-established global ecosystem, from equipment manufacturers to telecommunications operators, common rule is essential to keeping the ecosystem moving forward collaboratively. This is where the process of standardization comes in, which sets internationally agreed-upon standards to give users access to better products and services at lower prices. A representative example demonstrating the benefits of international standardization is the global roaming service, which allows users travelling to foreign countries to use their mobile devices as they are.

Standardization is one of main driving forces behind the growth of the mobile communication industry since a new generation has been introduced once every decade. 'Large-scale investments into mobile communication have been triggered when each new generation of communications is commercialized,' explained Dr. Han. 'When certain countries or companies run their businesses with proprietary solutions, the risk of failure increases.' This means that the chance of success can increase only when the stakeholders of the mobile communication ecosystem come together to define the most relevant technologies and discuss aspects like implementation early enough. 'Determining communications standards and developing products following these standards is an equi process,' noted Dr. Han. 'These standards are crucial.'

Standardization is two-fold: the de jure standards obligated by regulators and the de facto standards established by the global communications industry which, while not compulsory, specify unified ways of operation for stakeholders around the world to follow. The Standards Research team of Samsung's Advanced Communications Research Center oversees both standards.

'For example, in order to utilize the extremely high frequency band (mmWave) for 5G, de jure standardization is a prerequisite for the commercialization of any device using the band, which includes assigning a set of frequency bands to mobile communication, setting regulated conditions such as maximum transmission power and out-of-band emission, and ensuring its safety for the human body and existing devices,' explained Dr. Han. 'We are also simultaneously developing protocol technologies and working on de facto standardization to include these technologies into the standards by participating in standards developing organizations such as 3GPP (3rd Generation Partnership Project) and IEEE (Institute of Electrical and Electronics Engineers).' Dr. Han emphasized that both de jure and de facto standards are equally important.

Working as a Communications Standard Expert

Frequency bands are a limited resource. It is inevi that different parties will clash over acquiring such an in-demand resource, which is why each frequency band is already allocated to a specific purpose, e.g. fixed communications, mobile communication, broadcasting, satellite, or other uses. The extremely high frequency band adopted for 5G was an unexplored territory from the perspective of mobile communication. When Samsung initially proposed it, there was pushback at first.

Standards experts are supposed to take the initiative of reserving such new spectrums for the mobile communication industry. 'By stressing mobile communication's contribution to the economy, we managed to persuade the governments of each country, and attracted more supporters by showing them the feasibility of applying this extremely high frequency band to mobile communication,' recalled Dr. Han. 'We actively presented many details to justify our claim, including the simulation results of a coexistence study. As a result, we were able to have this extremely high frequency band assigned to 5G.'

'There is no almighty judge when it comes to fairly determining which technology among many candidates should be selected as a part of the standard. Moreover, any technology has its own pros and cons,' said Dr. Han. 'There is a decision-making process inherent to standardization. Proposals are first made by companies, intensive and technical debate on each proposal then follows, and participants finally build a consensus to reach a conclusion. We have to avoid sticking to our own interests. Instead, we are trying to communicate with other stakeholders to find the best way forward based on an understanding of the industry as a whole. When we take care of the ecosystem, proposals that we develop to make it healthy and sustainable will be supported by the majority as a result.' Similar to the role of the diplomat, standardization experts participate in global standardization conferences and will there represent their company or their country. They are expected to be the best in their own field. 'As we are contending at the forefront of these international discussions, technical competitiveness is the key requirement for Samsung delegates,' explained Dr. Han. 'Therefore, in our projects, anyone who is most competitive in a certain area is designated as the champion of the area, regardless which team he or she belongs to.'

Standardization, the Next Phase of 5G

4G is a communications technology designed to enable the wireless broadband service for smartphones. In particular, 4G as a universal communications platform aggressively adopted the Internet protocol that was popularly used in past wired packet communications. Therefore, many Internet-based services could easily migrate to cellular systems. 5G, then, is designed to expand its territory from the broadband service for smartphone users to vertical markets including the smart factory, automobile, healthcare, private network, smart city, and more. 4G as a universal solution led to a huge growth of the communications market. On the other hand, 5G aims to create new markets based on its new design principle of customizable networks to fulfill the specific requirements of a particular industry sector.

To realize the innovations that 5G has promised, Dr. Han and his team have been working on Rel-16, the second version of 5G. 'Rel-15, the first version of 5G, laid a new framework for the technology and focused on how to provide differentiated experiences to conventional customers, i.e. smartphone users,' noted Dr. Han. 'We joined the global collaboration to develop Rel-16 in order to realize the 5G vision. Rel-16 introduces and enhances 5G's features for vertical markets. For example, V2X1 is for connected cars, industrial IoT communications is for smart factories and the data analytics function has been improved for network AI.'

Even though 5G has been commercialized, the standardization of 5G for further enhancements will never stop. Until the launch of 6G, the 5G standard will continuously evolve in order to improve and expand 5G. 'As soon as we concluded the development of 5G's second version, we immediately began work on the third version, Rel-17,' commented Dr. Han. 'We have discovered some areas to improve commercial 5G networks with, including coverage expansion and NR-MIMO (Multiple Input Multiple Output). These will be amended and enhanced in the upcoming versions. Furthermore, we will continue to discover new features to add in order to enable new 5G applications. Innovations we are looking at include media delivery for AR glasses-type devices and edge computing enablers for low latency services from cloud servers close to users.'

Standardization of Edge Computing, Further Enhancement for 5G Services

Samsung is constantly pushing the boundaries of 5G in order to bring its unique experiences to users. One key characteristic of 5G is its ultra-low latency, brought about by its nine-tenths latency reduction in the radio access link between terminal and base station as compared to the previous generation. In order for users to experience the quality of ultra-low latency services, the end-to-end latency between the user terminal and the cloud server should be reduced. Samsung believes that edge computing will solve the rest of this puzzle, this being latency reduction in the backbone network, by placing the server closer to users. Thanks to 5G and edge computing, users will finally be able to enjoy 5G's signature service on their devices.

'The link between a device and its server was out of 3GPP's scope,' said Dr. Han. 'But it is also hard for other standards organizations who are not experts in 5G to develop the standard for edge computing without a complete understanding of 5G systems.' Due to this difficulty, attempts were made to develop edge computing-enabled communication using proprietary solutions - which would lead to serious market fragmentation. 'Samsung initiated discussions on edge computing inside 3GPP and persuaded other participating companies. We are now leading the standardization effort for enabling edge computing in 5G systems as one of the key items of Rel-17.'

In 2009, Samsung began the early stages of 5G research with the question of 'how can we improve cellular networks to be 10 times better than 4G LTE' Samsung will continue to develop further enhanced technologies for the future of 5G. 'Samsung plays various key roles in the influential standardization organization for mobile communications and leads those standards and related technologies,' explained Dr. Han. 'Based on our perseverance for over 10 years in this field, we will overcome whatever obstacles we encounter and will make 5G a big success.'

Making a Better World - Through Technology

Dr. Han began working in this field because when he was a student, he was extremely curious about who made standard specifications, the ground rules that were akin to a communications bible. And today, he is leading the team shaping the future of communications with standards. What resolution has he set

'When we worked on LTE standards, we did not even expect that the term 'LTE', back then only used by selective standard engineers, would become a common and popular term,' noted Dr. Han. 'This experience reminded me that the technologies we create can change the world and the daily lives of people. We are also aware of high expectation from 5G that we have developed. I firmly believe that our work will benefit the world.'

Dr. Han is also working on promoting Samsung's 6G vision to inspire people in this field. 'In the future, the main customers in the communications market won't just be human, but will include robots and other machines, too,' explained Dr. Han. 'People will start to enjoy hyper-connected experiences and be able to explore reality in a virtual world without temporal or spatial constraints. 6G will present fundamental technologies for such innovations. We will begin communicating with stakeholders as per Samsung's 6G White Paper, published on July 14. Our 5G experience and the insights captured in our 6G vision will help us prepare for the long journey toward another success story with 6G.'

'Moreover, the sustainable growth of society and the communications industry will be key considerations for shaping 6G.'

Contact:

Tel: 1800 407 267 864

Originally posted here:
Samsung Electronics : - Defining the Boundaries of Communications - marketscreener.com

Read More..

This Week in Storage, featuring Qumulo, Actifio and more Blocks and Files – Blocks and Files

This week, AWS has made Qumulo filer software available in the AWS government cloud; Actifio backs up SAP HANA to object storage in GCP; and LucidLink is doing cloud collab stuff with Adobe Premiere Pro.

Scalable file system supplier Qumulo has announced its availability in the AWS GovCloud (US) through the AWS Marketplace.

Qumulo says Government organisations can now integrate their file data with legacy applications in private cloudand cloud-native applications in AWS GovCloud (US) with a single file data platform.

The company is working with Corsec Security Inc. to gain various US government certifications for its software. The company said it aims to make Qumulo the strategic choice for all types of Controlled Unclassified Information (CUI) and unclassified file data., as well as the upcoming FIPS 140-2 and Common Criteria EAL2+ certifications of its platform.

NetApp, a Qumulo competitor, this week announced Azure NetApp Files is in the Azure government cloud

Copy data manager Actifio is waving a tech validation report from ESG that says it reduced backup and disaster recovery (DR) infrastructure costs by 86 per cent when protecting SAP HANA workloads with Google Cloud Platform (GCP) object storage. The comparison is with legacy backup approaches using high-performance block storage.

ESG found the same high levels of performance from a DR copy running off Google Nearline object storage as their production instances running on Google Persistent disk block storage.

ESG Senior Validation Analyst Tony Palmer said: Cloud object storage is typically 10x inexpensive than the cloud SSD/flash block storage. Actifios ability to recover SAP HANA database in just minutes from cloud object storage, while delivering the I/O performance of an SSD/flash block storage is very unique in the industry and reduces cloud infrastructure costs by more than 80 per cent for enterprises.

You can download the ESG Actifio SAP HANA Technology Review.

LucidLink, which supplies accelerated cloud-native file access software, is partnering with Adobe Premiere Pro so its users can edit projects directly from the cloud.

Generally, Adobe Premiere Pro video editing software users edit local files because access is fast. However, team working and remote team working require multi-user access to remote files. LucidLinks FileSpaces can provide teams with on-demand access to media assets in the cloud that are accessed as if they were on a local drive.

Sue Skidmore, head of partner relations for Adobe Video, said With so many creative teams working remotely, the ability to edit Premiere Pro projects directly from the cloud has become even more important. We dont want location to hold back creativity. Now Premiere users can collaborate no matter where they are.

Filespaces provides a centralised repository with unlimited access to media assets from any point in existing workflows. The pandemic has encouraged remote working. Peter Thompson, LucidLink co-founder and CEO, provided a second canned quote: Our customers report they can implement workflows previously considered impossible. We are providing the missing link in cloud workflows with streaming files.

Actifio has announced technical validation and support for Oracle Cloud VMware Solution (OCVS), Oracles new dedicated, cloud-native VMware-based environment.OCVS enables enterprises to move their production VMware workloads to Oracle Cloud Infrastructure, with the identical experience in the cloud as in on-premises data centres. It integrates with Oracles second-generation cloud infrastructure. OCVS is available now in all public regions and in customer Dedicated Region cloud instances.

Taiwan-based Chenbro has announced the RB23712, a Level 6, 2U rackmount server barebone (no CPUs, fitted drives) with 12 drive bays designed for storage-focused applications in the Data Center and HPC Enterprise. It pre-integrates an Intel Server Board S2600WFTR with support for up to two, 2nd GenerationXeon Scalable Processors. The RB23712 offers Apache Pass, IPMI 2.0 and Redfish compliance, and includes Intel RSTe/Intel VROC options.

Microchip Technology has introduced the latest member of the Flashtec family, the Flashtec NVMe 3108 PCIe Gen 4 enterprise SSD controller with 8 channels. It complements the 16-channel Flashtec NVMe 3016 and provides a full suite of PCIe Gen 4 NVMe SSD functions. The 3108 is intended for use by M.2 and the SNIA Enterprise and Data Center SSD Form Factor (EDSFF) E1.S drives.

Nutanix says it has passed 2,500 customers for Nutanix Files. Files is part of a Nutanix suite for structured and unstructured data management, which includes Nutanix Objects, delivering S3-compatible object storage, and Nutanix Volumes for scale-out block storage.

Penguin Computing has become a High Performance Computing (HPC) sector reseller and solution provider of Pavilion Hyperparallel Flash Arrays (HFA).

Quantum has announced its ActiveScale S3-compatible object store software has been verified as a Veeam Ready Object Solution.

Synology has launched new all-flash storage and a line of enterprise SSDs. The FS3600 storage system is the newest member of Synologys expanding FlashStation family of network-attached storage (NAS) servers. Synology has also announced the release of SATA 5200 SATA SSDs and SNV3400 and SNV3500 NVMe SSDs.

The FS3600 features a 12-core Xeon, up to 72 drives, and 56GbitE support. The new SSDS can fit in its enclosure and have 5-year warranties. They integrate with Synology DiskStation Manager (DSM) for lifespan prediction based on actual workloads.

Data replicator and migrator WANdisco said it is the first independent software vendor to achieve AWS Competency Status in data migration.

Zerto is reprinting a short Gartner report: What I&O leaders need to know about Disaster Recovery to the cloud. The report assumes that by 2023, at least 50 per cent of commodity-server workloads still hosted in the data centre will use public cloud for disaster recovery. Its an eight-minute read and you can get it, with minimal registration.

Follow this link:
This Week in Storage, featuring Qumulo, Actifio and more Blocks and Files - Blocks and Files

Read More..

10 billion records sit in unsecured databases – China leads the pack – SecurityBrief New Zealand

China, the United States, India, Germany, and Singapore are the top five countries with the most unsecured databases in the world or at least thats according to new research from NordVPN.

The security firm partnered up with a white hat hacker to scan Elasticsearch and MongoDB libraries for unsecured databases, over the space of one year.

The hacker uncovered a total of 9517 unsecured databases, collectively containing more than 10 billion entries that's 10,463,315,645 entries containing data such as emails, passwords, phone numbers, and other sensitive information.

China topped the list with 3794 exposed databases, containing a collective of more than 2.6 billion (2,629,383,174) detected entries.

The United States wasnt too far behind, with 2703 exposed databased and 2.4 million (2,397,583,255) entries.

India had 520 exposed databases with 4.9 million entries; Germany had 361 exposed databases with 248 million entries; Singapore had 355 exposed databases with 2.3 million entries.

Rounding out the top 10 most exposed databases include France, South Africa, the Netherlands, Russia, and the United Kingdom.

Other countries included South Korea, Ireland, Vietnam, Hong Kong, Brazil, Japan, Canada, Iran, Australia, and Taiwan.

NordVPN warns that although some of the exposed entries could be junk and only used for the purposes of testing, it could be hugely damaging if sensitive information were exposed.

NordVPN points to recent data leaks including a case where 540 million Facebook records were exposed on Amazon cloud servers.

Furthermore, search engines such as Shodan and Censys scan the internet constantly, enabling people to gain access to open databases. NordPass security expert Chad Hammond says anyone could scan the internet in as little as 40 minutes.

Security threats, such as automated Meow attacks that destroy data without reason or ransom, also place unsecured databases at more risk.

Hammond says, Every company, entity, or developer should make sure they never leave any database exposed, as this is obviously a huge threat to user data.

He adds that database protection should include data encryption at rest and in motion, identity management, and vulnerability management.

All should be encrypted using trusted and robust algorithms instead of custom or random methods. Its also important to select appropriate key lengths to protect your system from attacks.

Identity management is another important step and should be used to ensure that only the relevant people in an enterprise have access to technological resources.

Finally, every company should have a local security team responsible for vulnerability management and able to detect any vulnerabilities early on, he concludes.

See more here:
10 billion records sit in unsecured databases - China leads the pack - SecurityBrief New Zealand

Read More..

Finding the Right and Secured Video Platforms for your Business – Security Boulevard

It is really not an easy life for the internet based OTT services providers, be it for the pay TV cable operators or the new internet players. Users have become used to having all of their entertainment sources in all their devices all the time, from their e-books to digital music with no compromise in videos. In the meantime, there has been a rise in the expectations, from the entertainment studios, to get their content protected from any illegal use.

Generally, the technological complication of constructing, sustaining and streamlining of these multiscreen OTT services is not going down. OTT players require a variety of skills that includes video streaming, data protection, application support and other technical infrastructures. However, no single parameter, fully stacked with all these competencies, has come up in front of the OTT operators so they could depend on, to create their services that are accessible, inter-operable and automated.

Digital Rights Management is a digital authorizing system allowing the content administrators to monitor the how and by whom parameters of the content consumption.

DRM is often misunderstood with encryption. As it goes, that encryption is the method of complicating the digital info, while DRM is the comprehensive process of managing the content access. It includes the delegation of the locking and unlocking keys, backend authorizing systems with various features such as policy adherence and downloaded playback control.

The content authorities need personalised marketable DRMs to safeguard their content. In order to get access to any kind of content from the content authors, broadcasters, OTT operators or the network distributors, there is a compliance to using the few chosen DRM systems.

Hypertext Transfer Protocol Secure is a method of ensuring safe live streaming solutions and video communication over the internet. Netscape, initially, developed this to secure the online traffic using Secure Socket Layer (SSL). Since then Transport Layer Security (TSL) support has also been added to its aid. HTTP is not actively linked to streaming of videos; however, it has become a custom to use for actual HTTP applications and so for HTTP video streaming. Now we shall see how HTTP works in OTT.

Recently, HTTPs has been more commonly used for streaming. Some of the major video streaming players like Facebook, Netflix etc require HTTPs for streaming videos on their platforms. When the online traffic is sent in clear, means it is streamed over unsecure HTTP and the metadata for the video streaming session is at risk. Randomly, anyone can copy any data about the browsing session like video title, user id details etc. On a higher level, anyone can record and study the info related to Netflix traffic and what all content titles are being majorly streamed and by whom.

Using HTTPS, the transaction and the metadata info with the OTT streaming platforms and the users are safeguarded by establishing a secure channel between the two. Hence, HTTPs ensures full confidentiality of the users and their video streaming history.

AES Advanced Encryption Standard aids in protecting the video content when it is paused or is at transmission phase. It is implemented as a symmetric block cipher that can be skillfully applied on software, hardware or any other process to protect or encrypt the overall amount of the sensitive content. AES is the newest version of DES (Data Encryption Standard) which was developed in early 1970s by IBM.

The content protection in AES is very much similar to as explained in HTTPs. AES encrypts the content in a way that it will need the user to use special keys while requiring access over HTTPs.

In a nutshell, AES encrypts the video streaming in such a way that it gets nearly impossible for the frauds to steal the confidential data from your account, even if they could access the video sessions, they still cannot watch the videos.

Essentially, there are three levels of accessing to videos, which are:

In general, any user who has access to a streaming network can view both obtainable and membership videos, depending on the type of access granted to them.

There are exceptions to access, like sometimes the user can access the protected video libraries but is unable to watch the videos, it is because when they have not been authorized to access the protected video content. To have that access, users need to hold a special key that is sent to them over emails, after placing a special request (regarding the upgradation of their memberships) or via mails in some cases.

In order to have the special access, the user needs to get himself verified and validated with their ID proofs and payment card details, so as to ensure complete user authenticity. These details are stored on servers against each user IDs and a full activity log is maintained with a clear status of their access level. Every time any member logs into their account, a complete security check is done to cross-verify all minute details.

With the intervention of HTML5 video formats, the player can be used to write in HTML5/JavaScript which can be operated directly in the browser, unlike a separate application so long the browser can connect with one or more than one DRMs. With HTML5, there is no longer a requirement to depend on either a platforms in-built player code or third party stacked player codes that operate independently from the browser.

Few countable videos have limited viewing rights across the network. With IP based locational restrictions you can protect your video content from being watched by any random user across geographies, except the ones who are allowed. This encrypts your content from global video piracy. Having geographical restriction on your video sessions ensures protection from being downloaded and watched from far off locations. Even if it happens, you have the option to blacklist the whole location to break the access. However, it is not recommended as it is not safe, but still provides a twin layer of protection.

The process by which every time a video is uploaded through live streaming services gets stored in a data centre, which is administered by the Content Delivery Network (CDN). It is a decentralized network of cloud servers that uses complex software-based procedures to stream videos globally. A CDN minimizes the possibilities of encountering shaky videos, buffering issues and content delay. CDN protects your video streaming from any kind of online attacks known as DDoS attacking technique.

Continued here:
Finding the Right and Secured Video Platforms for your Business - Security Boulevard

Read More..

How to Create an Infrastructure for a Remote-Ready School – EdTech Magazine: Focus on Higher Education

When stay-at-home orders and physical distancing were implemented earlier this year, many schools scrambled to transition to remote learning. But not the Academy of Our Lady of Peace, where I serve as technology director.

The academy the oldest high school in San Diego and the only all-girls school south of Los Angeles already had technology in place that enabled us to seamlessly pivot to online instruction. We had zero downtime, and our 750 students didnt miss a single day of learning.

One reason for our success was that we were already fully in a cloud environment. We began the transition to cloud-based solutions a few years ago when our Microsoft Exchange Server reached its end-of-life status. Instead of automatically investing in a new onsite server, we moved forward with G Suite for Education because of its intuitive interface and robust back-end administrative capabilities.

mixetto/Getty Image; Logo by Amira Martin

Read the rest here:
How to Create an Infrastructure for a Remote-Ready School - EdTech Magazine: Focus on Higher Education

Read More..

Making the Cloud and Data Center Work Together Effectively – Data Center Frontier

Enterprise customers are weighing the best options for taking advantage of cloud computing models. (Photo: Rich Miller)

A new special report seriesexplores hybrid IT and the colocation industry, and this entry highlights the impacts andevolving relationship between the cloud and the data center.

During the 2019 Data Center World conference, there was a particular question and concern around the impacts of cloud on the enterprise data center. Although the cloud will continue to play a critical role in how we deliver core applications and services, it will not replace data center solutions. This is evident by the types of investments that major cloud providers are making so that their solutions run within your on-premises data center. This is the cloud telling all of us that data locality, application performance, and working with local resources is still significant.

In the most recent AFCOM State of the Data Center report, we saw a significant trend in how organizations are leveraging cloud solutions. Trends are showing that cloud now has a broader meaning that goes beyond just public cloud solutions.

Three in four respondents (72%) report noticing a trend for organizations to move away from public cloud and looking to colocation or private data centers. As mentioned earlier, the definition of private and hybrid cloud is becoming increasingly blurred as major cloud providers (AWS Outposts, for example) are offering their native solutions directly on-premises at a data center site. Currently, 52% of respondents have implanted some type of private cloud solution, and 48% are leveraging some sort of public cloud solution.

The cloud trends with the most impact on respondent organizations include IoT (Internet of Things) growth resulting in more big data (47%), data center operations management (DCOM) tools (42%), and integration with AI, data-driven services, and machine learning (39%).

All of this translates to a better understanding of cloud, and where Hybrid IT makes sense.

With a greater understanding of cloud computing, there will also be better integration around Hybrid IT. Its important to examine where cloud and Hybrid IT join forces to make a difference:

In a Hybrid IT scenario, you can leverage cloud-like delivery models to accomplish data security.

In a Hybrid IT environment, enterprises can continue to get value out of their existing infrastructure (sometimes legacy) until a technology or business event makes it worthwhile or necessary to replace it with a cloud-based alternative. This can include significant hardware or software upgrades, the need to decommission or consolidate a part of a data center, a fundamental change in business processes, or even a merger and acquisition.

If youre working with a capable data center partner and you have a good Hybrid IT strategy in place, allowing some of those systems to continue to operate while still being economically feasible can make all the sense in the world. Hybrid IT can act as your gateway into new and emerging technologies by allowing you to adopt those systems at your own pace. And there are significant benefits to making this happen. This includes:

To get started, many organizations are turning to providers of retail colocation data centers, hosted colocation data centers, and cloud-based facilities.

To create a Hybrid IT approach, you have to take a step back and understand how it applies to your business. And youll need to understand:

To get started, many organizations are turning to providers of retail colocation data centers, hosted colocation data centers, and cloud-based facilities. Most of all, theyre turning to partners that are both cloud and edge-ready. To that extent, heres what you need to know to develop a Hybrid IT mindset.

Catch up on the first entry in the special report series, and over the next few weeks, we will also explore the following topics:

Download the full report,Hybrid IT Supporting Critical Initiatives During a Journey to Digital Modernization, to explore further how hybrid computing is fueling the data center industry.

See the original post here:
Making the Cloud and Data Center Work Together Effectively - Data Center Frontier

Read More..

What is cloud computing? – IMC Grupo

Definition of cloud computing

Cloud computing is a type of services which can be simply described as a way of leasing IT infrastructure, without purchasing any hardware. External servers with very high capacity allow users to individually choose the computing power processors, memory, disk space or internet bandwidth. Such a solution makes it possible for everyone to collect any number of files and data, adjusted accurately to their needs.

As this technology reduces the entrepreneurs necessity of dealing with advanced IT infrastructure, it is beneficial especially for companies, but private users can also find it profitable. Cloud computing providers, such asCloudFerro, handle the operation and management of customers data centers, as well as integrating applications and securing the collected data.

The use of cloud computing brings many benefits to all enterprises, organizations and institutions. What is most important, it leads to a significant reduction in costs customers no longer have to bear expenses on purchasing or handling essential hardware and software. Companies which decide on implementing this solution into their IT systems do not have to plan the capacity of their resources in advance the offer is flexible and can be freely changed and extended whenever it is necessary.

What is more, cloud computing improves the organizations efficiency, as it makes it possible to use all files, applications and programs on various devices at all times. Additionally, thanks to this technology making and recovering backups becomes much simpler and more effective which leads to reducing the risk of data loss.

Cloud computing is undoubtedly the future of IT resources. It is a simple solution that will certainly lead to increasing your companys efficiency. Therefore, to make managing IT infrastructure easier, get interested in this type of services and check the offer of your local provider.

Read more:
What is cloud computing? - IMC Grupo

Read More..

3 Top Cloud Computing Stocks to Buy in August – The Motley Fool

Cloud computing has become one of the hottest markets for tech stocks, as many companies have moved their focus from hardware-based products to cloud-based services. This shift in the industry has not only created a lot of new, fast-growing companies that focus solely on cloud services, but it has also helped reinvigorate older tech companies.

Of course, not all cloud companies are experiencing the same growth, and finding the right ones for your stock portfolio with great long-term potential can be an overwhelming task. That's why a few Motley Fool contributors have compiled this list of top cloud computing stocks for you to buy right now. Read on to find out why Twilio (NYSE:TWLO), Microsoft (NASDAQ:MSFT), and Amazon (NASDAQ:AMZN) made the cut.

Image source: Getty Images.

Brian Withers (Twilio): Twilio might not be a household name, but you've probably experienced its software and not even realized it. You might have received a phone call from your Lyft driver, an SMS text from Airbnb providing updates on your booking status, or a phone call to confirm your concert tickets are still available to sell on StubHub's marketplace. All of these events were powered by Twilio's cloud platform that helps companies integrate communications capabilities into their existing applications.

Founder and CEO Jeff Lawson said that before Twilio, the process to build software-driven messaging capabilities was fragile, slow, and expensive. Companies would have to connect to a network provider, set up a communications data center, adapt existing applications by writing custom code, and likely hire high-priced consultants to integrate it all together. Once it was built, the maintenance and scaling headaches were just the beginning for this complex setup. Twilio has simplified all of this. Software developers can access its powerful communications Super Network of 25 cloud data centers in nine different regions with simple application programming interfaces that can be embedded into an organization's existing software applications.

Twilio makes most of its money (75%) by taking a tiny cut of every text, email, phone call, and video message on its network. The remaining 25% of revenue is driven by large customers who contract to pay subscription fees for unlimited usage of Twilio's products. This usage-based model powered a 51% compound annual growth rate over the last three fiscal years (not including the SendGrid acquisition) and has continued to drive strong growth of 57% and 46% for Q1 and Q2 of this year. Twilio also sports enviable net dollar retention rates equal to or exceeding 125% for the last nine quarters.

But the company's growth is just getting started. With businesses scrambling to deal with the impacts of social distancing, many have accelerated their digital transformation plans. Whether it's a call center associate now working from home, a bank teller interacting with a customer remotely, or a contactless delivery status update, these communication use cases play to Twilio's strengths. What's even more exciting for investors is that these types of customer communications have become must-have capabilities for the brands we use every day.

Although the coronavirus pandemic has been a tailwind for accelerated digital transformation efforts, it's also been a headwind for its travel, transportation, and hospitality customers. As a result of the ongoing uncertainty, management is only projecting its outlook into the next quarter. With third quarter guidance of slightly slower revenue growth of 36% to 38% and a bottom-line loss (after posting a small non-GAAP profit this quarter), the stock took a small step back after the earnings announcement.

Even though its price-to-sales ratio is a lofty 28, the solid long-term prospects for this cloud computing stock make it a compelling buy.

Image source: Getty Images.

Danny Vena (Microsoft): There's little doubt that Amazon is the undisputed leader in the realm of cloud computing, but biggest doesn't always mean best. For example, many traditional retailers that compete with Amazon are reluctant to line the digital pockets of their biggest rival. For many of them, Microsoft's Azure is a better cloud computing choice.

That's not all. For companies that are longtime users of Office, Microsoft 365, or Dynamics 365 that are already locked into Microsoft's ecosystem, it only makes sense to aggregate many of their services with the same provider.

Microsoft only entered the race for cloud dominance in the past several years, and has bypassed many of the would-be contenders, now trailing just Amazon Web Services (AWS), according to research company Gartner (NYSE: IT) and its much vaunted Magic Quadrant.

Azure has a consistent track record of growing faster than AWS in recent years, and that continued in the quarter ended June 30, 2020. Azure grew 47% year over year during the quarter, while AWS grew 29%. Microsoft noted in its most recent quarterly report that its commercial cloud surpassed $50 billion in revenue for the first time during the trailing-12-month period. For context, AWS reported $40 billion in net sales. Since neither company provides a detailed accounting of what's included in each segment, this isn't an apples-to-apples comparison, but it does show that Microsoft continues to gain ground on its larger rival.

The platform's faster growth isn't the only reason to buy Microsoft stock now. The company has proven to be particularly resistant to the challenges facing many businesses during the pandemic. The company's more personal computing segment, which was expected to be most vulnerable, turned in a stellar performance, getting a boost from gaming via its Surface line of notebooks and tablets and Xbox content and service, which grew an impressive 28% and 65%, respectively.

The productivity and business processes segment also turned in a better-than-expected performance, the result of higher demand due to remote work.

Given the uncertainty wrought by the pandemic and Microsoft's strength across its operating segments, there's never been a better time to add the tech giant's stock to your portfolio.

Image source: Getty Images.

Chris Neiger (Amazon): Amazon is well-known for its e-commerce dominance, but it's the company's cloud computing segment, Amazon Web Services (AWS), that actually generates most of the company's profits. AWS offers cloud services for data storage, networking, artificial intelligence, and much more -- and it's a huge business for Amazon.

AWS's operating profit was $3.4 billion in the most recent quarter, with impressive operating margins of 31%. The segment's profit easily outpaced the $2.1 billion in operating profit from Amazon's North American e-commerce business.

And not only is AWS a key source of profit for Amazon, but it's also the undisputed leader in the cloud computing infrastructure market. AWS has 33% market share right now, compared to next-in-line Microsoft with 18%.

AWS continues to expand its sales as well, with revenue jumping 29% in the most recent quarter to $10.8 billion. The good news for Amazon is that the cloud computing infrastructure as a service (IaaS) market isn't done growing yet. IaaS will grow from an estimated $50 billion this year to $81 billion in 2022, according to the research firm Gartner.

AWS's dominance in cloud computing, combined with its profitability for Amazon, can't be overstated. As more companies look to cloud computing platforms to host their services, Amazon will surely benefit. The coronavirus has forced more businesses to expand work-from-home services and increase e-commerce sales, and AWS will benefit by being the go-to cloud service for those hosting needs.

Gartner said in a recent press release, "For the remainder of 2020, organizations that expand remote work functionality will prioritize collaboration software, mobile device management, distance learning educational solutions and security, as well as the infrastructure to scale to support increased capacity." As more companies look to the cloud to expand these services, they'll likely rely on Amazon's leading cloud infrastructure service to do so.

Read more:
3 Top Cloud Computing Stocks to Buy in August - The Motley Fool

Read More..