Page 3,233«..1020..3,2323,2333,2343,235..3,2403,250..»

How Technology Is Revolutionizing the Unorganized Parking Sector in India and the Road Ahead – News18

Mobile and cloud based services have simplified our lives to an extent where we cant imagine our lives without them. But the parking industry has not been up to speed while adapting to these services and consumer expectations. While the traditional parking methods must have worked in the earlier times, we need more than these traditional methods to simplify parking and squeeze out more space from the available space.

In the modern world, where the automobile and mobility industry is evolving at a rapid pace, the urban world needs future proof technology to enhance the parking experience. Here's a look at the cutting edge developments done pioneers in smart parking technology:

Smart Sensors- Tools like UV sensors, geomagnetic sensors and radar based sensors aid in tracking of the vehicles inside parking. With these devices, it makes it viable to monitor the occupancy of each parking slot. Sensors in the parking are connected to the cloud server and mobile apps to show real time status of parking to the consumers and parking managers.

ANPR Cameras- The ability of Smart Cameras to scan the number plate helps in providing secure and fast access control of the parking lot. The cameras process and transfer information to the servers to keep track of vehicle identification. It eliminates the unauthorized entry of vehicles into the parking and automates the ticketing and revenue collection operation.

Mobile Apps- Mobile apps for parking have brought a revolutionary change in the parking industry. The real time apps help the user to avoid long queues and find parking easily. It eliminates the cash-counter system in the parking by offering cashless payment options. The user can easily check the status and relevant information of parking, and do reservations and avail value added services.

Automated Parking system- It utilizes compact parking spaces and maximizes the parking capacity of the parking. This technology converts parking garages into multiple levels of parking lots where the vehicles are stacked vertically. It minimizes land usage and increases the utility of the area through the mechanical system.

Creating more parking spaces is not going to solve the problem but we need a stable vision of parking that manages the association between supply and demand. And getting that vision, smart parking technology will play a very crucial role. Here's how technology contributes to manage an unorganized parking sector:

Real-time availability- The latest mobile apps and web apps are making it convenient for the user to check the real-time availability of the parking spaces, directly saving a lot of time and reducing traffic congestion.

Tracking- With evolving technology, the lives of both the administrator and the users of the parking are simplified. The administrator can track every single activity like the vehicles going in and out of the parking, vacancy in the parking lot, monitoring the health status of the devices, and more. The user can also check the availability of parking, the status of the parking sessions, and transactions in real-time.

Seamless transactions- The traditional methods of payment like parking meters, ticket vending machines, and ticket counters have now been replaced with the new payment methods. In the present day, transactions are done through online payment portals, making it hassle-free.

Increased Security- Safety and security are significant in modern times. The devices deployed in the parking, like the sensors, cameras, and trackers, increased more reliability and security. Only authorized vehicles are allowed in the parking area. Parking cameras help enforcement officers to capture the violations and to ensure the safety of the vehicles in the parking 24*7.

Touchless parking- Post Covid-19, consumers and businesses are mindful of minimizing physical touch. In a touchless parking lot, your customers (transient and monthly) can do entry access, payments, and exit validation all through their phones. The new inventions in the parking industry facilitate the user to receive a hands-on wheel experience. It is the fastest-growing shift from operating everything manually to doing it online.

Reduced workforce - The complete process of managing the parking operations are done with the parking equipment and software, reducing the enforcement of manual workers in the parking garage. Manual entry, opening gates, and directing the vehicles are few tasks that were done manually, now managing those operations through parking equipment and software makes everything automatic..

Minimizes revenue leakage- Smart parking technology helps the industry in performing traditional parking operations in a better and optimized way. There is no misplacement of parking fees as there are no middlemen between the parking operator and the user. The operator can view all the details about the vehicle count and the transaction associated with each vehicle.

Conclusion

Smart Parking technology has provided the much needed transformation of the parking industry. It has positively impacted all stakeholders of parking and mobility to deliver future proof strategies. With modern technology touching our everyday lives, more advancements are expected in the future of parking.

This opinion is authored by Chirag Jain, Founder & CEO - Get My Parking. All views are personal.

Read more here:
How Technology Is Revolutionizing the Unorganized Parking Sector in India and the Road Ahead - News18

Read More..

ECLIPSE TECH PARTNERS WITH WASABI TO DELIVER HIGH PERFORMANCE HOT CLOUD STORAGE AT COLD STORAGE PRICING – The Herald Journal

BOUNTIFUL, Utah, Dec. 15, 2020 /PRNewswire/ -- Eclipse Tech, a provider of GPU workstations in the cloud, announced a new partnership today with cloud storage company Wasabi. In addition to a user friendly platform with pay-as-you-go pricing, the new partnership will offer Eclipse Tech clients flexible, high speed storage at a cost that is well below that of other storage competitors.

With an increased global need for companies to find remote work setups, many are now transitioning to working in the cloud.Eclipse Tech provides an easy solution for businesses and universities who need to quickly provision resources for remote users with minimal setup. Wasabi storage options are being seamlessly integrated into the Eclipse Tech platform, allowing users to quickly and easily manage their cloud storage and access it in their Eclipse Tech virtual workstation.

"We have had many requests from users looking for storage options that are both cost effective and high performance," said Eclipse Tech CTO, Ben Campbell. "We are thrilled to be able to offer Wasabi as a storage option for our users."

"Wasabi Hot Cloud Storage provides simple, predictable, and affordable hot cloud storage for businesses all over the world at 1/5th the price of the competition," said David Friend, Wasabi CEO and co-founder. "Through this partnership, Wasabi will allow Eclipse Tech customers to easily manage their data in the cloud."

For more information visit the Eclipse Tech website at: http://www.eclipsetech.co or for a free demo contact sales@eclipsetech.co

About Eclipse TechEclipse Tech provides a next-generation cloud computing platform for remote work or learning - anywhere in the world. With its powerful, ready-to-use virtual desktop, Eclipse Tech's simple process eliminates the need for expensive hardware usually needed for graphics-heavy workflows, without requiring a technical expert to set it up. Perfect for workflows and collaboration in media & entertainment, construction & engineering, manufacturing, scientific & medical research, gaming, and education. Eclipse Tech provides fully customizable configurations and tailored solutions, including shared storage, with a pay-as-you-go model.

Follow and connect with Eclipse Tech on Twitter, Linkedin and ourblog

About Wasabi Wasabi provides simple, predictable and affordable hot cloud storage for businesses all over the world. It enables organizations to store and instantly access an infinite amount of data at 1/5th the price of the competition with no complex tiers or unpredictable egress fees. Trusted by customers worldwide, Wasabi has been recognized as one of technology's fastest growing and most visionary companies. Created by Carbonite co-founders and cloud storage pioneers David Friend and Jeff Flowers, Wasabi has secured $110 million in funding to date and is a privately held company based in Boston.

Follow and connect with Wasabi onTwitter, Facebook,Instagram and ourblog.

Media Contact: Nick BrownInkHouse for Wasabiwasabi@inkhouse.com

Kurt WalkerVP - Growth801-362-4761media.relations@eclipsetech.co

See more here:
ECLIPSE TECH PARTNERS WITH WASABI TO DELIVER HIGH PERFORMANCE HOT CLOUD STORAGE AT COLD STORAGE PRICING - The Herald Journal

Read More..

Amber X is a privacy-focused in-home cloud storage for your photos you can access anywhere – DIYphotography

With all the goings-on at Google and people looking for alternate cloud storage options, sometimes its easier to just be your own cloud. I use Resilio Sync for this, but that does usually mean building up a dedicated PC. The Amber X wants to simplify that process with an inexpensive cloud-storage-in-a-box solution that lets you store your data and retrieve it anywhere.

Amber X is currently running on Indiegogo, where its long since passed its goal. It has a couple of days left to go, with pledges starting at $129. But weve got you a special link thatll let you get your hands on one for just $119 about the same cost youd pay for a year of Google Photos or iCloud for the same storage, except theres no subscription with this!

Amber Xs cloud storage system is focused heavily on Privacy. All your data is stored locally on your Amber X device and is never stored on AmberCloud itself and nobody can see your data not even the Amber team, so theres no need to worry about whos looking at your photos and video. AmberCloud is simply a means to let your mobile device see your Amber X sitting at home and to make the connection.

The Amber X unit itself contains 500GB of internal SSD storage, but you can add virtually unlimited storage to the device using external USB SSDs and hard drives. While it will let you back up just about anything, its designed primarily to let you back up your camera roll. Once your device is synced, all your photos and videos are automatically backed up to the devices internal SSD.

Setting up is simple. Just plug in power and add it to your home network. Download the app for either iOS, Android, Windows or macOS and run through the setup process. After that, away you go. Once backed up, your photos and videos are automatically organised by the systems built-in AI, which lets you then filter and view images sorted by location, faces detected, which device they were created with, when and more. And, if you wish, you can share select folders, images and videos with specific people that only you allow.

Overall it looks like a neat, self-contained solution to the cloud backup issue faced by just about anybody who shoots photos or videos with their phone. And when youre at home with your device, youre not limited by Internet speeds, so you can back up quickly. And if you run out of space, you can just add another hard drive.

Amber X is already being manufactured and is currently on Indiegogo with a couple of days left. Heres a secret link for DIYP readers, thatll get you $10 off the normal Super Early Bird price, bringing it down from $129 to $119. Shipping is expected to begin in February 2021.

See more here:
Amber X is a privacy-focused in-home cloud storage for your photos you can access anywhere - DIYphotography

Read More..

Trend Micro Cloud One – File Storage Security: Designed to mitigate threats across the cloud – Help Net Security

Trend Micro announced the worlds first cloud-native, fully serverless file storage security tool for organizations building applications in the cloud. Trend Micro Cloud One File Storage Security is designed to mitigate threats across the cloud environment and support strict compliance requirements.

The explosion of cloud-based file and object storage presents a new attack vector for threat actors to target with malicious files. Cloud One File Storage Security provides automated anti-malware scanning to keep information safe and ease compliance needs.

Global organizations are increasingly looking to public cloud providers to drive IT agility, cost savings and business growth. But while the provider deals with security of the cloud, the customer is responsible for everything inside their cloud environment, said Mark Nunnikhoven, vice president of cloud research for Trend Micro.

This is a highly scalable, automated scanning tool thats fast to deploy with no added infrastructure, allowing organizations to confidently store cloud files and data associated with their cloud applications.

Backed by Trend Micros 30+ years of cybersecurity experience and industry leading threat intelligence, the tool blocks known bad files, and looks for hidden or changing malware variants.

The scanner itself is a lightweight, cloud-native serverless function thats designed for minimal operational overhead. This architecture enables fast, seamless deployment and flexible integration with organizations existing custom workflows for added value.

The tool supports various compliance requirements that call for anti-malware scanning of cloud files while maintaining data sovereignty.

Trend Micro Cloud One File Storage Security is available now for AWS S3, with support for Microsoft Azure Blob storage and Google Cloud Storage coming soon.

Using Trend Micros Cloud One platform, teams can implement a range of security services and compliance checks without hindering agile cloud development and deployment. This single cloud-native security seamlessly complements and integrates with existing AWS, Microsoft Azure, VMware, and Google Cloud toolsets.

Read this article:
Trend Micro Cloud One - File Storage Security: Designed to mitigate threats across the cloud - Help Net Security

Read More..

Google is finally making it easier to search your Drive cloud storage on mobile – TechRadar

Google Drive has updated its search functionality for mobile users, which should make it quicker and easier for you to find the files you need. The latest changes will see intelligent search suggestions offered within the Android and iOS versions of the Drive app, based on factors like past searches and frequently accessed files.

Were launching new features for the Google Drive mobile apps that will help you search more quickly and efficiently, a Google Workspace update explained. Now, Android and iOS users with the latest versions of the Drive app will be able to: See and re-run recent desktop and mobile searches; [and] view and select intelligent suggestions as they type, including suggestions for people, past searches, and keywords, as well as recently accessed files.

Google notes that the launch of the new search features has been driven by workplace shifts brought about by the Covid-19 pandemic. With more people working away from their desks, it has become increasingly important that they can find files quickly using their smartphone.

The coronavirus pandemic has forced businesses to enable remote working on a mass scale, with social distancing measures being enforced around the world. In this new world of work, cloud computing has become vital in allowing organizations to continue operating while keeping their employees safe.

The pandemic has also heightened competition among cloud solution providers, however, leading many to update their service offerings. In addition to the new search functionality, Google Drive also recently tweaked how Microsoft Office documents were edited on the platform.

Google Drives updated search function will be made available for all Google Workspace and G Suite users over the next couple of weeks, providing they have the latest version of the app installed and have the Web and App Activity privacy option enabled within their Google account.

Read the original:
Google is finally making it easier to search your Drive cloud storage on mobile - TechRadar

Read More..

Wondershare’s Document Cloud is the best way to back up your files – Pocket-lint

(Pocket-lint) - It doesn't matter how fast or secure your physical storage is, if you lose it or it breaks then you're in huge trouble. Data loss can be hard to undo and have devastating consequences. Even if you don't lose any data, having things filed away on different drives and external hardware can make it really annoying to find your files quickly.

That's where Wondershare steps in to help - its superb cloud backup service, Document Cloud, offers secure and quick cloud storage that makes it easy to keep your documents in the cloud, searchable and accessible remotely without any worries. Wondershare Document Cloud is a brilliant way around your storage woes - find out more about what it offers, below, and read on to check out an amazing deal you can grab right now.

The core of the Document Cloud experience is right there in the name - it gives you access to a large storage pool hosted in the cloud, securely. You can store and access up to a gigabyte's worth of data for free using the trial option, but most users take on a membership to gain access to the full 100GB allocation.

That's enough to store practically countless files and documents, and the entire system is built so that you can access it wherever you browse the web. That means that whether you're on your smartphone, a tablet, a laptop or your desktop computer, you can quickly and easily access your files, search them and filter them to find the one you're looking for without needing to get too forensic in your searches.

Your files will also be locked firmly behind robust layers of protection to ensure that you, and only you, have access to them, making sure that you don't have to worry about anyone else being able to snoop through your private files.

It's not just simple storage, though - Wondershare Document Cloud offers a wide range of additional features that take it from being a simple storage platform to a more powerful tool in your working and personal life. One such feature is its electronic signature system, which lets you send documents to named recipients to be signed and authorised officially, which can be hugely useful for those dealing with forms and waivers, like landlords.

Another is the existence of a wide range of document templates you can use to create new files entirely in your library, and the ability to encrypt your documents to ensure that they can't be read by anyone else even if you were to send them by accident. The ability to bulk-send documents if you need to, and to check an audit log of your storage and a history of it, too, are all just more tools to ensure that you can make the most of your file storage, not just leave things to wither in it.

It all adds up to a platform that can do more with your files than other competitors you might have used before.

One of the other areas where Document Cloud can make your life way easier is in team management - if you have a bunch of people working on a small pool of documents, you'll know how quickly things can get chaotic if people are saving new versions over each others' work.

Document Cloud makes it really easy to add team members to your plan, giving them access to whatever you deem they need, via a clever authorization system. You can also set some users as Admins, so that they have a bit more control.

This lets you easily see who has tasks awaiting completion, who's already finished what they were meant to be working on. It's a great tool for keeping track of your team's progress, cleverly built into Document Cloud's other features.

Of course, Wondershare has a whole range of amazing apps offering other features including video editing, photo manipulation, audio recording and more, and one of the other major plusses to using Document Cloud is how it will integrate with these over time.

Wondershare has already fully integrated it with PDFelement, its superb app for editing and creating new PDF files, to let you automatically save your files to the cloud, access them and edit them entirely through the web. That means that even if you're out and about with just your smartphone, you can easily log on and edit a PDF, something that would be unthinkable to a lot of people without Wondershare's help.

This integration will spread to more Wondershare apps to create an ecosystem that's super-powerful, letting you do a wide range of tasks remotely. That's just a sampling of the reasons why Document Cloud is the answer for you, but there's another big reason to try it out now. Wondershare is running a huge 50% off discount, slashing the price of its paid plans in half. So, it's the perfect time to sign up and store your files in the cloud.

View post:
Wondershare's Document Cloud is the best way to back up your files - Pocket-lint

Read More..

Why cloud vendors are investing in new sources of compute power – VentureBeat

In 2014, data wasdeclaredthe oil of the digital economy, and the analogy remained accurate until recently. In 2020, however, data reflected oil only in the parallels to the2020 oil glut too much production, not enough consumption, and the wholesale commoditization of storage.

Today, the overriding demand is for datas refined end product business insight. And the most crucial link in the data insights supply chain is compute power.

This makes the infrastructure of CPU cycles that enables distillation of value from mountains of data the new oil of the digital economy. And its driving some dramatic changes in the computing hardware ecosystem. Heres what I mean:

Cloud vendors like AWS came to understand that the core differentiation of their offerings had little to do with data itself and everything to do with what customers can get from their data. Yet deriving value from massive datasets spread across multiple cloud storage instances, and leveraging advanced AI and ML-poweredgraphanalytics and other analytics, takes a lot of processing juice.

The exponential growth in demand for processing capacity (and the costs associated with it) was what initially drove organizations to move to the cloud. Yet once the move to the cloud was a fait accompli, cloud vendors could take a long, hard look at their own processing capabilities.

What they saw was that processing was the hands-down biggest variable cost in the cloud environment. And they realized that buy versus build priorities had flipped. Just as Amazon had verticalizeddeliveries lowering costs and competing with UPS and FedEx cloud vendors could verticalize chipmaking, or outsource to competitors other than Intel and AMD.

So they did.

AWS dipped its toes in the silicon watersin 2018, when it began offering services over its first gen Graviton chips, which were designed with technology licensed from Arm (which NVIDIA is in the process of acquiring). This year, AWS dove headfirst into the chip pool, launching services based on Graviton2 which are touted as massively faster and cheaper than its Intel-based offerings. AWS also announced a new ARM-based supercomputing servicetwo weeks ago.

In 2017, Microsoft announced it was committing to use chips based on Arm-based technology for cloud purposes. It was among the first to test the Altra processor from Arm server CPU start-up Ampere in March, actively evaluating the chips capacities in their labs to help bolster Microsofts hyperscale data centers. Two years ago, Google launched its Tensor Processing Unit (TPU) 3.0, a custom application specific processor to accelerate machine learning and model training.

Meanwhile, Appleannounced in Junethat it would gradually transition away from Intel-based chipsets in its personal computers, and more recently stated it was going to produce its owncellular modem chipstoo.

What were seeing is the decoupling of processing power from its traditional members-only club. Like oil, compute power is moving the direction of storage and other commodity services. And just like airlines care deeply about oil prices, inasmuch as oils derivatives are a pillar of their service offering, enterprises will look at computing power as a means to an end.

Cloud vendors will relentlessly pursue ever-cheaper processing power. The entire compute layer will be commoditized, and well see apps routinely running across tens of thousands of CPUs in parallel. Companies that embrace multicloud will be able to split processing intensive tasks between providers, based on highly-competitive and micro-segmented incremental pricing.

Computing power will become a commodity in the full and traditional sense of the word, too. It will betradedon markets like any metal, energy, livestock, or agricultural commodity. Traders will be able toarbitrageprocessing cycles and hedge with processingfutures.

This shift will force cloud vendors to rethink themselves. Differentiation will be based on computing cycle availability and the quality of the algorithms used for AI/ML analysis.

What does all this mean for Intel and AMD? Unless they make some radical changes, I think the expression old soldiers never die, they just fade away may be apt. Considerhigh streetretail, whose demise began with the advent of widespread e-retail and accelerated during the pandemic. With the shift to cloud computing, the demand for CPU power on the desktop and in the data center will continue to shrink. And if cloud vendors make their own processing power, we could see traditional chipmakers go the way ofSears.

The burgeoning demand for insights from the petabytes of data that continues to flood into enterprise cloud storage is completely reshaping the computing ecosystem. As cloud vendors step into new verticals to take control of their computing supply chain, the old order of processors stands before a time of dramatic and fundamental change.

David Richards is co-founder and CEO ofWANdisco.

Read the rest here:
Why cloud vendors are investing in new sources of compute power - VentureBeat

Read More..

How to Send Large Files Over the Internet – PCMag UK

Have you ever tried to email a file to someone, only for your mail service to tell you that its too big? Its a frustrating but common problem. Most email services and software restrict the size of file attachments. For example, Gmail and Yahoo limit the size of attached files to 25MB, so that 100MB video isn't going through. Email is not your only choice; many standalone services can take on the job. Here are some ways you can send large files over the internet.

One easy solution is to upload the file to a cloud storage service for the other person to then access and download from their device. Free tiers from Box (10GB), Dropbox (2GB), Google Drive (15GB), iCloud (5GB), and OneDrive (5GB) offer storage space that may solve your issue. However, these services also have upload limits, so you may need to upgrade to a paid plan depending on your needs.

Gmail limits attached files to 25MB in size; anything over that is automatically placed inside Google Drive. You can do this by opening a new email and attaching the file. If its too large, Google will generate a link to it in Google Drive.

After you try to send your email, youre asked to provide access to the file to your recipient. By default, the file is available just for viewing. You can opt to allow the person to review or edit the file, but they would need a Google account to perform either action.

Once permissions are decided, send the email to its recipient and the person can click the link to view the file in Google Drive. Google may limit you to 15GB for the free tier of Google Drive, but any paid plan will allow you to upload up to 750GB a day (though files larger than this will still go through), with an overall file limit of 5TB.

Yahoo Mail can do the same, but its a less user-friendly option. If you try to send a large file through Yahoo, an alert will prompt you to save the file on either Google Drive or Dropbox. Pick your service of choice, then manually upload the file to the service.

You then return to the email, click File Attachment, and select Share Files From Google Drive or Share Files From Dropbox. Choose the file and it shows up as an attachment to your email. Once it is sent, your recipient can click the file attachment to view it in Google Drive or Dropbox.

Outlook allows you to attach a file up to 33MB in size. If you try to send something larger, Outlook prompts you to upload and share the file via OneDrive. Select that option and then compose and send your message. The recipient can then open and view the file from your OneDrive space.

Instead of relying on email, you can instead turn to a third-party file transfer website. Just upload the file you wish to send and enter your name and email address along with the name and address of your recipient. The site houses the file online and sends your recipient a download link. How large can the file be? That depends on the service, and what you're willing to pay.

DropSend allows you to compose an email to your recipient and attach the file you want to send. Your recipient receives an email with a link to the file to view it or download it. DropSend offers three personal plans, all requiring a paid subscription. For $5 a month, the Basic plan gives you 10GB of online storage with up to 25 sends a month. For $9 a month, the Standard plan offers 25GB of storage with as many as 50 sends a month with other bonus features. The$19-a-month Professional plan gives you 25GB of storage with an unlimited number of sends each month and a host of advanced features.

With MyAirBridge, you can upload a file and email a link to a specific recipient or just upload the file and generate a link to share with anyone. You can send a file as large as 20GB for free. A basic $2.99-per-month plan covers files up to 50GB, the $10.99-per-month Pro plan handles files as hefty as 70GB, and the $65.99-per-month Enterprise plan allows files as beefy as 100GB.

Filemail is quick and simple. Fill out an email form with your address and its destination, compose your message, attach your files, and send your message. Your recipient then receives a link to the file for downloading or viewing it online. The free option allows files as large as 5GB, the $10-per-month Filemail Pro plan supports sizes as large as 25GB, and the $15-per-month Business plan handles unlimited file sizes.

WeTransfer is a user-friendly service many have probably already used. Just select the file you want to send from your computer, then add the email addresses and compose your message. Click the Transfer button to send your file to the recipient. A free WeTransfer account allows file sizes up to 2GB. You dont need to create an account, but guests have to enter an emailed verification code for each transfer. For $12 per month, a WeTransfer Pro account allows files as large as 20GB as well as other benefits.

Send Anywhere is an ad-supported file transfer site that can send files as large as 10GB for free. You can upload a file, then secure it with a six-digit key or create an account to generate a shareable link or send an email. If you need to transfer even larger files, a Send Anywhere Plus plan supports file sizes up to 50GB at a cost of $5.99 a month.

Go here to see the original:
How to Send Large Files Over the Internet - PCMag UK

Read More..

Swedish university fined $66,000 for GDPR violations – The Daily Swig

Jessica Haworth16 December 2020 at 14:01 UTC Updated: 17 December 2020 at 14:40 UTC

Ume University research group held sensitive information on insecure cloud storage

A Swedish university has been fined SEK550,000 ($66,000) for storing sensitive personal information in the cloud without sufficiently protecting the data.

Ume University, in mid-northern Sweden, violated the General Data Protection Regulation (GDPR) by failing to properly secure data related to a research study on male sexual health, the Swedish Data Protection Authority has ruled.

A research group had gained access to preliminary police reports concerning cases of male rape, a statement from the regulator reads.

On receiving the files, the university group scanned and stored them digitally in a US cloud storage service, despite the institution informing faculty members via its intranet that such sensitive files should not be stored in this way.

Read more of the latest GRPR news and breaches

The reports contained information on the suspicion of crime, name, personal identity number, and contact details, as well as sensitive data about sexual life and overall health.

In another incident, the research group sent an email to the police requesting further information with one of the scanned reports attached as a reference.

The research group later repeated this action, despite the fact that the police pointed out the inappropriateness in sending sensitive material in unencrypted emails, the report states.

Linda Hamidi, who led the investigation by the Swedish Data Protection Authority, said that the cloud service and the way the university uses it does not provide sufficient protection for this type of personal data.

The report reads: These events show that the university has not taken necessary measures to ensure a level of security appropriate in relation to the risk.

Ume University was also faulted for failing to report the data breach under GDPR laws, which came into effect in May 2018.

The report adds: The Swedish Data Protection Authority also criticizes the university for failing to report the incident as a personal data breach.

The controller is obliged to notify the DPA of data breaches and furthermore to present to us what has been done to mitigate the effects of the incident and to prevent similar incidents from happening in the future.

READ MORE Healthcare security woes: More than 45 million medical images openly accessible online

Here is the original post:
Swedish university fined $66,000 for GDPR violations - The Daily Swig

Read More..

Apache Pulsar vs. Kafka and other data processing technologies – TechTarget

This article discusses how Apache Pulsar handles storage and compares it to how other popular data processing technologies, such as Apache Kafka, deal with storage. Follow this link and take 35% off Apache Pulsar in Action in all formats by entering "ttkjerrumgaard" into the discount code box at checkout.

Apache Pulsar's multilayered architecture completely decouples the message serving layer from the message storage layer, allowing each to scale independently. Traditional distributed data processing technologies such as Hadoop and Spark have taken the approach of co-locating data processing and data storage on the same cluster nodes or instances. That design choice offered a simpler infrastructure and some possible performance benefits due to reducing transfer of data over the network, but at the cost of a lot of tradeoffs that impact scalability, resiliency and operations.

Pulsar's architecture takes a very different approach that's starting to gain traction in a number of cloud-native solutions. This approach is made possible in part by the significant improvements in network bandwidth that are commonplace today: separation of compute and storage. Pulsar's architecture decouples data serving and data storage into separate layers: Data serving is handled by stateless "broker" nodes, while data storage is handled by "bookie" nodes as shown in Figure 1.

This decoupling has many benefits. For one, it enables each layer to scale independently to provide infinite, elastic capacity. By leveraging the ability of elastic environments (such as cloud and containers) to automatically scale resources up and down, this architecture can dynamically adapt to traffic spikes. It also improves system availability and manageability by significantly reducing the complexity of cluster expansions and upgrades. Further, this design is container-friendly, making Pulsar the ideal technology for hosting a cloud native streaming system. Apache Pulsar is backed by a highly scalable, durable stream storage layer based on Apache BookKeeper that provides strong durability guarantees, distributed data storage and replication and built-in geo-replication.

A natural extension of the multilayered approach is the concept of tiered storage in which less frequently accessed data can be offloaded to a more cost-effective persistence store such as S3 or Azure Cloud. Pulsar provides the ability to configure the automated offloaded of data from local disks in the storage layer to those popular cloud storage platforms. These offloads are triggered based upon a predefined storage size or time period and provide you with a safe backup of all your event data while simultaneously freeing up storage capacity on the local disk for incoming data.

Both Apache Kafka and Apache Pulsar have similar messaging concepts. Clients interact with both systems via topics that are logically separated into multiple partitions. When an unbounded data stream is written to a topic, it is often divided into a fixed number of equal sized groupings known as partitions. This allows the data to be evenly distributed across the system and consumed by multiple clients concurrently.

The fundamental difference between Apache Pulsar and Apache Kafka is the underlying architectural approach each system takes to storing these partitions. Apache Kafka is a partition-centric pub/sub system that is designed to run as a monolithic architecture in which the serving and storage layers are located on the same node.

In Kafka, the partition data is stored as a single continuous piece of data on the leader node, and then replicated to a preconfigured number of replica nodes for redundancy. This design limits the capacity of the partition, and by extension the topic, in two ways. First, since the partition must be stored on local disk, the maximum size of the partition is that of the largest single disk on the host machine (approximately 4 TB in a "fresh" install scenario); second, since the data must be replicated, the partition can only grow to the size of smallest amount of disk space on the replica nodes.

Let's consider a scenario in which you were fortunate enough to have your leader be placed on a new node that can dedicate an entire 4 TB disk to the storage of the partition, and the two replica nodes each only have 1 TB of storage capacity. After you have published 1 TB of data to the topic, Kafka would detect that the replica nodes are unable to receive any more data and all incoming messages on the topic would be halted until space is made available on the replica nodes, as shown in Figure 3. This scenario could potentially lead to data loss, if you have producers that are unable to buffer the messages during this outage.

Once you have identified the issue, your only remedies are to either make more room on the existing replica nodes by deleting data from the disks, which will result in data loss, since the data is from other topics and most likely has not been consumed yet. The other option is to add additional nodes to the Kafka cluster and "rebalance" the partition so that the newly added nodes will serve as the replicas. Unfortunately, this requires recopying the entire 1 TB partition, which is an expensive, time-consuming and error-prone process that requires an enormous amount of network bandwidth and disk I/O. What's worse is that the entire partition is completely offline during this process, which is not an ideal situation for a production application that has stringent uptime SLAs.

Unfortunately, recopying of partition data isn't limited to only cluster expansion scenarios in Kafka. Several other failures can trigger data recopying, including replica failures, disk failures or machine failures. This limitation is often missed until users experience a failure in a production scenario.

Within a segment-centric storage architecture, such as the one used by Apache Pulsar, partitions are further broken down into segments that are rolled over based on a preconfigured time or size limit. These segments are then evenly distributed across a number of bookies in the storage layer for redundancy and scale.

Using the previous scenario we discussed with Apache Kafka in which one of the bookies disks fills up and can no longer accept incoming data, let's now look at the behavior of Apache Pulsar. Since the partition is further broken down into small segments, there is no need to replicate the content of the entire bookie to the newly added bookie. Instead, Pulsar would continue to write incoming message segments to the remaining bookies with storage capacity until the new bookie is added. At that point, the traffic will instantly and automatically ramp up on new nodes or new partitions, and old data doesn't have be recopied.

As we can see in Figure 5, during the period when the fourth bookie stopped accepting segments, incoming data segments 4, 5, 6 and 7 were routed to the remaining active bookies. Once the new bookie was added, segments were routed to it automatically. During this entire process, Pulsar experienced no downtime and was able to continue serving producers and consumers. As you can see, Pulsar's storage system is more flexible and scalable in this type of situation.

About the authorDavid Kjerrumgaard is the director of solution architecture at Streamlio, and a contributor to the Apache Pulsar and Apache NiFi projects.

See more here:
Apache Pulsar vs. Kafka and other data processing technologies - TechTarget

Read More..