Page 248«..1020..247248249250..260270..»

WATCH: Police forcibly remove outraged undocumented Cornell professor who repeatedly disrupted Ann Coulter talk – The College Fix

Cornell University police on Tuesday had to forcibly remove a professor after she repeatedly disrupted Ann Coulters guest talk at the Ivy League institution, video and pictures obtained by The College Fix show.

Monica Cornejo, an assistant professor of interpersonal communication, is told to get up and leave the venue by two police officers in a video taken by Russ Nelson, a New York resident who attended the talk, which was open to the public.

The 36-second video Nelson posted on X shows one of the officers put his hands on her arms and tell Cornejo she is under arrest for disorderly conduct, to which she repeatedly responds dont touch me do not touch me, and tells them I am a faculty member.

Its unclear if she was actually cited once she left the room.

Cornejo one of the first undocumented tenure-track faculty members at Cornell, according to the Daily Sun was apparently outraged at Coulter and her talk, titled Immigration: The Conspiracy To End America.

She was yelling counterpoints to Ann, and was the only disruption, Nelson told The Fix in a text message Tuesday. She kept objecting quite loudly, gave Ann the finger repeatedly, and after about the sixth outburst the cops came and took her away.

She had accused everyone in the room of being a racist because we support Ann Coulter. Seems like a stretch to me.

According to Cornejos faculty bio, she earned her doctorate at UC Santa Barbara in 2022.

Her bio states her research focus is on teaching students about different ways in which interpersonal communication can reduce or create disparities and inequities in the United States as well as the strategies members of minoritized communities utilize to challenge the disparities and inequities that position minoritized group members in a second-class position.

As Coulter gave her speech Tuesday, she was flanked by two personal security team members.

The last time Coulter spoke at Cornell in November 2022 she was shouted down by rowdy student audience members.

As The College Fix previously reported, her talk was supposed to go on for an hour, but constant, hostile heckling from protesting students disrupted it with aggressive comments, music, and loud noises. A frustrated Coulterended the speech after about 20 minutes of protests inside the venue.

The university was widely panned for the incident, one of many that prompted alumni and other observers to accuse Cornell of not doing enough to protect free speech.

In a March 13statement published as a letter to the editor in the Daily Sun, Cornell Provost Michael Kotlikoff explained why Coulter was re-invited: Having been deeply troubled by an invited speaker at Cornell (any speaker) being shouted down and unable to present their views, I agreed that there could be few more powerful demonstrations of Cornells commitment to free expression than to have Ms. Coulter return to campus and present her views.

Professor Cornejo and Cornells media relations department could not be immediately reached for comment late Tuesday regarding the incident.

MORE:Major Cornell donor yanks funding over DEI, demands President Pollack resign

IMAGES/VIDEO: Courtesy of Russ Nelson

Read More

Like The College Fix on Facebook / Follow us on Twitter

Link:
WATCH: Police forcibly remove outraged undocumented Cornell professor who repeatedly disrupted Ann Coulter talk - The College Fix

Read More..

U.S. officials scramble to stop major Internet firms from ditching FISA obligations – The Washington Post

U.S. government officials were scrambling Friday night to prevent what they fear could be a significant loss of access to critical national security information, after two major U.S. communications providers said they would stop complying with orders under a controversial surveillance law that is set to expire at midnight, according to five people familiar with the matter.

One communications provider informed the National Security Agency that it would stop complying on Monday with orders under Section 702 of the Foreign Intelligence Surveillance Act, which enables U.S. intelligence agencies to gather without a warrant the digital communications of foreigners overseas including when they text or email people inside the United States.

Another provider suggested that it would cease complying at midnight Friday unless the law is reauthorized, according to the people familiar with the matter, who spoke on the condition of anonymity to discuss sensitive negotiations.

The companies decisions, which were conveyed privately and have not previously been reported, have alarmed national security officials, who strongly disagree with their position and argue that the law requires the providers to continue complying with the governments surveillance orders even after the statute expires. Thats because a federal court this month granted the government a one-year extension to continue intelligence collection.

Section 702 requires the government to seek approval from the Foreign Intelligence Surveillance Court for the categories of intelligence it wants to collect. The court has issued certifications for collection involving international terrorism, weapons of mass destruction and foreign governments and related entities. Those certifications are good for one year and were renewed this month at the governments request.

U.S. officials have long argued that the law is a vital means of collecting the electronic communications on foreign government adversaries and terrorist groups. But its renewal has become an unusually divisive flash point, aligning conservative Republicans and liberal Democrats who are wary of granting the government broad surveillance authorities without new restrictions.

The people familiar with the efforts to keep the companies in compliance declined to name them, but they said their loss would deal a significant blow to U.S. intelligence collection.

Its super concerning, said one U.S. official of the potential loss of intelligence. You cant just flip a switch and turn it back on again.

U.S. officials began to hear Friday afternoon that the providers were planning to stop compliance unless Section 702 was reauthorized.

Senators are attempting to come to an eleventh-hour agreement on amendments on the legislation Friday night to quickly reauthorize the measure and avoid any lapse. Last week, the House renewed Section 702, but only for two years and only after privacy hawks failed to pass an amendment that would have required U.S. intelligence agencies to obtain a warrant to review Americans communications collected under the program. That bid failed in a dramatic 212-212 tie vote.

The House approval came despite former president Donald Trumps entreaty on social media to KILL the bill.

First passed in 2008 and reauthorized several times since then, the law enables the NSA to collect without a warrant from U.S. tech companies and communications providers the online traffic of non-Americans located overseas for foreign intelligence purposes. Communications to or from foreign targets deemed relevant to FBI national security investigations about 3 percent of the targets, according to the government are shared with the bureau. But the law is controversial because some of those communications may involve exchanges with Americans, which the FBI may view without a warrant.

The House bill represents the biggest expansion of surveillance in 15 years since Section 702 was originally created, and a shameful Congress would be expanding surveillance at a time when reforms are needed, said Jake Laperruque, deputy director of the Center for Democracy and Technologys Security and Surveillance Project.

U.S. security officials, for their part, for years have extolled the benefits of the law, with White House officials saying that the intelligence collected accounts for more than 60 percent of the presidents daily briefing. FBI Director Christopher A. Wray recently disclosed that it helped the bureau discover that Chinese hackers had breached the network of a U.S. transportation hub, and that it had helped thwart a terrorist plot last year in the United States involving a potential attack on a critical infrastructure site.

Failure to reauthorize 702 or gutting it with some kind of new warrant requirement would be dangerous and put American lives at risk, Wray told Congress this month.

Read more:
U.S. officials scramble to stop major Internet firms from ditching FISA obligations - The Washington Post

Read More..

Cyber attack takes Frontier Communications systems offline, affecting millions of broadband customers – ITPro

US telecom provider Frontier Communications was forced to shut down a number of its internal systems after detecting an unauthorized third party in its IT environment, shuttering internet access for millions.

Frontier Communications said it first detected the unauthorized access on 14 April 2024, before reporting the incident to the SEC on 15 April. The company said it had taken its systems down as part of its incident response protocols in an effort to contain the breach.

Frontier reported it believes it has contained the incident, with its core IT environment already restored, adding that it has also begun efforts to restore normal business operations, but this process is still ongoing.

Frontier serves customers in 25 US states, with 3 million broadband subscribers and a fiber optic network consisting of 5.2 million locations, as threat actors continue to target critical national infrastructure organizations to maximize the impact of their attacks.

Frontier says the third party, which it believes was likely a cyber crime group, was able to gain access to personally identifiable information (PII), among other information.

The telecom provider was unable to provide any further information on the specific types of sensitive information accessed by the attackers, or whether the PII pertained to customers or employees.

Some customers took to social media to voice their concern after being without internet for three days since Frontier took its systems down, reporting they cannot access technical support through Frontiers app, website chat, or their phone line.

Receive our latest news, industry updates, featured resources and more. Sign up today to receive our FREE report on AI cyber crime & security - newly updated for 2024.

Frontier announced it was experiencing technical issues with its internal support systems and provided a phone number for those who require assistance.

This incident comes hot on the heels of a series of high-profile cyber incident affecting telecom companies.

A huge cache of AT&T customer data was published on the dark web on 30 March 2024, with the personal data of 73 million current and former customers being exposed.

In February 2024, Australian telecom company Tangerine disclosed a breach that exposed the personal data of 232,000 customers, after an eternal contractors compromised credentials were used to access a customer database.

As a result, internet providers are increasingly being classified alongside the healthcare, water, and energy sectors as critical national infrastructure (CNI), due to the number of critical services that rely on an internet connection.

In its 2023 annual review UKs National Cyber Security Centre included internet providers as part of the critical national infrastructure, defined as organizations which if compromised could cause large scale loss of life, a serious impact on the economy, and have other grave social consequences for the community.

The annual review also notes the cyber threats facing organizations today have changed, with a rise of state-aligned groups launching attacks against critical national infrastructure in rival states.

As such, telecommunications firms should be taking extra precautions to mitigate the potential threats of nation-state affiliated threat actors deploying sophisticated attacks to cripple essential services across the region.

See the rest here:
Cyber attack takes Frontier Communications systems offline, affecting millions of broadband customers - ITPro

Read More..

Researchers create ‘quantum drums’ to store qubits one step closer to groundbreaking internet speed and security – Tom’s Hardware

A device called a quantum drum may serve as "a crucial piece in the very foundation for the Internet of the future with quantum speed and quantum security", says Mads Bjerregaard Kristensen, postdoc from the Niels Bohr Institute in a new research piece. The original research paper has an official briefing available for free on Phys.org, and can be found published in full in the Physical Review Letters journal for a subscription fee.

One key issue with quantum computing and sending quantum data ("qubits") over long distances is the difficulty of maintaining data in a fragile quantum state where losing data or "decohering" becomes a much higher risk. Using a quantum drum at steps along the chain can prevent this data decoherence from occurring, enabling longer and even potentially global communication distances.

The current record for sending qubits over a long distance is held by China and Russia, and is about 3,800 km with only encryption keys sent as quantum data. The standard wired qubit transmission range is roughly 1000 kilometers before loss of photons ruins the data. Quantum drums could potentially address this limitation.

How does a 'quantum drum' work? In a similar manner to how existing digital bits can be converted into just about anything (sound, video, etc.), qubits can be converted as well. However, qubits require a level of precision literally imperceivable to the human eye, so converting qubits without data loss is quite difficult. The quantum drum seems like a potential answer. Its ceramic glass-esque membrane was shown to be capable of maintaining quantum states as it vibrates with stored quantum information.

Another important purpose served by these quantum drums is security. Were we to start transferring information between quantum computers over the standard Internet, it would inherit the same insecurities as our existing standards. That's because it would need to be converted to standard bits and bytes, which could become essentially free to decode in the not-so-distant quantum future.

By finding a quantum storage medium that doesn't lose any data and allows information to be transferred over much longer distances, the vision of a worthwhile "Quantum Internet" begins to manifest as a real possibility, and not simply the optimism of quantum computing researchers.

Quantum computing research continues to be a major area of interest, often with highly technical discussions and details on the technology. A research paper on quantum drums and their potential of course doesn't mean that this technique will prove to be commercially viable. Still, every little step forward creates new opportunities for our seemingly inevitable quantum-powered future.

Join the experts who read Tom's Hardware for the inside track on enthusiast PC tech news and have for over 25 years. We'll send breaking news and in-depth reviews of CPUs, GPUs, AI, maker hardware and more straight to your inbox.

Read this article:
Researchers create 'quantum drums' to store qubits one step closer to groundbreaking internet speed and security - Tom's Hardware

Read More..

Cloudflare R2 Storage Introduces Event Notifications and Infrequent Access Storage Tier – InfoQ.com

During the recent Developer Week, Cloudflare announced that the object storage R2 now supports event notifications, which automatically trigger Workers in response to data changes. Additionally, the migration service Super Slurper now extends its support to Google Cloud Storage and a new infrequent access storage tier is available in private beta.

Presently in open beta, event notifications dispatch messages to a queue whenever there's a change to data within a bucket. These messages are subsequently received by a consumer Worker, allowing developers to define any subsequent actions needed. Matt DeBoard, Mengqi Chen, Siddhant Sinha, systems engineers at Cloudflare, and Erin Thames, product designer at Cloudflare, write:

The lifecycle of data often doesnt stop immediately after upload to an R2 bucket event data may need to be transformed and loaded into a data warehouse, media files may need to go through a post-processing step, etc. Were releasing event notifications for R2 in open beta to enable building applications and workflows driven by your changing data.

Source: Cloudflare blog

Designed for data lakes, storage for cloud-native applications, and web content, Cloudflare R2 enables developers to store unstructured data using S3-like APIs. Dubbed the zero egress fee object storage platform by Cloudflare to emphasize its main differentiator from competing globally distributed object storage services, R2 offers dynamic functionalities that integrate with Cloudflare Workers.

Released last year with support for Amazon S3 only, Super Slurper is a migration service that enables developers to move all their data to R2 in "one giant slurp" or "sip by sip" and now supports also Google Cloud Storage as a source. Migration jobs preserve custom object metadata from the source bucket by copying them on the migrated objects on R2 and do not delete any objects from the source bucket.

Initially released last year with support exclusively for Amazon S3, Super Slurper is a migration service that enables developers to transfer their data to R2 either in one comprehensive transfer or gradually. The service now extends its compatibility to Google Cloud Storage as a source. Migration jobs preserve custom object metadata from the source bucket by replicating them onto the migrated objects in R2.

The private beta release of the Infrequent Access storage class, a cost-effective option with comparable performance and durability, marked the third feature announcement for R2 during Developer Week. This new storage class can be assigned either through APIs or lifecycle policies and is tailored for scenarios involving infrequently accessed data, such as long-tail user-generated content or logs. DeBoard, Chen, Sinha, and Thames add:

In the future, we plan to automatically optimize storage classes for data so you can avoid manually creating rules and better adapt to changing data access patterns.

On Hacker News, user thrixton questions the pricing of the new tier:

So pricing is 1c / GB-month, compared to S3 IA at 1.25c / GB-month, a decent saving but not massive, no archive or deep archive options though, I wonder if / when these will come. What sort of negotiated rates can you get from AWS for bandwidth I wonder, at the moment, that seems like the only real benefit from CF I think.

While there are no egress fees on the class, a data retrieval fee of USD 0.01/GB (the same amount as AWS S3-IA) is charged when data in the Infrequent Access storage class is accessed.

While the Infrequent Access storage class does not incur egress fees, a data retrieval fee of USD 0.01/GB (equivalent to AWS S3-IA) is charged when data within this tier is retrieved.

More here:
Cloudflare R2 Storage Introduces Event Notifications and Infrequent Access Storage Tier - InfoQ.com

Read More..

Audacity Adds Cloud Backups and Device Syncing in 3.5 Update – How-To Geek

Everybody's favorite open-source audio editor just gained some cool new features. The Audacity 3.5 update introduces cloud backups, device syncing, automatic loop tempo detection, and a mess of other improvements.

Audacity's newfound cloud syncing capability relies on audio.com, a free SoundCloud-like platform for sharing, discovering, and collaborating on audio projects. It's a pretty solid solution for backing up or syncing Audacity projects, particularly for users who own multiple PCs. However, you'll need to link your audio.com account with each of your Audacity installations.

If you want to sync Audacity projects to a cloud service like Dropbox or Google Drive, you'll have to do it the old-fashioned waymanually save the project files to your cloud storage platform of choice. Audacity's built-in syncing functionality only supports the audio.com platform.

The 3.5 update also improves some of Audacity's music production capabilities with a new non-destructive pitch shifting tool, automatic tempo detection for imported loops (through audio and metadata analysis), and a refined plugin manager with search functionality.

Audacity says that automatic tempo detection will work best when loops have their BPM listed in the file name ("drum-loop-120-bpm.wav," for example), though audio analysis should detect the correct BPM when importing simple loops.

Note that some "niche features" were removed from Audacity in this release. The only notable removals are the "EQ XML to TXT Converter," which can be downloaded as a plugin, and "Vocal Reduction and Isolation" effect. Audacity recommends using Intel OpenVINO plugins in place of the Reduction and Isolation effect, though you can download the original effect if you still need it for old projects.

For a full list of changes and bug fixes in Audacity 3.5, check the changelog. You can also view update notes and track development at GitHub. While Audacity isn't known for rapid development, we've experienced more frequent updates since the open-source software was acquired by Muse Group. Muse Group also owns audio.com, by the way.

The Audacity 3.5 update supports Windows, macOS, and Linux installations. It also boasts improved compatibility with BSD operating systems. Audacity doesn't support automatic updates, so you must install Audacity 3.5 manually.

Source: Audacity

Here is the original post:
Audacity Adds Cloud Backups and Device Syncing in 3.5 Update - How-To Geek

Read More..

Core Scientific to expand Texas Bitcoin mining data center by 72MW – DatacenterDynamics

Bitcoin mining and digital infrastructure provider Core Scientific is planning a 72MW expansion of its data center in Denton, Texas.

The expansion will take place on the companys 31-acre site, originally built in 2022. The facility currently operates 125MW of Bitcoin mining capacity, expandable to 300MW at full build-out.

Other specifications of the expansion have not been disclosed, but completion is expected by Q2 2024.

Adam Sullivan, CEO at Core Scientific, said: Owning and controlling all of our infrastructure with access to ready power gives us the strategic optionality to expand our mining capacity, deploy upgrades to our proprietary mining technology stack, reallocate miners to optimize for efficiency and even flex to alternative forms of compute when such opportunities arise.

The company added that the expansion program will deliver more than 20 additional exahash of mining capacity, for an average cost of $200,000 per megawatt.

Core Scientific has another Texas data center, located in Pecos, offering 71MW of capacity over a 100-acre site. The company says its Pecos site can scale up to 234MW.

The company has five other US data centers in North Dakota, Kentucky, Georgia, and North Carolina. Combined, Core Scientific says its facilities have a live supply of 745MW and 372MW of pipeline supply.

Uniminers, another cryptomine data center provider, has announced it has broken ground on a data center in Ethiopia.

The facility is set to offer 100MW of IT capacity in its first phase, scheduled for completion in autumn 2024, and host approximately 24,000 high-performance ASIC miners, including models such as the Antminer S21, S21 Hydro, T21, and S19.

Other specifications, such as the location, have not been shared.

Batyr Hydyrov, president at Uniminers, said: The increasing complexity of mining necessitates greater investment in robust equipment. The industrial-scale mining for extracting the remaining bitcoins has never been more pertinent.

He added: The battle for accessing sufficient electrical power is intensifying globally as the slowdown in miner sales worldwide is mainly due to scarcity of installation sites and available power capacity.

The company said Ethiopia offers many geographical benefits, including its affordable, eco-friendly hydroelectric power. Plans are also in place for Uniminers to expand its footprint across Africa, the Middle East, and South America.

Headquartered in Guangzhou, China, and founded in 2017, Uminers has a total capacity of 90MW in data centers across Hong Kong, Oman, the UAE, the US, and Singapore.

Ethiopia has seen a recent spike in development, with Wingu.Africa, Red Fox, Sun Data World, and ScutiX all developing data centers in the country. Pan-African operator Raxio Data Centres also launched a new facility in Ethiopia in November last year.

Original post:

Core Scientific to expand Texas Bitcoin mining data center by 72MW - DatacenterDynamics

Read More..

Power of Data: Dive Into the Best Analytics Books of 2024 – Analytics Insight

In this day and age, where data is everything, data analytics helps organizations to positively influence the things they do by improving the decisions they make, revealing possible opportunities that can be failures and also the risks they are yet to prevent. Data analytics delivers businesses with a data-driven approach that helps them understand the customers needs, adjust their marketing strategy accordingly, and, as a result, improve overall performance. Therefore, the number of competent analysts being called for has risen significantly over the last couple of years. Show interest in the list of best analytics books for 2024.

This book covers in-depth studies on how to solve comparable challenges in various contexts and the language-based Python data work technique. The methods described in detail throughout the analytic book include loading, cleaning, merging, reordering, and altering data from Pandas and Numpy libraries.

This book contains a step-by-step guide on data analytics. It provides a generic process for using the methodology to analyze a problem in whatever business people are engaged in. Master the fundamentals of data mining, machine learning, and, most importantly, reasoning to become a data-driven decision-maker.

The essential elements of reporting, business intelligence, data visualization, and descriptive statistics are emphasized throughout the text. I wrote this guide with two primary purposes: (1) the programs it includes can be helpful while implementing practical apps, and (2) the exercises included are to be completed in Python. Therefore, it becomes a topic of inquiry in which not only three basic machine learning methods, regression, classification, and clustering, are highlighted.

Data Analytics 101 is a starter pack for data literacy, which is the process of turning data into intelligence. The book describes how one collects and arranges data for machines so that they can interpret it as needed, mentioning crucial machine-learning algorithms such as regression, classification, and clustering, among others.

SQL for Data Analysis begins by teaching the attendees about enhancing their SQL skills, and then they go on to show the attendees how to use SQL within their workflow. It also has some advanced techniques that help to transform data into insights, which cover tender join, window function, subquery, and regular expression.

This booklet is an essential tool for Excel users to learn how to analyze data and the components of the data structure. One of the crucial chapters is devoted to critical statistical techniques, including practical exercises with spreadsheets. The author also gives valuable advice on transitioning to the usage of Python and R tools for data analysis and hypothesis testing.

Modern data analytics in Excel will be discussed, thus eliminating the limit of traditional data processing and expanding the possibilities of data representation intuitively. Through the Author, the reader learns how to drive datasets by using Power Command and Powerdition for repeatable data segmentation and the creation of non-relational data models and analysis measures. In the book, there is also space given to the use of AI and Python in Excel reporting, which is very advanced.

The reader gains knowledge on how to use Excel for data analysis and relevant reporting of massive amounts of data. It streamlines the tedious process of creating reports and analysis and builds upon the models of data visualization.

This book is a hands-on experience where you learn tools for data analysis and apply them to support managerial, economic, and policy decisions. The book has been arranged by subject to include data wrangling, regression analysis, and causal analysis, and there are also real-world data case studies.

For business executives, Storied With Data is one of the best books as far as illustrating various data visualization techniques is concerned. Furthermore, the writer says that though the target event may be a topic as massive as the universe, it will be so finely chiseled that it will leave an everlasting mark on the reader.

Continued here:

Power of Data: Dive Into the Best Analytics Books of 2024 - Analytics Insight

Read More..

Quantum Cloud Computing Secured in New Breakthrough at Oxford – TechRepublic

Businesses are one step closer to quantum cloud computing, thanks to a breakthrough made in its security and privacy by scientists at Oxford University.

The researchers used an approach dubbed blind quantum computing to connect two quantum computing entities (Figure A); this simulates the situation where an employee at home or in an office remotely connects to a quantum server via the cloud. With this method, the quantum server provider does not need to know any details of the computation for it to be carried out, keeping the users proprietary work secure. The user can also easily verify the authenticity of their result, confirming it is neither erroneous nor corrupted.

Figure A

Ensuring the security and privacy of quantum computations is one of the most significant roadblocks that has held the powerful technology back so far, so this work could lead to it finally entering the mainstream.

Despite only being tested on a small scale, the researchers say their experiment has the potential to be scaled up to large quantum computations. Plug-in devices could be developed that safeguard a workers data while they access quantum cloud computing services.

Professor David Lucas, the co-head of the Oxford University Physics research team, said in a press release: We have shown for the first time that quantum computing in the cloud can be accessed in a scalable, practical way which will also give people complete security and privacy of data, plus the ability to verify its authenticity.

Classical computers process information as binary bits represented as 1s and 0s, but quantum computers do so using quantum bits, or qubits. Qubits exist as both a 1 and a 0 at the same time, but with a probability of being one or the other that is determined by their quantum state. This property enables quantum computers to tackle certain calculations much faster than classical computers, as they can solve problems simultaneously.

Quantum cloud computing is where quantum resources are provided to users remotely over the internet; this allows anyone to utilise quantum computing without the need for specialised hardware or expertise.

FREE DOWNLOAD: Quantum computing: An insiders guide

With typical quantum cloud computing, the user must divulge the problem they are trying to solve to the cloud provider; this is because the providers infrastructure needs to understand the specifics of the problem so it can allocate the appropriate resources and execution parameters. Naturally, in the case of proprietary work, this presents a security concern.

This security risk is minimised with the blind quantum computing method because the user remotely controls the quantum processor of the server themselves during a computation. The information required to keep the data secure like the input, output and algorithmic details only needs to be known by the client because the server does not make any decisions with it.

Never in history have the issues surrounding privacy of data and code been more urgently debated than in the present era of cloud computing and artificial intelligence, said Professor Lucas in the press release.

As quantum computers become more capable, people will seek to use them with complete security and privacy over networks, and our new results mark a step change in capability in this respect.

Quantum computing is vastly more powerful than conventional computing, and could revolutionise how we work if it is successfully scaled out of the research phase. Examples include solving supply chain problems, optimising routes and securing communications.

In February, the U.K. government announced a 45 million ($57 million) investment into quantum computing; the money goes toward finding practical uses for quantum computing and creating a quantum-enabled economy by 2033. In March, quantum computing was singled out in the Ministerial Declaration, with G7 countries agreeing to work together to promote the development of quantum technologies and foster collaboration between academia and industry. Just this month, the U.K.s second commercial quantum computer came online.

Due to the extensive power and refrigeration requirements, very few quantum computers are currently commercially available. However, several leading cloud providers do offer so-called quantum-as-a-service to corporate clients and researchers. Googles Cirq, for example, is an open source quantum computing platform, while Amazon Braket allows users to test their algorithms on a local quantum simulator. IBM, Microsoft and Alibaba also have quantum-as-a-service offerings.

WATCH: What classic software developers need to know about quantum computing

But before quantum computing can be scaled up and used for business applications, it is imperative to ensure it can be achieved while safeguarding the privacy and security of customer data. This is what the Oxford University researchers hoped to achieve in their new study, published in Physical Review Letters.

Dr. Peter Dmota, study lead, told TechRepublic in an email: Strong security guarantees will lower the barrier to using powerful quantum cloud computing services, once available, to speed up the development of new technologies, such as batteries and drugs, and for applications that involve highly confidential data, such as private medical information, intellectual property, and defence. Those applications exist also without added security, but would be less likely to be used as widely.

Quantum computing has the potential to drastically improve machine learning. This would supercharge the development of better and more adapted artificial intelligence, which we are already seeing impacting businesses across all sectors.

It is conceivable that quantum computing will have an impact on our lives in the next five to ten years, but it is difficult to forecast the exact nature of the innovations to come. I expect a continuous adaptation process as users start to learn how to use this new tool and how to apply it to their jobs similar to how AI is slowly becoming more relevant at the mainstream workplace right now.

Our research is currently driven by quite general assumptions, but as businesses start to explore the potential of quantum computing for them, more specific requirements will emerge and drive research into new directions.

Blind quantum cloud computing requires connecting a client computer that can detect photons, or particles of light, to a quantum computing server with a fibre optic cable (Figure B). The server generates single photons, which are sent through the fibre network and received by the client.

Figure B

The client then measures the polarisation, or orientation, of the photons, which tells it how to remotely manipulate the server in a way that will produce the desired computation. This can be done without the server needing access to any information about the computation, making it secure.

To provide additional assurance that the results of the computation are not erroneous or have been tampered with, additional tests can be undertaken. While tampering would not harm the security of the data in a blind quantum computation, it could still corrupt the result and leave the client unaware.

The laws of quantum mechanics dont allow copying of information and any attempt to observe the state of the memory by the server or an eavesdropper would corrupt the computation, Dr Dmota explained to TechRepublic in an email. In that case, the user would notice that the server isnt operating faithfully, using a feature called verification, and abort using their service if there are any doubts.

Since the server is blind to the computation ie, is not able to distinguish different computations the client can evaluate the reliability of the server by running simple tests whose results can be easily checked.

These tests can be interleaved with the actual computation until there is enough evidence that the server is operating correctly and the results of the actual computation can be trusted to be correct. This way, honest errors as well as malicious attempts to tamper with the computation can be detected by the client.

Figure C

The researchers found the computations their method produced could be verified robustly and reliably, as per the paper. This means that the client can trust the results have not been tampered with. It is also scalable, as the number of quantum elements being manipulated for performing calculations can be increased without increasing the number of physical qubits in the server and without modifications to the client hardware, the scientists wrote.

Dr. Drmota said in the press release, Using blind quantum computing, clients can access remote quantum computers to process confidential data with secret algorithms and even verify the results are correct, without revealing any useful information. Realising this concept is a big step forward in both quantum computing and keeping our information safe online.

The research was funded by the UK Quantum Computing and Simulation Hub a collaboration of 17 universities supported by commercial and government organisations. It is one of four quantum technology hubs in the UK National Quantum Technologies Programme.

Read the rest here:
Quantum Cloud Computing Secured in New Breakthrough at Oxford - TechRepublic

Read More..

Oracle to Invest More Than $8 Billion in Cloud Computing and AI in Japan – PR Newswire

Planned investment will grow Oracle Cloud Infrastructure's footprint in Japan

Expanded local support and operations team will help customers and partners address digital sovereignty requirements

AUSTIN, Texas and TOKYO, April 17, 2024 /PRNewswire/ -- Oracle Corporation Japan today announced that it plans to invest more than $8 billion over the next 10 years to meet the growing demand for cloud computing and AI infrastructure in Japan. The investment will grow Oracle Cloud Infrastructure's (OCI) footprint across Japan. In addition, to help customers and partners address the digital sovereignty requirements in Japan, Oracle will significantly expand its operations and support engineering teams with Japan-based personnel.

"We are dedicated to meeting our customers and partners where they are in their cloud journey," said Toshimitsu Misawa, member of the board, corporate executive officer and president, Oracle Corporation Japan. "By growing our cloud footprint and providing a team to support sovereign operations in Japan, we are giving our customers and partners the opportunity to innovate with AI and other cloud services while supporting their regulatory and sovereignty requirements."

Oracle plans to increase local customer support of its public cloud regions in Tokyo and Osaka and its local operations teams for Oracle Alloy and OCI Dedicated Region. This will enable governments and businesses across Japan to continue to move their mission-critical workloads to the Oracle Cloud and embrace sovereign AI solutions. Oracle sovereign cloud and AI services can be delivered securely within a country's borders or an organization's premises with a range of operational controls. Oracle is the only hyperscaler capable of delivering AI and a full suite of 100+ cloud services locally, anywhere.

Oracle's Distributed Cloud Delivers the Benefits of Cloud with Greater Control and Flexibility

OCI's distributed cloud lineup supports:

Additional Resources

About OracleOracle offers integrated suites of applications plus secure, autonomous infrastructure in the Oracle Cloud. For more information about Oracle (NYSE: ORCL), please visit us at http://www.oracle.com.

TrademarksOracle, Java, MySQL and NetSuite are registered trademarks of Oracle Corporation. NetSuite was the first cloud companyushering in the new era of cloud computing.

SOURCE Oracle

See the rest here:
Oracle to Invest More Than $8 Billion in Cloud Computing and AI in Japan - PR Newswire

Read More..