Page 245«..1020..244245246247..250260..»

Tech Giant Google Reveals Plans To Merge Its AI Research And DeepMind Divisions – Tech Business News

In a statement on Thursday, Google disclosed intentions to unite its Research and DeepMind divisions, aligning teams dedicated to crafting AI models.

The search engine giant will consolidate teams that focus on building artificial intelligence models across its Research and DeepMind divisions in its latest push to develop its AI portfolio. The move comes amid growing global concerns about AI safety and increasing calls for regulation of the technology.

Gemini, unveiled last year, boasts capabilities to process various data formats, including video, audio, and text. However, Google faced criticism following inaccuracies in historical image generation, prompting a pause in certain image generation functionalities.

While the rollout of Gemini helped boost Alphabets share price, it came under fire after inaccuracies in some historical image generation depiction.

Rick Osterloh, previously overseeing Googles hardware efforts, will lead the Platforms and Devices team, emphasising the pivotal role of AI in shaping user experiences.

Osterloh highlights the integration of hardware, software, and AI as crucial for transformative innovations, citing examples such as Pixels camera technology.

Google emphasises that the consolidation isnt indicative of a shift away from its dedication to the broader Android ecosystem. Instead, it highlights the companys amplified emphasis on integrating AI across its platforms.

See original here:
Tech Giant Google Reveals Plans To Merge Its AI Research And DeepMind Divisions - Tech Business News

Read More..

Sundar Pichai on merging Android and Pixel teams, Google DeepMind, more – 9to5Google

Sundar Pichai is out with an internal email today detailing the big Platforms & Devices reorganization of Android, Chrome and Pixel, as well as other company-wide changes.

To truly drive computing forward, we need to do it at the intersection of hardware, software and AI. So we are formalizing the collaboration between DSPA and P&E and bringing the teams together in a new PA called Platforms & Devices.

The Alphabet/Google CEO says this merger will result in higher quality products and experiences for our users and partners. Specifically, it will turbocharge the Android and Chrome ecosystems and bring the best innovations to partners faster, with Circle to Search for Samsung cited as an example.

This should speed up decision-making internally.It follows the hardware division in January switching to a functional organization model where, for example, there is one team for hardware engineering across Pixel, Nest, and Fitbit.

Meanwhile, there are other AI changes today. All compute-intensive model building now takes place within Google DeepMind. This gives other teams within Google single access points for tak[ing] these models and build[ing] generative AI applications.

Meanwhile, Googles Responsible AI teams are moving from Research to DeepMind to be closer to where the models are built and scaled.

Were standardizing launch requirements for AI-powered features and increasing investments in red team testing for vulnerabilities and broader evaluations to help ensure responses are accurate and responsive to our users prompts.

Meanwhile, Google Research is getting a clear and distinct mandate to continue investing in foundational and applied computer science research in three key areas:

Fundamental computer science research is in our DNA and we have some of the worlds best computer scientists. We simply would not be the company we are today without the researchers who developed the foundations on which all Googles products are built and are now inventing the foundations for our future.

Pichai ends on a mission first note:

We have a duty to be an objective and trusted provider of information that serves all of our users globally. When we come to work, our goal is to organize the worlds information and make it universally accessible and useful. That supersedes everything else and I expect us to act with a focus that reflects that.

FTC: We use income earning auto affiliate links. More.

Originally posted here:
Sundar Pichai on merging Android and Pixel teams, Google DeepMind, more - 9to5Google

Read More..

Google merges DeepMind and Research teams in latest AI push – Verdict

Google announced on Thursday (18 April) that it will be merging teams that focus on building AI models across its Research and DeepMind divisions, in the latest move by the company to catch up in the GenAI race.

Google will be moving its Responsible AI teams from its Research department to DeepMind in order for it to be closer to where its AI models are being built, according to the company.The Responsible AI team focuses on safe AI development.

The move comes as concern about AI safety grows and global lawmakers increasingly seek effective ways to regulate the rapidly growing technology.

At the start of April, the UK and US governments signed a memorandum of understanding in a partnership to tackle AI safety and ethics.

The two countries previously pledged to work together on AI safety during the UKs AI Safety Summit in November 2023 at Bletchley Park.

Under the partnership, the UKs AI Safety Institute will share its research with the US. The countries have also committed to partnering with other countries on AI safety.

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Your download email will arrive shortly

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

Country * UK USA Afghanistan land Islands Albania Algeria American Samoa Andorra Angola Anguilla Antarctica Antigua and Barbuda Argentina Armenia Aruba Australia Austria Azerbaijan Bahamas Bahrain Bangladesh Barbados Belarus Belgium Belize Benin Bermuda Bhutan Bolivia Bonaire, Sint Eustatius and Saba Bosnia and Herzegovina Botswana Bouvet Island Brazil British Indian Ocean Territory Brunei Darussalam Bulgaria Burkina Faso Burundi Cambodia Cameroon Canada Cape Verde Cayman Islands Central African Republic Chad Chile China Christmas Island Cocos Islands Colombia Comoros Congo Democratic Republic of the Congo Cook Islands Costa Rica Cte d"Ivoire Croatia Cuba Curaao Cyprus Czech Republic Denmark Djibouti Dominica Dominican Republic Ecuador Egypt El Salvador Equatorial Guinea Eritrea Estonia Ethiopia Falkland Islands Faroe Islands Fiji Finland France French Guiana French Polynesia French Southern Territories Gabon Gambia Georgia Germany Ghana Gibraltar Greece Greenland Grenada Guadeloupe Guam Guatemala Guernsey Guinea Guinea-Bissau Guyana Haiti Heard Island and McDonald Islands Holy See Honduras Hong Kong Hungary Iceland India Indonesia Iran Iraq Ireland Isle of Man Israel Italy Jamaica Japan Jersey Jordan Kazakhstan Kenya Kiribati North Korea South Korea Kuwait Kyrgyzstan Lao Latvia Lebanon Lesotho Liberia Libyan Arab Jamahiriya Liechtenstein Lithuania Luxembourg Macao Macedonia, The Former Yugoslav Republic of Madagascar Malawi Malaysia Maldives Mali Malta Marshall Islands Martinique Mauritania Mauritius Mayotte Mexico Micronesia Moldova Monaco Mongolia Montenegro Montserrat Morocco Mozambique Myanmar Namibia Nauru Nepal Netherlands New Caledonia New Zealand Nicaragua Niger Nigeria Niue Norfolk Island Northern Mariana Islands Norway Oman Pakistan Palau Palestinian Territory Panama Papua New Guinea Paraguay Peru Philippines Pitcairn Poland Portugal Puerto Rico Qatar Runion Romania Russian Federation Rwanda Saint Helena, Ascension and Tristan da Cunha Saint Kitts and Nevis Saint Lucia Saint Pierre and Miquelon Saint Vincent and The Grenadines Samoa San Marino Sao Tome and Principe Saudi Arabia Senegal Serbia Seychelles Sierra Leone Singapore Slovakia Slovenia Solomon Islands Somalia South Africa South Georgia and The South Sandwich Islands Spain Sri Lanka Sudan Suriname Svalbard and Jan Mayen Swaziland Sweden Switzerland Syrian Arab Republic Taiwan Tajikistan Tanzania Thailand Timor-Leste Togo Tokelau Tonga Trinidad and Tobago Tunisia Turkey Turkmenistan Turks and Caicos Islands Tuvalu Uganda Ukraine United Arab Emirates US Minor Outlying Islands Uruguay Uzbekistan Vanuatu Venezuela Vietnam British Virgin Islands US Virgin Islands Wallis and Futuna Western Sahara Yemen Zambia Zimbabwe Kosovo

Industry * Academia & Education Aerospace, Defense & Security Agriculture Asset Management Automotive Banking & Payments Chemicals Construction Consumer Foodservice Government, trade bodies and NGOs Health & Fitness Hospitals & Healthcare HR, Staffing & Recruitment Insurance Investment Banking Legal Services Management Consulting Marketing & Advertising Media & Publishing Medical Devices Mining Oil & Gas Packaging Pharmaceuticals Power & Utilities Private Equity Real Estate Retail Sport Technology Telecom Transportation & Logistics Travel, Tourism & Hospitality Venture Capital

Tick here to opt out of curated industry news, reports, and event updates from Verdict.

Submit and download

The UKs Science, Technology and Innovation Secretary, Michelle Donelan, described the partnership as a landmark moment in AI development.

We have always been clear that ensuring the safe development of AI is a shared global issue, she said. Only by working together can we address the technologys risks head on and harness its enormous potential to help us all live easier and healthier lives.

GlobalData forecasts that the overall AI market will be worth $909bn (712.25bn) by 2030, registering a compound annual growth rate (GAGR) of 35% between 2022 and 2030.

In the GenAI space, revenues are expected to grow from $1.8bn in 2022 to $33bn in 2027 at a CAGR of 80%.

Give your business an edge with our leading industry insights.

Read more from the original source:
Google merges DeepMind and Research teams in latest AI push - Verdict

Read More..

Trolling IBM’s Quantum Processor Advantage With A Commodore 64 – Hackaday

The memory map of the implementation, as set within the address space of the Commodore 64 about 15kB of the accessible 64kB RAM is used.

Theres been a lot of fuss about the quantum advantage that would arise from the use of quantum processors and quantum systems in general. Yet in this high-noise, high-uncertainty era of quantum computing it seems fair to say that the advantage part is a bit of a stretch. Most recently an anonymous paper (PDF, starts at page 199) takes IBMs claims with its 127-bit Eagle quantum processor to its ludicrous conclusion by running the same Trotterized Ising model on the ~1 MHz MOS 6510 processor in a Commodore 64. (Worth noting: this paper was submitted to Sigbovik, the conference of the Association for Computational Heresy.)

We previously covered the same claims by IBM already getting walloped by another group of researchers (Tindall et al., 2024) using a tensor network on a classical computer. The anonymous submitter of the Sigbovik paper based their experiment on a January 2024 research paper by [Tomislav Begui] and colleagues as published in Science Advances. These researchers also used a classical tensor network to run the IBM experiment many times faster and more accurately, which the anonymous researcher(s) took as the basis for a version that runs on the C64 in a mere 15 kB of RAM, with the code put on an Atmel AT28C256 ROM inside a cartridge which the C64 then ran from.

The same sparse Pauli dynamics algorithm was used as by [Tomislav Begui] et al., with some limitations due to the limited amount of RAM, implementing it in 6502 assembly. Although the C64 is ~300,000x slower per datapoint than a modern laptop, it does this much more efficiently than the quantum processor, and without the high error rate. Yes, that means that a compute cluster of Commodore 64s can likely outperform a please call us for a quote quantum system depending on which linear algebra problem youre trying to solve. Quantum computers may yet have their application, but this isnt it, yet.

Thanks to [Stephen Walters] and [Pio] for the tip.

See the article here:
Trolling IBM's Quantum Processor Advantage With A Commodore 64 - Hackaday

Read More..

French quantum computing powerhouses Pasqal and Welinq announce partnership – Tech.eu

Today, two French quantum computing companies,PasqalandWelinq, announced a partnershipsetto bring new standards to thequantum computingindustry.

Pasqal builds quantum processors from ordered neutral atoms in 2D and 3D arrays to give its customers a practical quantum advantage and address real-world problems. It was founded in 2019 out of the Institut d'Optique by Georges-Olivier Reymond, Christophe Jurczak, Professor Dr Alain AspectNobel Prize Laureate Physics, 2022, Dr Antoine Browaeys, and Dr Thierry Lahaye. To date, Pasqal has secured more than 140 million in financing.

Welinq develops and commercialises quantum links based on laser-cooled neutral atom quantum memories to interconnect quantum computers, drastically increasing their computational power and ensuring their deployment in clusters on customer premises.

The company spun out from Sorbonne Universit, CNRS and PSL-University and was founded in 2022 by Tom Darras, Prof Julien Laurat, Dr Eleni Diamanti and Jean Lautier-Gaud.

The next-generation Quantum Processing Units (QPUs)are expectedto execute quantum algorithms relying on a large number of qubits while applying error correction, which would necessitate an even more significant number.

Welinq harnesses a unique solution to interconnect multiple QPUs, significantly enhancing computational power. This facilitates scaling up the number of qubits and optimised QPU deployment and establishes the foundation for expansive quantum networks.Welinq's world-leading quantum memories are central to this breakthrough,whichare essential in creating these pivotal quantum links.

The two companies aim to push the boundaries of quantum processing unit (QPU) interconnectivity. Welinq brings their full-stack, turnkey quantum links to the partnership and the world's most efficient quantum memories based on cold neutral atoms, promising to provide the scalability necessary for achieving fault-tolerant quantum computing.

Pasqal offers expertise in quantum computing with neutral atoms, featuring full-stack capabilities from hardware design and development to software solutions.

By the end of 2024, Welinq targets an industrial prototype of their neutral atom quantum memory with cutting-edge efficiency, storage time, and fidelity. Pasqal aims for a breakthrough in 2024 with 1000-qubit QPUs. T

he roadmap peaks in the 2026-2027 horizon with projected 10,000-qubit QPUs and high-fidelity two-qubit gates.

By 2030, they aim to foster a thriving quantum computing ecosystem, driving significant scientific and commercial advancements.

Multiple Pasqal neutral atom quantum processors will be interconnected for the first time, significantly boosting computing power. This represents a substantial step toward developing a complete, fault-tolerant quantum computing architecture that supports distributed computing.

Georges-Olivier Reymond, CEO and co-founder Pasqal commented:

"The partnership between Pasqal and Welinq is a strategic step towards practical quantum computing.

Our collaboration is centred on creating tangible solutions by integrating Pasqal's precision in quantum processing with Welinq's innovative networking and quantum memory systems.

This is quantum advancement with real-world application in mind, striving to solve complex problems with greater efficiency and reliability."

According to Tom Darras, CEO & Co-founder of Welinq:

"I am delighted to see that Welinq's unique vision for the scale-up of quantum computing is in alignment with quantum computing leaders like Pasqal,

This is a landmark for boosting the global quantum community towards achieving practical quantum computing in networked quantum computer architectures."

Lead image: Dynamic Wang.

See the rest here:
French quantum computing powerhouses Pasqal and Welinq announce partnership - Tech.eu

Read More..

Quantum Computing Could be the Next Revolution – Fair Observer

Every few decades, the world witnesses technological revolutions that profoundly change our lives. This happened when we first invented computers, when we created the Internet and most recently when artificial intelligence (AI) emerged.

Today, experts frequently speculate that the next revolution will involve technologies grounded in the principles of quantum mechanics. One such technology is quantum computing. Harnessing the unique properties of quantum mechanics, quantum computers promise to achieve superior computational power, solving certain tasks that are beyond the reach of classical computers.

Quantum computers can potentially transform many sectors, from defense and finance to education, logistics and medicine. However, we are currently in a quantum age reminiscent of the pre-silicon era of classical computers. Back then, state-of-the-art computers like ENIAC ran on vacuum tubes, which were large, clunky, and required a lot of power. During the 1950s, experts investigated various platforms to develop the most efficient and effective computing systems. This journey eventually led to the widespread adoption of silicon semiconductors, which we still use today.

Similarly, todays quantum quest involves evaluating different potential platforms to produce what the industry commonly calls a fault-tolerant quantum computer quantum computers that are able to perform reliable operations despite the presence of errors in their hardware.

Tech giants, including Google and IBM, are adapting superconductors materials that have zero resistance to electrical current to build their quantum computers, claiming that they might be able to build a reasonably large quantum computer by 2030. Other companies and startups dedicated to quantum computing, such as QuEra, PsiQuantum and Alice & Bob, are experimenting with other platforms and even occasionally declaring that they might be able to build one before 2030.

Until the so-called fault-tolerant quantum computer is built, the industry needs to go through an era commonly referred to as the Noisy Intermedia-Scale Quantum (NISQ) era. NISQ quantum devices contain a few hundred quantum bits (qubits) and are typically prone to errors due to various quantum phenomena.

NISQ devices serve as early prototypes of fault-tolerant quantum computers and showcase their potential. However, they are not expected to clearly demonstrate practical advantages, such as solving large scale optimization problems or simulating sufficiently complex chemical molecules.

Researchers attribute the difficulty of building such devices to the significant amount of errors (or noise) NISQ devices suffer from. Nevertheless, this is not surprising. The basic computational units of quantum computers, the qubits, are highly sensitive quantum particles easily influenced by their environment. This is why one way to build a quantum computer is to cool these machines to near zero kelvin a temperature colder than outer space. This reduces the interaction between qubits and the surrounding environment, thus producing less noise.

Another approach is to accept that such levels of noise are inevitable and instead focus on mitigating, suppressing or correcting any errors produced by such noise. This constitutes a substantial area of research that must advance significantly if we are to facilitate the construction of fault-tolerant quantum computers.

As the construction of quantum devices progresses, research advances rapidly to explore potential applications, not just for future fault-tolerant computers, but also possibly for todays NISQ devices. Recent advances show promising results in specialized applications, such as optimization, artificial intelligence and simulation.

Many speculate that the first practical quantum computer may appear in the field of optimization. Theoretical demonstrations have shown that quantum computers will be capable of solving optimization problems more efficiently than classical computers. Performing optimization tasks efficiently could have a profound impact on a broad range of problems. This is especially the case where the search for an optimized solution would usually require an astronomical number of trials.

Examples of such optimization problems are almost countless and can be found in major sectors such as finance (portfolio optimization and credit risk analysis), logistics (route optimization and supply chain optimization) and aviation (flight gate optimization and flight path optimization).

AI is another field in which experts anticipate quantum computers will make significant advances. By leveraging quantum phenomena, such as superposition, entanglement and interference which have no counterparts in classical computing quantum computers may offer advantages in training and optimizing machine learning models.

However, we still do not have concrete evidence supporting such claimed advantages as this would necessitate larger quantum devices, which we do not have today. That said, early indications of these potential advantages are rapidly emerging within the research community.

Simulating quantum systems was the original application that motivated the idea of building quantum computers. Efficient simulations will likely drastically impact many essential applications, such as material science (finding new material with superior properties, like for better batteries) and drug discovery (development of new drugs by more accurately simulating quantum interactions between molecules).

Unfortunately, with the current NISQ devices, only simple molecules can be simulated. More complex molecules will need to wait for the advent of large fault-tolerant computers.

There is uncertainty surrounding the timeline and applications of quantum computers, but we should remember that the killer application for classical computers was not even remotely envisioned by their inventors. A killer application is the single application that contributed the most to the widespread use of a certain technology. For classical computers, the killer application, surprisingly, turned out to be spreadsheets.

For quantum computers, speculation often centers around simulation and optimization being the potential killer applications of this technology, but a definite winner is still far from certain. In fact, the quantum killer application may be something entirely unknown to us at this time and it may even arise from completely uncharted territories.

[Will Sherriff edited this piece.]

The views expressed in this article are the authors own and do not necessarily reflect Fair Observers editorial policy.

Read the rest here:
Quantum Computing Could be the Next Revolution - Fair Observer

Read More..

Exploring the Power of Quantum AI: What You Need to Know – Scioto Valley Guardian

Quantum AI is a fascinating field that combines the power of quantum computing with artificial intelligence to unlock new possibilities and revolutionize industries. In this article, we will delve into the basics of quantum computing, explore the next frontier of quantum AI algorithms, examine cutting-edge applications across various industries, discuss the challenges and opportunities that come with this technology, and speculate on the future of quantum AI. Whether youre a seasoned tech enthusiast or simply intrigued by the potential of groundbreaking innovations, the emergence of platforms likequantumai.counderscores the growing importance and accessibility of quantum AI in shaping the technological landscape of tomorrow.

The field of quantum computing represents a fundamental change in computational approach, moving away from classical computings binary logic and towards the probabilistic domain of quantum mechanics. Fundamentally, quantum computing uses quantum bits, or qubits, to alter data by utilizing the laws of superposition and entanglement. Qubits are different from classical bits in that they can exist in more than one state at once. This allows for the processing of information in parallel and exponentially increases computer capacity. Shors algorithm for integer factorization and Grovers algorithm for database search are two examples of these algorithms that highlight the revolutionary potential of quantum computing in resolving intricate issues that are beyond the scope of classical systems.

The cutting edge of computational innovation is embodied in AI algorithms, which combine the intelligence of artificial neural networks with the capabilities of quantum computing. Equipped with the concepts of quantum parallelism and superposition, these algorithms go beyond the limitations of traditional machine learning models by enabling quick data processing and improved optimization methods. The field of these artificial intelligence (AI) algorithms is expanding at an unprecedented rate, with ground-breaking developments ranging from quantum-inspired optimization algorithms such as the Quantum Approximate Optimisation Algorithm (QAOA) to quantum neural networks for pattern recognition and classification. These algorithms present unmatched prospects for propelling scientific research and expanding technological boundaries.

Quantum AI has the potential to revolutionize a wide range of sectors by spurring creativity and altering long-standing paradigms. By utilizing these optimization algorithms to negotiate complicated market dynamics and open up new paths for profit maximization, quantum AI in finance empowers algorithmic trading tactics, risk management protocols and portfolio optimization procedures. Quantum AI in healthcare heralds a new era of precision medicine and tailored medicines by streamlining drug discovery pipelines, facilitating genome sequencing and analysis, and enabling personalized treatment regimens. This AI also improves inventory management systems, expedites route optimization, and boosts demand forecasting skills in logistics and supply chain management, all of which maximize operational effectiveness and resource utilization.

Although quantum AI has countless potential, there are manydifficulties and barriersin the way of its actualization. One of the biggest obstacles in the way of effective computing is still the search for fault-tolerant quantum hardware that can maintain stable qubits and reduce quantum decoherence. In addition, interdisciplinary cooperation and coordinated research efforts are required for the development of scalable quantum algorithms and error correction codes to overcome current obstacles and realize the full potential of quantum AI. However, these difficulties also present previously unheard-of chances for creativity, teamwork, and societal effect, highlighting the revolutionary potential of quantum AI in reshaping both technology and humankind.

Quantum AI is expected to evolve through a trajectory of rapid innovation, revolutionary breakthroughs, and paradigm shifts in computational approaches as we move towards a future driven by quantum energy. From ground-breaking studies in quantum information theory to industrial applications in quantum computing and artificial intelligence, the field of quantum AI is changing at a rate never seen before, changing entire industries, transforming scientific research, and advancing humankind to new heights of comprehension. The potential of quantum AI to surpass imagination and usher in an unprecedented era of technical growth and societal upheaval is contingent upon sustained investment, collaboration, and inventiveness.

To sum up, quantum AI is a cutting-edge technical advancement that embodies a stunning combination of artificial intelligence and quantum computing, with the potential to redefine human achievement. By exploring the complexities of artificial intelligence with quantum mechanics, we can open up new possibilities outside the scope of traditional computing paradigms. We set out on a voyage of exploration and invention as we negotiate the difficulties of quantum AI, driven by the unquenchable quest for knowledge and advancement.

The field of quantum AI is constantly growing, offering numerous chances for groundbreaking discoveries, cross-disciplinary cooperation and societal influence. Quantum AI is a progress accelerator that will lead us to a future filled with limitless potential and unimaginable possibilities, revolutionizing everything from industries to scientific frontiers to solving urgent global concerns. As we approach the dawn of a quantum-powered era, let us seize the opportunity presented by quantum AI and use its revolutionary potential to create a better, more promising future for coming generations.

View original post here:
Exploring the Power of Quantum AI: What You Need to Know - Scioto Valley Guardian

Read More..

‘Almost very close’ to nuclear weapon: Federal cyber officials brace for quantum computing surprise – Washington Times

Federal cybersecurity officials are preparing for a quantum computing surprise that requires the largest change in encryption ever to safeguard Americans data from foreign hackers.

The Cybersecurity and Infrastructure Security Agencys Garfield Jones said Tuesday that the emergence of a cryptanalytically relevant quantum computer will upend digital security in unprecedented ways and that people need to prepare immediately.

Such a device, dubbed CRQC, would be capable of breaking encryption to expose government secrets and peoples personal information to anyone who uses the machine, according to cyber officials.

Nations will rush to develop the tech and keep it hidden from public view in order to steal their enemies data while upending information security in the process, according to Mr. Jones, CISA associate chief of strategic technology.

When it drops, its not going to be, I dont think its going to be a slow drop, Mr. Jones told cyber officials assembled at the U.S. General Services Administration. I think once someone gets this CRQC, none of us will know.

Quantum computers promise speeds and efficiency that todays fastest supercomputers cannot match, according to the National Science Foundation. Classical computers have more commercial value now because quantum computers have not yet proven capable of correcting errors involving encoded data.

A cryptanalytically relevant quantum computer, the CRQC, will be capable of correcting errors, according to Mr. Jones, and perform tasks that other computers cannot approach.

Preparations for defense against such technology are underway across the federal government.

Art Fuller, who is leading the Justice Departments post-quantum cryptography efforts, said developing secure systems presents a huge challenge that cannot be solved by flipping a switch.

This is the largest cryptographic migration in history, Mr. Fuller told officials at Tuesdays event.

Estimates on the timing of the creation of such a quantum computer vary, but Mr. Jones said large-scale quantum computers remain in the early stages of research and development and could still be a ways off.

Regardless, Mr. Jones cautioned digital defenders against delaying preparation for the arrival of such technology.

He described the environment surrounding the development of the CRQC as almost very close to a nuclear weapon, with nations competing to obtain the machine and keep it top secret.

You never know, three years from now, you might have a CRQC but I think planning and getting that preparation in place will help you protect that data, Mr. Jones said.

The National Security Agency similarly fears the arrival of a CRQC in the hands of Americas enemies.

NSA Director of Research Gil Herrera said last month that teams around the world are building with different technologies and could develop something representing a black swan event, an extremely unexpected occurrence with harsh consequences.

If this black swan event happens, then were really screwed, Mr. Herrera said, citing potential damage to everything from financial transactions to sensitive communications for nuclear weapons.

Mr. Herrera did not forecast precisely when a nation could develop such a device in remarks at the Intelligence and National Security Alliance event but indicated it may take a long time to achieve.

View original post here:
'Almost very close' to nuclear weapon: Federal cyber officials brace for quantum computing surprise - Washington Times

Read More..

NJIT Computer Science and Engineering Experts Talk About ‘Smart’ Cities – NJIT News |

Accomplished computer science and engineering professors at New Jersey Institute of Technology were among the featured speakers at a conference about creating smart cities that was organized by two centers of NJITs Martin Tuchman School of Management: the Leir Research Institute and Hub for Creative Placemaking.

Distinguished Professor of Computer Science Guiling Grace Wang talked about her research into developing responsive traffic signals whose timing adjusts based on the volume of traffic. Artificial intelligence is central to that project.

In addition, Wang, whos also associate dean of research at NJITs Ying Wu College of Computing and director at the universitys Center for AI Research, detailed her efforts to use blockchain technology to create a decentralized credential management system for vehicles thats secure and protects privacy.

Assistant Professor of Civil and Environmental Engineering Rayan Hassane Assaad identified 10 technological disruptions that are spurring the development of smart cities, including drones, sensors, wearables, automation, the internet of things and self-driving cars.

Assaad, founding director at the Smart Construction and Intelligent Infrastructure Systems Lab at NJITs Newark College of Engineering, also emphasized the importance of data collection, interconnectedness and intelligence in managing infrastructure systems, be they in transportation, water management or energy. He ended his presentation with a quote from entrepreneur Jim Rohn: For things to change, you have to change.

The professors were among nine speakers at the conference, which took place at NJIT. The experts came from a variety of organizations, including the New Jersey Economic Development Authority, Regional Plan Association, National Endowment for the Arts, Gensler design and architecture firm, Choose New Jersey and New Jersey Business Action Center.

The hub for the conference was apropos as NJIT is known for bringing people together, advancing knowledge and spurring innovation, as President Teik C. Lim and Martin Tuchman School of Management Dean Oya Tukel noted in their welcoming remarks.

The keynote speaker was an interesting choice: Ben Stone, the design and creative placemaking director at the NEA. Stone, who holds degrees in fine arts, American studies and city planning, shared nine examples of how design and creativity can help shape and explain new infrastructure projects.

On a day of rapid-fire, PowerPoint-fueled show-and-tell, Stone also detailed NEA grant programs that support community development around the country. Our Town, for example, funded the painting of murals on buildings in Baltimore that teased the transformation of three neighborhoods into an arts and entertainment district.

Stone, whose grandfather taught mechanical engineering at what was then Newark Technical School (now NJIT), illustrated the need to put people first in the development of smart cities a theme that other speakers echoed throughout the conference.

The design expertise that our team provides is not just about aesthetics. Its about economics. Its about connections. Its about community development, bringing people together around a shared vision, Stone said.

Artists and designers are creative problem-solvers who can work alongside all of you, he added, noting that they can serve as allies in the work you all are doing, allies in community development work, allies in thinking about the future of our communities and thinking about them creatively and holistically together.

View post:

NJIT Computer Science and Engineering Experts Talk About 'Smart' Cities - NJIT News |

Read More..

The Necessary Evil of Computer Science 124 | Magazine – Harvard Crimson

In late March, Noa Kligfeld 24 sent an email to various College mailing lists with the subject line, HAVE THOUGHTS ABOUT CS 124? In the email, Kligfeld, a Computer Science concentrator, explained that she was working on a class project to help make CS 124 better. She hoped to share the survey results with Harvard professors to improve the class experience.

Why might Kligfeld hope to make the course better? Take a look at the Spring 2023 QReports: Do not take this class for pride; No social life. You will be maiden-less.

Most students, however, arent taking Computer Science 124: Data Structures and Algorithms for pride. Theyre taking it to fulfill the computer science concentrations Algorithms requirement. Hence the courses description as a necessary evil in the Q Guide.

Last year, the average number of hours students reported spending on coursework outside of class per week was 16.70, with the plurality of students reporting it took up 18-20 hours a week. But this isnt the only reason the course draws complaints: while many computer science students are hoping to use their degree to go into industry, CS 124 is a theoretical course, focusing more on proofs than programming. This is consistent with Harvards CS department at large. Many students step foot on campus hearing about how theoretical Harvards CS department is.

Adam C. Hesterberg, the current assistant director of undergraduate studies for CS, says that the theoretical focus of courses like CS 124 is an attempt to circumvent the rapidly changing industry trends.

We try to teach skills that will be useful to computer scientists even when the hottest language in industry moves onto the next thing, probably in a few years, he says.

He views the department as equipping students with timeless skills, not the currently useful languages in the industry.

Boaz Barak, a CS professor and co-director of undergraduate studies, also hopes undergraduates will find a broad range of applicability from their CS studies.

Our goal is to prepare Harvard CS concentrators to many possible career options, Barak says. Courses that focus too much on practical knowledge may become outdated before students even graduate.

Hesterberg notes that he frequently hears back from former students of CS 124 who state that what they learned in 124 was really useful to them. Similarly, Barak says that it is no accident that the material of CS 124 is the one that is often used by tech companies in their interviews.

Computer science concentrator Amulya Garimella 25 agrees, saying she really enjoyed the theoretical aspects of Harvards computer science classes.

Understanding deep down why things work is helpful. I think that it also helps you understand how to think about problems like a machine would, which I think is really helpful, says Garimella, a former Crimson magazine editor.

Garimella says that anyone could easily learn a language by searching up the syntax, but the ability to understand the super base layer of computer science had allowed her to code better in her research positions and even in a biotech startup.

Most languages are pretty similar, deep down. They have some very important differences in how theyre implemented, she says. Certainly GPT-5 will be able to code very well, but deep down, its the math and the theory that even made those advances possible.

Charlie Chen 27, who plans to concentrate in Computer Science, says that hes not concerned about finding a job despite Harvards CS being largely theoretical.

Im not too worried. I feel like with SWE interviews nowadays, a lot of the prep work comes from outside of classes where you have to write code. And Harvard also does have a lot of clubs that provide great relevant experience, like T4SG, he says, referring to Tech For Social Good.

Still, theres the courses rigor: the 16.7 hours of week that caused one QReports writer to say, Do not take this class if you wish to have work-life balance.

In response to complaints and questions from students about the courses rigor, Barak clarifies, The instructors of CS 124 have worked at reducing difficulty in recent years, and in particular making it less dense by eliminating material that appears in other courses. We certainly shouldnt make [the] course hard for the sake of being hard.

The growing number of CS concentrators which roughly doubled in the last decade, according to Hesterberg has also presented a challenge in sourcing enough teaching staff to support students. Barak says this staff shortage likely contributes to the negative experiences students have reported in more challenging classes.

Though the rigorous theoretical and mathematical components have dissuaded many students from pursuing a computer science concentration, others remain undeterred.

I feel like the material we learn is all really interesting, because its just very problem solve-y, Chen says.

Typically, those with stronger math backgrounds coming into college have found the theoretical CS classes relatively easier, making the barriers of entry higher for those from under-resourced schools and backgrounds.

Usually the people who are super exposed to math find it really easy to just pick up coding, says Garimella, who took a linear algebra class in high school. To keep up, its definitely a challenge.

In a similar vein, many QReport comments recommend that students have a solid math foundation before taking the course, with one even suggesting that you should take the class if you excel in Math 55.

Chen, however, says that he feels math background isnt that big.

A lot of the hard part of the course comes from being able to absorb a lot of hard information quickly, he says. Its a lot more a question of how much time you can put into going to class, going to lectures, going to section and office hours.

Garimella says that students might perceive the theoretical CS classes to be harder because they came into college having done well in their studies.

You come to Harvard, and you might have a math lecture where you just understand none of that. I think thats really disheartening, she says.

I think people should just stop being scared about these courses especially if you dont want to go to grad school and your grades dont matter as much, she adds. I think that people should be more comfortable with going to lecture not understanding anything.

At the end of the day, CS 124 might not be all that different from the courses at other schools.

Two of my apartment-mates are software engineers at Google who went to MIT, and were complaining pretty similarly about thinking that MITs CS classes were not really relevant to their jobs, says Hesterberg, laughing. My MIT alum apartment-mates were impressed at the practical applicability of our CS classes. So it seems like there is some amount of a grass is always greener on the other side aspect.

Magazine writer Chelsie Lim can be reached at chelsie.lim@thecrimson.com.

Staff writer Xinni (Sunshine) Chen can be reached at sunshine.chen@thecrimson.com. Follow her on X @sunshine_cxn.

Go here to see the original:

The Necessary Evil of Computer Science 124 | Magazine - Harvard Crimson

Read More..