Page 2,489«..1020..2,4882,4892,4902,491..2,5002,510..»

Cameron University’s CU in Computing presentation to focus on Computer Science and Information Technology degree programs – Duncan Banner

If youre considering a degree from Cameron Universitys Department of Computing and Technology but arent sure of the differences between the departments two undergraduate degree programs Computer Science and Information Technology -- the upcoming CU in Computing event has the answers. The free, virtual informational presentation will feature specifics about each degree program, including courses, options and undergraduate research opportunities. Career options will also be covered. CU in Computing will take place on Monday, Nov. 22, from 5:30 to 6:30 p.m.

According to the U.S. Bureau of Labor Statistics, employment in computer and information technology occupations is projected to grow 11 percent by 2029, much faster than the average for all occupations. These occupations are projected to add more than half a million new jobs.

The Bachelor of Science in Computer Science degree program focuses on the study of computing technologies, including hardware and software. It includes the systematic study of computing systems and computation. A student graduating with this degree can easily move into the industry or pursue graduate studies. Graduates of this program go on to various careers in video gaming, business, technology, government, and intelligence and law enforcement. Other career options include the manufacturing sector as software and web developers, database administrators, network engineers and administrators, data analysts, and more.

The Associate in Applied Science and Bachelor of Science in Information Technology degree programs prepare graduates for employment requiring expertise as an information technology specialist. Modern-day businesses and industries employ a wide variety of technologies, and these businesses and industries need technology specialists to develop, implement and maintain the technology. Therefore, the departments information technology curricula are flexible, with options in Cyber Security and Information Assurance, Management Information Systems, and Technology. Graduates of this program will be successful in obtaining a variety of positions in business, industry and government.

To register, go to https://www.cameron.edu/comptech/events. Registrants will receive a secure link to the presentation. For more information, contact Dr. Muhammad Javed, Chair, CU Department of Computing and Technology, at mjaved@cameron.edu or call 580-581-2335.

Continued here:

Cameron University's CU in Computing presentation to focus on Computer Science and Information Technology degree programs - Duncan Banner

Read More..

Bias in Algorithms | The Inference Project – Yale News

Artificial algorithms are increasingly being deployed to inform, endorse, and govern various aspects of todays society. Their reach includes the domains of hiring, lending, medicine, criminal justice, insurance, allocation of public services, social and business interactions, and the dissemination of information and news. Through a synthesis of computational and statistical models for representing concepts, human-generated datasets that provide examples for training, and powerful optimization algorithms that can efficientlynavigate through vast and complex landscapes to infer concepts that explain data, such algorithms have taken big strides towards mimicking various aspects of natural intelligence.

These algorithms have led to tremendous economic and social impact but have also been shown to be biased they candiscriminate, reinforce prejudices, polarize opinions, and influence political processes. How can subjective human or societal biases emerge in the objective world of artificial algorithms? And how can we design algorithms free from these limitations?

The search for answers to these questions also leads us to some understanding of the bias in human decision-making algorithms.

Professor AleksanderMdry, who will lead the post-talk discussion on Monday, November 22, is theCadence Design Systems Professor of Computingin theMITElectrical Engineering and Computer Science Departmentand a member of the Computer Science and Artifical Intelligence Lab at MIT. Hereceived his Ph.D. fromMITin 2011 and prior to joining the universitys faculty, he spent a year as a postdoctoral researcher at Microsoft Research New England. He also was on the faculty ofEPFLuntil early 2015. ProfessorMdry is currently serving as theDirector of theMIT Center for Deployable Machine Learningand is the Faculty Lead of theCSAIL-MSR Trustworthy and Robust AI Collaboration. Hisresearch spans machine learning, optimization, and algorithmic graph theory, and he has a strong interest in building on the existing machine learning techniques to forge a decision-making toolkit that is reliable and well-understood enough to be safely and responsibly deployed in the real world.

Registerin advance for this webinar and the post-talk conversation:

Original post:

Bias in Algorithms | The Inference Project - Yale News

Read More..

The tech behind popular cryptocurrencies, explained – Popular Science

Whether youre on the head or tail end of the cryptocurrency craze, one thing is for sure: These digital assets are hitting the mainstream hard, and dont seem to be going away anytime soon. Notably, the country of El Salvador recently adopted bitcoin as legal tender, and New Yorks incoming mayor Eric Adams is intent on transforming New York City into a hotspot for cryptocurrency.

Although only 16 percent of Americans say they invested, traded, or used cryptocurrency, almost 90 percent have heard about it, according to a recent Pew Research Center survey.

Advocates for cryptocurrency and decentralized finance (where people can make financial deals with one another without being moderated by a middleman or central authority like a bank) in general argue that these platforms are transparent and simultaneously anonymousboth good things.

The key to this vision lies in a digital technology called the blockchain, which undergirds all cryptocurrencies. The blockchain serves as a virtual hall of records, or a public ledger, that records every transaction, detailing the amount as well as the sender and receivers wallet addresses.

Yet, critics and regulatory bodies are worried about the potential for harm from cryptocurrencies, such as people using them for scams, money laundering, or funding illegal activities (not to mention the enormous carbon footprint that some of these cryptocurrencies have). And experts have raised concerns about the strength of cryptocurrency networks against attacks, and whether the design of some systems have warped over time to become centralized or inherently allow the rich to get richer.

[Related: Cryptocurrency scammers are mining dating sites for victims]

For those who are just wading into the crypto territory, heres a basic explainer on how the computer science behind these systems work.

To start at the front end, this is what happens when you send and receive cryptocurrency. Keep in mind that all cryptocurrencies are just based on computer programs, bitcoin included, and that these coins are not actually money, but clippings of computer code that transfer value from one user to another. To become a part of this process, first you have to create a digital wallet. Bitcoin and Ethereum both have recommendations on what wallet works best with their cryptocurrency, and specialty exchanges like Coinbase and Gemini also offer wallets.

Whenever you create a new wallet, the algorithm running that cryptocurrency will generate a paired private key and public key associated with it. You can think of the public key as like an address or bank account number, and the private key proves your ownership. These keys are a long string of characters that identify where the crypto should go. Usually, the addresses only accept the type of cryptocurrency theyre affiliated with (although something called cross-chain bridges and exchanges can help link up different cryptocurrencies).

[Related: 6 apps to get you started on crypto]

You do not have bitcoins in your possessionyou have proof that somebody in the past sent you those bitcoins, says Nicolas Christin, an associate professor of computer science, engineering, and public policy at Carnegie Mellon University.

You can then tap some of the unspent value in your wallet, and send it to someone elses public key. When you sign to verify that you want to send the bitcoins, you generate a small personalized piece of code attached to the transaction, and the system creates a mathematical puzzle that locks up that value and scrambles the code. When the recipient is ready to spend the money, they will put a corresponding piece of code into the transaction. Everybody in the network can verify that the two pieces of code fit together (through a process called transaction confirmation, also known as miningmore on that later). This entire operation is called signature verification.

[Related: Bitcoin is having a bumpy rollout as an official currency in El Salvador]

Its impossible for someone to find a missing piece if they dont have the right information, but its super easy for anybody to verify that two pieces fit, Christin explains. Bitcoin has very little additional computational abilities beyond signature verification. Satoshi Nakamotos [the pseudonym of the alleged creator of Bitcoin] vision was to have programmable money, initially. The problem is Bitcoin became very popular very quickly and the developers decided to freeze the features where they were.

However, a new upgrade released last week could open up the possibility for supporting expanded functions beyond signature verification.

Many modern cryptocurrencies derive from the Bitcoin model. For example, Litecoin is in many respects similar to Bitcoin, but the puzzle component was slightly altered. They replaced the mining algorithm (called SHA-256) thats used in Bitcoin with a function called Scrypt, which they claim takes less energy to run. On the other hand, the creators of Bitcoin Cash branched off from a team that was working on Bitcoin to make a Bitcoin-esque cryptocurrency that can process more transactions per second.

Ethereum, however, takes a different approach. Its blockchain has an added feature called loops, which allows it to repeatedly run a piece of code, and engineers can program on top of it. Ethereum uses a mechanism called a gas that charges the person who initiated the transaction a fee to run a programming instruction. The program burns up the gas as it runs, and when its out of gas, the program either completes or terminates.

[Related: NFTs are blowing up the digital art and collectibles worlds. Heres how they work.]

Developers can build a cryptocurrency on top of Ethereum (like the stablecoin DAI), create mortgages, or unique non-fungible tokens, since theyre all pieces of code. All of those are pieces of code that are extensions of Ethereum transactions, says Christin.

Ethereum is also credited with the nifty innovation of integrating smart contracts onto their blockchain. Ethereums developers describe these as code scripts that performs some actions or computation if certain conditions are satisfied, comparing the logic of the code to how a vending machine works. If a digital art NFT lives inside a smart contract, for example, the artist can create a royalty schedule that accrues a fee every time the art is transferred on the blockchain.

Or, as another example, imagine walking into a bank and asking to borrow $10 million for the day without telling anyone your name. Somebodys going to be reaching for a red button under a desk somewhere, says Ari Juels, a professor of computer science at Cornell Tech. But you can actually do something like this on a blockchain.

You would borrow money using a smart contract, and you use it to do whatever you want to do. Typically, its used for arbitrage, where you buy and sell tokens at profit. Then, you pay back the loan, and all of that is contained in a single transaction. The way that blockchains work, if you fail to pay back the loan, the whole transaction can just be aborted, Juels says, which means that its as though you never borrowed the money to begin with.

Now, to peel back the curtains some more: To keep any cryptocurrency system running, there has to be a way to release new coins into the network, along with a way of maintaining the public ledger that tracks where all the new coins come from and where they go.

But since these cryptocurrencies are all meant to be peer-to-peer, theres no one entity that does all this, the way a traditional bank does. Instead, the responsibility of running the system falls to the whole network of participants, which is why they have to come to a form of consensus about whether transactions are valid or invalid. Each transaction made on the blockchain needs to be verified. A batch of transactions make up a block, and several blocks make up a chain.

The blockchain provides you with a different trust model, says Juels. The rules are very well defined and transactions can be executed in a rigorous, programmatic way.

[Related: What exactly is a digital dollar, and how would it work?]

There are a variety of methods used by different cryptocurrencies to accomplish those two standard tasks. Proof-of-work is the process used by most cryptocurrencies, including Bitcoin and Ethereum, to do this. Although all users get to check if the transaction was good in the end, only one user can be elected to lead the validation, add the transaction to the blockchain, and receive a reward. These rewards are how new currencies get released into the system. This operation is also known as mining. But first, the users, called miners, have to compete against each other to solve a cryptographic puzzle whose difficulty is proportional to the number of people trying to solve the puzzle. The puzzle is created by an algorithm. The only way to solve it is to try many different numbers, and powerful computers or processors can try more numbers quicker so are more likely to get the correct answer.

With Bitcoin, there is a limited amount of bitcoins in the system (21 million) and the rewards for mining decrease over time, although miners are still incentivized because they can receive a portion of the transaction as a fee. The ideal goal of Bitcoin was one vote per CPU. That has ultimately been subverted, says Juels. People are using specialized mining hardware to participate in the system. As bitcoin mining heated up, people developed and burned through specialty hardware, guzzling up electricity and creating tons of waste.

Proof-of-work still functions according to the original principle of requiring an investment of resources in order to participate in the system to mine blocks, Juels notes.

Meanwhile, in proof-of-stake systems, you pay to play, and have to stake tokens as a resource investment to participate, like putting in a security deposit that you get back once the transactions you added to the blockchain are approved by the network. The system chooses a staker who is online at the time randomly and they get to validate the transactions and receive the rewards. Because it doesnt require solving puzzles, in theory, it should use less energy.

[Related: Renewable energy cant cure Bitcoins environmental woes]

In Bitcoin, your participation in the system is proportional to the amount of computation you do, says Juels. In a proof-of-stake system, its proportional to the amount of cryptocurrency you hold in the system.

Typically the way that [both proof-of-work and proof-of-stake] systems work is that the rights to create the next block is determined randomly in a kind of lottery where your chances of winning the lottery are proportional to your resources, he adds.

While Ethereum said that it was transitioning to a proof-of-stake system, that jump has not yet happened. The existing cryptocurrency projects that use proof-of-stake have their own variations of it. For example, Cardano uses a proof-of-stake system called Ouroboros that incorporates stake delegation and stake pools. And Solana, a blockchain that you can also build smart contract programs and other decentralized apps on, combines proof-of-stake with another consensus algorithm called proof-of-history to incorporate timestamps on transactions.

Despite proof-of-stake being faster and more energy efficient, many experts have concerns about its stability and the barriers to entry. In Bitcoin, you can just start mining, in principle, with your laptop. You wouldnt do very well, but you can join the system without any type of previous investment of resources, says Juels. In the case of these proof-of-stake systems, you need to go buy some coins to participate, or be assigned the coins at the outset of the protocol. There are some people who object to the need to obtain coins in order to participate to begin with, but that is a necessity.

Alternatively, a cryptocurrency project called the XRP ledger uses a consensus protocol unlike proof-of-stake or proof-of-work thats almost democraticbut validators do not receive any rewards.

Theres another concept to know, too. Proof-of-storage (otherwise known as proof-of-space) is where youre committing an amount of space for storage in the network. The idea initially was digital preservationwe want to record everything, so at least we can use the disk space for a good purpose. It turns out its less needed than we thought, says Christin. Theres a need for digital preservation but it doesnt scale as quickly as a currency would. Juels proposes that these systems could potentially be useful for storing data from NFTs. One project testing out this concept is Filecoin.

Ultimately, despite gaining ground with large finance platforms like PayPal, Mastercard, and Robinhood, the future of cryptocurrency is still uncertainlooming federal regulations could dramatically reshape the community and the ecosystem. And the value of currencies like bitcoin remain volatile and represent risky investments. Wherever the next chapter of cryptocurrency leads, its indisputable that the popularity of this new wave of technology has already forced large financial institutions to evolve their thinking on how people want to interact with money, and with each other using money.

More:

The tech behind popular cryptocurrencies, explained - Popular Science

Read More..

When will a robot write a novel? Harvard computer scientist shares his thoughts – Harvard Gazette

In terms of words, AI is very good at manipulating the language without understanding or manipulating the meaning. When it comes to novels, there are some genres that are formulaic, such as certain kinds of unambitious science fiction that have very predictable narrative arcs, and particular components of world-building and character-building, and very well understood kinds of tension. And we now have AI models that are capable of stringing tens of sentences together that are coherent. This is a big advance because until recently, we could only write one or two sentences at the time, and the moment we got to 10 sentences, the 10th sentence had nothing to do with the first one. Now, we can automatically generate much larger pieces of prose that hold together. So it would likely be possible to write a trashy novel that has a particular form where those components are automatically generated in such a way that it feels like a real novel with a real plot.

When it comes to more sophisticated works, I think the point I would make is that fiction is an extremely effective and important part of our contemporary discourse, because it allows us to set aside our allegiances, and it allows us to suspend disbelief and to step into the shoes of somebody with a very different perspective and to be really open to that perspective. We are able to imagine other peoples lives, we are able to imagine alternative ways of interacting with other people, and were able to consider them in a really open-minded way.

Its important to note that Im relying on my expert perspectives when I talk about computer science, and my personal perspectives when I talk about my experience as a reader. But from the perspective of a reader, I dont think we will have a robot that is able to engage with and manipulate these kinds of meanings.

As a reader, I find that with gripping fiction, its not just the content of the book, its not just the plot, its not just the issues that engage me. Its also the fact that Im engaging with another human who wants me to reimagine the world. Its part of a discourse.

Another point I would make around the technology is that in recent years researchers have made the distinction between manipulating the language versus manipulating the meaning, and they point out that through tools that expertly manipulate the language weve created an illusion that machines can understand and manipulate meaning. But thats absolutely not the case. We have created a grand con. And potentially a dangerous one, because we are convincing the rest of the society that technology can do things that it actually cannot do.

As told to Colleen Walsh, Harvard Staff Writer

Sign up for daily emails to get the latest Harvardnews.

See original here:

When will a robot write a novel? Harvard computer scientist shares his thoughts - Harvard Gazette

Read More..

Infosys Commits to 3-Year Investment in Thurgood Marshall College Fund – Yahoo Finance

- TMCF and Infosys will establish mechanism to increase candidates from Historically Black Colleges and Universities (HBCUs) for high-quality STEM jobs

- The Infosys Foundation USA will support the TMCF Teacher Quality & Retention Program to help prepare aspiring K12 STEM teachers to bring computer science and maker education to their classrooms

NEW YORK, Nov. 17, 2021 /PRNewswire/ -- Infosys (NYSE: INFY), a global leader in next-generation digital services and consulting, today announced a 3-year investment benefitting the Thurgood Marshall College Fund (TMCF), America's largest organization exclusively representing the Black College Community. As part of its commitment to developing and recruiting diverse talent, Infosys will serve as a HBCU Graduate Pipeline Partner creating an opportunity for at least 1600 graduating students to interview and accept jobs within Infosys. Additionally, given its mission to advance access to computer science education for K12 educators, Infosys Foundation USA will serve as the STEM sponsor of TMCF's Teacher Quality & Retention Program (TQRP) 10-day Summer Institute to invest in the future pipeline of Black STEM educators.

Infosys

"Building a culture of equality doesn't start and end with any one individual, one government, or one business it takes a coalition of the willing to ensure our progress is lasting. Partnering with the TMCF is a fantastic opportunity to highlight the work their organization is doing to develop a culture of engagement that turns young Americans into future STEM leaders. Infosys recognizes that in order to get the right people into the right roles, while also ensuring upward mobility, we can't rely on past approaches. Through this partnership, Infosys renews its commitment to provide equal opportunity for all and strive for a workforce that resembles the cultural makeup of America," said Ravi Kumar, President, Infosys.

Infosys and TMCF will provide graduating students with a path to career training and job placement at Infosys. The program will leverage TMCF's HBCU campus relationships and its talent acquisition team to identify students interested in technology careers and having strong analytic skills. Ultimately, students will have an opportunity to interview for roles at Infosys. Infosys will also support on-campus informational sessions to maximize the program's reach with the eventual goal of moving more than 1,600 students through the career program.

Story continues

Recent data shows that this kind of investment in HBCUs would profoundly affect the US economy. HBCUs are uniquely positioned to foster such engagement given their assets, experience, and cultural and historical significance.

Dr. Harry L. Williams, President & CEO, Thurgood Marshall College Fund, said, "It's time today's technology leaders champion the talent of HBCU graduates. We believe HBCUs possess the power and the people to create a more equitable society, that's why we work so diligently to ensure their sustainability and strengthen their capacity to continue to produce top talent who are prepared to change the face of leadership in business and beyond. We are thrilled Infosys recognizes the potential of diverse students to transform the future workforce. This impressive group of students will certainly be tomorrow's corporate leaders and experts."

The Infosys Foundation USA will invest in the TMCF TQRP Summer Institute, a 10-day hands-on training program that helps pre-service and aspiring K12 educators to develop their pedagogical skills while equipping them to succeed in challenging teaching environments. As the official STEM sponsor, the Foundation will provide fellows with access to innovative computer science and maker education resources that exist on the Pathfinders Online Institute digital learning platform while helping to build connections to a wider community of peer STEM teachers. Roughly 100 fellows are selected to participate in the program, which is estimated to positively impact nearly 4,000 K-12 students.

"We are thrilled to launch this new initiative with TQRP at a time that our country needs dynamic and diverse educators. The statistics are clear that K12 students have higher rates of educational performance when they have teachers in the classroom who reflect their backgrounds. The TQRP Summer Institute is an exceptional program that reflects the values of the Foundation to promote K12 computer science education in under-represented communities. We look forward to the scaled impact that will result from this new partnership," says Kate Maloney, Executive Director, Infosys Foundation USA.

Graduates of the TQRP program become the next generation of future leaders who possess a passion for teaching in high-need communities. To date, TQRP fellows have impacted over 31,000 K-12 students in high need, urban, and rural areas. Over time the partnership between TQRP and the Infosys Foundation USA will aim to leverage synergies of their respective professional development resources to create expanded enrichment opportunities for the fellows as they enter K12 classrooms and inspire the next generation of STEM leaders.

Infosys and TMCF will host a panel discussion at the Infosys Americas Leadership Forum today to detail expected outcomes of the partnership and its industry impact. For more information about Infosys' investment in TMCF and its commitment to attracting and retaining diverse talent, please tune in to a live stream of the event beginning at 5:20 p.m. ET at https://youtu.be/iHcSXkYW_IA

About Infosys Foundation USAInfosys Foundation USA was established in 2015 with the mission of expanding computer science and maker education to K-12 students and teachers across the United States, with a specific focus on increasing access to underrepresented communities. The Foundation achieves impact through delivering professional development programs for teachers, partnering with leading nonprofits, and delivering innovative media campaigns that inspire everyone to be creators, not just consumers, of technology. For more information, visit infosys.org/USA.

About InfosysInfosys is a global leader in next-generation digital services and consulting. We enable clients in more than 50 countries to navigate their digital transformation. With over four decades of experience in managing the systems and workings of global enterprises, we expertly steer our clients through their digital journey. We do it by enabling the enterprise with an AI-powered core that helps prioritize the execution of change. We also empower the business with agile digital at scale to deliver unprecedented levels of performance and customer delight. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.

Visit http://www.infosys.com to see how Infosys (NSE, BSE, NYSE: INFY) can help your enterprise navigate your next.

Safe Harbor

Certain statements in this release concerning our future growth prospects, financial expectations and plans for navigating the COVID-19 impact on our employees, clients and stakeholders are forward-looking statements intended to qualify for the 'safe harbor' under the Private Securities Litigation Reform Act of 1995, which involve a number of risks and uncertainties that could cause actual results to differ materially from those in such forward-looking statements. The risks and uncertainties relating to these statements include, but are not limited to, risks and uncertainties regarding COVID-19 and the effects of government and other measures seeking to contain its spread, risks related to an economic downturn or recession in India, the United States and other countries around the world, changes in political, business, and economic conditions, fluctuations in earnings, fluctuations in foreign exchange rates, our ability to manage growth, intense competition in IT services including those factors which may affect our cost advantage, wage increases in India, our ability to attract and retain highly skilled professionals, time and cost overruns on fixed-price, fixed-time frame contracts, client concentration, restrictions on immigration, industry segment concentration, our ability to manage our international operations, reduced demand for technology in our key focus areas, disruptions in telecommunication networks or system failures, our ability to successfully complete and integrate potential acquisitions, liability for damages on our service contracts, the success of the companies in which Infosys has made strategic investments, withdrawal or expiration of governmental fiscal incentives, political instability and regional conflicts, legal restrictions on raising capital or acquiring companies outside India, unauthorized use of our intellectual property and general economic conditions affecting our industry and the outcome of pending litigation and government investigation. Additional risks that could affect our future operating results are more fully described in our United States Securities and Exchange Commission filings including our Annual Report on Form 20-F for the fiscal year ended March 31, 2021. These filings are available at http://www.sec.gov. Infosys may, from time to time, make additional written and oral forward-looking statements, including statements contained in the Company's filings with the Securities and Exchange Commission and our reports to shareholders. The Company does not undertake to update any forward-looking statements that may be made from time to time by or on behalf of the Company unless it is required by law.

Cision

View original content:https://www.prnewswire.com/news-releases/infosys-commits-to-3-year-investment-in-thurgood-marshall-college-fund-301426631.html

SOURCE Infosys

Here is the original post:

Infosys Commits to 3-Year Investment in Thurgood Marshall College Fund - Yahoo Finance

Read More..

At SC21, Plenary Wrestles With the Ethics of Mainstreamed HPC – HPCwire

As the panelists gathered onstage for SC21s first plenary talk, the so-called Peter Parker principle with great power comes great responsibility cycled across the background slideshow. For the following hour, five panelists confronted this dilemma: with the transformative power of HPC (and, in particular, HPC-enabled AI) increasingly mainstreamed and deployed by all major sectors of society, industry and government, what ethical responsibilities are conferred to whom, and how can those responsibilities be fulfilled?

The plenary, titled The Intersection of Ethics and HPC, featured five speakers: Dan Reed, professor of computer science and senior vice president of Academic Affairs at the University of Utah, who moderated the discussion; Cristin Goodwin, general manager and associate general counsel for Microsoft; Tony Hey, a physicist and chief data scientist for Rutherford Appleton Laboratory; Ellen Ochoa, chair of the National Science Board and former astronaut and director of NASAs Johnson Space Center; and Joel Saltz, an MD-PhD working as a professor and chair of the Department of Biomedical Informatics at Stony Brook University.

We know that advanced computing now pervades almost every aspect of science, technology, business, and society. Think about its impacts on financial institutions, e-commerce, communications, logistics, health, national security And big tech overall has been in the news lately and not necessarily in a good way, Reed opened, citing a Pandoras box of issues ranging from the effects of social media and data breaches to deepfakes and autonomous vehicles.

Unintended consequences and unethical actors

Technology, he continued, is also being exploited at scale, with governments and criminals leveraging high-power surveillance and intrusion tools to great effect. Beyond the national security applications and implications, HPC has also become tightly tied to competitiveness for businesses and to the state-of-the-art for forward-facing fields like medicine and consumer technology. HPC, Reed pointed out, is just the latest field to go through this tumultuous adolescence: fields like physics and medicine had experienced similar ethical dilemmas as their capabilities expanded.

As a physicist, Hey agreed, invoking perhaps the most famous step change in the ethical onus on a scientific field. I think the outstanding example is the Manhattan Project, which developed the atomic bomb during the war, he said. The Manhattan Project, he explained, had been initiated due to the fear of an unethical actor Hitler who likely would not have hesitated to use such a weapon were it in his possession. That was the original motivation. But actually before they tested their nuclear weapons that theyd developed in the Manhattan Project, Germany had surrendered. So the original reason had gone, he said leaving the scientists to wrestle with their creation. And I think, really, you can almost replace nuclear weapon technology with AI technology. You cant uninvent it, and we can be ethical about our use, but well have enemies who arent.

These enemies, and wantonly unethical actors generally, were the subject of much discussion. Goodwin, who works to address nation-state attacks at Microsoft, said that while cyberattacks by nation-states were once considered unlikely force majeure events, theyre now commonplace: Microsoft, between the period of August 2018 and this past July, notified over 20,000 customers of nation-state attacks, she said.

In my space, what I see all the time is the paradigm of unethical abuse, she added, contrasting that with the paradigm of ethical use. How are you thinking about abuse? The September 12th cockpit door? What are the ways your technology could be abused? This issue, she said, was particularly spotlit in the wake of Microsofts ill-fated chatbot, Tay.

Many people know that Microsoft back in 2016 had released a chatbot, and you could interact on Twitter and it would respond back to you, she recapped. And in about 24 hours it turned into a misogynistic Nazi and we took it down very, very quickly. And that forced Microsoft to go and look very very closely at how we think about ethics and artificial intelligence. It prompted us to create an office of responsible AI and a principled approach to how we think about that.

This kind of unanticipated reappropriation or redirection of a technology somewhat limited in scope, though offensive, when applied to a chatbot becomes much more ominous as the technologies expand. Saltz advised the audience to look beyond what [the] specific application is, citing the relatively straightforward introduction of telehealth which is now spiraling into the use of AI facial and body recognition to, in combination with medical records, make predictions for a patients health during a telehealth appointment. Pretty much every new technical advance, even if it seems relatively limited, can be extended and is being extended to something more major, he said.

Uninclusive models and unsuitable solutions

On the topic of unintended consequences, several of the panelists expressed concern over the bias that can be conferred often accidentally to AI models and their predictions through improper design and training. Ochoa referenced a famous case where an AI model was used to predict recidivism in sentencing, which, she said, resulted in the AI essentially predicting where police were deployed. These things can creep in at various different areas, but theyre being used so broadly theyre really affecting peoples lives, she said.

Indeed, much of this bias can be attributed to sample selection. To that point, Saltz spoke on the use of HPC-powered models to aid in diagnosis, prognosis and treatment. Medicine is a particular font of ethical dilemmas, theres no doubt and increasingly, these involve high-end computing and computational abilities, he said. Do you recommend an intensive, scorched-earth treatment for a patient to give them the best chance of beating cancer, or do you recommend a less taxing treatment because theyre unlikely to require anything more severe to recover? So, models can predict this, Saltz said. On the other hand: can models predict this? This is a major technical issue as well as an ethical issue. One of the main issues, again: if a particular population was used for training, how do you generalize that model? Should you?

Saltz had a couple of ideas for how to ameliorate these problems starting with the collection of more data from more groups. As human beings, we should encourage medical research and make our data available, he said. Theres a lot of work associated with this, but I think that convincing the citizenry that there really is a potential huge upside to participating in research studies and making their data available will be very important to enhance medical progress.

Second, he said, was validating the algorithms. The FDA has a project dedicated to validation of AI in medicine that were involved in, he explained. The notion is that thered be a well-defined path so that developers of algorithms can know when their algorithm has been deemed good enough to be reliable.

I think that also speaks to the notion that you want a diverse community looking at those issues, Reed said, because they will surface things that a less diverse community might not. Recommendations, too, can be asymmetrical: Hey explained how fine-grain tornado prediction enabled disaster agencies to recommend fewer evacuations along a more specific path, but that for groups that might need longer to evacuate such as people with disabilities, or older people that more targeted, quicker-response approach might be unsuitable. These things require great consideration of the people who are affected, he said.

Unfathomable explanations and unrepresentative gatherings

One core problem pervades nearly all efforts to reinforce ethics in HPC and AI: comprehension.

We are a vanishingly small fraction of the population so how do we think about informed debate and understanding with the broader community about these complex issues? Reed said. Because explaining to someone that, this is a multidisciplinary model with some abstractions based on AI and some inner loops, and weve used a numerical approximation technique with variable precision arithmetic on a million-core system with some variable error rate, and now talk to me about whether this computation is right that explanation is dead on arrival to the people who would care about how these systems are actually used.

Goodwin said that getting users, stakeholders and the general public to understand the implications of technological developments or threats was something that Microsoft had been wrestling with for some time. We have context analysts that help us simplify the way we talk about what weve learned so that communities that are not technical or not particularly comfortable with technical terms can consume that, she said. What we believe is that you cant have informed public policy if you cant take the technical detail of an attack and make it relatable for those who need to understand that.

When talking about communication between HPC or AI insiders and the general public, of course, its important to note the differences between those two groups differences that span demographics, not just credentials. The attendees at this conference are not broadly representative of our population, Reed said, gently, looking out at the audience.

Ochoa followed up on that thread, discussing efforts to fold in the missing millions that are often left unrepresented by gatherings of or decisions by technically skilled, demographically similar experts.

We try to make sure were not doing anything discriminatory, right? she said. But welcoming is actually much broader than that.

More:

At SC21, Plenary Wrestles With the Ethics of Mainstreamed HPC - HPCwire

Read More..

Governor Abbott Encourages Texas High Schoolers To Participate In CyberStart America – Office of the Texas Governor

November 17, 2021 | Austin, Texas | Press Release

Governor Greg Abbott today announced Texas high school students will have the opportunity to participate in CyberStart America, an innovative, online cybersecurity talent search and competition sponsored by the National Cyber Scholarship Foundation (NCSF) and the SANS Institute. CyberStart is open to all students in grades 9-12 to explore their aptitude for cybersecurity and computer science.

"The demand for cybersecurity professionals continues to grow as technology becomes a greater part of our everyday lives," said Governor Abbott. "CyberStart is a fun, engaging way to prepare our students to tackle new challenges in our communities and workforce, and I encourage Texas students to take advantage of this tremendous opportunity to become exceptional leaders in the cybersecurity field."

CyberStart is a series of online challenges that allow students to act as cyber protection agents, solving cybersecurity-related puzzles, and explore related topics such as code breaking,programming, networking, and digital forensics.Students who do well in the program can earn access to scholarships and advanced training. Last year, 32,000 students from 4,800 schools around the country played CyberStart, and the NCSF awarded over $4 million in scholarships and advanced training. Out of those students, more than 4,000 Texas students registered, 732 of which reached the national competition and 68 were named National Cyber Scholars.

Learn more about CyberStart America.

Read the original post:

Governor Abbott Encourages Texas High Schoolers To Participate In CyberStart America - Office of the Texas Governor

Read More..

USF physicists selected as finalists for the ‘Nobel Prize of supercomputing’ – University of South Florida

A team of computational physicists and computer scientists led by researchers from the University of South Florida has reached a new milestone in supercomputing and was selected as a finalist for the fields most prestigious award.

The Gordon Bell Prize, presented annually by the Association of Computing Machinery at the International Conference for High Performance Computing, Networking, Storage and Analysis (SC21), recognizes outstanding achievement in high-performance computing and is often referred to as the Nobel Prize of supercomputing. The purpose of the award is to recognize outstanding achievements in innovative applications of high-performance computing to problems in science, engineering and large-scale data analytics.

The team, led by Ivan Oleynik, professor in USFs Department of Physics, along with Kien Nguyen Cong and Jonathan Willman, both of whom recently completed doctoral degrees at USF, utilized the 200-petaflop Summit supercomputer at the Oak Ridge National Laboratory, the fastest supercomputer in the U.S., to explore how carbon atoms behave at extremely high pressures and temperatures.

Ivan Oleynik

Jonathan Willman

Kien Nguyen Cong

This is one of the most significant fundamental problems that exists in material science today, Oleynik said. Furthering our understanding of carbon behavior inside of recently discovered exoplanets or upon enormous compression during inertial confinement fusion implosions is paramount to advancing our knowledge of the structure of exoplanets or unlocking limitless fusion energy sources. Making impact through such grand simulations is something we could have never dreamed of.

For the first time, researchers were able to conduct cutting-edge molecular dynamics simulations of several billion carbon atoms with extreme quantum accuracy. To accomplish this, USF researchers worked with partners from Sandia National Laboratories, the Royal Institute of Technology, the National Energy Research Scientific Computing Center and NVIDIA Corporation to develop novel machine learning interatomic potentials describing interactions between carbon atoms with ultimate fidelity as well as to implement them in very efficient GPU-enabled computational algorithms.

By combining this novel methodology with access to Summit, the nations most powerful supercomputer, researchers were able to run a 24-hour simulation that uncovered a long sought-after synthesis of high-pressure post-diamond crystalline phase of carbon under extreme conditions. This transformative discovery was made possible not only through access to Summit, but through the use of the teams combined expertise in innovative atomic-scale machine learning simulation methodology and its algorithmic implementation that helped unlock the enormous predictive power of computer simulations on one of the most powerful supercomputers in the world.

This was an enormous simulation that has revealed previously unknown behavior of carbon at the atomic scale in billion atom simulations at experimental time and length scales, Oleynik said. Not only have we made this discovery, but in the process, broke the previously held world record of quantum accurate molecular dynamics by running our simulation 23 times faster. This is an immense leap forward in computational materials science.

The team is now working to publish its science findings from this simulation while awaiting the Nov. 18 announcement of the Gordon Bell Prize winner.

To learn more about the teams work, read their research article.

More:

USF physicists selected as finalists for the 'Nobel Prize of supercomputing' - University of South Florida

Read More..

4 universities nurturing tomorrow’s women STEM experts – Study International News

Edith Clarke, Mary Engle Pennington, Maria Klawe, Lydia Villa-Komaroff, Rosalind Franklin what do all these women have in common?

They are all credited with bringing STEM (thats Science, Technology, Engineering, and Mathematics) to the forefront and paving the way for more women to break glass ceilings in these disciplines.

Clarke was the first female electrical engineer. Pennington discovered the worlds first safety standard for milk storage. Klawe increased the number of women studying computer science at her faculty to 40%. Villa-Komaroff found that bacteria can be engineered for insulin production and Franklin provided critical research for the DNA structure.

Yet, as qualified and impressive as they are, they are outliers. Women are significantly underrepresented in STEM fields only 28% are women and receive lower pay than men.

If you seek to narrow this gap and change this trend, set your sights on universities that will offer the time, effort and resources to ensure your success. These institutions have role models, plus academic, social and professional opportunities for women in STEM. Check out our pick of universities in Europe that best fit this description:

If the STEM industries are looking for women talent, Coventry University graduates are candidates.

After all, they come from an institution with a long string of accolades: #1 Modern University in the Midlands (Guardian University Guide 2021), University of the Year for Student Experience (The Times and Sunday Times Good University Guide 2019), Gold for Outstanding Teaching and Learning, to name a few. Here, 97% of students found jobs or continued to further studies within six months after graduation (DLHE survey, UG UK, 2016/17, published in 2018).

Source: Coventry University

In the Universities two dedicated engineering and computing buildings, students evolve into masters in their field of study, as well as gain further personal, cultural and work experiences through taught masters programmes such as Computer Science; Cyber Security; Data Science/Data Science with Computational Intelligence; and Management of Information Systems and Technology.

Aspiring female postgraduate students are welcome to pursue these programmes here. Many graduates remember their Coventry experience as one defined by making lifelong friends, being inspired by dedicated teachers and having opportunities to contribute.

I found inspiration and was amazed by the possibilities and potential that could be achieved when applying the taught content. I loved what I studied, and it helped me keep going, the idea that with this knowledge, I would be able to improve lives, experiences or even the world itself, shares MSc Data Science graduate Toma Petraviciute.

Ume University, the largest institution of higher education in northern Sweden, has over 36,000 students from 60 countries 66% are females. It recently won the Nobel Prize for Chemistry in 2020 and has 44 international programmes at least 40 programmes are taught in English.

Ume Universitys peaceful campus. Source: Ume University, Facebook

The universitys interdisciplinary collaborations span over 2,000 researchers working in top national and international research teams. Students can access top-range facilities and equipment at Ume such as High Performing Computing Centre North, Integrated Science Lab and NanoLab.

Students like Adriele Pradi recall how her teachers at Ume made her study experience enjoyable with their teaching expertise and subject knowledge.

The lecturers are very friendly and supportive. I like the way they focus on creating self-reliance and independence in the students, Cartrine Anyango, another student from Nairobi, shares.

For international students, the Buddy System Programme helps them meet and mix with domestic students to learn from each other and transition better into university life.

Masters programmes offered here include Computing Science, Statistics and Data Science and IT Management.

The University of Edinburgh is world-renowned for many things: ranked #16 in the QS World University Ranking 2022, fourth in the UK for research power, and famous alumni that include Nobel laureates and Olympic athletes.

Sunset at the University of Edinburghs campus. Source: The University of Edinburgh, Facebook

The University of Edinburgh is also ranked among the top 20 most international universities, is a member of Universitas 21 research network as well as the European networks: Coimbra Group, League of European Research Universities (LERU) and Una Europa alliance.

Besides being located in one of the worlds top 20 student cities, Edinburgh, the university also offers outstanding facilities for learning including well-stocked libraries and 30 computer labs.

Due to its superior ratings, cutting edge learning environment, fascinating postgraduate programme and impressive staff, the university greatly appealed to me, Zia Barnard, a Biotechnology graduate from Saint Lucia, says.

A wide array of taught masters programmes are offered here. This includes the MSc Computer Science, MSc Data Science and the MSc Cyber Security, Privacy and Trust.

As the biggest engineering university in the Czech Republic with 19,000 students, BRNO University of Technology ranked among 2.3% of the worlds best universities has some of the best engineering equipment within excellent research centres such as the Central European Institute of Technology.

Brno in winter. Source: BRNO University of Technology, Facebook

Being located in Brno translates to affordable living costs and close proximity to technology companies for students. Since Brno is dubbed Europes Silicon Valley, students can participate in numerous foreign internships and international project collaborations 95% of graduates easily find jobs within six months.

BRNO University offers English-taught masters programmes such as Applied Sciences in Engineering, Industrial Engineering, Information Technology, and Environmental Sciences and Engineering.

Finally, BRNO has an interesting affiliation with women in STEM Zdena Rbov pioneered IT in Czechoslovakia and worked at the Faculty of Information Technology at BRNO. The Summer Schools (F)IT for Girls is also offered at BRNO University for high school female students who want to learn IT.

*Some of the institutions featured in this article are commercial partners of Study International

Continued here:

4 universities nurturing tomorrow's women STEM experts - Study International News

Read More..

SAS and Sphero Address Coding Needs of Students with Visual Impairments – T.H.E. Journal

Computer Science Education

Data analyticscompany SASand education technology companySphero are working together to bring data analyticsand robot coding along with soft skills to students with visualimpairments.

At the heart of theinitiative is SASCodeSnaps, a free app designed to be used inclassrooms, camps and clubs, to teach kids the basics of computerscience, including how to code. The students work together andproblem-solve to take on programming challenges using printed codingblocks. When blocks are scanned with the SAS CodeSnaps app, theprogram executes on a Sphero robot, such as the BOLT,SPRK+or SpheroMini.

The printable blocksare available in 10 languages. Now the collection also includes anEnglish braille version.

SAS worked with thePerkinsSchool for the Blind to adapt CodeSnaps to meet theneeds of students with visual impairments and blindness. Now brailleis part of the code blocks, and lessons can incorporate a tactiledevice, such as a measuring stick to measure distances.

Diane Brauner,manager of Perkins'Paths to Technology website, helped create activitiesthat use noise to help students identify the robot's movements.

The activities weretested during a codingchallenge with the Coding Club at TheGovernor Morehead School in Raleigh, NC. The challengerequired students (teams of boys against girls) to send their Spherothrough the course, including traveling to a trash can, then goingbehind it and crossing the finish line.

"No longersitting on the sidelines or relying on a sighted peer's descriptions,students who are blind or low-vision can fully participate in everyaspect of the coding activity," said Brauner, in a pressrelease. "With the physical course, SAS CodeSnaps braille blocksand a Sphero robot, blind and low-vision students are studying thephysical obstacle course, writing code using the SAS CodeSnapsbraille blocks, and following the Sphero robot auditorily."

"Every studentshould have the opportunity to learn to code," added Ed Summers,director of accessibility at SAS. "With CodeSnaps' interactive,customized resources, teachers of students with visual impairmentscan find creative ways to integrate computer science into anysubject, engaging students with sound and touch."

This is far fromSAS' first foray into accessibility. In 2017, the company launchedSASGraphics Accelerator, a tool for making datavisualizations accessible to people with visual impairments. SASGraphics Accelerator generates alternative presentations of SAS datavisualizations, including verbal descriptions, tabular data andinteractive sonification, which uses non-speech audio to conveydetails about the graph. Users rely on sound rather than sight toexplore bar charts, time series plots, heat maps, line charts,scatter plots and histograms. For example, a sonic representation ofa bar graph will shift where the sound is coming from to indicatemovement along the x-axis and changes the pitch to indicate higher orlower values to designate the y-axis.

About the Author

Dian Schaffhauser is a senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning. She can be reached at [emailprotected] or on Twitter @schaffhauser.

See the article here:

SAS and Sphero Address Coding Needs of Students with Visual Impairments - T.H.E. Journal

Read More..