Page 1,490«..1020..1,4891,4901,4911,492..1,5001,510..»

How To Change Core Count and TDP of Intel Xeon Y CPUs on Dell … – ServeTheHome

One feature that Intel has had for some time is the ability to change the personality of its Xeon CPUs dynamically. On CPUs that have Y suffixes, and with compatible servers, we can get access to Intel Speed Select Technology or SST. This allows us to change the core counts, frequencies, and TDP of CPUs to different levels easily. We are going to show how to change the personality of a Y series Xeon using a Dell PowerEdge R760 iDRAC 9 interface since it is very easy.

The two main components of making this capability work are having a compatible CPU and server. Here we have the Intel Xeon Platinum 8452Y as you can see here.

When these are installed, changing personalities is fairly straightforward. One can use BIOS, or even just the iDRAC 9 BIOS setting page. Here is the dropdown SST-Performance Profile:

Here we can see the options:

These match the Intel Ark page Intel SST-PP:

That BIOS setting can be selected, then the server reboots and the new performance profile will take effect.

Using Intel SST-PP is extremely easy. Most STH readers with SST-PP enabled Xeons are probably using the default maximum core count profile. Still, this is an option and is very easy to change. If you are wondering what the use case is for moving to lower core counts, that is fairly easy.

We saw a good example of how this is used recently in ourPutting the Bare Metal Server in the PhoenixNAP Bare Metal Cloud piece. There, PhoenixNAP uses Supermicro servers with the same Intel Xeon Platinum 8452Y SKUs. Having Intel SST-PP options allows for one SKU to be installed then multiple types of bare metal instances to be serviced from that one SKU.

See the rest here:
How To Change Core Count and TDP of Intel Xeon Y CPUs on Dell ... - ServeTheHome

Read More..

‘Proxyjacking’ Cybercriminals Exploit Log4j in Emerging, Lucrative … – Dark Reading

Threat actors have found a lucrative new attack vector that hijacks legitimate proxyware services, whichallow people to sell portions of their Internet bandwidth to third parties.In large-scale attacks that exploit cloud-based systems, cybercriminals can use this vector dubbed "proxyjacking" to earn potentially hundreds of thousands of dollars per month in passive income, researchers from Sysdig Threat Research Team (TRT) have found.

In a February blog post, Kaspersky researchers describedproxyware services like this: "[Users install a client that creates a]proxy server. Installed on a desktop computer or smartphone, it makes the device's Internet connection accessible to an outside party." That outside party the proxyware service then resells an agreed-upon portion of the user's bandwidth to other people.

"Depending on how long the program remains enabled and how much bandwidth it is permitted to use, the client accumulates points [for the user]that can eventually be converted into currency and transferred to a bank account," according to researchers at Kaspersky.

In one attack that the Sysdig researchers observed, threat actors compromised a container in a cloud environment using the Log4j vulnerability, and then installed a proxywareagent that turned the system into a proxy server without the container-owner's knowledge, the researchers revealed in a blog post on April 4.

This allowed the attacker to "sell the IP to a proxyware service and collect the profit," in an unusual type of Log4j exploit. Usually, Log4j attacks involve anactor dropping a backdoor or cryptojacking payload on the device, Crystal Morin, Sysdig threat research engineer, wrote in the post. "While Log4j attacks are common, the payload used in this case was uncommon," she wrote.

Proxyjacking shares characteristics of cryptojacking in that both profit off the bandwidth ofa victim and bothare about equally profitable for the attacker, Morin said. However, these attacks differ in that attackers typically install CPU-based miners to extract maximum value from compromised systems, while proxyjacking mainly uses network resources leaving a minimal CPU footprint, she wrote.

"Nearly every piece of monitoring software will have CPU usage as one of the first (and rightfully most important) metrics," she wrote. "Proxyjacking's effect on the system is marginal: 1 GB of network traffic spread out over a month is tens of megabytes per day very likely to go unnoticed."

Proxyjacking is a relatively new phenomenon spurred by the growth and use of proxyware services in the last couple of years, the researchers said. As mentioned, these services, such as IPRoyal, Honeygain, and Peer2Profit, are installed as apps or software on Internet-connected devices that, when running, allow someone to share Internet bandwidth by paying to use the IP address of the app users.

Proxyware comes in handy for people who want to use someone else's IP address for activity such as watching a YouTube video that isn't available in their region, conducting unrestricted Web scraping and surfing, or browsing dubious websites without attributing the activity to their own IP, the researchers said. According to the service, people pay for each IP address that someone shares via proxyware based on the number of hours they run the application.

In the attack investigated by Sysdig researchers, attackers targeted an unpatched Apache Solr service running in Kubernetes infrastructure to take control of a container in the environment, and then downloaded a malicious script from a command-and-control server (C2), which they placed in the /tmp folder to have privileges to perform their activity.

"The attacker's first execution was downloading an ELF file renamed /tmp/p32, which was then executed with some parameters, including the email address [emailprotected][.]com and the associated password for their pawns.app account," Morin wrote in the post.

Pawns.app is a proxyware service that has been seen sharing IPs from IPRoyal's proxy network. Indeed, Sysdig TRT correlated the binary downloaded and executed in the malicious script to the command-line interface version of the IPRoyal Pawns application from GitHub, which uses the same parameters, researchers said. In this way, attackers began using the compromised pod to earn money on the service, they said.

Attackers covered their tracks by cleaning the compromised system of their activity, clearing the history, and removing the file they dropped in the containers and the temp files, the researchers added.

While the list of proxyware services reported as being used for proxyjacking is small right now, Sysdig researchers believe that this attack vector will continue to grow and eventually "defenders will uncover more nefarious activities," Morin wrote."This is a low-effort and high-reward attack for threat actors, with the potential for far-reaching implications."

Researchers estimate that in 24 hours of activity for one proxyjacked IP address, an attacker can earn $9.60 per month. In a modest compromise of, say, 100 IP addresses then, a cybercriminal could net passive income of nearly $1,000 per month from this activity, they said.

When exploiting Log4j on unpatched systems, this figure can climb even higher, as millions of servers are still running vulnerable versions of the logging tool, and more than 23,000 of them can be reached from the Internet, according to Censys, the researchers said. "This vulnerability alone could theoretically provide more than $220,000 in profit per month" for an attacker, Morin wrote.

To avoid "receiving potentially shocking usage bills" due to proxyjacking activity, organizations need to take actions to mitigate potential attacks, the researchers said. They recommended that organizations set up billing limits and alerts with their respective cloud service provers, which can be an early indicator that something is amiss, Morin wrote.

Morin advised that organizations should also have threat-detection rules in place to receive alerts on any initial access and payload activity preceding the installation of a proxyware service application on your network.

Originally posted here:
'Proxyjacking' Cybercriminals Exploit Log4j in Emerging, Lucrative ... - Dark Reading

Read More..

Hypercompetitive Cloud Market a Blessing for Cloud OTT Platforms – Analytics India Magazine

Cloud-native OTT platforms were born out of the explosion in the streaming audience. More viewers brought along more security threats, more complex workflows and bigger infrastructures. A cloud-based infrastructure solved a bunch of these problems scalability became easier and the quality of experience improved by a wide margin.

Manik Bambha, the co-founder and CEO of ViewLift, was early to spot this shift having founded the cloud-based streaming platform in 2008. We realised we can help sports, media companies and broadcasters quickly launch their own OTT services without draining their time and resources. These companies can start making money from their digital content in a matter of weeks, rather than months or years of joining the platform, he explained.

The benefits from this push in cloud infrastructure were far too many. Cloud native tech is a revolution. In the past, brands had to order servers and wait for them to be ready before they could launch their digital platforms. This process could take between 6-12 months, which was a significant barrier of entry for many businesses. With cloud native technology, brands can launch their platforms in a matter of weeks. Cloud native platforms are built to scale quickly and efficiently, which means that they can handle millions of users in a matter of days, he stated.

During the development process, a cloud native offers greater flexibility and agility. This means developers can easily make changes to the platform without disrupting user experience. It also means brands can respond to changing market conditions and customer needs quicker than ever before. More for consumers, more for businesses.

Bambha said that this shift in content was a natural one and has been in the making for a long time. Over the past decade, we have seen a massive shift in the media industry towards over-the-top (OTT) media services. Ten years ago, OTT was largely seen as a pilot or test project, but today it is a key growth strategy for many companies.

The rise of OTT was driven by a number of factors, including increased internet speeds, the proliferation of smart devices and changing consumer habits. While traditional TV is still the main revenue source for many big brands, it is shrinking and will possibly vanish within the next 10-15 years. Consumers are increasingly turning to OTT services like Netflix, Hulu and Amazon Prime Video for their entertainment needs. OTT is the future of media, and companies that do not adapt to this new reality risk being left behind, he stated.

Since these platforms are married to cloud businesses, it goes without saying that the furiously competitive segment will affect them. We have stayed ahead in predicting the cloud wars and we have made our OTT solutions cloud-agnostic and multi-cloud capable. Currently, we support AWS and Google Cloud, Bambha said.

Bambha discusses how the increasingly competitive cloud computing market is also opening up new opportunities for OTT content owners. Theres a wider range of cloud providers to choose from, which can help them optimise their costs and improve performance. Additionally, competition drives innovation, which is a win-win for both consumers and the industry, he added.

One of the most significant applications of AI/ML is our content-recommendation engine. By analysing user behaviour and preferences, ViewLift can provide personalised content recommendations that are more likely to resonate with each individual user. This helps to keep users engaged and coming back for more, he said. Predictive analytics is another area where AI/ML is being used. ViewLift is also using AI/ML to personalise the user interface providing a more intuitive and engaging experience.

Bambhas transition to the media and entertainment industry wasnt entirely unforeseen. Formerly a director of engineering at MySpace, and following that up with a stint as the VP of engineering at Shindig, Bambha has a deep understanding of social media, content and what goes behind it. As I continued to solve these technical problems, I began to see how they intersected with business problems, particularly in the domain of digital and OTT media, he signed off.

Excerpt from:
Hypercompetitive Cloud Market a Blessing for Cloud OTT Platforms - Analytics India Magazine

Read More..

Securing Medical Devices is a Matter of Life and Death – Infosecurity Magazine

When a man arrived in the middle of the night at a North London hospital and was emotionally upset, distressed, with seizure-like movements and unable to speak, Isabel Straw, an NHS emergency doctor, first struggled to find the reason because all the tests her team performed on him did not reveal any issues.

That is until she realized the man had a brain stimulator implanted inside his head and its malfunctioning was probably the reason for his pain.

Straw, also the director of the non-profit bleepDigital, urged decision-makers at all levels to start investigating further the cybersecurity risks of medical devices, from the consumer ones through the implanted and ingested technologies.

In the past 10 years, weve seen a lot of advances in these technologies, which has opened up new vulnerabilities, she said during a presentation at UK Cyber Week, on April 4, 2023.

The Internet of medical things (IoMT), as all these devices have come to be called, is increasingly used in healthcare settings and at home, both outside and inside the body, and is ever more interconnected, and so the security threats the IoMT poses are becoming more concerning and can have a significant impact on patients health.

The fear that these devices could start malfunctioning, or even get hacked, is real, and examples of cyber incidents involving IoMT devices are growing. As a result, there needs to be increased coordination between manufacturers and governments to implement more safeguards against security incidents and more capabilities to operate digital forensics, Straw said.

She also insisted that healthcare professionals should be trained on technical issues they could encounter with IoMT devices and on as many models as possible.

With the patient I mentioned, we had to go through his bag, where we found a remote control for the brain stimulator, which no doctor at the hospital knew about. So, I took a photo of it, did a reverse Google image search and found the manual online after a few hours. We realized the device was just desynchronized, but it took us 13 hours to find someone to reset it. If this happened again tomorrow, we would still not know how to treat him, she explained.

To this date, we still dont know why it malfunctioned. Often, these medical devices dont have the memory space or the ability for digital forensics, she added.

These devices can process increasing amounts of data, posing a security risk and data privacy concerns.

Since 2013, the electrodes in brain stimulators have started to be able to read more data, on top of just delivering a voltage. This allowed us to get more data from the patients brain activity and read it externally, which can be used to personalize the data youre analyzing to the patients disease. But streaming peoples brain data also brings a confidentiality issue, Straw highlighted.

In that case, not only does the brain stimulator needs to be secure, but also the communication streams with the health center, the system the health professional is using, and the cloud servers as health professionals increasingly use cloud services to process and analyze data.

Another challenge is what to do when someone dies because of a medical device. If this man had died, what would have happened with his device? Should we bury it with him, or dispose of it? Does it go to the general waste? And what do you mention on the death certificate? These questions are still unanswered, and we dont get training on those issues, Straw noted.

See the article here:
Securing Medical Devices is a Matter of Life and Death - Infosecurity Magazine

Read More..

Dubai, UAE Dedicated server hosting with Best Data Center … – Digital Journal

High Uptime, Low Latency and Low Cost dedicated server Hosting Plans with IP based at UAE, Dubai

Delhi, Delhi, India, 8th Apr 2023, King NewsWire Data centers are crucial to running your business, storing and managing vital data. Theyre also a great way to keep your company secure and ensure that everyone has access to information when they need it.

Theyre also a great way to simplify scaling when your company needs more capacity. TheServerHost Dubai Dedicated Server solutions can scale up and relatively cheaply and in real time.

Dubai data centers are designed to handle demanding computing needs with the greatest efficiency, reliability and security. This means that they need to be built with the latest technologies and be able to adapt quickly to changing requirements.

Among the most important considerations are power, space and cooling capacity, with flexibility and scalability in mind. This is essential to ensuring that your data center is able to keep up with the demands of the business and grow as you do.

Dubai data centers also ensure that your business is well protected from external threats by using multiple layers of security systems, including biometric locks and video surveillance. This can prevent unauthorized people from accessing your servers and other equipment, which can lead to data breaches or malicious attacks.

Redundancy is the act of adding duplicate hardware or systems that can step in to take over the work if the original system fails. This is important in data center operations because it can prevent downtime and keep businesses running.

While redundant equipment helps reduce downtime, it also requires maintenance and care to ensure it works as expected. This is why many data centers have dedicated technicians on staff 24 hours a day.

There are several ways to build redundancy into your business. Some of the most common include having redundant rack power distribution units (PDUs), uninterruptible power supply (UPS) modules, and generators. These redundancy devices help keep your IT equipment powered up in the event of a power outage.

Another way to make sure that your equipment has backup power is by using dual feed or dual substations for utility power. These redundant components help ensure that your servers and other IT devices have plenty of power to keep them operating, even if one side of the power chain fails.

This type of redundancy can save your business money by reducing the amount of time that it takes to get your computer back up and running again. Additionally, it can minimize the impact that downtime has on your business and its customers.

The N value of redundancy is the minimum number of critical components needed for the data center to function at full capacity. It is a standard measurement for all data centers. However, it does not account for the additional redundancy that is required to keep your data center functioning at a high level of resilience.

Security is a vital part of any data center, as it protects critical information and applications from physical threats. Keeping data and applications secure can be an expensive and complicated endeavor, but it is also one that should never be ignored.

The most important thing about security in data centers is the right combination of strategy, architecture, technology and processes to mitigate the risk. By following these best practices, you can rest assured that your companys sensitive data is protected at all times.

First and foremost, you must ensure that you have a system in place that allows you to control access to the data center. This can include biometric readers, mantraps, anti-tailgating systems, and a number of other options.

Second, it must have a system in place that monitors all movement through the data center and prevents unwanted activity. This can be accomplished by using CCTV cameras to record movements in the hallways and at the data center itself.

Third, it must have a system in place to protect data and applications from environmental factors. This can be done by ensuring that the data center is built to withstand major weather events, such as floods, hurricanes, tornadoes and earthquakes.

Fourth, it must have a system in place for managing equipment thats onsite at the data center. This can be done by having a logically segmented network and by protecting the physical devices that make up that network from threats such as malware and viruses.

Finally, it must have a system in place where a firewall can be configured to block traffic based on endpoint identity and endpoint location. This will help you find attacks early before they can spread across your entire network.

A security strategy in a data center must be constantly monitored and adjusted, as the threat landscape changes. This is why its essential to conduct regular audits and testing to identify vulnerabilities and patch up holes in your security infrastructure.

In addition to implementing the best security technologies and techniques, it must also make sure that your security staff is aware of the protocols they need to follow. This can be achieved by training all employees about the proper use of security measures and why they are needed.

Data centers are responsible for the storage of large amounts of data that businesses need to access. As a result, the management of data center resources becomes an important factor in ensuring that the data is available to meet business demands.

With so much data to manage, businesses are transforming their data center infrastructures into automated systems that help with monitoring, processing, and troubleshooting processes. These tools help to improve operational efficiency and reduce IT staff workloads by minimizing repetitive, time-consuming tasks so that they can focus on higher-level, strategic goals.

Besides improving productivity and operational efficiency, automation can also enhance the security of the data center. It can identify potential security threats, and it can respond to them in a timely manner.

Another benefit of data center automation is that it streamlines the network configuration process by enabling the use of common policy settings for all networks. This eliminates the need to manually implement changes that are necessary to accommodate changing IT needs.

Its also possible to integrate different automation solutions together to create a unified control center. This allows IT to configure event triggers and thresholds for compute, provisioning and deprovisioning resources across different layers of the infrastructure.

As an added bonus, many data center automation tools allow for API programmability. This ensures that applications can be easily integrated with each other and that they maintain a fast data exchange, which is critical for agile IT operations.

With these considerations in mind, the best data center infrastructure will enable businesses to take advantage of new technology while keeping costs down and avoiding unnecessary headaches. With automation in place, organizations will be able to manage their data center more effectively and deliver high-quality services to customers.

AI is the field of computer science that aims to create machines that can learn and think like humans. It encompasses machine learning and deep learning, which allow computers to mimic the neural networks in the human brain.

AI has become an increasingly important technology, and its being applied in many different industries, including finance, healthcare and manufacturing. Companies use machine learning algorithms to understand data and uncover information about their customers, products, competitors and more.

There are also numerous AI-powered services available to organizations, many of which are provided by cloud providers. These services are aimed at speeding up data prep, model development and application deployment.

Dedicated servers are a great choice for businesses that have a lot of traffic or need enterprise applications. They offer better hardware, security, and experienced support. They also have unlimited bandwidth and dedicated IP addresses, so you can run as many websites as you want. TheServerHost offers a variety of plans and packages, so you can choose one that suits your needs.

TheServerHost Dubai servers are optimized for high-speed performance. They feature multiple high-speed network interfaces, daily security scans, redundant power and network connections, and are built with enterprise-grade hardware. The company also offers a centralized control panel, which makes managing your server easier.

TheServerHost has a team of technical support specialists that can help you with any issues you may have. They are available round the clock and can answer your questions quickly and efficiently. You can also contact them by phone or chat to get an immediate response.

Daily Backup: TheServerHosts daily backup service is free and provides cloud-to-cloud, disaster recovery, migration, deletion control, and search solutions. It can be used to backup databases, email accounts, and other important data.

Managed Services: TheServerHosts managed services can help you with your website and keep it secure and virus-free. They can also update your operating system, install security updates, and maintain your servers performance.

Memcached and Redis Caching: TheServerHosts caching technology speeds up the processing and execution of PHP-based applications, which helps your website load faster. It also stores the most requested and important databases in RAM, which reduces their retrieval time.

Unmatched Uptime: TheServerHost has a 100% uptime guarantee, so you can rest assured that your site will always be online. They also have a team of dedicated engineers that can quickly respond to any problems you may encounter.

Whether you need a dedicated server for your business or just a personal blog, TheServerHost can provide you with everything you need to make your website a success. They have a variety of packages and plans to suit your needs, including free DNS, a control panel, and live chat support.

The best way to ensure your server is working at peak efficiency is to perform maintenance checks regularly. These include checking hardware and software updates, security upgrades, and RAID alarms. Performing these maintenance tasks can save you a lot of time and money down the road, so its worth taking the time to do them.

In addition to maintaining your server, TheServerHost also offers a host of other services that can help you stay productive and on track. These include daily backup, daily malware scans, and daily malware removal. They can also help you upgrade your hardware, install new applications, and create a customized hosting plan.

Choosing the right dedicated hosting provider can be tricky. You need to choose a company that offers quality service and a fair price. Its also important to find a company that offers a wide range of features and services, such as managed hosting and unlimited bandwidth.

For Dubai VPS Server visit https://theserverhost.com/vps/dubai

For UAE Dedicated Server visit https://theserverhost.com/dedicated/dubai

Organization: TheServerHost

Contact Person: Robin Das

Website: https://theserverhost.com/

Email: [emailprotected]

Address: 493, G.F., Sector -5, Vaishali, Ghaziabad 201010.

City: Delhi

State: Delhi

Country: India

Release Id: 0804233047

The post Dubai, UAE Dedicated server hosting with Best Data Center Infrastructure TheServerHost appeared first on King Newswire.

Information contained on this page is provided by an independent third-party content provider. Binary News Network and this Site make no warranties or representations in connection therewith. If you are affiliated with this page and would like it removed please contact [emailprotected]

Read this article:
Dubai, UAE Dedicated server hosting with Best Data Center ... - Digital Journal

Read More..

The Mastodon plugin is now available on the Steampipe Hub – InfoWorld

When Twitter changed hands last November I switched to Mastodon; ever since Ive enjoyed happier and more productive social networking. To enhance my happiness and productivity I began working on a Mastodon plugin for Steampipe. My initial goal was to study the fediverse writ large. Which people and which servers are powerful connectors? How do moderation policies work? Whats it like to join a small server versus a large one?

These are important questions, and you can use the plugin to begin to answer them. But I soon realized that as a newcomer to a scene thats been evolving for six years, and has not welcomed such analysis, I should start by looking for ways to enhance the experience of reading Mastodon. So I began building a set of dashboards that augment the stock Mastodon client or (my preference) elk.zone. And Ive narrated that project in a series of posts.

Last week we released the plugin to the Steampipe Hub. If youve installed Steampipe, you can now get the plugin using steampipe plugin install mastodon. The next phases of this project will explore using the plugin and dashboards in Steampipe Cloud, and speeding up the dashboards by means of persistent Postgres tables and Steampipe Cloud snapshots. Meanwhile, heres a recap of what Ive learned thus far.

While the dashboards use charts and relationship graphs, they are mainly tables of query results. Because Steampipe dashboards dont (yet) render HTML, these views display plain text onlyno images, no styled text. Ive embraced this constraint, and I find it valuable in two ways. First, Im able to scan many more posts at a glance than is possible in conventional clients, and more effectively choose which to engage with. When I described this effect to a friend he said: Its a Bloomberg terminal for Mastodon! As those of us who rode the first wave of the blogosphere will recall, RSS readers were a revelation for the same reason.

Second, I find that the absence of images and styled text has a calming effect. To maintain a healthy information diet you need to choose sources wisely but, no matter where you go, sites deploy a barrage of attention-grabbing devices. I find dialing down the noise helpful, for the same reason that I often switch my phone to monochrome mode. Attention is our scarcest resource; the fewer distractions, the better.

Theres a tradeoff, of course; sometimes an image is the entire point of a post. So while I often read Mastodon using these Steampipe dashboards, I also use Elk directly. The Steampipe dashboards work alongside conventional Mastodon clients, and indeed depend on them: I click through from the dashboards to Elk in order to boost, reply, or view images. That experience is enhanced by instance-qualified URLs that translate foreign URLs to ones that work on your home server.

The ability to assign people to lists, and read in a list-oriented way, is a handy Twitter affordance that I never used much because it was easy to let the algorithms govern my information diet. Because Mastodon doesnt work like that, lists have become the primary way I read the fediverse flow. Of the 800+ people I follow so far, Ive assigned more than half to lists with titles like *Climate* and *Energy* and *Software*. To help me do that, several dashboards report how many of the people I follow are assigned to lists (or not).

I want as many people on lists as possible. So I periodically review the people I follow, put unassigned people on lists, and track the ratio of people who are, or arent, on lists. Heres the query for that.

When you read in a list-oriented away, as is also true when you read by following hashtags, there are always people whose chattiness becomes a distraction. To control that Ive implemented the following rule: Show at most one original toot per person per list per day. Will I miss some things this way? Sure! But if youve said something that resonates with other people, Im likely to hear about it from someone else. Its a tradeoff thats working well for me so far.

Heres the SQL implementation of the rule.

On the home timelines dashboard Ive made it optional to include or hide boosts, which can be the majority of items. On the list-reading dashboard Ive opted to always exclude them, but the SQL idiom for doing soselect distinct on (person, day)is simple, easy to understand, and easy to change.

Ive so far found three ways in which relationship graphs can make Mastodon more legible. First, in Mastodon relationship graphs, I showed how to use SQL-defined nodes and edges to show boost relationships among people and servers. In another article I used the same tools to map relationships among people and tags. And most recently I used them to explore server-to-server moderation.

In all three cases the format conveys information not directly available from tabular views. Clusters of interesting people pop out, as do people who share tags. And when I graphed servers that block other servers I discovered an unexpected category: some servers that block others are themselves also blocked, like infosec.exchange in this example.

The Steampipe combo of SQL-oriented API access and dashboards as code is a uniquely productive way to build relationship graphs that can unlock insights in any domain. As weve seen with Kubernetes, they can help make cloud infrastructure more legible. The Mastodon graphs suggest that the same can happen in the social networking realm.

When you append .rss to the URL of a Mastodon account, or tag, you produce an RSS feed like https://mastodon.social/@judell.rss or https://mastodon.social/tags/steampipe.rss. These feeds provide a kind of auxiliary API that includes data not otherwise available from the primary API: related tags, which appear in the feeds as RSS category elements. Steampipe really shines here thanks to the RSS plugin which enables joins with the primary Mastodon API. This query augments items in accounts feed with tags that appear in each item.

A similar query drives the graph discussed in Mapping people and tags on Mastodon.

In that example, surfacing the connection between a user, @themarkup, and a pair of tags, scotus and section230, was useful in two ways. First, it helped me instantly spot the item that I most wanted to read, which was buried deep in the search results. Second, it helped me discover a source that Ill return to for guidance on similar topics. Of course I added that source to my Law list!

Everyone who comes to Mastodon appreciates not having an adversarial algorithm control what they see in their timelines. Most of us arent opposed to algorithmic influence per se, though; we just dont like the adversarial nature of it. How can we build algorithms that work with us, not against us? Weve already seen one example: The list-reading dashboard displays just one item per list per person per day. Thats a policy that I was able to define, and easily implement, with Steampipe. And in fact I adjusted it after using it for a while. The original policy was hourly, and that was too chatty, so I switched to daily by making a trivial change to the SQL query.

In News in the fediverse I showed another example. The Mastodon server press.coop aggregates feeds from mainstream news sources. I was happy to have those feeds, but I didnt want to see those news items mixed in with my home timeline. Rather, I wanted to assign them to a News list and read them only when I visit that list in a news-reading mindset. The fediverse offers an opportunity to reboot the social web and gain control of our information diets. Since our diets all differ, it ought to be possibleand even easyfor anyone to turn on a rule like *news only on lists, not timelines*. Steampipe can make it so.

When you ask people on Mastodon about these kinds of features, the response is often Have you tried client X? It offers feature Y. But that solution doesnt scale. It would require massive duplication of effort for every client to implement every such policy; meanwhile, people dont want to switch to client X just for feature Y (which might entail losing feature Z). Could policies be encapsulated and made available to any Mastodon client? Its interesting to think about Steampipe as a component that delivers that encapsulation. A timeline built by SQL queries, and governed by SQL-defined policies, is a resource available to any app that can connect to Postgres, either locally or in Steampipe Cloud.

If youre curious about the Steampipe + Mastodon combo, install the plugin, try out the sample queries, then clone the mod and check out the dashboards. Do they usefully augment your Mastodon reader? What would improve them? Can you use these ingredients to invent your own customized Mastodon experience? Join our Slack community and let us know how it goes!

See the article here:
The Mastodon plugin is now available on the Steampipe Hub - InfoWorld

Read More..

Software Architecture and Design InfoQ Trends Report – April 2023 – InfoQ.com

Key Takeaways

The InfoQ Trends Reports provide InfoQ readers a high-level overview of the topics to pay attention to, and also help the InfoQ editorial team focus on innovative technologies. In addition to this report and the trends graph, an accompanying podcast features some of the editors discussing these trends.

More details follow later in the report, but first it is helpful to summarize the changes from last year's trends graph.

Three new items were added to the graph this year. Large language models and software supply chain security are new innovator trends, and "architecture as a team sport" was added under early adopters.

Trends which gained adoption, and therefore moved to the right, included "design for portability," data-driven architecture, and serverless. eBPF was removed as it has niche applications, and is not likely to be a major driver in architectural decisions.

A few trends were renamed and/or combined. We consider Dapr as an implementation of the "design for portability" concept, so it was removed as a separate trend. Data-driven architecture is the combination of "data + architecture" and data mesh. Blockchain was replaced with the broader idea of decentralized apps, or dApps. WebAssembly now notes both server-side and client-side, as these are related but separate ideas and may evolve independently in the future.

The portability aspect of "design for portability" is not about being able to pick up your code and move it. Rather, it creates a clean abstraction from the infrastructure. As InfoQ editor Vasco Veloso says, "whoever is designing and building the system can focus on what brings value, instead of having to worry too much with the platform details that they are going to be running on."

This design philosophy is being enabled by frameworks such as Dapr. Daniel Bryant, InfoQ news manager, sees the benefit of the CNCF project as providing a clearly defined abstraction layer and API for building cloud-native services. Bryant said, [with integration] it's all about the APIs and [Dapr] provides abstractions without doing the lowest common denominator".

A recent article by Bilgin Ibryam described the evolution of cloud-native applications into cloud-bound applications. Instead of designing a system with logical components for application logic and compute infrastructure, cloud-bound applications focus on the integration bindings. These bindings include external APIs as well as operational needs such as workflow orchestration and observability telemetry.

Another technology that supports designing for portability is WebAssembly, specifically server-side WebAssembly. Often WebAssembly is thought of as a client-side capability, for optimizing code running in the browser. But using WebAssembly has significant benefits for server-side code. InfoQ Editor Eran Stiller described the process for creating WebAssembly-based containers.

Instead of compiling it to a Docker container and then needing to spin up an entire system inside that container on your orchestrator, you compile it to WebAssembly and that allows the container to be much more lightweight. It has security baked in because it's meant to run the browser. And it can run anywherein any cloud, or on any CPU, for that matter. Eran Stiller

More information about Daprand WebAssembly can be found by following those topics on InfoQ.

The news around AI, specifically large language models such as GPT-3 and GPT-4, has been impossible to ignore. This is not simply a tool used by software professionals as the adoption by everyday people and the coverage in all forms of media has demonstrated. But what does it mean to software architects? In some ways, it is too early to know what will happen.

With ChatGPT and Bing, we're just beginning to see what is possible with large language models like GPT-3. This is the definition of an innovator trend. I don't know what will come of it, but it will be significant, and something I look forward to seeing evolve in the next few years. Thomas Betts

While the future is uncertain, we have optimism that these AI models will generally have a positive benefit on the software we build and how we build it. The code-generation capabilities of ChatGPT, Bing chat, and GitHub Copilot are useful for writing code and tests and allowing developers to work faster. Architects are also using the chatbots to discuss design options and analyze trade-offs.

While these improvements in efficiency are useful, care must be taken to understand the limitations of AI models. They all have built-in biases which may not be obvious. They also may not understand your business domain, despite sounding confident in their responses.

This will definitely be a major trend to watch in 2023, as new products are built on large language models and companies find ways to integrate them into existing systems.

Last year, we discussed the idea of data + architecture as a way to capture how architects are considering data differently when designing systems. This year we are combining that idea with Data Mesh under the heading of data-driven architecture.

The structure, storage, and processing of data are up-front concerns, rather than details to be handled during implementation. Blanca Garcia-Gil, a member of the QCon London programming committee, said, when designing cloud architectures there is a need to think from the start about data collection, storage, and security, so that later on we can derive value from it, including the use of AI/ML. Garcia-Gil also pointed out that data observability is still an innovator trend, at least compared to the state of observability of other portions of a system.

Data Mesh was a paradigm shift, with teams aligned around the ownership of data products. This fits the idea of data-driven architecture, as well as incorporating Conways Law into the overall design of a system.

While there has been more adoption in designing for sustainability, we chose to leave it as an innovator trend because the industry is just starting to really embrace sustainable systems and designing for a low carbon footprint. We need to consider sustainability as a primary feature, not something we achieve secondarily when trying to reduce costs. Veloso said, I have noticed that there is more talk about sustainability these days. Let's be honest that probably half of it is because energy is just more expensive and everybody wants to reduce OPEX.

One of the biggest challenges is the difficulty in measuring the carbon footprint of a system. Until now, cost has been used as a stand-in for environmental impact, because there is a correlation between how much compute you use and how much carbon you use. But this technique has many limitations.

The Green Software Foundationis one initiative trying to help create tools to measure the carbon consumed. At QCon London, Adrian Cockcroft gave an overview of where the three major cloud vendors (AWS, Azure, GCP) currently stand in providing carbon measurements.

As the tooling improves, developers will be able to add the carbon usage to other observability metrics of a system. Once those values are visible, the system can be designed and modified to reduce them.

This also ties into the ideas around portability and cloud-native frameworks. If our systems are more portable, that means we will more easily be able to adapt them to run in the most environmentally-friendly ways. This could mean moving resources to data centers that use green energy, or processing workloads during times when the energy available is more green. We can no longer assume running at night, when the servers are less busy is the best option, as solar power could mean the middle of the day is the greenest time.

Blockchain and a distributed ledger is the technology behind decentralized apps. Mostly due to changes at Twitter, Mastodon emerged as an alternative, decentralized social network. However, blockchain remains a technology that solves a problem most people do not see as a problem. Because of this niche applicability it remains classified as an innovator trend.

Architects no longer work alone, and architects can no longer think only about technical issues. The role of an architect varies greatly across the industry, and some companies have eliminated the title entirely, favoring principal engineers as the role primarily responsible for architectural decisions. This corresponds to a more collaborative approach, where architects work closely with the engineers who are building a system to continually refine the system design.

Architects have been working collaboratively with software teams to come up with and iterate designs. I continue to see different roles here (especially in larger organizations), but communication and working together through proof of concepts to try out designs if needed is key. Blanca Garcia-Gil

Architecture Decision Records (ADRs) are now commonly recognized as a way to document and communicate design decisions. They are also being used as a collaboration tool to help engineers learn to make technical decisions and consider trade-offs.

The Architecture & Designeditorial team met remotely to discuss these trends and we recorded our discussion as a podcast. You canlisten to the discussion and get a feel for the thinking behind these trends.

Read the original post:
Software Architecture and Design InfoQ Trends Report - April 2023 - InfoQ.com

Read More..

The Future of Applied AI: Towards A Hyperpersonalised & Sustainable World – BBN Times

Business leaders are facing the challenges of addressing sustainability goals including reducing carbon footprint and managing energy consumption costs, whilst also ensuring that they position their firm to take advantage of the rapid pace of change and new business opportunities that advancing technology, in particular AI, is enabling across every sector of the economy.

As an Intel Ambassador, I am delighted to continue my collaboration with Intel in relation to the 4th Generation of Intel Xeon Scalable Processors and the potential to scale AI across the economy whilst also helping meet sustainability objectives.

With built-in accelerators and software optimizations, 4th Gen Intel Xeon with built-in Accelerators have been shown to deliver leading performance per watt on targeted real-world workloads. This results in more efficient CPU utilization, lower electricity consumption, and higher ROI, while helping businesses achieve their sustainability goals.

One may add broad AI or Artificial Broad Intelligence (ABI) into the categories on the lower left side from the image above. We are now in the era of ABI as Multimodal, Multitasking Transformers from the likes of Microsoft, Google, OpenAI, and others enable certain Deep Learning algorithms to perform both vision and natural language processing (NLP) tasks, albeit such powerful algorithms require capable Central Processing Units (CPUs) and Graphical Processing Units (GPUs) that scale to perform well.

Intel 4th Generation Xeon Scalable Processors enable accelerated AI workloads 3x to 5x for Deep Learning inference on SSD- ResNet34 and up to 2x for training on ResNet50 v1.5 with Intel Advanced Matrix Extensions (Intel AMX) compared with the previous generation. Furthermore, in terms of AI performance the 4th Gen IntelXeon Scalable Processors deliver up to 10X higher PyTorch performance for both real-time inference and training with built-in AMX (BF16) vs prior generation. (FP32).

As we enter an era of ever more powerful AI algorithms such as Transformers with Self-Attention and Generative AI and the rise of AI meets the IoT (AIoT) well need the kind of capability that the 4th Generation Gen IntelXeon Scalable Processors deliver with more efficient and powerful CPUs that allow for AI to scale and process large volumes of data very rapidly in low latency use cases, and yet at the same time to do so with energy efficiency and reduced carbon footprint as a key objective too.

Microsoft commissioned a report from PWC entitled How AI can deliver a sustainable future in relation to the potential for AI across four sectors of the global economy:

Energy;

Agriculture;

Water;

Transportation.

The results from the report demonstrated the potential of AI to drive a reduction in emissions, whilst also increase jobs and economic growth across the four sectors explored in the report:

Reduction of CO2 emissions by up to 4% globally;

GPD growth of 4.4% amounting to a vast $5.2 trillion;

Employment growth amounting to a net 38 million jobs created.

The potential for the reduction in GHG emissions (up to 4% globally) is based upon assumptions applied across all four sectors (water, energy, agriculture and transportation) and the role that AI may play across those sectors including but not limited to precision agriculture, precision monitoring, fuel efficiencies, optimising use of inputs, higher productivity.

Furthermore, the resulting gains from Standalone 5G networks was set out by the US EPA and BCG (see right side of infographic above) whereby the resulting ability of SA 5G networks to enable massive scaling of the AIoT (AI applied onto IoT devices and sensors) and the increased automation flows with machine-to-machine communications may result in both a jobs gain and potential to reduce GHG emissions.

The latest Intel Accelerator Engines and software optimizations help improve power efficiency across AI, data analytics, networking and storage. Organizations can achieve a 2.9x average performance per watt efficiency improvement for targeted workloads utilizing built-in accelerators compared with the previous generation. This leads to more efficient

CPU utilization, lower electricity consumption and higher return on investment, while helping businesses achieve their sustainability and carbon reduction goals.

The 4thGeneration of IntelXeon Scalable Processors provide energy efficiency improvements achieved due to innovations within the design of the built-in accelerators. This allows for particular workloads to consume less energy whilst running at faster speeds.

The result per watt (on average) is 2.9X over the 3rd Gen Intel Xeon Processors whilst also allowing for massive scaling of workloads that will needed in the new era of the AIoT that we are entering into, for example inferencing and learning increased by 10X, 2X for improved compression, 3X for data analytics all achieved with 95% less cores. [1]

Another innovation is the Optimized Power Mode feature that, when enabled, provides 20% energy savings (up to 140 Watts on a dual socket system) while only minimally impacting performance (2-5% on select workloads).

The convergence of Standalone (SA) 5G Networks that will allow for a massive increase in device connectivity and ultra-low latency environments will allow for a massive scaling of the Internet of Things (IoT) with internet connected devices and sensors communicating with human users and each other (machine to machine). Increasingly these IoT devices will have AI embedded onto them (on the edge of the network).

Furthermore, Statista forecast that by 2025 there will be a staggering 75 billion internet connected devices, or over 9 per person on the planet! And IDC Seagate forecast that the volume of data generated will increase from 64 Zetabytes in 2020 (when we talked about the era of big data) to almost three times the volume amounting to 175 Zetabytes in 2025 with 1/3rd of this data consumed in real-time! Applying AI will be essential to efficiently manage networks and also to make sense of the data and provide near real-time responses to users.

Furthermore, this new era will allow us to measure, analyse (evaluate) and respond dynamically to our environment (whether that be healthcare, energy, smart cities with traffic, manufacturing, etc). AI capabilities and inference performance will be key to succeed in this era that we are entering into.

A world where Machine-to-Machine Communication reduces risk (broken down car is detected by the red car that then broadcasts to other vehicles around it who also then broadcast and thereby also avoid traffic jams where emissions can increase due to slow moving traffic) as shown in the illustration below.

IntelXeon Scalable processors provide for more networking compute at lower latency while helping preserve data integrity. Achieve up to 79% higher storage I/O per second (IOPS) with as much as 45% lower latency when using NVMe over TCP, accelerating CRC32C error checking with Intel Data Streaming Accelerator (Intel DSA), compared to software error checking without acceleration.

BCG in an article entitled Reduce Carbon and Costs with the Power of AI forecast that AI technology applied towards corporate sustainability goals may yield reductions of 2.6 to 5.3 gigatons or 1 to 3 USD trillion in value added.

The process for achieving this entails:

Monitoring emissions;

Predicting emissions;

Reducing emissions.

BCG believes that the sectors with the greatest potential for reductions of GHGs due to application of AI include: Industrial goods, transportation, pharmaceutical, consumer packaged goods, energy and utilities.

Intels vision is to accelerate sustainable computing, from manufacturing to products to solutions, for a sustainable future. Organizations can help reduce their scope 3 GHG emissions by choosing 4th Gen Gen IntelXeon Scalable Processors, which are manufactured with 90-100% renewable energy at sites with state-of-the-art water reclamation facilities that in 2021 recycled 2.8 billion gallons of water. For the avoidance of doubt, it is noted that the statistics provided in this paragraph entail Scope 3 emissions related to embodied carbon that don't impact the operational emissions of carbon, however, Scope 3 also includes operational carbon within which servers form a larger part of the equation.

Use case examples of applying the AIoT towards sustainability include the following:

Sensors that may detect that a no person is present in a room and hence switch off the lights and turn the heating (or if summer the air conditioning) off or to a lower level;

Sensors that may realise that a window is open whilst the heating is running and close it;

Predicting issues before they occur such as bust water pipes, unplanned outages in monitoring, traffic congestion spots and trying to reroute traffic or amend the traffic light sequencing to reduce the jams;

In relation to agriculture applying computer vision on a drone to determine when the crops are ripe for harvesting (so as to reduce wasted crops) and also to check for signs of drought and insect infestations;

Deforestation near real-time analytics of illegal logging.

Renewable energy drones applying Computer Vision from Deep Learning algorithms that may inspect the blades of wind turbines and solar panels on solar farms for cracks and damages thereby improving asset life and enhancing amount generated.

Energy storage optimisation with Machine Learning algorithms applied towards maximising the operational performance and return on investment for battery storage.

Rolnick et al. (2019) published a paper entitled Tackling Climate Change with Machine Learning (co-authored by leading AI researchers including Demis Hassabis (Co-founder of DeepMind), Andrew Y Ng, and Yoshua Bengio set out the potential to reduce emissions by applying AI across the manufacturing operations of a firm all the way from the design stage with generative design and 3D printing, supply chain optimization with a preference for low greenhouse gas emissions options, improving factory energy consumption with renewable supplies and efficiency gains (including predictive maintenance) through to detection emissions with the follow up action of abating emissions from heating and cooling and optimizing transport routes.

The 4th Generation of Intel Xeon Scalable Processors also have power management tools to enable more control and greater operational savings. For example, new Optimized Power Mode in the platform BIOS can deliver up to 20% socket power savings with a less than 5% performance impact for selected workloads.

Furthermore, the paper by Rolnick et al. sets out how firms may deal with the unsold inventory problem for retailers with some estimates placing the annual costing the fashion industry $120 billion a year! This is both an economic and an environmental wastage. Targeted recommendation algorithms to match supply with demand, and application for Machine Learning towards forecasting demand and production needs may also help reduce such wastage.

In the world of the AIoT a customer could be walking along the high street or the mall and a Machine Learning algorithm could offer them personalised product recommendations based upon the stores in close proximity to them.

Both the retail and manufacturing examples would require near real-time responses from the AI algorithms and hence a reason why accelerators within the CPU are important factors to deliver enhanced performance.

The world of the AIoT will require the ability to work within power constrained environments and respond to user needs in near-real time.

Intel enables organizations can make dynamic adjustments to save electricity as computing needs fluctuate. Gen Intel Xeon Scalable Processors have built-in telemetry tools that provide vital data and AI capabilities to help intelligently monitor and manage CPU resources, build models that help predict peak loads on the data centre or network, and tune CPU frequencies to reduce electricity use when demand is lower. This opens the door to greater electricity savings, the ability to selectively increase workloads when renewable energy sources are available and an opportunity to lower the carbon footprint of data centres.

In addition, only Intel offers processor SKUs optimized for liquid-cooled systems, with an immersion cooling warranty rider available, helping organizations further advance their sustainability goals.

AI will literally be all around us across the devices and sensors that we use, allowing for mass hyper- personalisation at scale with near real-time instant responses to the customer user. However, in order to avail these opportunities business leaders will need to ensure that they have invested in the appropriate technology that can meet the needs of the business and its customers.

We are entering an era where near immediate responses (often on the fly) will be necessary to engage with customers and also to respond dynamically in a world of machine-to-machine communication.

Intel Advanced Matrix Extensions (Intel AMX) allows for efficient scaling of AI capabilities to respond to the needs of the user and the network.

Significantly accelerate AI capabilities on the CPU with Intel Advanced Matrix Extensions (Intel AMX). Intel AMX is a built-in accelerator that improves the performance of Deep Learning training and inference on 4th Gen Intel Xeon Scalable Processors, ideal for work-loads like natural language processing, recommendation systems, and image recognition.

4th Gen Intel Xeon Scalable Processors have the most built-in accelerators of any CPU on the market to deliver performance and power efficiency advantages across the fastest growing workload types in AI, analytics, networking, storage, and HPC. With all newaccelerated matrix multiply operations, 4th Gen Intel Xeon Scalable Processors have exceptional AI training and inference performance.

Other seamlessly integrated accelerators speed up data movement and compression for faster networking, boost query throughput for more responsive analytics, and offload scheduling and queue management to dynamically balance loads across multiple cores. To enable new built in accelerator features, Intel supports the ecosystem with OS level software, libraries, and APIs.

Performance gains from the 4th Gen Intel Xeon Scalable Processors include the following (source 4th Gen Intel Xeon Scalable Processors perf index):

Run cloud and networking workloads using fewer cores with faster cryptography. Increase client density by up to 4.35x on an open-source NGINX web server with Intel QuickAssist Technology (Intel QAT) using RSA4K compared to software running on CPU cores without acceleration.

Improve database and analytics performance with 1.91x higher throughput for data decompression in the open source RocksDB engine, using Intel In MemoryAnalytics Accelerator (Intel IAA) compared to software compression on coreswithout acceleration solutions with 8.9x increased memory tomemory transfer using Intel Data Streaming Accelerator (Intel DSA), versusprevious generation direct memory access.

For 5G vRAN deployments, increase network capacity up to 2x with new instruction set acceleration compared to the previous generation.

Security is a key issue in the era of the AIoT as SA 5G networks expand and scale.

Businesses need to protect data and remain compliant with privacy regulations whether deploying on premises, at the edge, or in the cloud. 4th Gen Intel Xeon Scalable processors unlock new opportunities for business collaboration and insights even with sensitive or regulated data. Confidential computing offers a solution to help protect data in use with hardware-based isolation and remote attestation of workloads. Intel Software Guard Extensions (Intel SGX) is the most researched, updated, and deployed confidential computing technology in data centres on the market today, with the smallest trust boundary of any confidential computing technology in the data centre today. Developers can run sensitive data operations inside enclaves to help increase application security and protect data confidentiality.

Intels Bosch case study provides an example of an application of security in the IoT sector.

The case study observed that access to raw data sets is ideal for the development of analytics based on Artificial Intelligence. The example sets out how Boschs autonomous vehicles unit reduced risks associated with data or IP leakage using the open source project Gramine, running on Intel SGX. For more details, please refer to Implementing Advanced Security for AI and Analytics.

By the end of this decade, we may experience a substantial increase in the number of advanced Electric and Autonomous Vehicles (EVs and AVs) on the road and a world where

battery storage will be of greater importance as more renewable energy scales across the grid (following the Inflation Reduction Act in the US, and the continued policies of the UK and EU towards reducing carbon emission targets). Powerful CPUs with built in accelerators can help Machine Learning techniques scale across battery storage facilities to optimise the availability of energy and battery performance. This is relevant for edge and network scenarios with power and battery constraints, such as EVs and power-optimized devices in smart homes and manufacturing facilities.

In this world mass hyper-personalisation at scale enabled by the AIoT will allow for both near real-time engagement with the customer on the fly as well as greater efficiency and hence less wastage as Machine Learning and Data Science will enable superior prediction of customer needs from the vast amount of data that will be created.

One may imagine the users engaging in retail or entertainment on their way into and back home from work and the EV/AV recognising the passengers with Computer Vision from Deep Learning and personalising the environment of the car (entertainment, etc) to the user profile. The AV/EVs will go on one journey to another adjusting to different passengers and allowing the user to utilize their time efficiently and as they wish (engaging with brands, working, entertainment). However, even before more advanced EV/AVs arrive, there are many opportunities for firms to avail in the era of the AIoT for near real-time engagement with the customer whilst also reducing wastage (for example better matching supply and demand, improved demand forecasting, identifying and matching supply chain and manufacturing processes).

The 4th Gen Intel Xeon Scalable processors enable a more secure environment for developing IoT services and applications across the edge of the network, in turn enabling businesses to create new opportunities with greater confidence around security.

This vision of scaling and enabling a secure AIoT aligns with my own personal vision of applying AI and related data analytical and digital technology to deliver on sustainability objectives whilst also delivering world of genuine mass hyper-personalisation at scale whereby firms can truly respond to their customers needs in real-time conditions and further tailor their offerings to the individual customers needs.

Were entering an exciting new era from this year and across the rest of this decade whereby AI will scale rapidly across the devices and sensors around us as well as the remote cloud servers that will continue to remain important for training algorithms, acting as data lakes and enabling analytics on historic data to improve learning outcomes for AI and improve the personalisation of service or identify the opportunities to further enhance operational efficiencies across organisations.

Well be able to measure and evaluate emissions and energy consumption around us and identify wastage and reduce inefficiencies.

The AI algorithms across the Edge of the network will require energy efficient CPUs to operate across power constrained environments and to achieve reduced carbon footprint. The 4th Gen Intel Xeon Scalable Processors allow organisations to scale AI capabilities, provide hyper-personalisation at scale, and manage their internal operations at the Edge more efficiently whilst also helping enable security and sustainability goals to be met.

Imtiaz Adam

Data Scientist

Postgraduate in Computer Science with research in AI, Sloan Fellow

More here:
The Future of Applied AI: Towards A Hyperpersonalised & Sustainable World - BBN Times

Read More..

How excited should we be for the future of artificial intelligence? – Daily Cardinal

Disruptive technologies have impacted our lives in innumerable ways. Electric cars, virtual reality, 5G and numerous other genius human accomplishments have changed the way we live. But, the latest breakthrough in tech is artificial intelligence (AI).

As a student, I see how it impacts both students' ability to complete their schoolwork and professors teaching styles. However, with this added ease for students comes an added pressure for professors.

Hours of lecturing can now be replaced with a simple prompt and some elementary follow up questions. Ive witnessed first-hand that teachers are making learning content we cant get from AI their top priority in class. It makes sense. What is the purpose of going through the entire college process just to have a computer do the exact same thing in significantly less time?

Making sure there is a distinction between the capabilities of humans and those of AI should be a top priority otherwise, our day to day lives would seem merely superficial. But as AI constantly increases its own capabilities, this becomes a seemingly impossible task.

The generative AI chatbot known as ChatGPT has taken college campuses by storm. Now, the popularity and power of ChatGPT have led to an AI arms race with major tech firms such as Google and Microsoft squaring off at the top.

The speed at which new technologies are being released is remarkable. This innovation is fueled by competition and, of course, money. However, AI may be a disruptive technology one we might not want gaining power too quickly. For all of the good that comes out of AI, its also important to be mindful of the dangers that can come from it. Unlike electric cars, virtual reality and 5G, AI has the capability to act in human-like ways, reeling in a whole new list of issues.

Earlier this year, AI engines from Google and Microsoft passed the Turing test. The Turing test is administered by a human on an AI engine. If the interrogator is unable to distinguish the AI responses from the human responses, then AI has passed the test.

Most of these questions are not centered around high-level intelligence but rather on language cues and elementary level problem solving. Before the development of these chatbots, passing the Turing test was a rarity, to say the least. Now, it seems ever more common.

This accelerated development of AI being indistinguishable from humans is dangerous. For example, AI can now engage in manipulation and initiate a task as simple as a common scam. This isnt to say we have reached The Terminator levels, where the human race could be at risk, but it is certainly important to monitor as the capabilities of AI increase.

Additionally, equipping bad actors with this kind of AI could have catastrophic impacts throughout the world. As a result, AI experts and global tech leaders such as Elon Musk recently called for a pause in the development of this powerful technology.

In an interview with Yoshua Bengio, often referred to as the Godfather of AI, Musk said, Weve reached the point where these systems are smart enough that they can be used in ways that are dangerous for society. This consensus among tech leaders is not a call to action to stop AI development, but rather to understand how to control the AI that is developed before it is further developed.

Ensuring we can control AI before rushing to expand it seems like common sense; however, as history heeds, greed often takes control of people when they stumble upon an opportunity as lucrative as AI could be. To simply hope every AI specialist across the world will slow down for the purpose of safety is not something to bet on.

Enjoy what you're reading? Get content from The Daily Cardinal delivered to your inbox

So, next time ChatGPT is doing your math homework or writing your 10 page essay for your class, keep in mind the consequences of this intelligence as well as its benefits. AI could be the answer to humans most complex problems or the downfall of all our previous accomplishments.

Gianluca Sacco is a sophomore at UW-Madison studying Economics and Political Science. Do you agree that Artificial Intelligence requires increased vigilance? Send all comments to opinion@dailycardinal.com

The Daily Cardinal has been covering the University and Madison community since 1892. Please consider giving today.

Read the original:
How excited should we be for the future of artificial intelligence? - Daily Cardinal

Read More..

9 examples of artificial intelligence in finance – Cointelegraph

Artificial Intelligence (AI) is transforming the financial sector, revolutionizing how banks, financial institutions and investors operate. Here are nine examples of AI in finance, and how they are changing the industry:

AI algorithms can analyze transactions in real time, detect anomalies and patterns that may indicate fraudulent activities, and alert banks to take appropriate actions.An example of fraud detection using AI is PayPals fraud detection system. PayPal uses machine learning algorithms and rule-based systems to monitor real-time transactions, and identify potentially fraudulent activities.

The system examines data points like the users location, transaction history, and device information to identify abnormalities and patterns that can hint at fraudulent behavior. The technology can notify PayPals fraud investigation team about a possibly fraudulent transaction so that they can look into it further or block the transaction. The amount of fraudulent transactions on the network has dramatically decreased thanks to this AI-powered solution, making using PayPal safer and more secure.

AI-powered chatbots can provide personalized financial advice, answer customer queries, and automate routine tasks like opening new accounts or updating customer information.

The chatbot KAI from Mastercard, which helps clients with account queries, transaction histories and expenditure tracking, is an example of how AI is being used in customer support. KAI uses machine learning algorithms and natural language processing to offer consumers tailored help and financial insights across a variety of channels, including SMS, WhatsApp, and Messenger.

AI can accurately assess past and present market trends, spot patterns, and predict future prices. AI algorithms can also perform transactions in real time, using pre-programmed rules and conditions, optimizing investing strategies and maximizing returns.

Financial institutions and investors benefit significantly from this technology, which enables them to make data-driven decisions and maintain an advantage in the fiercely competitive world of trading.

Related: What are artificial intelligence (AI) crypto coins, and how do they work?

By analyzing complex financial data, artificial intelligence can identify potential risks and forecast future scenarios, providing valuable insights that enable banks and other financial institutions to make well-informed decisions.

An example of risk management using AI is BlackRocks Aladdin platform. To analyze enormous volumes of financial data, spot risks and opportunities, and give investment managers real-time insights, the Aladdin platform combines AI and machine learning algorithms.

By examining elements like market volatility, credit risk, and liquidity risk, the platform assists investment managers in monitoring and managing risks. Investment managers may enhance their investment strategies and make data-driven decisions thanks to Aladdins risk management capabilities, which lower the risk of losses and boost returns.

AI can analyze vast amounts of financial data and provide insights into investment trends, risks and opportunities, helping investors make informed decisions.An example of portfolio management using AI is Wealthfront, a robo-advisor that uses AI algorithms to manage investment portfolios for clients.

To create customized investment portfolios for clients based on their goals, risk tolerance, and financial position, Wealthfront combines classic portfolio theory and AI. As market conditions and the clients goals change, the platform automatically rebalances the portfolio while continuously monitoring its performance. Many investors find Wealthfront an appealing alternative because of its AI-powered portfolio management, which enables customized and optimal investing plans.

AI algorithms can analyze credit histories, financial statements, and other data to provide accurate credit scores, enabling lenders to make better lending decisions. For instance,ZestFinances Zest Automated Machine Learning (ZAML) platform uses AI to analyze credit risk factors and provide more accurate credit scores, improving lending decisions and reducing the risk of default.

AI-powered robo-advisors can provide personalized financial advice and investment strategies based on a clients financial situation, goals and risk tolerance. For instance,Bank of Americas AI chatbot, Erica, can provide personalized financial advice, answer customer queries and automate routine tasks.

AI can analyze a range of data points, including demographic information, health records and driving history, to provide accurate insurance underwriting. For instance, to improve accuracy and lower fraud in the insurance market, Lemonade, an AI-powered insurtech company, employs AI algorithms to evaluate claims and underwrite insurance policies.

Related:A brief history of artificial intelligence

AI can help financial institutions comply with complex regulations by analyzing transactions, detecting fraud, and ensuring compliance with Know Your Customerand Anti-Money Launderingregulations.

For instance, ComplyAdvantage helps businesses comply with legal obligations and avoid fines by using AI and machine learning algorithms to monitor financial transactions and identify potential money laundering activities.

See more here:
9 examples of artificial intelligence in finance - Cointelegraph

Read More..