Page 2,837«..1020..2,8362,8372,8382,839..2,8502,860..»

Chinese AI Learns To Beat Top Fighter Pilot In Simulated Combat – Forbes

A Chinese AI system has defeated a top human pilot in a simulated dogfight, according to Chinese media. The AI was pitted against Fang Guoyu, a Group Leader in a PLA aviation brigade and a previous champion in such contests.

"At first, it was not difficult to win against the AI," said Fang in a report in Global Times, a Chinese state newspaper. But as the exercise continued the AI learned from each encounter and steadily improved. By the end it was able to defeat Fang using tactics it had learned from him, coupled with inhuman speed and precision.

ADVERTISEMENT

"The AI has shown adept flight control skills and errorless tactical decisions, said brigade commander Du Jianfeng.

The Chinese exercise of setting human pilots against AI aims to improve both. The AI gives the pilots a new and challenging opponent which thinks out of the box and can come up with unexpected tactics, while each dogfight adds to the AIs experience and helps it improve.

The AI was developed by a number of unspecified research institutes working with the aviation brigade, according to the report.

In the culmination of DARPA's AlphaDogfight exercise, the Falco AI decisively beat a skilled human ... [+] pilot in simulated combat between F-16s.

The event echoes DARPAs AlphaDogfight competition last year which featured human and AI pilots fighting it out in simulated F-16s. In the initial rounds, different AIs competed to find the best. In the final round, the winning AI, Falco from Heron Systems, took on the human champion, an unnamed U.S. Air Force pilot. The AI triumphed, scoring a perfect 5-0 win in a series of encounters.

AIs have significant advantages in this situation. One is that they are fearless and highly aggressive compared to human pilots; another term might be reckless. They can react faster than any human, and can track multiple aircraft in all directions, identifying the greatest threats and the best targets in a rapidly changing situation. They also have faster and more precise control: Falco was notably skilled at taking aim and unleashing a stream of simulated cannon fire at opponents who were still lining up their shot. Whether these advantages would carry over into a messy real-world environment is open to question further planned exercises by DARPA, the USAF and others may help settle the matter.

DARPAs ACES program, of which AlphaDogfight was part, plans to port dogfighting algorithms onto small drones and test various scenarios of one-on-one, one-versus-two, and two-versus-two encounters in the next year. At the same time they are also preparing for combat autonomy on a full-scale aircraft. This may utilize existing dumb QF-16 target aircraft, the drone versions of F-16s used for air-to-air combat practice.

The QF-16, an unmanned version of the F-16 used as an aerial target, could be upgraded to a ... [+] dogfighter with smart software

The contest for AI supremacy between the U.S. and China is attracting increasing attention, with the National Security Commission on AI (NSCAI) concluding in March that, for the first time since World War II, Americas technological predominance is under threat. China has created hundreds of new AI professorships and developed an efficient ecosystem for AI start-ups with tax breaks and lucrative government contracts on offer.

ADVERTISEMENT

AI fighter pilots are just a tiny piece in the military balance, and not a meaningful indicator on their own. However, the fact that China chooses to publicize the latest development sends a message that they are hard on Americas heels, if not drawing ahead, in direct military applications of AI. If their AI can really learn skills that rapidly from contests with human pilots, then, like DeepMind's AlphaGo, it may now be competing with versions of itself and developing tactics and levels of skill impossible for humans.

Meanwhile, in the larger evolutionary contest between humans and AIs, the machines have just taken another tiny step forward in chipping away our superiority. The new Top Gun movie out later this year may be nostalgic on more ways than one.

Continue reading here:
Chinese AI Learns To Beat Top Fighter Pilot In Simulated Combat - Forbes

Read More..

Different Types of Robot Programming Languages – Analytics Insight

Robots are by far the most efficient use of modern science. Robots not only reduce human labor but also execute error-free activities. Many businesses are expressing an interest in robotics. Automated machines have gained popularity in recent years. Keeping the situation in mind, we shall discuss robotic computer languages.

So, in order for robots to do tasks, they must be programmed. Robot programming is the process through which robots acquire instructions from computers. A robotic programmer must be fluent in several programming languages. So lets get started.

There are about 1500 robotic programming languages accessible worldwide. They are all involved in robotic training. In this section, we will go through the top programming languages accessible today.

The easiest way to get started with robotics is to learn C and C++. Both of these are general-purpose programming languages with almost identical features. C++ is a modified version of C that adds a few features. You should now see why C++ is the most popular robotic programming language. It enables a low-level hardware interface and delivers real-time performance.

C++ is the most mature programming language for getting the greatest results from a robot. C++ allows you to code in three different ways. The Constructor, Autonomous, and OperatorControl methods are among these. In this constructor mode, the initializing code runs to build a class. It will execute at the start of the program in this scenario.

It aids in the initialization of sensors and the creation of other WPILib objects. The autonomous approach guarantees that the code is executed. It only works for a set amount of time. The robot then moves on to the teleoperation section. The OperatorControl technique is used in this case.

Python is a powerful programming language that may be used to create and test robots. In terms of automation and post-process robotic programming, it outperforms other platforms. You may use this to build a script that will compute, record, and activate a robot code.

It is not necessary to teach anything by hand. This enables rapid testing and visualization of the simulations, programs, and logic solutions. Python uses fewer lines of code than other programming languages. It also includes a large number of libraries for fundamental functions. Pythons primary goal is to make programming easier and faster.

Any item can be created, modified, or deleted. In addition, we may code the robots motions in the same script. All of this is accomplished with very little code. Python is among the finest robotic programming languages as a result of this.

Java is a programming language that enables robots to do activities that are similar to those performed by humans. It also provides a variety of APIs to meet the demands of robots. Java has artificial language characteristics to a high degree.

It enables you to construct high-level algorithms, searching, and neural algorithmic algorithms. Java also allows you to run the same code on many computers.

Java is not built into machine code since it is an interpretative language. Rather, in execution, the Java virtual computer interprets the commands. Java has become quite popular in the field of robotics as a result of this. As a result, Java is preferable to alternative robotic programming languages. Java is used by modern AIs such as IBM Watson and AlphaGo.

Microsofts .NET programming language is used to create apps with Visual Studio. It provides a good basis for anyone interested in pursuing a career in robotics. .NET is primarily used by programmers for port and socket development.

It supports various languages while allowing for horizontal scaling. It also offers a uniform environment and makes programming in C++ or Java easier. All of the tools and IDEs have been thoroughly tested and are accessible on the Microsoft Developer Network.

In addition, the merging of languages is smooth. As a result, we can confidently rank this among the best robotic programming languages.

In robotic engineering, MATLAB and its open-source cousins like Octave are extremely popular. In terms of data analysis, it is considerably ahead of many other robotic computer languages. MATLAB is not really a programming language in the traditional sense. Yet, engineering solutions based on complex mathematics can be found here.

Robotic developers will learn how to create sophisticated graphs using MATLAB data. It is quite helpful in the development of the complete robotic system. It also aids the development of deeply established robotic foundations in the robot business. Its a tool that lets you apply your methods to simulate the outcome. Engineers may use this simulation to fine-tune the system design and eliminate mistakes.

There have been cases when MATLAB has been used to build a complete robot. As a result, it must be included among the top ten languages. Kuka kr6 is one of the greatest instances of MATLAB application. MATLAB was also used to create and simulate this robot by the developers.

One of the first robotic computer languages was Lisp. It was introduced to the market to allow computer applications to use mathematical terminology. Lisp is an AI domain that is mostly used for creating Robot Operating Systems.

Tree data structures, automated storage management, syntax highlighting, and elevated-order characteristics are among the features available. As a result, it is simple to use and aids in the elimination of implementation mistakes after an issue have been identified.

This problem-solving procedure takes place at the prototype stage, not the manufacturing stage. It also includes capabilities like the read-eval-print loop and self-hosting compilation.

One of the earliest programming languages to hit the market was Pascal. Its still quite useful, especially for newcomers. It is based on the Fundamental programming language and teaches excellent programming skills. Pascal is being used by manufacturers to create robotic programming languages.

ABBs RAPID and Kukas KRL are two examples. Nevertheless, most developers consider Pascal to be obsolete for everyday use. Theyve also highlighted its significance for newcomers.

It will assist you in learning other robot programming languages more quickly. This is only recommended for complete novices. When youve gained some expertise in robotics programming, you can transition to another language.

And its a wrap. We hope that you found this article helpful regarding robotic programming languages. Weve covered all of the pros and cons of the top robotic programming languages. You can choose the most appropriate language for your needs. Even now, robotics has a promising future. So now is the ideal moment to get started.

See the article here:
Different Types of Robot Programming Languages - Analytics Insight

Read More..

Engineering Team Trains for Success at National I-Corps – University of Arkansas Newswire

Photo by Russell Cothren

Min Zou, professor of mechanical engineering.

U of A researchers and the discoveries they make can change our world for the better. But taking inventions from the laboratory to the marketplace, where industries and government can deliver new technology to consumers, is a daunting challenge. That's where industry mentorship, university support and the National Science Foundation Innovation Corps come in. UofA I-Corps teams participating in the regional and national programs are supported by the Division of Research and Innovation and the Division of Economic Development.

Min Zou, professor of engineering, and graduate student Sujan Ghosh formed a partnership to explore the opportunity to commercialize Zou's patent-pending innovation in solid lubricant coating. Their team, CoatingEngineering, was connected with an industry mentor in Ty Keller, research and development director for Hytrol Conveyor Company in Jonesboro, Arkansas.

Ghosh and Zou saw the potential in solid lubricant coating to improve material handling and reduce costs for businesses and were ready to take the next step. Their team was accepted into the National Science Foundation Innovation Corps (NSF I-Corps) Site at the U of A. The NSF I-Corps program strives to "reduce the time and risk associated with translating promising ideas and technologies from the laboratory to the marketplace."

Working with Ed Pohl, professor and department head in industrial engineering; Cynthia Sides, director of innovation and industry partnerships; Bob Beitle, professor in chemical engineering; and Sarah Goforth, executive director of the Office of Entrepreneurship and Innovation, CoatingEngineering refined its business focus and began investigating opportunities in the marketplace. Pohl stated that "our NSF I-Corps site is designed to assist early stage teams with their customer discovery process, and we are extremely excited to see this team continue on to the National I-Corps program, which will help them continue to mature the commercialization of their technology."

Keller, a 2005 U of A graduate in mechanical engineering, has engaged the team in discussions with industry, offered advice and supported the team's grants and proposals. "Not only is this exciting to me from a discovery standpoint, Hytrol sees the business potential in this technology. It could be a win-win for them by shrinking the distance from lab to market."

The team's potential for commercial implementation and success qualified them for the prestigious NSF National I-Corps program, hosted by the I-Corps South Node, along with teams from institutions such as MIT, UC Berkeley and University of Texas at Austin. During the intensive, seven-week program, Ghosh and Zou completed about 100 interviews with industry representatives. What they learned is shaping the future of CoatingEngineering.

"We received feedback that we weren't expecting at first," Ghosh said. "Businesses were less concerned about cost and more about what the benefits are to society at large."

Ghosh and Zou heard from many companies that they were looking for technology that was green and sustainable, that both reduces noise and energy consumption.

"This has been so valuable because meeting with industry and customer segments helps explains where the challenges and opportunities are," Zou said. "The earlier we can learn from different markets, the better we can tailor our technology. Professors and researchers read journals for the latest developments in our field, but industry engagement helps us meet the challenges of the marketplace."

Along with the success of CoatingEngineering, the university's efforts to promote innovation commercialization are yielding very promising results, which has been highlighted as essential to the future success of the university by Chancellor Steinmetz in his "2020: Focus on the Future" white papers.

Weston Waldo, venture development program manager of U of A Technology Ventures, works closely with inventions and inventors, like Zou, to liaise Technology Ventures with university spinoff companies and entrepreneurial teams considering leveraging university innovations and empowers them to take advantage of the many resources that are available in Northwest Arkansas and beyond. "CoatingEngineering is the 'dream team' because they have a faculty member in a STEM field, a graduate student/research expert learning the ropes of entrepreneurship and a knowledgeable industry mentor who is a senior leader with domain expertise matched to the innovation," Waldo said.

NSF has completed their annual report to Congress on the National I-Corps program where NSF reported that 50 percent of National I-Corps team participants through Summer 2020 have gone on to form a company, and these companies have raised $760 million.

For those that are interested in learning more about I-Corps, please contact one of the following people:

Follow this link:

Engineering Team Trains for Success at National I-Corps - University of Arkansas Newswire

Read More..

Building a resilient cloud-based business – The Manila Times

In today's uncertain landscape, many entrepreneurs and managers think that to future-proof their businesses, they need to overhaul their cloud migration since the rest of the market is already doing it. Although it is important to keep the information technology (IT) infrastructure as modern as possible, the large investment it entails can have significant implications on other crucial parts of the organization. As such, before investing totally in cloud migration, businesses should have a roadmap that shows how they will benefit from the move.

Michael Bathon, vice president of cloud services at Rimini Street, discusses with The Manila Times the pertinent issues businesses should focus on in developing a cloud-based roadmap tailor-fit to address an organization's unique and specific needs.

The Manila Times (TMT): The pandemic has pushed migration to the cloud across industries around the world. We're interested to know that while it's a very positive development, does it follow that most businesses must upgrade to best-of-breed systems as soon as possible?

Michael Bathon (MBa): I have to say "everybody else is doing it" isn't a sufficient reason for any business to make the jump entirely to the public cloud. At the same time, as your infrastructure requirements and costs continue to grow over time, you will have fewer resources that can be directed toward innovative projects.

If your competition is enhancing its security and disaster recovery posture while reducing digital infrastructure costs per unit, you risk setting yourself up for failure down the road. Overhauling your IT infrastructure is daunting to be sure; anytime you make a large investment in an area that touches just about every corner of the business, the overall implications can be high.

But that is the key point - not every business needs a complete overhaul. Cloud migration should come only after you build a business roadmap that shows how your organization will benefit from the move.

TMT: You're talking about a roadmap for cloud migration. That can only mean, first of all, priorities to ensure your journey to the cloud is secure and will bring the business to the desired destination. What would be your guideposts in developing such a roadmap?

MBa: One of the first and most obvious things cloud migration does for your business is free up your workforce. With fewer servers and/or data centers to run, your talent can focus on new and "cool" initiatives that help drive your core business forward. Instead of using resources to maintain systems, you enable teams to perform more valuable and rewarding work and help improve morale across the organization.

Making the move to a public cloud provider also allows businesses to secure talent. The reality is, if you're not looking at the cloud now, members of your IT team might leave because really, nobody wants to work for an organization that is solely based on antiquated systems, with no sight of modernization on the horizon.

Finally, a cloud roadmap allows your IT team to shift their focus back to your business. How much time does your IT team spend running hardware? And is it really driving new business? What if you could turn on its head so the team could focus on something new? It's certainly more attractive to both business leaders and those in the trenches to work on innovative projects rather than maintaining ageing systems.

TMT: We understand the "people first" concept in successfully running any enterprise. What else should a cloud migration roadmap entail, especially for those just starting out?

(MBa): If you look at the current IT processes, you have testing, support, and anything else you can think of and migrating them as-is to the cloud probably won't work the way you envision because it's a different paradigm.

I strongly encourage teams that are new to the cloud to do a readiness assessment to determine which processes are (or are not) "cloud ready." Once your core data, whether HR data, customer data, client supply chain data in the cloud, you can start consuming cloud-based tools that allow you to unlock the real value of your data.

An overlooked reality is very few, if any business, have migrated to 100 percent cloud-based infrastructures. On the flip side, it also rarely makes sense to invest in the most current enterprise resource planning on the market (and the associated support costs that come with it).

The cloud is inevitable, and what a cloud rollout looks like for your business will likely differ from that of other organizations, but at the end of the day, the key question is this: Would you rather spend time supporting your existing applications and/or implementing new backup software in order, or do you want to build in a level of flexibility that will help you grow and, ultimately, future-proof your business?

Rimini Street is the leading third-party support provider for Oracle and systems applications software products and a Salesforce partner.

Read the rest here:
Building a resilient cloud-based business - The Manila Times

Read More..

What does Apples Xcode Cloud mean for the future of apps? Heres what devs say – Digital Trends

For consumers and outside observers, Apples Worldwide Developers Conference (WWDC) is always a chance to see what lies in store when the next versions of its operating systems come to their devices. For developers, though, it is all about learning what Apple is doing under the hood. At this years event, Apple revealed Xcode Cloud, a new feature of its Xcode development app that Apple believes will make life easier and simpler for app builders.

Folks at Apple told us they were incredibly excited for Xcode Cloud and disappointed that developers could not be on-site when it was announced at the companys online event and a quick perusal of the Twittersphere brings up a wealth of devs giddy with expectation for the new feature.

But what exactly is Xcode Cloud, and why is Apple convinced it is such a big deal? To find out, we sat down with both engineers at Apple and the developers its targeting to see how Xcode Cloud might impact their work, to hear out any apprehensions they might have, and tease out what it could mean for the future of apps.

Lets start with the basics. To make apps for Apple platforms, developers use an Apple-created Mac app called Xcode. Its been around since 2003 and remains one of the most important pieces of software in Apples catalog. Xcode Cloud is one of the biggest updates to Xcode in years, bringing new functionality that many developers had to leave Xcode for in the past.

Apple positions Xcode Cloud as a tool that puts previously complex tools within reach of all developers. I asked Wiley Hodges, the Director for Tools and Technologies at Apple, what they were hearing from developers that led to the creation of Xcode Cloud.

Weve seen that there are tasks like distributing the apps to beta testers, like managing feedback and crash reports, that are really critical to building great apps, Hodges said. And weve seen that more and more of our developers have been interested in continuous integration and using this automated build and automated test process to constantly verify the quality of software while its being built.

Those are exactly the problems Xcode Cloud is meant to address.

Xcode Cloud lets developers run multiple automated tests at once, uses continuous integration (CI) so app code can be quickly iterated and updated. It also simplifies the distribution of app builds to beta testers and lets devs catch up on feedback. It can build apps in the cloud rather than on a Mac to reduce load and allows for the creation of advanced workflows that automatically start and stop depending on set conditions.

We wanted to bring these tools and services in the reach of all our developers, because right now its been something that I think was more on the advanced level for developers to get this set up and running as part of their process, Hodges explained.

That sounds promising enough. But what do actual developers think?

Putting those tools front and center is something several developers told us was a key attraction of Xcode Cloud. Now that previously quite specialized capabilities have been integrated into the main tool they use to build apps, there is much less need to find third-party alternatives and add extra steps to their workflows.

Denys Telezhkin, a software engineer at ClearVPN, summed this feeling up in an interview with Digital Trends.

I was very interested [in Xcode Cloud] as there have been a variety of problems with different CIs, he told me. For example, Microsoft Azure is difficult to configure, GitHub Actions is expensive, and so on.

With everything integrated into Xcode Cloud, leaning on unreliable alternatives could become unnecessary. Of course, Apple will be happy to steer developers away from its rivals.

But the chief impetus, Hodges insists, was something different: The motivation for Xcode Cloud came from our observation that while there was a group of devoted Xcode Server users, most developers still werent implementing continuous integration. We started looking at the obstacles that prevented adoption and came to the conclusion that a cloud-hosted CI offering would be the best way to get broad adoption of CI as a practice, particularly with smaller developers for whom setting up and managing dedicated build servers was a bigger challenge.

Seeing tools and services like Xcode Cloud integrated directly into the dev platform got us excited.

For devs, its about more than just CI though. Scott Olechowski, Chief Product Officer and Co-Founder of Plex, got to try out a beta version of Xcode Cloud before Apples WWDC announcement. He told me the potential benefits are wide-ranging.

Seeing tools and services like Xcode Cloud integrated directly into the dev platform got us excited since it should really help us be more efficient in our development, QA [quality assurance], and release efforts.

Part of that increased efficiency will likely come in Xcode Clouds collaboration tools. Each team member can see project changes from their colleagues, and notifications can be sent when a code update is published. The timing is auspicious, given the way the ongoing pandemic has physically separated teams all over the globe. Yet it was also coincidental, said Hodges.

The reality is weve been on this path for quite a while, literally years and years, and so I think the timing may be fortuitous in that regard. This is definitely a long-term project that was well underway before our unfortunate recent events.

If there is one thing Apple is great at, its building an ecosystem of apps and products that all work together. Unsurprisingly, Xcode Cloud reflects that it connects to TestFlight for beta testers, lets you run builds on multiple virtual Apple devices in parallel, plays nice with App Store Connect, and more. For many developers, that integration could have a strongly positive impact on their work.

Vitalii Budnik, a software engineer at MacPaws Setapp, told me having everything in one place will mean more time spent actually coding and less time juggling multiple tools and options. For Budniks MacPaw colleague, Bohdan Mihiliev of Gemini Photos, the app distribution process will be faster and smoother than it currently is.

Apple sees Xcode Cloud as something that can improve life for developers large and small. Alison Tracey, a lead developer on Xcode Cloud at Apple, emphasized the way Xcode Cloud levels the playing field for smaller developers as well.

With the range of options that exist to you in the configuration experience when youre setting up your workflows, you really can support the needs of a small developer or somebody thats a small development shop or somebody thats new to continuous integration, all the way up to more of the advanced power users.

This ranges from a simple four-step onboarding process to integrating Mac apps and tools like Slack and dashboards thanks to built-in APIs.

Its not all smooth sailing, though. Apple refused to divulge pricing details for Xcode Cloud at WWDC, saying more information would not be available until the fall. Many developers I spoke to were concerned about that to one degree or another, and it seems to be putting a slight damper on the excitement a lot of devs are feeling about Xcode Clouds potential.

Questions have also been raised about Xcode Clouds value to developer teams that create apps for both Apple and non-Apple platforms since Xcode can only be run on the Mac. I put this to Alex Stevenson-Price, Engineering Manager at Plex, since Plex has apps for Mac, Windows, Linux, Android, iOS, and many other systems. He told me that Plexs various apps are built by different teams using different tools, so while it is a great new string in the Apple teams bow, it will not be of much use to the non-Apple teams because they will not be using Xcode anyway.

If you want to get Xcode Clouds benefits when building an Android app, you are out of luck.

Of course, it should not come as a surprise that Apple has limited interest in providing tools for rival ecosystems. If you want to get Xcode Clouds benefits when building an Android app, you are out of luck, but Xcode has always been restricted (Apple might say focused) in that way. That could pose problems for developers who have the same app on both iOS and Android or any number of other platforms.

Other developers told me they will have to wait and see whether Xcode Clouds reputed benefits play out in reality. Its use for solo developers was also questioned, partly because a number of its features are aimed towards teams with multiple members.

For instance, Lukas Burgstaller, the developer behind apps like Fiery Feeds and Tidur, told me Xcode Clouds utility depends on the setting.

While I dont think Im going to use it for my personal projects [as] I feel like continuous integration is moderately helpful at best for a solo developer setup, I will definitely start using it in my day job as an iOS team lead, where we were planning to set up some sort of CI for over a year but never got to it.

But even if he might not use every feature, Burgstaller still described Xcode Cloud as a finally announcement, saying he was extremely happy Apple is adding it to Xcode.

It is still early days for Xcode Cloud. Like many of the other updates and new features announced at WWDC 2021, from iOS 15 to MacOS Monterey, it is currently only available to beta testers. Despite a few concerns and bad memories from the spotty launch of another developer tool, Mac Catalyst, a few years ago the benefits seem to far outweigh the drawbacks, at least according to the developers I spoke to.

In fact, none of those devs said Xcode Cloud was completely without merit, suggesting there will be something for most people who work to create apps for the Apple ecosystem. Provided Apple continues to improve it as developer needs change, and as long as its pricing is not egregiously expensive, Apple might be onto a winner with Xcode Cloud.

As always, the proof is in the pudding, and a lot will depend on the state Xcode Cloud finds itself in at launch. For many developers, though, its fall release cant come soon enough.

Read the original post:
What does Apples Xcode Cloud mean for the future of apps? Heres what devs say - Digital Trends

Read More..

Travelling to the cloud – ITWeb

When lockdown began, all industries suffered, but some were impacted even more than others. Much like travel and tourism, the transport industry in general and the cross-country passenger bus services in particular struggled to survive. Having been unable to operate at all for a period, this meant their customers and the need to keep them happy became even more important, because these businesses certainly wanted their clients to return in droves once the lockdown ended.

The forward-thinking ones have been investing in customer-facing applications as well as internal IT workloads, such as their core platform for information management, online ticketing, storage, analytics and other services.

Stone He, President, Huawei Cloud (Southern Africa), points out it is no surprise to discover that these companies are determined to continuously improve their end-user experience and provide travellers with more real-time information.

A critical foundation for doing so is cloud computing, which is why an increasing number were already shifting to the cloud even before the pandemic struck. During lockdown, cloud proved critical in making sure their business and production systems could continue to run without any impact, he says.

These organisations prioritised the development and consolidation of hybrid workloads, not only for the customer-facing systems, but also for their internal IT systems. Many are now searching for more agile, resilient and cost-effective cloud services to replace their current service providers or eliminate the need to host their own data centres.

Another key advantage of the cloud is that it helps them overcome scalability challenges related to the processing and analysis of their data, he adds. For businesses that deal with multiple national routes, thousands of passengers and any number of destinations, the sheer volume of data that needs to be processed is enormous.

This is why these entities need a world-class cloud provider that offers local hosting and more cost-efficient solutions to support their businesses.

Some key challenges a move to a locally hosted, international cloud provider can help to solve, adds He, include: significant cost reductions; access to local support services and a dedicated local cloud team; and a reduction in operational complexity.

It goes without saying that any such move would require a seamless migration, since these businesses like in most other sectors simply cannot afford to have their production workloads go down during a migration.

Asked what sort of benefits such businesses would glean from a move to a locally sited cloud data centre, He suggests it would enable access to real-time travel status updates and the opportunity to suggest alternative routes if things do go awry. Most vitally, it would allow head office to easily communicate with staff on the ground, who would have a full view of the situation. Furthermore, they can leverage the cloud to provide greater safety, thanks to real-time vehicle diagnostics, not to mention keeping track of passenger counts and, ultimately, being able to effectively deploy resources when responding to the needs of the business.

The days of maintaining and replacing on-premises servers at Intercape are over and the decision to move our production servers to cloud was made in March 2020," says Karl Rosch, IT Manager at Intercape. He continued: After investigating various platforms, the decision was made to move to Huawei Cloud and the cost and reliability of Huawei Cloud made it a very attractive offering."

Rosch also found that the transition to Huawei Cloud was seamless with the local support from a Huawei engineer who assisted with the set-up and migration project. After more than a year on the Huawei platform, the uptime has been excellent and support from the Huawei team has been outstanding, said Rosch.

The transnational transport sector is already being impacted by new disruptors like Uber, albeit that these are not direct competition yet, but these companies still need to offer the flexibility that customers exposed to ride-sharing apps have come to expect. This means a move away from the rigid approaches to timetables and scheduling of the past. The flexibility and scalability of the cloud will be a huge benefit with regard to how they manage their operations and approach their customers moving forward, explains He. Whats more, only the cloud can provide a genuine foundation to ensure easy adoption of future technology advances, particularly around machine learning and the Internet of things. Thanks to the cloud, these digital technologies can be quickly deployed, enabling these organisations to not only keep their future innovations on track, but also any potential disruptors at bay, He concluded.

Read the rest here:
Travelling to the cloud - ITWeb

Read More..

Intel: I’m already the biggest DPU shipper Blocks and Files – Blocks and Files

Big beast Intel has come crashing out of the semiconductor jungle into the Data Processing Unit gangs watering hole, saying its the biggest DPU shipper of all and it got there first anyway.

Data Processing Units (DPUs) are programmable processing units dedicated to running data centre infrastructure-specific operations such as security, storage and networking offloading them from existing servers so they can run more applications. DPUs are built from specific CPU chips, FPGAs and/or ASICs by suppliers such as Fungible, Nvidia, Pensando, and Nebulon. Its a fast-developing field and device types include in-situ processors, SmartNICs, server offload cards, storage processors, composability hubs and components.

Navin Shenoy, Intel EVP and GM of its Data Platforms Group, said at the Six Five Summit that Intel has already developed and sold what it called Infrastructure Processing Units (IPUs) what everyone else calls DPUs. Intel designed them to enable hyperscale customers to reduce server CPU overhead and free up cycles for applications to run faster.

Guido Appenzeller, Data Platforms Group CTO at Intel, said in a statement: The IPU is a new category of technologies and is one of the strategic pillars of our cloud strategy. It expands upon our SmartNIC capabilities and is designed to address the complexity and inefficiencies in the modern data centre.

An IPU, he said, enables customers to balance processing and storage, and Intels system has dedicated functionality to accelerate applications built using a microservice-based architecture. Thats because Intel says inter-microservice communications can take up from 22 to 80 per cent of a host server CPUs cycles.

Intels IPU can:

Patty Kummrow, Intels VP in the Data Platforms Group and GM of the Ethernet Products Group, offered this thought: As a result of Intels collaboration with a majority of hyperscalers, Intel is already the volume leader in the IPU market with our Xeon-D, FPGA and Ethernet components. The first of Intels FPGA-based IPU platforms are deployed at multiple cloud service providers and our first ASIC IPU is under test.

The Xeon-D is a system-on-chip (SoC) microserver Xeon CPU, not a full-scale server Xeon CPU. Fungible and Pensando have developed specific DPU processor designs instead of relying on FPGAs or ASICs.

There are no DPU benchmarks, so comparing performance between different suppliers will be difficult.

Intel says it will produce additional FPGA-based IPU platforms and dedicated ASICs in the future. They will be accompanied by a software foundation to help customers develop cloud orchestration software.

Read this article:
Intel: I'm already the biggest DPU shipper Blocks and Files - Blocks and Files

Read More..

Using Zero Trust Security to Protect Applications and Databases – Server Watch

Applications and databases play vital roles for organizations hosting services and consumers accessing data resources and protecting them is a top priority for any data center.

Connected to an internet full of hackers, billions of devices, and malware, networks are vulnerable to an array of web-based threats. Not long ago, the priority for network security was securing the network perimeter, but forces like remote work and the widespread adoption of cloud and edge computing make defending the perimeter increasingly tricky.

Its no longer a question of if malicious actors can gain access. Its whether theyre able to move laterally within the network when they do. As zero trust has evolved from buzzword to product in the last decade, a consensus has emerged that microsegmentation-based framework is the surest defense against the next generation of threats. To preserve server security, zero trust ensures intruders will never reach an organizations crown jewels.

Here we look at why zero trust is a significant boost to application and database security and how to adopt a zero trust architecture.

Downtime, machine failure, and cyberattacks can be devastating to organizations. When data is offline or unavailable, personnel and customers alike arent pleased. Knowing this, administrators secure the network with a suite of software and security tools to keep the network running and data available. For the data center, power redundancy and backup and disaster recovery solutions are essential protections.

Another crucial example of a network tool are traditional firewalls placed at the network edge to prevent intruders and malicious packets from gaining entry. As the perimeter has long been a cybersecurity priority, security policies inside the network and traffic between network segments changed little. As the years transformed network perimeters, accessing a network gateway has never been easier.

Also Read: SASE: Securing the Network Edge

A malicious actor can move laterally through the network with initial access, escalate privileges, and compromise sensitive data. Several attacks this year, including the SolarWinds Orion breach, showed how skilled advanced persistent threats (APT) could mask their activity while spreading malware across network systems.

In reversing the paradigm of designing devices to inherently trust other devices [allow all], zero trust calls for granular controls between network segments and eventually a day where only pre-categorized traffic is permissible [deny all]. Because SMB up to large enterprise organizations requires extensive data and application sharing capabilities, the network architects objective isnt to disrupt business-critical access instead, ensure abnormal traffic gets identified and managed.

By following the steps provided, network stakeholders can ensure that the organizations most important assets are secure, maximize visibility into network traffic, and adjust control policies to maintain regular business.

Todays network perimeter is rarely still. From the rise of remote work to the boom in endpoint devices in use, protecting an organizations attack surface is no longer entirely possible.

Network administrators need to take a birds eye view of their network and define where the most critical data and resources reside. Dubbed the protect surface, every organization has network segments vital to business continuity that likely deserve more substantial security than other segments. The Applications with client data, operational technology (OT) that controls industrial processes, and Active Directory come to mind.

With protect surfaces identified, the process of defining users and privileges begins. Who is accessing what resources? Does a user with initial access have access to the whole network segment or just a fraction of the data resources within an application?

Also Read: Top IAM Tools & Solutions for 2021

Applications and databases are responsible for storing and transmitting critical data across global networks. When resources move from defined protect surfaces, the flow, destination, device, time, location, user and role are all data points administrators need to inform next steps.

An image of how malicious actors could access your most important data and system controls will appear when analyzing how data moves. Equipped with valuable insight into traffic flows and vulnerabilities, administrators can start to test their findings.

At the heart of zero trust in practice is microsegmentation, the act of segmenting network components to ensure appropriate access levels for the relevant data resources.

The network fabric makes enforcing access betweens segments in your infrastructure seamless for data centers and software-defined data centers. By contrast, network fabrics arent ideal for microsegmentation in cloud environments. Fit for an SDDC environment, a virtual machine manager, also known as a hypervisor, can serve as an enforcement point for comprehensive network management.

And last but not least, next-generation firewalls (NGFW) are a popular choice for implementing microsegmentation because of their flexibility in deployment. Across environments, NGFWs can form a distributed internal layer of security throughout the network.

Also Read: Top Firewall (NGFW) Vendors 2021

No matter the microsegmentation route, administrators now can establish granular policy rules based on their prior findings. Essential information for establishing valid policies include clearly defining:

With the organization network mapped out, all packets, users, privileges, and protect surfaces defined, its time to configure policies to reflect an optimized security approach. Applying these policies can be one application at a time or en masse once its found successful. Administrators can then test flipping the trust switch for the first time. From allowing all to denying all traffic except whats prescribed the networks taken a giant leap.

Flipping the trust switch comes with its share of hiccups. As key personnel and clients begin using the network in its zero trust infrastructure, the IT department is sure to see a rise in technical support requests. Every request for greater access informs network and database administrators on adjusting controls to reflect the living organizations security framework. Monitor these requests and continue to track how sensitive data moves to optimize changes to policies.

Also Read: Top Rack Servers of 2021

There are no one-size-fits-all zero trust solutions. While vendors offer support, insight, and experience in implementing zero trust, a zero trust framework is custom to the organization and network it serves. With that in mind, the process for implementation described above isnt concrete. Organizations with initiative can take steps today to start the process of building a zero trust network architecture.

Zero trust covers the gamut of the OSI model to protect the organizations digital infrastructure. Implementing zero trust from network to application layers, databases, and software programs gives stakeholders the visibility to feel confident about the organizations security posture.

While an intimidating endeavor, moving towards zero trust is a process worth initiating to organize and secure your organizations data resources for years to come.

While databases and applications have long been mainstream components of the enterprise network, security services for protecting them are still a complex marketplace. To learn more about the industry, check out eSecurity Planets Top Database Security Solutions for 2021.

Also Read: Best Load Balancers of 2021

Read the original here:
Using Zero Trust Security to Protect Applications and Databases - Server Watch

Read More..

How the FBI Is Trying to Break Encryption Without Actually Breaking Encryption – Gizmodo

Photo: MANDEL NGAN/AFP (Getty Images)

Since at least the 1990s, federal officials have publicly worried that encrypted communications give aid to terrorists and criminals. More often than not they have, to some degree, been right.

In the early 2000s, Los Zetas, the infamous Mexican cartel, actually created their own military-grade encrypted radio network, which they used to mask the movements of their narco-trafficking supply chain. Around the same time, al Qaeda and other terrorist Mujahideen groups began using self-engineered encryption software in the hopes of avoiding the all-seeing eye of Americas national security state. Other criminal groups quickly followed suit and, today, the need for dark capabilities has given rise to companies that intentionally court and sell exclusively to underworld clientele. These firms, which allegedly go to great lengths to protect their customers, appear to have a short life span, however: In the last few years, a number of prominent encryption platforms and other technologies have been infiltrated and dismantled by law enforcementwith the most recent example occurring just a week ago.

Last Tuesday, the U.S. Department of Justice announced Trojan Shield, a bold, over-the-top law enforcement operation. In it, the FBI used a high-level criminal informant to co-opt and then run an encrypted chat platform, called ANOM, designed specifically for transnational criminal organizations. Rather than infiltrate an existing platform, the feds had decided to create and operate their own. When drug traffickers and money launderers flocked to ANOM, the FBI and other authorities were waiting, ready to intercept and study all of the communications the crooks offered up. It was the honeypot to end all honeypotsa baited trap on a global scale.

Certainly, the short-term payoff from the operation has been overwhelming: all last week, governments throughout the world continued a parade of hundreds of arrests, with police holding press conferences and gleefully trotting out indictments related to the operation. Alleged biker gangs, Italian crime families, drug traffickers throughout the world were all ensnared. In the U.S., the Justice Department indicted 17 people allegedly involved in managing ANOM (despite the FBIs secret role), arresting a majority of them. The operation has also revealed a deluge of intelligence about the ways in which international criminal syndicates operate, which will doubtlessly help inform future investigations targeted against such groups.

And yet, one of the operations long-term goals, as stated by police, seems elusiveif not quixotic. We aim to shatter any confidence in the hardened encrypted device industry with our indictment and announcement that this platform was run by the FBI, said Acting U.S. Attorney Randy Grossman during a press conference last week. Similarly, Suzanne Turner, the special agent in charge of the FBIs San Diego Field Office, said that this should be considered a warning to criminals. [Those] who believe they are operating under an encrypted cloak of secrecy, your communications are not secure, Turner said. She later added that the operation would hopefully keep criminals guessing as to whether a platform was a legitimate business or one secretly run by the feds.

G/O Media may get a commission

Grossman and Turners statements mark a turning point in a decades-long effort by the U.S. government to undermine encrypted communication, which has proliferated into the mainstream in recent years, from Signal to iMessage, WhatsApp to Google Messages. If the cops cant break encrypted technologies, theyll break our confidence in them insteadeven if it means crossing the line themselves.

Encrypted messaging apps are pretty much untouchable by law enforcement, said James A. Lewis, a security professional with the Center for Strategic and International Studies, in a phone call. Lewis has studied the issue for years.

People used to speak by air-conditioners, or go for a walk in the park, he said, referencing Godfather-type scenarios, in which criminals would sneak around to avoid wiretapping. Now, he said, everybody, including the mafia, has a smartphone in their pocket. Thus, the temptation to rely on such easy methods of communication is strong. Its just a general shift to relying on messaging, he said. Criminals have moved with the rest of the population.

The companies that have preceded ANOMmany of which were infiltrated and dismantled by copsworked hard to conceal their activities, which were done in the service of criminal ecosystems centered around drug dealing and murder, government officials have argued. For instance, Phantom Secure, a now-defunct phone company that offered modified, encrypted Blackberry and Android devices, reportedly sold a majority of its services to Mexican drug cartels, which used the devices to communicate with underlings and strategize narcotics shipments. Two other platforms that were recently taken down by policeSky Global and EncroChatallegedly functioned in very much the same way.

Similarly, the devices used by the kind of groups ensnared in Trojan Shield are far different than your average civilian encrypted chat app like Signal or WhatsAppboth of which use end-to-end encryption, meaning only the sender and recipient have access to any conversations. Most often, they are modified phones that have had the GPS, mic, and camera capabilities disabled, and include a specialized encrypted chat app that functions on a closed loop with other devices specifically designed to communicate with each other. On top of this, the government claims companies that sell such devices will often offer covert protection to their customershelping to remotely wipe the contents of phones if they are confiscated by police. With all of these benefits, criminals have little incentive to give up these types of services because they are simply too useful to their operations.

A lot of the encryption is un-hackable, Lewis said. If you can get access to the device then your chances are better, but if you are just intercepting traffic, it can be exceptionally difficultmaybe even impossible [to hack it].

That unbridgeable impasse is partially why the FBI and other federal agencies have spent the last 30 years waging a slow-motion campaign against the use of encryption. During the first so-called Crypto Wars in the 1990s, national security politicos in the Clinton administration argued that the proliferation of encryption technologies worldwide would effectively create a force-field around corruption. Ever since then, federal officials have, in one way or another, aggressively pursued a workaround for the technology, often employing strategies that threatened civil liberties and treated Americans privacy as an afterthought.

This has gone through a number of different iterations. When the 90s lobbying to halt encryptions export didnt work, the feds quickly turned to a different strategy: lobbying the private sector to install backdoors in their encrypted networks so that the FBI could enjoy intimate access to Americans protected communications. Beginning in the mid-2000s, the Justice Department and the FBI went on a charm offensivetrying to explain to Congress and the American people why it really needed to do this. That campaign has lasted for years, with ongoing lobbying by the FBI director continuing to the present moment.

With Trojan Shield, it really seems like a whole new tactic in the governments ongoing battle against encryption, but one that is far more psychological than legal. Here, the bureau seems to be attempting to shake overall confidence in encrypted platformsinspiring doubt over whether those communications are really secure or just a giant honeypot with an FBI agent lingering in the rearview. In so doing, theyre basically trying to undermine a technology that serves as one of the few protections for everyday peoples privacy in a world intentionally designed to eviscerate it.

Jennifer Lynch, the surveillance litigation director at the Electronic Frontier Foundation, said that the recent operation was concerningadding that she doubted the FBI even had the legal authority in the U.S. to carry out Trojan Shield, which is probably why it was partnered with more than 100 countries, according to the DOJ.

We still dont know a lot about how this investigation occurred and how all of the data-sharing transpired among the different countries that were involved, Lynch said in a phone interview. What we do know, however, is concerning enough. The FBI said that they geo-fenced communications of Americans. That says to me that even the FBI doesnt believe they have legal authority under the Fourth Amendment or our federal wiretapping act to do what they did.

Extrapolating on that point, Lynch noted the bureaus partnership with Australia, which recently passed the TOLA Act. The law allows the Australian government to compel private companies and technologists to reengineer software and products so that they can be used to spy on users. Australias laws also allow for extensive wiretapping, ones that far outstrip the ones available in the U.S., Lynch said.

Basically, the FBI is laundering its surveillance through another country, she said.

Alternately, Lewis argues that the challenges posed by encryption force law enforcement to get creative with how they combat the increasing use of the technology by criminal groups.

You have to get a subpoena, you have to get the company to cooperate, said Lewis, explaining the current restrictions when police try to investigate malfeasance via encrypted chat platforms. The company wontin many caseshave access to the unencrypted data. Thats where something like this becomes attractive [to criminals].

Even with high-powered entities like the National Security Agency, the data they intercept wont necessarily be useful in traditional law enforcement investigations, he said. The NSA is not in the law enforcement business, he said. Theyre not collecting evidence. So even in the cases where they have intercepted traffic, it could not be used in court, said Lewis. So youve got technology problems and legal problems.

If the operation has seeded doubt about the security of the platforms for criminal use, then its done its job, he argues.

Its certainly planted a seed of doubt in their minds, he said, of the criminals. Uncertainty really helps. It means theyll want to do more face-to-face meetings or something else other than talk on the phone, which may make them easier to catch, he said.

Of course, the FBI plants seeds of doubt by chucking handfuls of the stuff at everyone within earshotits not just criminals who will fear that someones reading every text, its all of us. And for Lynch, thats an injustice.

I think that what the FBI did is highly suspect, she said, and I think that we should all be concerned about itbecause it makes us question the privacy and security of our communications.

Read more here:
How the FBI Is Trying to Break Encryption Without Actually Breaking Encryption - Gizmodo

Read More..

Vergecast: Windows 11 leaks, RCS encryption, and this week in antitrust – The Verge

Every Friday, The Verge publishes our flagship podcast, The Vergecast, where co-hosts Nilay Patel and Dieter Bohn discuss the week in tech news with the reporters and editors covering the biggest stories.

In this episode, the show is split into three sections. First, Nilay and Dieter talk to Verge senior editor Tom Warren about this week in Microsoft: leaks of the Windows 11 UI, announcements from E3 2021, and Microsoft CEO Satya Nadella doubling as the companys chairman.

In section two of the show, Verge politics reporter Makena Kelly returns to explain the continuing push by the US government to enact antitrust legislation on tech monopolies this week, five new bills were introduced and the Senate confirmed a new commissioner of the FTC.

In part 3, Verge managing editor Alex Cranz joins in to chat about this week in gadgets and Google the company is adding end-to-end encryption to their Messages app, Sonos officially announced their picture frame speaker, and Telsas Model S Plaid made its big debut.

You can listen to the full discussion here or in your preferred podcast player.

Here is the original post:
Vergecast: Windows 11 leaks, RCS encryption, and this week in antitrust - The Verge

Read More..