Page 399«..1020..398399400401..410420..»

Satoshi Trial (COPA v Wright): COPA forgery experts dismantled on the stand – CoinGeek

Through three weeks of COPA v Wright, one factor has hovered over the proceedings like a phantom: hundreds of documents submitted into evidence which the Crypto Open Patent Alliance says were forged by Dr. Craig Wright to support his claim to be Satoshi Nakamoto.

Jonathan Hough KC, barrister for COPA, spent almost the entire six days of Dr. Wrights time on the stand accusing Dr. Wright of having made the forgeries, one by one. Armed with expert reporters from supposed forensic experts Patrick Madden, Arthur Rosendahl and others, Hough KC would put things to Dr. Wright like, This documents metadata lists an last edit time of years and our expert says this means its manipulated, isnt that right Dr. Wright? or This OpenOffice documents timestamp is from a time before the version used to create it was released. Its a forgery, isnt that right?

Dr. Wright had an answer to practically everything. Hed say that he works with a complex computer environment which is slightly difficult to reproduce (but far from impossible), involving the use of virtual machines and shared environments such as Citrix which are known to produce the kind of document anomalies that have been seized upon by COPA. Hes also particular about which release and version he uses for his software, so what COPAs experts called anomalies are a consequence of, for example, opening a document that was created in an old version of a program via a new one. Other anomalies, Dr. Wright would sometimes say, arent anomalies at all and any expert worth their paycheck would know that.

Yet backed with multiple expert reports, Hough KC and his clients must have felt confident that Dr. Wrights explanations were passing through Justice Mellor without having any impact. After all, experts are valued in court cases precisely because they are independent from itand Dr. Wright most certainly is not.

If that was COPAs hope, it fell apart almost as soon as their first expert took the stand on Monday.

First up was Patrick Madden. Madden is COPAs primary document authenticity expert, having submitted no less than five expert reports in advance of the case. Within minutes, Craig Orr KC (barrister for Dr. Wright) had elicited an admission that shook many of the assumptions underpinning COPAs forgery allegations.

Keeping in mind that many of Maddens conclusions on document manipulation were reached because of anomalous timestamping, Orr KC asked Madden:

MS Word documents contain an edit time metadata counter, yes? Its true that if a user is accessing MS Word through Citrix, the edit time counter will start to run when MS Word is opened on the remote server, posed Orr KC, and Madden agreed.

And it will continue to run until MS Word closed on the remote server.

Madden, as he often did during his testimony, quickly interjected, seemingly anticipating Dr. Wrights defences as Orr KC formulated his questions: Or if a separate MS Word launches on the same remote session.

But subject to that, the counter will continue to run until MS Word is closed. And thats because a user operating MS Word through Citrix is interacting with an instance of MS Word running on the remote server?

Yes, admitted Madden.

So, Orr KC surmised, all that would have to happen to have an edit time spanning a period of years or even decades is for a user to simply open a Citrix session on their local computer, open MS Word on the remote server, open a document in MS Word, disconnect their local computer without closing MS Word on the remote server and then at a later point in time reconnect to the remote server using their local computer. Madden said yes.

Assuming this is news to Justice Mellor (something we shouldnt do, given his technical background) he must have been left wondering why so much time was spent goading Dr. Wright over timestamp anomalies caused by Citrix when Maddens response affirmed what Dr. Wright had told the court days earlier.

When it came time for Orr KC to challenge Maddens actual conclusions, things really started to fall apart.

One of those conclusions was reached by Madden in respect of an OpenOffice document submitted by Dr. Wright as part of his case. Madden found that the timestamps, which dated it to March 2008, could only be explained by Dr. Wright backdating the computer clock on which the document was created, because the OpenOffice version used was not released until well after 2008.

Are you aware that Dr. Wright explained in his evidence that he created this document using LaTeX, and deliberately set the metadata to use OpenOffice 2.4 to obscure the version number by making it appear as though he wrote the document using OpenOffice?

At the mention of LaTeX, an expert in Maddens position should have immediately tapped out. He isnt a LaTeX expert and as he would go on to admit, isnt at all familiar with LaTeX. Therefore, the only good faith move open to him as an expert before the court was to acknowledge he cannot offer an opinion on Dr. Wrights explanation. Yet, for some reason (the reason would become clear by the end of Orr KCs cross-examination) Madden wouldnt cede the point.

Im not a LaTeX expert, but from being familiar with other not dissimilar programs and the concept of document conversion, looking through this has the structure and feel of a document created using OpenOffice.

Just how the court is supposed to assimilate Maddens opinion on the feel of the document, he did not say. He went on:

So you cant answer as to whether or not what Dr. Wright says is technically possible, said Orr KC.

No, however, he would then have had to have been instructing the software to put in the level of detail about the build to name in the document which hadnt been released yet.

Thats a question of fact on which you are unable to express any opinion.

Madden, not getting the point: Its a question of how he would have known and then the fact is this information is recorded in the document.

I do suggest that clock manipulation is not the only possible explanation for the metadata you have observed in this document.

Madden, still not getting the point: I dont believe LaTeX was used for this because it is an OpenOffice file. This document is an ODT file in my opinion, created with OpenOffice.

You are not a LaTeX expert. And again, as in other places, you are jumping to conclusions here in a desire to reach a conclusion you want to reachthat these documents are not authentic.

Orr KCs genial faade had steadily been giving way to a firmer tone throughout this line of questioning, and by that last comment his tone had made clear hed heard all he needed to from Madden on that point.

In Maddens defence, on cross examination we are seeing only the parts of his expert analysis that Dr. Wrights barristers see fit to highlight. For some of these items, Madden may have phrased his analysis less forcefully than Hough KC makes its sound when putting it to Dr. Wright on the stand. But that is precisely why declaring Dr. Wrights case a write-off after his testimony would have been foolishit would be like judging an NFL player solely off his career highlight (or lowlight) reel. Without the context of the 99% of the story not captured in those highlights, its impossible to draw a conclusion one way or the other.

That caveat aside, however, the strategy of Orr KC was clear: get Madden to confirm the blanket statements made by Dr. Wright to excuse the supposed manipulations (such as his Citrix environment) and drill down on enough of Maddens specific conclusions to demonstrate to the court that he is speaking beyond his expertise and clinging to the conclusion that Dr. Wright is a serial forger.

This culminated in a killing blow right as Madden was about to be home-free. Orr KC brought Madden to the wording of his first expert report, which contains an acknowledgement that the volume of work has been too much to do alone and I have been assisted by Bird & Bird. Orr asked why Madden wouldnt recruit an assistant rather than rely on the law firm for whom he is supposed to be serving as an independent expert. Madden answered that he didnt want to rely on the work an assistant did, preferring to keep his expert analysis wholly his ownso he relied on Bird and Bird.

The majority of which was just finessing the language, Madden said without a hint of self-awareness.

So when we see language in your reports such as Dr Wrights position is speculative and unfounded, thats your language? asked Orr KC, whose dramatic pauses were becoming longer and his expression of disbelief sterner.

No, thats my language, Madden laughed nervously.

Orr KC must have known what he had in his hands at that point. He asked Madden if he kept an office at Bird and Bird (no, just met with them three or four times in their office, said Madden) and how its possible for Madden to be both dictating his analysis to Bird and Bird and reviewing it at the same time (that would be sections where Im demonstrating the findings and as Im talking through it one of the people at Bird and Bird would have been writing up what Im saying, he answered).

They were drafting the report for you, werent they? said Orr KC.

No, they were helping with the assembly of it, but the actual content is mine, came Maddens answer.

Have you adopted a similar approach in any other case?

I produce my draft, Ill be asked to explain bits in more detail, and have had that happen here.

I asked you: have you adopted a similar approach to that youve adopted with Bird and Bird in any other case?

Not quite the same, no, came the begrudging admission.

You must be aware of the overriding importance of retaining your independence as an expert. The approach you have adopted has undermined that independence, hasnt it?

Madden of course said no, but the damage had surely already been done.

Maddens testimony contrasted with that of another of COPAs expert witnesses: Arthur Rosendahl, who is put forward as an expert on LaTeX. Unlike Madden, Rosendahl readily acceptedand even pre-emptedthat certain areas he could have commented on were beyond his area of expertise. There were also no questions from Orr KC on the process Rosendahl had used to prepare his report or his independence more generally, underscoring the complaints made about Madden.

Beyond that, however, Orr KC found that Rosendahls opinion contained much of the same problems as those put to Madden.

Like Madden, Rosendahl had also failed to replicate Dr. Wrights computer environment. Dr. Wright made clear in his witness statements that he used a combination of Windows and Linux when using LaTeX, and that he had used the MiKTeX distribution for Windows and a TeXLive distribution for Linux. Despite this, Rosendahl worked only performed his analysis using TeXLive.

Rosendahls answer to this was to say that it was not clear from Dr. Wrights explanation whether he was using one specific OS or not, or whether he was flitting between the two as he worked.

It seemed to me like he was using either one or the other, and I used the environment I was most familiar with.

Its hard to imagine the court being satisfied with this answer. It was open to Rosendahl to come back with a request for more information from Dr. Wright so he could perform accurate tests, but he evidently chose to operate on an erroneous assumption instead. Such nuances in testing environment are likely to be significant in this case, because COPAs case is that the LaTeX code provided by Dr. Wright does not compile into a pixel-perfect representation of the Bitcoin White Paper like Dr. Wright says it should. As COPA tells it, the slightest difference in appearance between the compiled output of Dr. Wrights code and the original Bitcoin White Paper is evidence of forgery, so a failure to take into account the full extent of Dr. Wrights environment is inexcusable. This failing is especially dramatic given that Rosendahl knew his report would be deployed against an individual facing serious allegations of forgery.

Rosendahl was also, like Madden, forced to admit that Dr. Wrights explanations as to certain anomalies within his evidence were technically possible (albeit unlikely). In his report, Rosendahl highlighted features such as a lack of hyphenation over line breaks. Rosendahl wrote that LaTeX is set by default to allow words to break across lines, whereas the PDF format does not. Lack of hyphenation in the white paper PDF, according to Rosendahl, means that it cannot have been created in LaTeX.

Except within minutes of having Rosendahl on the stand, Orr KC got him to admit, just as he had done with Madden, that it is possible for a person to change the default LaTeX behaviour so that it does not use hyphenation. Orr KC elicited the same admission regarding Rosendahls conclusion that because the White Paper uses different fonts for headers and body text, it cant have been created in LaTeX: it would be uncommon, conceded Rosendahl, but possible.

So, after two days of hearing from the experts whose reports COPA so scandalously used to attack Dr. Wright while on the stand, where is COPAs case left? No where. In relation to both experts COPA had put up for cross-examination, Dr. Wrights barrister Craig Orr KC was able to show that neither had replicated Dr. Wrights working environment in order to produce their reports despite knowing it was non-standard, and many of the expert conclusions Hough KC lobbed at Dr. Wright were shown to be either beyond the expertise of the expert or are simply non-sequiturs.

If that sounds unusual, wait until you hear the real absurdity underpinning the forgery allegations against Dr. Wright: as hes said repeatedly over the course of the trial, the librarys worth of documents submitted into evidence in this case is largely his lifes work spanning decades. Many of these documents are said by COPA to be poor forgeries, and yet those same documents form the foundation for a vast empire of patents that have already been granted to Dr. Wright and nChain. This means they have already undergone (and survived) rigorous trials at the hands of patent attorneys and patent offices around the world.

Paid-for experts produced by COPA may muddy the waters for some (and full credit to Craig Orr KC for the limited purification he was able to do while on the stand), but the proof of Dr. Wrights work is already in the patent pudding.

Check out all of the CoinGeeks special reports on theSatoshi Trial (COPA v Wright).

New to blockchain? Check out CoinGeeks Blockchain for Beginners section, the ultimate resource guide to learn more about blockchain technology.

See more here:

Satoshi Trial (COPA v Wright): COPA forgery experts dismantled on the stand - CoinGeek

Read More..

Huawei to bring cloud computing to Egypt – DCD – DatacenterDynamics

Huawei is set to launch a new local cloud service in Egypt next month.

First reported by the South China Morning Post, the Chinese tech giant is also planning to develop an AI cloud computing center in Hong Kong, its first outside of mainland China.

The new cloud region in Egypt will add to Huawei's 85 cloud availability zones, spread across 30 regions.

The company is the second largest cloud provider in China, and has been steadily increasing its global presence. Last year, it launched new cloud regions in Turkey and Saudi Arabia, the latter of which was located in an STC/Center3 data center in Riyadh.

Huawei is also developing third availability zones in both Brazil and Mexico. The Brazilian region is expected to go live in 2024.

It is unclear which data center the Egypt cloud region will be hosted in. In 2019, Huawei had shared plans for a cloud data platform that would be hosted in a Telecom Egypt data center in Cairo. At the time, this was intended to be Huawei's first cloud computing region in Africa and the Middle East, but it appears that the company had a change of heart, and its first service in the region was Riyadh, Saudi Arabia, which launched last year.

Elsewhere, Huawei lists several regions across China and Hong Kong; Dublin, Ireland; Amsterdam, the Netherlands; Paris, France; Bangkok, Thailand; Singapore; Jakarta; Indonesia; Riyadh, Saudi Arabia; Istanbul, Turkey; Johannesburg, South Africa; Mexico City, Mexico; Sao Paulo, Brazil; Buenos Aires, Argentina; Lima, Peru; and Santiago, Chile.

Despite the global growth, many countries continue to deem Huawei a high-risk vendor due to its close ties to the Chinese government. The US and UK are among those to have banned its equipment from their networks.

In addition to the new cloud region, Huawei has also shared plans for an AI cloud computing center in Hong Kong. This will be the first outside of mainland China, with the company already having developed such facilities in Gui'an, Ulanqab, and Wuhu, from which customers can access the Huawei Cloud Ascend AI service with ready-to-call AI models.

Jacqueline Shi, president of Huawei Cloud Global Marketing and Sales Service, said: "At Huawei Cloud, AI is a key strategy. We're building a solid cloud foundation for everyone, for every industry, to accelerate intelligence."

The company has its own generative AI model named Pangu, the third version of which was launched in July 2023.

See the article here:
Huawei to bring cloud computing to Egypt - DCD - DatacenterDynamics

Read More..

Alibaba Cloud cuts prices, hard, for multi-year commitments – The Register

Alibaba Cloud has made significant price cuts for those willing to use its datacenters in mainland China and commit to multi-year deals.

The Chinese giant has detailed [in Chinese] price reductions of up to 36 percent for some instance types in its Elastic Compute Service (ECS).

Object storage prices can fall 55 percent under some deals, helped by the extension of Alibaba Cloud's reserved capacity terms from one year to between two and five years.

Database as a service costs have also been reduced, by up to 40 percent.

A free traffic allowance has been increased from 10 to 20 gigabytes.

The Register understands Alibaba hopes to win more customers in mainland China, and to encourage local businesses to adopt cloud and consider AI.

Alibaba Cloud customers outside mainland China are welcome to use the lower prices. The Register expects a decision to do so will only come after very close consideration of Chinese data protection and security laws, as few orgs are comfortable storing sensitive data offshore never mind in a famously complex jurisdiction like the People's Republic.

The discounts were announced weeks after Alibaba Cloud revealed that its growth had stalled, other than among Alibaba Group companies. The hyperscaler also sought to slough off low-margin contract-based customers.

Those challenges put into perspective the discounts for long-term commitment to the Alibaba Cloud customers that sign up for the deals could help the hyperscaler address both of its problems.

Alibaba Cloud is not alone in trying to make long-term commitments attractive. The likes of Azure and AWS have made reserved capacity the cheapest way to consume their services a tactic that makes sense as it allows them to more predictably cover the costs of their infrastructure. Alibaba would understand that aspect of cloud economics, and almost certainly employs the finance wonks capable of modelling them for its own business.

That it has reached similar conclusions about how best to price its cloud using long deals is therefore no surprise especially for an organization whose cloudy thinking has so often reached the same conclusions as its Western rivals.

Originally posted here:
Alibaba Cloud cuts prices, hard, for multi-year commitments - The Register

Read More..

‘The year of the data cloud:’ Salesforce results, guidance impress Wall Street – Seeking Alpha

John M. Chase

Salesforce (NYSE:CRM) was in focus on Thursday after the cloud computing software giant reported strong fourth-quarter results and provided guidance for fiscal 2025, leaving many on Wall Street to see this year as "the year of the data cloud."

Shares rose fractionally in premarket trading.

"Strong demand for Data Cloud offering, which is now approaching $400M in [annual recurring revenue] (nearly +90% yr-yr)," Baird analyst Rob Oliver wrote in a note. "While macro environment remains challenging, strong trends in AI and data could be sources of upside this year."

Oliver reiterated his Outperform rating and boosted his price target to $355 from $310.

Stifel analyst J. Parker Lane also said 2024 will likely be the "year of the data cloud," as he maintained his Buy rating on Salesforce and bumped his price target to $350 from $330.

Looking to the next fiscal year, Salesforce expects to generate sales within a range of $37.7B to $38B, below the $38.65B that analysts were forecasting. Full-year earnings are expected to be between $9.68 and $9.76 per share, above the $9.61 per share estimate.

For the period ending Jan. 31, Salesforce earned an adjusted $2.29 per share as revenue rose 11% year-over-year to come in at $9.29B. Subscription and support revenue during the period rose 12% year-over-year to $8.75B, while service revenue came in at $2.16B. Platform and other revenue was $1.72B during the period, while marketing and commerce revenue rose 8.2% year-over-year to $1.287B.

Analysts had expected the Dow 30 component to earn $2.27 per share on revenue of $9.22B.

Salesforce also announced its first-ever quarterly dividend of $0.40 per share and boosted its share buyback program by $10B. The company also said that it had returned $1.7B to shareholders in the fourth-quarter in the form of buybacks.

Salesforce appears to be in a strong position when it comes to monetizing artificial intelligence, especially via its Einstein offering, which some on Wall Street believe could boost sales considerably.

"We believe [AI] is a major land grab opportunity that could significantly benefit CRM over the coming years and could increase overall revenue by $4 billion+ annually based on our estimates and field work by 2025," Wedbush Securities analyst Dan Ives wrote in a note. Ives maintained his Outperform rating and $325 price target.

Lane also expressed optimism around Salesforce's ability to generate revenue from AI in short order.

"We note that Einstein contributions in the guide are minimal, so early adoption has the potential to drive upside from the [high single digit] growth guide," Lane wrote. "We remain confident in the company's positioning in the nascent AI space and expect its early investments into building AI-powered solutions/models organically will begin to pay off as soon as FY25."

Excerpt from:
'The year of the data cloud:' Salesforce results, guidance impress Wall Street - Seeking Alpha

Read More..

Stack Overflow and Google Cloud Announce Strategic Partnership to Bring Generative AI to Millions of Developers – PR Newswire

Partnership brings together the leading knowledge platform for developers with Google Cloud's leading AI platform

NEW YORK and SUNNYVALE, Calif., Feb. 29, 2024 /PRNewswire/ -- Stack Overflow and Google Cloud today announced a strategic partnership that will deliver new gen AI-powered capabilities to developers through the Stack Overflow platform, Google Cloud Console, and Gemini for Google Cloud.

Through the partnership, Stack Overflow will work with Google Cloud to bring new AI-powered features to its widely adopted developer knowledge platform. Google Cloud will integrate Gemini for Google Cloud with Stack Overflow, helping to surface important knowledge base information and coding assistance capabilities to developers. Google Cloud will also surface validated technical knowledge from Stack Overflow directly in the Google Cloud console, giving developers easy access to trusted and accurate knowledge and code backed by the millions of developers that have contributed to the Stack Overflow platform for 15 years.

"This partnership brings our enterprise AI platform together with the most in-depth and popular developer knowledge platform available today," said Thomas Kurian, CEO at Google Cloud. "Google Cloud and Stack Overflow will help developers more effectively use AI in the platforms they prefer, combining the vast knowledge from the Stack Overflow community and new AI capabilities, powered by Vertex AI and Google Cloud's trusted, secure infrastructure."

"In the AI era, Stack Overflow has maintained that the foundation of trusted and accurate data will be central to how technology solutions are built, with millions of the world's developers coming to our platform as one of the few high quality sources of information with community attribution at its core," said Prashanth Chandrasekar, CEO of Stack Overflow. "This landmark, multi-dimensional AI-focused partnership, which includes Stack Overflow adopting the latest AI technology from Google Cloud, and Google Cloud integrating Stack Overflow knowledge into its AI tools, underscores our joint commitment to unleash developer creativity, unlock productivity without sacrificing accuracy, and deliver on socially responsible AI. By bringing together the strengths of our two companies, we can accelerate innovation across a variety of industries."

Extending Stack Overflow to Gemini for Google Cloud and the Google Cloud ConsoleStack Overflow and Google Cloud are partnering to unleash developer productivity by bringing together Gemini for Google Cloud with Stack Overflow's trusted, community-vetted knowledge. Gemini for Google Cloud is already trained on an extensive collection of publicly available information and code from open-source and third-party platforms. Now, Gemini for Google Cloud will also provide developers with suggestions, code, and answers from Stack Overflow, utilizing the new OverflowAPI.

In addition, developers using Gemini for Google Cloud will be able to access Stack Overflow directly from the Google Cloud console, bringing them greater access to information so they can ask questions and get helpful answers from the Stack Overflow community in the same environment where they already access Google Cloud developer services and manage cloud applications and infrastructure.

Supercharging Stack Overflow's developer engagement platform with Google Cloud's AI servicesStack Overflow has selected Google Cloud as the platform of choice to host and grow its public facing developer knowledge platform. In addition, Stack Overflow plans to leverage Google Cloud's state-of-the-art AI capabilities to improve their community engagement experiences and content curation processes. The use of Google Cloud AI technology is expected to result in an accelerated content approval process and further optimized forum engagement experiences for Stack Overflow users.

To learn more about Stack Overflow, visithttps://stackoverflow.co. To learn more about Gemini for Google Cloud, visit https://cloud.google.com/duet-ai.

The first set of new integrations and capabilities between Stack Overflow and Gemini for Google Cloud will be available in the first half of 2024 and previewed at Google Cloud Next, April 9-11.

About Stack Overflow Across both its public and private platforms, Stack Overflow is empowering developer communities to discover the information, answers, and learning opportunities they need when they need them. Millions of the world's developers and technologists visit Stack Overflow to ask questions, learn, and share technical knowledge, making it one of the most popular websites in the world. Stack Overflow's market-leading knowledge sharing and collaboration platform, Stack Overflow for Teams, helps more than 15,000 organizations distribute knowledge, increase efficiency, and innovate faster. Founded in 2008, Stack Exchange, Inc., the owner of Stack Overflow, is headquartered in New York, NY. Stack Overflow is a registered trademark of Stack Exchange, Inc.

About Google CloudGoogle Cloud accelerates every organization's ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google's cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems.

SOURCE Google Cloud

The rest is here:
Stack Overflow and Google Cloud Announce Strategic Partnership to Bring Generative AI to Millions of Developers - PR Newswire

Read More..

Top Cloud Computing Bootcamps to Enroll in 2024 – Analytics Insight

In todays fast-paced job market, staying ahead of the curve is essential. One area where this is particularly true is in cloud computing and modular technology solutions. These cutting-edge fields are crucial for excelling in the modern corporate world. Its no surprise that cloud and distributed computing have consistently ranked high on LinkedIns Annual list of skills employers expect the most from individuals.

For those interested in a career path in cloud computing, enrolling in a Cloud Computing Bootcamp can be a smart move to gain basic knowledge and skills. However, with so many alternatives available, determining the ideal one may be difficult. To help you navigate this, weve curated a list of the top cloud computing Bootcamps that are worth your time and effort.

This program, offered in partnership with Caltech CTME, provides in-depth training on cloud computing, focusing on Azure and AWS. Participants gain skills in designing and deploying dynamically scalable and reliable cloud applications. The program includes live learning, hands-on exercises, industry case studies, masterclasses by CTME faculty, and full-spectrum career services.

This 8-month program (4 hours per week) covers all aspects of cloud computing, including core distributed systems, cloud applications, and cloud networking. The specialization consists of 6 courses that progress from basic to advanced topics. Participants earn a course completion certificate by successfully completing a hands-on project.

This program explores all three types of cloud computing IaaS, PaaS, and SaaS. Participants learn about cloud technologies such as AWS, GCP, Azure, vSphere, and OpenStack, and how to use them to enhance business productivity and effectiveness. The program lasts for 8 months and involves instructor-led training.

This learning path consists of 6 cloud computing courses totaling 10 hours of comprehensive learning material. Participants gain a strong foundation in cloud computing, understanding the differences between AWS, Azure, and GCP, exploring common features of all cloud services, and learning about different job options related to cloud development.

This course covers basic cloud computing concepts along with AWS fundamentals. With around 8 hours of on-demand video, participants gain an understanding of the fundamental systems underlying the cloud and build their expertise from basic to advanced levels.

This program is designed to help participants become AWS Cloud Architects, capable of leading an organizations cloud computing strategy. The 3-month program (10 hours per week) covers planning, designing, and building highly available and secure cloud infrastructure. Prior knowledge of programming, AWS, and cloud computing is necessary.

This training program is ideal for those seeking a foundational understanding of cloud computing. Participants learn cloud computing basics in a platform-agnostic way, understanding which cloud provider or deployment model is right for them. The program also covers the benefits and limitations of major cloud service platforms, including AWS, Azure, and GCP.

This program aims to make participants proficient in cloud applications and architecture. Participants explore core cloud computing skills such as AWS CloudFormation, EC2, S3, Azure Resource Manager, VPC, Route53, Azure App Services, and more. The program also prepares participants for cloud architect certifications like Azure Architect and AWS Solutions Architect.

This program is suitable for individuals who want to learn the basics of cloud computing, including its different forms, benefits, and what makes the technology powerful. The program includes numerous hands-on demos of cloud computing solutions, as well as discussions on IaaS and SaaS solutions. The expert-led program lasts around 2 hours.

This short course (1-hour duration) is ideal for those seeking a foundational understanding of cloud computing. Participants learn how cloud computing is used, why its important, and the as a service models like PaaS, FaaS, IaaS, and SaaS. The course also covers the difference between servers and serverless and highlights the jobs that can be obtained with cloud skills.

In conclusion, gaining expertise in cloud computing can be highly beneficial for your professional career. Cloud skills are in high demand, and professionals can expect lucrative salaries as they continue to gain more experience in the field. By choosing the right Cloud Computing Bootcamp, you can enhance your skills and advance your career in this rapidly evolving field.

Join our WhatsApp and Telegram Community to Get Regular Top Tech Updates

View post:
Top Cloud Computing Bootcamps to Enroll in 2024 - Analytics Insight

Read More..

Alibaba Cloud announces a 55% price reduction – Chinadaily.com.cn – China Daily

Liu Weiguang,president of public cloud business at Alibaba Cloud Intelligence. [Photo provided to chinadaily.com.cn]

Alibaba Cloud, the cloud computing unit of Chinese heavyweight Alibaba Group Holding Ltd, announced on Thursday a price reduction of up to 55 percent on more than 100 core cloud products, marking the biggest price cut in the company's history.

This move aims to make public cloud services more inclusive and accessible in the era of artificial intelligence, and speed up the popularization of cloud computing in all walks of life, the company said.

The new prices, with an average of 20 percent reduction, are effective on Thursday, covering over 100 cloud products, such as computing, storage and database.

"As the biggest cloud service provider in China and the Asia-Pacific region, we see tremendous growth prospects in China's digital market. That's why we decided to launch the price reduction campaign to lower the threshold of cloud services for more enterprises and developers to reap the technological dividends and accelerate the adoption of advanced public cloud services across various industries in China," saidLiu Weiguang,president of public cloud business at Alibaba Cloud Intelligence.

Liu said with the rapidly increasing amount of data in China, businesses will need robust, high-performance and cost-effective computing power to help handle and analyze their massive amounts of data, adding Alibaba Cloud aims to become the most open cloudand help customers to turn AI into productivity.

Cloud infrastructure services expenditure on the Chinese mainland grew 18 percent year-on-year to $9.2 billion in the third quarter of 2023, according to market consultancy Canalys. Alibaba Cloud led the cloud infrastructure services market, taking up 39 percent of total spending.

fanfeifei@chinadaily.com.cn

Go here to read the rest:
Alibaba Cloud announces a 55% price reduction - Chinadaily.com.cn - China Daily

Read More..

JFrog integrates Qwak AI tech to speed up application delivery – SC Media

A new collaboration between software supply chain firm JFrog and machine learning platform provider Qwak AI will leveragemachine learningmodels to speed up the delivery of artificial intelligence applications at scale,SiliconAnglereports.

Under the integration, JFrog Artifactory and Xray will integrate natively with Qwaks ML Platform in a modern DevSecOps and MLOps workflow. JFrogs universal ML Model registry will be able to make use of a centralized MLOps platform to enable the building, training, and deployment of models with greater governance, visibility, versioning, and security whether on-premises or in the cloud. Machine learning engineers, data scientists, developers and DevOps teams can also make use of the integration to easily develop projects at scale. "There are still hurdles to overcome in terms of bringing ML models to production, such as bridging the gap between MLOps and DevSecOps workflows," said Gal Marder, JFrog's executive vice president of strategy. "We can provide customers with a complete MLSecOps solution that helps bridge this gap by bringing ML models in line with other software development processes."

Go here to see the original:
JFrog integrates Qwak AI tech to speed up application delivery - SC Media

Read More..

Why the Cloud Is Critical to Modern Applications in Retail – BizTech Magazine

The use of applications in retail has reached an all-time high. Experts predict that 2024 will see retail apps being used more than ever, almost as much as navigation and weather apps. But heres the challenge: How can retailers modernize their apps efficiently?

Traditionally, most retailers updated their applications inside their data centers. The data generated by these applications was typically very dispersed, and the applications themselves usually were not well integrated.Thats an increasingly obsolete approach. The rapid pace of change in retail makes application agility crucial.

But retailers simply cant afford to completely redesign their customer-facing mobile apps every time conditions shift. Instead, they need to be able to release new features without interrupting uptime or performance.

Enter cloud-native applications. This is a modernized approach that enables retailers to adapt to market changes and consumer needs by offering scalable, flexible and resilient infrastructure. Here are a few ways that retailers can use the cloud to modernize their applications:

Click the banner to learn how to modernize your retail IT environment.

Deployment speed is a key attribute of top DevOps performers. In fact, the highest-performing DevOps teams deliver on-demand deployment, with multiple deployments each day, according to Googles research on the topic. The more efficient the development, the faster the deployment.

Cloud platforms help expedite the cycle. DevOps teams can leverage myriad tools and services provided by cloud platforms to build, test and deploy applications more efficiently. From there, teams can also add any number of automated solutions into the mix.

DISCOVER: See how a national retail chain improved its customer experience.

Coca-Cola Argentina is one example of this. Leveraging Amazon Web Services and its automated continuous delivery pipeline, Coca-Cola Argentina created Wabi, a mobile app for product ordering and delivery. With this technology, delivery takes 30 minutes instead of hours. And, as detailed on AWS website, the cloud drove the creation of the app and now helps keep it updated efficiently.

Using AWS, we can deploy new features for Wabi in several months instead of the year it could sometimes take in our previous environment, Alejandro Arauz, digital operations director at Coca-Cola Argentina and cofounder of Wabi, notes in an AWS case study.

Retailers frequently scale physical storefronts in accordance with traffic and seasonal demand. Now, retailers can do the same with applications. The cloud makes it simple and cost-effective to scale up during periods of growth or scale down during seasonal downtimes. This ensures that retailers meet customer demand without overinvesting in underutilized infrastructure.

If a retailer wants to prepare for the holidays, for example, it can troubleshoot applications in advance to make sure they run smoothly under pressure. Mobile users expect websites to load in under three seconds, and stress-testing applications before deployment can reduce loading times. This can also help retailers avoid uninstalls of e-commerce apps, which can cost businesses up to $68,000 per month.

UP NEXT: See the challenges that in-store retailers are facing in 2024.

When it comes to modernizing applications, another key asset of the cloud is simplified security compliance. By offering standardized, up-to-date security protocols and automated compliance controls across an entire digital infrastructure, the cloud reduces the complexity of security standards and keeps retailers updated in real time.

Thats why 60 percent of consumer-facing applications will be running on public clouds by 2025, according to Boston Consulting Group. Whats more, almost 40 percent of data warehouses and analytics workloads and more than 30 percent of core business applications will be running on public clouds by the same year. Retailers who make the shift to the cloud now will fare better in the long term.

The percentage of consumer-facing applications that will run on public clouds by 2025

Source: Boston Consulting Group, The Keys to Scaling Digital Value, March 2022

The cloud also offers convenience. With a centralized dashboard, employees can reduce their toggling tax, or the time lost by switching between applications, notes the Harvard Business Review. In this way, the cloud can improve the customer experience while also increasing employee engagement and productivity.

Cloud platforms also seamlessly integrate with other services and applications. This facilitates collaboration across teams and creates a more cohesive application ecosystem. Overall, retailers who embrace a cloud-native approach are positioned for greater success in the fast-paced market. They also have more agility and cost-efficiency to meet consumer demands and stay competitive.

Here is the original post:
Why the Cloud Is Critical to Modern Applications in Retail - BizTech Magazine

Read More..

Unveiling the Threat Landscape: Exploring the Security Risks of Cloud Computing – Security Boulevard

In the digital era, cloud computing has become synonymous with agility and scalability for businesses and individuals. However, critical security risks and threats inherent in cloud environments come alongside the myriad benefits. This blog aims to dissect the nuances of cloud security risks, shedding light on the challenges commonly faced when securing digital assets in the cloud.

Before delving into the specific risks associated with cloud security, its crucial to understand the foundational concept of the Shared Responsibility Model. This model represents a new approach to securing cloud environments. Unlike traditional on-premise solutions, with the Shared Responsibility Model, cloud security is a collaborative effort between cloud service providers (CSPs) and their users.

The Shared Responsibility Model defines the division of responsibilities between the CSP (cloud service provider) and the user. The CSP secures the underlying infrastructure, including the physical data centers, networking, and hypervisors. On the other hand, users are entrusted with securing their data, applications, and configurations within the cloud.

This balanced approach ensures that neither party bears the entire burden of cloud security, fostering a cooperative relationship that leverages the expertise of both CSPs and users. The model shifts based on the type of cloud service Infrastructure as a Service (IaaS), Platform as a Service (PaaS), or Software as a Service (SaaS).

The development of this model was necessitated by the dynamic nature of cloud computing, where traditional security models became inadequate. Oversight of the Shared Responsibility Model is a shared endeavor, with constant communication and collaboration required to adapt to evolving threats and technological advancements.

Understanding this model is fundamental to comprehending the subsequent discussion on security risks in cloud computing. It lays the groundwork for organizations to make informed decisions, implement effective security measures, and navigate compliance complexities in the cloud.

Now, lets delve into the specific risks associated with cloud security.

Cloud environments, known for their intricate configurations through web-based interfaces or Infrastructure as Code (IaC), are susceptible to misconfigurations. Cloud resources dynamic and scalable nature introduces challenges, making it crucial for teams to adapt and effectively manage configurations. This includes addressing risks such as Broken Object Level Authorization (OWASP API1) and Security Misconfigurations (OWASP API8), where improper configurations may lead to unauthorized access or vulnerabilities.

Cloud platforms offer diverse services, each demanding specific access controls. The scalability of cloud environments complicates the consistent management of access permissions. Teams must navigate complex IAM settings unique to each cloud provider. This challenge aligns with risks such as Broken Authentication (OWASP API2) and Broken Function Level Authorization (OWASP API5), where weak authentication mechanisms or flawed access controls can result in unauthorized access.

Cloud computing involves data transmission over networks and storage in shared infrastructures. Encryption is vital due to the distributed and multi-tenant nature of cloud services. Teams must implement encryption measures compatible with cloud environments to protect data across various states. This aligns with risks such as Broken Object Property Level Authorization (OWASP API3), emphasizing the importance of encryption at the object property level.

Cloud environments consist of numerous interconnected components, making monitoring and logging complex. Cloud storage security risks such as incomplete monitoring may lead to the oversight of critical security events.Specialized tools are required to track activities across virtual machines, containers, and cloud services. Incomplete monitoring may lead to the oversight of critical security events. This challenge corresponds to risks like Improper Inventory Management (OWASP API9), highlighting the need for comprehensive monitoring.

Cloud service providers regularly update their platforms, requiring teams to manage patches for virtual machines, containers, and other services. The dynamic nature of cloud infrastructure demands agile patch management to address vulnerabilities promptly. This aligns with risks such as Unrestricted Resource Consumption (OWASP API4), where successful attacks can lead to resource exhaustion or denial of service.

Cloud environments are susceptible to various disruptions, necessitating effective disaster recovery plans. These plans should align with cloud services, including backup strategies and the ability to restore operations cloud-natively. This challenge relates to risks such as Unrestricted Access to Sensitive Business Flows (OWASP API6), emphasizing the importance of planning for potential disruptions.

Cloud Specificity: Cloud computing relies heavily on APIs for seamless service integration. Insecure APIs pose a specific threat in cloud environments, where integration is essential. Teams must be vigilant in securing APIs and verifying the security practices of third-party services. This corresponds to risks such as Unsafe Consumption of APIs (OWASP API10), underlining the importance of secure API practices in cloud-based services.

At the heart of these security challenges lies the application programming interfaces (APIs), pivotal components that facilitate seamless connections between software without needing human login. APIs, however, present a unique set of challenges. Whether dealing with open-source or proprietary software, the API landscape demands a meticulous approach to identify and address potential risks.

The OWASP API Security Top 10 offers a comprehensive list of common issues associated with APIs, ranging from broken object-level authorization to the unsafe consumption of APIs. This framework underscores the tendency to place unwavering trust in API functionality, often overlooking inherent vulnerabilities. Notably, the list highlights the need for organizations to scrutinize API usage, considering additional technologies that can augment protection, especially for services intended for a wider audience.

As the cloud security landscape evolves, understanding APIs critical role in vulnerabilities and solutions becomes paramount. By acknowledging the challenges and proactively implementing robust security measures, organizations can fortify their cloud infrastructure against potential threats, ensuring a resilient and protected digital ecosystem.

The cloud landscape faced challenges as Microsoft grappled with authentication issues, drawing attention from attackers and security experts, including Tenable. The heart of the matter lay in insufficient access control to Azure Function hosts, a critical component of Microsofts Power Platform (Power Apps, Power Automation). This revelation underscored the importance of transparency in cloud security, emphasizing the need for robust measures to secure cloud authentication.

Tenable CEO Amit Yoran described the vulnerability allowed attackers to interact with Azure Functions without authentication, exploiting a flaw in creating and operating custom connectors within the Power Platform. This scenario exposed a potential risk wherein attackers could traverse different customer connectors by determining hostnames, posing a serious threat to data integrity.

Microsoft swiftly addressed the Power Platform Custom Code information disclosure vulnerability, as detailed in a technical note. Affected customers were promptly notified via Microsoft 365 Admin Center, ensuring a proactive approach to risk mitigation.

Recent challenges, such as the unprepared shift to remote work and smart home security concerns, have introduced new dimensions to cloud security.

The rapid adoption of remote work infrastructure requires secure frameworks and comprehensive policies to mitigate risks. Organizations should prioritize endpoint security, enforcing the use of virtual private networks (VPNs) and regularly updating security protocols on remote devices.

Smart home devices, previously non-networked, now serve as potential breach points, emphasizing the need for user awareness and safe configuration practices. Employee education programs should include guidelines on securing home networks, updating router passwords, and ensuring the security of connected devices.

Cloud configuration should prioritize security over speed. Rushed setups often result in misconfigurations that expose sensitive data. Organizations should allocate sufficient time for detailed cloud security risk assessment, including comprehensive stress testing to identify potential weak points. Continuous monitoring and automated configuration management tools contribute to ongoing security.

BYOD policies demand careful consideration of potential risks. While the flexibility of BYOD policies enhances employee convenience, organizations should implement strict security measures. This includes regularly updating security software on employee devices, conducting periodic security training, and implementing mobile device management (MDM) solutions.

Phishing attacks and social engineering methods continue exploiting technical and human vulnerabilities.

Implementing multi-factor authentication, security software, and regular training are essential measures.

Phishing attacks often target the human security element, relying on unsuspecting users to divulge sensitive information. Organizations should conduct regular and simulated phishing exercises to enhance employee awareness. Multi-factor authentication (MFA) adds an extra layer of protection, requiring additional verification beyond passwords.

Regular training sessions on recognizing social engineering tactics and ongoing communication about emerging threats contribute to a vigilant and security-conscious workforce. Additionally, organizations should invest in advanced email filtering solutions to detect and block phishing attempts before reaching employee inboxes.

Identify and encrypt sensitive data, ensuring secure storage of encryption keys.

While VPN services provide secure transit for data, organizations should also focus on encrypting data at rest. This involves identifying and classifying sensitive data, applying encryption algorithms, and securely storing encryption keys. Regularly updating encryption protocols in response to evolving threats enhances the overall security posture.

Implement cloud security solutions, such as Kaspersky Hybrid Cloud Security, for comprehensive protection.

End-to-end encryption ensures that data remains secure from the origin device to its destination. This practice safeguards sensitive information even if intercepted during transit. Organizations should promote the use of applications and services that prioritize end-to-end encryption.

Cloud security solutions, such as Kaspersky Hybrid Cloud Security, provide a holistic approach to protecting cloud environments. These solutions offer threat detection, vulnerability management, and real-time monitoring. Regularly updating and configuring these solutions according to evolving threats enhances their effectiveness.

Secure smart home devices, use VPNs for remote work, and regularly update software for increased security.

Test cloud security setups and conduct regular audits to identify and address vulnerabilities proactively.

Multi-factor authentication adds an extra layer of security beyond passwords. Organizations should prioritize its implementation across cloud services, ensuring user access requires multiple verification forms.

Securing smart home devices involves more than just individual device security. Organizations should guide employees on securing their home networks, using VPNs for remote work, and updating router passwords. This comprehensive approach extends the organizations security perimeter to include employee home environments.

Regularly updating software is a fundamental yet often overlooked aspect of cloud security. Organizations should implement automated patch management systems to ensure that all software, including operating systems and applications, is up-to-date. Conducting regular security audits helps identify potential vulnerabilities and weaknesses that attackers may exploit.

Effectively securing the cloud requires a dual focus on understanding and mitigating security risks and threats. By embracing the shared responsibility model, navigating security and compliance in cloud computing challenges, and implementing proactive measures against potential threats, organizations can confidently harness the power of cloud computing risk while safeguarding their digital assets. In this ever-evolving landscape, a comprehensive and strategic approach to cloud security is key to a resilient and protected digital infrastructure. Stay vigilant, stay secure.

The post Unveiling the Threat Landscape: Exploring the Security Risks of Cloud Computing appeared first on Centraleyes.

*** This is a Security Bloggers Network syndicated blog from Centraleyes authored by Michelle Ofir Geveye. Read the original post at: https://www.centraleyes.com/security-risks-of-cloud-computing/

Continued here:
Unveiling the Threat Landscape: Exploring the Security Risks of Cloud Computing - Security Boulevard

Read More..