Page 718«..1020..717718719720..730740..»

How to make self-hosting and local-first software work – The Verge

For a while, I really thought I could be a self-hoster. After months of talking to people about platforms and security and what it means that we really dont own any of the data and apps we use every day, my big plan was to buy a mini PC and run my life off my own device.

A lot of Docker experimentation later, I pretty much gave up. (As one person put it to me, if you ever find yourself typing in an IP address and port number, youve officially exited the realm of things most people will ever do.) And so this episode of The Vergecast, the fourth and final in our series about connectivity, became about something else. Self-hosting is a nice idea and a totally impractical reality for most people; signing into cloud services and downloading apps is just so much easier to do!

But there are plenty of people out there who think we dont have to choose. They think its possible to build software that both belongs to us and works across all our devices, that is collaborative and user-friendly and has an offline mode. They even have a term for this local-first software and point to apps like Obsidian as proof that it can work.

After that, we get to one more idea about software: that the solution isnt to change the way we acquire and access software but rather to change the things we can do to that software. In his book The Internet Con, activist and author Cory Doctorow argues that interoperability might be the solution to most of our tech woes. Interop could turn the internet from a series of walled gardens into a teeming forest of interconnected services that are only as successful as they are good. But that requires some legal changes and some big new ideas about how we build and use software.

Software has connected us and connected everything. So how do we connect to our software? Thats the question of this episode. The answer doesnt quite look like Plex servers and NAS systems, but it might be the next best thing.

See the original post here:
How to make self-hosting and local-first software work - The Verge

Read More..

AI in the Global South: Opportunities and challenges towards more … – Brookings Institution

The advent of the Fourth Industrial Revolution and its accompanying technological advancements has commanded the attention of countries worldwide, leading to an unprecedented adoption and interest in leveraging artificial intelligence (AI). As AI increases in ubiquity, countries in the Global South, including Africa, Southeast Asia, Latin America, and the Caribbean, have begun to capitalize on the opportunities presented by these technologies despite early development being primarily concentrated in the West. Given the historical development challenges that countries in the Global South have faced, AI stands to help advance progress in critical domains such as agriculture, healthcare, and education. However, rising concerns about the ethical implications of using AI also present new challenges for countries in this region to address, along with handling existing development priorities.

As the development of AI advances rapidly and demonstrates the potential to bolster economic growth, governments in the Global South must understand how to make progress toward enacting robust AI regulation and building thriving AI ecosystems that sustainably support startup growth and the development of research and engineering talent. A part of this progress will involve the equitable inclusion of countries from the Global South in roundtable convenings, working groups on AI, and high-level advisory bodies such as those initiated by the U.K. government, OECD, and the United Nations. However, the inclusion of the Global South in these initiatives should also be a goal of countries that have dominated the current AI discourse. The White Houses plans to lead critical global conversations and collaborations on AI, as outlined in the recently issued Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence, should deliberately seek participation from stakeholders and governments in the Global South to inclusively work towards an equitable era of AI.

Current landscape of AI in the Global South

Over the past decade, the Global South has adopted AI tools to address traditional development challenges, and there exists a diverse array of use cases for AI among member countries in the Global South, particularly in agriculture, healthcare, and education. Within agriculture, projects have focused on identifying banana diseases to support farmers in developing countries, building a deep learning object detection model to aid in-field diagnosis of cassava disease in East Africa, and developing imagery observing systems to support precision agriculture and forest monitoring in Brazil. In healthcare, projects have focused on building predictive models to keep expecting mothers in rural India engaged in telehealth outreach programs, developing clinical decision support tools to combat antimicrobial resistance in Ghana, and using AI models to interpret fetal ultrasounds in Zambia. In education, projects have focused on identifying at-risk students in Colombia, enhancing English learning for Thai students, and developing teaching assistants to aid science education in West Africa. There is much anticipation for an increase in AI innovation over the next decade as companies, governments, and various organizations actively work to expand their development.

The emergence of AI within the Global South has also provided opportunities to democratize current AI practices, lead towards the development of more inclusive AI systems, and increase participation from communities underrepresented in AI development. Grassroots organizations, such as Masakhane and Ghana NLP, have emerged to focus on developing datasets and machine translation tools to help expand access to low-resource African languages. Efforts such as Deep Learning Indaba, Khipu, AI Saturdays Lagos, and Data Science Africa, amongst many others, have been instrumental in growing communities of AI researchers and developers within Africa and Latin America by hosting conferences and workshops and helping build local expertise in emerging technology. Many of these issues were recently raised in a Brookings webinar that included leaders from civil society organizations in the Global South, who helped to frame productive dialogues around these and other issues. As calls for localized development of AI systems increase, these organizations will become even more critical in ensuring that AI development meets the needs and interests of local communities.

Large tech companies have continued to expand their footprint in the Global South by establishing research labs, development centers, and engineering offices. IBM was one of the first big tech companies to establish an industry research lab in the Global South, building IBM Research India in 1998. Since then, IBM has launched research labs in So Paulo and Rio de Janeiro (2010), Nairobi (2013), and Johannesburg (2016). India has continued to be a market entry point for big tech research labs in the Global South, with Microsoft opening Microsoft Research India in Bangalore in 2005. Microsoft has continued to expand its reach, opening its Africa Development Center with two locations in Nairobi and Lagos in 2019 and launching the Microsoft Africa Research Institute, also in Nairobi in 2020. Google has also built up its research presence in the Global South, launching an AI research lab in Accra in 2018 and Bangalore in 2019. While developing research labs in the Global South is one step in advancing progress within AI, it is also important to understand that there is significant infrastructure and human capital necessary to maintain these labs and develop broader AI ecosystems.

Challenges of AI in the Global South

Despite the opportunities presented by AI, several challenges remain. Infrastructure challenges plaguing countries in the Global South could potentially hinder the development of AI in these regions with the large amounts of data needed to train AI systems and the high number of computing resources consumed in this process. Africa particularly struggles with internet access. In the last decade, internet penetration in Africa rose significantly, increasing from eight percent in 2011 to 36% in 2021. Limited internet penetration within Africa can largely be attributed to inadequate access to electricity and insufficient investments in crucial internet infrastructure components such as fiber optic cables, cellular towers, and base stations. According to data from the World Bank, 80.7% of the urban population in Sub-Saharan Africa is connected to electricity. Comparatively, South Asia has an urban electricity connectivity rate of 99.9, and Latin America and the Caribbean have a rate of 99.5%. This figure drops to 30.4% in rural Sub-Saharan Africa, compared to 98.3% in rural South Asia and 96.5% in rural Latin America and the Caribbean.

Concerns regarding the negative impacts of AI have led to a wide array of conversations regarding its use in domains such as healthcare, employment, and policing. However, many of these concerns have primarily been concentrated on the West, excluding perspectives on how AI may affect countries in the Global South. The Global South has also routinely become a destination for outsourced data labeling labor, with companies, such as Sama and Scale AI, relying on workers from this region. Recent coverage has highlighted the harms data workers and content moderators in East Africa and South Asia have faced when exposed to graphic content. And there are concerns that this exploitation could continue or even worsen. The lack of robust data protection and AI policies in the Global South could potentially lead to greater levels of misuse as AI grows in reach. Given that AI legislation is still in early development globally, there is an opportunity for countries in the Global South to circumvent the potential negative impacts of AI. To counter these harms, governments representing these countries must make a concerted effort to draft AI strategies and move towards enactment to protect vulnerable communities and enable responsible innovation. They must also be prominently represented at the table in multilateral global conversations on AI.

The road forward for AI in the Global South

The World Banks estimations indicate that connecting the 100 million Africans residing in remote areas would necessitate an investment of at least $100 billion. Plans are underway to improve connectivity in the Global South, starting with the African continent. 2Africa, the longest subsea internet cable ever designed, is currently under deployment and has 46 connections to land-based networks around 33 countries within Africa, Asia, and Europe. While most of the worlds estimated 485 in-service subsea internet cables are owned by large telecommunication companies, big tech companies that include Amazon, Google, Meta, and Microsoft have increased their subsea development efforts, currently owning or co-owning around 30 cables, with many more in development. However, this growing interest comes with concerns around algorithmic colonization and the potential for the interests of large companies to override those of local communities who may negatively bear the brunt of harm imposed by technical systems.

As countries in the Global South begin to build their respective AI capabilities, investing in cloud computing infrastructure independent of external platforms such as AWS, Google Cloud, or Microsoft Azure can help manage high computing costs and ensure that data is stored in compliance with local laws and regulations. However, there will also be a need for governments to invest in building data and cloud computing centers and training the necessary personnel to maintain them.

Governments in the Global South must also build the capacity to train local researchers and developers to help expand their respective AI ecosystems. Since AI research and development has traditionally been concentrated in Western countries such as the United States, Canada, and the United Kingdom, much effort is needed to close this gap. Forging partnerships with external entities can help countries in the Global South speed up the progress of building local AI capacity. Countries such as Nigeria have established partnerships with Microsoft to help equip citizens with digital skills, while other efforts from Google have trained nearly eight million people in Latin America in digital skills since 2017. However, given the unstable political environments and economies in some regions of the Global South, these efforts could be hampered if brain drain continues to increase. It is essential that as AI talent is trained within these countries, they are encouraged to stay and contribute to local economies. It will also be vital that countries in the Global South build their respective AI capabilities by supporting the creation and maintenance of thriving AI ecosystems that promote entrepreneurship and support local innovation through research labs and digital hubs.

Additionally, governments in the Global South should integrate digital skills training into primary and secondary school curricula help build a pipeline of researchers and developers in emerging technologies. Some early efforts have been seen in Kenya, where the governments Digital Economy Blueprint focuses on to incorporating topics in computer literacy, ICT skills, coding, digital citizenship, and online safety into K-12 curriculum. Within the rest of the Global South, plans to implement educational upskilling have been noted in digital transformation initiatives from Brazil, Costa Rica, India, Jamaica, Malaysia, Panama, Rwanda, and South Africa. However, given that many of these plans to improve digital skills training are still in development or early stages of implementation, there is much work to do to realize the potential of sustainable AI workforces in the Global South.

Finally, AI can deliver many other opportunities for countries in the Global South, ranging from more streamlined healthcare systems, improved access to education, and expanded economic growth. PwC estimates that AI could contribute up to $15.7 trillion to the global economy by 2030. However, excluding China, only $1.7 trillion of this economic impact is expected to impact the Global South. This outlook is concerning given that the Global South holds a majority of the worlds population, potentially limiting the economic advantages of AI to those already reaping the benefits of emerging technologies. Cautious approaches to building, implementing, and governing these technologies are needed. AI should not be seen as a panacea for solving development issues. There should be equitable investment by countries toward building capacity for traditional social services while making efforts toward building sustainable AI ecosystems. To work towards these goals, countries in the Global South should continue participating in international collaboration and partnerships and leverage existing expertise within their respective countries. The ability of the Global South to innovate despite existing challenges will help define the role of the global majority as the world transitions into a digital future which AI stands to inevitably shape.

Continue reading here:
AI in the Global South: Opportunities and challenges towards more ... - Brookings Institution

Read More..

Lumentum to Acquire Cloud Light to Accelerate Data Center Speed and Scalability – Yahoo Finance

Combination expected to deliver a more than five-fold expansion in Lumentums cloud intra-data center served opportunity

Served opportunity expected to grow at over 30% CAGR through 2028, driven by investments to support the rapid proliferation of Artificial Intelligence and Machine Learning (AI/ML) applications

Acquisition expected to be immediately accretive to non-GAAP earnings per share and to more than double Lumentums cloud intra-data center infrastructure revenue in the 12-month period following the transaction close

Lumentum to host an investor call today, October 30, at 5:30 a.m. PT to discuss the transaction

SAN JOSE, Calif., October 30, 2023--(BUSINESS WIRE)--Lumentum Holdings Inc. (NASDAQ: LITE) ("Lumentum") and Cloud Light Technology Limited ("Cloud Light") today announced that they have entered into a definitive agreement under which Lumentum will acquire Cloud Light with a transaction value of approximately $750 million, subject to certain adjustments. At the time of closing, transaction consideration will be paid in cash and the assumption and substitution of outstanding unvested Cloud Light options. The transaction has been unanimously approved by the Boards of Directors of both companies and by Cloud Lights shareholders.

The acquisition of Cloud Light is expected to accelerate Lumentums push into the fastest growing segments of the multibillion-dollar opportunity for optical modules used in cloud computing data center infrastructure. Cloud Light has a demonstrated track record of developing and manufacturing the highest-speed connectivity solutions at the leading edge of new and rapidly growing technology transitions. Nearly all of Cloud Lights more than $200M revenue in the last 12 months was derived from 400G or higher speed transceiver sales. In the most recent quarter, over half of Cloud Lights optical transceiver revenue was derived from 800G modules.

With this acquisition, Lumentum will be well-positioned to serve the growing needs of cloud and networking customers, particularly those focused on optimizing their data center infrastructure for the demands of AI/ML. Lumentum will be able to deliver immediate customer value with a more comprehensive product and technology portfolio, enabling customers to more effectively manage the escalating compute and interconnect requirements of AI workloads. The combination also brings best-in-class design and assembly, test, and packaging capabilities together with Lumentums global scale and customer reach.

Story continues

"With Cloud Light, we are making a strategic investment to significantly expand our opportunities in the cloud data center and networking infrastructure space," said Alan Lowe, Lumentum president and CEO. "Cloud Light provides us with the highest speed transceiver solutions at scale and complements our advanced component capabilities. This results in a broad product and technology portfolio that addresses a wide range of cloud operator needs."

"We are confident that this transaction will deliver substantial, long-term value to our stockholders, with immediate earnings accretion and accelerated revenue growth. We look forward to welcoming Cloud Lights very talented team to Lumentum," concluded Mr. Lowe.

"Todays announcement is a pivotal milestone in the history of Cloud Light, and a testament to the hard work and dedication of our employees," said Dr. Dennis Tong, Cloud Light Founder and CEO. "We founded the company with a vision that our deep expertise in high-volume precision manufacturing would result in a superior value proposition for cloud data center customers. Having worked closely with the technology teams within leading cloud operators, we believe we can build upon our success to date and further accelerate cloud data center growth by combining Lumentums advanced photonic integration and transmission technologies with our highly automated packaging and manufacturing processes. We look forward to joining the Lumentum team and beginning an exciting new chapter."

Compelling Strategic and Financial Benefits

Captures AI inflection with expanded intra-data center opportunity: The highly complementary combination squarely positions Lumentum as a leader in providing photonics to cloud operators, enabling more than a five-fold expansion in the companys served opportunity inside of data centers. With the advent of generative AI, cloud network needs for 400G and higher speed optical transceivers have accelerated rapidly, with the opportunity for these intra-data center products anticipated to grow at a 30% CAGR and exceed $10B by 2028. Cloud Light provides the highest speed optical transceiver products to leading hyperscale cloud customers, with even higher-speed solutions well along in their development, complementing Lumentums existing portfolio of laser transmitters and other integrated components for data center transceivers.

Better equipped to address future customer roadmaps: AI models are driving an exponential increase in compute requirements, where performance is now doubling every 3 to 4 months, compared to the historical doubling every two years according to Moores Law. Scaling workloads even higher is limited by existing network and interconnect bottlenecks. With the addition of Cloud Light, Lumentum better addresses these challenges both now and into the future by combining advanced optical interconnect technologies based on chip-scale photonic integration, together with highly automated packaging and manufacturing technologies.

More strategic partner to cloud and AI infrastructure customers: With Cloud Light, Lumentum can more effectively address the growing and diverse needs of cloud operator and AI infrastructure providers. This includes providing next-generation optical connectivity and optical switching within data centers, as well as coherent pluggable modules and optical line system components for data center interconnect applications. With Lumentums vertical integration capabilities in components and diversity in its global manufacturing footprint, customers can benefit from improved security of supply and superior technology and cost roadmaps.

Amplifies edge and metro networking opportunities: Cloud Lights proven capabilities in low-cost product development and high-volume manufacturing are relevant not only for cloud data center and data center interconnect solutions, but also for network edge applications, including those served by direct detect and coherent tunable DWDM transceivers. A broader set of networking customers will benefit from the added advanced packaging technologies and manufacturing capabilities offered by this combination.

Immediately accretive to Lumentums earnings and accelerates Lumentum growth: The transaction is expected to be immediately accretive to Lumentums non-GAAP earnings per share and is expected to more than double Lumentums cloud data center infrastructure revenue in the 12-month period following the transaction close. In the last twelve months, over 90 percent of Cloud Lights revenue was derived from 400G and higher speed products, and in their most recent quarter, over half of Cloud Lights optical transceiver revenue was derived from 800G transceiver products.

Transaction Financing and Approvals

Lumentum intends to finance the transaction through cash from its balance sheet. The transaction is expected to close by the end of calendar 2023, subject to receipt of regulatory approvals and other customary closing conditions.

Advisors

BofA Securities served as the exclusive financial advisor to Lumentum and Wilson Sonsini Goodrich & Rosati, Professional Corporation served as legal advisor. Morgan, Lewis & Bockius LLP served as legal advisor to Cloud Light.

Transaction Conference Call

Lumentum will hold a conference call today, October 30, 2023, at 5:30 a.m. PT/8:30 a.m. ET to discuss this announcement. A live webcast of the call and the replay will be available on the Lumentum website at http://investor.lumentum.com. To listen to the live conference call, dial (888) 259-6580, (206) 962-3782, or (416) 764-8624 and reference the conference ID 95405221. Supporting materials for the calls presentation will be posted on http://investor.lumentum.com under the "Events and Presentations" section prior to the call.

A conference call replay will be available from October 30, 2023, at 11:30 a.m. ET through November 6, 2023, at 11:59 p.m. ET. To access the replay, dial (877) 674-7070 or (416) 764-8692 and reference the passcode 405221 #.

This press release is being furnished as an exhibit to a Current Report on Form 8-K filed with the Securities and Exchange Commission and will be available at http://www.sec.gov/.

About Lumentum

Lumentum (NASDAQ: LITE) is a market-leading designer and manufacturer of innovative optical and photonic products enabling optical networking and laser applications worldwide. Lumentum optical components and subsystems are part of virtually every type of telecom, enterprise, and data center network. Lumentum lasers enable advanced manufacturing techniques and diverse applications including next-generation 3D sensing capabilities. Lumentum is headquartered in San Jose, California with R&D, manufacturing, and sales offices worldwide. For more information, visit http://www.lumentum.com.

About Cloud Light

Cloud Light Technology Limited designs, markets, and manufactures advanced optical modules for automotive sensors and data center interconnect applications. The companys core team has over 18 years of experience in the design of advanced optical modules, with a rich heritage in advanced manufacturing and delivering superior quality and customer experience. Founded in 2018, Cloud Light is headquartered in Hong Kong, with R&D centers in Hong Kong and Taiwan and state-of-the-art manufacturing facilities in Dongguan, China and Southeast Asia. For more information, visit http://www.cloudlight.com.hk.

Forward-Looking Statements

This communication contains forward-looking statements within the meaning of Section 27A of the Securities Act of 1933, as amended, and Section 21E of the Securities Exchange Act of 1934, as amended. Forward-looking statements generally relate to future events, including the timing of the proposed transaction and other information related to the proposed transaction. In some cases, you can identify forward-looking statements because they contain words such as "may," "will," "should," "expects," "plans," "anticipates," "could," "intends," "target," "projects," "contemplates," "believes," "estimates," "predicts," "potential" or "continue" or the negative of these words or other similar terms or expressions that concern the proposed transaction and our expectations, strategy, plans or intentions regarding it. Forward-looking statements in this communication include, but are not limited to, (i) expectations regarding the timing, completion and expected benefits of the proposed transaction, (ii) expectations and beliefs with respect to customers, the opportunity and market that combined company will serve, products and technologies of the combined company and future operations, and (iii) the expected impact of the proposed transaction on Lumentums business and financial results. Expectations and beliefs regarding these matters may not materialize, and actual results in future periods are subject to risks and uncertainties that could cause actual results to differ materially from those projected. These risks include the risk that the transaction may not be completed in a timely manner or at all; the ability to secure regulatory approvals on the terms expected in a timely manner or at all; the effect of the announcement or pendency of the transaction on our business relationships, results of operations and business generally; risks that the proposed transaction disrupts current plans and operations; the risk of litigation and/or regulatory actions related to the proposed transaction; changing supply and demand conditions in the industry; and general market, political, economic and business conditions. The forward-looking statements contained in this communication are also subject to other risks and uncertainties, including those more fully described in filings with the Securities and Exchange Commission, including Lumentums Annual Report on Form 10-K for the fiscal year ended July 1, 2023 as well as other filings made by Lumentum with the SEC from time to time and available at http://www.sec.gov. These forward-looking statements are based on current expectations, and with regard to the proposed transaction, are based on Lumentums current expectations, estimates and projections about the expected date of closing of the proposed transaction and the potential benefits thereof, its business and industry, managements beliefs and certain assumptions made by Lumentum, all of which are subject to change.

Lumentum undertakes no obligation to update the information contained in this communication or any other forward-looking statement.

Category: Financial

View source version on businesswire.com: https://www.businesswire.com/news/home/20231030099888/en/

Contacts

Investors: Kathy Ta, 408-750-3853, investor.relations@lumentum.com Media: Nol Bilodeau, 408-439-2140, media@lumentum.com or Hotwire Global, lumentum@hotwireglobal.com

Link:
Lumentum to Acquire Cloud Light to Accelerate Data Center Speed and Scalability - Yahoo Finance

Read More..

Governor Gianforte Meets with President Tsai of Taiwan – the State of Montana NewsRoom

Governor's Office

TAIPEI, Taiwan Governor Greg Gianforte today met with President Tsai Ing-wen of Taiwan to strengthen Montanas relationship with its longstanding partner and ally.

Montana and Taiwan share nearly four decades of history as trading partners and friends, built on our common values of individual liberty, innovation, and free enterprise, Gov. Gianforte said. It was an honor to meet with President Tsai this morning to convey our appreciation for this longstanding partnership and our optimism for the future.

Governor Gianforte meeting with President Tsai of Taiwan

President Tsai hosted Gov. Gianforte at the Presidential Office Building in Taipei for the meeting.

I would like to begin by welcoming Governor Gianforte as he leads this economic and trade delegation. On behalf of the people of Taiwan, I extend our sincere gratitude to him for making Taiwan the first stop of his trip. I am also grateful to Governor Gianforte for reopening the State of Montana Asia Trade Office-Taiwan after taking office in 2021 to continue strengthening bilateral economic and trade cooperation through concrete action, President Tsai said.

President Tsai continued, Since taking office, Governor Gianforte has also been keen to promote tax incentive and deregulation legislation. This has created more job opportunities and enhanced the investment climate.

During the meeting, the governor and president reflected on Montana and Taiwans sister state relationship and discussed opportunities for economic collaboration.

Below is a transcript of President Tsai's remarks, translated:

I would like to begin by welcoming Governor Gianforte as he leads this economic and trade delegation. On behalf of the people of Taiwan, I extend our sincere gratitude to him for making Taiwan the first stop of his trip. I am also grateful to Governor Gianforte for reopening the State of Montana Asia Trade Office-Taiwan after taking office in 2021 to continue strengthening bilateral economic and trade cooperation through concrete action.

Taiwan and Montana have long enjoyed close cooperation in agriculture and tourism. Large numbers of Taiwanese tourists visit such famous attractions as Yellowstone National Park, and agricultural goods from Montana such as beef and wheat are very popular among domestic consumers.

Montana also boasts key US industry clusters in the photonics, optoelectronics, and optics sectors, and is actively developing its biotechnology, cloud computing, and aerospace industries. Since taking office, Governor Gianforte has also been keen to promote tax incentive and deregulation legislation. This has created more job opportunities and enhanced the investment climate.

Taiwan is Montana's seventh-largest trade partner. As the bilateral trade environment becomes even more favorable, Taiwan and Montana can continue to deepen partnerships in such key industries as photonics, optoelectronics, optics, and semiconductors to foster mutually beneficial development.

Taiwan and Montana also share the values of freedom and democracy. I want to thank Governor Gianforte, as well as the Montana Senate and House of Representatives, for passing Taiwan-friendly resolutions supporting our international participation. Taiwan will deepen collaboration with even more like-minded democratic partners to jointly address global challenges.

This delegation led by Governor Gianforte is a key step toward bolstering bilateral relations. We anticipate that, through the efforts of both sides, Taiwan and Montana will enjoy even broader and more far-reaching exchanges going forward.

And below is a transcript of Governor Gianfortes remarks:

President Tsai, thank you for taking the time to meet with us. It's an honor to be in Taiwan with you today. Beginning in 1985, with the signing of our sister state relationship, Montana has shared a strong bond with Taiwan. This bond has been forged through trade and educational exchanges and our shared values of freedom and free enterprise.

As you mentioned, our legislature in Montana reinforces this bond every two years with a joint resolution in support of Taiwan. In honoring that partnership, I proudly reopened the Montana Asian trade office in the heart of Taipei during my first year in office, ushering in new opportunities for the people of Taiwan and Montana. And now two years later, it's only fitting that I lead my first international trade mission as governor to this great country.

Montana is best known around the world for our beautiful vistas and wide-open spaces. Yellowstone and Glacier National Parks are both right in our backyard and we see millions of visitors every year. While our wide-open spaces are admired for their beauty, they also create excellent conditions for growing wheat and raising beef. Just a four-hour drive, north of Yellowstone National Park sits Montana's Golden Triangle. There our farmers seed more than two million acres of wheat each year. Hot days and cool nights in the summer make for some of the finest wheat in the world. Earlier this year, we were honored to host a delegation of flour millers in Montana to showcase where and how we grow these crops.

While agriculture surely remains the bedrock of our trade relationship with Taiwan, we're also seeing rapid growth in other industries like education, bioscience, and photonics.

The Ministry of Foreign Affairs and the Ministry of Education have been funding two programs within the state of Montana. One, housed at the University of Montana within the Mansfield Center, will begin early next year and is dedicated to growing the Mandarin program for both high school and college students in Montana. The second, at Montana Tech and Minghsin University of Science and Technology, will focus on short-term exchanges related to international business particularly as it relates to the semiconductor workforce. Globally, it's expected that education will be a $7 trillion industry by 2025. Montana would like to be part of this, and our universities have set forth a serious agenda to create more connections internationally.

Montana ranks sixth nationally among states for bioscience industry growth over the last five years and fifth in the United States for growth in research and development. This means our bioscience industry is expanding and we're at the tip of the spear for cutting-edge research.

Montana also has one of the highest per capita concentrations of optics, photonics, and quantum computing companies in the United States. The 21st century will depend on photonics as much as the 20th century depended on electronics. Globally the photonics industry is expected to grow from $1.5 trillion to nearly $2 trillion by 2025. It is imperative that we've worked to build this industry for our state, and Taiwan is a key partner in this.

During our visit here to Taipei, we will sign a memorandum of understanding with the Taiwan-USA Industrial Cooperation Promotion Office and the Taiwan Photonics Industry and Technology Development Association to continue discussions and relationships begun by our state office.

I speak on behalf of our entire delegation, Madam President, when I say I'm filled with optimism. Optimism about the future of Montana and Taiwan, and excited about the possible extensions of our partnership. Again, on behalf of the first lady and myself, thank you for hosting us today.

Later, the governor joined members of the delegation for an appreciation luncheon with Taiwans Deputy Minister of Foreign Affairs Roy Chun Lee, who supported Montanas trade mission.

Background:

###

Originally posted here:
Governor Gianforte Meets with President Tsai of Taiwan - the State of Montana NewsRoom

Read More..

What Tools are Needed to Better Handle Cloud in Healthcare? – Healthcare IT Today

There are a lot of amazing things that the cloud is capable of doing in healthcare. But as a newer technology, there are still some areas that it falls behind. So as we are looking into the cloud and possibly adding it to our organizations, we need to be looking at some additional tools that can help you better manage your cloud efforts in healthcare.

To get a starting point on what tools we need to get a better handle on the cloud in healthcare, we reached out to our wickedly smart Healthcare IT Today Community. The following is their recommendations on the necessary tools.

Vladimir Dabi, Tech Lead at Inviggo

Any Infrastructure-as-a-Service tools are very helpful for managing cloud infrastructure. This can be cloud specific tools like AWSs CloudFormation or AWS CDK, or it can be open source tools like Terraform with AWS/GCP/Azure provider. Using IaaC with version control like git lets you have reproducable infrastructure in a click of a button, as well as history of changes. To better handle cloud in healthcare, SaaS tools like SecureFrame can be used. SecureFrame connects to your cloud account and audits your infrastructure for chosen compliance (SOC2, HIPAA, GDPR, CCPA etc) which makes this process much faster.

Jay Ackerman, President & CEO at Reveleer

To better handle cloud in healthcare, several tools are needed. These tools collectively enable healthcare organizations to efficiently manage cloud resources, ensure security and compliance, optimize performance, and leverage advanced technologies for improved patient care and operational excellence. Some essential tools include:

Adopting cloud technology in healthcare brings both opportunities and challenges. While cost savings, scalability, and flexibility are among the benefits, considerations for data security, compliance, and proper migration strategies are essential. Healthcare organizations should assess their unique needs and risks when choosing between public, private, or multi-cloud models. They should also invest in appropriate tools to effectively manage and secure their cloud infrastructure.

Paul Brient, Chief Product Officer at athenahealth

In healthcare IT, the term cloud can encompass a range of solutions, but simply being in the cloud doesnt automatically translate to seamless interoperability, effortless and cost-free software updates, or anytime, anywhere access. These aspects are critical, especially as healthcare moves toward a value-based care model where connectivity is essential for delivering high-quality patient care and closing care gaps.

Said another way, simply moving your existing on-premises technology to the cloud doesnt unlock the full potential of cloud technology. The full benefits of cloud technology come when you transition to a true multi-tenant, single-instance SaaS solution. These systems are more flexible, more scalable, and lend themselves better to interoperability. When all users are on the same instance, updates are seamless, and importantly, integrations for interoperability only need to be built once for all customers. With on-premises technology that is migrated to the cloud, you still must manage upgrades and build point-to-point integrations.

True SaaS solutions also allow practices to harness the network effect of being part of a larger system, where the system can learn from all the usage of the system to benefit all users. So, for example, if a payer changes their rules and starts denying claims, this learning can be quickly incorporated to avoid denials across the network. Or if a care gap is closed for a patient at one practice, all other practices caring for that patient can see this immediately.

Finally, with the dramatic increase in AI capabilities, having all users on a common system allows for an extremely powerful AI training and deployment environment. AI models can learn from the work being done across all practices to streamline administrative tasks and greatly improve the user experience.

Doug Parent, CEO at RingRx

At RingRx, we utilize a combination of cloud infrastructure management tools to ensure efficient and secure operations. These tools provide centralized control over cloud resources, enabling us to monitor, manage, and optimize our cloud infrastructure. Additionally, we leverage specialized healthcare technology solutions for compliance, security, and patient communication, ensuring seamless integration within our cloud environment. To better handle cloud in healthcare, dedicated tools for data encryption, access control, audit logging, and HIPAA compliance verification are essential to maintaining the highest standards of data protection and privacy.

Sundar Shenbagam, Chief Technology Officer at Edifecs

Instead of focusing on Tools, I would focus on approaches. Reason is every major cloud vendors provide plethora of tools to better understand usage, monitor details and APIs/tools to automate solutions. First approach is to take advantage of tooling and capability available on cloud like Kubernetes containers to manage workload than try to manage it outside of them which results in big maintenance headaches. Once we take advantage of such tools, then spend time in setting up monitoring and alerting framework using cloud vendor tools and use APIs and tools to resolve resource issues automatically by predicting upcoming failures. Never forget human angle since DevOps engineers are the ones need to act on resolving issues when automation fails.

Matt Donahue, Chief Technology Officer at CloudWave

Managing cloud infrastructure in healthcare requires a range of tools. These include tools to gain visibility into the usage, consumption, and performance of cloud resources. Financial analysis tools are crucial for tracking and optimizing cloud spending, as unexpected costs can arise. Security tools, including user access provisioning and audit logs, are also essential to protect patient data and ensure regulatory compliance. Additionally, backup and recovery solutions in the cloud are important for data protection and disaster recovery, as they often offer cost-effective options compared to traditional backup systems.

Ajaya Loya, Senior Engineering Manager Cloud Infrastructure & Security Tea at LeanTaaS

Employing Infrastructure as Code (IAC), applying containerization strategies for hosting applications, ensuring visibility into cloud spending, and utilizing Cloud Security Posture Management (CSPM) tools for identifying misconfigurations are essential approaches to efficiently oversee cloud infrastructure, bolster security, and expedite innovation while enhancing the adaptability of change cycles.

So many great ideas here! Thank you to everyone who took the time to submit your expertise and thank you to everyone reading this article! We could not do this without your support. Let us know what tools you think are necessary to have a better handle on cloud in healthcare either in the comments down below or on social media. Wed love to hear from you!

Get Fresh Healthcare & IT Stories Delivered Daily

Join thousands of your healthcare & HealthIT peers who subscribe to our daily newsletter.

See more here:
What Tools are Needed to Better Handle Cloud in Healthcare? - Healthcare IT Today

Read More..

New Webinar: 5 Must-Know Trends Impacting AppSec – The Hacker News

Oct 30, 2023The Hacker NewsWebinar / Web App Security

Modern web app development relies on cloud infrastructure and containerization. These technologies scale on demand, handling millions of daily file transfers it's almost impossible to imagine a world without them. However, they also introduce multiple attack vectors that exploit file uploads when working with public clouds, vulnerabilities in containers hosting web applications, and many other persistent threats.

We surveyed organizations responsible for securing critical web applications used by healthcare, financial services, technology, and other critical infrastructure verticals to learn how they tackle the most destructive threats and summarized our findings in the OPSWAT 2023 State of Web Application Security Report. The survey report revealed that:

In this webinar, join our panel of web application security experts as they expand on the insights gathered while protecting the world's most critical applications.

Our experts will also share five must-know web application security insights, including:

Platforms like Microsoft Azure, Amazon Web Services, and Google Cloud Platform are ubiquitous for hosting web applications. However, embracing public cloud hosting without implementing the requisite security measures exposes applications to data breaches.

Despite significant advantages, containers may bring additional security risks. Malware or vulnerabilities hidden in containers hosting web applications can disrupt business, risk customer data, and lead to compliance violations.

You must check files for malware and sensitive data to prevent breaches and ensure compliance. Our panel will outline pitfalls and tools you can use to avoid costly and embarrassing data leaks.

Organizations must implement automated tools, services, and standards that enable teams to securely develop, secure, deploy, and operate applications.

Despite most organizations increasing their security budgets, most only use five or fewer AV engines to detect malicious files. Surprisingly, very few disarm files with potentially dangerous payloads with Content Disarm and Reconstruction (CDR) technology.

Join our panel of cybersecurity veterans Emo Gokay, Multi-Cloud Security Engineer at EY Technologies and George Prichici, VP of Products at OPSWAT, as they share insights and strategies gathered from the frontlines of securing critical infrastructure from advanced and persistent malware.

Register now to walk away with five key web application security insights and strategies.

More:
New Webinar: 5 Must-Know Trends Impacting AppSec - The Hacker News

Read More..

AWS to offer Sovereign Cloud for the EU. Germany’s pleased, but will it go down as well elsewhere? – diginomica

AWS has become the latest Infrastructure-as-a-Service (IaaS) provider to announce that its setting up a dedicated European Sovereign Cloud that will allow customers to keep all metadata they create in the EU - and help the US company to push even further into the European public sector market.

According to the firm, only EU-resident AWS employees who are located in the EU will have control of the operations and support for the AWS European Sovereign Cloud. The official announcement states:

Located and operated within Europe, the AWS European Sovereign Cloud will be physically and logically separate from existing AWS Regions, with the same security, availability, and performance of existing AWS Regions, giving customers additional choice to meet their data residency, operational autonomy, and resiliency needs. The AWS European Sovereign Cloud will launch with its first AWS Region in Germany and will be available to all European customers.

AWS currently has 102 Availability Zones across 32 geographic regions, and has plans to launch 15 more Availability Zones and five more AWS Regions in Canada, Germany, Malaysia, New Zealand, and Thailand. AWS infrastructure in Europe consists of eight AWS Regions in Frankfurt, Ireland, London, Milan, Paris, Stockholm, Spain, and Zurich.

But theres a need for the Sovereign offering, argued Max Peterson, VP of Sovereign Cloud at AWS in a blog posting:

When we speak to public sector and regulated industry customers in Europe, they share how they are facing incredible complexity and changing dynamics with an evolving sovereignty landscape. Customers tell us they want to adopt the cloud, but are facing increasing regulatory scrutiny over data location, European operational autonomy, and resilience. Weve learned that these customers are concerned that they will have to choose between the full power of AWS or feature-limited sovereign cloud solutions.

He added:

Were taking learnings from our deep engagements with European regulators and national cybersecurity authorities and applying them as we build the AWS European Sovereign Cloud, so that customers using the AWS European Sovereign Cloud can meet their data residency, operational autonomy, and resilience requirements. For example, we are looking forward to continuing to partner with Germanys Federal Office for Information Security (BSI).

The German citation is interesting. A recent article on the Politico website revealed that Germanys Federal Commissioner for Data Protection and Freedom of Information was questioning whether AWS cloud hosting was suitable for use in storing police data.

Still, AWS is clearly doing something right with the Sovereign Cloud announcement as it managed to get Claudia Plattner, President of the German Federal Office for Information Security (BSI) to chip in an obliging comment for the launch press release:

The development of a European AWS cloud will make it much easier for many public sector organizations and companies with high data security and data protection requirements to use AWS services. We are aware of the innovative power of modern cloud services, and we want to help make them securely available for Germany and Europe. The C5 (Cloud Computing Compliance Criteria Catalogue), which was developed by the BSI, has significantly shaped cybersecurity cloud standards, and AWS was, in fact, the first cloud service provider to receive the BSIs C5 testate. In this respect, we are very pleased to constructively accompany the local development of an AWS cloud, which will also contribute to European sovereignty, in terms of security.

And Plattner is backed up by no less a personage thank Dr. Markus Richter, CIO of the German federal government, Federal Ministry of the Interior, who declared:

This will give businesses and public sector organizations more choice in meeting digital sovereignty requirements. Cloud services are essential for the digitization of the public administration. With the German Administration Cloud Strategy and the EVB-IT Cloud contract standard, the foundations for cloud use in the public administration have been established. I am very pleased to work together with AWS to practically and collaboratively implement sovereignty in line with our cloud strategy.

Others in the EU may need more convincing. The Defence Committee of the Italian lower House of Parliament has recently been discussing the need for a national government cloud tied into the idea of that becoming a European government cloud.

AWS isnt the first IaaS provider to announce a Sovereign Cloud rollout. Oracle, for example, beat its rival to it with its own iteration in 2022, with the first two regions being located in Germany and Spain, with operations and support restricted to EU residents and specific EU legal entities.

Then there is the question of the UK. No longer part of the EU, but - for the time being - adhering to EU data protection standards, can organizations in Brexit Britain tap into an EU-dedicated Sovereign Cloud. AWS makes no reference to this in its blurb to date. It may again follow Oracles lead here, where UK customers can indeed access the EU Sovereign Cloud.

And, of course, the announcement by AWS comes against the backdrop of the UK'sCompetition and Markets Authority (CMA) launching a formal investigation into alleged anti-competitive dominance of the country's 7.5 billion cloud services market, with AWS and Microsoft firmly in its sights. Given that one reason for the Sovereign Cloud is to make it easier for the public sector to adopt AWS as a provider, it's tempting to wonder how far this will be seen as a benefit by the CMA...

Excerpt from:
AWS to offer Sovereign Cloud for the EU. Germany's pleased, but will it go down as well elsewhere? - diginomica

Read More..

Compare Ansible vs. Docker use cases and combinations – TechTarget

Ansible and Docker are popular tools for organizations looking to accelerate software releases via automation. While these tools couldn't be more different, they can work well together -- each accomplishing a different, important task for deploying and running software.

Docker enables you to run an application in a maintainable and repeatable manner. An admin can write a Dockerfile that stores all an application's commands and then build a Docker image and run it on most systems.

With Ansible's automation capabilities, admins can easily provision and configure servers. Ansible is developed in Python and can run on most machines to install updates to dependencies, configure software or orchestrate running applications.

Let's explore Ansible and Docker to see how these tools differ and how they can work together.

Supported by all cloud providers, Docker is often the path of least resistance for the deployment of an application to the cloud. Advantages include the ability to quickly get new containers up and running, the portability of a Docker image and the benefits of turning an application's environment configuration into a Dockerfile.

A Docker container doesn't load an OS, which means it can start running in seconds. Applications can scale quickly as well -- simply spin up more containers using Docker deployment tools, such as AWS Fargate, Google Cloud Run or Azure Container Instances. It's also possible to switch an app between cloud providers or even host the app on premises because Docker containers are easy to configure in cloud-hosted and local environments.

If you work with a Docker image, then your application's deployment is contained in a Dockerfile. A Dockerfile is written with commands to configure the application's running environment within the Docker container. Examples of simple commands in the Dockerfile include the COPY command and ENTRYPOINT command. The COPY command copies the application's executable file into the Docker container, while the ENTRYPOINT command runs the executable as the container's main process.

Since these configurations are captured in code, they can be kept in a version control repository. GitHub, for example, maintains a history of all the changes to the application's Dockerfile. This enables developers to collaboratively maintain the application's running environment.

Enabling developers to create and maintain their application's running environment is critical to a team's DevOps efforts, which aim to break down the barrier between writing application code and running and operating an application.

Ansible automates processes, which can help organizations improve a wide range of efficiencies. For IT tasks, such as the management of developer machines, Ansible makes it possible to create and maintain the setup of a developer's machine in code.

To do this, Ansible uses playbooks, which contain plays. A play is the basic unit of Ansible execution and contains variables, roles and an ordered list of tasks that can repeatedly be run. For example, a play can contain the necessary steps to install and configure a custom Secure Sockets Layer/Transport Layer Security certification and be executable on any machine Secure Socket Shell can access.

A playbook could also provision a server to run an application. The benefits of using Ansible, much like Docker, are that containing these configurations in code provides a working history of changes.

It's possible to use Ansible with Docker to manage more infrastructure. Docker containers offer a way to efficiently build an application, but they require a service to handle their configuration, deployment and orchestration. Ansible is useful for this -- an Ansible playbook can install and configure Docker on the host machine, install a Docker image and run that image as a service.

Cost and customization are the advantages of controlling an application's Docker runtime setup and configuration. Other container tools, such as Fargate, are more costly than running a basic server on premises. When you create your own Ansible playbooks, you also get fuller configurability.

Ansible has a huge advantage over provisioning machines by hand for organizations that don't run applications in the cloud. Using cloud-based tools, such as Fargate, can be compelling, but Ansible and Docker can provide a company with complete control of the server on which their application runs.

Here is the original post:
Compare Ansible vs. Docker use cases and combinations - TechTarget

Read More..

Top Tech Conferences in November 2023 – Spiceworks News and Insights

The season of technology conferences is almost over. November will be relatively less crammed in terms of the number of important events in the month. Nevertheless, there are a few learning-oriented conferences that the tech community can take advantage of before easing off into the holiday season.

Spiceworks News & Insights lists its top five curated conferences to attend this November. Heres what November 2023 holds in store.

This two-day affair caters to system admins, developers, managers, and DevOps practitioners and centers around modern software delivery, especially in the context of cloud applications, open source, and everything in between.

The conference will feature 16 talk sessions delivered by 21 industry experts, who will touch base on topics such as security in todays GitOps environment, how ChatGPT can help with observability, the impact of AI on DevSecOps, and more.

Location: AXA Auditorium, Barcelona, Spain

Date(s): November 9-10, 2023

Attendance: Offline; paid registration required

Weeks before the event, Web Summit announced a new CEO: the former Wikimedia Foundation CEO Katherine Maher. The executive, who also chairs privacy-focused messaging app Signals board, said, In recent weeks, Web Summit has been at the center of the conversation, rather than the host. Its purpose was overshadowed by the personal comments of the events founder and former CEO, Paddy Cosgrave.

Cosgraves comments on the conflict in Israel-Gaza may have introduced a hiccup in the run-up to the premier event, expected to host 70,000 people in mid-November. However, the event, now in its 15th year, expects smooth sailing. Google, Meta, Amazon, Intel, Siemens, and others said they would not attend this years Web Summit.

With an expected attendance of more than 1,000 speakers, 800 investors, 2,600+ startups, and over 300 partners, Web Summit 2023 will create networking, learning, awareness, lead generation, and investment opportunities.

Location: Altice Arena & Fil, Lisbon, Portugal

Date(s): November 13-16, 2023

Attendance: Offline; paid registration required

The most significant announcement Microsoft can make right now is the introduction of its underwriting artificial intelligence (AI) chip. According to a report from The Information, thats exactly what Microsoft will do at its annual conference for developers and IT pros.

Microsoft has made significant investments in AI over the years, including a total of $11 billion in ChatGPT and Dall-E creator OpenAI. The company is rapidly incorporating AI into its products, such as its flagship operating system, Windows 11, productivity suite Microsoft 365, and more. So, pushing to build prowess for an in-house dedicated AI silicon, something that has propelled NVIDIA to over $1 trillion in market capitalization, is a no-brainer.

Expect AI-driven advancements in Copilot, product and functionality demos, and the companys vision for the future.

Microsoft Ignite 2023 tickets have already sold out last month, but you can still catch the keynotes and other talks online by registering with the company.

Location: Seattle Convention Center | Summit, Seattle, WA

Date(s): November 14-17, 2023

Attendance: Offline; paid registration required; Free registration for online sessions (Nov 15-18)

See More: The Path Forward: A Recap of Gartner IT Symposium/Xpo 2023

AWS re:Invent serves as the biggest cloud computing companys annual gateway to showcase its latest cloud innovations to the industry and facilitate IT pros and decision-makers to meet with AWS experts and, of course, each other.

The conference goes beyond breakout sessions, featuring Bootcamps, AWS Builder Labs, and more (totaling 2,000+ technical sessions) for hands-on learning experiences.

Location: Multiple venues in Las Vegas, NV

Date(s): November 27 December 1, 2023

Attendance: Paid registration required for offline attendance; Free online streaming

The London edition of the international technology conferences and expo world series organized by TechEx will be held over two days in November-December 2023. TechEx will be managing the following co-located events:

Location: Olympia, London. U.K.

Date(s): November 30 December 1, 2023

Attendance: Offline with paid and free passes allowing different levels of access

Which of these events are you planning to attend? Share with us on LinkedInOpens a new window , XOpens a new window , or FacebookOpens a new window . Wed love to hear from you!

Image source: Shutterstock

Read more:
Top Tech Conferences in November 2023 - Spiceworks News and Insights

Read More..

5 cloud asset management best practices to optimize performance – TechTarget

User dissatisfaction with IT asset management stems largely from the difficulties associated with identifying and tracking assets that aren't subject to central procurement or readily available for physical inventory. Most cloud users know these challenges exist for cloud assets, as well.

Many cloud services reduce the ability of central asset management to see all the assets that support cloud connection and use. This obscurity leads to a key principle of cloud asset management: Traditional asset tracking must be magnified and enhanced for cloud use.

The general goals of IT asset management (ITAM) systems include the following:

Utilization efficiency has increasingly broadened to include cost management versus quality of experience, which links asset management to performance. This shift is particularly important for cloud assets because of their elastic -- and somewhat ephemeral -- nature. However, the performance efficiency of cloud assets demands a new look at asset tracking.

It's possible for midsize companies to use simple tools, such as a desktop spreadsheet or database program, to organize ITAM data if their assets are on premises. However, that approach is problematic for cloud assets because of their ephemeral nature. Companies need to be able to discover assets from logs, which often contain too many transcription errors.

No matter what tools enterprises use, the right approach to cloud asset management is essential for optimized performance.

Enterprises can't tag a cloud asset, inspect it for inventory or read serial numbers. But cloud assets dependably come with bills. Cloud asset management starts with the centralized collection of all cloud billing.

Remove charges for management features and other non-hosting elements. Leave charges for hosting application components or databases, as well as cloud feature usage directly linked to these elements. This set of charges is the baseline for cloud asset management, so it's important to capture the explicit resource name and code, the nature of the resource, and the organization and individual responsible for ordering it.

The next step is primary governance and security screening of cloud assets. Companies should validate the features of all cloud services against their security and governance policies to ensure the services don't contain features that could compromise policies.

If enterprises haven't done this step already, they should perform it before undertaking other cloud asset management tasks. At this point, the goal is to assess the services against general governance and security policies. Address application-specific issues later, as described below.

While cloud teams can audit cloud costs now, it's probably better to do so when it's possible to make associations between billed costs and applications.

Next, associate cloud assets with application deployments. This step is particularly important for performance optimization.

A bill records the use of a cloud asset, while a log records the user. In some cases, cloud deployment tool logs and scripts -- such as Kubernetes and DevOps -- might be required to link application components to specific cloud assets and their costs. This association helps assess how resources are actually used, as it links them to specific applications and users. That association then links cloud usage monitoring with application performance monitoring (APM).

For the remainder of cloud asset management processes, the application link to the cloud establishes tracking and performance baselines.

The log connection is the basis for managing licenses and application costs. For each application license, the link to cloud resources and assets indicates a use or deployment that might then be regulated under software license terms. This applies not only to applications but to any middleware customers deploy in the cloud.

AI is a technology, not a goal, so it's critical to ensure that the selected tools fit the mission optimally.

At this stage, enterprises should conduct application-specific security and governance reviews, as connections between cloud assets and applications are now auditable. It's also practical to assess the pricing and billing policies now, because the link to the application is a link to the business case that justifies it.

In hybrid cloud applications, this step reveals a link between the cloud and data center for specific applications. That then requires cloud asset management practices to interlock with ITAM practices to gain complete asset visibility.

Performance optimization requires enterprises to assess cloud resource costs versus application performance levels. This step requires time-correlation of billing for cloud resources and application performance levels. Teams can derive the former from the cloud bill-to-application associations and the latter from APM tools. If ITAM procedures manage data center assets, that data can be correlated with cloud data via the common application linkage.

This is a monitoring and data analysis/correlation task, with the aim to augment traditional cloud logging and APM tools with AI tools for analysis. AI is a technology, not a goal, so it's critical to ensure that the selected tools fit the mission optimally.

The cloud is a target of performance analysis, so it could be a candidate for hosting some or all components of asset management. This approach has its benefits and challenges, beyond the traditional ones of comparing cost and features with what's available for use on premises.

Cloud tools often fit the cloud provider who offers them, but they're more difficult to adapt to other multi-cloud or hybrid cloud environments. This point is particularly important for asset management applications but is often missed when enterprises consider options.

No single tool can fit a typical enterprise's cloud asset management challenge. A combination of tools and strong procedural controls is essential. Relying on a tool or tools without those controls is almost surely going to lead to performance, cost and governance problems.

Link:
5 cloud asset management best practices to optimize performance - TechTarget

Read More..