Category Archives: Cloud Hosting
InterSystems cloud integration service comes to NZ and more briefs – Healthcare IT News
NZ launch of InterSystems' cloud-based integration engine
InterSystems has launched in New Zealand its managed integration engine cloud service, HealthShare Health Connect Cloud.
The platform-as-a-solution streamlines interoperability and data integration between clinical systems and applications. It also manages cloud migration infrastructure, enabling providers to "more quickly and easily connect distributed datasets, efficiently move workloads and information across cloud solutions and on-premise services."
The company introduced the data integration engine to the country following increased demand for healthcare interoperability, particularly the HL7 FHIR data sharing standard.
WA Health extends cloud contract with Atos
Atos will continue delivering managed services, hybrid cloud, and core infrastructure services to Health Support Services (HSS), the ICT service provider for WA Health, following a contract extension.
The French IT company was recently awarded a contract worth A$242 M ($159 million) to continue supporting the digitalisation of the WA public health system for the next five years.
It has been working with HSSon the digital transformation of WA Health since 2019. Thatinitial contract was for the delivery of private cloud, managed public cloud, hybrid cloud orchestration, co-location and managed services for over 2,000 servers, hosting of over 1,000 applications, and fully managed public cloud services utilising three hyperscalers.
Victoria to complete Altera EMR rollout across Gippsland
The state government of Victoria has committed to finishing the deployment of an integrated EMR systemacross the rural region of Gippsland.
It recently announced A$12 million in funding over four years to support health services in the region to upgrade their systems. This includes the rollout of Altera Digital Health's Sunrise EMR to remaining health services under the Gippsland Health Alliance (GHA): Gippsland Southern Health Service, Kooweerup Regional Health Service, Omeo District Health, Orbost Regional Health, South Gippsland Hospital, and Yarram and District Health Service.
This comes almost a year after the GHA rolled out the EMR solution across major regional and subregional health services under its wing.
The GHA implementation, Altera claims,is the widest EMR deployment on Microsoft Azure in Australia.
First go-live of cloud-based chemo prescribing solution in SA
Alterahas also started delivering an enterprise chemotherapy prescribing system to South Australian hospitals with a first go-live done at Lyell McEwin Hospital and Mount Gambier and Districts Health Service.
In 2021, the company, which exclusively distributes the iQemo system in Australia and New Zealand, won the competitive tender to deliver the end-to-end solution, which includes predefined regimens, prescribing, scheduling and dispensing, and chemotherapy administration and reporting.
A rollout across regional hospitals and health centres is now underway with implementationat major sites such as Royal Adelaide and Flinders hospitals set for the first quarter of 2024.
Cubiko releases MyMedicare module
GP intelligence platform Cubiko has introduced a new module to support clinics in their implementation of MyMedicare.
MyMedicare is a voluntary patient registration model launched by the Australian government as part of broader efforts to strengthen the Medicare insurance scheme.
In helping "formalise" the connection between patients and their GPs, the free MyMedicare module on Cubiko allows practices to directly import CSV lists of registered patients from MyMedicare in theHealth Professional Online Services,doing away with the administrative burden of registering patients to their clinics manually. It also provides actionable insights into various patient cohorts.
Additionally, the module, which is available on the Cubiko Assist Dashboard, has tools that aid in identifying and enrolling patients with upcoming appointments who are eligible but not yet registered for MyMedicare.
More:
InterSystems cloud integration service comes to NZ and more briefs - Healthcare IT News
Offline backups are a key part of a ransomware protection plan – TechTarget
Ransomware is a major threat today, and it can be particularly harmful when it targets data backups. Offline backups are one method IT administrators lean on to protect against ransomware.
Offline backups are stored on an isolated storage infrastructure that is disconnected from production applications and infrastructure, as well as from the primary backup environment. The result is an air-gapped backup copy that businesses can use for recovery in the event that the primary backup copy becomes compromised.
Historically, an offline backup environment would be a good fit for data that requires less frequent access, such as long-term retention data, and data that is less business-critical. However, the simultaneous rise of cyber attacks and introduction of data privacy legislation have led to an increase in offline backups for mission-critical, frequently accessed data.
While offline backup ransomware protection is an effective option, it is a complex process. Offline backups play a role in ransomware protection, and there are numerous paths to get there. Before deciding to use offline backups for ransomware protection, organizations must consider some key factors. The backup method's practicality, cost, effectiveness and ability to meet recovery objectives are critical to keep in mind.
The longstanding approach to creating an offline backup environment is shipping backup copies to an off-site, disconnected tape storage location.
The longstanding approach to creating an offline backup environment is shipping backup copies to an off-site, disconnected tape storage location.
The problem with this approach is that today's IT operations teams are understaffed and significantly strapped for time, particularly in the area of cybersecurity. Many simply do not have the cycles to deploy and manage yet another infrastructure -- especially considering that the isolated infrastructure will require manual software updates to avoid security vulnerabilities.
A potential pitfall of these alternatives is infiltration of the isolated environment. As a result, the environment must be closely audited for network isolation, control over when the network connection is open, and role-based access to and control over the network and vault environment.
In addition, IT operations staff must look for an option that has data immutability and indelibility. Immutability renders the backup copy read-only, so no one can make unapproved changes to the data. Indelibility inhibits the backup copy from being deleted before the conclusion of a dedicated hold period. These safeguards help protect against data exfiltration and corruption in the event that a malicious actor is able to access the isolated environment.
For any implementation, admins must consider the backup window. They must know how long it will take to complete the backups, as well as any potential lags or gaps between backup jobs. This fundamentally affects the business's ability to meet required recovery points.
Also important to factor in is the required recovery time. Both the backup window and recovery time are largely dependent on the frequency and size of backups jobs, as well as how much data the organization backs up.
New options are emerging that offer an operational isolation, such as hosting the data off site in the cloud or through a service provider. These methods require a network connection to production-facing portions of the environment in order to transfer the backup copy to the isolated environment.
There are a few drawbacks to using the cloud for offline data backups. Since it is isolated, but not completely offline like tape libraries, the cloud is easier for a ransomware attack to reach.
In addition, any cloud-hosted option is potentially subject to egress fees when data is recovered. This is important for IT operations staff to be aware of upfront because it is potentially a very expensive factor to overlook.
Krista Macomber, senior analyst at Futurum Group, writes about data protection and management for TechTarget's Data Backup site. She previously worked at Storage Switzerland and led market intelligence initiatives for TechTarget.
Go here to read the rest:
Offline backups are a key part of a ransomware protection plan - TechTarget
Which Cloud Stocks are Benefitting from the AI Boom? – Opto RoW
WisdomTrees Cloud Computing Fund appears not to have capitalised on the AI rally this year, despite cloud models being fundamental to AI programs. However, two-thirds of the global cloud infrastructure market is accounted for by three Magnificent Seven companies: Google, Microsoft and Amazon. These companies are already shaping themselves for the next generation of cloud and AI computing.
However, market leaders like Amazon have gained 74% year-to-date.
Cloud providers are partnering with AI firms, and preparing for quantum computing.
Cloud software companies have not participated in 2023s AI rally, wrote Christopher Gannatti, Global Head of Research at WisdomTree, in a 15 November report entitled How Will You Use AI: Cloud Software.
Despite the fact that most AI applications run on cloud platforms, WisdomTrees Cloud Computing Fund [WCLD] which tracks an index of emerging public companies focused on delivering cloud-based software to customers has lagged behind the Nasdaq 100 over the past year.
This could suggest a future bounce in the fund, assuming that the underlying thesis that cloud is fundamental to AI applications is correct, and that the market is yet to price this in. However, there are other possible reasons for cloud stocks underperformance.
One is that emerging cloud stocks are expensive by traditional valuation metrics (such as dividends and earnings), and are likely to remain so because of their large expected growth trajectories. Another is that cloud stocks appear to be highly rate-sensitive. On one level, this reflects the fact that cloud stocks tend to be growth stocks; high interest rates are a headwind because they discount the future returns of these stocks.
However, cloud companies seem to be especially sensitive: the WCLD funds price-to-sales ratio recently fell below that of the Russell 1000 Growth Index.
Gannatti is not discouraged by this fact, though, given that the funds projected sales growth far outweighs that of the growth index.
Gannattis hypothesis, however, relies on the notion that cloud-based computing underpins AI applications. While this may be true, there is a disconnect between the cloud-hosted products from which WCLDs holdings derive the bulk of their revenue, and the cloud providers that actually host other applications, such as AI. Those companies have already seen large gains, and more could be on the way.
Research consultancy Gartner is of the opinion that generative AI is among the technologies that could push public cloud spend to $678.8bn in 2024. This marks a 20.4% year-over-year increase from the $563.6bn forecast for 2023.
The highest-growth segment of the cloud market is forecast to be infrastructure-as-a-service (IaaS), which is expected to grow 26.6%, followed by application platform-as-a-service (PaaS) which is expected to grow by 21.5%.
Synergy Research Group puts global cloud infrastructure spending at $68bn in Q3, up approximately 18% year-over-year. The market research firm contends that the increased demand for cloud infrastructure services brought about by AI is helping to overcome other headwinds the sector is facing, such as challenging economic and political situations.
According to Synergy, Microsoft [MSFT] increased its share of the global cloud-infrastructure market by nearly two percentage points to 11% in the third quarter (Q3), while market leader Amazon [AMZN] stayed within its long-standing market share band of 3234%, though towards the bottom end of that range.
Between them, Amazon, Microsoft and Google [GOOGL] accounted for 66% of the global market in Q3. The tier-two providers that had the highest year-on-year growth rates were Oracle [ORCL], Snowflake [SNOW], MongoDB [MDB], VMware [VMW], Huawei and China Telecom [0728:HK].
Generative AI poses significant challenges for cloud providers. The largest among them Microsoft, Alphabets Google, and Amazon are ramping up their capital spend to ensure that they can offer sufficient capacity, the Financial Times reported in early November. Between them, the companies capital spend totalled $32bn in the three months to September, nearly 50% more than the same period in 2020 and 10% more than in the quarter to June.
This extra capacity is enabling the major cloud providers to form groundbreaking partnerships that centre on the potential of generative AI.
In October, Google Cloud announced a partnership with Moodys [MCO] to develop AI systems for financial experts. While the partnership does not explicitly cite the use of cloud-based systems, its three key goals are likely to benefit from Googles role in the segment: to create specialised large language models for financial analysis, to provide access to Moodys data via Googles data warehouse BigQuery, and to enable search of Moodys financial data using Googles Vertex AI.
Elsewhere, data security and management provider Cohesity announced a partnership with AWS for its Turing generative AI features on 13 November. This will aim to improve data management, security and analysis; enrich data interaction and learning; and enable dynamic training of Turing via retrieval augmented generation.
While AI is adding to the demands on cloud providers, quantum computing which requires large amounts of cloud power has the potential to multiply those pressures in the not-so-distant future. This, in turn, may generate a growth area.
Poised to take advantage of AI- and quantum-related demand, a consortium of French academic and industrial researchers dubbed the CLUSSTER project is developing a cloud service tailored towards AI, high-performance computing (HPC) and quantum computing.
The system will be available for HPC and AI from 2024, and will later be expanded to incorporate quantum computing.
Microsoft has already made its own move into the space, with the launch of its Integrated Hybrid feature for Azure Quantum. Hybrid quantum applications with a mix of classical and quantum code together will empower todays quantum innovators to create a new class of algorithms, said Microsofts press release
The three predominant cloud service providers Amazon, Microsoft and Google are all among the so-called Magnificent Seven stocks that have seen outsized gains in the past year thanks to the AI boom.
These three companies can be directly linked to the AI programs underlying this trend; Anthropic hosts mainly on Amazons AWS, OpenAI hosts its applications such as ChatGPT on Microsofts Azure, while Midjourney is hosted on Google cloud.
The stocks have gained 74%, 58.8% and 54.4% respectively in the year to 21 November. In the same period, WCLD gained 22.4%. WCLDs top three holdings as of
21 November are Crowdstrike [CRWD], ZScaler [ZS] and Shopify [SHOP]. Their share prices gained 98.48%, 71.01%, and 100.89% in the same time period.
Disclaimer Past performance is not a reliable indicator of future results.
CMC Markets is an execution-only service provider. The material (whether or not it states any opinions) is for general information purposes only, and does not take into account your personal circumstances or objectives. Nothing in this material is (or should be considered to be) financial, investment or other advice on which reliance should be placed. No opinion given in the material constitutes a recommendation by CMC Markets or the author that any particular investment, security, transaction or investment strategy is suitable for any specific person.
The material has not been prepared in accordance with legal requirements designed to promote the independence of investment research. Although we are not specifically prevented from dealing before providing this material, we do not seek to take advantage of the material prior to its dissemination.
CMC Markets does not endorse or offer opinion on the trading strategies used by the author. Their trading strategies do not guarantee any return and CMC Markets shall not be held responsible for any loss that you may incur, either directly or indirectly, arising from any investment based on any information contained herein.
*Tax treatment depends on individual circumstances and can change or may differ in a jurisdiction other than the UK.
Continue reading for FREE
Error! Please try submitting again.
Success! You have successfully signed up.
Continued here:
Which Cloud Stocks are Benefitting from the AI Boom? - Opto RoW
The Cloud County Ground to host two Middlesex T20s in 2024 – Essex Cricket
Posted on Thursday 23 November 2023
The Cloud County Ground will host two of Middlesexs home Vitality Blast matches in 2024.
Lords will remain Middlesexs main ground, hosting four of their seven home South Group matches, leaving three to be played at other venues.
In recent seasons, both Radlett Cricket Club and Merchant Taylors School have been Middlesexs preferred home out-grounds, however in 2024 the decision has been taken to play two matches at Chelmsford.
Fixture informations will be revealed in line with the full fixture announcement at 10am on Thursday.
Ticket information for the two Chelmsford fixtures will be announced in due course.
Speaking of todays announcement, Essex Chief Executive Officer, John Stephenson, commented:
This is a fast-moving initiative, but we believe it will be of mutual benefit to both counties as it allows the Club to grow our revenue at a challenging time and make best use of our primary asset, the ground.
It was excellent to hear that the Middlesex squad and staff are looking forward to playing at Chelmsford, and we are excited to welcome them here for two Vitality Blast fixtures next year.
Middlesex Chief Executive Officer, Andrew Cornish, added:
We have gone to lengths to be transparent and open with our members when discussing the financial position the Club is in, and moving forwards we need to continue to take every step we can to ensure we remain rigorous in our control of the Clubs costs.
The cost of setting up the infrastructure of an out-ground venue is a significant liability the Club has historically had to factor into our financial model every year increasingly so in recent seasons with the enhancements we have made to the Member experience at out-ground matches.
As we continue to scrutinise every cost the Club incurs, out-ground set-up costs stand out as an area which we could make a significant positive impact on.
We have, as a result, been in discussion with our friends at Essex, who have been very receptive to the idea of hosting us at The Cloud County Ground for two of our Blast matches this year.
For Middlesex, this represents an opportunity for us to make a significant financial uplift to the Club, which is something that we simply cannot ignore, and with Chelmsford just over 20 minutes away on the train out of Stratford station, we felt it provided a sensible solution.
Equally importantly, we have consulted with the professional playing group on this decision, who were unanimous in their support, given their desire to play more cricket at first-class venues, with first-class facilities available to them.
All at Middlesex, the players and the staff, along with all at Essex, are excited about what this opportunity represents and we are looking forward to Middlesex competing at The Cloud County Ground this summer.
Our thanks go to our Members and our supporters for their understanding of this venue choice and hope you appreciate the sound financial reasons for the decision. Whilst this may seem like a radical decision, it is one that we need to make to ensure that Middlesex gets back on a more stable financial footing.
See the rest here:
The Cloud County Ground to host two Middlesex T20s in 2024 - Essex Cricket
3 Stocks That Could Turn $1,000 Into $5,000 by 2030 – The Motley Fool
Every investor wants big gains. But, not every trade pans out as hoped. Plenty of "surefire mega-winners" end up crashing and burning because their risks are underestimated while their upsides are overblown. Names like action-camera maker GoPro or meal kit company Blue Apron (now owned by Wonder Group) come to mind. At one point both stocks were all the rage. Then reality struck, upending each of them.
Those flops don't necessarily mean you have to give up hope on finding a mega-winner though. It just means you need a stricter set of criteria, searching for companies with a superior product or service doing business in proven, sustainable growth markets.
To this end, here's a closer look at three stocks that could turn a $1,000 investment in them into a $5,000 holding in a matter of years. In all three cases, most investors aren't seeing an important detail about their respective businesses, or aren't looking far enough into the future.
You likely know companies like Nvidia, Intel, and Dell Technologies make the technological brains found within a data center. But have you ever thought about who assembles all of this equipment into a final, functioning product? It's not Dell or Intel. Companies like Super Micro Computer (SMCI -0.34%) handle this work. And there's plenty of it.
Super Micro Computer describes itself as "a leading provider of application-optimized, high-performance server and storage solutions that address a broad range of computational-intensive workloads." It goes on to explain its "server Building Block Solutions coupled with extensive in-house design and manufacturing enables the company to rapidly develop, build, and test server and storage systems, subsystems, and accessories with unique configurations."
In other words, while a customer may want to utilize a specific brand of computer processors in its data centers, that organization actually calls Super Micro Computer to make it happen. Genesis Cloud chose Super Micro to help build its new machine-learning apparatus using Nvidia's purpose-built HGX100 artificial intelligence (AI) processors, for instance. Web-hosting outfit Absolute Hosting tapped Super Micro Computer to build its network of virtual private servers around Advanced Micro Devices' EPYC processor.
And business has been good. Last quarter's top line of $2.1 billion was up 14% year over year, extending a long streak of comparable growth. Earnings are growing accordingly.
Now take a step back and look at the bigger picture. As much as the artificial intelligence and cloud computing markets have grown in just the past few years, they've still only scratched the surface of their eventual size. Technology market research outfit IDC believes the cloud computing and storage infrastructure market will grow by more than 11% per year through 2027, while Straits Research expects the artificial intelligence infrastructure market to expand at an average pace of nearly 21% through 2030.
Both tailwinds of course bode well for Super Micro Computer attracting sustained investor interest, particularly in light of its competitive edge. That is, it's honing in on one of the fast-growing AI market's unexpected challenges. As CEO Charles Liang explained during the company's fiscal 2024 first-quarter earnings call, "Our high power efficiency systems, free-air and liquid-cooling expertise has become one of our key differentiators of success." It matters since, as Liang adds, "I anticipate that up to 20% or more of global data centers will transition to liquid-cooled solutions in just a few years."
Amazon (AMZN 0.02%) is the Western Hemisphere's dominant e-commerce name. This dominance isn't the reason Amazon shares could generate 400% returns by 2030 though. In fact, rising operating costs and improving competition somewhat threaten the company's online shopping market leadership.
Rather, the reason to take a shot on Amazon stock right now is the potential it has on a couple of other fronts. One of these opportunities is the aforementioned cloud computing.
In terms of revenue, e-commerce and digital content services (like video and music streaming) are Amazon's biggest businesses, accounting for more than 80% of its top line through the first three quarters of the year. That's not the case for operating profits, however. Although Amazon's cloud computing arm, Amazon Web Services (AWS), only makes up 16% of the company's revenue, this business drives nearly 74% of Amazon's operating profits. This income should more or less grow in step with the cloud computing market's growth.
And the other investment-worthy opportunity? While profit margins on e-commerce may now be under permanent pressure, Amazon is changing its business model in a subtle but significant way. That is, it's getting serious about advertising. Web traffic to Amazon.com is so strong that it now makes sense to allow its third-party sellers to pay Amazon to feature their products. The company did $12 billion worth of advertising business last quarter, but Insider Intelligence believes this full year's advertising revenue tally of $44.9 billion will reach $67.6 billion as soon as 2025.
This is high-margin revenue, too, simply leveraging the traffic Amazon.com is already drawing.
Given these initiatives and their current growth trajectories, Amazon's current annualized revenue of $550 billion could readily swell to well over $1 trillion. Net income could grow at an even faster clip as the company's more profitable cloud computing arm becomes a bigger piece of the business, expanding from around $20 billion now to on the order of $100 billion by 2030. Amazon shares will likely follow the lead of that prospective quadrupling of the company's bottom line.
Last but not least, put Symbotic (SYM -4.51%) on your watch list of stocks that could turn $1,000 into $5,000 by 2030.
It's not a household name, but there's a good chance you or someone living in your household has benefited from Symbotic's products. The company makes robotics solutions, specializing in warehouse automation. Its customers include Walmart, Target, and Albertsons just to name a few -- all companies that are ramping up their e-commerce operations at the same time they're looking for greater cost efficiency. Symbotic helps in both ways, leveraging artificial intelligence to do what once required humans to handle in a less efficient way.
Some investors may recall robotics was something of a hot-button issue a couple of years before the COVID-19 pandemic, promising growth that forever seemed around the corner. Then the pandemic took hold, further delaying the realization of the tech's potential. Interest in the technology has seemingly withered away in the meantime, for investors as well as its next batch of users.
It's an opportunity, however, you may want to put back on your radar even if most other investors haven't.
See, with a handful of usage cases now proving the potential of this inventory-handling solution, would-be customers are becoming interested again. Indeed, market research outfit LogisticsIQ believes the warehouse automation market is set to grow at an average annualized pace of 15% between now and 2028, while Mordor Intelligence puts the growth rate figure closer to 16%.
Symbotic is clearly capitalizing on this market's growth too, and then some. Last quarter's top line of $392 million crushed the year-ago comparison of $244 million as well as analysts' consensus estimate of just under $307 million. But that's still just the beginning. The analyst community expects Symbotic to be doing $2.5 billion worth of business by 2025, with some analysts calling for a top line of $4.0 billion in 2026. That's more than three times its top line of $1.1 billion for its recently completed fiscal year.
The real story here not enough investors are seeing yet, however, is the bottom line. The company's been in the red for most of its existence, but now with enough scale and enough market growth ahead, sustainable profits are in sight. Symbotic swung to an EBITDA profit of $13 million for the quarter ending in September, in fact, and analysts expect the company to sustain this trend, turning last fiscal year's per-share loss of $0.37 into a profit of $0.04 this year en route to earnings of $0.57 per share next fiscal year.
Trading around $52 per share, Symbotic's present price doesn't even come close to reflecting the company's potential profits come 2030. A bottom line in the ballpark of $2.00 (or maybe even more) per share by then isn't out of the question.
See the article here:
3 Stocks That Could Turn $1,000 Into $5,000 by 2030 - The Motley Fool
Simplifying AI development with Azure AI Studio – InfoWorld
Microsoft Azure has been at the heart of Microsofts AI ambitions for many years now. It began with making the deep learning products of Microsoft Research available as Azure Cognitive Services. Then Microsoft added tools to roll your own cloud-hosted machine learning, using Azure to train models and host the resulting services. Now Azure is the home for Microsofts growing family of Copilots, which both build on Azure OpenAIs generative AI models and give customers access to those same models.
Supporting all of these tools, plus providing a framework for customizing cloud service models, required Azure to provide more than one development environment. The result was, to say the least, complex and hard to understand. Fortunately, the Azure AI team has been working on a replacement, Azure AI Studio, that unifies Azures AI development tools, building on responsible AI concepts and supporting a mix of pre-defined and custom AI models.
The development of Azure AI Studio involves a fundamental change in the way we use AI models. Instead of simply making an API call to a single model, were now building pipelines that mix different aspects of a model, or even chaining different models to deliver a multimodal application. Tools like LangChain, Semantic Kernel, and Prompt Flow are now essential frameworks for taming and controlling the output of generative AI, grounding it in our own data.
For example, we can have a computer vision application that identifies objects in a picture, feeding that list into a generative AI large language model to produce a text description of the image, before using a voice generator to read that description to a visually impaired user holding a camera.
As a result, Microsoft is bringing its various Azure AI development tools into one new environment, Azure AI Studio. Introduced in a public preview at Ignite 2023, Azure AI Studio is, for now, focused on building Copilots, Microsofts name for generative AI-powered applications. AI Studio includes support for mixed-model multi-modal tools, and for the Azure AI SDK. The overall aim is to allow you to experiment inside the Studio before building your refined model into a production service.
While Azure AI Studio is in public preview, using Azure OpenAI models in your application requires approval from Microsoft. You will need to be working on a project for an approved enterprise customer, which requires you to be working directly with a Microsoft account team. You will also need to have a specific use case for your project, as this will be used to scope access to the service for both you and your users. For example, if your application will use sensitive data, you will likely be required to limit your application to internal users on secured internal networks.
Theres no need to create a new resource to work with Azure AI Studioits a standalone service that sits outside the Azure Portal. Simply log in with an Azure account to start working. AI Studio opens to an introductory home screen that gives you access to a catalog of models, as well as the Azure OpenAI service. Other options provide links to the familiar Cognitive Services APIs, and to content safety tools that help you reduce the risk of including unsuitable materials in training data or in the prompts used in an AI-powered application.
There are four tabs in Azure AI Studio: Home, Explore, Build, and Manage. On the Home tab, in addition to the links to the rest of the service, youll see a number of sample projects that are hosted on GitHub. These will give you the necessary scaffolding to start building your own code,. One sample shows you how to build an Azure AI-powered Copilot, and another shows you how to mix different AI services to build a multi-modal application.
Getting started is simple enough. You begin by creating an AI-specific resource to manage the VMs and services used for your application. Azure AI Studio walks you through a familiar Azure set-up wizard, creating this resource and its AI services. Interestingly the default includes the renamed Azure Cognitive Search, now called Azure AI Search. This is an interesting choice, as it indicates Microsoft is taking an opinionated approach to AI application architectures, requiring an external setting of embeddings to ground your application and reduce the risk of hallucinations due to prompt overruns.
You can now add an AI model to your Azure AI Studio instance, for example using an Azure OpenAI generative AI model. This is added to the resource group youre using for your AI application, ensuring that youre controlling network access to avoid unauthorized access to your API. This lets you lock access down to a specific VNet, so the only access comes from your application. For even more control, you can disable public network access completely, creating private endpoints on specific subnets.
Theres a large catalog of available models. Youre not limited to OpenAI models, theres support for Metas Llama, open-source models on Hugging Face, Nvidias collection of foundation models, and Microsoft Research models. You can choose models directly or use a list of inference tasks to pick and choose the model thats right for your project. Usefully the catalog is interactive, and you can try out basic interactions before deploying a model into a project.
Building an AI-powered application in Azure AI Studio can be quite simple. Once youve created a deployment and selected your choice of model, its ready to start using. Theres a simple playground you can use to test out prompts and model operation, for example looking at completions or running an AI-driven chat session. Initially you wont be using the model with your own data, so it will only give you generic answers.
Once youre satisfied with your basic prompts and the performance of the model youre using, you can start to modify its behavior by adding data. Data sources can be uploaded files, Azure Blob storage, or an Azure AI Search index. This last option allows you to quickly bring in a pre-processed vector index, which will increase accuracy and speed. Files can include PowerPoint, Word, PDF, HTML, Markdown, and raw text. New data will be indexed by Azure AI Search, ready to ground your AI model.
Azure AI Studio keeps you notified of costs at all steps of the process, so you can make informed decisions about what features to enable. This includes whether to use vector search or not. Once the data has been ingested, you can use the playground to test your models responses again, ensuring that they are now grounded.
The model can now be deployed as a web app for further testing, adding authentication for other tenant users via Entra ID. At this point you can export the playground contents to Prompt Flow for additional development.
Prompt Flow is Azure AI Studios tool for chaining models, prompts, and APIs to build complex AI-powered applications. It gives you the tools to manage system-level prompts, user input, and services, using them as part of a flow, much like those built in Semantic Kernel or LangChain.
Prompt Flow gives you a visual view of the elements of your application, and how each step feeds into the next, allowing you to construct and debug Copilot-like services by linking nodes that perform specific functions. These can include Python, allowing you to bring in data science tools. While you can build your own flows from scratch, Prompt Flow comes with a set of basic templates that provide the necessary scaffolding for further development. These include scaffolds for building long chats with a conversation memory.
Using Prompt Flow allows you to work in both Azure AI Studio and in Visual Studio Code, giving you your choice of development environment. Using a code-based approach loses the visual flow graph, with connections and flow elements defined in YAML. However, the Prompt Flow extension for VS Code not only allows you to work with the code of your flow contents, but gives you a visual editor and a view of your flow graph.
Azure AI Studio is still in preview, but its already offering an interestingly opinionated take on AI application development. Microsofts collection of AI tools show that the company has adopted generative AI wholesale, and incorporate the lessons it has learned in producing trustworthy Copilots. The result promises to be a fast path to bringing generative AI to your applications and data.
See the original post:
Simplifying AI development with Azure AI Studio - InfoWorld
Preserving our digital content is vital. But paying $38,000 for the privilege is not – The Guardian
Opinion
Storing online data in perpetuity is not just about photos and texts but thoughts and ideas. Platforms such as WordPress are starting to act, but it must be at a realistic price
Sat 25 Nov 2023 11.00 EST
Way back in 2004 the two founders of Google, Larry Page and Sergey Brin, thought that it would be a cool idea to scan all the printed books in the world and make them available online. This was at the time when their companys motto (apart from the guff about not being evil) was to organise all the worlds information. Given that the obvious places to look for large collections of books are university libraries, they decided to start there, so they set out to persuade university librarians to let them scan their holdings.
One of the first institutions they approached was a very large American university: they went to visit its librarian and found him very supportive of their ambitious project. Accordingly, the deal was easily sealed. Afterwards, though, the boys noticed that their librarian friend seemed pensive, and so asked him what was wrong. Nothings wrong, he replied. Im just wondering how we can ensure that these scans will be available to readers in 400 years time when Google is no longer around. Because it wont be.
When the librarian told me the story, he remarked that the two lads looked astonished: the thought that Google might be mortal seemed never to have occurred to them. But of course he was right: the lives of most corporations are short. In the US, for example, average lifespan of S&P 500 companies is 21 years and declining. So if we wish to ensure that things are preserved in perpetuity, we need to ensure the institutions that curate them are likewise very long-lived. Given that, it seemed appropriate that our conversation was taking place in the university library in Cambridge, an institution that has been around for more than 800 years and may well be around for another 800.
With digital artefacts, however, preservation involves more than just the longevity of buildings and institutions; it also involves continuity of the technology needed to access older digital artefacts. Its like the problem of how now to view those charming VHS videos you shot when the kids were small but on steroids. Theres a nice cartoon about this somewhere that shows a Nasa control room at the moment when one of the agencys 30-year-old probes has just begun sending back data from deep space. Staff members are joyously celebrating and back-slapping. And then a guy asks: Can anyone here remember how to install Windows 95?
As our world becomes remorselessly digitised, we should be worried about this. When historians of the future start digging for the records of our time they will encounter many black holes. This applies not just to institutions and corporations, but also to each one of us. Think of the billions of photographs we cheerfully upload to social media every day; they are now stored in the cloud of giant server farms owned by tech companies. But when you die, they will be effectively gone for good unless you have thoughtfully arranged access to your account for a friend or family member. Likewise for all your emails and tweets, not to mention those Facebook, Instagram, WhatsApp and Signal chats that you so enjoyed and which charted your social life. All gone once the grim reaper has called, unless there are arrangements in place for the storage of and access to them in perpetuity.
Apart from social media posts, the other invaluable source for future social historians seeking insight into the ways humans lived in the run-up to climate catastrophe is the blogosphere. This is what produces the user-generated content celebrated by the legal scholar Yochai Benkler in his book The Wealth of Networks: a space for writing and conversation that lies outside the market. Its effectively a digital instantiation of German philosopher Jrgen Habermass idea of the public sphere an area in social life where individuals can come together to freely discuss and identify societal problems. And as such its a space in which important conversations happen.
Some blogging is done on corporate platforms such as Blogger, but much of the best stuff is on personal blogs hosted by individual writers. These people (of whom I am one) pay for their own web-hosting, generally write for no payment and are difficult or impossible to censor. But when they die, their blogs generally die, and their thinking good, bad and indifferent is lost to posterity.
So it was interesting that Matt Mullenweg, the founder of the dominant WordPress blogging platform, recently came up with a proposition: Secure your online legacy for a century. That sounded like good news. The bad news was the fee: $38,000. This is clearly absurd a way of rich people assuring their precious content lives on after their death, as one critic put it. But if we were serious about preserving a record of what people are saying and thinking at this moment in human history, theres a germ of a good idea here provided it comes with a realistic price attached.
Tech tacticsA lovely essay on the ethics of Silicon Valley by Sherry Turkle on the Crooked Timber blog.
Mind the gapTim Harford tackles employment inequality and greedy jobs that eat your time on his blog.
Stylish approachCan a computer write like Eudora Welty? An intriguing essay by Randy Sparkman on the Literary Hub site probes the real utility of a large language model.
{{topLeft}}
{{bottomLeft}}
{{topRight}}
{{bottomRight}}
{{.}}
One-timeMonthlyAnnual
Other
See the rest here:
Preserving our digital content is vital. But paying $38,000 for the privilege is not - The Guardian
What has the advent of cloud computing done to the speed of Tech … – Medium
The big three cloud providers or otherwise called Hyper scalers AWS, Microsoft Azure and GCP offer scalable computing for everyone, this means startups can massively take advantage of it which means shortage of compute power is not a hindrance for technological innovation.
Let me explain what I meanultra powerful, ultra reliable, ultra scalable and ultra secure cloud computing resources are available worldwide where there is internet available. This means businesses small and large can utilize the following benefits with a few clicks:
1 Rapid Elasticity: Cloud hosted application usage and traffic surges are not an issue for well load balanced applications. It allows for a new app service to climb from 0 users to a million with a few clicks and in seconds.
2Remote Connectivity, Global access and Collaboration: Cloud computing facilitates global collaboration by providing a centralized platform accessible from anywhere. This has led to increased collaboration on projects with teams distributed across the globe, fostering diverse perspectives and expertise. VPN tunnels and connections can be provisioned and can scale to thousands of simultaneous connections in seconds leading to availability of company resources to work from anywhere employees.
3 No server infrastructure planning required: Cloud computing allows businesses to scale resources up or down based on demand. This flexibility enables rapid development and deployment of applications without the need for extensive infrastructure planning.
4 No large upfront investments required: Cloud services usually follow a pay-as-you-use model, eliminating the need for large upfront investments in hardware, instead this has been replaced by cloud spend budgets. This cost efficiency empowers startups and small businesses to access advanced computing resources, fostering innovation across a broader range of organizations.
5 Test and Destroy enables rapid deployment: Cloud services provide developers with the tools to quickly prototype and deploy applications. This rapid development cycle allows for faster innovation and shorter time-to-market for new products and features.
Read the original post:
What has the advent of cloud computing done to the speed of Tech ... - Medium
Microsoft unveils mysterious "Azure Boost" feature that could give … – TechRadar
Microsoft has introduced a new hardware upgrade option called Azure Boost designed to improve the performance of virtual machines on its cloud computing platform.
In an explainer, the company outlined how Azure Boost looks to [offload] server virtualization processes traditionally performed by the hypervisor and host OS onto purpose-built software and hardware.
Microsoft says that customers can expect a range of benefits including improved networking, revised storage, performance, and security.
Microsoft said that Azure Boost-compatible VM hosts contain the new Microsoft Azure Network Adapter (MANA), which means up to 200 Gbps bandwidth.
Offloading storage operations also sees improvements, with up to 17.3 GBps and 3.8 million IOPS for local storage and up to 12.5 GBps throughput and 650 K IOPS for remote storage.
The cloud provider has confirmed that 17 instance types are already supported, and future Azure virtual machines will also be compatible with Azure Boost.
The explainer confirms that Azure Boost applies to both Linux VMs and Windows VMs, and already, some versions of Ubuntu (20.04 and 22.04 LTS) and Windows Server (2016, 2019, and 2022), among others, have support for the MANA driver (via The Register).
AWS has a similar setup, which it calls Nitro, which underpins the EC2 instances. Like Azure Boost, it promises to offload functions to dedicated hardware and software, in turn improving performance and optimizing cost efficiency.
Whether youre an AWS customer or an Azure customer preparing to take advantage of Boost, the performance enhancements could allow you to use fewer resources, which is better for both the environment and your businesss bank account.
See the article here:
Microsoft unveils mysterious "Azure Boost" feature that could give ... - TechRadar
eWEEK TweetChat, December 12: Tech in 2024, Predictions and … – eWeek
eWEEK content and product recommendations are editorially independent. We may make money when you click on links to our partners. Learn More.
On Tuesday, December 12 at 11 AM PT, @eWEEKNews will host its monthly #eWEEKChat. The topic will be the future of technology in 2024 and beyond, and it will be moderated by James Maguire, eWEEKs Editor-in-Chief.
Well discuss using X, formerly known as Twitter current and evolving trends shaping the future of enterprise technology, from AI to cloud to cybersecurity. Our ultimate goal: to offer guidance to companies that enables them to better keep pace with evolving tech trends.
See below for:
The list of experts for this months Tweetchat currently includes the following please check back for additional expert guests:
The questions well tweet about will include the following check back for more/revised questions:
The chat begins promptly at 11 AM PT on December 12. To participate:
2. Open Twitter in a second browser. On the menu to the left, click on Explore. In the search box at the top, type in #eweekchat. This will open a column that displays all the questions and all the panelists replies.
Remember: you must manually include the hashtag #eweekchat for your replies to be seen by that days tweetchat panel of experts.
Thats it youre ready to go. Be ready at 11 AM PT on September 12 to participate in the tweetchat.
NOTE: There is sometimes a few seconds of delay between when you tweet and when your tweet shows up in the #eWeekchat column.
July 25: Optimizing Generative AI: Guide for CompaniesAugust 15: Next Generation Data AnalyticsSeptember 12: AI in the EnterpriseOctober 17: Future of Cloud ComputingNovember 14: The Future of Generative AIDecember 12: Tech in 2024: Predictions and Wild Guesses
*all topics subjects to change
View post:
eWEEK TweetChat, December 12: Tech in 2024, Predictions and ... - eWeek