Page 857«..1020..856857858859..870880..»

How To Transfer Files From An Android Phone To Mac – SlashGear

The Apple ecosystem has been tailored in a way that all Apple products have a seamless connection with each other and transferring files between these devices needs no technical know-how at all. Sending files from your iPhone or your iPad to your Mac is as easy as putting on AirDrop. The Apple-developed feature allows you to wirelessly share images, documents, videos, and other files. It combines both Bluetooth and Wi-Fi technology to create a direct peer-to-peer (p2p) connection between devices, making it convenient for quickly sharing files without the need for email, messaging apps, or physical cables.

But with Android, this process is different and a little bit more tasking where there's no built-in feature to make swapping files with your MacBook seamless. So, as an Android user, you'll usually have to use a USB cable paired with third-party applications or have to resort to other makeshift methods to help you out and hope it does the trick. It won't be as smooth as AirDrop, that's for sure.

Google introduced a service known as Android File Transfer, specifically designed to support the seamless transfer of files between Android devices and MacBooks. This service is compatible with Mac systems running macOS 10.7 (Lion) and later versions. While it may exhibit occasional glitches, it remains one of the most effective methods for transferring files between your Android device and MacBook. Here's how to use it:

You're now all set to browse through your Android device's files and folders and copy stuff over. If Android File Transfer doesn't load up automatically when you connect your USB cable from your MacBook to your Android device, don't worry. Go to Launchpad on your MacBook and open Android File Transfer to access your files and folders. If you see a "No Android device found" error instead of your files, it probably means your Android device and MacBook aren't connected properly. Try swapping out your cable or disconnecting and reconnecting to give it another shot.

This is an alternative approach that isn't very popular for transferring files between your MacBook and Android device. While lots of cloud storage options are available, it's likely that you already have a preferred service you use regularly, such as Google Drive, Microsoft OneDrive, Dropbox, or another platform. To make file transfers between these two devices using your chosen cloud storage, it's important to ensure that your selected cloud service supports both your MacBook and Android devices. Once you've got that checked off, follow these steps:

If you don't typically use any of these cloud storage options, create an account with a suitable service. Install the software on your MacBook and grab the matching app on your Android device. Now, you're all set to transfer files back and forth. It could be anything photos, videos, GIFs, text files, scripts, apps, you name it. Once you upload them to your cloud storage service, you'll be able to access and copy them from both your MacBook and Android device directly from the cloud. If you frequently send large files using this cloud storage method, consider upgrading to a premium plan to expand your cloud storage capacity.

The rest is here:
How To Transfer Files From An Android Phone To Mac - SlashGear

Read More..

10 expert Bitcoin price predictions for 2024 – Finbold – Finance in Bold

Although Bitcoin (BTC) has been going through a bit of a stagnation lately, along with the majority of the cryptocurrency market, quite a few experts are bullish about its long-term future, with the most optimistic prognoses for the maiden cryptocurrency going as high as $1 million.

As it happens, the author of the book Undressing Bitcoin, marketing consultant and podcast host Layah Heilpern, listed ten finance and investment experts and companies, along with their predictions for the price of Bitcoin in the future, in an X post shared on September 14.

Specifically, Heilpern started off with billionaire investor and venture capitalist Tim Draper, who had originally predicted that Bitcoin would reach the price of $250,000 by June 2023 but has later pushed back his prognosis for 2025, stating he hadnt expected the United States regulators to be so aggressive.

Meanwhile, one of the early pioneers in the cryptocurrency field and CEO of blockchain company Blockstream, Adam Back, stated in August that Bitcoin could hit $100,000 before the 2024 halving, the same number as offered by Robert Kiyosaki, who more recently said Bitcoin could even soar to $1 million if the world economy crashed.

At the same time, $1 million by 2030 is the price target for Bitcoin, also predicted by Cathie Wood, the CEO of global asset manager ARK Investment Management, back in June 2023, having reiterated that Bitcoin is a hedge against inflation a view shared by many other experts.

In July, Mike Novogratz, the CEO of crypto investment firm Galaxy Digital, said Bitcoin would undoubtedly reach $500,000 in the next five years or so due to its adoption pace and unique features, such as being tailor-made to being an anti-inflation store of value.

Furthermore, Fundstrat Global Advisors co-founder Tom Lee believes that Bitcoin is heading toward $180,000 by the end of 2024, particularly if the US Securities and Exchange Commission (SEC) approves a spot Bitcoin exchange-traded fund (ETF).

Arthur Hayes, the co-founder of crypto exchange BitMEX, sees $70,000, slightly above Bitcoins all-time high (ATH) of $69,045 from November 202, as the most likely scenario in a rally triggered by the possible decision of the US Federal Reserve on cutting interest rates, as he wrote in his digest on September 12.

On the other hand, investment banking and asset management giant JPMorgan Chase (NYSE: JPM) is a bit more conservative in its estimation, projecting $45,000 as a bull case scenario price for Bitcoin, provided it equals gold in risk capital or volume-adjusted terms in investors portfolios.

That said, in its recent Blockchain Letter, published on August 22, the team at Pantera Capital, one of the leading names in crypto asset management, highlighted its bullish outlook for 2024 and projected that Bitcoin could rise to around $148,000 in its next four-year halving cycle if past trends hold.

In the meantime, Standard Chartered, one of the leading international banks in the United Kingdom that offers Bitcoin and crypto custody in the European Union through its subsidiary, Zodia Custody, has recently boosted its original $100,000 end-2024 forecast for Bitcoin to $120,000.

Finally, Heilpern added her own forecast of $75,000 for the flagship decentralized finance (DeFi) asset to the list, proceeding to place the timeline for achieving this price target at some point in 2025 in the comments.

As things stand, Bitcoin is currently changing hands at $26,631, up 1.07% on the day and growing 3.07% across the past week, while still recording losses of 8.68% on its monthly chart, as per the latest information retrieved on September 15.

All things considered, time will tell which of the above prognoses ranging from a modest (at this point) $45,000 to a whopping $1 million per unit of the Proof-of-Work (PoW) cryptocurrency was the most correct (if any).

Disclaimer: The content on this site should not be considered investment advice. Investing is speculative. When investing, your capital is at risk.

See original here:
10 expert Bitcoin price predictions for 2024 - Finbold - Finance in Bold

Read More..

Bitcoin’s long-term price momentum is breaking, Wolfe Research says – CNBC

Bitcoin's price this week is near a familiar and key support level, but its latest stop there is a little more worrying than previous ones this year, according to Wolfe Research. Earlier this week, bitcoin fell below $25,000 for the first time since March . The flagship cryptocurrency tested this threshold at various points this year, but it always rebounded. While bitcoin has been floating in the narrow range of $25,000 to $30,000 all year, investors have cheered its resilience knowing potential catalysts such as developments around bitcoin exchange-traded funds were on the horizon. That long-term momentum is starting to break, however. That means crypto investors who have been patiently waiting for better days may want to consider their positions. "With support in this area on dual fronts, it makes sense that price would hold and consolidate in this region. However, as we look around, the crypto landscape is growing ever more concerning," Wolfe analyst Rob Ginsberg said in a note Wednesday. "Coins are taking out crucial levels of support across the board and there's not many bright spots to be found at the moment." "Longer term momentum is starting to break in bitcoin," he added. "This is often one of our more reliable warning signs and part of the reason we are currently bearish on the broader market. Short term price action is never our worry, it's when longer term trends start to break, that we want to take notice and pivot accordingly." BTC.CM= 6M mountain Bitcoin recently fell below key support at $25,000 for the first time since March. Bitcoin's 50-day moving average began turning lower in August and recently crossed below its 200-day moving average. Although it may be losing momentum, the 200-day moving average is still ascending. The next level of support below $25,000 is at $20,000, Ginsberg said. Further, ether is testing the $1,600 threshold, which has Wolfe "highly concerned." Failure to bounce from that level puts $1,500 at the next level to test on the downside, the firm said. "As the retail investor comes under pressure and liquidity is drained, our concerns will only grow for crypto prices," Ginsberg added. CNBC's Michael Bloom and Nick Wells contributed reporting.

See the original post:
Bitcoin's long-term price momentum is breaking, Wolfe Research says - CNBC

Read More..

Aerial imagery datasets to be stored in the cloud for faster, easier … – RNZ

These aerial shots taken before and after Cyclone Gabrielle show a section of Gisborne's Waipaoa River between Ormond and Te Karaka. Photo: LINZ

Scientists working in future disasters will have faster, easier access to past datasets, as Toit Te Whenua Land Information New Zealand (LINZ) moves towards cloud storage.

As part of the open data programme run by Amazon Web Services, 20 terabytes of aerial imagery going back to the 1970s is initially being made available to the public on the cloud.

LINZ is the second organisation to join the programme, following GNS Science.

Amazon Web Services country manager Tim Dacombe-Bird said storing data in the cloud made it readily available globally, and accessible quickly.

"Previously, to share these large datasets... [LINZ] would have to ship hard drives around the country or they'd have to break the datasets up into smaller components," he said.

"It adds a layer of complexity, but it also adds the burden of time for the analysis."

Users, including scientists, researchers, and governments, but also members of the public, would now be able to access all of these images through the Registry of Open Data.

Aerial imagery can be used when disasters hit. During Cyclone Gabrielle, an aerial view allowed agencies to assess the damage and allocate resources more efficiently.

LINZ made satellite and aerial imagery covering a third of the North Island available when the on the cloud for the agencies involved in the response.

LINZ head of location information Aaron Jordan said it was usually those outside the disaster zone who were dealing with the data.

"What typically happens in an event is those who aren't affected are often the ones that are collecting and coordinating the imagery... and then provisioning it to people who can make assessments," Jordan said.

"Those assessments can then be passed down through other channels, like radio, to people on the ground."

In those cases, the faster they could get the information, the better.

Jordan said Toit Te Whenua would be adding more images as they became available, eventually extend back to the 1940s.

It was paid for by the AWS Open Data Sponsorship Program, which covers the cost of storage for publicly available high-value, cloud-optimised datasets.

More:
Aerial imagery datasets to be stored in the cloud for faster, easier ... - RNZ

Read More..

Hunter Biden sues former Trump aide over release of laptop … – Los Angeles Times

WASHINGTON

Hunter Biden sued a former Trump administration aide in California federal court Wednesday, alleging that he published emails, images, videos and recordings belonging to Biden online.

The 13-page suit accuses Garrett Ziegler, his company and 10 unnamed defendants of improperly accessing, tampering with, manipulating, altering, copying and damaging computer data that they do not own in violation of the states computer fraud laws.

Ziegler, a former aide to White House trade advisor Peter Navarro, has emerged in far-right media circles as one of the Biden familys most outspoken critics. An attorney for Ziegler did not immediately answer a request for comment Wednesday.

The suit centers around a now-infamous laptop, purportedly left by Biden at a Wilmington, Del., repair shop and found by Republican operatives weeks before the 2020 election, which has become a central part of allegations of corruption involving President Bidens son.

Ziegler has said his nonprofit research group Marco Polo has posted online over the last two years thousands of emails, photos, text messages and other documents purportedly from Hunter Bidens iPhone backup and cloud storage.

The suit alleges that Ziegler and the unnamed co-defendants have ignored requests to stop releasing the information and return it to Biden, claiming that they instead doubled down and vowed to continue violating the law.

Biden is seeking a jury trial to determine damages as well as an injunction to prevent Ziegler from continuing to share the information online.

Bidens attorneys say Ziegler has waged a sustained, unhinged and obsessed campaign against [Hunter Biden] and the entire Biden family for more than two years.

The filing describes how Ziegler and the 10 unnamed defendants allegedly obtained data belonging to Hunter Biden from other Trump allies in January 2021 and disseminated tens of thousands of emails, thousands of photos, and dozens of videos and recordings on the internet. It accuses Ziegler and the defendants of hacking an encrypted iPhone and Bidens online cloud storage.

In March, Bidens legal team filed a lawsuit against John Paul Mac Isaac, a computer repairman in Delaware who claimed he obtained the laptop. Bidens lawyers also previously referred Ziegler to federal and state prosecutors for alleged criminal behavior.

Read the original:
Hunter Biden sues former Trump aide over release of laptop ... - Los Angeles Times

Read More..

Ateliere and qibb Partner to Enable Turnkey Hybrid Storage … – StreetInsider.com

Customers can seamlessly integrate on-premises and cloud storage, eliminating bottlenecks, increasing efficiency, cost savings while opening doors to new business opportunities

CENTURY CITY, CA / ACCESSWIRE / September 14, 2023 / Ateliere Creative Technologies, a leading developer of cloud-native media supply chain solutions and a 2023 "IDC Innovator," has announced a partnership with media application integration specialists, qibb, to support hybrid storage workflows across the media supply chain. Ateliere's cloud-native media supply chain workflows and open API seamlessly integrate with qibb's media workflow connectors, ensuring the swift automatic transfer of local files to the cloud and vice versa. This collaboration unlocks a world of possibilities for Ateliere Connect customers who want to retain their on-premises operation while leveraging the benefits of the cloud, maintaining a cohesive workflow without the need for bespoke integrations in the process.

"This partnership is not just about technology; it's about delivering value. Support for hybrid workflows is critical for us as we work to make our solutions more accessible to businesses who are not yet fully operating in the cloud," says Ateliere CEO Dan Goman. "Our partnership with qibb extends our offerings to companies that store and heavily leverage content on on-premises storage yet want the added benefits that the cloud provides."

The highly efficient and flexible media ecosystem resolves deficient orchestration and workflow management challenges for working on premise and in the cloud. Customers with large on-premise storage pools can now easily design cloud workflows based on business needs as opposed to technical workarounds.

"At qibb, we're experts at making media applications work together smoothly. Ateliere is a great example to show how we can connect to services outside of our Node Catalog," says Roman Holzhause, qibb Chief Technology Officer. "Via our community-based GraphQL connector, all Ateliere objects, events, job creation and workflow progress become available. This allows integrating Ateliere seamlessly with more than 100 third-party applications in the qibb catalog - including leading media applications and cutting-edge AI solutions. Together, we will ensure the highest level of customer satisfaction by enabling customized workflows and automations."

About the Ateliere - qibb Integration

qibb's compatibility with Ateliere's GraphQL API and media system integrations enable easy workflows with Ateliere's low-code environment. All objects, events, job creation and workflow progress are available via the API, allowing third-party applications like qibb to operate or react with custom behavior.

Connect with Ateliere to learn how their hybrid workflow capabilities power modern media supply chains: https://www.ateliere.com/company/contact-us.

Book a Meeting with Ateliere at IBC2023The IBC2023 show will be held at the RAI Convention Center in Amsterdam from September 15th through September 18th. Attendees can book a private meeting with an Ateliere expert or sales professional to discuss their media supply chain and streaming needs and how Ateliere solutions can help. To request a day and time, visit https://www.ateliere.com/events/ibc-show-2023.

Press Briefings at IBC2023

Members of the press are invited to connect with Ateliere for a private demonstration on stand 5.B63. Press attending the show can book a media briefing at the booth or schedule a virtual briefing before the show through [emailprotected].

About qibb

qibb enables future-proof media solutions and makes integration of professional media workflows easy. qibb is the pioneering integration platform to create and maintain low-code media workflows along the digital media supply chain. A community-driven ecosystem of adapters (nodes) and pre-integrations (flows) flanked by a versatile feature toolkit makes virtualized systems integration a reality. With qibb it is possible to integrate several apps into automated workflows - all on one cloud platform. The integration platform is developed by experienced digital and media experts at the tech company Techtriq.www.qibb.com

About Ateliere

Ateliere Creative Technologies is a leading cloud-native media supply chain company that empowers media companies and content creators to reach consumers on a global scale. The Ateliere suite of SaaS solutions incorporates cutting-edge workflows and formats to make the vision for a studio in the cloud a reality. The nucleus of the Ateliere platform, Ateliere Connect", delivers core competencies in IMF, parallel scaling, and geographically distributed workflows. Ateliere is built by a team of experts with decades of combined experience at companies such as Amazon, HBO, Netflix, and Microsoft.

Find out more at https://www.ateliere.com, and follow us on X (@TeamAteliere), Instagram (@AteliereTech), LinkedIn (https://www.linkedin.com/company/ateliere-creative-technologies/), and Facebook (@AteliereCreativeTechnologies).

Media Contact:

Kristin CandersGrithaus Agency[emailprotected]+1 (207) 974-7744

SOURCE: Ateliere

View source version on accesswire.com: https://www.accesswire.com/784140/ateliere-and-qibb-partner-to-enable-turnkey-hybrid-storage-integration-for-the-modern-media-supply-chain

Read more from the original source:
Ateliere and qibb Partner to Enable Turnkey Hybrid Storage ... - StreetInsider.com

Read More..

Domain-Driven Cloud: Aligning your Cloud Architecture to your … – InfoQ.com

Key Takeaways

Domain-Driven Cloud (DDC) is an approach for creating your organizations cloud architecture based on your business model. DDC uses the bounded contexts of your business model as inputs and outputs a flexible cloud architecture to support all of the workloads in your organization and evolve as your business changes. DDC promotes team autonomy by giving teams the ability to innovate within guardrails. Operationally, DDC simplifies security, governance, integration and cost management in a way that promotes transparency for IT and business stakeholders alike.

Based on Domain-Driven Design (DDD) and the architecture principle of high cohesion and low coupling, this article introduces DDC including the technical and human benefits of aligning your cloud architecture to the bounded contexts in your business model. You will learn how DDC can be implemented in cloud platforms including Amazon Web Services (AWS) and Microsoft Azure while aligning with their well-architected frameworks. Using illustrative examples from one of our real customers, you will learn the 5 steps to implementing DDC in your organization.

DDC extends the principles of DDD beyond traditional software systems to create a unifying architecture spanning business domains, software systems and cloud infrastructure.

Our customers perpetually strive to align "people, process and technology" together so they can work in harmony to deliver business outcomes. However, in practice, this often falls down as the Business (Biz), IT Development (Dev) and IT Operations (Ops) all go to their separate corners to design solutions for complex problems that actually span all three.

What emerges is business process redesigns, enterprise architectures and cloud platform architecture all designed and implemented by different groups using different approaches and localized languages.

Whats missing is a unified architecture approach using a shared language that integrates BizDevOps. This is where DDC steps in, with a specific focus on aligning the cloud architecture and software systems that run on them to the bounded contexts of your business model, identified using DDD. Figure 1 illustrates how DDC extends the principles of DDD to include cloud infrastructure architecture and in doing so creates a unified architecture that aligns BizDevOps.

[Click on the image to view full-size]

In DDC, the most important cloud services are AWS Organizational Units (OUs) that contain Accounts and Azure Management Groups (MGs) that contain Subscriptions. Because 100% of the cloud resources you secure, use and pay for are connected to Accounts and Subscriptions, these are the natural cost and security containers. By enabling management and security at the higher OU/MG level and anchoring these on the bounded contexts of your business model, you can now create a unifying architecture spanning Biz, Dev and Ops. You can do this while giving your teams flexibility in how they use Accounts and Subscriptions to meet specific requirements.

The benefits for aligning your cloud architecture to your organizations business model include:

DDC may not be the best approach in all situations. Alternatives such as organizing your cloud architecture by tenant/customer (SaaS) or legal entity are viable options, too.

Unfortunately, we often see customers default to organizing their cloud architecture by their current org structure, following Conways Law from the 1960s. We think this is a mistake and that DDC is a better alternative for one simple reason: your business model is more stable than your org structure.

One of the core tenets of good architecture is that we dont have more stable components depending on less stable components (aka the Stable Dependencies Principle). Organizations, especially large ones, like to reorganize often, making their org structure less stable than their business model. Basing your cloud architecture on your org structure means that every time you reorganize your cloud architecture is directly impacted, which may impact all the workloads running in your cloud environment. Why do this? Basing your cloud architecture on your organizations business model enables it to evolve naturally as your business strategy evolves, as seen in Figure 2.

[Click on the image to view full-size]

We recognize that, as Ruth Malan states, "If the architecture of the system and the architecture of the organization are at odds, the architecture of the organization wins". We also acknowledge there is work to do with how OUs/MGs and all the workloads within them best align to team boundaries and responsibilities. We think ideas like Team Topologies may help here.

We are seeing todays organizations move away from siloed departmental projects within formal communications structures to cross-functional teams creating products and services that span organizational boundaries. These modern solutions run in the cloud, so we feel the time is right for evolving your enterprise architecture in a way that unifies Biz, Dev and Ops using a shared language and architecture approach.

Both AWSs Well-Architected framework and Azures Well-Architected framework provide a curated set of design principles and best practices for designing and operating systems in your cloud environments. DDC fully embraces these frameworks and at SingleStone we use these with our customers. While these frameworks provide specific recommendations and benefits for organizing your workloads into multiple Accounts or Subscriptions, managed with OUs and MGs, they leave it to you to figure out the best taxonomy for your organization.

DDC is opinionated on basing your cloud architecture on your bounded contexts, while being 100% compatible with models like AWSs Separated AEO/IEO and design principles like "Perform operations as code" and "Automatically recover from failure". You can adopt DDC and apply these best practices, too. Tools such as AWS Landing Zone and Azure Landing Zones can accelerate the setup of your cloud architecture while also being domain-driven.

Do you think a unified architecture using a shared language across BizDevOps might benefit your organization? While a comprehensive list of all tasks is beyond the scope of this article, here are the five basic steps you can follow, with illustrations from one of our customers who recently migrated to Azure.

The starting point for implementing DDC is a set of bounded contexts that describes your business model. The steps to identify your bounded contexts are not covered here, but the process described in Domain-Driven Discovery is one approach.

Once you identify your bounded contexts, organize them into two groups:

To illustrate, lets look at our customer who is a medical supply company. Their domain and technical contexts are shown in Figure 3.

[Click on the image to view full-size]

Your organizations domain contexts would be different, of course.

For technical contexts, the number will depend on factors including your organizations industry, complexity, regulatory and security requirements. A Fortune 100 financial services firm will have more technical contexts than a new media start-up. With that said, as a starting point DDC recommends six technical contexts for supporting all your systems and data.

You dont have to create these all up-front, start with Cloud Management initially and build out as-needed.

WIth your bounded contexts defined, its now time to build a secure cloud foundation for supporting your organizations workloads today and in the future. In our experience, we have found it is helpful to organize your cloud capabilities into three layers based on how they support your workloads. For our medical supply customer, Figure 4 shows their contexts aligned to Application, Platform and Foundation layers of their cloud architecture.

[Click on the image to view full-size]

With DDC, you align AWS Organizational Units (OUs) or Azure Management Groups (MGs) to bounded contexts. By align, we mean you name them after your bounded contexts. These are the highest levels of management and through the use of inheritance they give you the ability to standardize controls and settings across your entire cloud architecture.

DDC gives you flexibility in how best to organize your Accounts and Subscription taxonomy, from coarse-grained to fine-grained, as seen in Figure 5.

DDC recommends starting with one OU/MG and at least two Accounts/Subscriptions per bounded context. If your organization has higher workload isolation requirements, DDC can support this too, as seen in Figure 5.

[Click on the image to view full-size]

For our customer who had a small cloud team new to Azure, separate Subscriptions for Prod and NonProd for each context made sense as a starting point, as shown in Figure 6.

[Click on the image to view full-size]

Figure 7 shows what this would look like in AWS.

[Click on the image to view full-size]

For our customer, further environments like Dev, Test and Stage could be created within their respective Prod and Non-Prod Subscriptions. This provides them isolation between environments with the ability to configure environment-specific settings at the Subscription or lower levels. They also decided to build just the Prod Subscriptions for the six technical contexts to keep it simple to start. Again, if your organization wanted to create separate Accounts or Subscriptions for every workload environment, this can be done too and still aligned with DDC.

From a governance perspective, in DDC we recommend domain contexts inherit security controls and configurations from technical contexts. Creating a strong security posture in your technical contexts enables all your workloads that run in domain contexts to inherit this security by default. Domain contexts can then override selected controls and settings on a case-by-case basis balancing team autonomy and flexibility with required security guardrails.

Using DDC, your organization can grant autonomy to teams to enable innovation within guardrails. Leveraging key concepts from team topologies, stream-aligned teams can be self-sufficient within domain contexts when creating cloud infrastructure, deploying releases and monitoring their workloads. Platform teams, primarily working in technical contexts, can focus on designing and running highly-available services used by the stream-aligned teams. These teams work together to create the right balance between centralization and decentralization of cloud controls to meet your organizations security and risk requirements, as shown in Figure 8.

[Click on the image to view full-size]

As this figure shows, policies and controls defined at higher level OUs/MGs are enforced downwards while costs and compliance are reported upwards. For our medical supply customer, this means their monthly Azure bill is automatically itemized by their bounded contexts with summarized cloud costs for Orders, Distributors and Payers to name a few.

This makes it easy for their CTO to share cloud costs with their business counterparts and establish realistic budgets that can be monitored over time. Just like costs, policy compliance across all contexts can be reported upwards with evidence stored in the Compliance technical context for auditing or forensic purposes. Services such as Azure Policy and AWS Audit Manager are helpful for continually maintaining compliance across your cloud environments by organizing your policies and controls in one place for management.

With a solid foundation and our bounded contexts identified, the next step is to align your workloads to the bounded contexts. Identifying all the workloads that will run in your cloud environment is often done during a cloud migration discovery, aided in part by a change management database (CMDB) that contains your organizations portfolio of applications.

When aligning workloads to bounded contexts we prefer a workshop approach that promotes discussion and collaboration. In our experience this makes DDC understandable and relatable by the teams involved in migration. Because teams must develop and support these workloads, the workshop also highlights where organizational structures may align (or not) to bounded contexts. This workshop (or a follow-up one) can also identify which applications should be independently deployable and how the teams ownership boundaries map to bounded contexts.

For our medical supply customer, this workshop revealed the permissions required for a shared CI/CD tool in the Shared Services context was needed to deploy a new version of their Order Management system in the Orders context. This drove a discussion on working out how secrets and permissions would be managed across contexts, identifying new capabilities needed for secrets management that were prioritized during cloud migration. By creating a reusable solution that worked for all future workloads in domain contexts, the cloud team created a new capability that improved the speed of future migrations.

Figure 9 summarizes how our customer aligned their workloads to bounded contexts, which are aligned to their Azure Management Groups.

[Click on the image to view full-size]

Within the Order context, our customer used Azure Resource Groups for independently deployable applications or services that contain Azure Resources, as shown in Figure 10.

[Click on the image to view full-size]

This design served as a starting point for their initial migration of applications running in a data center to Azure. Over the next few years their goal was to re-factor these applications into multiple independent micro-services. When this time came, they could incrementally do this an application at a time by creating additional Resource Groups for each service.

If our customer were using AWS, Figure 10 would look very similar but use Organizational Units, Accounts and AWS Stacks for organizing independently deployable applications or services that contained resources. One difference in cloud providers is that AWS allows nested stacks (stacks within stacks) whereas Azure Resource Groups cannot be nested.

For networking, in order for workloads running in domain contexts to access shared services in technical contexts, their networks must be connected or permissions explicitly enabled to allow access. While the Network technical context contains centralized networking services, by default each Account or Subscription aligned to a domain context will have its own private network containing subnets that are independently created, maintained and used by the workloads running inside them.

Depending on the total number of Accounts or Subscriptions, this may be desired or it may be too many separate networks to manage (each potentially has their own IP range). Alternatively, core networks can be defined in the Network Context and shared to specific domain or technical contexts thereby avoiding every context having its own private network. The details of cloud networking are beyond the scope of this article but DDC enables multiple networking options while still aligning your cloud architecture to your business model. Bottom line: you dont have to sacrifice network security to adopt DDC.

Now that we have identified where each workload will run, it was time to begin moving them into the right Account or Subscription. While this was a new migration for our customer (greenfield), for your organization this may involve re-architecting your existing cloud platform (brownfield). Migrating a portfolio of workloads to AWS or Azure and the steps for architecting your cloud platform is beyond the scope of this article, but with respect to DDC this is a checklist of the key things to keep in mind:

For brownfield deployments of DDC that are starting with an existing cloud architecture, the basic recipe is:

Your cloud architecture is not a static artifact, the design will continue to evolve over time as your business changes and new technologies emerge. New bounded contexts will appear that require changes to your cloud platform. Ideally much of this work is codified and automated, but in all likelihood you will still have some manual steps involved as your bounded contexts evolve.

Your Account / Subscription taxonomy may change over time too, starting with fewer to simplify initial management and growing as your teams and processes mature. The responsibility boundaries of teams and how these align to bounded contexts will also mature over time. Methods like GitOps work nicely alongside DDC to keep your cloud infrastructure flexible and extensible over time and continually aligned with your business model.

DDC extends the principles of DDD beyond traditional software systems to create a unifying architecture spanning business domains, software systems and cloud infrastructure (BizDevOps). DDC is based on the software architecture principle of high cohesion and low coupling that is used when designing complex distributed systems, like your AWS and Azure environments. Employing the transparency and shared language benefits of DDD when creating your organizations cloud architecture results in a secure-yet-flexible platform that naturally evolves as your business changes over time.

Special thanks to John Chapin, Casey Lee, Brandon Linton and Nick Tune for feedback on early drafts of this article and Abby Franks for the images.

Original post:
Domain-Driven Cloud: Aligning your Cloud Architecture to your ... - InfoQ.com

Read More..

50 programs that fix Microsoft Windows problems fast | PCWorld – PCWorld

No matter how much experience you have with Microsofts Windows, it can still be improved by turning to software and tools that can make the operating system that much better.

Take Windows 11, for example: When Microsoft introduced it with extremely strict system requirements in autumn 2021, it was only a matter of time before those barriers could be circumvented.

In order to install the new operating system on older PCs, the registry first had to be changed manually. Later, this could be simplified with a batch file, and now even that is superfluous thanks to Rufus, a small tool for creating bootable USB sticks. Now, with just a few additional mouse clicks, you can run Windows 11 on almost any computer.

Lets start this look at helpful, dead simple Windows software with that very program, before diving deeper into several different categories.

If youd prefer to wade into a deep, powerful program, check out our guide to Microsoft Sysinternals, the best Windows troubleshooting tool.

Download the installation file for Windows 11 from Microsoft via the option Download a Windows 11 disk image (ISO).

Now insert a flash drive with at least 8GB of storage space into the computer. Start Rufus, click on Select on the interface, select the Windows 11 ISO file, and then click Start. The Customize installation dialogue then appears, giving you an option to avoid Windows 11s obligation to set up an online account. Follow the setup process until Windows 11 is installed.

This Rufus-loaded flash drive will not only reinstall Windows 11 on any PC, but also upgrade any Windows 10 installation via the setup.exe file!

If you want to move an older Windows 10 system 1:1 to a new PC, we recommend Easeus Todo Backup. With it, you create an image of your old computers storage, from which you restore your system with all settings, programs, and data on your new PC. In addition, Windows 10 can then be upgraded to Windows 11 without any problems.

To install an older Windows version for example Windows 8.1, 10, or 11 version 21H2 save it to your storage with Windows ISO Downloader and create a setup flash drive from it with Rufus.

Depending on the Windows version, the hardware and the history of the PC, you may need a product key to activate the operating system for the new installation. You can read this key on your existing system with Showkeyplus.

Reset Windows Update Tool solves various update problems: Almost 20 features are available after starting the program with administrator rights.

If you have installed several versions of Windows, Linux, or other operating systems on your computer, you can use Easy BCD to adjust the boot entries and their prioritization.

Creating a flash drive as a multiboot system for booting different live systems was a complex matter for a long time. Ventoy fundamentally changes that. With this tool, all you have to do is make the flash drive bootable by clicking on Install and then simply save the ISO files on the stick within Windows. After booting from the flash drive, you select the desired live system via the Ventoy interface. The key benefit here is you dont need to create a new bootable flash drive when a new system version appears, but simply replace the older one with the new ISO file. Another plus: You can continue to use free space on the USB stick to save and transport your data.

Glary Utilities makes both problem analysis and their elimination possible with a mouse click, everything else is done automatically by the software in the background.

IDG

The promise of 1-click maintenance is hit or miss on the PC. The causes of possible errors are too varied and the solutions too complex. That said, Ccleaner and Glary Utilities are always worth a try. You can start the system analysis and the subsequent problem elimination with just one mouse click.

Bootracer requires a few more clicks. The program analyzes the start-up process and breaks it down into individual segments. This shows at a glance which process or autostart program has a problem. You can choose to start the boot analysis as a normal complete Windows start, or limited to the system without autostarting software. The wizards make it easy to use Bootracer, including the necessary restarts. You can see in the details where and why your start-up may take an unusually long time. These limitations help to get to the bottom of the cause or, if it makes sense, to exclude the software in question from autostarting when Windows loads.

The analysis tools Hwinfo and Speccy show whether something is wrong with your hardware. Both programs provide a wealth of information and sensor measurement data. Even more information on your processor is provided by CPU-Z and Core Temp, while GPU-Z digs deep into graphics card details. Unknown Device Identifier is helpful in identifying unknown components; the tool shows many more components than the native Windows device manager.

Memtest86 tests the main memory for errors, while Crystaldiskinfo analyzes SSDs and magnetic hard drives by reading out the SMART parameters. A look at the overall status shows whether everything is OK. The tool sounds an alarm in the event of abnormal values, which is very important for drives full of personal data.

Snappy Driver Installer recognises outdated hardware drivers and automatically updates them with the latest versions if desired.

IDG

Snappy Driver Installer checks whether your installed hardware drivers are up-to-date. The tool starts without installation. Click on the option Download indices only and wait until the system analysis is complete. Now, if desired, activate the Restore point field. To replace all obsolete drivers, continue by clicking on Select all Install at the top left. Alternatively, check the obsolete entries individually. Due to the file sizes of some drivers, downloading and installing may take some time.

Windows Explorer and the desktop are always used on a computer even if mostly unconsciously. While Microsoft has equipped the file manager with tabs in Windows 11, you have to retrofit the file explorer tabs into Windows 10.

To do this, install Qttabbar, restart the PC, open Explorer, and click on the down arrow in the View tab under the Options symbol on the right. There, you activate the list entry Qttabbar to show the new tab bar.

The free version of Tidy Tabs allows up to three tabs in a window, across programs for example, an Excel sheet, a Word document, and Outlook. It makes it easier to view pictures with a large preview that you start and close by pressing the space bar.

Treesize Free shows you at a glance which files are eating up space on your hard drive. The already mentioned Ccleaner also offers a quick search function for duplicate files (Extras - Duplicate Finder). Other tools like Anti-Twin offer more options, but are more complicated to use.

Small intervention, big effect. Capslock Goodbye prevents you once and for all from accidentally pressing the Caps Lock key again.

IDG

Clipboard Master can do more and is more convenient than the Windows clipboard. If youre annoyed by accidental clicks of the caps lock key, Capslock Goodbye is the tool for you. The software deactivates the key or assigns an alternative function to it. Desktop OK restores the icon placement on the desktop if it has been mixed up.

Three other programs provide more order (and more space on the hard disk): Should I Remove it shows which pre-installed software you can safely delete on new computers. O&O Appbuster makes it easier to remove Windows apps that hardly anyone needs. Unchecky prevents the secret installation of unwanted programs and toolbars. Its a godsend.

Remote help, i.e. taking over a remote computer via the Internet, is not only efficient, but also very simple with just one click in TeamViewer Quicksupport. The person who needs help starts the tool and gives his or her displayed ID and password to the helper. The helper takes care of the rest with the fully comprehensive Teamviewer software, which is also free for private use.

Changes to hard drives and partitions have a profound effect on your system and are often not easy to reset. Smaller tools can cause much less damage: Superdiskformatter, for example, allows you to change the file system (FAT32, NTFS, etc.) except for the Windows partition. Fat32formatter formats almost arbitrarily large data media as a FAT32 system, and Drive Letter Changer is used to assign fixed drive letters to USB drives.

Raidrive assigns a drive letter to cloud storage devices such as Dropbox, Google Drive, and OneDrive for quick 1-click access in Explorer. To do this, select a storage service and a letter via Add, log in with your login data and allow Raidrive to access the cloud. Done!

Cloudevo combines several cloud storages under Windows into one drive. This is convenient and allows you to store even oversized files on the internet.

IDG

Cloudevo also simplifies the handling of online storage by combining diverse cloud storage pools into a single drive with theoretically unlimited capacity. This is also possible with several free accounts from the same provider. While you as a user only see your Cloudevo drive, the service behind it automatically distributes the stored data to the various cloud storage providers.

Drive pooling with local drives, for example with several USB drives, can be done with Liquesce. Data that is too large to be sent by email can be forwarded with O&O FileDirect. The software creates an access link to your PC, which the recipient can use to transfer the shared data. Your computer must be switched on and online during the transfer.

If Windows blocks access to certain files, thus preventing deletion, renaming, or copying, Lock Hunter releases them again. You can change other permissions of folders and files with Attribute Changer.

Have I been pwned? and HPI Identity Leak Checker are not programs to install, but a simple mouse click is enough here too. Just type in your email address and youll know whether your account has been affected by one of the countless account and password hacks. If so, please be sure to change the corresponding password!

Defender UI provides a new, easier-to-use interface for all Microsoft security features than the integrated Windows Defender offers. The software contains four predefined security profiles and clearly groups together many security features and settings otherwise scattered throughout the operating system.

Virustotal Uploader simplifies the process of uploading potentially dangerous files to the Virustotal scanning platform. Instead of manually calling up the website, selecting the file. and uploading it, this free app works much faster via the Windows context menu. The browser plug-in I dont care about cookies eliminates the hassle of deselecting cookies on many websites by blocking or hiding the usual pop-up dialogues. It is best to combine the add-on with the automatic deletion of all cookies when you close your browser.

Wipe is suitable as a supplement for removing online traces. The software not only deletes browser data, but also temporary files and more.

You can securely delete data from your hard drive with Eraser: The name says it all.

Ungoogled Chromium is a special fork of the free Chromium browser, on which Google Chrome is also based. Unlike Chrome, Ungoogled Chromium does without any Google services for more privacy.

Simple Code Generator creates QR codes for private information that you wouldnt want to entrust to an online QR generator, such as mail addresses, Wi-Fi access information, Outlook or personal contacts, and the like.

Last but not least is USB-Logon. It lets you create a USB stick for fast and secure Windows logon without a password. USB-Logon is a good alternative for PCs without a Windows Hello-enabled webcam or fingerprint sensor.

The Microsoft Powertoys tool collection has grown to over 20 amazingly helpful features, many of them with (almost) one-click operation. For example, Always on top keeps any program window visible in the monitor at all times; Awake switches off the power settings for a certain time; FancyZones allows multiple windows to be easily arranged even under Windows 10; the File Explorer Add-Ons show the contents for various file formats as a large preview; and Image Resizing changes the size of photos simply via the context menu. Also via the context menu, PowerRename functions allow automatic renaming of files.

Because we had to wait so long for it, the new PowerToys feature Paste as plain text is downright ingeniously simple. The keyboard shortcut Ctrl-Windows-Alt-V inserts the content stored in the clipboard unformatted into any program ideal, for example, for quickly transferring web content into word processing.

This article has been translated from German to English and originally appeared on pcwelt.de.

Read more from the original source:
50 programs that fix Microsoft Windows problems fast | PCWorld - PCWorld

Read More..

Why IT leaders should deploy generative AI infrastructure now – TechTarget

In the past several months, rampant excitement about the potential benefits of generative AI technology has increased the technology's priority status across enterprise organizations worldwide.

According to a recent research report from TechTarget's Enterprise Strategy Group, "Beyond the GenAI Hype: Real-world Investments, Use Cases, and Concerns," 42% of organizations said they are in a generative AI proof of concept if they haven't already deployed it in production. Our research showed that generative AI ranks higher than cloud in overall strategic business initiatives, which highlights how critical these projects are now.

In other words, the adoption rate for generative AI projects is expected to be massive and unlike anything we've seen from enterprise technology. And as a result, there is a high likelihood that your own executive team is currently in a conflicted state: They are excited about the potential productivity benefits of generative AI, but they are concerned about the risks to data privacy.

Regardless of the pace of AI adoption within your own organization, the expected overall adoption rate means that if your organization lags in adopting and deploying generative AI products, your competition will gain an increased advantage.

Organizations need to move quickly when it comes to generative AI, but they should do so in a manner that enables them to start small, scale quickly and mitigate risk associated with data privacy, compliance and security. With that necessity in mind, Nutanix has introduced GPT-in-a-Box.

The product combines the following elements:

There is a lot to like in this packaging, but most important is its simplicity of design. Nutanix is known for simplicity, which is a hallmark of its HCI technology.

Overall, there is likely going to be a longer than usual "crawl" phase before you get to "run" with generative AI within your organization. But if you want to get a leg up on generative AI initiatives, don't waste time trying to deploy the perfect infrastructure for what the ideal use will be in three to five years. In fact, few -- if any -- organizations truly have a strong grasp on what the ideal use will be.

We do have a sense of what those uses will look like in general. According to Enterprise Strategy Group's generative AI research report, the more commonly identified uses improve productivity, efficiency and the overall customer experience.

As a result, organizations should seek to speed up infrastructure deployment to enable their data science teams to get started on identifying the right data and models. As an example, the Nutanix product enables organizations to start quickly, and it gives them the flexibility to scale and adapt as needed.

The ability to deploy the product on premises is also important. While public cloud services will likely support most generative AI products, a separate Enterprise Strategy Group research study, "Multi-cloud Application Deployment and Decision Making," found that 29% of organizations identified AI/machine learning workloads as not being candidates for cloud deployment.

Some of those organizations will be launching AI initiatives that will use sensitive data or data sets with privacy concerns. Or maybe the data and compute requirements in the cloud are simply too costly for organizations just getting started. According to the multi-cloud research report, the cost of low-latency performance in the cloud is the most common reason organizations decided that an on-premises workload is not a candidate for the public cloud.

Ultimately, when it comes to generative AI, speed is of the essence. And given the increased executive-level priority on generative AI workloads, IT leaders must be proactive.

GPT-in-a-Box is simple to use and flexible, but Nutanix is not the only provider that has announced a strengthened Nvidia partnership and a product for generative AI. Always evaluate your options.

See the original post here:
Why IT leaders should deploy generative AI infrastructure now - TechTarget

Read More..

Bitcoin clean energy usage reportedly exceeds 50% Will Tesla start accepting BTC payments? – Cointelegraph

Elon Musk said in 2021 that Tesla would accept Bitcoin payments once miners were using roughly 50% clean energy sources with positive future trend a benchmark that may have recently been met.

In a Sept. 14 thread on X (formerly Twitter), Bloomberg analyst Jamie Coutts reported the percentage of Bitcoin (BTC)mining energy coming from renewable sources had exceeded 50% with falling emissions plus a dramatically rising hash rate. According to Coutts, the push toward renewable energy sources was the result of miners dispersing from China in the wake of the countrys mining ban starting in 2021, as well as certain nations turning to mining to monetize stranded and excess energy.

Countries investing in BTC mining include El Salvador which has also recognized the cryptocurrency as legal tender since 2021 Bhutan, Oman and the United Arab Emirates. The 50% energy benchmark could mean a greater move toward adoption by one of the biggest companies in the world.

Related: Teslas diamond hands: EV maker's Bitcoin holdings see no change in Q2

Musk the CEO of Tesla, owner of X and founder of SpaceX announced Tesla would stop accepting BTC payments in May 2021, citing the rapidly increasing use of fossil fuels for Bitcoin mining and transactions at the time. Since establishing a sustainable energy source threshold of 50% for when the firm would resume payments, Musk acknowledged that there was a positive trend toward green energy sources but hasnt changed Teslas policy.

The Tesla CEO did not appear to have publicly announced any move to resume BTC payments. At the time of publication, the price of Bitcoin was $26,572, having risen more than 2% in the last seven days.

Magazine: Bitcoin is on a collision course with Net Zero promises

Continue reading here:
Bitcoin clean energy usage reportedly exceeds 50% Will Tesla start accepting BTC payments? - Cointelegraph

Read More..