Page 1,824«..1020..1,8231,8241,8251,826..1,8301,840..»

Did you know that Signal is to blame for the encryption used by WhatsApp? – Gearrice

Signal is one of the preferred messaging applications for those users who care about their privacy. Features like the end to end encryption, the use of PIN or the possibility of distorting the faces in the photos place it as a favorite in this section. What few knew is that Signal is also behind the encryption protocol in WhatsApp messages.

Although the two companies have had public spats over the Facebook privacy scandals, a few years ago they signed a brief partnership. Open Whisper Systems, the company behind Signal, announced in 2014 an agreement to implement its encryption protocol in WhatsApp. The company would integrate TextSecure into all messages, voice memos, files, and calls.

After a year of work, Signal confirmed that the operation was complete and all WhatsApp clients for iOS, Android, Windows Phone, Nokia and BlackBerry would have end-to-end encryption. For its part, WhatsApp announced to its users that messages and calls were protected and that no one outside the chat could read or listen to them

Remember that warning message at the top of every chat? It was thanks to the association between both companies.

Signal considered all scenarios to prevent third parties from reading conversations, including the possibility of using an old version of the client to receive messages in plain text. WhatsApp users have the option to verify that chats with your contacts are encrypted. The customer uses QR codes scannable or a string of numbers shared by both sender and recipient.

Signal became popular thanks to Edward Snowden, the NSA analyst who leaked classified information. Snowden recommended using TextSecure and RedPhone, two Open Whisper applications that would later be merged into what we now know as Signal. Later, the app was suggested by movements such as Black Lives Matter or Elon Musk himself.

Unlike WhatsApp, Signal is open source and is governed by a foundation created by the hacktivist Moxie Marlinspike and WhatsApp co-founder Brian Acton. The latter joined the project in February 2018 with a contribution of 50 million dollars and the idea of making private communication accessible to everyone.

Despite their collaboration almost a decade agoWhatsApp and Signal have had public fights. One of the most recent occurred in 2021, when the second exposed Facebooks practices for displaying targeted advertising. signal planned a campaign that showed how much information Meta collects from its users.

The idea was not well received by Mark Zuckerberg, who accused Signal of lying in order to gain publicity. Facebook did not allow Signal to display the ads and temporarily disabled your account in the social network. Although users could not see the advertising, the damage was done and Facebook was made a fool of itself.

Although Signal does not have the number of WhatsApp users, it is, together with Telegram, an alternative for those who want to protect their conversations. The next time you see the end-to-end encryption message in one of your WhatsApp chats, remember that this feature exists thanks to the work of Signal engineers.

Read the original post:
Did you know that Signal is to blame for the encryption used by WhatsApp? - Gearrice

Read More..

The 7 best apps you can use as a WhatsApp alternative – Business Insider

WhatsApp has achieved its widespread global popularity thanks to its multi-platform support, secure encryption, and wide array of communication features. But the app isn't for everyone, and you might be looking for a WhatsApp alternative that mimics many of the best aspects of the app. Here are seven of the best alternatives to WhatsApp.

Telegram is a popular messaging app that offers voice, video, and text chat along with a mix of security options. Voice calls are automatically encrypted end-to-end, for example, and the app offers a secret chat mode in which your texts are also end-to-end encrypted, plus have a self-destruct timer. Routine texts aren't as secure, though. It's also one of the few apps that lets you lock the app itself, so if someone gets physical access to your phone, they still can't read your messages in Telegram. It's free for iPhone, Android, and even runs in a browser.

If you're looking for the strongest security in a WhatsApp alternative, Signal is probably what you need. The app leans hard into privacy with complete end-to-end encryption for voice, video, and text conversations when you chat with other Signal users. And the company behind Signal is a US-based non-profit organization that has no profit incentive to sell your data. You can set messages to self-destruct as well. On the other hand, the app is missing a sense of fun you might be looking for; there are no stickers or built-in GIFs, for example. It's available for free for both iPhone and Android.

While most messaging apps are free, Threema is not it costs $5 on both iPhone and Android (and you can use it in a web browser as well). In return, you get complete end-to-end encryption for voice, video, and text messages, as well as file exchanges. It's also one of the very few apps which does not require you to sign up or confirm your account using your phone number, making it highly anonymous. You might also appreciate some of the extra features, like the ability to create polls and surveys, as well as search for images using natural language.

Facebook Messenger is an incredibly popular messaging app thanks to its connection to Facebook, and it's available for free for both iPhone and Android (plus you can use it in a browser). It includes all the basics, such as voice, video, and text messaging, as well as stickers and GIFs, if you are interested in that sort of thing. If you're concerned about privacy and security, though, it's worth noting that while Facebook is rolling out end-to-end encryption to Messenger, that feature is coming slowly and may not be fully available until 2023.

Kik is a little more than just a messaging app; it has a social component that makes it easy to meet new people to chat with, if you're looking for more than just a way to stay in touch with friends and family. Sort of like TikTok, you can join live broadcasts and watch and chat with a large number of other users. And when it comes to getting friends and family online, you can let people scan a QR code found in your profile to get them into a chat instantly. It's free for iPhone and Android, though there are a fair number of ads to contend with. If you value your privacy though, you can sign up with an email address without revealing your phone number.

Part of Microsoft's office productivity suite, Skype has always had something of a business focus, so it might not spring to mind for casual users. It's a full-featured communication app, though, and suitable for both work and recreation. As you'd expect, it can easily do voice, video, and text chats, as well as exchange files. Thanks to a variety of plug-ins, you can record video and voice calls when using Skype on the desktop, though it's also available for iPhone and Android. Perhaps the best reason to use Skype is for its more advanced features, though, like the ability to get real-time language translation, and the fact that you can use Skype to make calls to landline phones (with low international rates).

Viber doesn't have the same name recognition as many other communication apps, but like Skype, it is a solid option if you want an app that can also place calls to mobile phones and landlines in addition to chatting with other app users. Free for iPhone and Android, it has voice, video, and text chat features, and it does this with end-to-end encryption for secure communication.

Dave Johnson

Freelance Writer

See the original post here:
The 7 best apps you can use as a WhatsApp alternative - Business Insider

Read More..

Database Encryption Market Analysis and Demand with Forecast Overview to 2028 – NewsOrigins

The research analysison Database Encryption market offers a thorough assessment of the major growth opportunities, roadblocks, and other channels for expansion that will affect the industry's growth between 2022 and 2028.

Additionally, the research report predicts that during the projection period, this marketplace would display a healthy CAGR and produce commendable returns.

The document provides a comprehensive analysis of the economic condition to assist stakeholders in developing effective growth strategies for their future investments. The report also provides information on well-known companies that are operating in this industry sector, including information on their business portfolios and development trends as well as key information on the market segmentations.

Request Sample Copy of this Report @ https://www.newsorigins.com/request-sample/56392

Key Information from the Database Encryption market report:

Product category:

Applications overview:

Competitive landscape:

Database Encryption Market segments covered in the report:

Regional terrain:

This Database Encryption market analysis Report Contains Answers To Your Following Questions:

Request Customization for This Report @ https://www.newsorigins.com/request-for-customization/56392

Read more from the original source:
Database Encryption Market Analysis and Demand with Forecast Overview to 2028 - NewsOrigins

Read More..

Real-World Cloud Attacks: The True Tasks of Cloud Ransomware Mitigation – DARKReading

In Part 1 of our tales of real-world cloud attacks, we examined real-world examples of two common cloud attacks. The first starting from a software-as-a-service (SaaS) marketplace, demonstrating the breadth of potential access vectors for cloud attacks and how it can enable lateral movement into other cloud resources, including a company's AWS environment. The second cloud attack demonstrated how attackers take over cloud infrastructure to inject cryptominers for their profit.

As we have witnessed, more attacks have moved onto the cloud, so it was only a matter of time before ransomware attacks did, too. Let's look at two scenarios where attackers leveraged ransomware to gain profits, and how unique cloud capabilities helped victims avoid paying the ransom.

The first case (or rather cases, as this attack has appeared numerous times) is the notorious MongoDB ransomware, which has been ongoing for years. The attack itself is simple attackers use a script to scan the internet (and now, common cloud vendor address spaces) for hosts running MongoDB exposed to the internet. The attackers then try to connect to the MongoDB with the empty admin password. If successful, the attack erases the database and replaces it with a double ransomware note: pay, and your data will be returned; don't pay, and your data will be leaked.

Intervention was necessary to address the second part of the extortion scheme: data leakage. Luckily, the company had data backups, so recovery was easy, but the database contained considerable amounts of personally identifiable information (PII), which, if leaked, would be a major crisis for the company. This forced them into the position of either paying a hefty ransom or dealing with the press. MongoDB default logging, unfortunately, cannot provide a definitive answer regarding the data accessed, as not all potential types of data collection commands are logged by default.

This is where the cloud infrastructure became an advantage. While MongoDB may not log every command, AWS logs the traffic going in and out of servers, because it charges for network costs. Correlating the network traffic going out of the attacked server with the times when the attackers were connected to the compromised MongoDB server provided proof that the data could not have been downloaded by the attackers.

This allowed the company to avoid paying the ransom and ignore the threat. As expected, nothing further was heard from the attackers.

Another company experienced an attack on its main servers running on AWS EC2, where it was hit by a ransomware Trojan, not unlike those seen on on-premises servers. As often occurs these days, this was another double-extortion ransomware attack and the company needed help dealing with both issues.

Luckily, due to the company's cloud architecture and preparedness, there were AWS snapshots of the environment going back 14 days. The attackers were unaware of the snapshots and had not disabled them in their attack. This allowed the company to immediately revert to the day before the data encryption, resolving the first part of the attack with minimal effort. That still left two challenges to deal with: the potential data leak and the eradication of the attackers from the environment.

To address these challenges, there was a full investigation of the breach, which turned out to be quite complex due to the hybrid nature of their environment. The attackers compromised a single account with limited access, used by an IT person. They then identified a legacy on-premises server where that individual was an admin and used it to take over the Okta service account, allowing privilege escalation. Finally, using a decommissioned VPN service, they were able to hop to the cloud environment. Using the elevated privileges, they took over the EC2 servers and installed the malware.

The investigation yielded two significant findings. The first was the attack timeline. It showed that the compromise of all hosts occurred before the earliest snapshots were taken, indicating that the recovered servers were compromised and could not be used. New servers were installed, the data was transferred to them, and the original affected servers were purged.

The second finding was even more surprising. Malware analysis identified that the attackers used rclone.exe to copy the files to a remote location. The connection credentials were hardcoded in the malware, so the company was able to connect to the same location, identify, and remove their files, eliminating the attackers' access to the files, eradicating the extortion aspect of the attack.

As these real-life scenarios reveal, attackers are infiltrating the cloud and cloud breaches are on the rise. It's time for organizations to prepare for cloud incidents. Cybercriminals are leveraging cloud capabilities in attacks, and you should use them, too, to protect your organization and prevent a crisis from hitting the headlines.

Read the original here:
Real-World Cloud Attacks: The True Tasks of Cloud Ransomware Mitigation - DARKReading

Read More..

Finance Cloud Market to be Worth $101.71 Billion by 2030: Grand View Research, Inc. – Yahoo Finance

SAN FRANCISCO, Sept. 1, 2022 /PRNewswire/ --The global finance cloud market size is anticipated to reach USD 101.71 billion by 2030, according to a new report by Grand View Research, Inc. The market is expected to expand at a CAGR of 20.3% from 2022 to 2030. Financial organizations are modernizing their processes and embracing different aspects of digital transformation owing to the convenience offered by cloud solutions. Financial institutions using the cloud model benefit from improved disaster recovery, fault tolerance, and data protection.

Grand View Research Logo

Key Industry Insights & Findings from the report:

In terms of solution, the security segment accounted for the largest revenue share of USD 6.63 billion in 2021 and is projected to maintain its position during the forecast period. Rising security concerns due to organizations moving towards cloud-based services & tools and digital transformation strategy as part of their infrastructure development are driving the segment growth. The governance, risk & compliance segment is expected to register the highest CAGR of 22.6% during the projected period.

Based on service, the managed segment led the market with a 64.8% share in 2021 and is expected to retain its position during the forecast period. Managed services allow businesses to outsource all or a portion of their IT operations & infrastructure so they may concentrate on their main corporate objectives. By lowering operational expenditure (OPEX) and capital expenditure (CAPEX), outsourcing enables contact center-based businesses to lower the cost of network and IT spending.

The professional services segment is expected to register the highest CAGR of 23.2% in terms of revenue during the forecast period of 2022 to 2030.

In terms of deployment, the public cloud segment held the largest revenue share of USD 10.22 billion in 2021 and is projected to maintain its position during the forecast period. As a user of the public cloud, organizations are not in charge of administering cloud hosting services. The management and upkeep of the data center where data is stored fall under the purview of the cloud service provider.

This entails eliminating protracted procurement procedures and waiting for operations to set up servers, set up operating systems, and create connectivity. The public cloud also reduces expenditure because businesses only pay for the resources they use, thus cutting down on wasteful expenditure on idle resources. The private cloud segment is expected to register a CAGR of 22.9% during the assessment period.

Based on application, the wealth management segment held the largest revenue share of 29.6% in 2021 and is projected to maintain its position during the projected period. Moving wealth management systems to the cloud could assist in providing agile and flexible solutions that could help create a strategic competitive edge while positioning the business for long-term success. Companies are entering into partnerships for the adoption of cloud-based wealth management services.

The large enterprises segment dominated the market with a share of 68.1% in 2021. The small & medium enterprises segment is likely to register the highest CAGR of 24.0% during the forecast timeline. The growth of this segment is mainly due to the numerous benefits of cloud computing, including improved customer relationship management, regulatory compliance, data analysis, and assistance in detecting frauds in the financial sector.

According to a survey conducted by Ernst & Young Global Limited, a U.K.-based company, in March 2022, 39% of medium enterprises had made progress toward the cloud.

For instance, in January 2022, Avaloq, a provider of business process as a service (BPaaS) and software as a service (SaaS) announced that it is extending its long-standing partnership with RBC Wealth Management, which is a part of the Royal Bank of Canada, throughout Asia, for switching to cloud-based SaaS model and updating the wealth management platform with cutting-edge solutions. The asset management segment is anticipated to register the highest CAGR of 23.3% during the assessment period.

In terms of end-use, the banking and financial services segment generated the largest revenue of USD 13.76 billion in 2021 and is projected to retain its dominance during the projected period. The need to distinguish and personalize services has made it essential for banks to modernize their core technology foundation to cloud-based infrastructure. This was further expedited by the pandemic's requirement for distant operations and the exponential growth of digital transactions.

For instance, in July 2020, Microsoft and Finastra, one of the largest fintech organizations, which offers solutions for the financial sector, announced a global strategic partnership to accelerate transformation in financial services. The insurance segment is anticipated to register the highest CAGR of 23.5% during the projected timeline.

North America dominated the market in 2021 with a revenue share of 35.0% and is expected to expand at a CAGR of 18.9% during the forecast period. Asia Pacific is likely to register the highest CAGR of 21.6% during this timeline, owing to the rapid increase in digitalization and sustained national investment in technological advancements. The rapid rise of banking and insurance organizations as well as the increasing demand for cloud services support the Asia Pacific market's expansion.

Read 100-page full market research report, "Finance Cloud Market Size, Share & Trends Analysis Report By Solution, By Service, By Deployment, By Enterprise, By Application, By End-use, By Region, And Segment Forecasts, 2022 - 2030", published by Grand View Research.

Finance Cloud Market Growth & Trends

The volume of data breaches has surged in recent years, forcing financial companies to step up their security measures. According to the Financial Services Sector Exposure Report 2018-2021 by Constella Intelligence, a global threat intelligence organization, there were 6,472 breaches and data leaks found between 2018 and 2021, with more than 3.3 million records stolen from 20 organizations of Fortune 500.

The COVID-19 pandemic had a positive effect on the market for finance cloud. The financial sector has significantly altered its existing business strategy, improving its business performance and modernizing the old product lines with more cost-effective strategies. To maintain effective internal operations in the event of a pandemic, banks and other financial institutions have embraced the cloud much more widely. As a result, there has been a significant increase in demand for financial cloud during this period.

The market is anticipated to benefit from strategies adopted such as frequent launches, developments, and innovations by market players in the finance cloud industry. For instance, in May 2021, Google Cloud officially confirmed the data share solution for financial services. The data share solution is created to enable sharing of market data with enhanced security and ease across the capital markets, including data consumers like asset managers, investment banks, and hedge funds, as well as market data issuers like exchanges and other providers.

Finance Cloud Market Segmentation

Grand View Research has segmented the global finance cloud market based on solution, service, deployment, enterprise, application, end-use, and region:

Finance Cloud Market - Solution Outlook (Revenue, USD Million, 2017 - 2030)

Financial Forecasting

Financial Reporting & Analysis

Security

Governance, Risk & Compliances

Others

Finance Cloud Market - Service Outlook (Revenue, USD Million, 2017 - 2030)

Professional Services

Managed Services

Finance Cloud Market - Deployment Outlook (Revenue, USD Million, 2017 - 2030)

Public Cloud

Private Cloud

Hybrid Cloud

Finance Cloud Market - Enterprise Outlook (Revenue, USD Million, 2017 - 2030)

Finance Cloud Market - Application Outlook (Revenue, USD Million, 2017 - 2030)

Finance Cloud Market - End-Use Outlook (Revenue, USD Million, 2017 - 2030)

Finance Cloud Market - Regional Outlook (Revenue, USD Million, 2017 - 2030)

North America

Europe

Asia Pacific

Latin America

Middle East & Africa

List of Key Players in Finance Cloud Market

Check out more related studies published by Grand View Research:

Fintech-as-a-Service Market - The global fintech-as-a-service market size is expected to reach USD 949.49 billion by 2030, growing at a CAGR of 17.2% from 2022 to 2030, according to a new report by Grand View Research, Inc. The increasing adoption of financial technology-based solutions and platforms globally is anticipated to drive the growth of the market. The increasing adoption of artificial intelligence, cloud-based software, and big data integrated with financial services is expected to drive the growth of the market for fintech-as-a-service.

Smart Finance Services Market - The global smart finance services market size is expected to reach USD 46.85 million by 2028 and is expected to grow at a CAGR of 2.9% from 2022 to 2028, according to a new report by Grand View Research, Inc. The crucial growth factors of the market include the growing demand for the various IoT-based ATM services, such as installation and management services, across the globe.

Artificial Intelligence In Fintech Market - The global artificial intelligence in fintech market size is expected to reach USD 41.16 billion by 2030, growing at a CAGR of 16.5% from 2022 to 2030, according to a new report by Grand View Research, Inc. Artificial intelligence (AI) is widely used in financial organizations to improvise their precision levels, enhance their efficiency and instant query resolving through digital banking channels. AI technology like machine learning can help organizations raise their value by improving loan underwriting and eliminating financial risk.

Browse through Grand View Research's Next Generation Technologies Industry Research Reports.

About Grand View Research

Grand View Research, U.S.-based market research and consulting company, provides syndicated as well as customized research reports and consulting services. Registered in California and headquartered in San Francisco, the company comprises over 425 analysts and consultants, adding more than 1200 market research reports to its vast database each year. These reports offer in-depth analysis on 46 industries across 25 major countries worldwide. With the help of an interactive market intelligence platform, Grand View Research Helps Fortune 500 companies and renowned academic institutes understand the global and regional business environment and gauge the opportunities that lie ahead.

Contact:Sherry JamesCorporate Sales Specialist, USAGrand View Research, Inc.Phone: 1-415-349-0058Toll Free: 1-888-202-9519Email: sales@grandviewresearch.comWeb: https://www.grandviewresearch.comGrand View Compass| Astra ESG SolutionsFollow Us: LinkedIn | Twitter

Logo: https://mma.prnewswire.com/media/661327/Grand_View_Research_Logo.jpg

Cision

View original content:https://www.prnewswire.com/news-releases/finance-cloud-market-to-be-worth-101-71-billion-by-2030-grand-view-research-inc-301616196.html

SOURCE Grand View Research, Inc.

The rest is here:
Finance Cloud Market to be Worth $101.71 Billion by 2030: Grand View Research, Inc. - Yahoo Finance

Read More..

Everything as a Service?: Government and the Cloud – Government Technology

State and local IT leaders are galloping toward the cloud, whether they know it or not.

Thirty-seven percent say they moved on-premise infrastructure to public cloud this past year, according to the 2022 CompTIA Public Technology Institute (PTI) State of City and County IT National Survey. And 32 percent said that migrating systems and applications to the cloud will be a top priority in the next two years.

Yet much of state and local cloud adoption still goes unrecognized as such, said Alan Shark, vice president of public sector and executive director at PTI. People say theyre not in the cloud very much, and when you start asking questions, it turns out that theyre very much in the cloud.

Taken together, formal migrations to the cloud and adoption of cloud-based SaaS indicate a steady shift in IT resources: away from on-prem legacy solutions and toward the cloud. Its continuing to gain traction, Shark said.

In Utah, for example, the building that housed the states main data center is slated to be demolished, and CIO Alan Fuller is seizing the moment to jump-start his cloud migration. Were hoping to get better scalability, elasticity, security, redundancy and lower cost, he told GTat the NASCIO Midyear Conference in May. We have a multi-vendor cloud strategy and we want to move as many services and applications as we can from on-premise to the cloud.

Here, a range of state and local leaders help to paint a picture of the state of cloud adoption: the early wins, and the challenges yet to be faced.

The push came from a Deloitte assessment, which found in part that the state could uncover a lot of savings by consolidating its data center investments. Through cloud migration, we could avoid a $15 to $30 million investment that would have been necessary just to keep our primary data center facility going.

The state subsequently demolished its primary data center. (The site is now a parking lot.) Sloan reports that 80 percent of that activity moved directly to the cloud, and an additional 5 percent migrated to a shared data center with a much smaller physical footprint.

In the process, the IT team identified about 96 data centers across state agencies, including everything from formal data centers to servers running in spaces under desks. Ninety-two of those have since been decommissioned, with only a few outliers kept online.

In order to ensure the push to cloud aligns with specific agency needs, the IT team has established relationships with multiple cloud service providers, including Amazon Web Services, Microsoft Azure, Google Cloud and IBMs Z Cloud.

The way a particular agencys infrastructure has matured over time influences which cloud option is best suited for their needs. The child protection agency, for example, had already replaced an aging case management system with a solution that was based on Microsoft Dynamics. That went into the Dynamics 365 cloud, Sloan said. Weve allowed the agencies to have a voice in selecting the cloud providers that best meet their needs.

It shouldnt come as a surprise that Sloan makes a strong case for the benefits of cloud adoption. As government services and data shift to the cloud, he said, you inherently gain access to scalability and elasticity, as well as a whole set of feature functions that you dont inherently have in your current on-premise data center environment.

Theres a benefit on the personnel side as well. You get people out of having to do the managing, the administration, all the care and feeding for physical servers. Now the same number of people are able to do more things. They have more time to put into more valuable activities, he said.

One big adjustment has been the shift from capital to operational budgeting. Here, Sloan advocates for a gradual approach. It takes time. If you try and flip a big switch, there can be some real challenges, depending upon where you are in your capex cycle, he said.

If youre not at the end of a capex cycle, you potentially end up having duplicate costs, he said. So we work with our people to plan ahead. We will say: Youre not going to be buying new servers the next time this comes around, so if youre in year two of your five-year server cycle, plan now. Youll need to make the transition when those are up on their life cycle.

The city has its office productivity tools and online collaboration tools in the cloud, along with an e-signature platform and a records request tool. Next up: enterprise resource planning for human capital management and finance, followed by public safety.

We will have a big ERP system launching here the first week of October, and right on the tail of that, probably in late January or early February, were doing a major public safety systems swap-over to cloud, said CIO Chris Seidt.

While the transition has gone smoothly, Seidt is candid about the bumps hes hit along the road, and the need to manage thoughtfully through the details of a cloud migration. When subscribing to cloud services, for example, its important to pay close attention, to ensure you are subscribing to all of the things that you need, he said.

Because there may be different tiers of subscription models for different cloud services, you really have to look carefully at what is included in that, because it may not include all of the pieces that you would need as an organization in order to maintain your security posture, in order to gain access to functionality, he said.

In fact, Louisville initially undersubscribed for its office productivity suite. It lacked some of the practical security aspects that really needed to accompany it, Seidt said. If you stepped up a tier, a lot of those security tools were just included. Of course, theres a cost jump there too.

In addition to working through those technical details, Seidt has also focused heavily on change management.

One big barrier is staff willingness to go along, he said. Ive got a couple systems engineers that have been with me for decades, and now were asking them to do cloud. So one of the first steps was to make sure people got comfortable. When people are worried about losing their jobs, you have to get out ahead of that. We certainly didnt go into our cloud journey looking to reduce our headcount. If anything, we wanted to repurpose it.

To that end, the citys cloud journey has included not just messaging about the virtues of cloud, but also training to reskill IT for the task that lies ahead. Seidt said hes quadrupled the training spend since 2018, in an effort to upskill staff for the cloud transition.

When you let them know that youre willing to train them in those new skill sets, it makes them more engaged, Seidt said. Some of our systems guys are our biggest champions now, because they see the bigger picture.

While others work with multiple cloud providers, Seidt for now is focused solely on AWS engagements. With a single provider, he said, he can more easily control and manage his cloud resources.

At the same time, Seidt has been working with others in the city to repurpose the data center space hes no longer using.

In a lot of city governments, the data center serves as more than just a place to house compute and storage. It may also have your network core redundancy. It may also have fiber terminations for municipally owned fiber. Theres a lot of other things that live there, he said.

As he moves to decommission compute and storage, hes looking for opportunities to refresh the HVAC systems for downsized demand and potentially repurpose those spaces for other uses.

As for the shift from capex to opex spending, Seidt said good communication is the key to success. It does require a lot of cooperation with your chief financial officer and your elected officials, making sure that they understand the business case for it and the benefits that come with it, he said.

We have been slow to embrace the cloud, said Director of Information Services Eric Romero. Were so busy here, it is hard to find the time to get up to speed, to learn the newer technologies and to get to that comfort level.

The city has nonetheless already shifted a few applications to the cloud, including 311, permitting and a nearly billion-dollar road program thats leveraging cloud-based tools for project management. Romero said hes already seeing the benefits of a modernized approach. The upside is certainly that its less taxing on my department, my resources, my staff, he said.

Still, Romero has questions. Hes particularly concerned about data portability, or the lack thereof. Once all the data is in the cloud, how do we get it out of the cloud in a way that we can either use it for historical purposes or convert it for use in another system? he said, noting that his sense is that cloud service providers dont want to make this easy.

He said hed like to see more data on cloud reliability, too: downtime metrics and the like. Hed also like some assurance that a cloud provider would be able to respond quickly to an urgent need.

When we have an issue, something that might not seem critical to them might be extremely critical to us. If we had it in-house, my guys would be working on it 24 hours a day until it got resolved, he said. That might come through with the contract process, but will they really understand it?

At this point, when he does make a move, its typically a matter of cloud by necessity. For example, were running an old version of Exchange for email and every time that we get a patch from Microsoft, we have issues getting the servers back online, he said. We just finally came to the conclusion that we had to move.

In tandem with such efforts, Romero has been communicating with folks on the budgeting side to ensure a smooth transition to opex as needed.

American Rescue Plan money is helping bridge the gap between capex and opex for the moment, but eventually those new cloud-based contracts will come due again maybe two or five years down the road and Romero said it will be important to have solidified the funding model in support of that spending.

I was very upfront with our finance folks and the administration here, saying: These are my needs and we are going to use ARP funds for this, but weve got to address the budget or else in five years were going to be cutting these services.

In the long run, he said, changes to the budget process and incremental steps toward the cloud together will put the IT team on a better footing. Cloud will ultimately lower the management burden, freeing talent to tackle higher-level tasks. And a move to the cloud could also make it easier to hire the people he needs.

As candidates come in and they see that youre not fully embracing the cloud, that can be a detriment to their considering the position, he said. Seen in this light, cloud becomes increasingly a must-have for state and local governments looking to attract the best and brightest in a highly competitive talent market.

More to the point, Romero describes cloud as an inevitable technological evolution.

We know that this is coming. Were seeing more and more: not just software applications, but solutions that are being pushed to a subscription-based model, most often cloud-based, he said. We know that its coming and at some point, we will get there.

More:
Everything as a Service?: Government and the Cloud - Government Technology

Read More..

From 5G to 6G: The race for innovation and disruption – TechRepublic

Image: Unsplash

Connectivity is all about faster, better and increased data transfer between endpoints. The race for wireless connections, beginning in 1979 with the first 1G technology in Tokyo deployed by the Nippon Telegraph and Telephone (NTT), has led the world to 5G and 6G four decades later.

McKinsey Technology Trends Outlook 2022 reveals that advanced connectivity, which includes 5G, 6G, low-Earth-orbit satellites and other technologies, is driving growth and productivity across industries with an investment of $166 billion in 2021. Unlike other new technologies like artificial intelligence (AI) or mobility, the technology has a high adoption rate.

In a report shared by Market Research and Future to TechRepublic, the organization explains that the COVID-19 pandemic was a significant catalyst for implementing 5G globally.

With the power to transform industries faster, with greater capacity and less latency, 5G tech will impact transportation, banking systems, traffic control, remote healthcare, agriculture, digitized logistics and more, Market Research Future says.

New technologies like AI, machine learning, industrial Internet of Things (IIoT), new intelligent cars, and augmented and virtual reality applications in the metaverse also require faster download times and increased data communications in real-time. 5G and 6G are expected to boost these new trends.

SEE: Metaverse cheat sheet: Everything you need to know (free PDF) (TechRepublic)

Market Research and Future explains that the deployment of 5G does not come without challenges. The standardization of spectrum and the complexity in 5G network installation are the most prominent. MIT Tech Review adds that 6G will also face challenges and require cross-disciplinary innovation, new chips, new devices and software.

The next generation of cellular technologies offering higher-spectrum efficiency and high bandwidth have seen their share of debate. As McKinsey explains, many still wonder if 5G can completely replace the 4G LTE network and what percentage of networks will have 5G.

The Global Mobile Suppliers Association by May 2022, had identified 493 operators in 150 countries investing in 5G technology and an additional 200 companies that had technology that could potentially be used for 5G. New announcements for smartphones with 5G rose by 164% by the end of 2020, and cataloged 5G devices increased by 60%.

While new consumer products have rapidly adapted to 5G capabilities, industrial and business devices have not.

Shifting from 4G LTE to private 5G may not be cost-effective for all players; this would depend on a players technological aspirations and planned use cases, McKinsey said.

Market Research Future explains that $61.4 billion are driving this very competitive market, expected to reach $689.6 billion by 2027. But, infrastructure equipment, devices and software providers have been restraining growth.

MIT explains that 6G shares similar challenges with 5G but also presents new challenges. 6G engineers must work on infrastructure, devices and software to build the next-generation communication systems. 6G connectivity can not be done by simply scaling or updating todays technology.

MIT adds that 6G uses more sophisticated active-antenna systems, which integrate further using other Radio Access Technologies such as WLAN (wireless local area network), Bluetooth, UWB (ultra-wideband) and satellite. Fitting all this tech into a smartphone requires reimagining components like chips and radio transceiver technology.

This will require very creative electrical and computer engineering as well as disruptive industrial engineering and power management, MIT explained.

New 6G chips are essential to process the increased computing power. Low latencythe capacity to process a very high volume of data messages with minimal delayis already a challenge for 5G and will be even more defiant with 6G tech.

Low latency is essential for interactive data, real-time data and applications, and virtual environments or digital twins. These are all requirements for AI, the metaverse and the industrial sector. 6G latency will be reduced by using nearby devices, creating a signal on a 3-dimensional network.

SEE: Artificial Intelligence Ethics Policy (TechRepublic Premium)

To solve these problems, new semiconductor materials, intelligent surfaces, AI and digital twin technology developments are being used to test concepts, develop prototypes, and manage and enhance the network.

McKinsey stresses that 5G has proven that only a few telecommunications companies have been able to monetize from 5G enough to get a good return on investment (ROI). Therefore, capital expenditures and maintenance costs will also be closely watched. Additionally, large capital investments are required to build new technology and networks, representing another business challenge.

In its Dresden plant in Germany, Volkswagen replaced wired connections between machinery and now updates finished cars with over-the-air updates and connects unmanned vehicles with edge-cloud servers. Michelin uses new connectivity technologies for real-time inventory management, and Bosch equipped their first factory with 5G, enabling automation, connecting hundreds of end-points and synchronizing robotics with human factory workers. These are just some examples McKinsey gives of how advanced connectivity is disrupting industries.

Connectivity is expected to increase the annual rate of data creation by up to 25%, connect 51.9 billion devices by 2025 and impact the global GDP (gross domestic product) by more than $2 trillion. Additionally, 5G and 6G are expected to contribute to closing the digital divide allowing hundreds of millions of people to be connected for the first time.

In automotive and assembly, 5G and 6G are used to enhance maintenance and navigation, prevent collisions and drive the first fleets of autonomous vehicles. Healthcare devices and sensors connected to low-latency networks will improve patient treatment and monitoring with real-time data, significantly impacting treatment for patients with chronic disease that require constant checks.

Aerospace and defense are using 5G to boost their capacity and performance, while retail has improved inventory management, supply chain coordination and payment process and has created metaverse experiences thanks to the technology. The construction and building industry is printing 3D structures and using high-speed digital twins and applications, and the mining and natural resources sector is turning to smart exploration and exploitation with the digitalization of practices and automation of operations.

Leaders from almost every industry are considering engaging with new connectivity technologies. McKinsey says they should consider advanced connectivity a key enabler of revolutionary capabilities. From digital transformations to driving efficiency through automation and enabling technologies reliant on high-quality connectivity, such as cloud computing and IoT, connectivity will continue to drive the way the world works and lives.

More here:
From 5G to 6G: The race for innovation and disruption - TechRepublic

Read More..

Ransomware attackers expand the attack surface. This Week in Ransomware Friday, Sept 2 – IT World Canada

Ransomware continues to grow and expand, both in the number of attackers and the number of potential victims. This week we feature some of the attackers strategies described in recent news items.

Whats next Ransomware in a box? New Agenda Ransomware can be customized for each victim

A new ransomware strain called Agenda, written in Googles open source programming language Go (aka Golang) was detected and reported by researchers at Trend Micro earlier this week. There has been trend towards using newer languages like Go and Rust to create malware, particularly ransomware.

The fact that many of these languages can operate cross platform makes them a much greater threat. Go programs are cross platform and stand alone. They can execute without a Go interpreter on the host system.

In addition, the creators have added a new wrinkle making this new variant easily customizable. This new strain is being sold on the dark web as Ransomware as a Service (RaaS). Qilin, the threat actor that is selling it to its affiliates, claims it will allow them to easily customize, for each victim, the:

Finally, Agenda has a clever detection evasion technique also used in the other ransomware variant REvil. It changes the user password and enables automatic login with the new credentials. This allows the attacker to use safe mode to reboot and control the victims system.

Trend Micro reported that this allowed one attacker to move from reconnaissance to full-fledged attack in only two days. On the first day, the attacker scanned a Citrix server, and on the second day mounted a customized attack.

For more information you can review the original Trend Micro posting.

New Linux ransomware families

Another way that threat actors are expanding the attack surface is by targeting Linux, one of the predominant operating systems used on internet and cloud servers. RaaS offerings are increasing targeting Linux systems.

Although regarded as a very secure operating system, and despite a consistent move to patch vulnerabilities, the large number of Linux offerings used world-wide ensures there are a significant number of vulnerabilities at any given time. Failure to update and patch systems creates a large potential target base.

But software vulnerabilities are not the only area of weakness. Configuration mistakes are often the more likely factor in the breach of a Linux system, according to researchers at Trend Micro.

Remarkably, these include easily remedied issues such as:

To quote Trends report, given the prevalence of Linux, ransomware actors find the operating system to be a very lucrative target.

Ransomware going to the dogs is no joke

As RaaS and customizability become more and more prevalent, theres an increasing ability to target smaller and more specific groups. We are familiar with ransomware attacking health care organizations, but recently the United Veterinary Services Association has written to its members with recommendations to increase ransomware prevention after an attack that hit more than 700 animal health networks around the world.

It is a reminder that no group, regardless of size or type of business, is immune to ransomware.Every organization must communicate the need to have, at a minimum, the basics of ransomware protection in place:

Read this article:
Ransomware attackers expand the attack surface. This Week in Ransomware Friday, Sept 2 - IT World Canada

Read More..

The Network Binds The Increasingly Distributed Datacenter – The Next Platform

Before founding software-defined networking startup PlumGrid and then moving to VMware when it bought his company in 2016, Pere Monclus spent almost 12 years with Cisco Systems at a time when while much of enterprise networking was still in the corporate datacenter, the shift to network virtualization and the migration to the cloud were getting underway.

Cisco was dominant in the datacenter networking space and fed organizations with a steady stream of hardware, from routers to switches to silicon. The company carried an expansion view of its role in networking.

At Cisco, we were thinking always we have to control the end-to-end of the network, Monclus, vice president and chief technology officer of VMwares Networking and Security business unit, tells The Next Platform. The idea was we have to control the edge of the network so the core doesnt fall, because the core was where most of the markets were. We would have core routers, core switches and then take it all the way to the access to create the end-to-end networking as a principle, because from a Cisco perspective, what we were delivering was an end-to-end connectivity solution with our protocols.

About a year after Monclus left Cisco to found PlumGrid, VMware bought Nicira for $1.26 billion, a move that allowed the company that already was a significant datacenter presence through its server and storage virtualization to absorb networking into its increasingly software-defined world. NSX and networking have evolved over the past ten years to become a key part of VMwares own adaptation to an IT world that has broken well beyond the datacenter boundaries and out to the cloud and the edge. With containers, microservices and Kubernetes, software now dictates to hardware rather than the other way around.

Its also a world where the network is now tie that binds this increasingly decentralized IT environment, becoming the main thoroughfare for applications and data moving between the datacenter, cloud and edge and a central focus for organizations security measures. All this was on full display this week at VMwares Explore 2022 conference, which allowed the company to tout its ongoing expansion into the cloud and out to the edge and its networking portfolios central role in helping to make this happen.

The evolution of networking at VMware has taken several steps, Monclus says. At the time of the Nicira acquisition, enterprises would spend weeks or months putting the network in place before applications that would run top it could be put into production.

When VMware got into networking, the company heard from customers that they could quickly create and application and get a server up and running, but it takes them weeks to configure the network, he says. We started that journey with network virtualization and the first story [for networking] was about automation and agility. The question was, If I create a VM, could I just connect it to the network and give it an IP address? That was kind of the early days of network virtualization.

As more workloads and data were making their way out of the datacenter, security of the network became increasingly important, which is why VMware embraced micro-segmentation, a way to manage network access and separate workloads from one another to reduce an organizations attack surface and more easily contain breaches by preventing the lateral movement of attackers. The acquisition two years ago of network security startup Lastline helped fuel the vendors distributed IDS/IPS technology to complement the east-west protection delivered by micro-segmentation.

In June, the company added to its lateral security for network and endpoint technologies with a broad threat intelligence capability called Contexa. It sits in the infrastructure and offers visibility into both traditional and modern applications.

VMware over the years has put networking and security capabilities into the hypervisor and made them available as services in its own cloud offering and those of hyperscalers like Amazon Web Services and Google Cloud. Its also making NSX, and its expanding growing security capabilities including those from Carbon Black, which it bought in 2019 for $2.1 billion key parts of the multicloud strategy.

The vendor at Explore rolled out a broad range of enhancements to its networking and security portfolio all aimed at making it easier for enterprises to manage and secure their multicloud environments. It also gave a look to what the near-term future looks like with the introduction of a number of network- and security-focused projects.

VMware is embedding network detection and visibility capabilities into Carbon Black Clouds endpoint protection program, a move that is now in early access and brings together visibility into both the network and endpoints. It also is adding threat prevention tools like IDPS, malware analysis, sandboxing and URL filtering to its NSX Gateway Firewall and enhanced bot management to the NSX Advanced Load Balancer (ALB). The last two along with Project Watch, which aims to offer a continuous risk and compliance assessment model to multicloud environments are part of VMwares Elastic App Secure Edge (EASE), a strategy announced last year to offer a range of data plane services around networking and security.

As we noted earlier this week, VMware also is embracing data processing units (DPUs) from Nvidia for a number of its cloud-based offerings, including vSphere 8 and, for this case, NSX. Cloud providers like AWS and Oracle already are using DPUs and many in the industry believe that servers and other hardware in the near future will routinely include the chips. Monclus says customers that will gravitate toward DPUs or smartNICs for performance and security. For organizations like telcos that demand high performance and where their datacenters are revenue-generating facilities enabling CPUs to offload networking or compute tasks to DPUs is attractive.

There is a tradeoff they may save 15 percent in CPU utilization, which they can sell back to customers, but there also is the cost of the DPUs themselves. However, where datacenters are a cost factor, increasing security by leveraging the workload isolation offered by the DPUs and that likely will be a fast-growing use case for the chips, Monclus says.

Looking to the near future, VMware offered a look at Project Northstar and Project Trinidad, along with the aforementioned Project Watch. Project NorthStar is in technical preview and is a software-as-a-service (SaaS) network and security offering that will deliver services, visibility and controls to NSX users who can manage them via central cloud control plane.

The services include VMwares NSX Intelligence, ALB, Network Detection and Response and Web Application Firewall.

We are taking the control plane of NSX and turning it into a SaaS service to enable true multicloud solutions, Monclus says. When we have a policy as a service, it works on vSphere environments but it works across VMware Cloud, VMware Cloud Network, AWS, Google, Azure, and we have the same advanced protection, we have the same load balancer.

Both Project Trinidad and Project Watch are aimed at addressing the needs of modern workloads, he says. Theyre not tied to physical endpoints; instead, the API becomes the endpoint, he says. Project Trinidad uses AI and machine learning models to understand what are normal and expected east-west API traffic patterns between microservices so that if something anomalous pops up, it can be quickly detected.

We basically discover all the API, the schemas, API data and we create a baseline and we can start from the baseline, Monclus says. Project Trinidad introduces is AI/ML deep correlations between workflows and microservices.

As noted, Project Watch brings continuous security, compliance and risk assessment as well as automated and encrypted connectivity across clouds AWS, Google Cloud and Microsoft Azure virtual private clouds (VPCs) and virtual networks (VNETs) and security operations and integrates workflows from such areas as security and cloud operations and lines of business onto a single platform.

It also addresses the challenge of not only enabling networks and security to adapt to modern workloads but also to ensure that legacy hardware that cant make that change are secure.

VMware will assess and report the security risks enterprises face, giving the necessary data to make decisions, he says, adding that the vendor wants to create a continuous monitoring model in the same way as high availability, which uses the metric of three 9s, four 9s, and so forth, he says. We are trying to create a metric of how well youre running your datacenter or your applications from across security points.

Read more here:
The Network Binds The Increasingly Distributed Datacenter - The Next Platform

Read More..

Opinion: The line between data and privacy is measured only by success – Gambling Insider

If there is a modern subject of discussion which elicits a strong response from the public, it is that of data privacy.

Tech businesses such as Apple have made a conscious push towards informing the public about the subject, while making its products ever resistant to underhanded data retrieval.

In the gaming industry, data has become a way of life. Every bit that can be used is used by the industry to target new players, improve the flow of casinos thereby making it easier and more profitable to attract customers and generally improve market trends.

However, there is a catch.

At G2E Asia this year, Qliks Senior Director of Solutions and Value Engineering, Chin Kuan Tan, revealed the results of Qliks research into player preferences relating to data usage in the gaming and hospitality industry which threw up some interesting conundrums.

Tans presentation showed that 72% of people will stop engaging with a company completely if they have concerns over data collection, while also saying that 76% of players prefer hyper-personalisation over mass marketing techniques.

The duality of these two statistics shows that gaming operators are walking a knife edge when it comes to how the data gleaned from customers is used; and with the increased focus on data on an individual scale, the manner in which operators market themselves to customers has to evolve.

The more the industry uses servers and algorithms to solve and modernise everyday tasks, the more it relies on data collection to operate. This is something that Oosto CMO Dean Nicolls spoke about in a recent interview with Gambling Insider in relation to the companys facial recognition software.

When asked about how Oostos system protects the faces of millions of people that enter any of the locations where its technology is used, Nicolls spoke in depth about the subject in the upcoming September/October edition ofGambling Insidermagazine:

You might think a lot of the data is traversing from the casino to our central servers or to our cloud servers thats not the case. Everything is done locally. Traditionally, in a Vegas casino, all the servers are sitting on the premises and they are running our algorithms themselves, so were not getting that data on our servers. Now, naturally any data that goes from the camera to servers still needs to be encrypted; and it is, both in transit and at rest, but it isnt going anywhere on our servers.

The comments of the Oosto CMO show a willingness to please the audience, though perhaps it also shows a brushing off a want to get around the question as quickly and easily as possible, without wishing to be drawn into a larger conversation about ethical data practices.

On the whole, Oosto appears to do a good job of protecting the data of innocents; a difficult task when your business relies on filming and recognising large quantities of people en masse. However, the failure to categorically explain the safety precautions in place, outside of using the term encrypted, feels telling.

Data and the gaming industry is an odd mix, then.

In the modern day, the industry demands that players be protected from those that would do harm by obtaining data, while players themselves are ready to quit if they feel vulnerable for a second in signing up to a service.

Gaming companies want to use data to further the consumer experience while retaining and reassuring customers that any data provided will not be sold or used in other, nefarious ways as has been reported frequently since Edward Snowdens revelations in 2013.

Customers want what they always have wanted: a seamless service that benefits them without risk. But with the online nature of the modern world, this risk is accepted as long as it is mitigated, leaving gaming companies juggling the subjects of personalised experiences, data loss and customer satisfaction.

More:
Opinion: The line between data and privacy is measured only by success - Gambling Insider

Read More..