Page 2,094«..1020..2,0932,0942,0952,096..2,1002,110..»

New 5 Star Community Center will support veterans and community groups – SC Times

Following a move, a former St. Cloud business building is now a new community center to be used by veterans groups, the St. Cloud Police Department and other community groups.

The former 5 Star Plumbing, Heating and Air location at 1522 Third St. Nin St. Cloud is now 5 Star Community Center. Owner Dave Sherwood said the company donated the location after its move to Sauk Rapids.

Sherwood said he has several veterans in his familyand he saw them go through difficult times and need support. When he saw the work St. Cloud Stand Down does, he was impressed with the program.

"It was a really easy choice for me to make," Sherwood said of the building donation.

St. Cloud Stand Down is a non-profit veterans organization formed in 1998 that focuses on providing resources to veterans, including providing clothes and necessities, haircuts, meeting spaces, connections and help accessing assistance.

More: St. Cloud Stand Down sharpens focus on serving female veterans with new boutique

Bob Behrens, president of St. Cloud Stand Down, said the organization's presence at the building will be relatively minimal because the group already has a location at 724 33rd Ave. N, across from the former Electrolux building. However, he said the new community center will "be very well utilized" by the police department, veterans organizations and the community.

St. Cloud Stand Down owns the building, but the St. Cloud Police Department will manage the community meeting rooms in the basement and the Disabled American Veterans Organization will have a drop site for its clothing donation program in the back portion of the main floor. The main level also has space for veterans groups and community groups to meet, and will be managed by St. Cloud Stand Down.

St. Cloud Police Department Sgt. Tad Hoeschen said the department plans to use the space for community engagement work in the neighborhood. That could mean hosting neighborhood meetings, doing outreach with local businesses and community members, hosting parent education or working with other community partners.

"Honestly, the sky's the limit," Hoeschen said.

The community center won't be occupied like the COP House is, but Hoeschen said it's an another opportunity to "meet people where they're at."

"This is just one more option for us," Hoeschen said.

More: St. Cloud Police Department makes strides in community policing

The building was signed over to St. Cloud Stand Down in the summer, while Sherwood was in the hospital with a serious case of COVID-19 that he didn't expect to outlive. He signed the paperwork from the hospital, he said.

"I wanted to make sure no matter what happened to me, we got the place (taken care of)," he said.

Sarah Kocher is thebusiness reporter for the St. Cloud Times. Reach her at 320-255-8799or skocher@stcloudtimes.com. Follow her on Twitter @SarahAKocher.

Support local journalism. Subscribe to sctimes.com today.

See the original post:
New 5 Star Community Center will support veterans and community groups - SC Times

Read More..

OVH: The cloud should be open, reversible, interoperable – The Register

Interview OVHcloud is perhaps best known for cloud computing, hosting, and dedicated servers in its network of datacenters but the company has also made news in other arenas. For instance, taking on Microsoft in the European courts or the fire at one of its datacenters that destroyed customer data.

Backups? It's the cloud, right?

CEO Michel Paulin addresses the fallout from the March 2021 fire during a chat with The Register. "We have decided to increase the level of resilience," he says, "above all the regulations."

This comprises, according to Paulin, containerization, fire extinguishing systems, and batteries. The opening of a new datacenter is being delayed "just to readapt with higher standards of security and safety of our servers."

A recent report on the conflagration highlighted a lack of an automatic fire extinguisher system and a wooden ceiling that was only rated to resist fire for an hour. Expensive lessons, we'd wager, were learned.

Affected customers who assumed their data was safe were not pleased. A class-action lawsuit was filed over the situation and Paulin notes that more clarity is required over who is responsible for backups. "Many customers, especially small customers, were not really very interested in that and they believed 'because it's in the cloud... my backups are secured.'"

The devil will be in the legal detail. As for the assumption that backups were part of the deal in the time before the "incident," Paulin says: "In fact, it was not. So we decided to clarify that."

Backups are now the default and it is up to the customer to opt out.

It's a timely reminder to all those who have shoveled their data into the cloud over the years to check their contracts to see exactly what Ts&Cs they and their vendor have agreed.

OVHcloud's legal team was also recently called to action for an entirely different reason. Along with several other companies, OVH has filed a complaint with the European Commission's antitrust unit over how Microsoft runs its licensing operation. Namely, trying to run the Windows giant's services somewhere that isn't Azure can result in some hefty fees. Not ideal for competing cloud vendors.

"We see today that there are a lot of practises to avoid [allowing] the customer to choose, but also to use trojan horses to impose their cloud. And this is a type of claim we've made against Microsoft," says Paulin.

CEO Michel Paulin, left, and Hiren Parekh, VP for Northern Europe

"Because we are not on Azure, and we refuse to resell Azure, the conditions financially, legally, and technically are very different. We pay more... if we agree to resell Azure, most of them change; the pricing changes, it's less expensive..."

Unsurprisingly, Paulin also isn't keen on the antics of the other major cloud vendors, such as Google and AWS. "I think it's not good for customers. Because in the end, if only one or two companies or three companies have 100 per cent of the market, innovation, pricing, everything will be very bad."

In calendar Q1, AWS, Microsoft and Google sucked in 62 percent of customers' spending globally on infrastructure cloud services, according to Canalys.

Paulin worries that due to their size, dominance, and deep pockets, the big tech vendors "have a lot of capacity to impose their solutions on the rest of the world and on the rest of the customers," thus eroding freedom of choice and innovation in the long term.

"Each time there is a new player, which is threatening a little bit, this type of monopoly immediately buys them or they are killed," he tells us.

It all sounds awfully familiar to anyone who has followed the activities of some of the current cloud giants over the years, even those that seem to have a born-again attitude to openness nowadays.

"The cloud should stay open," says Paulin. "Reversible, interoperable."

By "reversible," the CEO refers to the occasionally heart-stopping fees demanded by vendors for extracting a customer's data. "Interoperable," however, brings the EU GAIA-X initiative to mind.

"The principle of GAIA-X," explains Paulin, "is to maintain openness. On paper, everybody is OK; the governments, the customers, and industry."

For the uninitiated, GAIA-X is Europe's data infrastructure initiative to take on the US and Chinese cloud businesses, the idea being to address the concept of digital sovereignty. It started off with 22 members and has swelled to 343, including Microsoft, Amazon, Google Ireland and other titans.

Paulin used the word "sabotage" in reference to some of the players in the project. "They don't want to make in practice the objective, which is to create an open cloud," he said.

As the charge to the cloud accelerates, there is a danger that by the time GAIA-X is ready to go, the world will have moved on to the extent that the initiative is irrelevant. "Yes, this is a threat," Paulin admits. "Speed of execution [and] speed of implementation will be a key factor of success."

"As we are many, it's difficult to find sometimes a consensus to be able to move forward quickly," he adds. And, of course, the processes could end up being delayed by other factors (or contributors perhaps more keen on the status quo), "to be sure that they will never be visible, and they will never be implemented."

According to Paulin, 80 percent of OVHcloud's revenues come from Europe, although the company is keen to expand outside of the region. He also reckons the future is not just hybrid, but also multi-cloud as customers minimize their exposure to local regulations.

"Multi-cloud is the fact that the customer will have the capacity to give workloads, storage, and to exchange data across the different vendors easily. And not very costly. And again, sometimes to be compliant with the constraints of data sovereignty or any type of regulations."

OVHcloud also makes its own servers, and while Paulin is keen to boast of the company's approach to sustainability thanks to its designs that require "zero air-conditioning in our datacenters," OVH is up against the supply chain issues faced by the rest of the industry.

"We have a capacity to switch our CPUs and GPUs and to offer a similar type of services but with different hardware, depending on the how the supplies are working.

"We have stocks," he adds, which goes some way to mitigate the supply chain risk and keep its French and Canadian production lines ticking over.

OVHcloud's stock price is currently not far off its IPO last year, having risen as 2021 drew to a close and subsequently fallen back during 2022. It reported 382 million in revenues for its first half-year 2022 results, an increase of 13.3 percent year-on-year. The company subsequently raised its revenue growth guidance to 15-17 percent from 12.5 to 15 percent. France, however, remains the big beast in terms of its revenues accounting for nearly half at 190 million.

See the original post:
OVH: The cloud should be open, reversible, interoperable - The Register

Read More..

NZ could become land of the long ‘green’ cloud, NZTE report finds – Stuff

New Zealand is well-placed to build a green data centre industry providing cloud computing services to Australians, a report commissioned by NZTE has concluded.

But the report by management consultant Analysys Mason found high wholesale electricity prices could be a fly in the ointment, and that the country was likely to face competition from Tasmania, which shares much of the advantage of New Zealands cool climate.

Technology giants Microsoft and Amazon Web Services (AWS) have budgeted billions of dollars to build new data centres in Auckland that will allow them to serve more of their customers locally, instead of from Sydney and further afield.

AWS has estimated it will invest $7.5 billion in its New Zealand facilities over 15 years.

READ MORE:* Bitcoin mining to be performed beside Southland power station * Data centre could be mining cryptocurrency in Central Otago by October* Land bought for $1 billion bid to turn Southland into global IT hub

But Analysys Mason also emphasised the potential for the South Island to become a hub for serving up cloud computing services to Australians from green data centres, following the expected availability of a new submarine fibre-optic cable network connecting Southland to multiple continents from 2025.

NZTE general manager of investment Dylan Lawrence said the report showed New Zealand, and specifically the South Island, are strong potential locations to build green data centres and serve overseas demand.

Data centres all over the world are power hungry. Building them here in New Zealand means investors and centre operators can take advantage of our renewable energy sources, he said.

In the South Island, they can also take advantage of the lower temperatures, which in turn helps lower the centres emissions.

Analysys Mason estimated that co-location data centres set up to provide services to third parties could be raking in annual revenues of $898m a year in New Zealand by 2030, almost double their revenues last year.

It also forecast the power requirements of data centres in New Zealand would grow from 81 megawatts last year to 303MW by 2030, which would be equivalent to a little over half of the demand of the Tiwai Point aluminium smelter.

Lawrence said the further development of the industry would have a number of benefits.

More data centres here will help support our businesses to move to the cloud, improve connectivity and decrease our reliance on offshore centres. And it will provide a potential export opportunity and therefore income.

Getty-Images

The data centre industry has been booming worldwide, creating weightless export opportunities for countries with cool climates and cheap renewable power.

New Zealand-founded technology company Datagrid is planning to build a huge data centre on a 43 hectare site in North Makarewa, near Invercargill, with an eye to serving the Australian market.

Its chief executive, Remi Galasso, is also the driving force behind the venture to connect Southland to Australia, North America and Southeast Asia with the Hawaiki Nui cable network, and is involved in a Chilean government initiative that could connect Southland to South America and potentially Antarctica.

Analysys Mason said Southlands 240 terabit-per-second connection on the Hawaiki Nui cable network would give it an advantage over Tasmania in becoming a hub for Australian cloud computing services.

Two existing internet cables connecting Tasmania had a limited capacity of 1Tbps each while a third cable had multiple reliability issues, it said.

Luke Tscharke/Tourism Tasmania

Tasmania is the obvious regional competitor to Southland in the bid to build a bigger data centre industry, but both may share the spoils a report suggests.

Those cables were also not connected directly to landing stations in Sydney or Melbourne.

But spot electricity prices were much lower in Tasmania at about 2.9 US cents (NZ 4.6c) a kilowatt-hour and all of its electricity was renewably generated, Analysys Mason said.

Spot electricity prices have hovered above NZ 20c/kWh in New Zealand for much of this year and spiked above 50c/kWh on Friday.

But data centres in New Zealand could still be competitive if they negotiated favourable supply contracts direct with generators, the consultant said.

Overall, it expected the South Island and Tasmania might share the spoils.

Galasso said Southland and Tasmania were the only locations in the region that had the advantage of large hydropower stations and cool weather.

The demand for electricity from the data centre industry was likely to be such that both would become hubs for cloud computing, he said.

Supplied

Hawaiki Cable founder Remi Galassos drive to get NZ better connected with subsea cables has been key to creating more opportunities for hosting cloud computing services in the country.

Analysys Masons estimate that Kiwi data centres would need 303MW of power by 2030 could be an underestimate if the Chilean governments Humbolt subsea cable progressed and New Zealand became an important data exchange between South America and Singapore, he said.

Datagrid was finalising its design ahead of putting in resource consent for its North Makarewa data centre, he said.

British data centre company Lake Parime plans to open a smaller data centre facility in central Otago by October, that will be powered by Contact Energys Clyde hydro scheme.

It is expected to be used in part for the environmentally-controversial practice of Bitcoin mining.

Another company, Grid Share, is involved in a similar initiative at Pioneer Energys Monowai Power Station.

A fourth business, local start-up T4 Group, is planning to build a mid-sized high-spec $50m data centre in Southland that would be used to house more critical cloud computer applications.

Read more:
NZ could become land of the long 'green' cloud, NZTE report finds - Stuff

Read More..

Significance of colocation in hybrid working – CXOToday.com

Colocation in itself is not sufficient for a complete digital transformation. The entire functionality is now dependent on a host of services. Business owners may have difficulty accessing data due to security breaches, disasters, or decentralised networks that disrupt business continuity. A multitude of supplementary activities, therefore, need to be taken that ensure your hybrid model of work has uncontested data availability.

With geographically distributed teams, it puts an added onus on the technologies to ensure virtual collaboration, asynchronous communication, and result-based tracking. In these scenarios, Colocation Data Centers play a critical role in handling Cloud storage, virtual meetings, VOIP traffic, and web-based applications.

Here are some of the crucial services to ask from your Colocation Data Center provider:

It is not a revelation that effective business continuity is dependent on creating ample redundancies within the Data Center ecosystem. Where IT teams are already overextended in the pandemic age, managing dispersed endpoints and increasingly complex cyber threats, BaaS protects the organisations information by replicating the entire contents on an offsite location. This makes an organisation less susceptible to evolving threats and frees up resources for revenue-generating operations.

In essence, organisations should look for:

When there are bundles of assets and information to protect, an organisation needs to develop a varied set of capabilities to keep up with DDoS and data loss threats in todays time. SECaaS consolidates them all to provide network security, vulnerability scanning, identity and access management, encryption, intrusion prevention, continuous monitoring and security assessments. It is outsourcing the security of your company within a Cloud architecture. This ensures that you dont have to increase your costs while scaling your business in a hybrid model.

Essentially, organisations must look for:

Data loss is accelerating at an unprecedented pace due to both natural and man-made disasters. In the face of this uncertainty and its relative impact on a hybrid style of working, it is absolutely important to document and deploy a Disaster Recovery (DR) solution that provides failover for your Cloud computing environment. DRaaS is an effective option that can help you save recurring costs, ensure a faster recovery time, build in-house controls, and get cohesive data security.

Organisations should look for:

STaaS works as Cloud storage rented from a Cloud Service Provider (CSP) for data repositories, multimedia storage, backup, and DR. It is designed to handle heavy workloads without disrupting the ongoing business operations. A key benefit here is the offloading of any cost or effort involved in managing a full-fledged infrastructure and technology while giving you the bandwidth to scale up resources on demand. You can respond to market conditions faster and spin up new applications which turn out to be service differentiators in the evolving digital landscape.

Organisations should look for:

Round-the-clock monitoring and issue resolution are the two indispensable aspects required to keep an organisations remote environment functional. This includes software patching, reporting, and performance tuning on a need-to-implement basis. In totality, a third party vendor within a Colocation Data Center can cover your applications, middleware, web hosting, data management and other critical missions within the ecosystem that allow you to access technology and scalability without breaking the budget.

To ensure 247 availability, organisations should look for:

A unified policy around the Clouds is crucial to optimise application experience and automate Cloud-agnostic connectivity. Hence, a Cloud networking solution should deliver extensive integrations, flexibility, scalability and security while promising a consistent Colocation environment that consolidates all enterprise functions. The ideal way to ensure seamless remote communication is to go for a low-latency route optimisation engine with maximum network performance.

Organisations should look for:

To sum up

The pandemic has given rise to this new trend of working from home and office simultaneously, thus calling for a better infrastructure to support the trend. As this model is taken up by most companies, a hybrid data center architecture is needed as well, particularly for colocation providers. It is only this way that they can adjust to the changing dynamics of digital India.

(The author Mr.Nikhil Rathi, Founder & CEO, Web Werks and the views expressed in this article are his own)

Original post:
Significance of colocation in hybrid working - CXOToday.com

Read More..

Global Virtual IT Lab Software Market Forecasts, 2021-2022 & 2028: Benefits of Using Virtual Sandbox Tests & Increasing Spending on…

DUBLIN--(BUSINESS WIRE)--The "Virtual IT Lab Software Market Forecast to 2028 - COVID-19 Impact and Global Analysis By Deployment and Organization Size" report has been added to ResearchAndMarkets.com's offering.

Virtual IT Lab Software Market is expected to grow from US$ 1,461.29 million in 2021 to US$ 3,174.88 million by 2028; it is estimated to grow at a CAGR of 11.7% during 2021-2028.

Benefits of Using Virtual IT Lab Software for Educating Employees on Advanced Tools/Projects

Several businesses use Virtual IT Lab Software to educate employees on new development practices and train them on advanced and modern tools/projects. The software is well-suited for employee training because it provides real-world resources without impacting the live applications, websites, or networks.

Moreover, practical training for employees is essential for the success of each organization as it increases the productivity of an employee. As technology has become an integral part of operations, employees must be trained to use the advanced tools effectively. Also, the virtual employment training approach is a cost-effective method.

Additionally, employee training extends an employee's knowledge, creates subject matter and in-house experts, reduces attrition rates and recruitment costs, and improves working productivity.

In the Virtual IT Lab Software Market ecosystem, some online course providers offer virtualized IT labs platforms as their training platform and range of courses. Several core Virtual IT Lab Software are used within a company for internal purposes. For instance, ReadyTech Corporation offers employee training with virtual training labs platform.

ReadyTech's virtualized IT labs platform provides employees a hands-on virtual training environment where they can practice actual on-the-job skills without jeopardizing the company's production systems. Virtual employment training with the help of Virtual IT Lab Software offers several benefits to the organization, which drives the Virtual IT Lab Software Market.

With the digital transformation, the use of cloud-based platform is increasing due to its simple deployment and reduced deployment time and cost. Moreover, the internet infrastructure has matured in developed countries and is flourishing in several developing countries, allowing end users to access the cloud-based platform.

A few benefits of cloud-based Virtual IT Lab Software are the secure hosting of critical data, improved security and scalability, and quick recovery of files. The backups are stored on a private or shared cloud host platform. Cloud-based Virtual IT Lab Software also reduces repair and maintenance costs and enhances customer satisfaction.

Therefore, due to the multiple benefits of cloud-based Virtual IT Lab Software, the adoption of the software is increasing by large enterprises and small & medium enterprises (SMEs) for educational purposes, which fuels the Virtual IT Lab Software Market growth.

Key Findings of Study:

In 2021, the cloud-based segment led the Virtual IT Lab Software Market. Virtual training has a promising future. As technology become more integrated into daily lives and work, more advancements and changes are expected in the near future. The cloud-based segment generates a significant amount of share in 2021.

Most virtual training labs are cloud-based, allowing learners to be educated from any location in the world, in any time zone, at any time, and at any speed. Users do not need an IT team to set up a complex training lab with virtual training labs. Instead of setting up each workstation with the appropriate software and files for in-person instruction, the virtual lab is set up once in the cloud and easily accessible by learners anywhere, contributing to the Virtual IT Lab Software Market growth.

Based on organization size, the Virtual IT Lab Software Market is segmented into SMEs and large enterprises. In 2021, the large enterprises segment led the Virtual IT Lab Software Market. The growing popularity among large enterprises and benefits associated with the utilization of the virtual IT labs is the major factor anticipated to drive the segment.

Virtual IT Lab Software for large enterprises offers an integrated corporate training platform for every aspect of the business. Large enterprises can educate their employees online faster and more efficiently. Companies can use this software to intelligently simplify, automate, and optimize their entire learning process.

It enables businesses to teach their personnel by building online courses, encouraging active learning, and increasing course completion rates in a simple and user-friendly environment. The growing popularity among large enterprises and benefits associated with the utilization of the virtual IT labs is the major factor anticipated to drive the segment growth in the Virtual IT Lab Software Market.

In 2021, North America accounted for the largest share in the global Virtual IT Lab Software Market.

Key Market Dynamics

Market Drivers

Market Restraints

Market Opportunities

Future Trends

Company Profiles

For more information about this report visit https://www.researchandmarkets.com/r/yzcnqx

Read more from the original source:
Global Virtual IT Lab Software Market Forecasts, 2021-2022 & 2028: Benefits of Using Virtual Sandbox Tests & Increasing Spending on...

Read More..

Codestone Reinforces Position as One of Leading SAP Partners in the UK – InsideSAP

Codestone, one of the top SAP partners in the UK that provides comprehensive enterprise resource planning (ERP) transformation services, recently expanded its IT capabilities with the acquisition of SAP Gold Partner Clarivos.

In March 2021, Codestone confirmed that it had formed a partnership with FPE Capital, a highly renowned investment firm that specialises in the software and services sector. As a leading provider of SAP and Microsoft solutions to the small and medium-sized enterprise (SME) and mid-market segments in the United Kingdom, Codestone benefited from the major investment partnership by ensuring that the company stays ahead in terms of innovation in the ever-changing industry.

Fast-forward to May this year, Codestone once again established its position in the IT services market by acquiring Clarivos, an SAP Business ByDesign ERP and SAP Analytics Cloud specialist. Clarivos, which was founded in 1996, provides technological consulting and solutions that aim to empower, innovate, and reinvent the functions of the Chief Financial Officer departments.

According to the statement released, the acquisition was successfully completed on May 3, 2022. Codestone stated that it anticipates that the transaction will create 10 million in additional revenues and 2 million in incremental adjusted earnings before interest, taxes, depreciation and amortisation (EBITDA) over the next 12 months following completion.

Codestone Chief Executive Officer and Co-Founder Jeremy Bucknell said:

This agreement complements our award-winning SAP ERP and Microsoft Cloud offerings and creates one of the most differentiated SAP services in the market. It is perfectly aligned with our customer-centric, cloud-first strategy and will drive greater focus on strategic execution and operational excellence to the Groups already significant value creation.

With Codestones cloud-first approach, Clarivos intends to assist companies in achieving speedy results while also future-proofing their businesses for the future by operating more efficiently through cloud solutions and implementing cutting-edge technologies. Several honours, including the recently concluded SAP UK & Ireland Partner Excellence Awards for Outstanding Performance and the SAP EMEA North Partner Excellence Awards for Outstanding Performance, have been given to Clarivos for its fast development in SAP Business ByDesign implementations.

Sharing his thoughts about the latest acquisition, Kirit Patel, Executive Director of Strategy and Execution at Clarivos, remarked:

Codestones multi-capability Microsoft credentials, cloud hosting expertise and comprehensive support provides much in demand services to our customers. Our mid-market and large enterprise customers will benefit from more comprehensive ERP delivery, cloud and managed services from a pedigree organisation that is widely recognised in this space.

Clarivos will bring approximately 70 professionals to Codestone who have collectively completed more than 350 successful EPM and ERP projects throughout the EMEA market. The Clarivos management team will continue to be part of the expanded organisation, and Kirit Patel will now serve as the Director of Mid-Market and Large Enterprise and as a member of the Codestone Operational Board of Directors.

Furthermore, mergers and acquisitions (M&A) in the SAP partner ecosystem have increased in recent years as some of the German software enterprise tech giants major UK partners seek to strengthen their positions. These include Accenture snapping up Edenhouse which is one of the countrys largest SAP Platinum Partners specialising in the sale, hosting, support, and implementation of SAP services and products to mid-sized enterprises. Last year also saw Sapphire Systems purchasing InCloud Solutions, a specialist in SAPs business cloud-based ERP software, SAP Business ByDesign.

Read the original here:
Codestone Reinforces Position as One of Leading SAP Partners in the UK - InsideSAP

Read More..

Data centres, submarine cables drive Andile Ngcaba to the ‘edge’ – ITWeb

Andile Ngcaba, chairman of Convergence Partners.

Businessman Andile Ngcabas investment management firm Convergence Partners is looking to disrupt Africas edge computing market.

The company is now eagerly awaiting the landing of submarine cables on the African continent to fully exploit the market.

In an interview with ITWeb last week, Ngcaba, who is now based in Silicon Valley, said the sprawling data centres in South Africa and the rest of Africa present significant opportunities for edge computing to thrive.

The interview followed Convergence Partners company inq. last week reaching an agreement to acquire 100% of Syrex, a provider of hyper-converged cloud technology solutions in SA, for an undisclosed amount.

Edge computing is a distributed computing paradigm that brings computation and data storage closer to the sources of data. From a performance standpoint, edge computing is able to deliver much faster response times.

According to Markets and Markets, the edge computing market is expected to grow from $36.5 billion in 2021, to $87.3 billion by 2026, at a compound annual growth rate of 19%.

Market research firms Forrester and Gartner have forecast that edge computing will become mainstream this year, taking advantage of 5G, the internet of things (IOT) and cloud-native software.

Inq. is one of the leading edge computing technology companies in Africa. The Convergence Partners-owned firm connects over 1 200 corporations across nine African countries.

According to Ngcaba, the company is set to expand into more African countries as it looks to extend its footprint.

He explained the merger with Syrex will result in more job hires across the continent.

Also speaking during the interview, David Heserlman, MD of Syrex, said the coming of hyper-scalers such as Microsoft and Amazon Web Services to SA presents an opportunity for edge computing, as well as the inq.-Syrex merger.

He believes the growth of IOT applications will also spur the edge computing market.

Of late, there has been a flurry of activity in SAs data centre space. As data demand and cloud adoption continue to cause a dramatic surge in traffic, data centres are becoming increasingly important on the continent, with more companies migrating components of their IT infrastructure into the cloud.

According to Ngcaba, the imminent arrival of subsea cables will also result in the jump in data traffic in Africa, further giving impetus to the edge computing service providers.

These include the Google-funded Equiano cable, as well as the other one being bankrolled by the 2Africa Consortium.

However, he did not disclose how the company will capitalise on the landing of these cables, preferring to say: Watch this space.

Convergence Partners-backed broadband infrastructure company CSquared and Google in March announced the landing of the Equiano subsea cable system in Lom, Togo, marking the cables first stop along Africas Atlantic coast.

The Equiano cable will run from Portugal along the West Coast of Africa connecting Europe to Togo, Nigeria, Namibia, SA and St Helena.

CSquared is the landing partner in Togo for the cable system, while Convergence Partners is an investor in the Seacom undersea cable.

Commenting on edge computing developments in Africa, Jon Tullett, senior research manager for cloud/IT services at the IDC, says edge computing is more typically a function of local data centre operations, served by local data centre interconnect and cross-connect networks.

So its the growth of investment in data centre facilities thats more relevant in boosting edge applications, but better international connectivity does spur adoption of data centre services; its all related, says Tullett, referencing the connectivity that will be brought by the subsea cables.

David Herselman, managing director of Syrex.

Data centre investment presents opportunities for edge computing, but the growing opportunities within edge computing are, in turn, a big part of the investment strategies behind data centre construction, he adds.

Similar points apply within hosted private cloud solutions, for often quite similar reasons, Tullett explains.

Edge computing has a lot to offer, but like cloud, its not perfect for every scenario. Very low-latency applications, heavy analytic processing close to the source, scaling out application deployment across delivery networks, sensitivities regarding data sovereigntythese are the sorts of characteristics which indicate an application may be well-suited for edge deployment.

He notes IOT is a good example of a broad spectrum of applications which often benefit.

Africa Analysis telecoms analyst Dobek Pater says the new cables will provide additional capacity and are likely to result in a further decrease in the price of international bandwidth.

To this extent, he points out that international connectivity will certainly not present constraints in terms of accommodating the expected increase in traffic from the introduction of new technologies, such as 5G and IOT, which are expected to generate significantly more traffic.

According to Pater, much of the traffic nowadays, especially in SA, is domestic rather than international.

Therefore, it is just as important to have a very good domestic backbone network. With some of the ICT service providers, including data centre and cloud companies, expanding their operations across Africa, subsea cable systems will play an increasingly important role in connecting such operations across different countries or regions in Africa.

While Pater also believes the growth of the data centre market will propel the edge computing market, he notes: We will also need to see the build-out of smaller data centres or aggregation nodes closer to where the traffic is generated or where apps are consumed for proper edge computing.

This will become increasingly important with applications requiring low latency and quick turn-around times for data computing or analysis and returning information to the user.

The edge of the network needs to move closer to the end-user. The data centre environment will be segmented into three large national or regional data centres, mainly for large-scale storage of data, hosting of cloud environment; smaller data centres distributed regionally, which will aggregate traffic regionally and redirect it; as well as small data centres, such as containers, close to the network edge where the fast turn-around data computing requirements will be addressed.

Concluding, Pater urges African organisations to work with their ICT service providers to determine how edge computing can best serve their requirements.

Here is the original post:
Data centres, submarine cables drive Andile Ngcaba to the 'edge' - ITWeb

Read More..

Will Computers Be Able to Imitate the Human Brain? – Techopedia

Many science fiction works have been crafted on the idea that machines will become increasingly human. Machines that think, learn and make decisions in the same way humans do have been the object of speculative fear even as scientists and engineers work to create them.

While The Singularity is hardly looming on the horizon, there have been some fascinating developments in the world of artificial intelligence and machine learning that we wanted to delve into.

Researchers at Purdue University are building human brain-inspired hardware for artificial intelligence (AI) to help AI learn continuously over time. The goal of the project is to make AI more portable so that it can be used in isolated in environments such as in robots in space or for autonomous vehicles. By embedding AI directly into hardware rather than running it as software, these machines could operate more efficiently.

MIT engineers have designed a brain-inspired chip by putting tens of thousands of artificial brain synapses, or memristors, on just one chip that's smaller than one piece of confetti. Memristors (memory transistors) are silicon-based components that mimic the information-transmitting synapses of the human brain. This so-called "brain-on-a-chip" could one day be built into small, portable AI devices that could perform complex computational tasks currently only performed by supercomputers.

And researchers at Northwestern University and the University of Hong Kong have developed a device modeled after the human brain that simulates human learning. The device is able to learn by association via synaptic transistors that process and store information at the same time.

Here are more details on these three projects that aim to enable computers to mimic the human brain:

In a paper published in Science in February, Purdue researchers explained how computer chips could rewire themselves dynamically to take in new data as the brain does, enabling AI to continue learning over time.

To enable learning, the brain is continuously forming new connections between neurons. As such, to build a computer or machine inspired by the brain, the circuits on a computer chip also have to change. However, a circuit that a computer has been using for years is the same as the circuit that was built for the computer in the factory.

Consequently, researchers must be able to "continuously program, reprogram, and change the chip," according to Shriram Ramanathan, a professor in Purdue Universitys School of Materials Engineering whose work involves discovering how materials could imitate the brain to improve computing.

Ramanathan and his team built new hardware that can be reprogrammed with electrical pulses on demand. The team's thinking is that because this device is adaptable, it will be able to take on all functions necessary to build a computer inspired by the human brain.

"Through simulations of the experimental data, the Purdue teams collaborators at Santa Clara University and Portland State University showed that the internal physics of this device creates a dynamic structure for an artificial neural network that is able to more efficiently recognize electrocardiogram patterns and digits compared with static networks," according to Purdue. "This neural network uses 'reservoir computing,' which explains how different parts of a brain communicate and transfer information."

The team now aims to demonstrate this on large-scale test chips that could be leveraged to develop a brain-inspired computer, the researchers said.

MIT researchers are working toward the day when people "might be able to carry around artificial brains [that can work] without connecting to supercomputers, the Internet, or the cloud," according to a statement.

"Like a brain synapse, a memristor would also be able to 'remember' the value associated with a given current strength, and produce the exact same signal the next time it receives a similar current," the statement noted. "This could ensure that the answer to a complex equation, or the visual classification of an object, is reliable a feat that normally involves multiple transistors and capacitors."

In a paper published in "Nature Nanotechnology," the scientists explained how their brain-inspired chip could remember a gray-scale image of Captain America's shield each pixel was matched to a corresponding memristor on the chip and recreate the same crisp image of the shield numerous times.

In the paper, MIT's researchers highlighted the fact that its "brain-on-a-chip" could be used for performing complex tasks on mobile devices tasks that currently only supercomputers can handle.

Taking a page from the book of Russian psychologist Ivan Pavlov, who trained dogs to associate the sound of a bell with food, researchers at Northwestern and the University of Hong Kong, have trained their computing device to associate light with pressure, according to a statement.. The research was published in the journal "Nature Communications."

This new device imitates the brain by using electrochemical "synaptic transistors" to process and store information at the same time, the statement noted. "These synapses enable the brain to work in a highly parallel, fault tolerant, and energy-efficient manner," the statement noted. The device's organic, plastic transistors work the way a biological synapse works.

"With its brain-like ability, the novel transistor and circuit could potentially overcome the limitations of traditional computing, including their energy-sapping hardware and limited ability to perform multiple tasks at the same time," according to the statement. "The brain-like device also has higher fault tolerance, continuing to operate smoothly even when some components fail."

In current computer systems, memory and logic are physically separated; however, bringing those functions together would save space and reduce the cost of energy. And the soft, plastic-like polymers of the new computing device would enable researchers to integrate it into smart robotics, wearable electronics, and even devices implanted in people.

While a robot uprising is still firmly in the realm of science fiction, these advancements are just some of the ways scientists are working to replicate the brain's biological machinery and that one day might lead to computers that function like the human brain.

See the original post:
Will Computers Be Able to Imitate the Human Brain? - Techopedia

Read More..

Explained: CERT-Ins new cybersecurity norms, and why it is likely to issue a clarification about them – The Indian Express

CERT-In is learnt to be working on releasing more details of the cybersecurity directive issued in April, which has been opposed by industry stakeholders. According to sources, the agency could clarify that the norms apply only to VPN providers who offer Internet proxy like services to general Internet subscribers, and not to corporate VPN service providers.

What are these norms that CERT-In is clarifying?

The norms, released on April 28, asked VPN service providers along with data centres and cloud service providers, to store information such as names, email IDs, contact numbers, and IP addresses (among other things) of their customers for a period of five years. Entities are also required to report cybersecurity incidents to CERT-In within six hours of becoming or being made aware of them.

The norms have triggered concerns over privacy, and CERT-In is expected to clarify that private information of individuals will not be affected by the directions.

These directions do not envisage seeking of information by CERT-In from service providers on a continual basis as a standing arrangement. CERT-In may seek information from service providers in case of cyber security incidents and cyber incidents, on a case-to-case basis, for discharge of its statutory obligations to enhance cyber security in the country, according to a person aware of the clarifications that CERT-In is in the process of finalising.

The agency is also likely to include in its clarifications that the April 28 directive to store such information and share it with CERT-In will override any contractual obligation VPN providers may have with their customers of not disclosing such information.

Queries sent to the IT Ministry and CERT-In Director General Sanjay Bahl were not immediately answered.

But why has CERT-In felt the need to issue a clarification?

Prominent VPN providers, a large part of whose value proposition is ensuring anonymity of their users on the Internet, have questioned the directives, and some providers like NordVPN are even considering pulling their servers from India should the directive be enforced on them.

At the moment, our team is investigating the new directive recently passed by the Indian government and exploring the best course of action. As there are still at least two months left until the law comes into effect, we are currently operating as usual. We are committed to protecting the privacy of our customers, therefore, we may remove our servers from India if no other options are left, Laura Tyrylyte, head of public relations at Nord Security, said.

VPN providers like Surfshark have claimed that their technology does not allow the logging of users information. Surfshark has a strict no-logs policy, which means that we dont collect or share our customer browsing data or any usage information, Gytis Malinauskas, head of the legal department at Surfshark, said.

Newsletter | Click to get the days best explainers in your inbox

Moreover, we operate only with RAM-only servers, which automatically overwrite user-related data. Thus at this moment, we would not be able to comply with the logging requirements even technically. We are still investigating the new regulations and its implications for us, but the overall aim is to continue providing no-logs services to all of our users, Malinauskas said.

How has the government responded to these concerns?

Speaking to The Indian Express earlier this month, IT Minister Ashwini Vaishnaw had said there was nothing to worry about CERT-Ins norms. There is no privacy concern. Suppose somebody takes a mask and shoots, wouldnt you ask them to remove that mask? It is like that, Vaishnaw had said during an interview.

Explaining the need for the rules, he had said, Cybersecurity is something which is continuously evolving. So we have issued very comprehensive guidelines from CERT-In. Ultimately, if there is a threat to you, the police and you would both have to work together.

The basic concept (of the guidelines) is that the people who are actually running the infrastructure should take all possible steps to make sure that things are in place and if there is any breach, immediately inform us so that we can take action, Vaishnaw said.

See more here:
Explained: CERT-Ins new cybersecurity norms, and why it is likely to issue a clarification about them - The Indian Express

Read More..

In India, kids exposed to online risks more than any country: Report – Hindustan Times

Children have spent more and more time on the internet since the beginning of the Covid-19 pandemic and, with online education for even lower classes becoming popular, exposure to the internet is also beginning at an early age, which leaves kids vulnerable to cyberbullying and other dangers. According to a report by internet security company McAfee -- Life Behind the Screens of Parents, Tweens, and Teens -- children hit their online stride when they are between 15 and 16, at which point mobile usage jumps so much it approaches levels they will carry into adulthood.

Research by McAfee, an American computer security software firm, surveyed 15,500 parents and more than 12,000 of their children in ten countries, including the United States, the United Kingdom, Mexico and India, to understand how they protect themselves and their loved ones on the internet.

Worldwide, 90 per cent of teens between 15 and 16 said they used a smartphone or mobile device. This marked a noteworthy 14 per cent spike in usage when compared to children 10 to 14 years old, 76 per cent of whom said they use a smartphone or mobile device, the May 12 report stated.

Indian kids exposed to online risks more than any country

Researchers also found the US has the highest cyberbullying rate (28 per cent) and high exposure to online risks, while India had the highest exposure to online risks out of any country. India also had some of the earliest mobile maturity, as per the data.

Global trends on cyberbullying

By the age of 17 or 18 reports of cyberbullying increased to 18 per cent, attempted theft of online accounts to 16 per cent, and unauthorised use of personal data to 14 per cent, data showed.

Data also showed that 73 per cent of children look to parents - more than any other resource - for help in terms of online safety. Parents, however, seem to lag behind a bit in actually taking active steps to secure their child from cyberbullying.

The report noted: "Parents take more precautions, such as installing antivirus software, using password protection, or sticking to reputable online stores when shopping, on their own devices than they do on their childrens connected devices."

Secret lives of teens and tweens online

According to the research, more than half of the surveyed children (59 per cent) act to hide their online activity - from clearing browser history to omitting details about what they are doing online

Do girls experience more dangers online?

According to the research, there is a gender bias when it comes to parents protecting kids from online threats. Data shows girls are more protected than boys, but it is the boys who encounter more issues online.

Girls aged 10-14 were more likely than boys of the same age to have parental control on personal computers or laptops in almost every country surveyed, while boys were more likely to hide their activity from parents.

23 per cent of parents said they would check the browsing and email history on the PCs of their daughters aged 10 to 14. But for boys aged 10 to 14, this is only 16 per cent.

22 per cent of parents restrict access to certain sites for girls. For boys this is just 16 per cent.

Follow the latest breaking news and developments from India and around the world with Hindustan Times' newsdesk. From politics and policies to the economy and the environment, from local issues to national events and global affairs, we've got you covered....view detail

Original post:
In India, kids exposed to online risks more than any country: Report - Hindustan Times

Read More..