Page 2,743«..1020..2,7422,7432,7442,745..2,7502,760..»

Kingsport’s Cloud Park to close on August 1 WJHL | Tri-Cities News & Weather – WJHL-TV News Channel 11

Unicoi County first responders given warm reception at 2nd annual dinnerNews / 5 hours ago

Sullivan Co. Sheriff's Office, Second Harvest Food Bank run back-to-school bashNews / 5 hours ago

The Happiness Company Hosts successful Donation DriveNews / 6 hours ago

Tri-Cities shoppers hit the stores for tax-free weekendNews / 5 hours ago

Tennesseans given opportunity to share stories that showcase uniqueness of area they call homeNews / 10 hours ago

Crumley House hosting online fundraiser "Donation Days"News / 10 hours ago

Hundreds of students receive free backpacks from HOPE's back-to-school giveawayNews / 10 hours ago

Washington County, Tennessee Schools to provide free meals to all studentsNews / 10 hours ago

Unicoi County Schools releases 2021-2022 reopening plansNews / 10 hours ago

Bristol's 1st Downtown Yard Sale kicks off Saturday, July 31News / 10 hours ago

Kingsport's Cloud Park to close on August 1News / 10 hours ago

Ballad Health ramps up vaccination efforts as 'COVID-19 cases surge'News / 1 day ago

Originally posted here:
Kingsport's Cloud Park to close on August 1 WJHL | Tri-Cities News & Weather - WJHL-TV News Channel 11

Read More..

The Preserve hosts pro disc golf tournament this weekend in Clearwater – SC Times

The best disc golf professionals will make their way to Central Minnesota thisweekend for The Preserve Championship.

The tournament will take place atAirborn Disc Golf Preserve in Clearwater, Minnesota. This weekend is one of a handful of disc golf tournaments throughout the country as the top professionals compete for the top spot not only at The Preserve, but in the tour.

This disc golf course was designed by one of the top disc golf professionals in the world,Cale Leiviska, back in May 2020.

"Cale has built a great course," said Seth Fendley, operations director for the Disc Golf Pro Tour. "That property has three disc golf courses on it and he takes and creates a larger scale championship course out of those three courses. He pairs it down into one and creates a championship course for the tournament."

Since the opening, Airborn Disc Golf Preserve has been a popular place for many disc golf enthusiasts, especially in the Minnesota area. There are three courses at the Clearwater location.

READ MORE:World-class disc golfer opens high-caliber courses in Clearwater

Timberwolf and Lynx are more for the experienced disc golf player as there are plenty of obstacles and challenges throughout the course. The Red Fox was made for the inexperienced disc golfers as it's a par three and helps new players develop a love for the sport.

This weekend, the players will play on the Black Bear course, which is a par 65 for men and par 66 for women. There's about 10,000 feet of layout for both of the courses for the two genders.

All of these courses are located right next to the Mississippi River in Clearwater and the area used to be an old golf course, so it was a perfect opportunity for Leiviska to make into a disc golf course.

"Minnesota, in and of itself, is one of the major states for disc golf in the United States," Fendley said. "You only have like California and Texas that have more disc golfers in it. The crazy thing about it is that in Minnesota you can disc golf year round but most people don't want to whereas California and Texas you can disc golf all year.

"That's just a testament to show how strong Minnesota disc golf is in the country."

The Preserve Championship will start on Friday and finish on Sunday with the best disc golf players in the country, and even the world. The 2021 women's world champion,Catrina Allen, will be playing this weekend and she's actually from Pipestone, Minnesota.

The 2021 men's world champion,James Conrad, will also be playing this weekend along with many other top players on both the men's and women's side.

There will be 30 men from Minnesota playing in the tournament including three from St. Cloud:Aidan Guthrie,Jonny Twofingers Estes andLouis LaPorta. There will also be seven women from Minnesota playing this weekend and none are from the St. Cloud area.

"We have the best of the best playing this weekend in Clearwater," Fendley said. "It'll be a good weekend with a lot of competition."

The Preserve Championship had its inaugural tournament last year andNikko Locastro narrowly edged out a win over Calvin Heimburg and Simon Lizotte for the men. On the women's side,Paige Pierce brought home the victory over Catrina Allen andMissy Gannon.

The Disc Golf Pro Tour and Leiviska agreed to have a maximum of 1000 people at this weekend's event. They believe they can host more on the course, but due to the COVID-19 pandemic, they wanted to keep the tournament controlled.

The tickets for the weekend are sold out as there were only 100 tickets still available heading into this final week, but those are bought out by the middle of the week.

If people couldn't buy tickets to the event and still want to watch the action from the weekend, there's a live stream option to watch the top leaders online. The live stream can be found at Disc Golf Network and it's a paid subscription.

Fendley said the Disc Golf Pro Tour plans to return to Airborn Disc Golf Preserve for future years to continue a yearly tournament.

"The Preserve is a beautiful course with plenty of challenges for these players," Fendley said. "I know a lot of them are looking forward to playing this weekend."

Brian Mozey is the high school sports reporter for the St. Cloud Times. Reach him at 320-255-8772or bmozey@stcloudtimes.com. Follow him on Twitter @BrianMozey.

Read more:
The Preserve hosts pro disc golf tournament this weekend in Clearwater - SC Times

Read More..

Red Hat and Nutanix Announce Strategic Partnership to Deliver Open Hybrid Multicloud Solutions – Business Wire

RALEIGH, N.C. & SAN JOSE, Calif.--(BUSINESS WIRE)--Red Hat, the worlds leading provider of enterprise open source solutions, and Nutanix (NASDAQ: NTNX), a leader in hybrid multicloud computing, today announced a strategic partnership to enable a powerful solution for building, scaling and managing cloud-native applications on-premises and in hybrid clouds. The collaboration brings together industry-leading technologies, enabling installation, interoperability and management of Red Hat OpenShift and Red Hat Enterprise Linux with Nutanix Cloud Platform, including Nutanix AOS and AHV.

Key elements of the partnership include:

Because of its distributed architecture, Nutanix Cloud Platform delivers an IT environment that is highly scalable and resilient, and well-suited for enterprise deployments of Red Hat OpenShift at scale. The platform also includes fully integrated unified storage, addressing many tough challenges operators routinely face in configuring and managing storage for stateful containers.

More information on the partnership is available here.

Supporting Quotes

Rajiv Ramaswami, president and CEO, Nutanix

This partnership brings together Red Hats industry-leading cloud native solutions with the simplicity, flexibility and resilience of the Nutanix Cloud Platform. Together, our solutions provide customers with a full stack platform to build, scale, and manage containerized and virtualized cloud native applications in a hybrid multicloud environment.

Paul Cormier, president and CEO, Red Hat

We have a vision to enable open hybrid clouds, where customers have choice and flexibility. Our partnership with Nutanix brings a leading hyperconverged offering to the open hybrid cloud, driving greater choice for our joint customers in how they deploy their containerized workloads and backed by a joint support experience.

Eric Sheppard, research vice president, IDC

Organizations around the world are deploying an increasingly diverse mix of modern and cloud-native workloads. This Red Hat and Nutanix partnership, and in particular the collaborative support agreement between the two companies, helps to bring virtualized applications and Red Hat OpenShift-based containerized workloads running on Nutanixs Cloud Platform together in a way that will benefit exactly these types of organizations and help to drive increased simplicity, agility, scalability within today's complex hybrid-cloud world.

Ritch Houdek, senior vice president, Technology, Kohls

We are thrilled to see two of our technology partners announce this strategic relationship. As we manage the complexities of hybrid cloud, we believe this relationship will unlock new hosting and deployment options for VM and container-based workloads. These new options will support our goals of being fast, efficient and friction-free as we deliver new experiences to our customers.

Gautam Roy, chief technology officer, Unum

The insurance industry is in the midst of a transformation to rapidly adapt to customers demands. We protect 39 million lives around the world with our products and services. As we work to modernize our IT infrastructure to support a seamless digital experience for our customers and employees, Nutanix and Red Hat help simplify our technology stack and advance our cloud transformation.

About Red Hat, Inc.

Red Hat is the worlds leading provider of enterprise open source software solutions, using a community-powered approach to deliver reliable and high-performing Linux, hybrid cloud, container, and Kubernetes technologies. Red Hat helps customers integrate new and existing IT applications, develop cloud-native applications, standardize on our industry-leading operating system, and automate, secure, and manage complex environments. Award-winning support, training, and consulting services make Red Hat a trusted adviser to the Fortune 500. As a strategic partner to cloud providers, system integrators, application vendors, customers, and open source communities, Red Hat can help organizations prepare for the digital future.

About Nutanix

Nutanix is a global leader in cloud software and a pioneer in hyperconverged infrastructure solutions, making clouds invisible, freeing customers to focus on their business outcomes. Organizations around the world use Nutanix software to leverage a single platform to manage any app at any location for their hybrid multicloud environments. Learn more at http://www.nutanix.com or follow us on social media @nutanix

Red Hats Forward Looking Statement Language

Except for the historical information and discussions contained herein, statements contained in this press release may constitute forward-looking statements within the meaning of the Private Securities Litigation Reform Act of 1995. Forward-looking statements are based on the companys current assumptions regarding future business and financial performance. These statements involve a number of risks, uncertainties and other factors that could cause actual results to differ materially. Any forward-looking statement in this press release speaks only as of the date on which it is made. Except as required by law, the company assumes no obligation to update or revise any forward-looking statements.

Red Hat, Red Hat Enterprise Linux, the Red Hat logo and OpenShift are trademarks or registered trademarks of Red Hat, Inc. or its subsidiaries in the U.S. and other countries. Linux is the registered trademark of Linus Torvalds in the U.S. and other countries.

Read more here:
Red Hat and Nutanix Announce Strategic Partnership to Deliver Open Hybrid Multicloud Solutions - Business Wire

Read More..

Navigating data sovereignty through complexity – Information Age

Laurent Michel, director of public affairs at Platform.sh, looks at how businesses can effectively navigate data sovereignty

Data legislation across the world can be difficult to navigate.

Where is your data? This was a simple question to answer not too long ago, when you could simply point to a server. Today, its far more difficult, and not simply because of the cloud.

Businesses do not build their online presence overnight. It grows as the company grows and new needs arise. More developers get involved, and data flows may not be as clear as they once were. Once a business becomes a multinational, the problem can quickly get out of control. Add all this in with political uncertainty and the software tools available today it makes it very difficult for companies to try to innovate while maintaining data governance across all websites and web applications.

Data sovereignty is the concept that data is subject to the laws of the country which it is processed in. In a world where there is a rapid adoption of SaaS, cloud and hosted services, it becomes obvious to see the issues that data sovereignty can have.

In simpler times, data wasnt something businesses needed to be concerned about and could be shared and transferred freely with no consequence. Businesses that also had a digital presence operated on a small scale and with low data demands hosted on on-premise infrastructure. This meant that data could be monitored and kept secure, much different from the more distributed and hybrid systems that many businesses use today.

With so much data sharing and lack of regulation, it all came crashing down with the Cambridge Analytica scandal in 2016, promoting strict laws on privacy.

Frederik Maris, vice-president, EMEA at Splunk, spoke to Information Age about the split European perceptions around data. Read here

The concept that data may be subject to the laws of more than one country presents mounting challenges for organisations. The General Data Protection Regulation (GDPR) is one such regulation that sent shockwaves throughout the world of IT. The regulation applies to the processing of EU residents personal data, regardless of where that processing takes place. If a company is not GDPR compliant, it risks regulatory fines of up to 20 million or 4% of global annual turnover (whichever is greater).

Fines are no empty threat either, over the course of 2020 more than 220 fines for GDPR were handed out. Though even with this threat, many companies still struggle to manage their own data strategy.

GDPR was the first major data compliance regulation but is not the only one. As businesses operate more internationally, they will need to be aware of the data policies from the region they are collecting from and where they are storing it.

When dealing with on-premise infrastructure, governance is clearer, as it must follow the rules of the country its in. However, when its in the cloud, a business can store its data in any number of locations regardless of where the business itself is. Its down to the business to make sure it is aware of where the data is being secured and that it is compliant wherever it is.

Many small businesses take advantage of the cost savings associated with large cloud hosting providers such as Google and Microsoft. When looking at this in the context of sovereignty, it begs the question over who is responsible for its governance. To make matters more complicated, cloud vendors dont always inform customers of the regulatory stakes of selecting one cloud region.

Azure for example operates on a shared responsibility model where depending on the service a customer is using they could be part responsible for a breach or misuse of data. As cloud usage increases, its important for teams to be fully aware of their responsibilities to avoid any issues.

Jack Watts, EMEA leader, AI at NetApp, discusses the need to double down on cloud deployments when it comes to the AI journey. Read here

Gartner predicts that cloud spending will reach $332 billion by the end of 2021, so with more complexity in the future and likely more regulation, businesses will need to get a tighter grasp on their data. Heres a few ways that they can do this:

As countries begin to adopt more complex data governance policies, the job will be on CTOs to navigate through this complexity and make sure that they have an accurate view of the whole business cloud environment, to ensure they are secure, compliant and responsible. Ultimately, selecting the right partner that offers solutions that combine performance, price predictability and total sovereignty over data to support growth is imperative.

The rest is here:
Navigating data sovereignty through complexity - Information Age

Read More..

GeekTek, MSP of the Year, to Rebrand as XOverture – WFMZ Allentown

LOS ANGELES, July 30, 2021 /PRNewswire-PRWeb/ -- GeekTek, the boutique managed IT services provider and consultancy headquartered in Los Angeles, and winner of various awards from CRN, MSPmentor, and Channel Partner Insight, including 2019 MSP of the Year, announced today that it was rebranding as XOverture.

According to GeekTek Founder and CEO/CTO Eric Schlissel, "The new brand better encapsulates the quality and professionalism of our services; the personality and values of our company; and our status as a rapidly-growing premium IT services and consulting firm that embraces progress and change."

The revamped branding can be viewed on the company's new website, XOverture.com.

A New Brand for the "New Normal" and Beyond

The company, established in 1998 and operating under the GeekTek name for over two decades, is known for its value-oriented and compassionate, partner-like approach to IT services and consulting, which combines top-tier, CIO-level guidance with award-winning services and support.

It operates internationally with offices in locations including Los Angeles, Toronto, and Hyderabad, and clients in multiple countries including the US, Canada, the UK, and Japan.

In an email to current clients sent last month, Schlissel said the idea for the rebrand came about during a "quiet period" last year after the company had shifted all of their clients to "the new normal of remote work".

"We took the opportunity to reflect, ask some big questions, and challenge our assumptions. What we quickly discovered was the disconnect between who we are and how our brand shows up in the marketplace."

An IT Overture to Business Success

According to an FAQ on the company's website, the name XOverture is meant to convey they are an "'overture' to success and growth for businesses through the masterful deployment and orchestration of IT."

In addition to the new name, the visual style of the new brand is meant to be "modern and clean yet also relatively light, reflecting an experienced, dependable, high-tech services and consulting company that's friendly and approachable and is a pleasure to work with."

Looking Ahead

Regarding future plans, the company stated that their "Our top priority is continuing to support and secure our current clients as well as we can. That includes the continual refinement of our support processes, retaining and bringing on quality technical talent, and keeping up on the latest developments in tech to ensure we're providing you with the best IT advice, management, and protection."

As for existing clients, Schlissel stated in his email, "the new branding will not result in any changes to your services, pricing, main points of contact, or dedicated support team. There will also be no changes to how you interact with our team, and all existing communication methods will continue to work."

The rebranding follows the 2020 spinoff of GeekTek's cannabis IT services operations into a new company called Cure8 (http://www.cure8.tech), in order to better support the unique needs of the one-of-kind and fast-growing cannabis industry. Cure8 already works with leading cannabis companies including Canopy Growth, Tweed, Tokyo Smoke, Superette, Spiritleaf, and Northern Helm, and is expected to expand its footprint throughout the US and Canada in 2021 and into 2022.

Moving forward, both XOverture and Cure8 will operate under the parent company Base8, whose name also hints at the team's musical interests, being a reference to both computing and the octal musical scale.

-

About XOverture

XOverture, formerly known as GeekTek, is a boutique IT services company that provides small & medium-sized businesses (SMBs) with the IT guidance, structure, & support they need to grow.

They're known for their top-tier service quality, industry-best response times, and partner-like approach. Services they offer include 24/7 managed services, full IT audits, IT project management, and cloud hosting and management.

Their approach to IT services emphasizes:

Media Contact

Shankar Iyerh, XOverture, +1 (323) 987-7771, marketing@xoverture.com

Twitter

SOURCE XOverture

More here:
GeekTek, MSP of the Year, to Rebrand as XOverture - WFMZ Allentown

Read More..

What are the top ten web hosting trends? Film Daily – Film Daily

Web hosting is one of the most important things for online businesses. Thus, it is imperative that the right web hosting service is hired. These services are also adopting new techniques and features. Therefore, you should always choose a company which is compliant with the recent trends in the market.

It necessarily means that it is always a good idea to be familiar with the web hosting trends in the market. It enables you to make the right choice as well. Nevertheless, here are the top ten web hosting trends this year:

Green web hosting is now becoming a new trend in the cyber field. It is a major step towards reducing environmental pollution and conserving energy. Hosting industries are adapting to greener methods in order to reduce carbon footprint.

Website and the data related thereto are stored in centers of data. These centers are to be maintained in a cool environment. This is to ensure that the data is stored in a protected and safe manner. In maintenance of such data centers, a huge amount of carbon is released.

Companies are now using greener methods while maintaining the data centers which is a great step.

This is another major trend which has emerged in the market. It is an entirely distinct area of creativity and innovation within the field. Rather than one single server, the data of the website is stored on distinct servers. Such servers are situated physically in different locations and are connected with each other through the Internet. Thus, cloud hosting is now much more convenient. The data is processed in a faster manner, there is more storage space and it also replaces the need for having individual physical servers.

If you are familiar with the new update on Google, you will see that websites that do not contain HTTPs are labeled as not secure. This means that the reliability, authenticity, credibility and rating of the website is negatively impacted. All this has a detrimental influence on the rankings and website traffic. If you use HTTPs, there are multiple benefits. For instance, you can build trust with the visitors. It will also increase the ranking of the website. It also ensures that the website remains as safe as possible. Thus, you should follow the new trend.

One of the most important things when it comes to a website is taking backup. It is of paramount importance. Backup of the website information can be done on a manual basis. However, web hosting companies now offer automated processes through which backups are made easily and more conveniently. Some web hosting companies provide administration of backup which are reliant on CMS. This backup means that only the most significant data has been backed up. It is one of the most basic trends in the market which is followed by a variety of professionals which include webhotel-guiden.dk.

Do It Yourself is the new trend. It has even entered the field of website development and hosting. It permits you to make your website and deal with the same as per your own preference and demands. All the components of the website can be relocated as per your own needs. There are many online tools which enable you to make your own website. These tools are simple to use and productive at the same time. Thus, website development is now much easier than it used to be, which is a positive trend gradually emerging in the market.

One of the major reasons people want to hire professional web hosting companies is that they provide better security features. This is something that you will not get with free options. Providing clients with digital security is one of the most important things which is needed. The trends are now changing and due to the advancement in technology, web hosting companies are now able to provide better security features to their customers.

Another trend which is gradually emerging in the field of web hosting is the inception of managed web hosting companies. Managed services are more personalized, customized and catered to the specific needs and preferences of the customers. In fact, it provides a smoother running for the processes. Given that the industry itself is shifting towards the more customer oriented approach, this is one of the trends that every customer or potential customer should eagerly look forward to.

The need for web hosting is increasing at a rapid pace. This means that competition is also increasing when it comes to pricing. The companies are getting extremely competitive. In fact, price is now a major consideration for customers while hiring companies. Potential customers are looking out for platforms which provide the lowest price. The trend is now changing to cheaper prices and better quality. Thus, while hiring any service, you should consider this trend.

Gone are the days while you would hire web hosting companies, it would only be limited to the role of web hosting. The trends have changed now. There is a concept of additional services which are essentially extra to the core service. It attracts more customers for the companies. By doing so, not only the customers are benefiting but the companies are able to attract more customers and enhance their service and brand reputation. It is one of the finest trends that is helping the industry as a whole.

Web hosting companies need to preserve their old customers and get newer ones in order to survive within the competition. There is no doubt that one web hosting company cannot satisfy all the customers in the market. This is where the concept of market targeting comes into play. Certain segments of the market are targeted by certain web hosting companies and this is how the industry is being run.

Read the rest here:
What are the top ten web hosting trends? Film Daily - Film Daily

Read More..

Cloud services: Finding the right fit – IT Brief Australia

Article by Hyland country manager for A/NZ Jamie Atherton.

As digital transformation continues to drive new solutions and organisations look for more flexible options, moving services to the cloud is becoming the deployment choice for many.

While on-premises deployments are still a valuable option, many companies are shifting their IT protocols to support cloud-based, multi-cloud or hybrid environments.

But not all services are the same, and a careful evaluation of the pros and cons of each model is required to maximise the agility, speed, security, affordability and delivery they can derive from their chosen model.

According to a recent Gartner report, the cloud market in Australia is expected to exceed $10.6 billion in 2021, increasing 18.4% from 2020.

There are many reasons why a move into the cloud makes sense. Right up front, it removes the challenge of budgeting for large-scale implementations and the associated effort of saving up funds.

Cloud services administered on the vendor side can either be owned by the vendor or leased by them. If a vendor owns their own data centres, the onus to efficiently manage that centre falls on the vendor.

The software-as-a-service (SaaS) model relies on monthly subscriptions and therefore negates any significant up-front investment. The vendor administers the software platform, leaving less hands-on work for an organisations IT team.

Upgrades, updates and security patches are performed in the cloud, negating the time, effort and headaches related to making major changes to a platform. In addition, cloud-based updates can be rolled out continuously, with little to no disruption to the end-user and providing the most up-to-date version of the platform.

In the past, an IT administrator would sacrifice a night of sleep to perform updates so that business could continue the next day with an updated application. Now, cloud-based software is updated and maintained with a much higher frequency, taking place automatically at scheduled times. As a result, services are always available, and business operations can continue with little disruption.

Content security is another major priority driving transitions from legacy on-premises platforms to modern solutions. A reputable cloud-based platform will have high-level security built-in and enterprise-grade security at the data centre level.

But cloud services are not one-size-fits-all. As efficient as it is to have software stored in the cloud, there is still the expectation that services are correctly administered on the vendor side.

This means all applications and files are available 100% of the time or as close to that figure as possible. This necessitates working with a vendor with a good reputation for managing its software in the cloud, and provides a very high level of uptime.

Finally, when evaluating cloud platforms, the data centre model itself should be considered and judged on its merits. For example, will applications be stored in the vendors own data centre, co-located in a shared space, or on third-party hardware?

If an organisation chooses services provided by a public cloud offering, it is not necessarily the role of a third-party data centre to maintain the deployment of services. While the data itself will undoubtedly be monitored, the actual management of the deployment does not reside with the data centre, so the overall level of involvement is not the same.

The vendor will be responsible for actively managing the cloud environment in which its software resides and ensuring that the delivery to the end-user is optimal at all times. This adds complexity to a deployment, as the vendor needs to manage their relationship with the customer and the cloud service provider.

Risk is generally reduced if content and data reside outside the organisations premises in a vendor-owned data centre. It is further reduced if that data is replicated to more than one location. On-premise deployments are only as safe as the organisations own hardware. They will generally not have the same level of high-grade security, failovers, redundancy and disaster recovery (DR) as a professional-grade data centre.

A vendors data centre space itself is likely to be purpose-built, with high-grade air-conditioning, uninterrupted and backup power supplies, storm and weather-proof, and include enterprise-grade levels of security. All this adds up to a safer, lower-risk environment for hosting content in the cloud.

Using a vendors hardware for cloud services also removes the capital expenditure required to house that data within the organisation not to mention the cost involved in running it efficiently. It reduces the need to deploy IT resources to constantly maintain the hardware, and there are no end-of-life replacement expenses.

A cloud services model also works best for public sector bodies, which often require data to be stored in more than one location and have strong governance around the movement of data. In most cases, governments will not allow data to leave the physical bounds of their country. This means the data must be stored either in a hybrid-cloud scenario or at two separate, locally-based data centres. Vendors who want to work in the public sector will be aware of this and have very tight governance and visibility over the data of all their customers.

Cloud services are of high value for organisations looking tokeep pace with the rapidly evolving business world because they offer flexibility, cost savings, security and business continuity. But not every model works for every organisation, so it pays to be diligent and find the right fit.

The rest is here:
Cloud services: Finding the right fit - IT Brief Australia

Read More..

Corporate Participation in the Open Source Community | Expert Advice – LinuxInsider.com

Open-source software is prolific in technology today. Just about everything from supercomputers to consumer electronics is powered by at least one piece of open source code.

But many businesses find themselves launching open-source products at a rapidly accelerating pace without truly understanding either the benefits that come with it or the potential pitfalls that must be avoided.

Lets talk about what open source means to your business, and how you can leverage it to serve both your customers and your business needs.

What exactly is open-source software? The open-source movement originated decades ago as a philosophical commitment to providing truly free (as in speech) software. This philosophy sternly rejected any attempts to commercialize or otherwise limit the use of such applications.

However, open-source developers today typically adopt a more nuanced approach, prioritizing accessibility and long-term sustainability of their projects over an unwavering commitment to absolute neutrality.

Some common tenets of modern open source include:

Although simple in concept, there are many open source licenses in common use. The Open Source Initiative (OSI), an organization dedicated to the promotion and protection of open source communities, has approved dozens of licenses as qualifying for open source per their policy. It should be recognized that the OSI itself has no legal authority on the validity of a license and acts merely as an industry steward.

Why so many licenses?

Some project authors want to retain a measure of control over their work to ensure that attribution is retained and that any modifications to their work must also be published as open source.

In such cases, a more restrictive (sometimes called copyleft) license such as GPLv3 is typically chosen. Other authors may not care as much about these considerations and opt for a more liberal license such as the MIT or Apache 2.0 licenses.

Selecting the best license for a project is a common source of confusion, so much so that GitHub has created a tool dedicated to helping guide inexperienced developers in making the decision.

With an understanding of the open-source concept, its only natural to ask why a business would entertain the idea of giving away software that took time and money to create. But there are plenty of ways that open source can benefit your business.

Leveraging Community Support

Many of the tools developed by your business dont contribute directly to its bottom line. For instance, I developed NetBox to help manage network infrastructure while working as a network engineer at DigitalOcean, a cloud hosting provider.

The software was certainly valuable, but it wasnt something we were positioned to sell directly to our customers. So, we decided to release it to the public as an open source project. In exchange, we received a ton of ideas and contributions from external collaborators who helped improve and maintain the software for everyones benefit, including our own.

Product Fit Testing

Suppose you have an idea for a new product and want to test its fit in the market. You might consider launching the initial product with only essential features under an open source license. This will allow you to gauge market interest in the product and, assuming there is sufficient interest, provide a way to collect feedback from early adopters to address bugs and request additional features well in advance of the full products launch.

Promote Brand Awareness

Open-source projects need not be large or complex endeavors. In fact, many of the most useful projects comprise just a few hundred lines of code: perhaps a handy utility or convenient library useful to other developers. Some arent even code at all, but curated collections of notes or resources.

Such projects are fairly cheap to develop and maintain, but help to amplify your companys brand within the community. Having already correlated your brand with a useful resource in the mind of a potential customer makes them more likely to consider your paid products.

Open Source as the Product

Perhaps paradoxically, the requirement that open source be made freely available does not preclude it from being sold. Many successful businesses have been built around selling primarily open-source products, usually by offering additional value in the form of support or extending functionality.

One of the most common approaches to commercialized open source is offering paid support and/or hosting for an open-source product. For example, WordPress is an extremely popular open-source content management system. Anyone can download and install it for free, but they also have the option of purchasing a subscription through wordpress.com. Their paid support plans include hosting, support, and other features and integrations depending on the pricing tier selected.

Another commercialization strategy is to split a product into both open-source and closed-source components, an approach referred to as open core. GitLab, a software development tool, is one such example. While the base product is made available for free as open source, many of its more advanced features are limited to paid versions.

Depending on the nature of your product, this approach can be a good fit as you reap the benefits of having an open-source community while also securing revenue from advanced users.

Launching an Open-Source Product

When you first publish an open-source project, its important to have a plan for drawing people to participate. It can be surprisingly difficult to give something away. One impediment is the absence of a price tag. Were so accustomed to relying on price as an objective (though not always accurate) indicator of value that it can be difficult to convey the value of a free product.

The most successful open-source projects focus on solving specific problems or fulfilling a well-defined set of needs. When publicizing your project, explain why you decided to create a new project versus using something that already exists, and how users can expect to gain from adopting your solution.

Be specific: Does it address a use case not covered well by existing solutions? Is it faster? More configurable? Your projects introductory document (often a simple README file) should spell out its value proposition in no more than a few sentences.

Once you have this in place, youll need to actively work on drawing people to your project. This can be done through marketing and social channels, but theres a line to tread. You dont want to be perceived as spammy or overly promotional, as this will quickly turn people off.

Consider what types of users would benefit most from adopting your software, and how you can best reach them. Blog posts demonstrating (and not just talking about) the benefits of your software tend to be very powerful; videos even more so. You can leverage social channels to promote these rather than linking directly to the project itself.

Depending on the nature of your project, you might also benefit by actively seeking out complaints about problems that you can solve. For example, perhaps youve developed an API client for a particular service or technology. Keep watch for people struggling in that area and suggest how your project may be of service to them where appropriate.

Twitter, Stack Overflow, and public chats can be especially good mediums for this sort of interaction. But again, dont be spammy. People tend to have a higher tolerance for the promotion of open-source software relative to paid products, but its still a pretty low threshold. Only interject when youre reasonably confident that the audience stands to benefit from what you have to say.

Participating in External Open-Source Projects

Your business can also benefit from engaging with external open-source projects. Perhaps there are open-source tools your business relies upon that would benefit from your contributions. Or maybe there are community-maintained projects that enable your customers to better consume your commercial offerings. Contributing to outside open source projects can yield benefits both tangible and intangible.

When considering an external contribution, its crucial to first understand the projects contribution policy. Well-established projects will typically have a formal document spelling out how contributions are to be proposed and submitted, whereas a smaller project may simply rely on established convention. Take some time to familiarize yourself with recent contributions to the project and make an earnest effort to abide by their precedent.

Also consider the real value of your proposed contribution to the projects user base. Contrary to popular belief, proposed contributions to open-source projects do place a substantial burden on a projects maintainers. Each submission must be carefully reviewed for appropriateness, functionality, security, documentation, and so on before it can be merged into the project. Unsolicited changes are often more of a burden than a boon for maintainers.

When in doubt, always discuss your proposal with the maintainers before proceeding with a submission of new code for review.

Given the continued growth of and interest in open-source software in recent years, theres no question that the trend will continue well into the foreseeable future.

By establishing and nurturing open-source communities focused both internally and externally, businesses as well as their customers can benefit tremendously.

Its clear that corporations that adopt an open-source strategy in earnest will be well positioned to lead in their markets.

Read the original:
Corporate Participation in the Open Source Community | Expert Advice - LinuxInsider.com

Read More..

Sending data to the cloud, NC makes long-awaited election system updates – WRAL.com

By Jordan Wilkie, Carolina Public Press

By the end of the summer, all 100 county boards of elections in North Carolina will be rid of the computer servers that hold voter registration data. The information will be stored in the cloud instead.

This is an early step in what will be a years long and nearly $3 million process to upgrade state and county election systems to improve security, usability and efficiency, according to the N.C. State Board of Elections.

The state will upgrade its voter registration and back-end data management, which are essential for running elections but little seen or understood by voters. The changes will not affect voting machines or the election equipment that makes, scans and counts ballots.

Originally designed in 1998 and put in place statewide in 2006, North Carolinas current election information management system is made up of a network of data servers in the state office and every county, woven together by a network of computer programs.

That was almost another geological era of cybersecurity risk management, according to John Sebes, co-founder and chief technology officer at the nonprofit Open Source Election Technology Institute. Back then, election administrators were not worrying about computer hacks from foreign nations or even criminals looking to make a buck.

We have to recognize its not just the technology front thats evolved so much; its the threat, Sebes said.

The scope of the projects shows how election administration has evolved since the turn of the century. Running elections now requires handling ever more data managed through increasingly complex voting technologies, all while protecting against the kinds of cybersecurity threats that challenge major corporations and thefederal government.

Updates planned over the next three years will make cybersecurity practices more consistent across all 100 county boards of elections, streamline updates to the back-end systems, write new software for use at the county and state levels, and replace the state servers with new hardware, according to Brian Neesby, chief information officer for the State Board of Elections.

Moving voter registration data from county servers to the cloud lays the foundation for all the other changes.

This is a big step toward the implementation of modernization as opposed to talking about modernization, said Derek Bowens, Durham Countys election director.

The State Board of Elections often consults with county election directors on elections improvements, and Bowens said he hopes that the board will consult with directors like him in the process of designing the new election management system.

Several other agencies have an interest in how the State Board of Elections runs. The board coordinates its security stance with other state agencies, like the Department of Information Technology and the Department of Public Safety, and with federal agencies, including the National Guard, FBI and Department of Homeland Security.

When complete, the states election infrastructure will be more resistant to computer attacks, and managing election data should be easier, Neesby said. Election security experts agree, with one important caveat: if it is done correctly.

North Carolina is adding Microsoft into the mix to take advantage of the kind of computer servers and security that only a multinational tech company can provide.

The states plan to move the counties voter registration systems to the cloud by the end of the summer means putting the data on Microsofts servers to be accessed remotely in each county.

Overall, the migration will improve our security posture because we will limit the surface area of attack; the cloud will allow us to exert easier control over our security practices, Neesby wrote in an email to CPP.

Relying on companies like Microsoft can be a two-edged sword, according to Duncan Buell, chair emeritus in computer science and engineering at the University of South Carolina. If the contract is written well and the software that will connect the counties to the cloud is secure, the move will likely be an improvement.

But since Microsoft serves some of the most important government and commercial clients, it is a huge target, Buell said.

They will be attacked by everybody, and they have been attacked by everybody, Buell said.

In the past, governments have been hesitant to store sensitive state data on a private companys computers, Sebes said, because it raises questions about data ownership and custodianship.

But in the age when the technological capacity of companies like Microsoft far outpace what local governments are able to offer and with the development of specific products for government use, states are getting over old fears, Sebes said.

All youre really losing control of is where the hardware lives and who does the physical security, and who does the personnel security for the physical data center staff, Sebes said. Thats a reasonable amount to give up.

In March 2020, malware froze Durham Countys website and many of its computer systems. The hack did not seem to target the countys voting systems, which were not directly affected. However, since the attack happened so close to the primary election and threatened to delay post-election audits, it raised alarms.

The county Board of Elections installed a localized version of its election management system and was quickly able to overcome other inconveniences like disabled phone lines and limited access to emails. In the end, the audits were only slightly delayed.

This kind of computer attack has become more common across the country and is just one of many ways hackers can inject chaos into an election.

Though the ultimate impact on Durhams primary election was minimal, the incident showed the importance of running election systems independently of other government systems, creating redundancy in the system and establishing backup plans in place should something fail.

Doomsday scenarios include scrambling the voter registration system so it is impossible to know who can vote, cutting power to the grid in major cities or a successful disinformation campaign convincing enough people to not trust the election results. Another worst-case scenario is the much-discussed but low likelihood of a hack into voting machines.

Though Russian state hackers probed voter registration databases in all 50 states in 2016, successfully infiltrated Illinois system and compromised one voting systems contractor, no votes were changed, and the outcome of the election was legitimate.

The sudden awareness of foreign nations attempts to interfere in U.S. democracy sparked a dramatic responsefromCongressandfederal agencies. The federal government found state election systems were vulnerable and needed significant security upgrades.

In January 2017, the Department of Homeland Security designated election infrastructure as part of the nations critical infrastructure, meaning it is among the most important systems keeping the country functioning.

The designation and its accompanying changes fueled a consensus among election experts and the federal and state governments that the 2020 elections were the most secure elections ever heldin the United States.

There is also consensus that states still have room to improve.

Counties and the Board of Elections have wanted to upgrade the voter registration system for the better part of a decade, Neesby said, but they did not have the funding or the staff.

Election officials werent originally thinking about security. They just wanted to make a cumbersome system more streamlined, according to Neesby.

But with the cybersecurity threat to U.S. democracy laid bare in 2016, the federal government increased funding and resource sharing with states to shore up their election systems.

At the moment, federal funds through the Help America Vote Act pay for almost two-thirds of the state boards IT personnel, according to a statement from the boards spokesperson, Pat Gannon, which he released opposing the Senates proposed budget, which would cut off these resources.

Without federal funding, the state could not modernize its election management system, Gannon wrote.

A funding loss would put the state in a difficult position, as its current systems are outdated and subject to predictive faults, memory and functional limitations and inadequate reliability, according to the part of the state boards IT report focused on replacing state servers.

North Carolina is not an outlier in using old technology. Without funding or an external force, state governments are often reluctant to upgrade their election systems, Buell said.

South Carolina only upgraded its voter registration software after the state bought new electronic poll books that didnt work with the old programs, Buell said. He served on the Richland County Board of Elections for two years and worked with the League of Women voters to advocate for election security.

North Carolina counties will not experience much change when the state board transfers the registration data to the cloud, according to Sara LaVere, elections director for Brunswick County. The login process and interface will change a little, and the state will dispose of the county server, she said.

The bigger changes will be phased in step by step over the next couple of years, according to Neesby. The state is rewriting software for the entire registration management system, designed for use in the cloud.

The current system was built in 1998, which is an antiquated coding and software platform that has reached end-of-life for software and hardware functionality, the state boards IT report reads. Modernization is necessary for functional and security reasons.

Editor's note: This article first appeared online at Carolina Public Press, an independent, in-depth and investigative nonprofit news service for North Carolina that allows WRAL News and other media outlets to republish its work.

Read the original:
Sending data to the cloud, NC makes long-awaited election system updates - WRAL.com

Read More..

Cloud Storage vs. Local Storage: Which Storage Solution is the Best for Your Business? – Enterprise Storage Forum

If we had to define the most common question we get asked, it would be: Which storage solution is best for my business? Businesses have been asking us this since cloud storage became feasible and affordable for small businesses. Unfortunately, it doesnt have an easy answer. Or, at the very least, it doesnt have an easy answer at the level of your whole business.

In reality, virtually every business out there makes use of both types of storage, at least to some degree. The formal name for this approach is hybrid storage, though most companies using it wouldnt use the term for the now-mundane mix of public cloud storage, on-device temporary storage, and long-term hard drive backups.

Nevertheless, for those new to the subject or those who need a quick refresher course, its possible to quickly summarize the differences between these types of storage.

Generally speaking, local storage is defined as data that you store on-premises. This includes every flash, hard, or backup drive you have, no matter how small; the hard drive in your personal laptop (or even on your phone) is technically local storage if you are taking it into work.

Local storage is great in some ways, and not so good in others.

Also read: Top NAS Storage Devices for 2021

Cloud storage is a type of storage where your data is not stored in your own servers, but instead you access files and programs over an internet connection. In slightly more technical terms, its when you access any information stored in a datacenter via the internet.

You can check out our guides to how cloud storage works if you want to know more, but for most small business owners there is a more important question: Is cloud storage worth the cost?

Also read: Best Cloud Storage for Business & Enterprise 2021

because you are limited by the speed of your internet connection. While you are only likely to notice this if you are transferring large amounts of data, it can be an issue in growing businesses.

As you can see from the lists above, both types of storage have pros and cons. Most businesses make use of both simultaneously, mixing local storage of critical, high security files with cloud servers that allow staff remote access to the data they use everyday.

In short, if you are not already using cloud storage for at least a portion of your storage needs, you should know that it will more than likely save your business significant costs.

Read next: Best Hybrid Cloud Storage Vendors & Software 2021

Read the original here:
Cloud Storage vs. Local Storage: Which Storage Solution is the Best for Your Business? - Enterprise Storage Forum

Read More..