Page 4,380«..1020..4,3794,3804,3814,382..4,3904,400..»

Announcing @HDScorp to Exhibit at @CloudExpo New York | #Cloud #Storage – SYS-CON Media (press release)

SYS-CON Events announced today that Hitachi Data Systems, a wholly owned subsidiary of Hitachi LTD., will exhibit at SYS-CON's 20th International Cloud Expo, which will take place on June 6-8, 2017, at the Javits Center in New York City.

Hitachi Data Systems (HDS) will be featuring the Hitachi Content Platform (HCP) portfolio. This is the industry's only offering that allows organizations to bring together object storage, file sync and share, cloud storage gateways, and sophisticated search and analytics to create a tightly integrated, simple and smart cloud-storage solution. HCP provides massive scale, multiple storage tiers, powerful security, cloud capabilities, and multitenancy with configurable attributes for each tenant, all backed by legendary Hitachi reliability. Designed to preserve data for long durations, HCP carries built-in data protection mechanisms and is designed to fluently evolve as storage technologies change. Furthermore, HCP addresses a wide range of challenges through a thriving community of third-party software partners as well as traditional and cloud storage protocols. HCP eliminates the need to maintain separate systems for each workload and bridges the gap between traditional, Mode 1 applications and modern, Mode 2 operations. Customers enjoy faster time-to-value and service providers accelerate time-to-market by eliminating the hassles of do-it-yourself integration.

Only Hitachi Data Systems powers the digital enterprise by integrating the best information technology and operational technology from across the Hitachi family of companies. HDS combines this experience with Hitachi expertise in the Internet of Things to deliver the exceptional insights business and society need to transform and thrive.

For more information, visit http://www.hds.com.

The World's Largest "Cloud Digital Transformation" Event

@CloudExpo / @ThingsExpo 2017 New York (June 6-8, 2017, Javits Center, Manhattan)

@CloudExpo / @ThingsExpo 2017 Silicon Valley (Oct. 31 - Nov. 2, 2017, Santa Clara Convention Center, CA)

Full Conference Registration Gold Pass and Exhibit Hall Here

Register For @CloudExpo Here via EventBrite

Register For @ThingsExpo Here via EventBrite

Register For @DevOpsSummit Here via EventBrite

Sponsorship Opportunities

Sponsors of Cloud Expo / @ThingsExpo will benefit from unmatched branding, profile building and lead generation opportunities through:

For more information on sponsorship, exhibit, and keynote opportunities, contact Carmen Gonzalez (@GonzalezCarmen) today by email at events (at) sys-con.com, or by phone 201 802-3021.

Secrets of Sponsors and Exhibitors HereSecrets of Cloud Expo Speakers Here

All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades.

With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend @CloudExpo | @ThingsExpo, June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA. Learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.

Track 1. FinTech Track 2. Enterprise Cloud | Digital Transformation Track 3. DevOps, Containers & Microservices Track 4. Big Data | Analytics Track 5. Industrial IoT Track 6. IoT Dev & Deploy | Mobility Track 7. APIs | Cloud Security Track 8. AI | ML | DL | Cognitive Computing

Delegates to Cloud Expo / @ThingsExpo will be able to attend 8 simultaneous, information-packed education tracks.

There are over 120 breakout sessions in all, with Keynotes, General Sessions, and Power Panels adding to three days of incredibly rich presentations and content.

Join Cloud Expo / @ThingsExpo conference chair Roger Strukhoff (@IoT2040), June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA for three days of intense Enterprise Cloud and 'Digital Transformation' discussion and focus, including Big Data's indispensable role in IoT, Smart Grids and (IIoT) Industrial Internet of Things, Wearables and Consumer IoT, as well as (new) Digital Transformation in Vertical Markets.

Financial Technology - or FinTech - Is Now Part of the @CloudExpo Program!

Accordingly, attendees at the upcoming 20th Cloud Expo / @ThingsExpo June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA will find fresh new content in a new track called FinTech, which will incorporate machine learning, artificial intelligence, deep learning, and blockchain into one track.

Financial enterprises in New York City, London, Singapore, and other world financial capitals are embracing a new generation of smart, automated FinTech that eliminates many cumbersome, slow, and expensive intermediate processes from their businesses.

FinTech brings efficiency as well as the ability to deliver new services and a much improved customer experience throughout the global financial services industry. FinTech is a natural fit with cloud computing, as new services are quickly developed, deployed, and scaled on public, private, and hybrid clouds.

More than US$20 billion in venture capital is being invested in FinTech this year. @CloudExpo is pleased to bring you the latest FinTech developments as an integral part of our program, starting at the 20th International Cloud Expo June 6-8, 2017 in New York City and October 31 - November 2, 2017 in Silicon Valley.

@CloudExpo is accepting submissions for this new track, so please visit http://www.CloudComputingExpo.com for the latest information.

Speaking Opportunities

The upcoming 20th International @CloudExpo | @ThingsExpo, June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA announces that its Call For Papers for speaking opportunities is open.

Submit your speaking proposal today! Here

Our Top 100 Sponsors and the Leading "Digital Transformation" Companies

(ISC)2, 24Notion (Bronze Sponsor), 910Telecom, Accelertite (Gold Sponsor), Addteq, Adobe (Bronze Sponsor), Aeroybyte, Alert Logic, Anexia, AppNeta, Avere Systems, BMC Software (Silver Sponsor), Bsquare Corporation (Silver Sponsor), BZ Media (Media Sponsor), Catchpoint Systems (Silver Sponsor), CDS Global Cloud, Cemware, Chetu Inc., China Unicom, Cloud Raxak, CloudBerry (Media Sponsor), Cloudbric, Coalfire Systems, CollabNet, Inc. (Silver Sponsor), Column Technologies, Commvault (Bronze Sponsor), Connect2.me, ContentMX (Bronze Sponsor), CrowdReviews (Media Sponsor) CyberTrend (Media Sponsor), DataCenterDynamics (Media Sponsor), Delaplex, DICE (Bronze Sponsor), EastBanc Technologies, eCube Systems, Embotics, Enzu Inc., Ericsson (Gold Sponsor), FalconStor, Formation Data Systems, Fusion, Hanu Software, HGST, Inc. (Bronze Sponsor), Hitrons Solutions, IBM BlueBox, IBM Bluemix, IBM Cloud (Platinum Sponsor), IBM Cloud Data Services/Cloudant (Platinum Sponsor), IBM DevOps (Platinum Sponsor), iDevices, Industrial Internet of Things Consortium (Association Sponsor), Impinger Technologies, Interface Masters, Intel (Keynote Sponsor), Interoute (Bronze Sponsor), IQP Corporation, Isomorphic Software, Japan IoT Consortium, Kintone Corporation (Bronze Sponsor), LeaseWeb USA, LinearHub, MangoApps, MathFreeOn, Men & Mice, MobiDev, New Relic, Inc. (Bronze Sponsor), New York Times, Niagara Networks, Numerex, NVIDIA Corporation (AI Session Sponsor), Object Management Group (Association Sponsor), On The Avenue Marketing, Oracle MySQL, Peak10, Inc., Penta Security, Plasma Corporation, Pulzze Systems, Pythian (Bronze Sponsor), Cosmos, RackN, ReadyTalk (Silver Sponsor), Roma Software, Roundee.io, Secure Channels Inc., SD Times (Media Sponsor), SoftLayer (Platinum Sponsor), SoftNet Solutions, Solinea Inc., SpeedyCloud, SSLGURU LLC, StarNet, Stratoscale, Streamliner, SuperAdmins, TechTarget (Media Sponsor), TelecomReseller (Media Sponsor), Tintri (Welcome Reception Sponsor), TMCnet (Media Sponsor), Transparent Cloud Computing Consortium, Veeam, Venafi, Violin Memory, VAI Software, Zerto

About SYS-CON Media & EventsSYS-CON Media (www.sys-con.com) has since 1994 been connecting technology companies and customers through a comprehensive content stream - featuring over forty focused subject areas, from Cloud Computing to Web Security - interwoven with market-leading full-scale conferences produced by SYS-CON Events. The company's internationally recognized brands include among others Cloud Expo (@CloudExpo), Big Data Expo (@BigDataExpo), DevOps Summit (@DevOpsSummit), @ThingsExpo (@ThingsExpo), Containers Expo (@ContainersExpo) and Microservices Expo (@MicroservicesE).

Cloud Expo, Big Data Expo and @ThingsExpo are registered trademarks of Cloud Expo, Inc., a SYS-CON Events company.

See the article here:
Announcing @HDScorp to Exhibit at @CloudExpo New York | #Cloud #Storage - SYS-CON Media (press release)

Read More..

Why the hybrid vs multi-cloud position is not up for debate – Cloud Tech

The Cloud Industry Forums latest study shows some 88% of UK businesses are now using the cloud with over half of these favouring the hybrid approach whereby all data is processed and stored over a combined public and private cloud infrastructure.

At the same time theres been a debate raging for some time over the merits of a multi-cloud approach where businesses can utilise a variety of cloud services public and private to deliver a horses for courses solution.

Okay, so whats the difference? Arent both approaches one and the same thing? On first pass they may appear so but the difference is that hybrid is effectively connecting a public cloud such as a Microsoft Azure, with your private on-premise cloud IT and applications. This way a business can cost-effectively access highly elastic compute resources from the chosen provider, perhaps for managing and storing additional workloads at peak times think retailers on a Black Friday and for general day to applications. But all the mission critical stuff remains on-site in the private camp for various reasons, such as security and privacy regulations.

The subtle difference with multi-cloud is businesses mix and match a range of public and private clouds to achieve best of breed applications and services. In practice, however, it may well be the case that only one public cloud service such as Azure is selected for delivery of day to day applications and for soaking up additional processing and storage requirements at busy times, while several private cloud IT systems are deployed for ensuring optimum sharing of workloads using a mix of specialist apps and services.

Theres no one size fits all. One provider or cloud infrastructure may not necessarily provide the optimum solutions for each and every workload or application. In either case, there will be pros and cons and which route to take will ultimately be decided on a case specific basis.

This said theres no longer any reason to limit your options by going for a one or the other approach. Why not have both hybrid and multi-cloud solutions at your disposal as and when? And with the availability of solutions such as Azure Stack the exact same cloud experience can be replicated privately on-premise.

Much, as always, depends on the resources of the data centre(s) being used. Whether hybrid, multi-cloud or a mix of both, there is one thing they must have in common for making everything interoperate seamlessly, securely and consistently: Connectivity.

More to the point, high speed low latency connectivity and plenty of it for ensuring sufficient redundancy and failover options. However, these multi and hybrid cloud environments are only going to be as good as the weakest link; the public clouds connection to the data centre.

This increasingly calls for data centres that bypass the internet with cloud gateways, allowing faster, more secure virtual private network connections directly into global public cloud network infrastructures, such as Microsofts Azure ExpressRoute.

Aside from this and the requisite level of scalable power to rack the other key factor to consider is a data centres level of engineering competence, necessary not only for configuring and interconnecting these complex environments, but also for helping businesses bring their legacy IT into the equation older equipment and software which is still playing a critical role and just too valuable to sideline.

With the right data centre, theres no longer a need to debate whether hybrid is better than a multi-cloud model. Businesses are free to follow a best of breed cloud strategy using a best of all worlds approach to deliver the right applications and services at the right time to exactly where theyre needed quickly, securely and consistently.

Read the original:
Why the hybrid vs multi-cloud position is not up for debate - Cloud Tech

Read More..

How to Help Your Customers Combat Public Cloud Bill Shock – Talkin’ Cloud

Consumption-based pricing and the ability to spin cloud servers up and down as needed have opened organizations up to a whole new way of doing business one that requires a lot less upfront capital. But something that still eludes many organizations is how to monitor the costs of their cloud usage before it gets out of hand.

Lynn LeBlanc, CEO and founder of hybrid IT service provider Hotlink, says that public cloud is typically more of a so-called black hole than on-premise infrastructure. One advantage of on-premise infrastructure is you buy it, and its yours, and it doesnt cost you anymore to use it up, she says.

If youre on a pay-as-you-go plan of public cloud, while you get that upfront benefit of not having to extend all that capitalreally managing that consumption, its a new problem that people arent used to having to solve, she tells Talkin Cloud in an interview.

To address this issue with its own corporate IT customers, HotLink launched managed services last year, in conjunctionwith its hybridHotLinkCloud-Attach Platform,that provide AWS cost optimization and load optimization. These services were born out of the companys own experience dealing with what LeBlanc calls bill shock at the end of each month.

I would get these [AWS] bills at the end of the month and honestly I never knew how much they were going to be, and they werent really organized in a way that we could deconstruct them, she says.

According to LeBlanc, while an Amazon bill shows her how much data was transferred to and from the cloud, how much storage she consume and other metrics, it will not tell me who was doing what, what were they doing with itso when it came to trying to figure out, wow, suddenly we got this monster bill that, by the way, was way more than we budgeted for, why? What were we doing?

I know that our engineering VP always dreaded when the bill came in because he knew there was going to be this fire drill to try and figure out what it was, she says.

For engineering it was particularly problematic because it was impossible to figure out how many cloud resources individual developers were using. So the company developed its own tools to monitor cloud usage, some of which use the Amazon API.

What we found was the reporting we created for ourselves could be really useful for our customers, LeBlanc says. We gave them some of those tools but then we found there wasnt that much discipline; that you really have to have if you care about managing cost in the cloud.

There are a lot of tools that do all kinds of things related to cost but it still depends on somebody really actively managing it. We found that when it came to corporate IT, andparticularly the upper mid-market, I just dont think there was the discipline around 'youve got to watch this stuff every day' if you really want at the end of the month to have the economic benefit that you envisioned when you started to use this resource.

HotLink started offering managed services around its various products last year, including disaster recovery as a service and general cloud management, but it wasnt until August 2016 when the company productized its cloud cost optimization and load optimization and made it part of its managed services offerings.

HotLink monitors AWS cloud costs on behalf of its customers with its cloud cost optimization module. LeBlanc explains: We find out from them their budget for public cloud usage for each month, we monitor it daily to see whats powering on, whats not, perhaps an extra-large instance is powered on but theres almost no CPU usage. This means somebody should have just powered it off.

For companies that are just getting started in monitoring cloud costs, LeBlanc says it can be hardto figure out how to budget. Similar to a household budget, it is best to have some number of months of usage before you can create a realistic budget. It also requires a bit of a behavioral change, she says.

Theres a bit of corporate IT thats sort of opaque. They dont necessarily know what all these applications are doing so lets say for example theyre using the public cloud for disaster recovery. They dont necessarily know how often a given application goes through a major update cycle for their database. They just never paid attention to that because they dont manage things at the application level for the most part; theyre looking at it more holistically.

Not being application aware from an operational point of view is a change in behavior that is necessary when you have a new unit of financial measure in the public cloud, LeBlanc says. When you dont know the application, there is a tendency to overprovision, she adds. HotLink makes recommendations to customers about right-sizing the pre-defined instances based on the performance and utilization of the resources, she says.

With its load optimization module, HotLink provides load balancing recommendations for on-premise data transfer; intelligent scheduling for bandwidth management; and AWS account configuration for fewer bottlenecks and faster throughput, the company says.

[W]hat youre trying to do is keep from saturating the network that is being used for a lot of purposeswe havent had any customers where they just didnt just have enough bandwidth but if its not being accurately managed they will have bottlenecks, she says.

See the article here:
How to Help Your Customers Combat Public Cloud Bill Shock - Talkin' Cloud

Read More..

Amihan Global Strategies Launches AMIHAN CLOUD BLOCKS, a Cloud-Native Infrastructure to Accelerate Enterprise … – PR Newswire (press release)

LOS ANGELES, April 25, 2017 /PRNewswire/ -- Amihan Global Strategies announces the immediate availability of AMIHAN CLOUD BLOCKS, a cloud-native infrastructure solution that accelerates digital transformation by augmenting the existing IT systems of large enterprises with the agility of remote servers in the cloud.

AMIHAN CLOUD BLOCKS merges the best of two worlds: world-class technology, based on Google's pioneering Kubernetes Platform, and regional expertise catering to the specific needs of the largest enterprises in the ASEAN region. Designed as a fully managed service solution and a 100% OPEX subscription model, AMIHAN CLOUD BLOCKS is the only cloud solution that is optimized for Southeast Asia.

INFRASTRUCTURE FOR INNOVATION

"The cloud is step one of digital transformation," said Winston Damarillo, Executive Chairman of Amihan. "It's the infrastructure for innovation that allows you to experiment, grow, and adapt in ways that are necessary to keep up with the pace of customers' digital appetites."

AMIHAN CLOUD BLOCKS enables companies to deploy new services faster, harmonize legacy and next-generation apps in a unified ecosystem, and perform complex data analytics all in a single platform. It aims to transform the IT infrastructure of legacy companies, starting with their data center: the nucleus that powers business applications and houses business intelligence.

AMIHAN CLOUD BLOCKS is based on Kubernetes, a container management tool that was initially developed by Google and is now managed by the Cloud Native Computing Foundation (CNCF), of which Amihan is a member. Furthermore, AMIHAN CLOUD BLOCKS is built on the NEC DX 2000 hardware platform, which delivers world-leading density of compute, memory and storage.

"NEC is excited to collaborate with Amihan to deliver a state-of-the-art Cloud Native Platform in Southeast Asia," says Tatsunori Shibata (Head of Go-to-market, Bigdata & Cloud platform business, NEC Corporation). "The NEC DX 2000 is a perfect fit for the workloads of the fastest growing digital companies in the region."

A SIMPLER DIGITAL TRANSFORMATION, ON DEMAND

Digital transformation is a complex process that involves drastic changes in culture, business practices and IT Systems. AMIHAN CLOUD BLOCKS simplifies this by minimizing time spent on compatibility issues, unifying data sources, investing in new systems, and ensuring latest cybersecurity and data privacy standards.

AMIHAN CLOUD BLOCKS enables a deliberate transformation of a company's existing IT systems to the cloud through its virtualization platform based on OpenStack, which will house legacy Linux and windows servers; Acaleph Storage, a data-secure enterprise wide storage platform; and Kubernetes-managed Docker to accelerate application development. In addition, AMIHAN CLOUD BLOCKS can seamlessly federate with Google Cloud Platform to enable a scalable hybrid-cloud across all its services.

"Enterprises are using open source technologies like Kubernetes to deploy cloud native architecture models that support fast, agile application development," said Dan Kohn, Executive Director of The Cloud Native Computing Foundation. "We are pleased to support our members in their efforts to deliver engineered solutions for the modern enterprise."

Media Contact: Rexy Josh Dorado Phone: 216.526.7842 Email: rdorado@agsx.net

About Amihan:

Amihan Global Strategies is a digital transformation accelerator that advises and partners with some of the largest institutions in the ASEAN region. Amihan helps companies build a digital roadmap, access ideal technology, and act on their vision to become future-ready organizations. Amihan has offices in Manila, Cebu, Singapore and Los Angeles.

Related Files

CLOUDBLOCKS Data Sheet.pdf

Related Images

image1.jpg

image2.png

image3.png

Related Links

AMIHAN CLOUD BLOCKS Product Page

Amihan Homepage

This content was issued through the press release distribution service at Newswire.com. For more info visit: http://www.newswire.com.

To view the original version on PR Newswire, visit:http://www.prnewswire.com/news-releases/amihan-global-strategies-launches-amihan-cloud-blocks-a-cloud-native-infrastructure-to-accelerate-enterprise-digital-transformation-300445731.html

SOURCE Amihan

http://www.amihan.net

More:
Amihan Global Strategies Launches AMIHAN CLOUD BLOCKS, a Cloud-Native Infrastructure to Accelerate Enterprise ... - PR Newswire (press release)

Read More..

IBM-Nvidia Servers Achieve High-Performance Computing Milestone In Oil Industry – Forbes


Forbes
IBM-Nvidia Servers Achieve High-Performance Computing Milestone In Oil Industry
Forbes
Each server, dubbed Minsky, is equipped with two Power8 central processing units (or CPUs) and four Nvidia Tesla P100 graphics processing units (or GPUs). With the servers running on IBM's cloud, the Stone Ridge's simulation took less than two hours to ...

and more »

See the article here:
IBM-Nvidia Servers Achieve High-Performance Computing Milestone In Oil Industry - Forbes

Read More..

Botnet Controllers in the Cloud – Spamhaus

Cloud computing is popular these days. Millions of users consume computing power out of the cloud every day. Cloud computing comes with several advantages over traditional server hosting, such as scalability and quick deployment of new resources.

As of January 2017, several large botnet operators appear to have discovered the benefits of cloud computing as well, and have started to deploy their botnet controllers in the cloud.

Since early 2017, we at Spamhaus have seen a significant increase in the number of botnet controllers (botnet command and control servers, C&C, C2) popping up at legitimate cloud computing providers. Most have been at Amazon's Cloud Computing platform "AWS" but we have recently seen an increase in new botnet controllers hosted on Google's Cloud Computing platform "Compute Engine". The chart below documents the numbers of newly detected botnet controllers at Amazon AWS and Google Compute Engine.

This chart is based only on botnet controllers. It does not include other fraudulent infrastructure, such as payment sites for ransomware (TorrentLocker, Locky, Cerber etc) or malware distribution sites. We have seen a spike in those types of criminal infrastructure at Amazon and Google as well.

Neither Amazon nor Google are handling abuse reports about botnet controllers, malware distribution sites, and other types of criminal activity on their clouds in a timely manner. Both allow botnet controllers to remain online for weeks at a time, despite multiple abuse reports and reminders.

Spamhaus has reached out repeatedly to both Amazon and Google about these abuse problems, but has received no relevant response from either so far.

As we are lacking any useful feedback from Amazon and Google on the causes of these ongoing abuse problems, we can only speculate. Previous experience with issues at cloud providers suggest that a weak or non-existent customer verification process might be the root cause of these abuse problems. Other factors which could lead to such problems include a weak Acceptable Use Policy, or a corporate culture and management not supporting of Abuse Desk policy enforcement.

We encourage Amazon and Google to take the appropriate actions to stop all outstanding abuse problems on their networks, just as all responsible hosting networks must do. These are the specific issues which we are presently tracking at those networks:

Open SBL Advisories in the responsibility of amazon.com: https://www.spamhaus.org/sbl/listings/amazon.com

Open SBL Advisories in the responsibility of google.com: https://www.spamhaus.org/sbl/listings/google.com

In addition, Amazon and Google must take necessary and appropriate steps to prevent further abuse of all types from being generated on their network. That includes reacting to abuse reports from many sources including, but not limited to, SBL listings, and effectively prohibiting all services to spammers and other abusive users.

We at Spamhaus are continuing to monitor the situations at Amazon AWS and Google Compute Engine, and may take additional action(s) to protect Spamhaus users from further abuse generated on those networks. We are by no means happy to publish a complaint of this nature against two such established Internet companies, yet at this time we are very concerned about the ongoing abuse tolerated by networks which should be setting reputational standards for legitimate hosting, and not for supporting botnets.

How hosting providers can battle fraudulent sign-ups: https://www.spamhaus.org/news/article/687/how-hosting-providers-can-battle-fraudulent-sign-ups

Spamhaus Botnet Summary 2016: https://www.spamhaus.org/news/article/733/spamhaus-botnet-summary-2016

M3AAWG Hosting Best Practices 2015 https://www.m3aawg.org/sites/default/files/document/M3AAWG_Hosting_Abuse_BCPs-2015-03.pdf

See the original post:
Botnet Controllers in the Cloud - Spamhaus

Read More..

How Cloud Computing has Enabled SMBs to Embrace ERP – Social Barrel (blog)

Posted By Firdaus on Apr 25, 2017 |

Share the joy

4

Shares

Source https://www.linkedin.com/pulse/key-drivers-companies-moving-cloud-erp-diwakar-rai

Enterprise Resource Planning (ERP) initially improved the efficiencies of large companies and organizations. As the ERP industry has evolved and technologies have improved, the benefits of ERP are now cost-effective for medium and even smaller businesses that have a need to streamline processes.

The principal technology that has made this feasible is the evolution of cloud computing. As PC Mag writes, many businesses are accessing their data over cloud-based platforms rather than storing it all locally. Many enterprises in the small and midmarket businesses (SMB) sector have already taken to the cloud for hosting and storage. This makes ERP solutions easier and, in many cases, more cost effective to implement for the smaller company that needs to manage processes, improve communication, and create synergies within the business.

ERP software ensures the smooth functioning of inventory management, order processing, and production while tracking and controlling other aspects of the business such as payroll, cash, raw materials, and orders. It also allows for faster communication and information sharing between internal members of the company as well as business partners, suppliers, and clients.

Early ERP systems were too costly for SMBs so they mainly ended up patching together various systems in random parts of the organization, but cloud-based ERP offers several benefits to smaller companies wanting to implement ERP. Firstly, it takes away the initial cost required for non-cloud systems. There is also no need for companies to have in-house expertise to manage and maintain the hardware. All applications are hosted in the cloud and accessed via an internet connection.

The companies creating ERP systems have enjoyed this opportunity as well and have designed some innovative systems to work with the cloud. At the same time, they have come up with creative pricing structures to make ERP affordable and highly attractive to SMBs. There are several ways to pay, and pay-per-transaction pricing models often suit smaller businesses.

The entry barrier has lowered, taking away some maintenance concerns. The savings on capital expenditure can be better used to grow the business; plus, there are efficiencies achieved from having the system in place. It also makes planning and budgeting easier as ERP costs will ultimately be a percentage of turnover.

But how do you know whether your small or medium-sized business could benefit from switching to an ERP? Business.com offers four scenarios that may indicate its time to take the plunge:

Another benefit cloud offers is that as the business grows, the ERP system is very easy to adjust. It is highly adaptable so you can start with the core functionality required and add to it over time. This is particularly useful in turbulent markets where short-term futures are not secure. A small, growing company can reap the benefits of ERP without any major up-front costs or long-term, onerous obligations.

You pay only for what you need as you need it and scale it over time if necessary. This eliminates the risk of investing in software that is too small for your needs, which you outgrow quickly, or getting something more powerful than desired and underutilized. A system can even be scaled down where necessary if volume or turnover drops dramatically.

You want to select a product that is not only right for you right now but will be in the future as well. While most ERP systems are scalable, the cloud makes it a bit faster, easier, and less expensive. As the technology evolves, so does the ERP industry. Once a massive system preserved only for the largest companies, ERP is now nimble, flexible and affordable enough for all but the smallest of companies thanks to advances in cloud computing.

Share the joy

4

Shares

Author: Firdaus

I work as an IT consultant in the Toronto area and I love to write blogs about a variety of subjects. My passion for writing stems from the desire that everyone should have access to meaningful information. Whether it is a blog about society, culture, technology, or social media, I dont want to miss the opportunity of sharing my thoughts with my friends and audience. Since I believe in mutual exchange of ideas, I am always on the lookout for a feedback on my writings.

Read more from the original source:
How Cloud Computing has Enabled SMBs to Embrace ERP - Social Barrel (blog)

Read More..

Mucheru urges private sector to boost investment in internet security – The Standard (press release)

NAIROBI, KENYA: Mr. Joe Mucheru, the ICT Ministry Cabinet Sectretary has urged the private sector to bolster investment in cybersecurity to curb the growing incidences of cybercrime targeting Kenyas digital economy.

Though a recent report by Jumia Business Intelligence and GSMA Mobile showed that Kenya is leading Africa in internet penetration with over 30 million having access to the internet, the positive trajectory has seen a sharp rise in cybercrime targeting financial Institutions and mobile money transaction platforms.

Mr. Mucheru urged the private sector to invest in cybersecurity infrastructure to complement ongoing Government efforts to curb the vice, citing investment opportunities for players in the internet security space to ensure security infrastructure matches current threat trends.

We have the Computer and Cybercrimes Bill, 2016 which is headed to parliament. We want to introduce stiffer penalties for cybercrime and online corporate espionage, said Mr Mucheru when he presided over a funds drive to improve the infrastructure of Lenana School.

The Kenya Cybersecurity Report 2016 published by Serianu Limited, estimates that about 44 percent of financial institutions run on a paltry cybersecurity budget of $1-1,000 annually, whilst about 33 percent of financial institutions in Kenya have $0 spend on all matters cybersecurity.

With more than 75.3 percent of Kenyan citizens formally included in financial services, one would logically expect a correspondent increase in cybersecurity investments in the financial services sector. Regrettably this is the opposite in the case of Kenyan banks, said Teddy Njoroge the ESET East Africa Country Manager during the recent Connected Summit 2017.

Mr. Mucheru said jobs were moving online with the freelancing economy in the United States clocking US$1 trillion and about 34 percent of Americans working online.

The very essence of introducing Ajira Digital in partnership with Rockefeller Foundation and the Kenya Private Sector Alliance was to tap online job opportunites for the youth. We will not relent on this initiative because of challeges posed by cybercrime, he added.

Over one-third of organizations that experienced a breach in 2016 reported substantial customer, opportunity and revenue loss of more than 20 percent, this is according to the Cisco 2017 Annual Cybersecurity Report.

In the report, Chief Security Officers cited budget constraints, poor compatibility of systems, and a lack of trained talent as the biggest barriers to advancing their security postures. Business leaders also revealed that their security departments are increasingly becoming complex environments with 65 percent of organizations using between six and 50 security products, thus raising the potential for security effectiveness gaps.

The CS urged local businesses to focus on opportuntines in addressing potential challenges around Internet of Things (IOT) and mobile communications that are fast becoming new targets of attack by cybercrimials.

In the next one year we expect that over 40 million new devices, mainly smart phones will be imported into the country, all these are potential new targets especially if users are ot aware of the cyber risks, he explained.

ESET East Africa, that specilises in internet security notes that effective infrastructural cybersecurity measures come at a budgetary cost which must be respected by C-Suite executives if organisations are to tame the constantly evolving cyber threat landscape.

While it is laudable that up to 63 percent of financial organisations in Kenya have an in-house cybersecurity department, the Serianu report 2016 indicates that only 29% of the employees within in-house cybersecurity departments in financial organisations are cybersecurity trained certificate holders.

Go here to see the original:
Mucheru urges private sector to boost investment in internet security - The Standard (press release)

Read More..

CipherLoc Releases Encryption Performance White Paper – Yahoo Finance

Company's Innovative Approach makes Encryption Processing Faster

AUSTIN, TX / ACCESSWIRE / April 25, 2017 / CipherLoc Corporation (CLOK), a leading provider of highly secure data protection technology, today announced the availability of a new white paper, "An Analysis of Encryption Performance," which summarizes how CipherLoc's unique technology, which is designed to make encryption faster, safer, and massively scalable, can be used to reduce the latency penalty associated with the deployment of encryption algorithms.

The foundation of modern security relies on encryption technology to protect sensitive data and maintain user privacy. However, it is a well-known fact that encryption technology comes with a price: increased latency. These latencies get worse when the key sizes are increased to stave off the continued advances in computing horsepower. This is one of the reasons why encryption has not been ubiquitously deployed despite the many security and privacy benefits it can provide.

The just-released white paper takes the most widely used encryption algorithm on the market today, the Advanced Encryption Standard (AES), and analyzes its performance both when running in standalone mode (i.e. without the CipherLoc utility included) as well as when running with the CipherLoc acceleration utility enabled. The white paper also compares how a highly-secure AES algorithm enabled with CipherLoc's utility performed vis--vis several other well-known encryption algorithms.

The white paper shows how the latency penalty associated with the AES encryption process is dramatically reduced when that same algorithm is accelerated using CipherLoc's innovative encryption utility. Depending on the file size and underlying hardware, performance improvements of up to 50% can be seen which in turn leads to better utilization, higher efficiency, and greater scalability when deploying encryption technology.

CipherLoc's solution is not a new algorithm but rather a utility that can be added on top of any existing cryptographic encryption algorithm. Once applied, the CipherLoc utility will not only improve the underlying encryption processing speed, it will simultaneously strengthen the data protection security. It will also make encryption algorithms more scalable by obviating the need to be constantly increasing the key size to maintain current levels of security. CipherLoc also simplifies the key management infrastructure by using one-time-use ephemeral keys that do not require key storage. In short, CipherLoc's easy-to-deploy technology makes encryption processing faster, stronger, and massively scalable.

Testing results are available upon request at info@cipherloc.net or by contacting us via the Company's website at http://www.cipherloc.net.

About CipherLoc Corporation (CLOK)

CipherLoc Corporation is a data security solutions company whose vision is simple - Protect the World's Data. Our highly innovative solutions are based on our patented Polymorphic Cipher Engine which is designed to take existing encryption algorithms and make them better, faster, stronger, and massively scalable. We deliver easy-to-deploy software solutions that can be added to any existing product, service, or application. In short, we keep information safe in today's highly dangerous world. For further information, please go to http://www.cipherloc.net

Read More

Go here to see the original:
CipherLoc Releases Encryption Performance White Paper - Yahoo Finance

Read More..

Encrypted Chat Took Over. Let’s Encrypt Calls, Too – WIRED

Slide: 1 / of 1. Caption: Wired

As end-to-end encrypted messaging apps have exploded in popularity, several well-known services have added encrypted calls as well. Why not, right? If it works for text-based chat, voice seems like a natural extension. If only it were that easy.

Encrypting calls has plenty of value, keeping conversations strictly between the two parties. They can circumvent government wiretaps, or criminal snooping. But a host of technical challenges with facilitating the calls themselves has slowed the spread of voice over internet protocol overall. Bandwidth is expensive. Firewalls and network filters make it harder to route data streams. Even basic call quality issues, like delays and echoes, prove difficult to fix. Adding encryption on top of all of this takes additional resources and specialized developers.

All of which has delayed encrypted callingbut not stopped it. And a new groundswell of enthusiasm is bringing more options than ever.

The challenges of making reliable encrypted calling starts with the underlying premise of internet-based calls. Theyre hard. While VoIP calling has become more reliable over the years, it remains technically challenging in itself, especially when people use cellular data instead of more stable ethernet or Wi-Fi connections.

Despite those challenges, Signal, the well-regarded secure communication platform, has offered encrypted calling since 2014. And when WhatsApp followed in 2016, bringing encrypted calls and video chat to more than a billion users, it helped shake off some longstanding inertia. Other secure messaging apps like Wire and Telegram have added encrypted calling over the last year. Signal itself even rolled out call quality improvements in February.

Signal developer Open Whisper Systems open-sources its code, so that companies can borrow from it to build their own encrypted chat and calling features. For example, while WhatsApps overall setup is proprietary, it bases the key exchange for its end-to-end encrypted messages and calls on Signal Protocol. Its users have to trust that it is implementing true end to end encryption in the way it claims. In exchange it brings some form of end to end encryption to an enormous user base that would probably otherwise have little exposure to or protection from the feature. And customers who dont have faith in a large provider like WhatsApp now have other options, given the recent proliferation of both VoIP in general and encryption specifically.

Theres so much happening right now in this space which is really exciting, says Nathan Freitas, the founder and director of the Guardian Project, a privacy and security nonprofit that worked on an encrypted calling platform called Open Secure Telephony Network. In 2012 there was just Skype basically. Google Hangouts didnt even exist. FaceTime existed kind of. So were really happy when theres so much public innovation that includes privacy and security.

Though not nearly as much as there could be, if everyone could get on the same page.

As with messaging, end-to-end encrypted calls require that both ends of the conversation use the same system. In other words, using Signal to call a landline wont cut it; you need to connect with another Signal user. Given this reality, many developers naturally gravitate to implementing encryption in closed systems; its easier both to manage and monetize.

For users, though, this approach has downsides. Unless the developer makes the product fully open source, or allows for extensive independent auditing, theres no guarantee that the encryption implementation works as advertised. The lock-in factor also limits who you can safely communicate with, which slows adoption.

Imagine, instead, an open communication standard that includes end-to-end encryption. It would allow secure communication with more people between different products and interfaces, because the protocols facilitating the end to end encryption would be the same.

The Guardian Projects OSTN experiment attempted to create exactly that sort of comprehensive, open communication suite. It focuses on using existing open, interoperable communication standards, employing classic protocols like ZRTP, which was developed in the mid 2000s by PGP creator Phil Zimmerman, and SRTP, which was developed in the early 2000s at Cisco. It also coordinates and controls its voice calls using the Session Initiation Protocol, developed by the telecom industry in the mid 1990s.

That retro backbone didnt come by choice; there simply arent a lot of more modern open protocol options available. Most big VoIP plus encryption advances have come from private companies like Skype (now owned by Microsoft), Google, and Apple, who offer varying degrees of encryption protection for calls and tend to value locked-in users over interoperability. That left OSTN with old tools.

While theyre very powerful, these are things that are 10, 20, 30 years old in terms of the architecture and the thinking, Freitas says. Theyre definitely showing their age.

And while a few smaller services, like PrivateWave and Jitsi, have adopted OSTN, the decision by larger companies to go it alone has limited its open-protocol dreams. Thats especially a shame for people who need absolute guarantees of security.

With proprietary apps, it can be hard for a user to tell if end-to-end encryption is enabled on both ends. Or, in the case of apps whose encryption protocols have not been fully vetted, whether it works as advertised to begin with.

For mainstream services, crypto is a nice add-on to give users the idea that they can feel more secure, but thats completely different than when your [customers] are people who are under threat, says Bjoern Rupp, the CEO of the boutique German secure communication firm CryptoPhone. If you have to fear for your life, not all secure communication systems are designed for that.

Encryption die-hards can host their own system using open standards like OSTN, similar to how you might host your own email server. Though it takes some technical knowhow, its an option that gives users real control and that isnt possible with closed systems. Another option is to use a security first service like CryptoPhone that offers an integrated, one-stop solution.

CryptoPhones can only call other CryptoPhones, but the company made that choice so it could control the security and experience of both hardware and software. To reconcile this closed system with transparency, the company is open source and invites independent review. It also has over a decade of experience. CryptoPhone has been making high-end commercial products for secure voice calling for a long time, the Guardian Projects Freitas says. They had these crypto flip phones, which were awesome.

None of which leaves the average consumer with widespread encrypted calling that works across multiple services. There may be some help on the way, though, in the form of a new, open, decentralized communication standard called Matrix that includes end to end encryption for chat, VoIP calling, and more. Matrix could be a clean, easy to implement standard underlying other software. For instance, if Slack and Google Hangouts both used the Matrix standard, you would be able to Slack someone from Hangouts and vice versa, similar to how you can send emails to anyone using their email address, regardless of what provider they use.

The net owes its existence to open interoperability, says Matthew Hodgson, technical lead of Matrix. Then people build silos to capture value, which is fair enough, but you get to a saturation point where the silos start really stifling innovation and progress through monopolism.

The catch, of course, is getting buy-in from companies that have little incentive, or getting new services built on a standard like Matrix to take off. Walled gardens tend to produce more profit than open ones.

Still, having these new options is an important first step. And combined with the broader proliferation of encrypted voice-calling apps, change finally seems to be coming from a lot of directions at once. I think theres a longer-term project going on called the internet, Freitas says. Some of us still believe in it.

Originally posted here:
Encrypted Chat Took Over. Let's Encrypt Calls, Too - WIRED

Read More..