Page 4,350«..1020..4,3494,3504,3514,352..4,3604,370..»

Genesis Mining & CryptoCurrency Steemit

:bu5Z}gMra5(x=%L!%LGvtg ,6s+4IEL-Zc^v%Pg};'toL;]dB #(d+:Hqcm~;ub p x[r[WO?$ m;pS=mymqiT3N;'wLKY bxzF**H@Gi@[mE`7x(Fq}YcU `d-w uZ|W]M"+oxP)q|an9^r 9"y+]M/o3&]LLUS(oqaN`PK]DE+W^M7(ou7Ls[ +'#8~]lOTwE7~ X?^^l2+&M1&uwwj,H.UNVZM8@UvS+tDMNHTE][j.X" yOdTF[GOOlAbm ]{cATW$^2=)mVl P$2Ix-/&,/kwsQ/-|}tam MOyj`Il'au, F[Y>Z} 57V28j Nse2-StIjDN~jggM>K~Sa;GXt[u(h?.g0%$'u;>9(m j+A=K uD4yC(:JzDt1q}79q`crE'2l2-yL?r5gpM}p 4MTwO61(swle4pr6S@8Ytw!zKoK0q+k:: aKAcSYCmGO]9Wl9oK8Own[I^n%~ 63!W27_] QjgLXi@o/ZG?;LKI x7@>W#w#G1: z QP$q28-ZcD^`n>]Lp8A|g..DZ&_CM2Ol[g?6k570 %75!zHX_v(A `[6bFSe`&n!H+JkLX%p 221A(]g,! 1D-@DG,VcBbIP`E(X1iBp2Iah$RQqn%zEw)9v qmdn+r?qm=XVaql=gc xMBpma ~Ln^^GddCPc ;A/jen0=bM'GgV8J zDB+ESotJ`HN@##"qN5}jU$p2di~3 Tj8/ {!oR#E Q2a@L7P@b ^1>*(0"e9`.j1b F% W4 d#E#,Bss8B:j0e312HA]1V0|OOh]!s KMb^M+t[RFI4>00CFA3% Rc8%$XSlRg3oQth2&'} Qr zc9A1Pq^@b8i]#HF`!]oBf?!vj{y1qm f.escBb'DWfyhi:giH1j6b) BdJ.@ZRI:ALc2Ax#dR@v|'WOi;bWWkd}tf&9S)|iy _?J8fo2fgTiczzjWu~N9D=T]L t[O9UM)&`B|t HPQT^$Sk cK6e8le+d8abWy$.plJ*bd1u,ds$ G7`pa$TB6;&Ncg?#["K)PpK{0Yj]7=XM[b%{lSRj;{ZaVu{0({0^?v[LL5L75]hYG^i^{DjI&=VK*1 YU!]kvvU^cV3c{:zeu`^QwkeuT{|*ReYTIU;7MU;*lo`W, 9Vxce`~i^I7&YE )Vx5xO7M+iU%MiUxVe |*VeyT

DMJz+`[IGHRT[M^%E s`muEni4 $XY!"z|*XU]7fUEVn*)UF~:&4TEJjnRWRTxMeZoEp7fD*xU0Do>+2*Tx%`SYTxE"nJTk.+2VuMVIGM1k /Vv)-0'n+,-$h1(+u>g_C'}?44qg8;)qwVF%]^44yH/AHtF^4,Bon2TS4[MF`We z|,t[jT[%sLoIPfAKtp]W+yO{F]|#X54b+w(+X9m RtUtzxYg y[g9jgiY%Z8a7> - BUhCJf/dUXxpjl5T~j(UQcc (JiA:`-@gOl p,q3pas}mcDxEs@UzXY!{#+gFVo~DOKrsk^#+j 5cRs%sOxFf}O'3>-gHH*88gbFXb>M]@lsD|QgNO$Od';z;jZd* wy+89{6K"aszE9p%-C|33oh.LIr0If&X%:8=s4NGu.)KAX%g cfT0'@t=Lo@""~EkdT^Sf:]op1=xJv>~Yu!oR6~eD>$_QJcI W`ylb]im LcHog/cA#Q=_d> NVL+t="w5U^Igx?(~1jf !D-Pl8h##(| 4F9qF 2TSF)s61SL9O2RXmL}l2-Scfz@]1cYY]UEuM].dwu2ZK]GuqThLKuP-TgRj ]U^ s`oAMg;Gz1z1^/nL{j{I^~@.Lz$}q['9BO"K4Eem#CsIkViIerf+Hn]PW;zOW,U5KW5_eW*_@_o=;h}5ge4|*5 oMH+^Sy{2~AGI{zqjzs]pf= NVGg!^b~w`^0.,wc7o8?J8Jf[Jxs>X)byWEaS U-]/Pi-{Rw-e=.U%?+*rwYIpA;/s)1 #N~5IzK:VSw[/pT`pw~nV+s(OL2>(2f}M@ cg7qy 'VAY#~/ LG8P,(fxO2*y4N|iu08'UL, 5a3>2}Av|>q"?Blzo]@>P$fq9`-i"@'~y"D#E6t-0? f P3-Xv^ SR|mEv(8ygM?PGi,DX?0X!fG 8dS%&lD~d fa`G1y$0.Z}=[P/9^}96c v joAl8l8[,rS^z Z^m8}ht~[l8m]`Z%7/3=v++TXK(gzbV54'mBy=IGJ^;>,O5I~VMY_s9HzJ,SA&i2iA+~m7l M w}*5d0;ux5=]HW]e9b@Pf/M,zqR sfS(e&r>oHoH@lQbUlmVu5/ggkyW]lt[[KFltY{@=mYIF%zp[EU%[?"&AE0@)cQX*4z8K[D& f%{TQn;hHg jz=$x~n?zY

fTRd~80O /2ly4&=>igIx6jKLL;>oI6'k;&VvNl| 8:?HcE+q=p^!z{Qyi!Kz7'o(Nk*M+@Oi)(SnMNwzpwvtNQfJXa_cuqN~ ^yw=/zG/n'WOW7";-'[y=/5wZx j1:s5)coyINA NFJ9c7G^7?$GgOn?GL[c`; 7'f9[|$3sZRT0? s`Qs4#gidb1uS_ca*WjKD[(/U3y3ag(cN>.] GJ>=?~)hFYOHCys~Y'1 fpCTDQ(Z#SjgM3&{e!ucF;`SX(R,wO??SA([Z4:o9bB-1cYJc~(Y+V]A2iU2/ gc xdv`Z*_+Pd|T6JF>{ALG2bVU0~O"ur {hC_}B)5H! R`P QrgE1p:d LRxBK6 @ 0N.H(R >dKD [HwK`R8^=N(yy~|Cz44^lbD~1GI;(02t;xH81Y LQ_J1W"0 2 ?dJhk!9:Fi1w y%5x I}r/;vks>xQtoo?;x~L~|]'?O'G, Q.WnM%@(HV) (TJXS)=`H-X+hL22%k hwCYmdF24/AS&6W h%gy;/@~fptsy CnB.cT.v*>kbp3EEc) {i2XG__x/o?^cG7Q_|50RtQZ]l&#+4mn?u5rG~FLA5tn?@JM=h]n^$]({"ht"M?/.GPTn6AZ; f#'^c~`"-'+G+qqG_xz8z}W]I>j-5x2dE'Fi8ycFYdr[V,2DZ4$&l]n `w.7O~F V9Z79!b~ H4nl S&PBC_em|>VoD}[O{'Q73&_,]z y91qNfoI$u>C"~}:c>~>|uSPkH{}.o2(W'Qx|xWPPl%YlwW'7K%RxHT5MDIZ- Ezry?x}dOnQv^gC@E*3|UDWmsF6wL`so qW]4rq& HN4QFKENqb3w)%)=zI%).]QRSS.j42#YSXsFnta@dr@~4LJJ)%'_f**KL Ncx$CHh%H~22{8N.(h#b:M H6gaht7ei3q3{)s 9IZtfgvYrr#> `X hsR/N1HP #w#']3n E f-{ ad58(OM'!oF"4i yz@0)#|{&s@%ck?Y8p9ox;SD|D`438'aal17&0|dJm@T1L;Hfm@TpK_xn42&|3l *Y"F(LoYBse*YMiJaGtZM 8@~kgYzH m+[?Qvd?ng&)7^ShiXsie! sPe3{:Km#/s=;.F2m]beYpIBkE%cl,{%rc1y5$@bkSt8W@!/;V`CjNaS g76XL*t=Q7"d|_8 ^ |A=u6[~9GV d#y._Y P$fq9`miDdd'~yB>&k5G,Axu*vi f5UA?@8/v^_m'B82iy`).X0$#sL0'$X;/t5kSIN#fq~/baqcxaWR9y@H2]JwN:%D<|29sr1kW#Q&LTmx-:7Pul2e0NA5ldaFfUsZd^vj"Fz8`O}`EQ+OgHxq2_GQ`yyT~rUFA0='2M>'yH=r?gzC_ 8z_gtcV9!XK ?~w8ao vs 8'| 9|,q~acC'EC?>/&-s:L.L_DO"3Q|)d#spmVb1seqqESH[NG0`p5/W`0{0U9;[L-G@ZN?"@l,B|"?sX!-}^%q0)7:GZIg6(-HGIo[M7D Yba8IPG8DE(N?^lA>;0'>603x`,L=<8pLMC?88I%c 4aQ)<$9#A?7 +D$ H0=/R8SE94MA>Y7 L)m l^6:T[B'`hE2#wha#p]8}3Zd@`h Y)cLH:L3`.X-MH%0]8x w02f.@2y [l?WSKKtH wp )Ao6feYOiQv!Hg8y)Z5g Ik|S8%@5^8Y+l`k!S,+F=Ny]ME.pue$zIIf.p `EH)%ua

Link:
Genesis Mining & CryptoCurrency Steemit

Read More..

UNODC launches cryptocurrency investigation training program – EconoTimes

Friday, May 12, 2017 5:53 AM UTC

The United Nations Office on Drugs and Crime (UNODC), the agency that is dedicated to fight drug trafficking and organized crime, announced that it has developed a leading Cryptocurrency Investigation Train-the-Trainers course.

Being the first course on cryptocurrency investigations, UNODC said that the course was delivered in recent weeks. The Cryptocurrency Investigation Training course saw law enforcement experts from across 22 nations as well as UNODC regional staff, who learned about the business profile and global ecosystem of cryptocurrencies, including bitcoin and Ethereum.

The training was a unique program where I could not only understand the conceptual framework of cryptocurrency but also get familiar with the ongoing criminal activities using bitcoins and nuances of their investigation including bitcoins-facilitated illicit international trafficking of drugs and weapons, a participant from India highlighted, presenting an overview of the training.

The course practitioners gained first-hand information about how to conduct bitcoin tracing as a part of a wider financial investigation, where to obtain information, and how to collaborate internationally on casework, among others. The training course also focused on developing a new set of skills for participants including understanding the cryptocurrency concept and cooperating internationally on cryptocurrency cases.

For the Cryptocurrency Investigation Training course, UNODC partnered with cryptocurrency industry leaders like Chainalysis Inc. in order to assist law enforcement officers and analysts to trace illegal financial flows.

We partnered with UNODC on creating the comprehensive train-the-trainers program on analysis, tracing and investigation of cryptocurrencies. It is a highly important topic: the use of Virtual Currencies is steadily growing and today analysts, law enforcers and prosecutors need training to collect and disseminate data gathered by exchanges and brokers thus supporting the expertise of national agencies in preventing misuse of this innovative technology for criminal purposes, Michael Gronager, CEO of Chainalysis, stated.

The second course was focused on the analysis of cryptocurrency transactions, chokepoints investigation, bitcoin AML framework and case studies.

Human Life Could Be Extended Indefinitely, Study Suggests

Goosebumps, tears and tenderness: what it means to be moved

Are over-the-counter painkillers a waste of money?

Does an anomaly in the Earth's magnetic field portend a coming pole reversal?

Immunotherapy: Training the body to fight cancer

Do vegetarians live longer? Probably, but not because they're vegetarian

Could a contraceptive app be as good as the pill?

Some scientific explanations for alien abduction that aren't so out of this world

Society actually does want policies that benefit future generations

Six cosmic catastrophes that could wipe out life on Earth

Big Pharma Starts Using Cannabis For Making Drugs In Earnest

Do you need to worry if your baby has a flat head?

See original here:
UNODC launches cryptocurrency investigation training program - EconoTimes

Read More..

Decreased Risk And Latency With Cloud Storage In Space – Business Solutions Magazine

By Cliff Beek, President, Cloud Constellation Corporation

In todays nanosecond environment, getting important data from Point A to Point B as quickly as possible can be the difference between success and failure for business as well as for government. Transmitting this data requires a series of communications hubs that are working at optimal levels. This may mean several seconds of latency, a definite competitive disadvantage. However, new advances in technology give VARs and Cloud Service Providers an opportunity to show their customers a more efficient way.

Standard communications architecture slows the flow of information with its multiple hops and interchanges through terrestrial networks. Simultaneously, this method exposes the data to monitoring and manipulation along the way. Even the most efficient cloud network requires third-party data centers to replicate globally to provision worldwide offices effectively.

This is the current state of affairs. But imagine data could be securely transmitted from a single corporate network hub to any location worldwide in less than a second. The solution to this problem will be found through new space-based data center technology, creating a telecom backbone around the globe. Such a network will allow data to flow freely around the world without restriction and without fear of interception, enabling CIOs to virtually provision any remote office in less than one third of a second, regardless of proximity, without any latency, jurisdictional or cybersecurity issues.

An independent space-based network infrastructure for cloud service providers and their enterprise and government customers is now possible. This new paradigm would allow customers to experience secure storage and provisioning of sensitive data around the world. By placing data on satellites accessible from everywhere via ultra-secure dedicated terminals, many of todays data transport challenges will be solved. Space-based storage offers a convenient solution to the issues of both security and jurisdiction while offering unprecedented transit speed.

This is not a completely new idea, but rather a re-envisioning of current practices. Organizations and government entities already enjoy the communications benefits of the satellites ringing the earth. Using the technologies that would enable space-based cloud storage, they can enjoy even faster and more secure communications and offer services that would not otherwise be possible.

Delivery of drone audio and video becomes much faster with space-based network infrastructure. At present, there is a latency of more than two seconds in the delivery of real-time drone video. Like driving a car with a two-second blinder, maneuverability and agility are constrained. Using a sky-based telecom system, latency will be reduced to less than one second.

This is great news not just for drone data the space-based network would expedite 4K HDTV between two live audiences as well. Currently, studios employ parlor tricks to mimic live two-audience interaction, and video error correction must be applied at each server stop in both directions to meet the studios demanding 4K HDTV specifications. Using a space-based system, latency would be reduced to about one second and require just one video error correction at the end. Studios would be enabled for true live audience interaction a major market differentiator.

In addition, video streaming services would be able to bypass congested, expensive networks, and expedited live video delivery would become a reality. Another beneficial development is that Cloud Service Providers will be able to sell services without adding more capital or operational expenditures for competitive expansion, including:

E.T.: The Extraterrestrial Solution

Todays data communications strategies would have seemed miraculous just a few years ago, yet ongoing technology trends create demand that makes those strategies inadequate now. Latency and security risks abound. However, using a space-based approach removes the middleman, as it were, enabling enterprises and governments to have the speed and privacy they need to succeed. Its an extraterrestrial offering VARs and CSPs can add to their solutions mix, creating differentiation and cost savings.

Cliff Beek is the president of Cloud Constellation Corporation. He has extensive experience with the management and financing of equity-backed ventures within areas of satellite, mobile broadband, mobile app development, and cloud infrastructure entities. Beek founded Star Asia Technologies and Laser Light Communications and served as the EVP at CoCo Communications. He holds an MBA from the Wharton School, University of Pennsylvania. Cliff can be reached online at @Cliff_Edges.

Go here to see the original:
Decreased Risk And Latency With Cloud Storage In Space - Business Solutions Magazine

Read More..

Microsoft blurs the line between desktop and cloud with OneDrive update – TechCrunch

At Microsofts Build 2017 event today, the company introduced a new feature for the companys cloud storage service OneDrive that will allow you toaccess your files without using up device storage. Called Files On-Demand, the feature will roll out along with the Windows 10 Fall Creators Update and will give users more control over which files are stored locally versus in the cloud.

The feature is basically Microsofts own take on Dropboxs Smart Sync, as it lets you view all the files and folders directlyin File Explorer, instead of only seeing those youve opted to sync to your device.

The idea is that the more you use online storage creating files and uploading photos and working across multiple devices the more you need to be able to see all your documents, and not worry if you had forgotten to sync something to your current device.

But while Dropboxs Smart Sync was introduced as a feature for business users,OneDrives Files On-Demand supports your personal and work OneDrive accounts, as well asSharePoint Online team sites.

In addition, your OneDrive files wont just be visible on your desktop, theyll also be shown in the file picker when youre using a Windows store application. When you select a given file from the picker, it will automatically download and open in the app like a local file would.

The user interface in Windows 10 File Explorer will also be updated so you can better see which files are available on the device and in the cloud, thanks to the use of a small cloud icon in a column labeled Status. Youll still be able to tell how much storage space the file will require if you choose to download it, as its size information is available in Explorer as well, even if the file isnt yet on your device.

You can also choose to make certainfolders or files always available offline with a right-click, then choosing Always keep on this device from the menu.

However, you generally wont have to go through this process its only really necessary if you know youll be spending a lot of time offline and dont want to lose access to your more critical folders and files. Most times, while youre connected to the internet, you can just double-click files and usual to open them into an app, and the file will automatically download to your device.

The feature has another benefit for larger organizations using SharePoint Online team sites it will reduce the network bandwidth by reducing the continual syncing that occurs today when changes are made.

At present, when anyone makes a change, files are re-downloaded on all synced devices. This is no longer necessary, as the files in their updated format will be available whenever the user clicks them.

While Files On-Demandworks well on desktop devices, on mobile, connectivity is often more of an issue. Microsoft addressed this by introducing a new option that will allow users to save entire folders to your mobile device, so you can open the files they contain once you have a connection again. Changes others make to those files while youre offline will be automatically updated once youre connected again.

This feature is available now on Android for Office 365 Personal and Home subscribers and OneDrive business accounts. iOS users will receive the update in a few months.

However, iOS users can today use OneDrive with iMessage. You can sharean entire folder or file directly within the iMessage interface. These documents and photos can also be instantly previewed here, through this new integration.

Read more from the original source:
Microsoft blurs the line between desktop and cloud with OneDrive update - TechCrunch

Read More..

Microsoft is on the edge: Windows, Office? Naah. Let’s talk about cloud, AI – The Register

Build At the Build 2017 developer conference today, Microsoft CEO Satya Nadella marked a Windows milestone 500 million monthly active users and proceeded to say very little about Windows or Office.

Instead he, along with Scott Guthrie, EVP of the Microsoft Cloud and Enterprise Group, and Harry Shum, EVP of Microsoft's Artificial Intelligence and Research group, spent most of their time on stage, in Seattle, talking about Azure cloud services, databases, and cross-platform development tools.

Arriving on stage to give his keynote address, Nadella in jest said that he thought it would be an awesome idea on such a sunny day "to bring everyone into a dark room to talk about cloud computing."

Office and Windows can wait.

Microsoft watchers may recall that its cloud-oriented businesses have been doing well enough to deserve the spotlight. In conjunction with the company's fiscal second quarter earnings report in January, the Windows and Office empire revealed that Azure revenue grew 93 per cent year-on-year.

During a pre-briefing for the press on Tuesday, Microsoft communications chief Frank Shaw described "a new worldview" for the company framed by the "Intelligent Edge" and the "Intelligent Cloud."

Nadella described this newborn weltanschauung as "a massive shift that is going to play out in the years to come."

He mused about a software-based personal assistant to illustrate his point. "Your personal digital assistant, by definition, will be available on all your devices," he said, to make the case that the centralized computing model, client and server, has become outmoded. Data and devices are dispersed.

In other words, all the data coming off connected devices requires both local and cloud computing resources. The revolution will not be centralized.

That could easily be taken as reheated Cisco frothing about the explosive growth of the Internet of Things and bringing processing smarts to the edge of the network. But Microsoft actually introduced a new service that fit its avowed vision.

Microsoft's bipolar worldview the Intelligent Edge and the Intelligent Cloud manifests itself in a novel "planet scale" database called Azure Cosmos DB. It's a distributed, multi-model database, based on the work of Microsoft Researcher Leslie Lamport, that promises to make data available locally, across Microsoft's 34 regions, while also maintaining a specified level of consistency across various instances of the data.

An Intelligent Meeting demonstration, featuring Cortana, showed how AI has the potential to exchange and coordinate data across multiple services. But "potential" requires developer work it will take coding to create the Cortana Skills necessary to connect the dots and manage the sort of cross-application communication that knowledge workers accomplish today through application switching, copying, and pasting.

Conveniently, the Cortana Skills Kit is now in public preview, allowing developers to extend the capabilities of Microsoft's assistant software to devices like Harman Kardon's Invoke speaker.

Beyond code, it will take data associated with people and devices in an organization to make those connections. That's something Microsoft with its Azure Active Directory, its Graph, and LinkedIn has in abundance.

A demonstration of real-time image recognition to oversee a construction worksite showed how a capability like image recognition might be useful to corporate customers. Cameras spotted unauthorized people and located requested equipment on-site. It looked like something companies might actually find useful.

Artificial intelligence as a general term sounds like naive science fiction. But as employed by Microsoft, it refers to machine learning frameworks, natural language processing, computer vision, image recognition or the like.

"We believe AI is about amplifying human ingenuity," said Shum.

Microsoft's concern is convincing developers and corporate clients to build and adopt AI-driven applications using Microsoft cloud computing resources, rather than taking their business to AWS or Google Cloud Platform.

One way Microsoft hopes to achieve that is by offering cloud computing outside the cloud, on endpoints like IoT devices. The company previewed a service called Azure IoT Edge to run containerized functions locally. It's a way of reducing latency and increasing responsiveness, which matters for customers like Sandvik.

The Swedish industrial automation biz has been testing Azure IoT Edge to anticipate equipment failure in its workplace machines, in order to shut them down before components break, causing damage and delays.

See the original post here:
Microsoft is on the edge: Windows, Office? Naah. Let's talk about cloud, AI - The Register

Read More..

Cloud Atlas: How to Accelerate Application Migrations to the Cloud – Talkin’ Cloud

Its a common misconception for people to imagine that business applications can be beamed up, Star Trek style, into the cloud and that the IT team just needs to press a few buttons and whoosh, the migration is done. If only it were that easy.

In the first place, its important to note that there are some applications that should not, or cannot be moved.Legacy applications may be difficult to virtualize, requiring significant development work before they can be migrated. Some applications may be sensitive to latency, so for performance reasons they should stay on-premise. Others may be governed by regulations which prohibit their moving outside of a given jurisdiction or geographic region.Despite these constraints, weve found through working with large enterprise organizations that around 85 percentof applications can potentially be migrated to the cloud.

But then there are multiple challenges which need to be addressed if the migration is to done smoothly and securely. First, the applications existing network flows need to be mapped, so that the IT team knows how to reconnect the applications connectivity post-migration.This is extremely hard to do in complex environments. Theres usually little to no up-to-date documentation, and attempting to understand the requirements and then painstakingly migrate and adjust every firewall rule, router ACL and cloud security group to the new environment manually is an extremely time-consuming and error prone process. A single mistake can cause outages, compliance violations and create holes in the businesses security perimeter.

Just how long could this process take?In AlgoSecs experience, an experienced consultant can manually map around one application per day, or five per week, depending on the number of network flows in the application, and the complexity. This means a team of five consultants would take around a year to map 1,200 applications in a typical large enterprise.If the organization does have good documentation of its applications, and an accurate configuration management database, it may be possible to cut this time by 50 percent.

But given the work and time involved - not to mention cost - in mapping applications manually, some organizations may ask if they really need to do it before migration.The answer is definitely yes, unless they plan to move only one or two applications in total and can afford to manage without those applications for hours or days, in the likely event that a problem occurs and connectivity is disrupted. Having comprehensive maps of all the applications that need to be migrated is essential: this atlas of connectivity flows shows the way forward to smooth, secure cloud migrations.

With an atlas of existing connectivity maps, organizations can tackle the migration process itself. This can be done manually using the APIs and dashboards available on all cloud platforms, but its slow work, and its all too easy to make costly mistakes. Some cloud service providers offer native automation tools, but these often only address the cloud providers environment and they dont provide visibility, automation or change management across your entire estate. Even some third-party cloud management tools which are capable of spanning multiple clouds will not necessarily cover your on-premise networks.

The most effective way to accelerate application migrations is with an automation solution that supports both the existing on-premise firewall estate, and the new cloud security controls, and can accurately define the flows needed in the new environment based on the atlas of existing connectivity flows, as well as the security and compliance needs of the new environment.In fact, the right automation solution can also discover and map your enterprise applications and their connectivity flows for you, without requiring any prior knowledge or manual configuration by security, networking or application teams.

Businesses can then use the solution to navigate through the actual migration process to the cloud, automatically generating the hundreds of security policy change requests that are needed across both the on-premise firewalls and cloud security controls. This dramatically simplifies a process that is extremely complex, drawn-out and risky, if attempted manually.

After the applications have been migrated, the automation solution should be used to provide unified security policy management for the entire enterprise environment, from a single console.

While there isnt yet a method for beaming applications up instantly into the cloud, automation makes the process both fast and relatively pain-free by eliminating time-sapping, error-prone manual processes, such as connectivity discovery and mapping, during the migration itself, and in ongoing management. Automation helps organizations to boldly go where they havent easily been able to go before.

About the Author

Edy Almer is responsible for developing and executing the companys product strategy. Previously Mr. Almer served as VP of Marketing and Product Management at Wave Systems, an enterprise security software provider, following its acquisition of Safend where he served in the same role. Prior to Safend, Mr. Almer managed the encryption and endpoint DLP products within the Endpoint Security Group at Symantec. Previously he managed the memory cards product line at M-Systems prior to that companys acquisition by Sandisk in 2006. Mr. Almers operational experience includes the launch of 3G services projects at Orange, Israel's fastest growing cellular operator, resulting in 100,000 new 3G customers within a year of its launch. As the CTO of Partner Future Comm, Mr. Almer developed the product and company strategy for potential venture capital recipient companies. Mr. Almer has a B. Sc. in Electrical Engineering and an MBA.

More:
Cloud Atlas: How to Accelerate Application Migrations to the Cloud - Talkin' Cloud

Read More..

Demands of IoT, Quantum and Cognitive Workloads Drive IBM’s Cloud Datacenter Expansion – Redmondmag.com

Datacenter Trends

The company is not only expanding its datacenters to meet growing individual demand, but to meet the computing burden of new and emerging technologies.

IBM just announced the opening of four new cloud datacenters in the U.S. -- two in Dallas, Texas, and two in Washington, D.C. The new facilities were designed to handle demanding cognitive workloads running on IBM's Bluemix cloud platform. Each facility has the capacity for thousands of physical servers and offers a range of cloud infrastructure services, including bare metal servers, virtual servers, storage, security services, and networking, the company said.

IBM's growing cloud datacenter network now extends across 19 countries and comprises 55 facilities.

Big Blue launched a strategic initiative to deploy its cloud datacenters in key local markets around the world about two years ago, and the new U.S. facilities are part of that strategy. Late last year, the company opened a cloud datacenter in Norway, which was the first in the Nordic region. It also opened cloud datacenters in Seoul, South Korea, and Chennai, India. The company reportedly plans to open four addition datacenters before the end of second quarter, including two in London, one in Australia, and one in San Jose, Calif.

But a significant expansion effort planned for 2017 emphasizes enhancing the capabilities of existing facilities to accommodate growing demand for support of blockchain technology, quantum and cognitive computing and the Internet of Things (IoT), said John Considine, GM of IBM's Cloud Infrastructure group, in a statement

"This expansion is not about increasing the number of countries we operate in, but the capacities of the markets we're already in," explained Francisco Romero, VP of IBM's Cloud Infrastructure Operations group. "We're essentially growing with the demand for IBM's analytic and cognitive capabilities. As the number of clients leveraging those capabilities and the data sets grow, the demand for our infrastructure grows."

IBM is betting on growing demand for these technologies and the resulting demand for datacenters that can handle the higher-end workloads, which mitigate the costs of added hardware.

"We're doing a lot of work within the datacenter to support technologies that are very AI-friendly at scale," Romero said. "We already have CPUs available in those datacenters that cognitive workloads take advantage of. And we continue to work with our hardware vendors to incorporate more and more cognitive-specific capabilities into the datacenter, but doing it at scale, so the total-cost-of-ownership equation works as well as possible for the overall business."

This is a bet IBM appears to be winning. In April, the company reported a 33% increase in revenue from its cloud services during the last quarter, and total cloud revenues of $14.6 billion over the past 12 months.

About the Author

John has been covering the high-tech beat from Silicon Valley and the San Francisco Bay Area for nearly two decades. He serves as Editor-at-Large for Application Development Trends (www.ADTMag.com) and contributes regularly to Redmond Magazine, The Technology Horizons in Education Journal, and Campus Technology. He is the author of more than a dozen books, including The Everything Guide to Social Media; The Everything Computer Book; Blobitecture: Waveform Architecture and Digital Design; John Chambers and the Cisco Way; and Diablo: The Official Strategy Guide.

See the original post:
Demands of IoT, Quantum and Cognitive Workloads Drive IBM's Cloud Datacenter Expansion - Redmondmag.com

Read More..

European tech unicorn OVH opens APAC HQ in Melbourne – The Australian Financial Review

OVH vice-chairman Laurent Allard says the business is growing rapidly in the Asia-Pacific region.

In a boost to the Victorian tech sector, French cloud infrastructure unicorn OVH is setting up an Asia-Pacific region headquarters in Melbourne to take on the likes of Amazon Web Services and Microsoft Azure, and it intends to employ up to 80 locals within three years.

The infrastructure-as-a-service company, which has more than 20 data centres in Europe, Canada and the US, has also built centres in Sydney and Singapore, as part of the company's expansion to the region.

OVH vice-chairman Laurent Allard told The Australian Financial Review while in Australia for the launch of the Asia-Pacific hub that the company's expansion into the region and the US was part of a bigger vision to become the global leader in cloud infrastructure-as-a-service.

"I'm not from this type of company background where the focus is on small and medium-sized businesses. I'm used to dealing with large enterprises and big transformation projects," he said.

"It was clear to me that the OVH model worked well but it was important to become the global leader, whereas at the time [in 2015] it was the European leader. We started out with no account manager for large accounts and had a very tech-driven company with gaps in terms of how to drive the company's strategy, but I had that expertise to bring and I'm very pleased with our complementary skill sets."

Until February 2015, Mr Allard had been the group chief technology officer of global IT and business process services provider CGI and until 2008 he was chief information officer of AXA Tech.

OVH, which already has 5000 customers in the Asia-Pacific region that are predominantly Australian businesses, was founded in 1999 by Octave Klaba with only $4000 and has remained a family-owned business in Roubaix, France.

Mr Klaba and his family still own 80 per cent of the business, which was valued at more than 1 billion in 2016.

The milestone valuation came after a 250 million capital raising led by New York-based private equity firms KKR & Co and TowerBrook Capital Partners.

OVH provides businesses with either public or hosted private cloud infrastructure, as well as bare metal cloud servers, But it considers its point of difference to be its ability to provide businesses with a hosted private cloud that works like a public one, and can be scaled up within 10 minutes to provide additional capacity during expected or unexpected busy periods.

The company said it selected Melbourne to be its Asia-Pacific region hub because of its talent pool and liveability, and OVH APAC expansion adviser Emmanuel Goutallier said the state government had actively engaged with the company ahead of the move.

"The state government has been extremely supportive. We started to engage with them mid last year and they have helped us understand the benefits of being in Melbourne and they've exposed us to something important the education community here in Melbourne, since we want to hire locally," he said.

"We have a strong in-house education program, but it needs to be complemented by external programs."

As well as establishing a presence in Australia, OVH will bring with it its Digital Launch Pad program. This is expected to be up and running by the end of June 2017 and will provide a range of free resources to start-ups on application, including up to $100,000 of cloud infrastructure support.

It is also in discussions with LaunchVic on how to support the local start-up ecosystem.

OVH joins a growing list of international tech companies which have set up offices in Melbourne, including Hired, Square, Slack and Zendesk.

Part of its decision to set up shop in Australia was to provide a local service to its customers here, fitting with the company's decision to create a wholly owned subsidiary for its US operations earlier this year, allowing it to operate in a way best suited to the US market.

"Our business is global, but we have to have a global-local approach. Technology can be a great generic asset, but the business solutions require proximity," Mr Allard said.

"We believe cloud is the way to build business solutions, but at the end of the day you need people on the ground and you need that proximity."

OVH is also in discussions with major telecommunications providers and cloud application businesses about establishing local partnerships in the Asia-Pacific region.

In the 2016 fiscal year OVH reported revenue of 320 million and by 2020 the company wants to hit 1 billion in revenue. By the end of 2017, the company also expects to have grown to 27 data centres, which will likely include more centres in the Asia-Pacific region.

The company has also pledged to invest 1.5 billion in its services over the next five years.

See the article here:
European tech unicorn OVH opens APAC HQ in Melbourne - The Australian Financial Review

Read More..

Dell Refreshes PowerEdge Line for First Time in 3 Years – Virtualization Review

News

It's part of an effort to make servers more cloud ready.

If Dell wants to keep being a leading hardware vendor in the cloud age, it needs servers that can keep up with the needs of more demanding infrastructure.

Cloud computing -- whether public, private or hybrid -- puts more strain on the underlying systems than the familiar, traditional datacenter model where everything stayed on-premises. Dell, recognizing that need, has updated its lineup to handle the changing model of computing that can have files, storage, networking and compute anywhere.

At the core of the company's new lineup of datacenter offerings, outlined this week at Dell EMC World in Las Vegas, is an upgraded version of the flagship Dell EMC PowerEdge servers, the first developed by the newly-merged company.

The company kicked off the datacenter portion of the conference with the launch of its PowerEdge 14 G servers (due out this summer) which are tied to the release of Intel's next-generation Xeon processors, code-named "Skylake Purley." It's the first refresh of the PowerEdge server line in three years and, in keeping with any refresh, the new systems offer the typical boosts in feeds and speeds. And while PowerEdge refresh will appeal to anyone looking for the latest servers, the release is also the key component to the entire Dell EMC converged and hyper-converged systems portfolio as well as new purpose-built appliances and engineered systems.

In addition to a new line of tower and rack-based servers, the PowerEdge 14 G will be the core compute platform for the forthcoming Azure Stack system and a new portfolio of datacenter tools, including a new release of its Networker data protection offering and upgrades to the VXRail 4.5, VX Rack and XC Series engineered systems (Windows Server, Linux and VMware, among others). "This is our 14th generation of servers, which is actually the bedrock of the modern datacenter," said David Goulden, president of Dell EMC, during the opening keynote session.

The new PowerEdge 14 G servers will be available for traditional datacenter applications as well as Web-scale, cloud-native workloads. Among the key upgrades that Dell EMC will deliver in the new PowerEdge server line are increased app performance and response times. The company claims the servers will offer a 19x boost in Non-Volatile Memory Express (NVMe) low latency flash storage single-click BIOS tuning that will allow for simplified and faster deployment of CPU-intensive workloads and the ability to choose from a variety of software-defined-storage (SDS) options.

"We knew we had to accelerate the workloads. We had to reduce the latency to make sure we have handled the performance to transform peoples' businesses," said Ashley Gorakhpurwalla, president of the Server Solutions division at Dell EMC. The server's new automatic multi-vectoring cooling allows a greater number of GPU accelerators, which the company claims can increase the number of VDI users by 50 percent.

In addition to the performance boost, company officials are touting a more simplified management environment. The servers will support the new OpenManage Enterprise console and an expanded set of APIs, which Dell EMC said will deliver intelligent automation. The company described the new OpenManage Enterprise as a virtualized enterprise system management console with a simple user interface that supports application plugins and customizable reporting. A new Quick Sync feature offers server configuration and monitoring on mobile devices. It boasts a 4x improvement in systems management performance over the prior version and can offer faster remediation with its ProSupport Plus and Support Assist, which the company claims will reduce the time to resolve failures by up to 90 percent.

Dell EMC has also added some noteworthy new security capabilities embedded in the hardware that offers new defenses. They include SecureBoot, BIOS Recovery, signed firmware and iDRAC RESTful API that conforms to Redfish standards. It also has better protection from unauthorized access control changes, with a new System Lockdown feature and a new System Erase function that ensures all data is wiped from a machine when taken out of commission.

The new PowerEdge servers were part of a number of other key datacenter offerings announced by the company this week. "Our new 14 G servers will be built into our full Dell EMC product portfolio, bringing out of our seventh generation of storage and data protection as well," Goulden said.

The servers will be offered with a variety of the company's new software-defined enterprise storage systems, including a new version of the Dell EMC ScaleIO software-defined storage (SDS) and upgrades to the company's Elastic Cloud Storage (ECS) platform. They include the ECS Dedicated Cloud Service for hybrid deployments of ECS, and ECS.Next, which will offer upgraded data protection and analytics; and its new Project Nautilus SDS offering, for storing and streaming IoT data. The servers will also power Dell EMC's new Ready Node portfolio, designed to transition traditional datacenters into cloud-scale infrastructure.

In addition to storage, Dell EMC said the PowerEdge 14 G will power the company's new Open Networking switches, including what the company claims is a top of rack that can offer more than a 2x in-rack throughput speed of traditional 10GbE switches and a unified platform for network switching, as well as a new line for small and midsize organizations.

About the Author

Jeffrey Schwartz is editor of Redmond magazine and also covers cloud computing for Virtualization Review's Cloud Report. In addition, he writes the Channeling the Cloud column for Redmond Channel Partner. Follow him on Twitter @JeffreySchwartz.

See the rest here:
Dell Refreshes PowerEdge Line for First Time in 3 Years - Virtualization Review

Read More..

bizEDGE NZ – Breaking down cloud computing and its nuances … – bizEDGE NZ

Youre the CIO of a large manufacturing company.

One of the IT goals for 2017 is to shift processes and programs to the cloud.

Youve decided that the public cloud is a good option, eliminating the need for an internal private cloud build.

However, have you and the team considered what mix of cloud and cloud-related capabilities will work for the specific goals of the enterprise?

There is a persistent idea in the IT world and other business units that cloud equals 'good'.

But enterprises must be more deliberate in deciding if the technology that will allow for the best results ispure cloud.

Part of the problem is that the term 'cloud' has been used to encompass a wide variety of technology.

In fact, Gartner predicts that by 2019, 'cloud' will be a ubiquitous term like 'network' as business solutions assume use of public cloud as a common asset.

Cloud computing represents one of the most misunderstood, yet valuable, innovations in current IT and business strategies," saysGartnervice president DarylPlummer.

"However, the value of cloud computing is reduced by the inability of many end-user organisations and managed service cloud providers to sort through technology provider cloud options to find the correct mix of cloud and cloud-related capabilities that they need, he adds.

Misaligned expectations will cause many cloud projects to fail.

Cloud and cloud-related offerings range from cloud-enabling technologies to pure cloud choices.

CIOs should consider the spectrum of cloud capabilities to support their intended value propositions.

Cloud-enabling is not cloud computing; however, it does include the technologies that allow customers to adopt cloud models.

This includes technologies such as visualisation software, physical servers, WAN networks and data centre colocation.

These technologies support companies that wish to build a private cloud, and form the building blocks via necessary architecture and physical setup.

They should be used to make cloud delivery more reliable, efficient or agile.

These technologies are closer on the spectrum to a true cloud model, but lack several key pure cloud attributes such as cloud APIs and usage-based pricing.

The goal for these technologies is not purity, but rather maximum levels of consumer control and specific hosting desires.

Companies that select these technologies are less concerned with the benefits of pure cloud and more concerned with a higher degree of customisation and control while attaining some level of virtualisation, standardisation, and automation.

They do not provide for true hyperscale, full self-service, ultra-rapid provisioning, or provider variety and speed of innovation.

Pure cloud options are generally, though not always, delivered in the public cloud over the private.

Pure cloud includes examples such as Amazon Web Services, Salesforce, Google App Engine and Microsoft Azure.

Pure cloud enables innovation brought about by continuous delivery of new service capabilities, and if the business goal is maximum agility, pure cloud options are a good fit.

This area of cloud offerings is growing rapidly although the bulk of IT remains in cloud-inspired as many enterprises move from 'cloud maybe' to 'cloud first'.

These are cloud offerings built on top of pure public cloud computing.

This type of offering assumes that public cloud computing already exists and builds on top of it to capture value from cloud services.

Examples include streaming video services, machine learning platforms, security as a service and GEs Predix platform, the 'social network for machines'.

This area is focused on providers generating new business growth in cloud-enhanced areas rather than traditional cloud areas.

The goal is to discover how enterprises can uniquely benefit from public cloud computing.

Article writtenby Gartnerbrand content managerKaseyPanetta

Read this article:
bizEDGE NZ - Breaking down cloud computing and its nuances ... - bizEDGE NZ

Read More..