Page 3,960«..1020..3,9593,9603,9613,962..3,9703,980..»

Finally, a good use for AI: Machine-learning tool guesstimates how well your code will run on a CPU core – The Register

MIT boffins have devised a software-based tool for predicting how processors will perform when executing code for specific applications.

In three papers released over the past seven months, ten computer scientists describe Ithemal (Instruction THroughput Estimator using MAchine Learning), a tool for predicting the number processor clock cycles necessary to execute an instruction sequence when looped in steady state, and include a supporting benchmark and algorithm.

Throughput stats matter to compiler designers and performance engineers, but it isn't practical to make such measurements on-demand, according to MIT computer scientists Saman Amarasinghe, Eric Atkinson, Ajay Brahmakshatriya, Michael Carbin, Yishen Chen, Charith Mendis, Yewen Pu, Alex Renda, Ondrej Sykora, and Cambridge Yang.

So most systems rely on analytical models for their predictions. LLVM offers a command-line tool called llvm-mca that can presents a model for throughput estimation, and Intel offers a closed-source machine code analyzer called IACA (Intel Architecture Code Analyzer), which takes advantage of the company's internal knowledge about its processors.

Michael Carbin, a co-author of the research and an assistant professor and AI researcher at MIT, told the MIT News Service on Monday that performance model design is something of a black art, made more difficult by Intel's omission of certain proprietary details from its processor documentation.

The Ithemal paper [PDF], presented in June at the International Conference on Machine Learning, explains that these hand-crafted models tend to be an order of magnitude faster than measuring basic block throughput sequences of instructions without branches or jumps. But building these models is a tedious, manual process that's prone to errors, particularly when processor details aren't entirely disclosed.

Using a neural network, Ithemal can learn to predict throughout using a set of labelled data. It relies on what the researchers describe as "a hierarchical multiscale recurrent neural network" to create its prediction model.

"We show that Ithemals learned model is significantly more accurate than the analytical models, dropping the mean absolute percent error by more than 50 per cent across all benchmarks, while still delivering fast estimation speeds," the paper explains.

A second paper presented in November at the IEEE International Symposium on Workload Characterization, "BHive: A Benchmark Suite and Measurement Framework for Validating x86-64 Basic Block Performance Models," describes the BHive benchmark for evaluating Ithemal and competing models, IACAm llvm-mca, and OSACA (Open Source Architecture Code Analyzer). It found Ithemal outperformed other models except on vectorized basic blocks.

And in December at the NeurIPS conference, the boffins presented a third paper titled Compiler Auto-Vectorization with Imitation Learning that describes a way to automatically generate compiler optimizations in a way that outperforms LLVMs SLP vectorizer.

The academics argue that their work shows the value of machine learning in the context of performance analysis.

"Ithemal demonstrates that future compilation and performance engineering tools can be augmented with datadriven approaches to improve their performance and portability, while minimizing developer effort," the paper concludes.

Sponsored: Detecting cyber attacks as a small to medium business

Excerpt from:
Finally, a good use for AI: Machine-learning tool guesstimates how well your code will run on a CPU core - The Register

Read More..

Cloud as the enabler of AI’s competitive advantage – Finextra

Data is just the beginning. Financial institutions (FIs) are now hyper-focused on surfacing meaningful, timely, and actionable insights from proprietary and third-party data. Technologies such as the cloud and artificial intelligence (AI) are forming new partnerships between humans and machines.

The barriers to entry have fallen and FIs are no longer only testing and experimenting with machine learning (ML), a subset of AI that allows computers to perform tasks without explicit instructions and relies on patterns. ML is now being deployed in key departments such as risk management, pre-trade analytics and portfolio optimisation, for example.

Finextra spoke to Geoffrey Horrell, director of applied innovation at Refinitivs London Lab and Joe Rothermich, Refinitivs head of labs - Americas about their recent report Smarter Humans. Smarter Machines: Insights from the Refinitiv 2019 Artificial Intelligence/Machine Learning Global Study, how ML processes can be deployed in the cloud and how it has become an enabler of competitive advantage.

The AI explosion Rothermich starts off with a comparison of AI to the explosion of the Internet, when suddenly you had the ability to quickly scale up servers and create websites. I think we are starting to see that with data, AI and machine learning algorithms.

In the past, there used to be a huge barrier to entry and although the machine learning algorithms havent changed dramatically since the early 2000s, we now have the ability to test out new ideas, train models and implement them in production systems easily.

Traditional infrastructure prevents scalability and digital transformation, Rothermich explains, his team being one of the early adopters of Hadoop in financial services he explores how building the infrastructure and prepping the data required substantial up-front investment of time and equipment.

The industry has moved to the cloud in order to make data accessible immediately, so algorithms can be written and tested at a faster rate, which in turn lowers the cost of production. There is an extensive breadth of data across all asset classes used by Refinitiv Labs that has been extensively curated and enriched and as a result, is now ML ready.

Providing the productivity edgeCoupling access to this real time data in the cloud allows clients to receive new insights at a faster rate, for use in risk assessment, transaction analysis, regulatory reporting, for example. Rothermich discusses how data such as accounting data, market data and text mining of news, events, filings, and transcripts are used to predict the likelihood of a company defaulting on its debt within a year.

Rothermich adds that recent research using deep learning has allowed the model to generalise better and not be tied to fixed vocabularies and could even adapt to multiple languages. Refinitiv Labs is conducting research in other areas, such as M&A prediction, to combine fundamental and text data to predict financial events.

Financial use casesThe growth of easy to use cloud infrastructure, the open source Python ecosystem and capabilities that help with machine learning workflows allow FIs to test out new databases or new computing infrastructures easily.

Implementing deep learning requires a lot more compute power and a lot more training data. We are working on this by using the cloud to scale up and conduct these experiments, leveraging machine learning frameworks without up-front investments in time and cost being an issue, Rothermich says.

Returning to risk management, it is evident that the scalability of the cloud also allows FIs to process massive amounts of data and obtain a response at a rapid rate but from a regulatory standpoint, there are issues around data, requiring the traceability of experiments with data and proving there are no biases.

But what type of risk use case are we using machine learning to address? Horrell extends on the credit risk example to answer this question and states that with investment risk, its about getting a much more real time view of probability of default compared to traditional credit ratings which tend to be lagging indicators.

By the time you see substantial deterioration in a companys fortunes that equates to a credit rating downgrade, the damage to the investment is already done. We know that there is more unstructured information out there that would give an early indicator to different kinds of financial distress or other leading indicators towards a higher probability of distress.

You can incorporate additional types of information using machine learning, different models for different data sets must be maintained and many, many test iterations must be run through. You also have to have a large capacity to handle the data, and to backtest it to see whether that additional unstructured data can give you that early indication that there might be a problem with the company, Horrell explains.

Sharing and parallelising with the cloudIn addition to this, while the cloud helps smaller teams become more agile when setting up a project and allows for faster experimentation, the cloud also allows FIs of all sizes to change direction when required, enhancing creativity and productivity.

In the front office, new horizons have opened up in terms of the types of data financial services insitutions can now analyse to power their investment and trading strategies. The rise of alternative data feeds into that, and the cloud creates many opportunities to look at this data.

The cloud can handle the scale of these datasets and provide the techniques and ML approaches to make sense of them and help FIs find completely new ways of generating investment ideas.

Rothermich explains that sharing code, sharing resources and sharing data is a lot easier in the cloud. And some of the tasks that are completed during a machine learning research project are very easily parallelisable and easy to scale up if the cloud resource is there.

On parallelisation, Horrell continues to say that because of the flexibility of the cloud, the technology can be applied to areas where it would not normally be. For instance, multiple risk models can be run, and data can be analysed in different ways from a risk point of view.

Rothermich highlights that in conversation with hedge funds, they revealed that one of the biggest tasks that they face is evaluating new datasets in addition to ingesting, mapping and validating that data. The clouds capacity for data has helped with loading up and merging new content sets and new, alternative datasets. This form of rapid data onboarding and evaluation gives FIs an informational edge.

Democracy vs. dataWhile there is a definite democratisation emerging with anyone being able to access data in the cloud, Horrell adds that ultimately, you cannot do data science without the data. The better quality your data, the better quality your results.

ReadSmarter Humans. Smarter Machines: Insights from the Refinitiv 2019 Artificial Intelligence/Machine Learning Global Study here.

More:
Cloud as the enabler of AI's competitive advantage - Finextra

Read More..

Dell’s Latitude 9510 shakes up corporate laptops with 5G, machine learning, and thin bezels – PCWorld

Dell's Latitude 9510 shakes up corporate laptops with 5G, machine learning, and thin bezels | PCWorld ');consent.ads.queue.push(function(){ try { IDG.GPT.addDisplayedAd("gpt-superstitial", "true"); $('#gpt-superstitial').responsiveAd({screenSize:'971 1115', scriptTags: []}); IDG.GPT.log("Creating ad: gpt-superstitial [971 1115]"); }catch (exception) {console.log("Error with IDG.GPT: " + exception);} }); This business workhorse has a lot to like.

Dell Latitude 9510 hands-on: The three best features

Dell's Latitude 9510 has three features we especially love: The integrated 5G, the Dell Optimizer Utility that tunes the laptop to your preferences, and the thin bezels around the huge display.

Today's Best Tech Deals

Picked by PCWorld's Editors

Top Deals On Great Products

Picked by Techconnect's Editors

The Dell Latitude 9510 is a new breed of corporate laptop. Inspired in part by the companys powerful and much-loved Dell XPS 15, its the first model in an ultra-premium business line packed with the best of the best, tuned for business users.

Announced January 2 and unveiled Monday at CES in Las Vegas, the Latitude 9510 weighs just 3.2 pounds and promises up to 30 hours of battery life.PCWorld had a chance to delve into the guts of the Latitude 9510, learning more about whats in it and how it was built. Here are the coolest things we saw:

The Dell Latitude 9510 is shown disassembled, with (top, left to right) the magnesium bottom panel, the aluminum display lid, and the internals; and (bottom) the array of ports, speaker chambers, keyboard, and other small parts.

The thin bezels around the 15.6-inch screen (see top of story) are the biggest hint that the Latitude 9510 took inspiration from its cousin, the XPS 15. Despite the size of the screen, the Latitude 9510 is amazingly compact. And yet, Dell managed to squeeze in a camera above the displaythanks to a teeny, tiny sliver of a module.

A closer look at the motherboard of the Dell Latitude 9510 shows the 52Wh battery and the areas around the periphery where Dell put the 5G antennas.

The Latitude 9510 is one of the first laptops weve seen with integrated 5G networking. The challenge of 5G in laptops is integrating all the antennas you need within a metal chassis thats decidedly radio-unfriendly.

Dell made some careful choices, arraying the antennas around the edges of the laptop and inserting plastic pieces strategically to improve reception. Two of the antennas, for instance, are placed underneath the plastic speaker components and plastic speaker grille.

The Dell Latitude 9510 incorporated plastic speaker panels to allow reception for the 5G antennas underneath.

Not ready for 5G? No worries. Dell also offers the Latitude 9510 with Wi-Fi 6, the latest wireless networking standard.

You are constantly asking your PC to do things for you, usually the same things, over and over. Dells Optimizer software, which debuts on the Latitude 9510, analyzes your usage patterns and tries to save you time with routine tasks.

For instance, the Express SignIn feature logs you in faster. The ExpressResponse feature learns which applications you fire up first and loads them faster for you. Express Charge watches your battery usage and will adjust settings to save bettery, or step in with faster charging when you need some juice, pronto. Intelligent Audio will try to block out background noise so you can videoconference with less distraction.

The Dell Latitude 9510s advanced features and great looks should elevate corporate laptops in performance as well as style.It will come in clamshell and 2-in-1 versions, and is due to ship March 26. Pricing is not yet available.

Melissa Riofrio spent her formative journalistic years reviewing some of the biggest iron at PCWorld--desktops, laptops, storage, printers. As PCWorld's Executive Editor she leads PCWorlds content direction and covers productivity laptops and Chromebooks.

Go here to read the rest:
Dell's Latitude 9510 shakes up corporate laptops with 5G, machine learning, and thin bezels - PCWorld

Read More..

Here’s why digital marketing is as lucrative a career as data science and machine learning – Business Insider India

In an interview with Business Insider, Mayank Kumar, Founder & MD of upGrad told how digital literacy is becoming a buzzword in the ecosystem. The requirement for experienced marketers is getting replaced by the demand for data-driven marketers.

In fact, Kumar says that professionals with 10+ years of experience in traditional marketing or sales are feeling the palpable need to upskill and do so, really fast.

As per LinkedIn, digital marketing specialist is one of the top 15 emerging job roles in India with Mumbai, Bangalore and Delhi attracting the most talent. However, it is no longer confined to traditional aspects of social media or content marketing. They would have to acquire skills with regards to Google Ads, Social Media Optimization, Google Analytics and Search Engine Optimization (SEO).

Nearly doubled salaries

They earn as much as data scientists and other techies who work for full stack development which is one the best paying software roles.

The top 20% of the transitioned learners graduated with an average hike of 177%, which is way above any industry benchmark. Those who were previously in profiles like software testing, software development, traditional marketing, sales and operations are now working with leading companies like HDFC Life, Facebook, IBM, Uber, Zomato, and Microsoft, upGrad said in a statement.

upGrad provides an industry connect for professionals who want to transition from their existing job roles.

We started our in-house placement support team which provides holistic placement services like resume building, interview preparation support and salary negotiation tips. As of today, we have over 300 corporates hiring from upGrads talent pool and we plan to add 50 new companies every quarter.

See also:This Indian startup gains as students from Tier 2 and 3 cities opt for online digital courses

Data scientists with 3 years experience can earn 20 lacs per annum

Here is the original post:
Here's why digital marketing is as lucrative a career as data science and machine learning - Business Insider India

Read More..

Home Office reinforces commitment to AWS with 100m cloud hosting deal – ComputerWeekly.com

The UK Home Office has reinforced its commitment to using Amazon Web Services (AWS) by signing a four-year,100m deal with the public cloud provider.

News of the deal was made public on 7 January 2020 following the publication of the award notice on the governments Contract Finders website.

Although details of the procurement have only just emerged, the award notice confirms that the contract officially started on 12 December 2019, and will run until 11 December 2023.

In a statement to Computer Weekly, the Home Office confirmed that the deal is effectively a renewal of a pre-existing contract between the two entities.

The award of the public cloud hosting services contract to Amazon is a continuation of services already provided to the Home Office, a departmental spokesperson told Computer Weekly. The contract award provides significant savings for the department of a four-year term.

The Home Office is renowned for being a heavy user of cloud technologies, and is according to the governments own Digital Marketplace IT spending league table by far the biggest buyer of off-premise services and technologies via the G-Cloud procurement framework.

According to its data, the Home Office has an evidenced spend of 772.63m on cloud services procured via G-Cloud, with 123.41m of this occurring during the 2019/2020 financial year so far. AWS appears to account for about 45.5m of the total spent by the Home Office to date.

In second place is the Department for Work and Pensions, which has spent about half of the Home Offices total through G-Cloud since the inception of the framework in 2012, having bought 345.23m of services through it to date.

The Home Office recently published a case study outlining the steps it is taking to ensure its increasing use of off-premise technologies is proceeding in a cost and performance-efficient way.

As reported by Computer Weekly, the department released details of how its Immigration Technology team had embarked on a programme of IT resource optimisation-focused work that had already generated savings of 40% during the previous year.

This work included ramping up its use of discounted cloud compute capacity during off-peak periods or by purchasing resources up-front for a lower price, and ensuring that systems were only running as and when needed to keep running costs down.

By continuing these techniques, the team is confident it can increase cloud cost savings by at least another 20% as it continues to experiment, the department said at the time.

Follow this link:
Home Office reinforces commitment to AWS with 100m cloud hosting deal - ComputerWeekly.com

Read More..

Ways In Which Cloud Hosting Affects SEO Services And Results – HostReview.com

The primary purpose of SEO is to reach out to more and more people and for that you will need to increase the visibility of your site. This will attract more viewers to your site and increase the chances of conversion into prospective customers.

In order to achieve this goal, you will need to design and implement the best strategies for search engine optimization for your business website. One of the best ways to do so besides using the best SEO tools is to focus on the technical aspects of your SEO. The most important, beneficial and effective tech solution is cloud hosting.

This tech solution is rapidly becoming popular among business, SEO professionals and SEO for Dentists expert as it is seen that more and more of them are making the switch to this method.

However, if you are planning to make a move to cloud hosting and if you already have SEO for your site, there are a few things that you should be aware of. These are:

Ideally, there are several different ways in which you can analyze the effects and working of cloud hosting on your SEO when it is compared with physical hosting. However, these are the few specific ways to analyze it precisely.

Considering the advantages

Any investment made in any business should be advantageous either in the short or in the long run, preferably both. Therefore, when you want to invest in cloud computing for your SEO, you will need to consider the benefits that it will bring to your business along with the return to your investment.

The advantages of local hosting are many and diverse and can only be explained with a relevant example. Assume that you reside in New York and are into house painting services. This is what happens with the SEO and search while using cloud computing.

This means that a site with a URL paintyourhouse.ny will show up before paintyourhouse.com or paintyourhouse.co.us. Why? The simple reason behind this is that the search engines prioritize the servers that are locally hosted when it comes to SEO and page ranking.

On the other hand, with traditional physical hosting, the only solution available for companies that cater to their customers all across the world is to:

This means that the companies the companies will need to buy different hosting space on different servers, one for the US, one for the UK, one for Australia and so on and so forth. This will a lot of time, effort, money, monitoring, tracking and maintenance.

However, with cloud hosting all such hassles can be overcome easily because the platform will have different servers from all over the world in one place. This will take out the need for buying different hosting space from multiple servers out of the equation. The cloud hosting platform will also provide the businesses with freeparking that will allow the businesses to host their websites from one platform but still get the advantage of local hosting.

Server downtime issue

One of the most significant issues that affects the SEO negatively is server downtime. This is because:

The most significant negative impact of server downtime on SEO is that your site will have a fairly low ranking in the SERPs as compared to those sites that are hosted on servers that do not experience such downtime issues.

This is the significant problem with physical hosting. In this type of hosting you will either sharing one server with several other websites or have only one server dedicated to your site. When and if the server goes down, the site also goes down.

With cloud hosting however, your site will never be inaccessible because it will not on one specific server but on a collection of servers. Therefore, when one server goes down, there will be another one to pick it up immediately to remove the slack. Since your site will never go down, it will eventually help you to retain the good SEO rankings.

The mobile advantage

In these modern days, mobile devices especially smartphones are used extensively to surf the internet and different sites. It is therefore essential that your site has the mobile advantage for a better SEO result. In order to make sure that your site serves both a mobile device as well as a desktop computer with measurable and considerable value that is equal for both.

If you host your site on the cloud you will be able to access a lot of data and advanced analytics metrics regarding the mobile web that the cloud hosting company typically makes available. This data will be very helpful to you to craft an SEO strategy that will be effective both for mobile and for the desktop web.

Therefore, in short, it can be said that cloud hosting is the way to go today as that will give you an easy access to all this relevant data much more than a physical hosting platform will provide.

Read more from the original source:
Ways In Which Cloud Hosting Affects SEO Services And Results - HostReview.com

Read More..

WSL 2: Where is it, and where is it going? – TechRepublic

With Windows 2004 in the final run-up to launch, what's happening to Microsoft's Linux tools?

It's been a while since Microsoft unveiled the re-architecture of its Windows Subsystem for Linux (WSL) at its Build conference. Since then it's been tested as part of the 20H1 series of Insider previews and will launch as part of Windows 10's next major update, which will be called Windows 2004.

That update is now close to feature complete, with only bug fixes expected between now and its likely April launch date. The long delay between completion and launch is part of Microsoft's new approach to Windows 10 updates, giving it longer in the Slow and Release Preview rings to identify and fix bugs and issues. That provides an opportunity to experiment with WSL 2 and look at how it will fit into your toolchain.

There's a big change at the heart of WSL 2. Instead of using a translation layer to convert Linux kernel calls into Windows calls, WSL 2 now offers its own isolated Linux kernel running on a thin version of the Hyper-V hypervisor. The WSL 2 hypervisor is similar to that used by the Windows Sandbox, letting Windows and Linux share the same timers to avoid one OS dominating the other. That allows Linux files to be hosted in a virtual disk with a Linux native ext4 file system using the 9p protocol for interactions between Windows and Linux.

It's important to note that using WSL 2 and the Windows hypervisor platform can affect using other virtualisation tools with Windows. Make sure you have one that can work with Hyper-V before you switch to WSL 2.

You're not getting the latest and greatest Linux kernel with WSL 2. Microsoft has made the decision to base it on the Kernel.org long-term support releases. Initially that means using Linux 4.19, with plans to rebase on new releases as they enter LTS. Microsoft has made its own modifications, keeping memory use to a minimum and only supporting specific devices. You shouldn't expect Microsoft to add additional device support -- it's not building a Linux desktop, only providing a way of running Linux binaries in Windows with a focus on developers building applications for cloud-hosted Linux systems.

With a WSL 2 install the virtual disk is initially limited to 256GB. If you need more space, you have to use Windows' DiskPart tool to resize the VHD manually. Once the disk has been resized, you then need to use Linux's filesystem tools to resize its file system. In practice, 256GB should be enough for most purposes -- especially if you're passing files to and from Windows, and using Windows tools alongside Linux.

Running in a thin hypervisor gives WSL 2 some advantages over traditional virtual machines. Microsoft can preload much of the OS in memory before starting up, giving it a very fast boot time. The intent is to give WSL 2 the feel of an integrated Windows command-line application, and by booting quickly it's possible to go from startup to working in only a few seconds.

WSL 2, here running Ubuntu 18.04.3 on Windows 10, now uses an isolated Linux kernel running on a thin version of Hyper-V.

Image: Simon Bisson/TechRepublic

Microsoft has significantly extended the utility of the underlying WSL management tooling by adding more features to the wsl command that manages the WSL service, bringing in commands that were previously part of wslconfig. You can now use it to switch a distribution downloaded from the Windows Store between WSL 1 and WSL 2, as well as defining which is the default WSL distribution. There's no change to the wsl.conf files used to manage WSL 1 installs, so you can use the same configuration files to mount drives and setup network configurations.

SEE:Windows 10: A cheat sheet(TechRepublic)

Moving from a translation layer to a virtual machine does affect how WSL 2 works with networking, and that can disrupt using tools like X410 for X-based graphical applications. Currently shared loopback addresses are only shared one way, from Windows to WSL. Internally WSL has its own IP address, and if you're configuring X you need a script to automatically set the DISPLAY environment variable before launching any X application in WSL 2.

Microsoft's new Terminal is another part of the WSL 2 story. It's a big update on the old Windows command-line experience, with support for it, for PowerShell, for Azure's Cloud Shell, and for all your WSL installs, both WSL 1 and WSL 2. Reworking the Windows Terminal adds support for console text effects, so you can use more Linux applications without worrying about display compatibility. Some features, like graphical backgrounds, show how customisable the Terminal is, while others, like the ability to split terminals into multiple panes, add features that mimic classic Unix features.

The Windows Terminal brings a new monospaced console font to Windows, Cascadia Code. It's an important update to the original Windows terminal fonts, making consoles easy on the eye. While not yet the default, it's actually well worth switching your terminal configurations to use the new font. Cascadia is installed alongside the Windows Terminal, although if you want to manage your own installs you can find the font on GitHub.

One important development has been the release of remote editing for Visual Studio Code, available in both WSL 1 and WSL 2. Using the WSL release of Ubuntu, type 'code' to launch Visual Studio Code. The first time you do this it'll download the server components into your WSL install. Now when you need to edit a file in WSL all you need to do is type 'code ' and it will open in a Windows-hosted Visual Studio Code window, saving automatically into WSL. Remoting into WSL from Windows allows you to use compilers and debuggers inside Linux, keeping your code where it belongs.

If you're using the new Docker Desktop tools with WSL 2, you can use this integration to work directly with your Linux containers from the Windows desktop. While it's still very much in beta, Docker Desktop shows promise, if only to indicate that enterprise software platforms are looking very closely at WSL, and at the benefits of a hybrid operating system.

Microsoft's switch to hosting WSL on Hyper-V is a step forward; it allows it to quickly support changes to the Linux kernel without having to modify its Windows integration layer and offering complete API support to Linux binaries. The result is an effective hybrid of the two operating systems, especially once you get WSL 2 working with X. But don't expect it to be a complete Linux desktop for every user: WSL remains targeted at developers who want to bring existing macOS- and Linux-based UNIX toolchains to Windows to build containers for cloud-native applications.

Be your company's Microsoft insider with the help of these Windows and Office tutorials and our experts' analyses of Microsoft's enterprise products. Delivered Mondays and Wednesdays

More:
WSL 2: Where is it, and where is it going? - TechRepublic

Read More..

The biggest govtech deals of the week (13/01/20) – NS Tech

The biggest govtech deals of the week (13/01/20) - NS Tech ').appendTo( jQuery(this) ); var divText2 = jQuery('.entry-content p:eq(5)', this); jQuery('.article-mpu:eq(0)', this).insertAfter(divText2); } if (articleLength > 19) { jQuery('').appendTo( jQuery(this) ); var divText3 = jQuery('.entry-content p:eq(15)', this); jQuery('.article-mpu:eq(1)', this).insertAfter(divText3); } if (articleLength > 29) { jQuery('').appendTo( jQuery(this) ); var divText4 = jQuery('.entry-content p:eq(25)', this); jQuery('.article-mpu:eq(2)', this).insertAfter(divText4); } if (articleLength > 39) { jQuery('').appendTo( jQuery(this) ); var divText5 = jQuery('.entry-content p:eq(35)', this); jQuery('.article-mpu:eq(3)', this).insertAfter(divText5); } } } }); } /* Add position of article as a class to its div */ var numCount = 0; function showMoreForNewArticles() { jQuery('.post-detail-row').each(function() { if ( jQuery( this ).hasClass( "marked" ) ) {} else { jQuery(this).addClass('marked'); str1 = 'articleno'; articleNumber = str1.concat(numCount); jQuery(this).addClass(articleNumber); numCount += 1; } }); }/* Initiate Banners on the side and check for unfilled Adslots every second */generateBannersForEmptySlots();window.setInterval(function(){ generateBannersForEmptySlots(); showMoreForNewArticles();}, 1000);/* .Initiate Banners on the side and check for unfilled Adslots every second */if (/Android|BlackBerry|iPhone|iPad|iPod|webOS/i.test(navigator.userAgent) === false) {//Load the second article onlym when you get to the bottom of the first.jQuery(window).bind('scroll', function() {//var elementOffset = jQuery('#full-menu').offset().top, jQuery("#full-menu").removeClass("fixed"); var scroll = jQuery(window).scrollTop(); if (scroll >= 370) { jQuery("#full-menu").addClass("fixed"); } var i = 1; jQuery('#sticky-sidebar').removeClass('widget-fixed'); var stickytop = 1000; var scroll2 = jQuery(window).scrollTop(); if (scroll2 >= stickytop-160) { jQuery("#sticky-sidebar").addClass("widget-fixed"); }}); } else { }};/* InArticle MPU */var mpuSlot0;var nextSlotId = 0;var o = 0;function generateNextSlotName1() { var id = nextSlotId++; return 'mpuSlot' + id; }function infiniteInArticleAds() { var slotName1 = generateNextSlotName1(); var slotDiv = document.createElement('div'); slotDiv.id = slotName1; document.getElementsByClassName('article-mpu')[o].appendChild(slotDiv); googletag.cmd.push(function() { var slot1 = googletag.defineSlot('/5269235/NS_Tech_2015_Dynamic_In_Article_MPU', [[300, 600], [300, 1050], [300, 250]], slotName1). setTargeting("Section", "Public Sector"). addService(googletag.pubads()); googletag.display(slotName1); //googletag.pubads().refresh([slot1]); }); o += 1;}/* InArticle MPU *//* Sidebar MPU 1 */var sidebarMPU1Slot0;var nextSidebarMPU1SlotId = 0;var p = 0;function generateNextSlotNameMPU1() { var id = nextSidebarMPU1SlotId++; return 'sidebarMPU1Slot' + id; }function infiniteSidebarMPU1Ads() { var slotNameMPU1 = generateNextSlotNameMPU1(); var slotSidebarMPU1Div = document.createElement('div'); slotSidebarMPU1Div.id = slotNameMPU1; document.getElementsByClassName('sidebar-mpu-1')[p].appendChild(slotSidebarMPU1Div); googletag.cmd.push(function() { var slotMPU1 = googletag.defineSlot('/5269235/NS_Tech_2015_Dynamic_Sidebar_MPU', [[300, 600], [300, 1050], [300, 250]], slotNameMPU1). setTargeting("Section", "Public Sector"). addService(googletag.pubads()); googletag.display(slotNameMPU1); }); p += 1;}/* Sidebar MPU 1 *//* Sidebar MPU 2 */var sidebarMPU2Slot0;var nextSidebarMPU2SlotId = 0;var q = 0;function generateNextSlotNameMPU2() { var id = nextSidebarMPU2SlotId++; return 'sidebarMPU2Slot' + id; }function infiniteSidebarMPU2Ads() { var slotNameMPU2 = generateNextSlotNameMPU2(); var slotSidebarMPU2Div = document.createElement('div'); slotSidebarMPU2Div.id = slotNameMPU2; document.getElementsByClassName('sidebar-mpu-2')[q].appendChild(slotSidebarMPU2Div); googletag.cmd.push(function() { var slotMPU2 = googletag.defineSlot('/5269235/NS_Tech_2015_Dynamic_Sidebar_MPU', [[300, 600], [300, 1050], [300, 250]], slotNameMPU2). setTargeting("Section", "Public Sector"). addService(googletag.pubads()); googletag.display(slotNameMPU2); //googletag.pubads().refresh([slotMPU2]); }); q += 1;}/* Sidebar MPU 2 *//* InBetweenArticle Leaderboard */var sidebarMPU3Slot0;var nextSidebarMPU3SlotId = 0;var r = 0;function generateNextSlotNameMPU3() { var id = nextSidebarMPU2SlotId++; return 'sidebarMPU3Slot' + id; }function infiniteSidebarMPU3Ads() { var slotNameMPU3 = generateNextSlotNameMPU3(); var slotSidebarMPU3Div = document.createElement('div'); slotSidebarMPU3Div.id = slotNameMPU3; document.getElementsByClassName('between-article-leaderboard')[r].appendChild(slotSidebarMPU3Div); googletag.cmd.push(function() { var slotMPU3 = googletag.defineSlot('/5269235/NS_Tech_2015_Dynamic_Leaderboard_Bottom', [[975, 250], [970, 250], [970, 90], [728, 90]], slotNameMPU3). setTargeting("Section", "Public Sector"). addService(googletag.pubads()); googletag.display(slotNameMPU3); //googletag.pubads().refresh([slotMPU3]); }); r += 1;}/* InBetweenArticle Leaderboard *//* InBetweenArticle MPU */var sidebarMPU4Slot0;var nextSidebarMPU4SlotId = 0;var s = 0;function generateNextSlotNameMPU4() { var id = nextSidebarMPU4SlotId++; return 'sidebarMPU4Slot' + id; }function infiniteSidebarMPU4Ads() { var slotNameMPU4 = generateNextSlotNameMPU4(); var slotSidebarMPU4Div = document.createElement('div'); slotSidebarMPU4Div.id = slotNameMPU4; document.getElementsByClassName('between-article-mpu')[s].appendChild(slotSidebarMPU4Div); googletag.cmd.push(function() { var slotMPU4 = googletag.defineSlot('/5269235/NS_Tech_2015_Mobile_MPU_1', [[300, 600], [300, 1050], [300, 250]], slotNameMPU4). setTargeting("Section", "Public Sector"). addService(googletag.pubads()); googletag.display(slotNameMPU4); //googletag.pubads().refresh([slotMPU4]); }); s += 1;}/* InBetweenArticle MPU */var checkBox1 = 0;jQuery(function($) { jQuery('#popup-tos').bind('scroll', function() { if(jQuery(this).scrollTop() + jQuery(this).innerHeight()>=jQuery(this)[0].scrollHeight) { jQuery( "#itro_popup" ).addClass( "scrollDown" ); jQuery( "#readPrivacyPolicy #ppText" ).text( "I have read your Privacy Policy" ); jQuery( "#readPrivacyPolicy .wpcf7-acceptance").prop('checked', true); jQuery( "#ppSubmit .wpcf7-submit").prop('disabled', false); jQuery( "#readPrivacyPolicy" ).fadeIn('slow'); } }); });jQuery(function(){ jQuery( "#readPrivacyPolicy" ).click(function() { if (checkBox1 == 0) { jQuery( "#readPrivacyPolicy .wpcf7-acceptance").prop('checked', true); jQuery( "#ppSubmit .wpcf7-submit").prop('disabled', false); jQuery( "#itro_popup" ).addClass( "scrollDown" ); checkBox1 = 1; } else { jQuery( "#readPrivacyPolicy .wpcf7-acceptance").prop('checked', false); jQuery( "#ppSubmit .wpcf7-submit").prop('disabled', true); jQuery( "#itro_popup" ).removeClass( "scrollDown" ); checkBox1 = 0; } });});/*jQuery('#popup_content').click(function(){ alert('test');});*/jQuery(function(){ jQuery('#popup_content .header').click(function(){ jQuery(this).closest('#popup_content .container').toggleClass('collapsed'); }); });

Scott Barbour/Getty Images

Read the rest here:
The biggest govtech deals of the week (13/01/20) - NS Tech

Read More..

Lost in Migration? Attributing carbon when outsourcing to cloud – Data Economy

Connectivity and data drive modern economies, with the demand for digital solutions only set to grow. At the same time, increasing awareness and con-cern about climate change means data centres are under attack for their en-ergy use and high-cost operations.

In response, AECOMs Andrew Williamson, Technical Director and electrical specialist; explores innovative approaches to modernise data centres and enhance their sustainability.

The need to reduce emissions is at an all-time high. Currently, electronics account for about five per cent of total global energy usage, with the ICT sector predicting it will use around 20 per cent of the worlds electricity by 2025, contributing up to 5.5 per cent of global carbon emissions.

As technology advances and a billion more people come online in developing countries, that figure is likely to rise even higher, potentially hitting up to 14 per cent by 2040, as the Internet of Things (IoT), artificial intelligence (AI) and smart solutions, such as driverless cars, becoming part of our everyday lives.

All of this intensifies the power burden on data centres, which already consume over two per cent of the worlds electricity. Many data centres are designed with considerable redundancy to enhance uptime and availability, while handling potential peak loads that have yet to be experienced building significant inefficiency into the facilities.

A Two-Fold Solution

There are actions data centre owners and occupiers can take now to increase their sustainability and secure a greener future.Firstly; where possible switching to renewable energy sources, including battery energy storage instead of combustion-based backup generation for short-term resilience. Secondly; modernising infrastructure to improve the energy-efficiency of servers, storage devices and other ICT equipment.

The benefits of modernised data centres

Upgrading legacy installations is an effective way to increase capacity without footprint increases, provided it can be achieved in an energy-efficient manner. It also offers other crucial long-term benefits, such as strengthening competitiveness, reliability, safety, flexibility, environmental integration, and security and monitoring.

For example, modernised data centres are better equipped to minimise downtime and respond to incidents, via the careful design of the redundancy and resilience of power supplies and critical mechanical systems. Through modular and scalable design, they can respond more effectively to changing customer needs and limit operating expenses. Power costs directly influence decisions to locate data centres, so accurately estimating the price of power both now and in the future is vital for modern facilities.

In addition, with increasing processing power comes an increased fire risk. modern data centres are equipped with fire detection and prevention systems and have effective evacuation and rescue built into the layout.Using advanced technologies for cooling and heat recovery, modern data centres are better able to integrate into their community environments. Studying and redesigning using analysis of factors, such as airflow, heat propagation, audible noise, and electromagnetic compatibility is a key component of any upgrading initiative.

Newsletter

Time is precious, but news has no time. Sign up today to receive daily free updates in your email box from the Data Economy Newsroom.

5 Core Steps to Modernise your Data Centre

Working with clients, weve identified the following five essential steps to upgrade smarter, faster and better:

1) Mobilise an Effective Project Team

The project team needs to include all relevant stakeholders from inside the organisation, as well as professional partners with experience in several key areas. These include finance and economics, architecture, planning and consenting, mechanical and electrical systems engineering, and utility connections, such as electric power, district heating, potable and wastewater. An integrated, multi-disciplinary team will deliver a holistic design, which does not have built-in inefficiencies due to design margins at the interfaces between different components and sub-systems.

2) Consider your Options

There is no universally applicable process for deciding how to modernise a data centre. With many possible solutions available, its important that the project team evaluate their options carefully during the planning stage to make sure they select the more effective, value-led approach for their facility. Analysis and simulation tools such as integrated safety-in-design, computational fluid dynamics, electro-magnetic transient simulations, thermodynamic models, and Monte Carlo reliability, availability and maintainability simulations are available to support this process, quantifying reliability and resilience, safety risks and energy savings potential, along with a range of impacts on the surrounding area.

3) Build a Strategically Focused Business Case

Although financial and economic analysis is an important part of the optioneering and planning phases, to help build a strong business case for modernisation, other factors also need to be considered. Alongside cost savings and return on investment, strategic factors, including brand-building, local community acceptance, future-proofing, and positioning in developing power and energy markets, and the so-called triple-bottom-line framework, which measures social and ecological impacts as well as economic impacts, must also be included.

4) Plan and Deliver

Each component of the modernised data centre must be designed considering all interfaces and possible conflicts. Sophisticated tools, including BIM, finite-element analysis and numerical integration simulation solutions, can assess aspects, such as fluid dynamics, thermodynamics, electrical transient performance, electromagnetic compatibility, audible noise and arc flash risk. Our team uses these tools in planning, designing and delivering data centres for major global hyper-scalers and colocation providers to mitigate risks upfront and focus attention where needed as early as possible.

5) Continually Validate Results and Refine Operational Procedures

Rigorous and objective assessments of the real-world performance are often neglected in the euphoria of completing a project. Measuring energy savings, environmental effects and impacts on nearby uses of shared utilities are often required for regulatory approvals. they in turn, also inform improvements to operational procedures, which can lead to further savings across the whole project.

Unlocking Value and Efficiency

By integrating the engineering and analysis function into a single project organisation, our team has worked to strengthen the optioneering, planning and delivery stages of datacentre modernisation projects.

For us, interaction and optimisation between sustainability engineering, utility connection designs, mechanical and electrical engineering, and a broad range of technical and environmental functions is key to unlocking value, which can be difficult to achieve when you use discrete, specialist scientists and engineers.

Read the latest from the Data Economy Newsroom:

Continue reading here:
Lost in Migration? Attributing carbon when outsourcing to cloud - Data Economy

Read More..

UKCloud announces new discounts to help organisations thrive and build an increasingly independent national capability – RealWire

UKCloud will provide a 25% discount on specialist assessment services alongside various promotions such as free discovery workshops and offers on sovereign cloud services

London 9th January 2020 UKCloud, the multi-cloud experts dedicated to making transformation happen across UK public sector, has today announced a variety of promotions of its data assessment services and private cloud solutions which are optimised to enable the UK Public Sector, and other organisations in regulated industries, to improve their autonomy and data security posture before the UK leaves the EU.

With the new Government and new Parliament now in place, it is now certain that the UK will be leaving the EU, beginning on 31 January 2020. But the Withdrawal Agreement sets a more challenging date, 31 December 2020 by which time the UK will leave the EU even if a deal has not been agreed. No trade deal of this size or complexity has ever been negotiated with the EU in such a short period and the trade deal wont just cover the flow of products and services but also data.

The EUs new Data Protection Supervisor has already said that the UK is 13th in the queue of countries that are negotiating data deals with Brussels. A deal is critical for thousands of businesses, especially in health and insurance sectors as more than three-quarters of UK data transfers are with EU member states, according to the technology trade association techUK, yet it is typical for this type of deal to take several years to negotiate and agree. Simultaneously, there is growing concern about the excessive dependency that organisations in the Western world have on U.S. technology providers that are subject to foreign legislation such as the U.S. CLOUD act. This all creates complexity and uncertainty for those with sensitive or secure data requirements.

UKCloud specialises in helping organisations in the Public Sector and similarly regulated industries adopt the right mix of cloud services without compromising on the compliance, connectivity and sovereignty requirements that are associated with valuable datasets which need to be protected and nurtured.

Michael Shenouda, Medical Director at Open Medical said 'Due to the nature of data held by the NHS, we needed a solution that would provide security, assurance and UK sovereignty, whilst also giving us the ability to scale our services. UKCloud ticked all the boxes and being focused in the UK Public Sector and governed by UK jurisdiction further helped enhance the decision.

In response to the increasing need for organisations in the UK to become more adaptable and less dependent on foreign services, UKCloud is running a variety of offers across a suite of services and solutions which apply to various stages of an organisations preparedness:

Abbey Bux, Head of Forensic Computing at The Insolvency Service said, UKCloud completed a consulting exercise for us which involved discovery, design and migration of data into the cloud for a high-profile environment. We were impressed with the team right from initial engagement. They were customer centric and knowledgeable, had a strong pedigree in upholding security standards and have a credible portfolio of cloud-based services which they have been delivering for a number of years. The migration went really well, communication was consistent and thorough, we were really pleased with the technical expertise and advice and we have subsequently appointed UKCloud to provide us with a managed service for this environment.

On a limited basis, UKCloud is offering various discounts and promotions on these services to organisations looking for cost effective options to best prepare their IT systems and datasets to drive better agility to handle the emerging complexity and uncertainty.

Leighton James, Chief Technology Officer at UKCloud said, For much of the last decade, the UK has been creating the right environment for digital transformation. Now is the time to further invest in our sovereign digital infrastructure which enables organisations across the UK to nurture and protect data as the national asset that it is. UKClouds mission is to use our unique knowledge and experience to help organisations reduce the time, cost and risk of developing our national capability and harnessing our invaluable national datasets.

To book your free discovery workshop, please contact: info@ukcloud.com or visit http://www.ukcloud.com/contact

- ends -

About UKCloudUKCloud provide assured, agile and value-based true public and multi-cloud solutions that enable our customers to deliver enhanced performance through technology.

UKCloud. Making Transformation Happen.

Additional information about UKCloud can be found at http://www.ukcloud.com or by following us on Twitter at @ukcloudltd, while information about UKCloud Health can be found at http://www.ukcloudhealth.com or on Twitter at @ukcloudhealth and information about UKCloudX can be found at http://www.ukcloudX.com or on Twitter at @ukcloudx

Media ContactEllie Robson-Frisby, Head of Marketing E: erobsonfrisby@ukcloud.com M: 07775 538135

See the rest here:
UKCloud announces new discounts to help organisations thrive and build an increasingly independent national capability - RealWire

Read More..