Page 2,380«..1020..2,3792,3802,3812,382..2,3902,400..»

Top 10 Best Machine Learning Companies to Join in 2022 – Analytics Insight

Machine learning is a blessing. Here are the 10 best machine learning companies to join in 2022

Machine learning is a blessing. The industrial sector saw a radical shift when machine learning and AI came into the limelight. Machine learning companies are gradually evolving at a faster pace and have emerged as one of the key players of IT firms. Machine learning refers to the development of intelligent algorithms and statistical modelling, that aids in improving programming, without coding them explicitly. For example, ML can make a predictive analysis app more precise, with the passing time. ML frameworks and models require an amalgamation of data science, engineering, and development skills. As we are becoming completely dependent on technology for making our lives faster and smoother, machine learning has also become an integral part of our lives. It is now widely accessed by various organizations on this planet. They have started building in-house data science teams. Some of these teams primarily focus on analysing business data, to generate valuable insights and the rest try to incorporate machine learning capabilities into their companys products.

Share This ArticleDo the sharing thingy

About AuthorMore info about author

More:

Top 10 Best Machine Learning Companies to Join in 2022 - Analytics Insight

Read More..

A Guide to ECCO: Python Based Tool for Explainability of Transformers – Analytics India Magazine

Accountability is required for any decision-making tool in an organization. Machine learning models are already being used to automate time-consuming administrative tasks and to make complex business decisions. To ensure proper security of the model and business decisions, scientists and engineers must understand the inner mechanics of their models, which is commonly referred to as a black box. This is no longer the case, as various tools, such as ELI5, are available to track the inner mechanics of the model. In this article, well look at how to explain the inner workings of language models like transformers using a toolbox called ECCO. The main points to be covered in this article are listed below.

Lets start the discussion by understanding the explainability of machine learning models.

Explainability in machine learning refers to the process of explaining a machine learning models decision to a human. The term model explainability refers to the ability of a human to understand an algorithms decision or output. Its the process of deciphering the reasoning behind a machine learning models decisions and outcomes. With black box machine learning models, which develop and learn directly from data without human supervision or guidance, this is an important concept to understand.

A human developer would traditionally write the code for a system or model. The system evolves from the data with machine learning. Machine learning will be used to improve the algorithms ability to perform a specific task or action by learning from data. Because the underlying functionality of the machine learning model was developed by the system itself, it can be difficult to understand why the system made a particular decision once it is deployed.

Machine learning models are used to classify new data or predict trends by learning relationships between input and output data. The model will identify these patterns and relationships within the dataset. This means that the deployed model will make decisions based on patterns and relationships that human developers may not be aware of. The explainability process aids human specialists in comprehending the decisions algorithm. After that, the model can be explained to non-technical stakeholders.

Machine learning explainability can be achieved using a variety of tools and techniques that vary in approach and machine learning model type. Traditional machine learning models may be simpler to comprehend and explain, but more complex models, such as deep neural networks, can be extremely difficult to grasp.

When machine learning has a negative impact on business profits, it earns a bad reputation. This is frequently the result of a misalignment between the data science and business teams. There are a few areas where Explainability heals based on this, such as,

Understanding how your models make decisions reveals previously unknown vulnerabilities and flaws. Control is simple with these insights. When applied across all models in production, the ability to quickly identify and correct mistakes in low-risk situations adds up.

In high-risk industries like healthcare and finance, trust is critical. Before ML solutions can be used and trusted, all stakeholders must have a thorough understanding of what the model does. If you claim that your model is better at making decisions and detecting patterns than humans, you must be able to back it up with evidence. Experts in the field are understandably skeptical of any technology that claims to be able to see more than they can.

When a model makes a bad or rogue decision, its critical to understand the factors that led to that decision, as well as who is to blame for the failure, in order to avoid similar issues in the future. Data science teams can use explainability to give organizations more control over AI tools.

The terms explainability and interpretability are frequently used interchangeably in the disciplines of machine learning and artificial intelligence. While they are very similar, it is instructive to note the distinctions, if only to get a sense of how tough things may become as you advance deeper into machine learning systems.

The degree to which a cause and effect may be observed inside a system is referred to as interpretability. To put it another way, its your capacity to predict what will happen if the input or computational parameters are changed.

Explainability, on the other hand, relates to how well a machines or deep learning systems internal mechanics can be articulated in human terms. Its easy to ignore the subtle contrast between interpretability and comprehension, but consider this: interpretability is the ability to comprehend mechanics without necessarily knowing why. The ability to explain what is happening in depth is referred to as explainability.

Many recent advances in NLP have been powered by the transformer architecture, and until now, we had no idea why Transformer-based NLP models have been so successful in recent years. To improve the transparency of Transformer-based language models, ECCO an open-source library for the explainability of Transformer-based NLP models was created.

ECCO offers tools and interactive explorable explanations to help with the examination and intuition of terms, such as Input Saliency, which visualizes the token importance for a given sentence. Hidden State Evaluation is applied to all layers of a model to determine the role of each layer. Non-negative matrix factorization of neuron activations was used to uncover underlying patterns of neuron firings, revealing firing patterns of linguistic properties of input tokens, and neuron activation tell us how a group of neurons spikes or responds while making a prediction.

Now in this section, we will take a look at how ECCO can be used to understand the working of various transformers models while predicting the sequence-based output. Majorly well see how weights are distributed at the final layer while predicting the next sequence and will also analyze all layers of the selected model.

To start with ECCO we can install it using the pip command as! pip install Ecco

And also make sure you have also installed the Pytorch.

First, we will start with generating a single token by passing a random string to the model. The GPT2 is used due to its superiority for generating the next sequence as a human does. The below code shows how we load the pre-trained model and how to use it for the prediction. The below generate method takes an input sequence and additionally there we can pass how many tokens we need to generate from the model by specifying generate= some number.

While initializing the pre-trained model we set activation=True that we capture all the firing status of the neurons.

Now well generate a token using the generate method.

From the method, token 6 and 5 is generated as first and of respectively.

The model has a total of 6 decoder layers and the last layer is the decision layer where the appropriate token is chosen.

Now we will observe the status of the last layers and see what are the top 15 tokens that the model has considered. Here we observe the status for position / token 6 and this can be achieved by output.layer_predictions method as below.

output.layer_predictions(position=6, layer=5, topk=15)

As we can see, the token first comes up with a higher contribution.

Similarly, we can check how different tokens would perform at the output layer. This can be done by explicitly passing the token numbers inside the method ranking_watch. However, tokens can be easily generated by using the pre-trained model that we have selected initially.

Below are the generated token IDs.

Now well supply these IDs to see the rankings.

output.rankings_watch(watch=[262, 717, 621], position=6)

At the decision layer, we can see the first rank is achieved by token first and the rest not even closer to it. Thus we can say the model has correctly identified the next token and did assign proper weights for possible tokens.

We have seen what the explainability of a Model is and how important it is when it comes to deploying such a model to the production level in this article. Tools are needed to aid debugging models, explain their behavior, and develop intuitions about their inner mechanics as language models become more common. Ecco is one such tool that combines ease of use, visual interactive explorable, and a variety of model explainability methods. This article focused on the ML models explainability and a glimpse of the ECCO Toolbox.

Read this article:

A Guide to ECCO: Python Based Tool for Explainability of Transformers - Analytics India Magazine

Read More..

What to know about the Minnesota redistricting plans going before a special judicial panel this week – MinnPost

A passel of lawyers will gather Tuesday morning in a large conference room in the Minnesota Judicial Center and take their last shot at influencing the special five-judge panel charged with drawing new congressional and legislative districts for the state.

The oral arguments in Wattson v. Simon will be the final public part of the process triggered 10 months ago by a lawsuit asking the Minnesota Supreme Court to address the assertion that the 2020 Census made the states current political lines unconstitutional.

After the hearing, the panel will spend six weeks doing two things. The first will be drawing eight new congressional districts, 67 state Senate districts and 134 state House districts. The second will be waiting, at least until Feb. 15, to be certain that the divided state Legislature will fail in its redistricting duties.

Lawyers for each of the four groups proposing new maps, known as intervenors, will be given time to make the case that their vision is the correct one and that the visions of the other three plaintiffs are not.

Article continues after advertisement

On Dec. 7, the four groups filed documents detailing their plans; on Dec. 17, all four filed briefs defending their plan and critiquing the plans of the others. Those briefs give an early look at what will be talked about during oral arguments Tuesday.

A typical civil case involves two parties: a plaintiff and a respondent. This one involves five: four that have proposed maps and Secretary of State Steve Simon, who was sued. The map-makers are known as the Wattson Plaintiffs (for lead plaintiff Peter Wattson, a former legislative lawyer involved in past redistricting efforts); the Anderson Plaintiffs (representing Republican Party interests); the Sachs Plaintiffs (representing DFL interests); and the Corrie Plaintiffs (for lead plaintiff Bruce Corrie, who are advocating for maximum representation for communities of color).

A fifth group, though not formal intervenors, has submitted a series of friend of court filings with a December 8 filing accepted by the court but a Dec. 29 filing commenting on the four intervenors plans rejected [December 8 Minnesota Special Redistricting Panel Brief, December 29 Minnesota Motion for Leave]. Calling themselves the Citizen Data Scientists, the group of 12 Minnesota residents is made up of professors, practitioners, and researchers in data science, computer science, mathematics, statistics, and engineering at some of Minnesotas leading institutions of higher education, who applied computational redistricting, a relatively new field that uses high-performance computers and optimization algorithms to systematically search through millions of possible combinations of district boundaries.

Here is a sampling of how the four intervenors defended their own work and attacked the others in the lengthy briefs that were filed in mid-December.

The Wattson plaintiffs proposal follows a least-change approach that advocates that court-drawn lines make just enough changes to restore population balance while following other legal mandates set by the panel. Based on the 2020 Census, Minnesotas congressional districts should have 713,312 residents, state Senate districts should have 85,172 and state House districts should have 42,586.

The other principles set out by the judicial panel include: not harming communities of color; not overly dividing local government boundaries; not dividing the reservations of American Indian tribes; crafting districts that are contiguous and convenient for voters; preserving communities of people with shared interests; and avoiding drawing lines with the purpose of protecting, promoting or defeating any incumbent, candidate or political party.

The Wattson plaintiffs proposal for new congressional districts.

The plans submitted by the other parties in this matter fail to adhere to this Panels redistricting principles for some obvious reasons, and some not so obvious reasons A less obvious but very important reason is that the plans of the Anderson Plaintiffs and Sachs Plaintiffs were drawn for the purpose of promoting, protecting or defeating an incumbent, candidate or party, notes the Wattson plaintiffs brief. The districts created by these parties can be explained on no ground other than attempting to gain a partisan advantage.

Article continues after advertisement

The Wattson Plaintiffs have argued that the only way to know if a plan was drawn to help an incumbent or party is to know where incumbents live and how proposed lines would impact future elections. This comes despite the panels assertion that it will not draw districts based on the residence of incumbent office holders and will not consider past election results when drawing districts.

Wattson forges ahead anyway, citing a partisan index the plaintiffs created to apply past election results to new lines.

One example that Wattson cites is how the DFL-friendly Sachs Plaintiffs plan shifts voters from the 3rd Congressional District (now held by DFL Rep. Dean Phillips) and the 5th Congressional District (now held by DFL Rep. Ilhan Omar) to make the 2nd Congressional District (now held by DFL Rep. Angie Craig) safer for Democrats.

The net effect of these changes is that CD 5 is much less convenient. It is sandwiched between CD 3 and CD 4 and is shaped like a T or a hammer, the Wattson brief states.

The Wattson brief also points out that the Corrie Plaintiffs new 8th Congressional District includes three GOP incumbents: U.S. Reps. Pete Stauber, Michelle Fischbach and Tom Emmer, while the DFL-leaning Sachs Plaintiffs plan puts both Emmer and Fischbach in the same district.

By just narrowly including Representative Emmer in CD 7 (Corrie Plaintiffs Plan) and narrowly including Representative Fischbach in CD 6 (Sachs Plaintiffs Plan), with no justification other than population, it is apparent that these pairings were done to defeat Republican incumbents, the Wattson brief states.

The GOP-leaning group of intervenors said they base their congressional plan on a geographic distribution of seats established in previous redistricting processes.

Each of the Opposing Parties congressional redistricting plans propose drastic reconfigurations to Minnesotas existing congressional districts and fail to meet this Panels redistricting criteria, the Anderson brief states, by combining rural and suburban communities into the same district. Doing so negatively impacts the ability for rural voters to elect representatives that reflect their priorities and concerns.

Article continues after advertisement

The Anderson Congressional Plan, on the other hand preserves the unique interests of rural, suburban/exurban, and urban Minnesotans.

Anderson takes issue with a new 8th Congressional District proposed by the Corrie Plaintiffs that reaches across the northern part of the state from North Dakota to Lake Superior.

The Anderson plaintiffs proposal for new congressional districts.

Anderson also accuses DFL-leaning plans of helping the DFL win more seats in Congress: By moving first ring suburbs, which have natural affinities with and similarities to Minneapolis and St. Paul, to districts comprised largely of highly suburban and exurban areas, these parties put more DFL-leaning voters in the perennially toss-up Third and Second districts, Anderson wrote. At the same time, removing first ring suburbs and adding outer suburban voters to the urban Fourth and Fifth districts pose no real risk to DFL candidates, incumbents, or the party, because the Fourth and Fifth districts have had highly reliable DFL majorities for decades.

The DFL-leaning group relies heavily on testimony given during the five-judge panels public hearings in October and criticizes others especially Anderson and Wattson for not taking that testimony into account. (For their part, those intervenors say Sachs cherry picks testimony that supports their decisions and disregards others.)

Sachs also accuses the Wattson plaintiffs of overly strict adherence to its least-change philosophy. Rather than draw districts that are responsive to the states geography and demographics, they instead pursue what they characterize as a least-change approach, one that rigidly focuses on calcified lines on a map and not the wishes and needs of Minnesotans statewide, the Sachs brief states. Their overemphasis on staticity for its own sake has produced proposed maps that are non-responsive to the clear wishes of Minnesotans as expressed to the Panel and that will consequently fail to accurately reflect the human geography of the state.

The Sachs plaintiffs proposal for new congressional districts.

Sachs also criticizes Wattson for using election analyses and incumbent location data. The Sachs Plaintiffs maintain that these sorts of partisan considerations ask the Panel to delve into troubling political waters, Sachs stated. Whether the parties proposed plans avoid impermissible political entanglements should instead be judged based on the degree to which they otherwise satisfy the Panels neutral redistricting criteria, particularly evidence in the record regarding the suitability of joining communities within the same district and dividing others among different districts.

Article continues after advertisement

Sachs also objects that Anderson and Wattson continue to have a First Congressional District that runs across the entire border with Iowa, accusing them of slavish devotion to prior district lines. The Sachs plan instead joins the southwest counties with a new 7th Congressional district that would run north and south from Iowa to Canada.

While both Corrie and Sachs criticize the Wattson plan for the least-change approach and a desire to avoid splitting local governments and precincts, they do so with different conclusions. Said Sachs: the Wattson Plaintiffs have ignored the Redistricting Principles laid out by this Panel, and instead prioritized their own principles, particularly preserving voting precincts and ensuring political competitiveness based on past election results.

But Corrie sees much different motives. In stark contrast to the Panels directive, the Wattson brief makes clear that its maps were created to ensure each incumbent is protected and unabashedly describes how districts were created based on where incumbents live and how to solidify their votes. Throughout their discussion, the Wattson Plaintiffs make scant mention of Minnesotas BIPOC communities. Rather, they pursue incumbent protection in the guise of protecting minority voting rights, perhaps hoping this Panel will not see they have directly contravened this Panels Redistricting Principles.

The Corrie plaintiffs proposal for new congressional districts.

The Corrie Plaintiffs House Plan has 24 districts with 30% or greater minority voting-age population. The Sachs Plaintiffs House Plan also has 24, but the Wattson Plaintiffs has only 21, and the Anderson Plaintiffs has only 18. The Corrie House Plan is the only plan that creates a district (HD 2B) where American Indian/Native American residents constitute 44.5% of the district population, giving this community the ability to elect candidates of choice when voting in alliance with others.

And Corrie explains its choice to spread the 8th Congressional District from east to west as a way to get the states tribal nations into a single district.

As the only map proposal that places all of northern Minnesota in one district, thereby bringing together the three largest American Indian reservations (Red Lake Nation, White Earth Nation, and Leech Lake Band of Ojibwe) as well as four other tribal reservations (such as Bois Forte Band of Chippewa, Fond du Lac Band of Lake Superior Chippewa, and Mille Lacs Band of Ojibwe, Grand Portage Band of Lake Superior Chippewa) and trust lands, the Corrie Congressional Map is the only map that abides by the Courts Redistricting Principles.

Here is the original post:

What to know about the Minnesota redistricting plans going before a special judicial panel this week - MinnPost

Read More..

Upcoming NSTDA Supercomputer in Thailand to Use Nvidia A100 GPUs – CDOTrends

Thailands National Science and Technology Development Agency (NSTDA) upcoming supercomputer will harness hundreds of GPUs, making it the largest public high-performance computing system in Southeast Asia, says Nvidia.

Powered by 704 Nvidia A100 Tensor Core GPUs, the new system will be 30 times faster than the current TARA HPC system. According to information from Nvidias product page, the A100 is available in 40GB or 80GB variants and offers up to 294 times higher AI inference performance over traditional CPUs.

The new supercomputer will be hosted at the NSTDA Supercomputer Centre (ThaiSC) to drive research by engineers and computational and data scientists from academia, government, and industry sectors. It is expected to support research projects in areas such as pharmaceuticals, renewable energy, and weather forecasting.

The new supercomputer at NSTDA will expand and enhance research in Thailand, speeding up the development of breakthroughs that benefit individuals and industries in the country, said Dennis Ang the senior director of enterprise business for worldwide field operations in the SEA and ANZ region at Nvidia.

NVIDIA A100 incorporates building blocks across hardware, networking, software, libraries, optimized AI models, and applications to enable extreme performance for AI and HPC, said Ang.

We chose NVIDIA A100 because it is currently the leading solution for HPC-AI in the market. Even more important is that many HPC-AI software applications are well supported by NVIDIA technology, and the list will keep growing, explained Manaschai Kunaseth, chief of operations at ThaiSC.

When operational, the additional power of the new supercomputer will allow users at ThaiSC to scale up existing research projects. Specifically, the new supercomputer will accelerate innovation for Thailands efforts with more advanced modeling, simulation, AI, and analytics capabilities.

Kwanchiva Thangthai, of the National Electronics and Computer Technology Centers Speech and Text Understanding Team, expects to see massive efficiency gains in speech recognition research pipelines. We can gain competitive performance and provide a free-of-charge Thai speech-to-text service for everyone via AIForThai, she said.

The supercomputer is expected to commence operation in the second half of 2022.

Image credit: iStockphoto/sdecoret

Original post:

Upcoming NSTDA Supercomputer in Thailand to Use Nvidia A100 GPUs - CDOTrends

Read More..

January 2022: Insight into how metabolites affect health aided by new data platforms – Environmental Factor Newsletter

Gary Siuzdak, Ph.D., from the Scripps Research Institute, highlighted exciting technologies that he said will advance the field of metabolomics and a wide range of scientific discovery, during a Dec. 7 NIEHS lecture. Metabolomics is the large-scale study of chemical reactions involving metabolites, which are small molecules that play important roles in cells, tissues, and organisms.

According to Siuzdak, research in this field originally focused on identifying metabolites that serve as biological signs of disease, which scientists call biomarkers. However, metabolomics has evolved into a more comprehensive tool for understanding how metabolites themselves can influence health and illness.

The most important area where metabolomics can be applied is in looking for active metabolites that affect physiology, Siuzdak said. For example, metabolites can impact and even improve the way we respond to medicine or exposure to toxic agents.

Siuzdak developed data analysis platforms called XCMS and METLIN that enable scientists to discover how metabolites can alter critical biological processes, and the tools have been cited in more than 10,000 scientific projects, he noted.

Through XCMS and METLIN, which now contains detailed data on 860,000 molecular standards, the Scripps Center for Metabolomics has strengthened research worldwide, across a variety of disciplines, said Siuzdak, the centers director.

Continued development of databases like METLIN is vital to success of the metabolomics field, noted David Crizer, Ph.D., a chemist in the NIEHS Division of the National Toxicology Program. He is a member of the institutes Metabolomics Cross-Divisional Group, which hosted Siuzdaks talk (see sidebar).

METLIN is designed to help scientists identify molecules in organisms, whether metabolites, toxicological agents, or other chemical entities, according to Siuzdak. He noted that the database encompasses more than 350 chemical classes, and there now are more than 50,000 registered users in 132 countries.

Our goal is to identify as many metabolites and other chemical entities as possible, and given the advances in other fields of biology, this data is long overdue, Siuzdak said.

We are finding metabolites that were previously unknown, quite regularly, he added. The more comprehensive METLIN is, the better chance we have of eventually identifying all molecules. To this end, I am constantly looking for ways to facilitate growth of the platform.

A metabolite called indole-3-propionic acid (IPA) is of particular interest to Siuzdak. IPA is a human gutderived metabolite originally identified by his lab in a 2009 paper in the Proceedings of the National Academy of Sciences, and it has since been examined in thousands of studies. Researchers have discovered that it is a multifunctional molecule that can aid immune function, among other roles.

In retrospect, it makes sense that a metabolite derived from a gut microbe could modulate the immune system, which is probably why it still generates so much excitement, he said.

IPA could be especially relevant with respect to autoimmune diseases, Siuzdak added.

For example, most people who die from COVID-19 dont succumb to the virus but from an overactive immune response that causes them to develop respiratory ailments, he said. A metabolite that modulates this effect could be very beneficial, noted Siuzdak.

Overall, we are pursuing one primary goal in the development of METLIN, which is to use experimental data generated from molecular standards to help identify these key, physiologically relevant molecules, he said.

Citation: Wikoff WR, Anfora AT, Liu J, Schultz PG, Lesley SA, Peters EC, Siuzdak G. 2009. Metabolomics analysis reveals large effects of gut microflora on mammalian blood metabolites. Proc Natl Acad Sci U S A 106(10):36983703.

(John Yewell is a contract writer for the NIEHS Office of Communications and Public Liaison.)

Read the original post:

January 2022: Insight into how metabolites affect health aided by new data platforms - Environmental Factor Newsletter

Read More..

Why Banks Are Slow to Embrace Cloud Computing – The New York Times

In North America, banks handle only 12 percent of their tasks on the cloud, but that could double in the next two years, the consulting firm Accenture said in a survey. Jamie Dimon, chief executive of JPMorgan Chase, said the bank needed to adopt new technologies such as artificial intelligence and cloud technology as fast as possible.

Jan. 4, 2022, 7:23 a.m. ET

Wells Fargo plans to move to data centers owned by Microsoft and Google over several years; Morgan Stanley is also working with Microsoft. Bank of America has saved $2 billion a year in part by building its own cloud. Goldman said in November that it would team up with Amazon Web Services to give clients access to mountains of financial data and analytical tools.

Cloud services enable banks to rent data storage and processing power from providers including Amazon, Google or Microsoft, which have their own data centers dotted around the globe. After moving to the cloud, banks can access their data on the internet and use the tech companies computing capacity when needed, instead of running their own servers year-round.

Seeing a big opportunity to sell cloud-computing services to Wall Street, some tech giants have hired former bankers who can use their knowledge of the rules and constraints under which banks operate to pitch the industry.

Scott Mullins, AWSs head of business development for financial services, previously worked at JPMorgan and Nasdaq. Yolande Piazza, vice president for financial services at Google Cloud, is the former chief executive of Citi FinTech, an innovation unit at Citigroup. Bill Borden at Microsoft and Howard Boville at IBM are Bank of America alumni.

Cloud providers are moving at a much faster development pace when you think of security, compliance and control structures, compared with individual banks, said Mr. Borden, a corporate vice president for worldwide financial services at Microsoft. The cloud, Mr. Borden and the other executives said, enables companies to increase their computer processing capabilities when they need it, which is much cheaper than running servers on their own premises.

But glitches do occur. One week after Goldman teamed up with Amazon, an AWS outage halted webcasts from a conference hosted by the bank that convened chief executives from the biggest U.S. financial firms. The glitch also caused problems for Amazons Alexa voice assistant, Disneys streaming service and Ticketmaster. AWS and its competitor, Microsoft Azure, both had outages recently.

Banking regulators in the United States, including the Federal Reserve, Federal Deposit Insurance Corporation and Office of the Comptroller of the Currency, have jointly underscored the need for lenders to manage risks and have backup systems in place when they outsource technology to cloud providers. The European Banking Authority warned firms about concentration risk, or becoming overly reliant on a single tech company.

The Financial Industry Regulatory Authority, which oversees broker dealers firms that engage in trading activity has already moved all its technology to the cloud. The group previously spent tens of millions of dollars a year to run its own servers but now rents space on AWS servers for a fraction of that amount, said Steven J. Randich, FINRAs chief information officer.

Go here to read the rest:
Why Banks Are Slow to Embrace Cloud Computing - The New York Times

Read More..

How I fell into the self-hosting rabbit hole in 2021 – Windows Central

Source: Microsoft

In some corners of the Internet, self-hosting is a big thing. There's a huge community to be found in places like Reddit, some great podcasts and so many helpful resources to learn from.

But what does self-hosting actually mean?

In simple terms, it's all about hosting your own services over reliance on a public cloud, wherever that comes from. It could be file storage, it could be a media server, a home automation system, security cameras, you name it, there's probably someone who's at least tried to self host it.

I've been very happy with my little home setup and the journey it took me on. I've started to learn some skills along the way and as we go into 2022, here's a little about the what, the why, and the still to come.

Public cloud services are extremely convenient. That's why they're so popular. And I'm not saying services like OneDrive are bad; far from it. But for reasons I don't fully understand (perhaps getting older and angrier, perhaps lockdown boredom), through 2021 I started taking more interest in which companies had access to my data.

Some of it is about privacy, but there's also a growing skepticism deep within me about reliance on a few big cloud providers for too many services. The recent AWS outages serve as a stark reminder that when something goes wrong, I can't access my doorbell properly. This seems like a ridiculous problem to have.

So, I started looking at what I could do to both be more mindful of my data and the rabbit hole then led to self-hosting. I haven't replaced everything that relies on someone else's infrastructure, but I've made a start. And perhaps I'm a little surprised at how enjoyable the whole process has been.

One of the best parts of this whole process has been starting to learn some new skills. I'm hardly an expert in any of them, but without getting into self-hosting I'd probably never have touched any of this.

I've been learning to use Linux since the first great lockdown of 2020, when I got bored and thought I'd give it a try. But that's always been on a desktop level like Windows 11. Through self-hosting, I've started to dabble in the world of servers and Docker containers while learning more about tools like SSH and even networking basics.

I love learning new IT skills, but in recent years I've definitely been a bit of a slacker. In recent months I've dabbled with Ubuntu Server, Docker, Portainer, building my own configuration files and the wonders of VLANs, all through getting into self-hosting. And the great thing is that it really is an endless rabbit hole. One thing leads to another which leads to another and so on.

A year ago, even though I'd been spending more time using Linux, in particular WSL on Windows 10, the command line still daunted me. Going into 2022 I find that working in a terminal is strangely satisfying.

So, to the good stuff: what am I actually self-hosting? I've tried a bunch of different apps and services across a number of devices. I started out on my Synology NAS before branching out a little and repurposing old hardware. I have a 2012 Apple Mac Mini that's useless as a Mac now, but it makes a fantastic little server box. My old Raspberry Pi 3 was found in a drawer and that's doing work as well.

The main service I've fell in love with this year is AdGuard Home. I've used Pi-Hole in the past but never really got attached to it. As good as it is, my inner noob is much more at home with the slick user interface and more beginner-friendly approach you get from AdGuard Home. It's running on my Raspberry Pi right now, though 2022 may see it relocated when I finally get proper fiber.

Also running on that same Raspberry Pi is a rudimentary local file server. It's a simple Samba setup, mostly in existence because I read a blog post on it and thought I'd give it a try. It's set up with a small USB flash drive connected to the Pi and I've been using it to share files across my home network that I don't need long-term or syncing to all my devices. It probably won't be around too much longer, but it's been handy.

On the old Mac Mini is where the bulk of the load lives right now. It's not running macOS of course, instead using Ubuntu Server 20.04 LTS. Even for a computer approaching 10 years old, Ubuntu Server is extremely lightweight with no desktop environment weighing it down. And currently, everything on here lives in a Docker container which is managed through the Portainer GUI.

Portainer is especially good for Docker novices like me, as it removes the need to handle Docker Compose files directly. You can either use one of the pre-selected app templates or simply point it at the Docker image for the service you want to set up and leave it to do the rest. To access the apps, all that's needed is a web browser.

Currently being hosted there is:

And over on my Synology NAS, I've finally got round to setting up a Plex server again with live TV and DVR thanks to HDHomeRun integration. I've also been using Nextcloud these last couple of months, set up on a DigitalOcean remote instance. Nextcloud is really fantastic and I use it for a number of purposes, including syncing offline copies of my work for the site, managing calendars and email, file backups and sync, and even handling RSS.

All of the services I'm self-hosting, with the exception of Plex, are free and open-source, which has been another priority throughout 2021.

To say I'm hooked on self-hosting is an understatement. It's addictive, and much like building a gaming PC, there's always the "next step" with hardware and software.

In 2022, I'm determined to carry on what I've learned and build upon it, developing more skills and trying things I previously would have run away from like Forrest Gump at full tilt. So I'll be dipping my toes into the world of homelabs, which goes hand-in-hand with self-hosting.

A homelab can be a giant server rack full of thousands of dollars of gear, or it can be a Raspberry Pi. And anything in between. Most of the services I've been using are incredibly lightweight which means there's really no need to spend money on new, expensive hardware.

I am looking to expand though. On the list of things to start learning in 2022 is virtualization with Proxmox/VMWare ESXi, Kubernetes, and (finally) starting to learn to code, well, something. I plan to pick up another 2012 Mac Mini or something similarly old, small and cheap, to maybe cluster together, and at least a Raspberry Pi 4.

I've already built a cheap home server rack from an IKEA Lack table of all things (post coming on that in the new year) and grabbed some ex-enterprise network gear for peanuts to set up a fully wired network in my office, separate from the home Wi-Fi.

We've no idea if 2022 is going to be better or worse than 2020 and 2021 at this point, but I'm going into it with a new hobby and a pretty massive to-do list.

Read the rest here:
How I fell into the self-hosting rabbit hole in 2021 - Windows Central

Read More..

The future of web hosting: 5 things to look out for in 2022 – TechRadar

The year 2021 proved that the online world has the power to keep businesses moving through turbulent times. It played a leading role in maintaining relationships between businesses and their customers and it even sparked areas of growth, highlighting the importance of a strong and resilient online presence.

Now we are in 2022, the world of web hosting, domains and website building is likely going to continue shifting to keep up with changing and growing needs of business.

From how the function of domains will change, to the future of server centres and sustainability, here is what might be in store for the world of web tech this new year.

Domains are a finite resource and so its only a matter of time before businesses are forced to start thinking outside the box. 2022 will be the year in which new domains start to gain popularity.

The UK domain name market has been overly saturated with requests for .com and .co.uk extensions for some time. When unavailable, website owners wont go straight for a secondary extension like .me or .cloud. Instead, theyll go back and change something about their company name by adding in a dash or a dot between words. Its only when that compromise isnt available that they look down the list to changing the extension type.

The way domains are used now has changed drastically from ten years ago and this will only continue to evolve in this new year. Often a Google search will be the fastest route to a website and therefore the specific domain name is far less relevant.

Businesses also need to factor in what kind of audience they attract and how they navigate the internet. Everyone has different behavioural traits online, but more often we want to find the required information as fast as possible.

Accessibility and simplicity are key. And because of this everything has got to get shorter and faster to access, including weblinks. Thats where these new domains will come in.

We hope to see new domain extensions take off in 2022 including .shop and .cloud, however for this change to take place, the market has to get behind it.

When it comes to the future of website building, web hosting platforms will need to adapt to ensure they are able to provide businesses with the opportunity to add widgets, new pages and update themes, all at the tap of a screen.

In short, there is going to be much more of an emphasis on convenience and ease than ever before. Website building will be focussed on inputting basic information into an app: and if its not easy to use businesses will lose customer interest.

Take the integration of social shops as an example, with the ever-increasing popularity of platforms such as Facebook, Instagram and TikTok shops, its more important than ever for web builders to offer quick integration that sync with these channels easily.

Businesses dont want the hassle of managing a website, and so in 2022, web hosting platforms that can provide infrastructure support such as upgrades and admin will be favored, enabling businesses to focus on other key areas, like revenue generation and business management.

We have many customers that already consume a lot of our underlying support. Demand for server support, as an example, is on the rise as businesses look for ease of use.

Software developers want to log into a server, install it and be done. Over the next year outsourcing activity such as the management of servers will only grow in importance.

The largest threat to our industry right now is cybersecurity, and this will only intensify over the coming year.

Conversations around cybersecurity have become part of managing a business and identifying who is responsible for the upkeep of the security of certain areas like email accounts, GDPR adherence among other things is crucial.

Cybercriminals are out there, and they will continue to grow in prominence. In 2022, greater focus will be placed on security within web hosting platforms. SSL certificates will become increasingly important to provide customers with the confidence that they can operate in a safe and secure space where daily functions wont be compromised.

Environmental initiatives will be on the agenda for businesses across a range of sectors as the business world turns its attention to how we can build a more sustainable world and reduce carbon emissions.

With stats estimating that data centers are responsible for 1% of the worlds consumed electricity, this is going to become a high priority in meeting environmental business targets: data centers that are run on renewable energy like solar or wind, will rise in popularity.

The key take-outs are simple. Businesses must now pursue the most convenient channels, that add ease to their everyday operations and allow them to focus on revenue generating activities. Traditional uses of domains are going to decline, making way for a search engine focused future.

We will also see an increase in businesses placing importance on their web infrastructure and outsourcing the support they need to maintain it. Finally, the push to find greener server suppliers has only just begun, and businesses are going to be on the lookout for suppliers that help to contribute to their carbon reduction pledges.

There is no denying that the online world is changing. Crucially, hosting and web building platforms as well as domain providers must keep up by offering services businesses require to stay ahead of the curve in 2022.

Read the rest here:
The future of web hosting: 5 things to look out for in 2022 - TechRadar

Read More..

New Connectivity Is Bringing Roads Up to Speed – Wired.co.uk

Three big trends in connectivity are set to revolutionise our roads in the next five to ten years: edge computing, 5G and vehicle-to-everything (V2X) communications. In 2022, edge-node sensors will be installed at roadsides around the world as an initial step in bringing all three together as part of the transport infrastructure.

Edge-based computing focuses on shifting the intelligence and data processing much closer to the original data source. Edge servers, known as edge nodes, are deployed in close proximity to vehicle networks. Unlike cloud computing, they can deliver real-time information without any delay.

The movement of data from cloud-based to edge-based has started but, in 2022, edge will become more prevalent, leading to consumers experiencing the first benefits of true 5G-based communication. For motorists, this will mean instantaneous content and high-precision, location-based services. For businesses, it will mean being able to reach customers more effectively.

Practically, this will be achieved by plug-and-play edge nodes arriving at roadsides. Unlike previous technology upgrades, such as adding cell towers, this wont require disruptive roadwork the nodes will primarily be hooked up wirelessly.

These edge nodes, incorporated into roadside units, will act as location references to support real-time, precise services. They will provide drivers with information on what is happening on the road, such as temporary blocks or the closest available parking at their destination. These time savings will translate into fuel and cost savings. Uber drivers, for example, who earn money based on the number of journeys or deliveries they complete, will be able to pinpoint a specific assigned rider, even in a crowded area, orthe exact front door for a delivery.

Businesses will also no longer operate solely from the cloud, which relies on customers reaching them via an app or website. Instead, some will start on the edge and connect with consumers they know are located close to their services. A car wash, for example, will be able to send a discount code to customers driving nearby.

These developments have already begun, with a number of edge-node infrastructure projects due to be deployed or further scaled in 2022. New York is piloting roadside units to improve safety, using them to deliver two-way alerts via an app that pings to keep vulnerable road users, such as cyclists and pedestrians, protected in high-traffic areas. In Austin, Texas, 5G roadside equipment will be deployed to provide real-time traffic updates and lane guidance. This will support safety and mobility applications, including signal controllers that grant traffic priority for emergency vehicles, and apps that issue alerts warning of incidents or road works.

View post:
New Connectivity Is Bringing Roads Up to Speed - Wired.co.uk

Read More..

Healthcare for the new normal world reimagined with digital analytics at the core – ETHealthworld.com

By Abhishek Rungta

In this new normal world, there has been a significant trend toward predictive and preventative approaches in public health due to a growing need for patient-centric or value-based, medical treatment. Rather than just treating symptoms as they arise, doctors can spot patients who are at high risk of acquiring chronic diseases and intervene before they become a problem.

Preventive therapy may assist to avoid long-term difficulties and costly hospitalizations, lowering expenses for the practitioner, insurance company, and patient.

Data analytics is critical since it helps organizations improve their results. Healthcare analytics uses current and historical data to gain insights, macro and micro, and support decision-making at both the patient and business level. The application of healthcare analytics has the potential to lower treatment costs, forecast disease outbreaks, avoid preventable diseases, and much more. It means, data analytics in healthcare helps to enhance overall patient care and quality of life.

A Statista report has revealed that by 2025 the market for health-related analytics will increase to about $28 billion. Here we will discuss the prospects of digital analytics in the healthcare industry in this new normal world.

Electronic Health Records (EHR)According to the HealthIT.gov report, 75% of healthcare professionals claim that their EHR helps them provide better patient care, which leads to increased patient satisfaction and fewer prescription mistakes, among other advantages. It helps both the patients and clinics in ample ways, like:

Fitness devicesNowadays, a considerable population wears fitness devices like FitBit, Apple Watch, etc. These wearable devices help to keep a track of the physical activities of the people using them.

The data acquired by these gadgets is transferred to cloud servers. By doing so, doctors can utilize the data to determine an individual's overall health and create appropriate wellness programs. Besides, the data can help doctors learn about particular health-related tendencies.

Prediction of disease outbreaksData analytics may also be used to forecast patterns in the transmission of disease, helping doctors, hospitals, and other healthcare workers to be better prepared.

For example, the WHO (World Health Organization) collects data on the reported cases of influenza. Based on that, the vaccine makers prepare the influenza vaccines so that influenza can be prevented as much as possible. The CDC says, The influenza viruses in the seasonal flu vaccine are selected each year based on surveillance data indicating which viruses are circulating and forecasts about which viruses are the most likely to circulate during the coming season.Besides, health has become the new passport because vaccinations define travel restrictions. Detecting the next inevitable variant, epidemic, or pandemic needs the dynamic analytics of AI and IoT.

Fraud preventionFraud in healthcare is one of the biggest issues across the globe. According to a report, nearly 60 billion dollars are lost annually due to health care fraud and abuse.

Predictive analytics, when combined with trained machine learning models, may detect specific irregularities that indicate fraudulent behavior, allowing for early detection.

Reduction of healthcare costsHealthcare costs are gradually rising every year. According to PwC's Health Research Institute (HRI) report, a 6.5% medical cost trend in 2022 is expected.

But the use of data analytics can help to reduce healthcare costs. Hospitals and doctors may acquire precise models for cutting costs and patient risk by using predictive and prescriptive data analytics.

The more insights this data analytics provides physicians, the better patient care they may provide. This information also indicates that they are more likely to have shorter hospital stays or fewer admissions or re-admissions. Eventually, patients benefit from lower healthcare costs.

By Abhishek Rungta, Founder & CEO, Indus Net Technologies.

(DISCLAIMER: The views expressed are solely of the author and ETHealthworld does not necessarily subscribe to it. ETHealthworld.com shall not be responsible for any damage caused to any person / organisation directly or indirectly.)

Go here to see the original:
Healthcare for the new normal world reimagined with digital analytics at the core - ETHealthworld.com

Read More..