Page 1,833«..1020..1,8321,8331,8341,835..1,8401,850..»

Friction in Data Analytics Workflows Causing Stress and Burnout – InfoQ.com

Subscribe on: Apple Podcasts Google Podcasts Soundcloud Spotify Overcast Podcast Feed

Shane Hastie: Good day folks. This is Shane Hastie for the InfoQ Engineering Culture podcast. Today I'm sitting down literally across the world from Matthew Scullion. Matthew is the CEO of Matillion who do data stuff in the cloud. Matthew, welcome. Thanks for taking the time to talk to us today.

Matthew Scullion: Shane, it is such a pleasure to be here. Thanks for having us on the podcast. And you're right, data stuff. We should perhaps get into that.

Shane Hastie: I look forward to talking about some of this data stuff. But before we get into that, probably a good starting point is, tell me a little bit about yourself. What's your background? What brought you to where you are today?

Oh gosh, okay. Well, as you said, Shane, Matthew Scullion. I'm CEO and co-founder of a software company called Matillion. I hail from Manchester, UK. So, that's a long way away from you at the moment. It's nice to have the world connected in this way. I've spent my whole career in software, really. I got started very young. I don't know why, but I'm a little embarrassed about this now. I got involved in my first software startup when I was, I think, 17 years old, back in late nineties, on the run-up to the millennium bug and also, importantly, as the internet was just starting to revolutionize business. And I've been working around B2B enterprise infrastructure software ever since.

And then, just over 10 years ago, I was lucky to co-found Matillion. We're an ISV, which means a software company. So, you're right, we do data stuff. We're not a solutions companies, so we don't go in and deliver finished projects for companies. Rather, we make the technologies that customers and solution providers use to deliver data projects. And we founded that company in Manchester in 2011. Just myself, my co-founder Ed Thompson, our CTO at the time, and we were shortly thereafter joined by another co-founder, Peter McCord. Today, the company's international. About 1500 customers around the world, mostly, in revenue terms certainly, large enterprise customers spread across well over 40 different countries, and about 600 Matillioners. Roughly half R and D, building out the platform, and half running the business and looking after our clients and things like that. I'm trying to think if there's anything else at all interesting to say about me, Shane. I am, outside of work, lucky to be surrounded by beautiful ladies, my wife and my two daughters. And so, between those two things, Matillion and my family, that's most of the interesting stuff to say about me, I think.

Shane Hastie: Wonderful. Thank you. So, the reason we got together to talk about this was a survey that you did looking at what's happening in the data employment field, the data job markets, and in the use of business data. So, what were the interesting things that came out of that survey?

Matthew Scullion: Thanks very much for asking about that, Shane. And you're quite right, we did do a survey, and it was a survey of our own customers. We're lucky to have quite a lot of large enterprise customers that use our technology. I mean, there's hundreds of them. Western Union, Sony, Slack, National Grid, Peet's Coffee, Cisco. It's a long list of big companies that use Matillion software to make their data useful. And so, we can talk to those companies, and also ones that aren't yet Matillion customers, about what they've got going on in data, wider in their enterprise architecture, and in fact, with their teams and their employment situations, to make sure we are doing the right things to make their lives better, I suppose. And we had some hypotheses based on our own experience. We have a large R and D team here at Matillion, and we had observations about what's going on in the engineering talent market, of course, but also feedback from our customers and partners about why they use our technology and what they've got going on around data.

Our hypothesis, Shane, and the reason that Matillion exists as a company, really is, as I know, certainly you and probably every listener to this podcast will have noticed, data has become a pretty big deal, right? As we like to say, it's the new commodity, the new oil, and every aspect of how we work, live and play today is being changed, we hope for the better, with the application of data. It's happening now, everywhere, and really quickly. We can talk, if you want, about some of the reasons for that, but let's just bank that for now. You've got this worldwide race to put data to work. And of course, what that means is there's a constraint, or set of constraints, and many of those constraints are around people. Whilst we all like to talk about and think about the things that data can do for us, helping us understand and serve our companies' customers better is one of the reasons why companies put data to work. Streamlining business processes, improving products, increasingly data becoming the products.

All these things are what organizations are trying to do, and we do that with analytics and data visualization, artificial intelligence and machine learning. But what's spoken about a lot less is that before you can do any of that stuff, before you can build the core dashboard that informs an area of the business, what to do next with a high level of fidelity, before you can coach the AI model to help your business become smarter and more efficient, you have to make data useful. You have to refine it, a little bit like iron ore into steel. The world is awash with data, but data doesn't start off useful in its raw state. It's not born in a way where you can put it to work in analytics, AI, or machine learning. You have to refine it. And the world's ability to do that refinement is highly constrained. The ways that we do it are quite primitive and slow. They're the purview of a small number of highly skilled people.

Our thesis was that every organization would like to be able to do more analytics, AI and ML projects, but they have this kink in the hose pipe. There's size nine boots stood on the hose pipe of useful data coming through, and we thought was lightly causing stress and constraint within enterprise data teams. So we did this survey to ask and to say, "Is it true? Do you struggle in this area?" And the answer was very much yes, Shane, and we got some really interesting feedback from that.

Shane Hastie: And what was that feedback?

Matthew Scullion: So, we targeted the survey on a couple of areas. And first of all, we're saying, "Well, look, this part of making data useful in order to unlock AI machine learning and analytics projects. It may well be constrained, but is it a big deal? How much of your time on a use case like that do you spend trying to do that sort of stuff?" And this, really, is the heart of the answer I think. If you're not involved in this space, you might not realize. Typically it's about 60%, according to this and previous survey results. 60% of the work of delivering an analytics, AI and machine learning use case isn't in building the dashboard, isn't in the data scientist defining and coaching the model. Isn't in the fun stuff, therefore, Shane. The stuff that we think about and use. Rather it's in the loading, joining together, refinement and embellishment of the data to take it from its raw material state, buried in source systems into something ready to be used in analytics.

So, any time a company is thinking about delivering a data use case, they have to think about, the majority of the work is going to be in refining the data to make it useful. And so, we then asked for more information about what that was like, and the survey results were pretty clear. 75% of the data teams that we surveyed, at least, reported to us that the ways that they were doing that were slowing them down, mostly because they were either using outdated technology to do that, pre-cloud technology repurposed to a post-cloud world, and that was slowing this down. Or because they were doing it in granular ways. The cloud, I think many of us think it's quite mainstream, and it is, right? It is pretty mainstream. But it's still quite early in this once-in-a-generation tectonic change in the way that we deliver enterprise infrastructure technology. It's still quite early. And normally in technology revolutions, we start off doing things in quite manual ways. We code them at a fairly low level.

So, 75% of data teams believe that the ways that they're doing migration of data, data integration and maintenance, are costing their organizations both time, productivity and money. And that constraint also makes their lives less pleasant personally as they otherwise could be. Around 50% of our user respondents in this survey revealed this unpleasant picture, Shane, to be honest, of constant pressure and stress that comes with dealing with inefficient data integration. To put it simply, the business wants, needs and is asking for more than they're capable of delivering, and that leads to 50% of these people feeding back that they feel under constant pressure and stress, experiencing burnout, and actually, this means that data professionals in such teams are looking for new roles and looking to go to areas with more manageable work-life balances.

So, yeah, it's an interesting correlation between the desire of all organizations, really, to make themselves better using data, the boot on the hose pipe slowing down our ability to doing that, meaning that data professionals are maxed out and unable to keep up the demand. And that, in turn, therefore, leading to stress and difficulty in attracting and retaining talent into teams. Does that all make sense?

Shane Hastie: It does indeed. And, certainly, if I think back to my experience, the projects that were the toughest, it was generally pretty easy to get the software product built, but then to do the data integration or the data conversions as we did so often back then, and making that old data usable again, were very stressful and not fun. That's still the case.

Matthew Scullion: It's still the case and worse to an order of magnitude because we have so many systems now. Separately, we also did a survey, probably need to work on a more interesting way of introducing that term, don't I, Shane? But we talk to our clients all the time. And another data point we have is that in our enterprise customers, our larger businesses so, this is typically businesses with, say, a revenue of 500 million US dollars or above. The average number of systems that they want to get data out of and put it to work in analytics projects, the average number is just north of a thousand different systems. Now, that's not in a single use case, but it is across the organization. And each of those systems, of course, has got dozens or hundreds, in many cases thousands of data elements inside it. You look at a system like SAP, I think it has 80,000 different entities inside, and that would count as one system on my list of a thousand.

And in today's world, even a company like Matillion, we're a 600-person company. We have hundreds of modern SaaS applications that we use, and I'd be fairly willing to bet that we have a couple of ones being created every day. So, the challenge is becoming harder and harder. And at the other side of the equation, the hunger, the need to deliver data projects much, much more acute, as we race to change every aspect of how we work, live and play, for the better, using data. Organizations that can figure out an agile, productive, maintainable way of doing this at pace have a huge competitive advantage. It really is something that can be driven at the engineering and enterprise architecture and IT leadership level, because the decisions that we make there can give the business agility and speed as well as making people's lives better in the way that we do it.

Shane Hastie: Let's drill into this. What are some of the big decisions that organizations need to make at that level to support this, to make using data easier?

Matthew Scullion: Yeah, so I'm very much focused, as we've discussed already, on this part of using data, the making it useful. The refining it from iron ore into steel, before you then turn that steel into a bridge or ship or a building, right? So, in terms of building the dashboards or doing the data science, that's not really my bag. But the bit that we focus on, which is the majority of the work, like I mentioned earlier, is getting the data into one place, the de-normalizing, flattening and joining together of that data. The embellishing it with metrics to make a single version of the truth, and make it useful. And then, the making sure that process happens fast enough, reliably, at scale, and can be maintained over time. That's the bit that I focus in. So, I'm answering your question, Shane, through that lens, and in my belief, at least, to focus on that bit, because it's not the bit that we think about, but it's the majority of the work.

First of all, perhaps it would be useful to talk about how we typically do that today in the cloud, and people have been doing this stuff for 30 years, right? So, what's accelerating the rate at which data is used and needs to be used is the cloud. The cloud's provided this platform where we can, almost at the speed of thought, create limitlessly scalable data platforms and derive competitive advantage that improves the lives of our downstream customers. Once you've created that latent capacity, people want to use it, and therefore you have to use it. So, the number of data projects and the speed at which we can do them today, massively up and to the right because of the cloud. And then, we've spoken already about all the different source systems that have got your iron ore buried in.

So, in the cloud today, people typically do, for the most part, one of two different main ways to make data useful, to do data integration, to refine it from iron ore into steel. So, the first thing that they do, and this is very common in new technology, is that they make data useful in a very engineering-centric way. Great thing about coding, as you and I know well, is that you can do anything in code, right? And so, we do, particularly earlier technology markets. We hand code making data useful. And there's nothing wrong with that, and in some use cases, it's, in fact, the right way to do it. There's a range of different technologies that we can do, we might be doing it in SQL or DBT. We might be doing it using Spark and Pi Spark. We might even be coding in Java or whatever. But we're using engineering skills to do this work. And that's great, because A, we don't need any other software to do it. B, engineers can do anything. It's very precise.

But it does have a couple of major drawbacks when we are faced with the need to innovate with data in every aspect of how we work, live and play. And drawback number one is it's the purview of a small number of people, comparatively, right? Engineering resources in almost every organization are scarce. And particularly in larger organizations, companies with many hundreds or many thousands of team members, the per capita headcount of engineers in a business that's got 10,000 people, most of whom make movies or earth-moving equipment or sell drugs or whatever it is. It's low, right? We're a precious resource, us engineers. And because we've got this huge amount of work to do in data integration, we become a bottleneck.

The second thing is data integration just changes all the time. Any time I've ever seen someone use a dashboard, read a report, they're like, "That's great, and now I have another question." And that means the data integration that supports that data use case immediately needs updating. So, you don't just build something once, it's permanently evolving. And so, at a personal level for the engineer, unless they want to sit there and maintain that data integration program forever, we need to think about that, and it's not a one and done thing. And so, that then causes a problem because we have to ramp new skills onto the project. People don't want to do that forever. They want to move on to different companies, different use cases, and sorry, if they don't, ultimately they'll probably move on to a different company because they're bored. And as an organization, we need the ability to ramp new skills on there, and that's difficult in code, because you've got to go and learn what someone else coded.

So, in the post-cloud world, in this early new mega trend, comparatively speaking, one of the ways that we make data useful is by hand-coding it, in effect. And that's great because we can do it with precision, and engineers can do anything, but the downside is it's the least productive way to do it. It's the purview of a small number of valuable, but scarce people, and it's hard to maintain in the long term. Now, the other way that people do this is that they use data integration technology that solves some of those problems, but that was built for the pre-cloud world. And that's the other side of the coin that people face. They're like, "Okay, well I don't want to code this stuff. I learned this 20 years ago with my on-premise data warehouse and my on-premise data integration technology. I need this stuff to be maintainable. I need a wider audience of people to be able to participate. I'll use my existing enterprise data integration technology, ETL technology, to do that."

That's a great approach, apart from the fact that pre-cloud technology isn't architected to make best use of the modern cloud, public cloud platforms and hyperscalers likes AWS Azure and Google Cloud, nor the modern cloud data platforms like Snowflake, Databricks, Amazon Redshift, Google BigQuery, et al. And so, in that situation, you've gone to all the trouble of buying a Blu-ray player, but you're watching it through a standard definition television, right? You're using the modern underlying technology, but the way you're accessing it is out of date. Architecturally, the way that we do things in the cloud is just different to how we did it with on-premises technology, and therefore it's hard to square that circle.

It's for these two reasons that today, many organizations struggle to make data useful fast enough, and why, in turn, therefore, that they're in this lose-lose situation of the engineers are either stressed out and burnt out and stuck on projects that they want to move on from, or bored because they're doing low-level data enrichment for weeks, months, or years, and not being able to get off it, as the business' insatiable demand for useful data never goes away and they can't keep up. Or, because they're unable to serve the needs of the business and to change every aspect of how we work, live and play with data. Or honestly, Shane, probably both. It's probably both of those things.

So our view, and this is why Matillion exists, is that you can square this circle. You can make data useful with productivity, and the way that you do it is by putting a technology layer in place, specifically designed to talk to these problems. And if that technology layer is going to be successful, we think it needs to have a couple of things that it exhibits. The first one is it needs to solve for this skills problem, and do that by making it essentially easier whilst not dumbing it down, and by making it easier, making a wider audience of people able to participate in making data useful. Now, we do that in Matillion by making our technology low-code, no-code, code optional. Matillion's platform is a visual data integration platform, so you can dive in and visually load, transform, synchronize and orchestrate data.

That low-code, no-code environments can make a single engineer far more productive, but perhaps as, if not more importantly, it means it's not just high-end engineers that can do this work. It can also be done by data professionals, maybe ETL guys, BI people, data scientists. Even tech-savvy business analyst, financiers and marketers. Anyone that understands what a row and a column is can pretty much use technology like Matillion. And the other thing that the low-code, no-code user experience really helps with is managing skills on projects. You can ramp someone onto a project that's already been up and running much more easily, because you can understand what's going on, because it's a diagram. You can drop into something a year after it was last touched and make changes to it much, much more easily because it's low-code, no-code.

Now, the average engineer, Shane, in my experience, often is skeptical about visual 4GL or low-code, no-code engineering, and I understand the reasons why. We've all tried to use these tools before. But, in the case of data, at least, it can be done. It's a technically hard problem, it's one that we've spent the last seven, eight years perfecting, but you can build a visual environment that creates the high-quality push down ELT instruction set to the underlying cloud data platform as well, if not perhaps even better than we could by hand, and certainly far faster. That pure ELT architecture, which means that we get the underlying cloud data platform to do the work of transforming data, giving us scalability and performance in our data integrations. That's really important, and that can be done, and that's certainly what we've done at Matillion.

The other criteria I'll just touch on quickly. The people that suffer with this skills challenge the most are larger businesses. Smaller businesses that are really putting data to work tend to be either technology businesses or technology-enabled businesses, which probably means they're younger and therefore have fewer source systems with data in. A higher percentage of their team are engineering team members. They're more digitally native. And so, the problem's slightly less pronounced for that kind of tech startup style company. But if you're a global 8,000, manufacturing, retail, life sciences, public sector, financial services, whatever type company, then your primary business is doing something else, and this is something that you need to do as a part of it. The problem for you is super acute.

And so, the second criteria that a technology that's going to solve this problem has to have is it has to work well for the enterprise, and that's the other thing that Matillion does. So, we're data integration for the cloud and for the enterprise, and that means that we scale to very large use cases and have all the right security and permissions technology. But it's also things like auditability, maintainability, integration to software development life-cycle management, and code repositories and all that sort of good stuff, so that you can treat data integration in the same way that you treat building software, with proper, agile processes, proper DevOps, or as we call them in the data space, data-ops processes, in use behind the scenes.

So, that's the challenge. And finally, if you don't mind me rounding out on this point, Shane, it's like, we've all lived through this before. Nothing's new in IT. The example I always go back to is one from, I was going to say the beginning of my career. I'd be exaggerating my age slightly there, actually. It's more like it's from the beginning of my life. But the PC revolution is something I always think about. When PCs first came in, the people that used them were enthusiasts and engineers because they arrived in a box of components that you had to solder together. And then, you had to write code to make them do anything. And that's the same with every technology revolution. And that's where we're up to with data today. And then later, visual operating systems, abstracted the backend complexity of the hardware and underlying software, and allowed a wider audience for people to get involved, and then, suddenly, everyone in the world use PCs. And now, we don't really think about PCs anymore. It's just a screen in our pocket or our laptop bag.

That's what will and is happening with data. We've been in the solder it together and write code stage, but we will never be able to keep up with the world's insatiable need and desire to make data useful by doing it that way. We have to get more people into the pass rush, and that's certainly what we and Matillion are trying to do, which suits everyone. It means engineers can focus on the unique problems that only they can solve. It means business people closer to the business problems can self-serve, and in a democratized way, make the data useful that they need to understand their customers better and drive business improvement.

Shane Hastie: Some really interesting stuff in there. Just coming around a little bit, this is the Engineering Culture podcast. In our conversations before we started recording, you mentioned that Matillion has a strong culture, and that you do quite a lot to maintain and support that. What's needed to build and maintain a great culture in an engineering-centric organization?

Matthew Scullion: Thanks for asking about that, Shane, and you're right. People that are unlucky enough to get cornered by me at cocktail parties will know that I like to do nothing more than bang on about culture. It's important to me. I believe that it's important to any organization trying to be high performance and change the world like, certainly, we are here in Matillion. I often say a line when I'm talking to the company, that the most important thing in Matillion, and I have to be careful with this one, because it could be misinterpreted. The most important thing in Matillion, it's not even our product platform, which is so important to us and our customers. It's not our shiny investors. Matillion was lucky to become a unicorn stage company last year, I think we've raised about 300 million bucks in venture capital so far from some of the most prestigious investors in the world, who we value greatly, but they're not the most important thing.

It's not even, Shane, now this is the bit I have to be careful saying, it's not even our customers in a way. We only exist to make the lives of our customers better. But the most important thing at Matillion is our team, because it's our team that make those customers' lives better, that build those products, that attract those investors. The team in any organization is the most important thing, in my opinion. And teams live in a culture. And if that culture's good, then that team will perform better, and ultimately do a better job at delighting its customers, building its products, whatever they do. So, we really believe that at Matillion. We always have, actually. The very first thing that I did on the first day of Matillion, all the way back in January of 2011, which seems like a long, long time ago now, is I wrote down the Matillion values. There's six of them today. I don't think I had six on the first day. I think I embellished the list of it afterwards. But we wrote down the Matillion values, these values being the foundations that this culture sits on top of.

If we talk to engineering culture specifically, I've either been an engineer or been working with or managing engineers my whole career. So, 25 years now, I suppose, managing or being in engineering management. And the things that I think are the case about engineering culture is, first of all, engineering is fundamentally a creative business. We invent new, fun stuff every day. And so, thing number one that you've got to do for engineers is keep it interesting, right? There's got to be interesting, stimulating work to do. This is partly what we heard in that data survey a few minutes ago, right? If you're making data useful through code, might be interesting for the first few days, but for the next five years, maintaining it's not very interesting. It gets boring, stressful, and you churn out the company. You've got to keep engineers stimulated, give them technically interesting problems.

But also, and this next one applies to all parts of the organization. You've got to give them a culture, you've got to give each other a culture, where we can do our best work. Where we're intellectually safe to do our best work. Where we treat each other with integrity and kindness. Where we are all aligned to delivering on shared goals, where we all know what those same shared goals are ,and where we trust each other in a particular way. That particular way of trusting each other, it's trusting that we have the same shared goal, because that means if you say to me, "Hey, Matthew, I think you are approaching this in the wrong way," then I know that you're only saying that to me because you have the same shared goal as I do. And therefore, I'm happy that you're saying it to me. In fact, if you didn't say it to me, you'd be helping me fail.

So, trust in shared goals, the kind of intellectual safety born from respect and integrity. And then, finally, the interest and stimulation. To me, those are all central to providing a resonant culture for perhaps all team members in an organization, but certainly engineers to work in. We think it's a huge competitive advantage to have a strong, healthy culture. We think it's the advantage that's allowed us, in part, but materially so, to be well on the way to building a consequential, generational organization that's making the world's data useful. Yes, as you can tell, it's something I feel very passionate about.

Shane Hastie: Thank you very much. A lot of good stuff there. If people want to continue the conversation, where do they find you?

Matthew Scullion: Well, me personally, you can find me on Twitter, @MatthewScullion. On LinkedIn, just hit Matthew Scullion Matillion, you'll find me on there. Company-wise, please do go ahead and visit us at matillion.com. All our software is very easy to consume. It's all cloud-native, so you can try it out free of charge, click it and launch it in a few minutes, and we'd love to see you there. And Shane, it's been such a pleasure being on the podcast today. Thank you for having me.

Mentioned

.From this page you also have access to our recorded show notes. They all have clickable links that will take you directly to that part of the audio.

Original post:

Friction in Data Analytics Workflows Causing Stress and Burnout - InfoQ.com

Read More..

What Schools Miss When Theyre Missing Relationship Data – EdSurge

Last month, a new study in Nature revealed a key predictor of economic mobility: connectedness. Specifically, researchers at Opportunity Insights found that relationships with higher-income students dramatically improved low-income students chances of upward mobility in adulthood, even more than traditional success metrics like school quality.

The Opportunity Insights team garnered praise for the sheer size of the data set they built to reach their findings: Their Social Capital Atlas consists of a staggering 21 billion data points on connection, mined from de-identified Facebook data from 72 million users. The analysis also yielded a new species of school-level data, charting the degree of economic connectedness within individual high schools and colleges across the country.

This new research begs a bigger question for education leaders striving for more equitable outcomes: What kinds of relationship data do schools need to understand the trajectories their students are on, and the relationships and resources at their disposal?

Unfortunately, legacy education data systems rarely contain much in the way of relationship data.

Thats not to say schools fly entirely blind. Schools can keep track of which students are paired with what teachers. They can assign advisors or mentors to students who are struggling. They can administer culture and belonging surveys that measure how students and staff experience and perceive their community.

But rosters and climate surveys only get you so far. They lean institution-centric, rather than student-centric. In other words, they rarely reveal the actual relationships and networks at play in students lives. Moreover, they tell schools nothing about students connections with family, friends, coaches, neighbors and the like that make up a young persons actual network, and often contain valuable assets that schools could tap into.

How might schools go about discovering who students know? One obvious strategy to gain a more complete picture of students networks is to ask students themselves.

Often, this takes the form of an activity called relationship mapping, which I describe in greater detail in a new report for the Christensen Institute, Students hidden networks: Relationship mapping as a strategy to build asset-based pathways.

Relationship mapping has low-tech roots. For decades, social workers have created pen-and-paper ecomaps with clients to reveal their social supports and stressors.

Network mapping, ecomapping, relationship mappingit's all the idea of trying to get on paper, Who are the people in your life? said Sarah Schwartz, a clinical psychologist and leading mentoring researcher whom I interviewed. When I do it with young people, I use a blank piece of paper, put their name in the middle and start drawing lines and asking them, Whos in your school? Whos in your community? Whos in your neighborhood? Who are your caregivers friends? Whos in your religious community? explained Schwartz.

This practice has been slow to migrate from paper into the digital realm. Even fairly popular programs like Harvards Making Caring Commons virtual Relationship Mapping Strategy rely on simple spreadsheets.

Pen-and-paper and spreadsheets may suffice for short activities and small programs. But they risk a static approach to relationship data. With better tools, that data could prove both a powerful and dynamic indicator over time. Luckily, a range of entrepreneurs are starting to build tools that could supercharge schools ability to access and store secure data on students networks in ways that help both young people and the institutions that serve them keep track of their connections.

Some tools have emerged from researchers focused on the power of network science to improve outcomes. For example, a new open-source research tool Network Canvas, developed through the Complex Data Collective, streamlines the process of designing network surveys, interviewing subjects, and analyzing and managing social network data.

Another tool built by researchers at Visible Networks Lab (VNL) called PARTNERme uses an interactive interface where kids and parents can draw their social connections, identify who helps them with things they need, and highlight their most pressing needs with the least amount of social support.

The resulting map aims to make invisible networks visible, according to VNLs founder Danielle Varda, a researcher and faculty at University of Colorado Denver School of Public Affairs.

By visualizing these types of things, we make a very complex problem easier to see and therefore more tangible to address, Varda said.

For the past two years, VNL has worked with the Annie E. Casey Foundation to support youth research fellows conducting qualitative research on how the PARTNERme assessment can best detect social supports in young peoples lives.

Other tools are starting to emerge to help young people identify and maintain connections. Palette is a startup focused on fostering more communication across students support networks. The goal, in founder Burck Smiths words, is to better connect and manage the adults that are most influential in a student's success. Palette is still in beta, but will launch a half dozen or so pilot programs this fall in advising, coaching, mentoring and counseling programs.

Other startups are pairing relationship maps with network-building curriculum. My Opportunity Hub (MyOH), an app in development by Edward DeJesus, founder of Social Capital Builders, Inc., nudges young people to keep the connections in their livesteachers, family members and mentorsupdated on their progress, and to build new connections with those in industries they are interested in. The tool goes hand in hand with DeJesuss Foundations in Social Capital Literacy curriculum, which teaches young people about building and mobilizing networks. The app aims to make maintaining connections more manageable. At any given time in the course of Social Capital Builders experiential curriculum, young people are keeping a select five to six individuals, what DeJesus and his team dub Opportunity Guides, up to date on their successes and challenges.

Tools like MyOH demonstrate the potential of pairing relationship-building curriculum with data and visualization tools. Others are starting to take a similar tack. For example, iCouldBe, an online mentoring program and college and career curriculum, is currently building a student-facing connections map where students will be able to visualize their networks on an ongoing basis. (Notably, students served by iCouldBe prefer the term connections to networks). While students make their way through the curriculum, the map will automatically populate any connections with teachers, coaches, and counselors that students identify, and urges students to develop new connections with people they would like to meet.

For iCouldBe, this marks a promising evolution from data-driven mentorship to data-driven network building. We have this enormous database on the backend of the program and use data science tools to really look at how mentees engage in the program. For every single week of the program we see a weekly score based on mentees and mentors engagement," said Kate Schrauth, executive director of iCouldBe. Were going to be looking to take these data science tools and add all of the metrics from the enhanced connections map so that we can understand how mentees are engaging with these broader networks over longer periods of time.

Better tools for assessing and maintaining connectedness offer myriad upsides when it comes to the complex challenges schools are facing this year. First, as researchers like VNLs Danielle Varda have long documented, connectedness and mental health are deeply intertwined. Given concerns about students mental health are top of mind among district leaders, schools would be wise to not just invest in interventions, but data focused on social connectedness.

Second, mapping networks can help create more resilient systems. In the early months of the pandemic, some school districts were lauded as innovative for initiatives that ensured someoneanyonefrom the district reached out to students daily. As Herculean as those efforts were, they were also a reflection of how ill-prepared schools were to leverage and coordinate existing connections in students lives. If more crises upend school as we know it, data on who students know and can turn to offers an invaluable safety net for centralized systems trying to operate under decentralized conditions.

Of course, limited time, financial resources, and network science expertise in schools may hamper adoption of these kinds of tools. Startups hoping to gain a foothold may need to be as much in the business of relationship mapping development as in the business of change management and consulting (which many of the tool providers above offer). Others are betting on adoption first outside of traditional systems. The first step of our strategy toward greater district adoption of PARTNERme is to partner with community-based organizations that provide services to schools to prove the value of using the tool, said Varda of VNLs approach.

But if the recent buzz around economic connectedness is any indication, there's significant interest from schools and the communities that support them in doubling down on the crucial role that relationships play in young peoples lives. Relationships and the resources they can offeroften dubbed social capitaldrive healthy development, learning and access to opportunity. Its time these connections become part and parcel of the data that schools collect to drive and measure their progress.

Read this article:

What Schools Miss When Theyre Missing Relationship Data - EdSurge

Read More..

NVIDIA and Dell Technologies Deliver New Data Center Solution for Zero-Trust Security and the Era of AI – NVIDIA Blog

Dell PowerEdge Servers Built With NVIDIA DPUs, NVIDIA GPUs and VMware vSphere 8 to Help Enterprises Boost AI Workload Performance and Build Foundation for Zero-Trust Security; Available to Experience Today on NVIDIA LaunchPad

VMware ExploreNVIDIA today announced a new data center solution with Dell Technologies designed for the era of AI, bringing state-of-the-art AI training, AI inference, data processing, data science and zero-trust security capabilities to enterprises worldwide.

The solution combines Dell PowerEdge servers with NVIDIA BlueField DPUs, NVIDIA GPUs and NVIDIA AI Enterprise software, and is optimized for VMware vSphere 8 enterprise workload platform, also announced today.

Enterprises can experience the combination of these technologies on NVIDIA LaunchPad, a hands-on lab program that provides access to hardware and software for end-to-end workflows in AI, data science and more.

AI and zero-trust security are powerful forces driving the worlds enterprises to rearchitect their data centers as computing and networking workloads are skyrocketing, said Manuvir Das, head of Enterprise Computing at NVIDIA. VMware vSphere 8 offloads, accelerates, isolates and better secures data center infrastructure services onto the NVIDIA BlueField DPU, and frees the computing resources to process the intelligence factories of the worlds enterprises.

Dell and NVIDIAs long tradition of collaborating on next-generation GPU-accelerated data centers has already enabled massive breakthroughs, said Travis Vigil, senior vice president, portfolio and product management, Infrastructure Solutions Group, Dell Technologies. Now, through a solution that brings NVIDIAs powerful BlueField DPUs along with NVIDIA GPUs to our PowerEdge server platform, our continued collaboration will offer customers performance and security capabilities to help organizations solve some of the worlds greatest challenges.

Running on BlueField, vSphere 8 supercharges the performance of workloads. By offloading to the DPU, customers can accelerate networking and security services, and save CPU cycles while preserving performance and meeting the throughput and latency needs of modern distributed workloads. The combination increases performance and efficiency, simplifies operations and boosts infrastructure security for data center, edge, cloud and hybrid environments.

Distributed modern applications with AI/ML and analytics are driving the transformation of data center architecture by leveraging accelerators and providing better security as part of the mainstream application infrastructure, said Krish Prasad, senior vice president and general manager, VMware Cloud Platform Business, VMware. Dell PowerEdge servers built on the latest VMware vSphere 8 innovations, and accelerated by NVIDIA BlueField DPUs, provide next-generation performance and efficiency for mission-critical enterprise cloud applications while better protecting enterprises from lateral threats across multi-cloud environments.

NVIDIA AI Enterprise Support for VMware vSphere 8 Coming SoonAs NVIDIA-Certified Systems, the Dell PowerEdge servers will be able to run the NVIDIA and VMware AI-Ready Enterprise Platform, a solution that features the NVIDIA AI Enterprise software suite and VMware vSphere.

A comprehensive, cloud-native suite of AI and data analytics software, NVIDIA AI Enterprise is optimized to enable organizations to use AI on familiar infrastructure. It is certified to deploy anywhere from the enterprise data center to the public cloud and includes global enterprise support to keep AI projects on track.

An upcoming release of NVIDIA AI Enterprise will bring support for new capabilities introduced in VMware vSphere 8, including the ability to support larger multi-GPU workloads, optimize resources and easily manage the GPU lifecycle.

AvailabilityWith NVIDIA LaunchPad, enterprises can get access to a free hands-on lab of VMware vSphere 8 running on NVIDIA BlueField-2 DPUs.

Dell servers with vSphere 8 on NVIDIA BlueField-2 DPU will be available later in the year. NVIDIA AI Enterprise with VMware vSphere is now available and can be experienced on NVIDIA LaunchPad hands-on labs.

NVIDIA CEO Jensen Huang and VMware CEO Raghu Raghuram discussed how the collaboration is driving the next era of computing in a fireside chat at VMware Explore.

View original post here:

NVIDIA and Dell Technologies Deliver New Data Center Solution for Zero-Trust Security and the Era of AI - NVIDIA Blog

Read More..

Cloud Computing in Higher Education Market Facts, Figures, Analytical Insights, and Forecast 2022-2030 – Taiwan News

According to the Astute Analytica study on the global Cloud Computing in Higher Education Market, the size of the market will increase from US$ 2,693.5 Million in 2021 to US$ 15,180.1 Million by 2030, registering a remarkable compound annual growth rate (CAGR) of 22% from 2022 to 2030.

The segmentation section of the report focuses on every segment, along with highlighting the ones having a strong impact on the global Cloud Computing in Higher Education Market. The segmentation served as the foundation for finding businesses and examining their financial standings, product portfolios, and future growth potential. The second step entailed evaluating the core competencies and market shares of top firms in order to predict the degree of competition. A bottom-up method was used to assess the markets overall size.

Request Sample Copy of Research Report @ https://www.astuteanalytica.com/request-sample/cloud-computing-higher-education-market

On the basis of institute type, the technical schools are estimated to hold the highest market share in 2021 and is also expected to project the highest CAGR over the forecast period owing to increasing demand for cloud computing in technical schools. Moreover, based on ownership, private institutes segment is anticipated to hold the largest market share owing to increasing funding in private institutes for adoption of cloud computing services. Whereas, the public institutes segment is expected to grow at the highest CAGR over forecast period. Furthermore, in terms of application, administration application holds a major share in the cloud computing in higher education in 2021. Whereas, unified communication is expected to project the highest CAGR over the forecast period due to increasing trend of e-learning. In addition to this, by deployment, the hybrid cloud segment held the largest market share in 2021.

Market Dynamics and Trends

Drivers

The increasing adoption of SaaS based cloud platforms in higher education, increasing adoption of e-learning, increasing IT spending on cloud infrastructure in education and increasing application of quantum computing in education sector will boost the global cloud computing in higher education market during the forecast period. Software-as-a-Service (SaaS) is a type of delivery model of cloud computing. In the higher education sector, SaaS applications include hosting various management systems for educational institutes and managing other activities. Moreover, higher education industry witnesses an increased adoption of e-learning due to its easy accessibility and high effectiveness. Users such as drop-outs, transfer learners, full-time employees are increasingly relying on e-learning trainings and education to upgrade their skills. Furthermore, higher education institutes are rapidly moving towards cloud-based services to save an intensive IT infrastructure cost and boost efficiency of operations.

Restraints

Cybersecurity and data protection risks, lack of compliance to the SLA and legal and jurisdiction issues is a restraining factor which inhibits the growth of the market during the forecast period. Issues related to data privacy pose threats in interest to mitigation of higher education institutions to the cloud. There are federal regulations for higher education institutes along with state and local laws to manage information security in the education environment. Moreover, the level of complexity in the cloud is high, which usually complies with several service providers and thus makes it hard for users to make changes or intervene. Also, the cloud computing industry faces various legal and jurisdiction issues that can run into years due to regional laws.

Cloud Computing in Higher Education Market Country Wise Insights

North America Cloud Computing in Higher Education Market-

US holds the major share in terms of revenue in the North America cloud computing in higher education market in 2021 and is also projected to grow with the highest CAGR during the forecast period. Moreover, in terms of institute type, technical schools hold the largest market share in 2021.

Europe Cloud Computing in Higher Education Market-

Western Europe is expected to project the highest CAGR in the Europe cloud computing in higher education market during forecast period. Wherein, Germany held the major share in the Europe market in 2021 because there is high focus on innovations obtained from research & development and technology adoption in the region.

Asia Pacific Cloud Computing in Higher Education Market-

India is the highest share holder region in the Asia Pacific cloud computing in higher education market in 2021 and is expected to project the highest CAGR during the forecast period owing to potential growth opportunities, as end users such as schools and universities are turning toward cloud services in order to offer high quality services that help users to collaborate, share and track multiple versions of a document.

South America Cloud Computing in Higher Education Market-

Brazil is projected to grow with the highest CAGR in the South America cloud computing in higher education market over the forecast period. Furthermore, based on ownership, private institutes segment holds the major share in 2021 in the South America cloud computing in higher education market owing to increasing funding in private institutes for adoption of cloud computing services.

Middle East Cloud Computing in Higher Education Market-

Egypt is the highest share holder region in 2021 and UAE is projected to grow with the highest CAGR during the forecast period. Moreover, in terms of application, administration holds a major share in the cloud computing in higher education in 2021. Whereas, unified communication is expected to project the highest CAGR over the forecast period due to increasing trend of e-learning.

Africa Cloud Computing in Higher Education Market-

South Africa is the highest share holder region in the Africa cloud computing in higher education market in 2021. Furthermore, by deployment, the private cloud segment is expected to witness the highest CAGR during forecast period due to the security benefits provided by the private deployment of the cloud.

Competitive Insights

Global Cloud Computing in Higher Education Market is highly competitive in order to increase their presence in the marketplace. Some of the key players operating in the global cloud computing in higher education market include Dell EMC, Oracle Corporation, Adobe, Inc., Cisco Systems, Inc., NEC Corporation, Microsoft Corporation, IBM Corporation, Salesforce.com, Netapp, Ellucian Company L.P., Vmware, Inc and Alibaba Group among others.

Segmentation Overview

Global Cloud Computing in Higher Education Market is segmented based on institute type, ownership, application, deployment and region. The industry trends in the global cloud computing in higher education market are sub-divided into different categories in order to get a holistic view of the global marketplace.

Following are the different segments of the Global Cloud Computing in Higher Education Market:

Download Sample Report, SPECIAL OFFER (Avail an Up-to 30% discount on this report- https://www.astuteanalytica.com/industry-report/cloud-computing-higher-education-market

By Institute Type segment of the Global Cloud Computing in Higher Education Market is sub-segmented into:

By Ownership segment of the Global Cloud Computing in Higher Education Market is sub-segmented into:

By Application segment of the Global Cloud Computing in Higher Education Market is sub-segmented into:

By Deployment segment of the Global Cloud Computing in Higher Education Market is sub-segmented into:

By Region segment of the Global Cloud Computing in Higher Education Market is sub-segmented into:

North America

Europe

Western Europe

Eastern Europe

Asia Pacific

South America

Middle East

Africa

Request Full Report- https://www.astuteanalytica.com/request-sample/cloud-computing-higher-education-market

About Astute Analytica

Astute Analytica is a global analytics and advisory company which has built a solid reputation in a short period, thanks to the tangible outcomes we have delivered to our clients. We pride ourselves in generating unparalleled, in depth and uncannily accurate estimates and projections for our very demanding clients spread across different verticals. We have a long list of satisfied and repeat clients from a wide spectrum including technology, healthcare, chemicals, semiconductors, FMCG, and many more. These happy customers come to us from all across the Globe. They are able to make well calibrated decisions and leverage highly lucrative opportunities while surmounting the fierce challenges all because we analyze for them the complex business environment, segment wise existing and emerging possibilities, technology formations, growth estimates, and even the strategic choices available. In short, a complete package. All this is possible because we have a highly qualified, competent, and experienced team of professionals comprising of business analysts, economists, consultants, and technology experts. In our list of priorities, you-our patron-come at the top. You can be sure of best cost-effective, value-added package from us, should you decide to engage with us.

Contact us:Aamir BegBSI Business Park, H-15,Sector-63, Noida- 201301- IndiaPhone: +1-888 429 6757 (US Toll Free); +91-0120- 4483891 (Rest of the World)Email: sales@astuteanalytica.comWebsite: http://www.astuteanalytica.com

Visit link:
Cloud Computing in Higher Education Market Facts, Figures, Analytical Insights, and Forecast 2022-2030 - Taiwan News

Read More..

Importance of Integrating Cloud Computing and Internet of Things – CIO Applications

Integration of IoT and cloud computing solutions is the future of the internet, solving several business obstacles and opening up new research and business opportunities.

FREMONT, CA: Cloud computing enables businesses to store, manage, and process data on cloud-enabled platforms that offer scalability, connectivity, and flexibility. Many cloud computing models facilitate digital transformation, efficiency, and growth for enterprises when successfully implemented. However, when coupled with the internet of things (IoT), the cloud allows never-before-seen capabilities that accelerate corporate growth.

In an IoT ecosystem, different cloud services and solutions perform numerous roles. Some cloud computing services incorporate machine learning, business intelligence tools, and SQL engines to accomplish complicated IoT-related activities.

Allows for remote computer access: With a vast storage capacity, the IoT eliminates the need for on-site infrastructure. With continual development and advancements in internet-based technology, such as the internet and devices that support advanced cloud solutions, cloud technology has entered the mainstream. Cloud solutions are replete with IoT and enable businesses to use remote computing services with a simple click or command.

Safety and Privacy: With cloud technology and IoT, enterprises can significantly minimize security concerns by automating tasks. IoT-enabled cloud technology is a system that offers preventative, detective, and remedial control. With efficient authentication and encryption processes, it also provides robust security safeguards to its consumers. In IoT goods, protocols such as biometrics facilitate managing and protecting user identities and data.

Data Integration: Current technological advancements have not only seamlessly merged IoT and the cloud but also offer real-time connectivity and communication. This facilitates the extraction of real-time information on important business activities and data integration on-the-fly with 24/7 connectivity. Cloud-based solutions with robust data integration capabilities can store, process, and analyze a significant volume of data collected from many sources.

Minimal Reliance on Hardware: Several IoT solutions offer plug-and-play hosting services enabled by cloud integration with IoT. With cloud-enabled IoT hosting, no gear or infrastructure is required to provide the agility needed by IoT devices. It is now simple for businesses to adopt large-scale IoT initiatives across several platforms and transition to omnichannel communication.

Organizational Continuity: Due to their agility and dependability, cloud computing solutions can ensure business continuity in an emergency, data loss, or natural disaster. Cloud services are provided through a network of data servers located in numerous geographical areas and maintaining multiple backup copies of data. IoT-based operations continue functioning in a disaster, and data recovery is simplified.

Communication Among Multiple Devices and Touchpoint: Cloud solutions allow IoT devices and services to communicate and interact with one another. The cloud and IoT can interact with and connect devices by providing many robust APIs. Having cloud-based communication capabilities expedites the contact process.

Response Time and Data Processing: Combining edge computing and IoT technologies reduce response times and accelerate data processing capabilities. Maximum usage needs the implementation of IoT with cloud computing and edge computing solutions.

More:
Importance of Integrating Cloud Computing and Internet of Things - CIO Applications

Read More..

FACT SHEET: Biden-Harris Administration Launches the Apprenticeship Ambassador Initiative to Create Equitable, Debt-Free Pathways to High-Paying Jobs…

Today, the Biden-Harris Administration is announcing the launch of the Apprenticeship Ambassador Initiative, a national network of more than 200 employers and industry organizations, labor organizations, educators, workforce intermediaries, and community-based organizations who are committed to strengthening and diversifying Registered Apprenticeship. Registered Apprenticeship is a high-quality, debt-free, equitable earn and learn model with a nationally recognized credential system that helps employers hire a more demographically diverse workforce and provides workers with on-the-job learning experience, job-related instruction with a mentor, and a clear pathway to a good-paying job. First Lady Jill Biden, Secretary Marty Walsh, and Secretary Gina Raimondo are hosting a discussion at the White House today with leaders of the Initiative.

The Apprenticeship Ambassadors have existing Registered Apprenticeship programs in over 40 in-demand industries and have committed to expand and diversify these programs over the next year by collectively: developing 460 new Registered Apprenticeship programs across their 40 industries, hiring over 10,000 new apprentices, and holding 5,000 outreach, promotional, and training events to help other business, labor, and education leaders launch similar programs. Ambassadors will also use their expertise to scale innovative practices and increase access to Registered Apprenticeship for underserved populations, including women, youth, people of color, rural communities, people with arrest or conviction records, and people with disabilities.

This new Initiative builds on President Bidens efforts to expand Registered Apprenticeships, which include investing hundreds of millions of dollars in Registered Apprenticeships and pre-apprenticeships and launching an Apprenticeship Accelerator that speeds up the time it takes to get approval to start a new program from months to days. The Administrations efforts have already helped develop over 4,000 new Registered Apprenticeship programs, add 6,700 new employer partners participating in Registered Apprenticeship programs, and led to the hiring of more than one million apprentices.

The Apprenticeship Ambassador Initiative will have long-lasting and mutually beneficial economic benefits for both workers and employers. About 93 percent of workers who complete Registered Apprenticeships gain employment and earn an annual average starting wage of $77,000. Registered Apprenticeships also help employers attract, train, and retain a skilled and diverse workforce and reap a $1.47 return for every dollar spent on Registered Apprenticeships. The Initiative will help to ensure there is a skilled, diverse workforce to implement the Presidents economic agenda including tackling the supply chain challenge and filling new clean energy jobs created by the Inflation Reduction Act, manufacturing and technology jobs created by the CHIPS and Science Act, infrastructure jobs created by the Bipartisan Infrastructure Law, and roles in other high-demand sectors like health care and cybersecurity.

For example, through the Initiative:

These efforts complement the Administrations efforts to expand Registered Apprenticeships to build a skilled, diverse workforce in high demand areasincluding those that are bolstered by Presidents economic agenda. For example,the Administration launched the Talent Pipeline Challenge, a nationwide call to action for employers, education and training providers, states, local, Tribal, and territorial governments, and philanthropic organizations to make tangible commitments that support equitable workforce development including launching or scaling Registered Apprenticeships in critical infrastructure sectors: broadband, construction, electric vehicle charging, and battery manufacturing. Clean energy tax credits in the Inflation Reduction Act include a significant bonus for businesses that hire using Registered Apprenticeship programs and pay prevailing wage rates ensuring our clean energy investments create high-quality training pathways that lead to good-paying jobs. State and local governments are using American Rescue Plan Fiscal Recovery Funds to expand pre-apprenticeships and Registered Apprenticeships in response to the negative economic impacts of the pandemic.

The Department of Labor is also taking additional steps to expand Registered Apprenticeship to serve at least 1 million apprentices annually within the next 5 years. These steps include:

The Department of Commerce is expanding Registered Apprenticeships, including by:

The Department of Education is expanding Registered Apprenticeship, including by:

Visit Apprenticeship.gov to start a program, become an apprentice, become an Apprenticeship Ambassador, or learn more about the Apprenticeship Ambassador Initiative and National Apprenticeship Week in November where many Ambassadors will showcase their commitments.

###

See the article here:
FACT SHEET: Biden-Harris Administration Launches the Apprenticeship Ambassador Initiative to Create Equitable, Debt-Free Pathways to High-Paying Jobs...

Read More..

Digital Transformation Market Trends, Size, Share, Growth, Industry Analysis, Advance Technology and For – Benzinga

"Microsoft (US), SAP (Germany), Baidu (China), Adobe Systems (US), Alibaba (China), IBM (US), Google (US), Marlabs (US), Salesforce (US), Broadcom (CA Technologies) (US), Equinix (US), Oracle (US), Hewlett Packard Enterprise (US), HCL Technologies (India), Tibco Software (US), Alcor Solutions (US), Smartstream (US), Yash Technologies (US), Interfacing (US)."

Digital Transformation Market by Component, Technology (Cloud Computing, Big Data & Analytics, Mobility & Social Media Management, Cybersecurity, AI), Deployment Mode, Organization Size, Business Function, Vertical and Region - Global Forecast to 2027

The global Digital Transformation Market size is expected to grow at a Compound Annual Growth Rate (CAGR) of 21.1% during the forecast period, to reach USD 1,548.9 billion by 2027 from USD 594.5 billion in 2022. Major drivers for the digital transformation market are the scalability of digital efforts, economic advantages of cloud-based digital transformation solutions, emergence of ML and AI, changes in the customer intelligence landscape, and the rise in adoption of big data and related technologies as well. The major restraint for the market is the issues related with privacy and data security as well. Critical challenges facing in the digital transformation market include concerns related to modernizing IT and lack of skilled personnel. Underlying opportunities in the digital transformation market includes the rise in government initiatives and financial support for adopting digitization and the demand for the personalized digital transformation.

Download PDF Brochure:https://www.marketsandmarkets.com/pdfdownloadNew.asp?id=43010479

As per verticals, the healthcare, life sciences & pharmaceuticals segment to grow at highest CAGR during the forecast period

The digital transformation market is segmented on verticals into Healthcare, Life Sciences & Pharmaceuticals, BFSI, Telecommunication, Manufacturing, Retail & Ecommerce, Government & Defense, Media & Entertainment, IT/ITES, Energy and Utilities, and other verticals, such as travel & hospitality, transportation & logistic, and education. As per verticals, the healthcare, life sciences & pharmaceuticals vertical is expected to grow at the highest CAGR during the forecast period. The healthcare sector is progressively embracing digital transformation technologies such as big data & analytics and cloud computing, for the management of electronic health records and healthcare information. Additionally, customers in the BFSI vertical require immediate access to their accounts and transaction details. For banks and other financial service providers, the automation of many activities, including lending and compliance management, is another benefit of digital transformation solutions. Therefore, the use of and management of digital transformation solutions in the healthcare, life sciences & pharmaceutical, and BFSI sectors presents enormous growth prospects for service integrators, research & consulting vendors, and hardware integration service providers.

Cloud segment to grow at the highest CAGR during the forecast period

As per deployment mode, the digital transformation market has segmented it into cloud and on-premises. The on-premises digital transformation solutions hadbeen adopted by various businesses due to its better control and management of data. While end users who are worried about cost and security tend to choose the cloud deployment type, the on-premises deployment type offers scalability and flexibility. The adoption of on-premises digital transformation solutions had been greatly impacted by the growing adoption of cloud computing solutions. Offering cutting-edge and reliable cloud solutions is a goal for many cloud-based digital transformation service providers. The primary benefits of cloud deployment types include easy deployment, low deployment cost, easy accessibility, and upgradeability as well. During the forecast period, the cloud segment is anticipated to grow at the highest CAGR for the digital transformation market.

Request Sample Pages:https://www.marketsandmarkets.com/requestsampleNew.asp?id=43010479

Some major players in the infrastructure as code market include Microsoft (US), SAP (Germany), Baidu (China), Adobe Systems (US), Alibaba (China), IBM (US), Google (US), Marlabs (US), Salesforce (US), Broadcom (CA Technologies) (US), Equinix (US), Oracle (US), Hewlett Packard Enterprise (US), HCL Technologies (India), Tibco Software (US), Alcor Solutions (US), Smartstream (US), Yash Technologies (US), Interfacing (US), Kissflow (India), eMudhra (India), ProcessMaker (US), Process Street (US), Happiest Minds (India), Scoro (UK), Dempton Consulting Group (Canada), Brillio (US), and Aexonic Technologies (India). These players have adopted various organic and inorganic growth strategies, such as new product launches, partnerships and collaborations, and mergers and acquisitions, to expand their presence in the global digital transformation market.

Oracle is a world leader in providing a wide range of products, services, and solutions to satisfy the needs of business IT environments, including platforms, applications, and infrastructure. Businesses of all sizes, governments, educationalorganizations, and resellers are among Oracles clients. Through a global sales team and an Oracle partner network, the company sells its products and services both directly and indirectly. It focuses on creating, producing, and selling application software, databases, and hardware systems. The company offers SaaS solutions that use cutting-edge technologies such asblockchain, IoT, AI, and ML. It operatesin more than 175 countries through three business segments, including cloud and license, hardware, and services. It serves more than 430,000 consumers in a variety of business verticals. Also, it has its global presence across Americas, Europe, Asia Pacific, and Middle East & Africa.

Adobe is a multinational software firm with a wide range of products. The firm is divided into three segments:Print & Publishing, Digital Media, and Digital Marketing. Adobe licenses its technology to hardware manufacturers, software developers, and service providers for use in their products and solutions. Web experience, analytics, social media optimization, testing and targeting, and campaign administration are all included in the marketing cloud of the company. It uses the SaaS, managed service, term subscription, and pay-per-use business models to deliver its products. Also,offers six cloud-based marketing solutions. Adobe provides products like Adobe Sign, Adobe Stock, Experience Manager, and Advertising Cloud. It offers services to a number of business verticals, including telecommunications, media and entertainment, retail, financial services, and government as well.

IBM is a multinational technology and advisory firm which provides infrastructure, hosting, and consulting services. The firm is divided into five main business units: systems, global business services, global technology services, cloud and cognitive software, and global financing. It serves a number of industries, including aerospace and defense, government, manufacturing, healthcare, oil & gas, automotive, electronics, insurance, retail & consumer goods, banking & finance, life sciences, telecommunications, media & entertainment, chemicals, and more. With customers in more than 175 nations, IBMhas a significant presence in the Americas, Europe, Middle East &Africa, as well as Asia Pacific. It was now the preferred platform for all corporate applications.

Media ContactCompany Name: MarketsandMarkets Research Private Ltd.Contact Person: Mr. Aashish MehraEmail: Send EmailPhone: 18886006441Address:630 Dundee Road Suite 430City: NorthbrookState: IL 60062Country: United StatesWebsite: https://www.marketsandmarkets.com/Market-Reports/digital-transformation-market-43010479.html

Press Release Distributed by ABNewswire.comTo view the original version on ABNewswire visit: Digital Transformation Market Trends, Size, Share, Growth, Industry Analysis, Advance Technology and Forecast 2027

More here:
Digital Transformation Market Trends, Size, Share, Growth, Industry Analysis, Advance Technology and For - Benzinga

Read More..

Amazon opens new office in Johannesburg – MyBroadband

Amazon Web Services (AWS) has opened a new office in Johannesburg, announcing that it is to support growing customer demand.

The new office will support South Africas burgeoning cloud market, and provide a range of services to organisations of all sizes, including startups, enterprises, and public sector agencies, Amazon said in a statement on Thursday.

The new office continues Amazons growing investment in South Africa.

AWS was initially developed in Cape Town by South African Internet pioneer Chris Pinkham after proposing the elastic compute cloud in an internal Amazon paper in late2003.

EC2 would become AWS first product

Pinkham is well-known in South African tech circles for founding the countrys first commercial ISP in 1993 The Internetworking Company of Southern Africa (Ticsa).

UUNET bought Ticsa in 1996, and Pinkham took a break from the tech scene, including sailing around the world.

In 2000, he joined Amazon to run their network engineering department.

However, in 2003 he wanted to move back to Cape Town from Seattle.

Not wanting to lose Pinkham, former Amazon CEO Jeff Bezos asked if he would look into developing EC2.

Pinkham said by late 2004 or early 2005 they had their first engineers on board in the Constantia area in Cape Town.

Thus began Amazons presence in South Africa. Amazon EC2 was officially launched in August 2006.

This is AWSs second office in Johannesburg. The company launched a local office on 13 August 2015.

Johannesburg offers an incredible talent pool of highly skilled and creative people, said AWS country manager Chris Erasmus.

It is home to many notable South African enterprises leading the way in digital innovation as well as fast-growing startups.

Erasmus said they had seen increased adoption of AWS technology in the country, fuelling the need to service Amazons customers from their centre of operations.

We look forward to fostering the countrys pioneering spirit alongside our customers by helping them accelerate their digital transformation and deliver innovative new products and services to the South African economic landscape.

See the rest here:
Amazon opens new office in Johannesburg - MyBroadband

Read More..

JCGC Longshots: Chess Chief ‘A Real Warrior,’ Tax Will ‘Break Out Of That One Hole Running’ – Horse Racing News – Paulick Report

by NYRA Press Office|09.02.2022|4:06pm

Chess Chief (outside) edges Owendale in the New Orleans Classic

The Estate of James J. Coleman Jr.'s graded stakes winningChess Chieflooks to light up the tote board in Saturday's Grade 1, $1 million Jockey Club Gold Cup, a Win and You're In qualifier for the Grade 1 Breeders' Cup Classic in November at Keeneland.

Contested at the Classic 10-furlong distance on Saratoga Race Course's main track, Chess Chief will break from post 4 under jockey Manny Franco.

The 6-year-old son of Into Mischief was fifth in this event last year and most recently was third in the Alydar August 4 at the Spa. He closed out his 2021 campaign with a win in the Tenacious at Fair Grounds Race Course, where he has posted all five of his wins through 34 career starts and purse earnings of $894,369.

The hard-knocking Virginia-bred made the grade in March 2021 in the Grade 2 New Orleans Classic and finished sixth in his title defense this spring.

Despite being made the longest shot of 30-1 on the morning line, trainer Dallas Stewart said he is expecting his horse to run a big race.

He's a real warrior. He ran in this race last year but was coming off a bad grass race, said Stewart. This year he's had a race over the track and a couple of real good works over the main track. He galloped as fresh as a 2-year-old this morning.

Chess Chief's most recent work was at five-eighths on August 28 over the Spa main track, covering the ground in 1:01.25.

Stewart is no stranger to Grade 1 success at the Spa, including the 2017 Personal Ensign with Forever Unbridled and the 2015 Ballerina with her full-sister Unbridled Forever.

We won a couple Grade 1s here before, said Stewart. We only bring a small amount of horses, but it's just like any other race. You've got to get in there and fight it out.

It has been a long road back to top company for 2019 Grade 2 Jim Dandy winnerTax, who returned to the races from a 19-month layoff in July to score a wire-to-wire victory in the Battery Park at Delaware Park. The 6-year-old son of Arch will now try for a Grade 1 victory in Saturday's $1 million Jockey Club Gold Cup going 10 furlongs for 3-year-olds and up at Saratoga Race Course.

Trained and co-owned by Danny Gargan with R.A. Hill Stable, Tax last faced Grade 1 company in the Pegasus World Cup Invitational in January 2021 at Gulfstream Park, his last race before an injury that forced his lengthy respite.

Off that big a layoff, it was so rewarding, Gargan said of the Battery Park effort. And they made him the favorite when he hadn't run in 532 days. He's a really cool horse.

Tax made a quick ascent to the graded ranks as a juvenile, graduating at second asking in a maiden claiming race at Keeneland where he was haltered by Gargan for $50,000. He followed with a third in the Grade 2 Remsen before making the grade in his sophomore debut with an off-the-pace score in the Grade 3 Withers, both at Aqueduct Racetrack. He punched his ticket to the Grade 1 Kentucky Derby with a runner-up finish to Tacitus in the Grade 2 Wood Memorial that April and subsequently finished 14th in the Run for the Roses.

Tax went on to have a prosperous second half of his 3-year-old season that included a close fourth in the Grade 1 Belmont Stakes, his determined three-quarter-length score over Tacitus in the Jim Dandy and a runner-up finish to Performer in the Grade 3 Discovery at the Big A to close out the year. As a 4-year-old, he competed in his first Pegasus World Cup and won the Grade 3 Harlan's Holiday at Gulfstream Park.

Gargan said he is excited to have Tax back in Grade 1 company.

We didn't know it would take this long, but it's pretty cool, Gargan said. It's a big step forward and he's going to have to run a big step forward, so it will be fun to see if he still has that desire to be in that kind of caliber. If he does, we'll keep doing it and if not, we'll go back and figure it out. I would love to see him hit the board.

Tax will run 10 furlongs for the first time since an even fifth-place finish in the 2019 Travers at the Spa. Gargan said the dark bay gelding, who is 3-for-3 going 1 1/16 miles, may be at his distance limits in the Jockey Club Gold Cup, but that his class will carry him when he exits the inside post under Kendrick Carmouche.

He'll be on the lead and hopefully he runs big. We'll break out of that one hole running, said Gargan. The only thing that worries me is that the mile and a quarter might not be his best distance. It's funny to say this, but he's undefeated at a mile and a sixteenth. But he ran good in the Belmont. There were a couple different options here, but we waited on this race and we'll see how it goes. He likes this track and he's a happy horse.

Follow this link:
JCGC Longshots: Chess Chief 'A Real Warrior,' Tax Will 'Break Out Of That One Hole Running' - Horse Racing News - Paulick Report

Read More..

Clyde Jared Torea to play in Malaysia Rapid Age-Group Chess Championship 2022 – PhilBoxing.com

Clyde Jared Torea to play in Malaysia Rapid Age-Group Chess Championship 2022

By Marlon BernardinoPhilBoxing.comFri, 02 Sep 2022

MANILA---Clyde Jared Torea, 7 years old, Grade 1 student of Pulo Elementary School in Cabuyao City, Laguna will head to Malaysia with hopes achieving an elo rating aside from bringing honor for the country.

Clyde Jared will see action at the 11th Dato' Ng Chee Cheong Open Rapid Age-Group Chess Championship 2022 Under Open-8 division to be held on September 4, 2022 at the Cititel Midvalley, Midvalley Megamall in Kuala Lumpur, Malaysia.

His older sister, Zafirah Jahly, will also play in the Under-10 section.

Clyde Jared, Cabuyao City's latest discovery, will also represent Laguna in the Batang Pinoy in Vigan City, Ilocos Sur this December.

Other notable players of Cabuyao City are AGM Dr. Fred Paez, Vince Angelo Medina, Tyrone de los Santos, Alfred Rapanot, Apollo Agapay, Michael Angelo Palma, Jeremy Marticio, Jersey Marticio and Woman Candidate Master AIM Alexandra Sydney Paez.

Major supporters were Gov. Ramil Hernandez, Congresswoman Ruth Hernandez, Mayor Dennis Hain, Ferly Orozco- Bolaos.Other supporters include Roly Dela Cruz, Virgie Herrera, Andy Banzuela, Sandie Javier, Marlon Gapaz, Kathleen Caparas , JP Ocampo, Gemma Delos Reyes and Marlon Delos Santos.-Marlon Bernardino-

Click here to view a list of other articles written by Marlon Bernardino.

The rest is here:
Clyde Jared Torea to play in Malaysia Rapid Age-Group Chess Championship 2022 - PhilBoxing.com

Read More..