Page 1,846«..1020..1,8451,8461,8471,848..1,8601,870..»

SVG PLAY: How On-Prem, the Cloud, and Growing File Sizes Are Impacting Storage and Archive Planning for Sports-Media Companies – Sports Video Group

A panel of leaders from the NBA, MLB, NHL, Imagen, and Data Core discuss striking a balance

For the first time since prior to the COVID-19 outbreak, theSVG Sports Content Management Forumreturned to New York City. Hosted at The Westin New York at Times Square on July 27, the event brought together technological leaders from various networks and vendors for a day of networking, information exchange, and idea sharing all around topics like storage, archiving, metadata, and more.

Sports leagues, broadcasters, and content producers of all sizes are challenged with The Big Archive Question: Where do I store my stuff? And, with more content being created every single year and file sizes continuing to grow with the arrival of 4K and HDR, this question has never been bigger.

In this panel discussion from the 2022 SVG Sports Content Management Forum, technology leaders and content owners discuss finding a happy medium between on-premises and the cloud, file-format and codec considerations, retention policies and the question of keeping/discarding physical tapes, and much more.

Panelists:Tom Blake, Imagen, Commercial DirectorAlex Grossman, DataCore, VP, Product Management and Product MarketingChris Halton, NBA, SVP, Media Technology and Operations*Adam Japhet, Major League Baseball, Senior Director of Corporate InfrastructureGrant Nodine, NHL, SVP, TechnologyModerator:Tab Butler, SVG Sports Content Management Committee, Former Chairman

*since the recording of this panel, Chris Halton announced that he has stepped down from his position at NBA.

SVG PLAYis your new home for all Sports Video Group live event and long-form video content. As an SVG member and sponsor, receive simple access to all SVG event panels, case studies, keynotes, and more all in one place. To visitSVG PLAY,CLICK HERE.

See the article here:
SVG PLAY: How On-Prem, the Cloud, and Growing File Sizes Are Impacting Storage and Archive Planning for Sports-Media Companies - Sports Video Group

Read More..

These Three Coins Will Make Your Crypto Winter Very Mild: Pugglit Inu, Ethereum, and Filecoin – NewsBTC

Cryptocurrency markets, where investment trends are positive, continue to offer numerous opportunities that will bring profitability in the long run. Drawing attention to a very productive process, coin investors emphasize that the appreciation of altcoins will be high during this period.

Along with altcoins, it can be seen that projects that are in the development stage are also attracting attention. Pugglit Inu (PUGT) is shown as one of the thriving ecosystems developed during this period.

Pugglit Inu (PUGT) is stated as a meme coin project. Developed on the Binance Smart Chain, the ecosystem provides low-cost transfers using multi and cross-chain technologies. Likewise, it is stated that the transaction speeds in the ecosystem are kept at the highest level.

Pugglit Inu (PUGT), a comprehensive marketing program, plans to reach large audiences by actively using social media platforms. To be profitable in the long term, the ecosystem, which is stated to be continuously expanded, has already met with intense interest from traders.

Pugglit Inu (PUGT) offers a chance to earn passive income and participate in decentralized management through its staking program. It is emphasized that the staking program could offer high profitability for users.

Ethereum (ETH) is the primary native asset of the Ethereum platform. This token was created to incentivize programmers running the Ethereum protocol on their computers. Developers are rewarded with this digital asset. In this way, the network remains secure at all times.

Mining operations need to be done to produce Ethereum (ETH), just like Bitcoin. Ethereum miners use computers or special hardware for this process. GPUs (video cards) or ASIC chips are used in mining operations.

Mining is a system in which users are rewarded by solving cryptographic passwords through various devices. In this, the computational power of the machines we mentioned above is used. For this reason, Ethereum (ETH) is a good choice both as an investment tool and for users who want to mine.

Filecoin (FIL) is a decentralized storage network. It turns cloud storage into an algorithmic market. The market runs on a blockchain with a native protocol token (called FIL) and lets participants earn rewards for supplying storage to clients.

Filecoin (FIL) was founded by Protocol Labs, which also created libp2p and IPFS. The Filecoin (FIL) network achieves its purpose by allowing clients to pay providers for storing or retrieving data. In this way, it provides an economic incentive for miners anyone with spare storage space to join the network and contribute their resources.

It results in a more decentralized, secure, and censorship-resistant storage infrastructure than what is offered by traditional cloud providers such as Amazon S3 or Azure Storage.

The Bear Market has taken its toll on the prices of many cryptocurrencies, but there are still a few gems to be found. In this article, weve highlighted three coins that we believe will outperform the market in the long run: Pugglit Inu (PUGT), Ethereum (ETH), and Filecoin (FIL).

While no trade is without risk, we believe these coins could offer good potential for returns despite the current market conditions.

Disclaimer:This is a paid release. The statements, views and opinions expressed in this column are solely those of the content provider and do not necessarily represent those of NewsBTC. NewsBTC does not guarantee the accuracy or timeliness of information available in such content. Do your research and invest at your own risk.

More here:
These Three Coins Will Make Your Crypto Winter Very Mild: Pugglit Inu, Ethereum, and Filecoin - NewsBTC

Read More..

Does mission critical data mean taking things slow? Nah, let’s take it to the Max Blocks and Files – Blocks and Files

Sponsored Feature Were used to hearing how data is the secret to unlocking your organizations potential, if only youre brave enough to let it flow freely.

The problem is that experienced tech leaders and data specialists in large organizations are likely to be far less cavalier about letting financial, credit or medical data just flow. As Dell engineering technologist Scott Delandy explains, Theyre relatively risk averse, just by the nature of the types of applications that they run.

The mission critical data systems underpinning these applications have been built up over years, decades even, with a sharp focus on quality, stability and reliability, he says, because You always expect your credit card to work. You always expect the thing that you bought to ship and usually get there on time. And you expect to walk out of the hospital.

At the same time, these organizations know that they do need to run new workloads, pouring their data into AI for example, or building out new, more agile but still critical applications on containers and microservices rather than traditional monoliths.

Now, as theyre being asked to support next generation workloads, they dont want to have to rebuild everything from scratch, says Delandy. They want to take the existing infrastructure, [or just] the operational models that they have in place, and they want to be able to extend those to these new workloads.

Containerized OS

These were the challenges Dell had in mind when it began to plan the refresh of its PowerMax line, the latest in a series of flagship storage systems which is at the heart of tech infrastructure in the vast majority of Fortune 500 companies. The most recent update introduces two new appliances, the 2500 and the 8500, which feature new Intel Xeon Scalable CPUs, NVMe dynamic fabric technology, and 100Gb Infiniband support, as well as a new storage operating system, PowerMaxOS 10.

Given the focus on containerized applications, it shouldnt be a surprise that the new OS is containerized itself, making it easier for Dell to develop and launch new features, and to share them across both PowerMax and the vendors other storage platforms.

Because of the way the microcode, the software, the operating environment has been developed, that gives us the ability to cross pollinate data services between the different platforms, says Delandy. One example of this cross pollination is a new set of file capabilities, which first appeared on the PowerStore platform, and is now also available on PowerMax.

But users still face the challenge of how to straddle the traditional VM world and modern applications built around containers an architecture that was never even built with persistent storage in mind and access the same automated, low touch, invisible type of workflow.

And thats why a lot of the work that weve been doing [is] around integrating to things like CSI (container storage interface), he explains, By putting a level of automation between the CSI API and the automation that we have on the infrastructure.

This is based on Dells Container Storage Modules technology, which is the connection between the CSI APIs and the infrastructure APIs, and it allows you to do all of those higher level things around replication, visibility, provisioning, reporting.

This is a key example of Taking the things that people have already built for and have had in place for decades and saying, Okay, Im just gonna go ahead and plug these new workloads in.

It also allows for active-active Metro replication for both VM and containerized applications, even though CSI block data is managed very differently to VM block data, Delandy explains. Weve been doing that in the VM world for like 100 years, right?

Radical density

The software improvements combine with the hardware improvements to enable what Delandy describes as radical density, offering more effective capacity with less physical storage, to the tune of 4PB of effective capacity in a 5U enclosure, or 800TB of effective storage per rack unit.

One significant contributor to this is the ability to support higher density flash, while also supporting very granular capacity upgrades.

Key to this is Flexible Raid which allows single drives to be added to a pre-existing RAID group. This means, When we put an initial configuration onto a users floor, we can use very dense flash technology, because we know if we start off with 15 terabyte drives, well see 30 terabyte drives in the next release. We know that when the customer needs to upgrade, we can just add another 15 terabytes versus having to add 150 terabytes.

Further flexibility comes courtesy of Dells Advanced Dynamic Media Enclosure technology which decouples the compute and storage elements of the PowerMax appliance. This allows more options on the balance of compute nodes versus storage capacity, as well as scalability. It also heads off the dilemma users face when an array starts topping out on performance, but because they have no way to upgrade the controllers, they are forced to add another entire array.

But even with the improvements that have led to this radical density, the remorseless growth of data means that admins still have to consider just how much they want to keep on their premium storage platforms. PowerMax has had multi cloud capabilities since its previous generation of appliances, with the ability to be able to connect in and copy and move data between primary block stores on site to an S3 object store. That can either be a private object store, like in the Dell World, it could be a PowerScale, or it could be an ECS platform. Or it could be a cloud provider.

The latest generation, Delandy continues, brings higher performance, resiliency, and high availability. But also, he says, a better understanding of the use cases. For example, analysis of users arrays suggests up to a quarter of capacity is taken up with snapshot data. There are perfectly good reasons why companies want to keep snapshots, but it also makes perfectly good sense to move them off the primary storage and into the cloud, for example.

Now you can run more transactional databases, more VMs, more Oracle, more SQL, more applications that need the throughput and the processing power of the array versus just holding all this stale, static data.

Its big, but is it secure?

Whether the data is transactional or static, security is the number one thing that users want to talk about these days, Delandy says. Often the conversation is simply a question of highlighting to users the pre-existing features in the system: Its really helping them understand what things they can do and what types of protection they already have, and how to enable what are the best practices around the security settings for that.

But the biggest concern customers have is somebody getting into the environment and not finding out fast enough that youve been breached.

Two new features are crucial here. One is the inclusion of hardware root of trust, which sees cryptographic keys fused onto the controller chips. Everything then has to be authenticated against these, from boot ups to upgrades and driver updates. This significantly reduces the risk of a bad actor obtaining backdoor access to the systems.

In addition, PowerMax now uses anomaly detection to monitor the storage and detect changes to the types of datasets being written including any failure to meet the 4:1 data reduction rates the systems updated reduction algorithms can deliver. One of the things that we look at is whats the reducible versus non reducible rate, and how does that change. We have it set to be so sensitive, that if we start to see changes in reducibility rates, that can indicate that something is being encrypted, explains Delandy.

Its a huge advantage for customers if they can get an indication of ransomware being at work within minutes, because typically encryption due to ransomware takes days, weeks, or even months.The ability to introduce automation, and balance both the long and short term view, is crucial to the whole PowerMax ethos. Dell has sought to take what was already a highly reliable platform and make it simultaneously a highly flexible platform on which it can deliver rapid innovation.

But as Delandy says, Dell has to ensure it is taking a deliberate, targeted approach to actually solving customer problems. Or, put another way, Were not just doing it because its cool.

Sponsored by Dell.

Read the original here:
Does mission critical data mean taking things slow? Nah, let's take it to the Max Blocks and Files - Blocks and Files

Read More..

Halcolu Data Science Institute (HDSI) and UCTV Have Teamed Up to Launch the Data Science Channel – University of California San Diego

The Halcolu Data Science Institute (HDSI) and University of California Television (UCTV) have teamed up to launch the Data Science Channel

UC San Diego Halcolu Data Science Institute (HDSI) and University of California Television (UCTV) have established a partnership to create the Data Science Channel to serve the data science community.

A collaboration between HDSI and UCTV, the Data Science Channel will serve as a central platform to promote the people and programs that are affiliated with HDSI to highlight the diversity of their research, education, and outreach programs. A place for the university and general community, the channel will raise awareness of the growing role and demand for data science as a field and career pathway. In addition, the channel will cover unique and leading-edge perspectives of the nearly endless domains to which data science can be applied.

The Data Science Channel will provide viewers with a range of programming, including public lectures, faculty profiles, short documentaries, video lessons, podcasts, and acquired content to support the following areas of focus:

Public outreach (external, general community)

Building a data science community (external, specific data science community)

Building UCSD data science community (internal, within UCSD community)

The goal is to engage the community, give a face to the efforts of faculty, create a visual connection for potential collaborators, and highlight recent research. The short documentaries will demonstrate the immediate and future impacts of HDSIs ongoing work. The video lessons will support all levels of learners engaged in data science. Podcasts will provide support and insightful conversation on the go. The acquired content from partners helps create a pipeline for those working in the same sphere to encourage collaboration."We are pleased to partner with UCTV to offer this unique programming from our world-class data science faculty, said Rajesh K. Gupta, Director of HDSI.

This partnership with UCTV will offer many benefits and prove resourceful in promoting all things data science. Saura Naderi, the Outreach Coordinator at HDSI commented, "We are hoping the Data Science Channel can serve as a resource to our data science peers, in regard to highlighting cutting-edge research, but also to be a platform for creating visibility into the field for those just starting their exploration."

The community is invited to explore this exciting new channel: https://uctv.tv/data-science/

About the Partners

UCTV is committed to creating fact-based educational and informational video programs that provide viewers access to the universitys scientific, intellectual, civic, and artistic resources, adding value and documenting the endless developments this field offers.

The reach of UCTV spans over 800,000 homes in San Diego County via cable and more than 4 million homes throughout the state, including San Diego, LA, Bay Area, Inland Empire, and Santa Barbara. Its YouTube presence is also strong, with over 1.1 million subscribers.

The mission of the Halcolu Data Science Institute (HDSI) is to establish the scientific foundations of data science, develop new methods and infrastructure, and equip students and industry partners to use data science to solve the worlds most pressing problems.

Founded in 2018 as a fully independent academic unit that works collaboratively with schools, divisions, departments, centers, students, and faculty across multiple disciplines throughout the entire campus. HDSI is the administrative home for the undergraduate B.S. major and minor in Data Science at UC San Diego, and three graduate programs: Master of Data Science Online, MS in Data Science, and Ph.D. in Data Science.

##

See the original post:

Halcolu Data Science Institute (HDSI) and UCTV Have Teamed Up to Launch the Data Science Channel - University of California San Diego

Read More..

Data Science Certifications: 3 Key Things to Know – Dice Insights

Do you need certifications for a successful career in data science? Thats a big and complicated question. Some organizations solely recruit data scientists with certifications; others are happy to hire anyone who can show mastery of data science principles and tools.

If nothing else, a certification can signal you are up-to-date with current data science skills. Although data science is very much an in-demand role (and supposedly the sexiest job of the 21st century), employers still want assurance you can analyze massive datasets for key insights. For example, having a certification from the Data Science Council of America (DASCA) indicates that you already have 3-5 years of practical and professional experience in data science.

What really helps is understanding the pros and cons to each data science cert from price, prerequisites, to niche tool coverage and applicable data examples, says Troy Kranendonk, senior curriculum manager for Pluralsight Skills Content Development.

Generally, each [certification] provides some exposure to data problems that need solving, Kranendonk adds. I would argue that having a conceptual understanding of the material found in a cert is much different than getting your hands dirty with the data to solve an actual business problem and then clearly communicating the solution to stakeholders.

In other words, having a data science certification looks good in your application materials, but you should also demonstrate that you can apply your knowledge in practical ways to data science challenges. While some certifications cover broad data science principles, data scientists with some experience may gravitate toward a certification in a specialization such as machine learning.

Three common (and popular) data science certifications include:

Choosing the right data science certification can depend on whether you already have years of work experience under your belt or not. Some certs are entry-level while others contain concrete prerequisites with the understanding that you already have years of experience with things like statistical analysis, Kranendonk points out.

Kranendonk says tooling, skill, and industry-specific certifications with hands-on validation will be the most valuable in the new future. The Titanic and Iris datasets commonly used for data science practice will only get you so far, he says. Most organizations are slowly understanding that people need data skills coupled with the actual industry-specific data used in the real world.

This means having a data science certification with a focus on a particular discipline or field, whether thats healthcare, finance, marketing, or some other industry. Its also a good idea to continuously take inventory of your skills and knowledge of topics and tooling.

From there you can shrink your gaps and train your weaknesses with certs that align with your personal analysis, Kranendonk says. This will allow you to quickly determine where youre starting from: ground zero, a bootcamp, a degree in data analytics, or a masters in data science?

For those just starting out in the wonderful world of data, you can opt to pursue well-rounded certifications such as those mentioned abovebut keep in mind that some employers may want you to have a specialized certification for particular languages or tools; for example, a Python certification.

If you want a certification that shows your skills with a range of tools, consider the IBM Data Science Professional Certification:

This certification consists of nine courses ranging from open source tools and data science methodology to data visualization and machine learning.

Stay hungry and dont stop learning and practicing, Kranendonk advises. In todays world, staying up to date with data skills is just as valuable as learning them for the first time given how fast technology changes and how quickly tech skills become outdated.

Membership has its benefits. Sign up for a free Dice profile, add your resume, discover great career insights and set your tech career in motion. Register now

See the original post:

Data Science Certifications: 3 Key Things to Know - Dice Insights

Read More..

Analytics and Data Science News for the Week of August 26; Updates from Incorta, SAS Software, Sisu, and More – Solutions Review

The editors at Solutions Review have curated this list of the most noteworthy analytics and data science news items for the week of August 26, 2022.

Keeping tabs on all the most relevant analytics and data science news can be a time-consuming task. As a result, our editorial team aims to provide a summary of the top headlines from the last month, in this space. Solutions Review editors will curate vendor product news, mergers and acquisitions, venture capital funding, talent acquisition, and other noteworthy analytics and data science news items.

SAS Viya with SingleStore enables the use of SAS analytics and AI technology on data stored in SingleStores cloud-native real-time database. The integration provides flexible, open access to curated data to help accelerate value for cloud, hybrid and on-premises deployments. Through SingleStores data compression and SAS analytic performance, the companies aim to reduce the complexity of data management and integration, as well as the computational time required to train sophisticated models.

Read on for more.

The newly released data applications further extend the reach and agility of Incorta and the new Incorta Marketplace opens the door to customizations and packaged solutions from partners and the community. General availability of the Incorta component SDK (software development kit) along with an expanding list of data connectors and data destinations bolster the platforms openness and extensibility.

Read on for more.

SAS Viya with SingleStore enables the use of SAS analytics and AI technology on data stored in SingleStores cloud-native real-time database. The integration provides flexible, open access to curated data to help accelerate value for cloud, hybrid and on-premises deployments. Through SingleStores data compression and SAS analytic performance, the companies aim to reduce the complexity of data management and integration, as well as the computational time required to train sophisticated models.

Read on for more.

Sisu automatically diagnoses key drivers of metric change, automates trend and anomaly detection, predicts changes before they occur, and connects to third-party systems to help organizations make the right decisions and drive better business outcome. Sisu customers can now predict the future impact of metric change to confidently plan actions based on statistically relevant results and make better decisions.

Read on for more.

For consideration in future analytics and data science news roundups, send your announcements to the editor: tking@solutionsreview.com.

Widget not in any sidebars

Tim is Solutions Review's Editorial Director and leads coverage on big data, business intelligence, and data analytics. A 2017 and 2018 Most Influential Business Journalist and 2021 "Who's Who" in data management and data integration, Tim is a recognized influencer and thought leader in enterprise business software. Reach him via tking at solutionsreview dot com.

The rest is here:

Analytics and Data Science News for the Week of August 26; Updates from Incorta, SAS Software, Sisu, and More - Solutions Review

Read More..

Quick Study: How to Find the Right Data – InformationWeek

Theres a hoarder mentality lingering in the data collection and management business. No, they arent clinging to 30-year-old magazines or a roomful of plastic shopping bags. Its a strategy under which some enterprises insist on collecting every byte of data possible, just in case we need it someday.

The hoarder mentality isnt as prevalent as it was in the early days of big data, but some data professionals and marketers just cant let go of their virtual shopping bags.

Theres a delicate balancing act that enterprises face in this era of advanced analytics and AI. Not enough data can leave a company eating competitors dust. Too much data can be hard to manage, even can make the company legally liable under myriad data privacy regulations. Good decisions, trustworthy AI apps, and efficient data management call for enterprises to collect, collate, and use the right data for them.

In this Quick Study youll see some of InformationWeeks articles on data quality, data management, and how that right data can drive company success.

Data Quality: How to Show the ROI for Projects

Data quality is critical to enterprise success, but it also can be hard to quantify. Here are some key steps that you can take to measure and communicate the tangible return on investment for your data quality initiatives.

AI and Machine Learning Need Quality Assurance

Artificial intelligence and machine learning are not set and forget technologies. They need quality assurance to operate, and continue to operate, as intended.

How to Elevate Your Organizations Use of Data Analytics

Here are three best practices for leveling up your organizations use of analytics and attaining ROI with an enterprise analytics program. Think tools and company culture.

Seeking an Oasis in a Data Desert

Gaps in data quality, particularly due to supply chain issues during the pandemic, is becoming a serious influence on planning effective machine learning models.

The Cost of AI Bias: Lower Revenue, Lost Customers

A survey shows tech leadership's growing concern about AI bias, AI ethics, as negative events impact revenue, customer losses, and more.

3 Ways Data Problems Can Stop Your Business

In todays data-rich world, its important to not only focus on how data can be used to benefit business, but also where a flawed data strategy raises hurdles for the organization.

9 Ways to Reduce the Environmental Impact of Data

IT leaders can reduce the environmental impact of their data by considering a set of data sustainability principles, according to a Gartner analyst.

Priorities of Highly Successful Chief Data Officers

Is your data organization focused on the right areas? A survey of chief data officers looks at how these data executives can enable success in their organizations by the projects they choose to prioritize.

Chief Data Officers Help Steer Digital Transformations

Chief Data Officers are prioritizing data quality, ROI from data and analytics investments, and data sharing.

From AI to Teamwork: 7 Key Skills for Data Scientists

Todays data scientists need more than proficiency in AI and Python. Organizations are looking for specialists who also feel at home in the C-suite.

Why to Create a More Data-Conscious Company Culture

The drive to greater transparency in data requires efforts beyond breaking down data silos. Heres how and why to focus on cultivating a more data-literate workforce.

Data Science: A Guide to Careers and Team Building

Here's a collection of curated articles to help IT professionals learn how to make a career out of data science or how to build a team of data scientists.

CIOs Take Center Stage on ESG Strategies, Battling an Overflow of Data

With environmental, social and governance strategies forming a core part of organizational business plans, CIOs need to tap into various areas of expertise to ensure ESG efforts are organized and integrated enterprisewide.

Creating a Data Literate Culture in Your Organization

Everyone in the organization needs to understand how to access data, keep it secure and think critically about its potential use cases and applications.

An Insider's Look at Intuit's AI and Data Science Operation

Intuit's Director of Data Science speaks with InformationWeek about how the company's data operations have grown and evolved from just a few data scientists trying to sell executives on the value of data projects to becoming an AI-driven platform company.

Deployment Risk: AutoML & Machine Learning Without Expertise

Learn exactly what AutoML is, the value data scientists bring, and best practices on how to use AutoML to kickstart projects within your business.

IBM Databand.ai Acquisition Targets Data Observability

IBM says the deal to acquire the data observability startup will further Big Blue's own mission to provide observability for business.

GDPR Anniversary: Farewell to Global Data Lakes

Heres where were at with the regulation and the data challenges organizations are faced with today. While individuals are concerned about privacy, organizations struggle to balance data privacy with the need to level AI, machine learning and analytics to compete.

Data Fabrics: Six Top Use Cases

A data fabric management architecture optimizes access to distributed data while intelligently curating and orchestrating it for self-service delivery. Here's a look at some ways a data fabric may be able to help your organization.

AI Set to Disrupt Traditional Data Management Practices

The growth of advanced analytics such as machine learning and artificial intelligence is set to drive a disruption in traditional data management operations, according to Gartner.

Beyond the Data Warehouse: What Are the Repository Options Today?

With the rise of unstructured big data, a new wave of data repositories has come into use that dont always involve a data warehouse.

Here is the original post:

Quick Study: How to Find the Right Data - InformationWeek

Read More..

Centre asks Offshore IITs to Offer Courses in Data Science and AI – Analytics India Magazine

The ministry of education has proposed offshore IIT campuses to offer undergraduate degree programmes in areas such as data science and artificial intelligence. Under consideration by the Centre, the proposal to have these courses was based on feedback from Indian embassies abroad.

A survey report shows most of the universities in the target nations, have undergraduate programmes in conventional disciplines. From the feedback shared by the ambassadors of the identified host nations, the most frequently mentioned disciplines are related to computer science or IT, data sciences, AI, machine learning or robotics, electrical, electronics, mining, metallurgy, petroleum and energy, the report read.

Sign up for your weekly dose of what's up in emerging technology.

Further recommendations were made on the various modes of admission, including JEE, GATE, SAT, GRE, and JAM. A JEE or JEE (Advanced) exclusively for offshore campuses can be conceived in the future if its economically and logistically viable, said the report.

The committee noted that academic programmes such as Bachelor of Technology (BTech) and Masters of Technology (MTech) would be named Bachelor of Science (BS) and Masters of Science (MS), which are commonly used for international degrees.

The 17-member committee is led by IIT Council Standing Committee chairperson Dr K Radhakrishnan and also includes the directors of IIT Bombay, IIT Kharagpur, IIT Madras, IIT Delhi, IIT Kanpur, IIT Guwahati and IIT Dhanbad.

Flexible joint faculty contracts were proposed identifying provisions for deputing faculty members from the existing IITs to the proposed institutes abroad in their formative years. The committee set up by the Centre for the global expansion of IITs has Indian missions abroad and identified UK, UAE, Egypt, Saudi Arabia, Qatar, Malaysia and Thailand as prospective locations for offshore campuses under IITs brand name.

Continued here:

Centre asks Offshore IITs to Offer Courses in Data Science and AI - Analytics India Magazine

Read More..

Where DataOps and Opportunities Converge – DevOps.com

In todays data age, getting data analytics right is more essential than ever. A robust data analytics implementation enables businesses to hit key performance metrics, build data and AI-driven customer experiences (think personalize my feed) and capture operational issues before they spiral out of control. The list of competitive advantages goes on, but the bottom line is that many organizations successfully compete based on how effectively their data-driven insights inform their decision-making.

Unfortunately, implementing an effective data analytics platform is challenging due to orchestration (DAG alert!), modeling (more DAGs!), cost control (Who left this instance running all weekend?!) and fast-moving data landscapes (data mesh, data fabric, data lakehouses ). Enter DataOps. Recognizing modern data challenges, organizations are adopting DataOps to help them handle enterprise-level datasets, improve data quality, build more trust in their data and exercise greater control over their data storage processes.

DataOps is an integrated and agile process-oriented methodology that helps businesses develop and deliver effective analytics deployments. It aims to improve the management of data throughout the organization.

While there are multiple definitions of DataOps, below are common attributes that encompass the concept while going beyond data engineering. Heres how we define it:

We broadly define DataOps as a culmination of processes (e.g., data ingestion), practices (e.g., automation of data processes), frameworks (e.g., enabling technologies like AI) and technologies (e.g., a data pipeline tool) that help organizations to plan, build and manage distributed and complex data architectures. DataOps includes management, communication, integration and development of data analytics solutions, such as dashboards, reports, machine learning models and self-service analytics.

DataOps is attractive because it eliminates the silos between data, software development and DevOps teams. The very promise of DataOps encourages line-of-business stakeholders to coordinate with data analysts, data scientists and data engineers. Via traditional agile and DevOps methodologies, DataOps ensures that data management aligns with business goals. Consider an organization endeavoring to increase the conversion rate of their sales leads. In this example, DataOps can make a difference by creating an infrastructure that provides real-time insights to the marketing team, which can help the team to convert more leads. Additionally, an Agile methodology can be employed for data governance, where you can use iterative development to develop a data warehouse. Lastly, it can help data science teams use continuous integration and continuous delivery (CI/CD) to build environments for the analysis and deployment of models.

The amount of data created today is mind-boggling and will only increase. It is reported that 79 zettabytes of data were generated in 2021 and that number is estimated to reach 180 zettabytes by 2025. In addition to the increasing volume of data, organizations today need to be able to process it in a wide range of formats (e.g., graphs, tables, images) and with varying frequencies. For example, some reports might be required daily, while others are needed weekly, monthly or on demand. DataOps can handle these different types of data and tackle varying big data challenges. Add in the internet of things (IoT), such as wearable health monitors, connected appliances and smart home security systems, and that introduces another variable for organizations that also have to tackle the complexities of heterogeneous data as well.

OK, so, how can we make this a reality? First, to manage the incoming data from different sources, DataOps can use data analytics pipelines to consolidate data into a data warehouse or any other storage medium and perform complex data transformations to provide analytics via graphs and charts.

Second, DataOps can use statistical process control (SPC)a lean manufacturing methodto improve data quality. This includes testing data coming from data pipelines, verifying its status as valid and complete, and meeting the defined statistical limits. This enforces the continuous testing of data from sources to users by running tests to monitor inputs and outputs and ensure business logic remains consistent. In case something goes wrong, SPC notifies data teams with automated alerts. This saves them time as they dont have to manually check data throughout the data life cycle.

Around 18% of a data engineers time is spent on troubleshooting. DataOps enables automation to help data professionals save time and focus on more valuable high-priority tasks.

Consider one of the most common tasks in the data management life cycle: Data cleaning. Some data professionals have to manually modify and remove data that is incomplete, duplicate, incorrect or flawed in any number of ways. This process is repetitive and doesnt require any critical thinking. You can automate it by either setting customized scripts or installing a built-in data cleaning software tool.

Additional processes that can be automated via DataOps include:

To develop your own DataOps architecture, you need a reliable set of tools that can help you improve your data flows, especially when it comes to crucial aspects of DataOps, like data ingestion, data pipelines, data integration and the use of AI in analytics. There are a number of companies thatprovide a DataOps platform for real-time data integration and streaming that ensures the continuous flow of data with intelligent data pipelines that span public and private clouds. Looking to increase the likelihood of success of data and analytics initiatives? Take a closer look at DataOps and harness the power of your data.

Read the original post:

Where DataOps and Opportunities Converge - DevOps.com

Read More..

NASA MUREP PBI/HBCU Data Science Equity, Access and Priority for Research and Education (DEAP) – Space Ref

Engagement Opportunities in NASA STEM 2022 (EONS2022)

Appendix N: MUREP PBI/HBCU Data Science Equity, Access and Priority for Research and Education (DEAP)

NOTICE OF FUNDING OPPORTUNITY (NOFO) RELEASED August 22, 2022

This National Aeronautics and Space Administration (NASA) Notice of Funding Opportunity (NOFO) entitled Engagement Opportunities in NASA STEM (EONS) 2022, solicits proposals for competitive funding opportunities in support of the Minority University Research and Education Project (MUREP), administered by NASAs Office of STEM Engagement (OSTEM). EONS-2022 is an omnibus announcement that includes a wide range of NASA STEM Engagement opportunities for basic and applied science and technology research and education. Specific opportunities will be issued periodically throughout the year as Appendices to this solicitation with individual requirements and milestones.

The following Appendix to EONS-2022 has been released: Appendix N: MUREP PBI/HBCU Data Science, Equity, Access and Priority for Research and Education (MUREP DEAP)

Full Proposals due at 5:00pm Eastern Time on Monday, October 24, 2022

NASA OSTEMs MUREP program solicits proposals from Predominately Black Institutions (PBIs) and Historically Black Colleges and Universities (HBCUs) to establish Data Science Institutes (DEAP Institutes) for data-intensive research in science and engineering that can accelerate discovery and innovation in a broad array of NASA Science Mission Directorate research domains. The DEAP Institutes will lead innovation by closely collaborating with NASA mentors for harnessing diverse data sources and developing and applying new methodologies, technologies, and infrastructure for data management and analysis research. The DEAP Institutes will support convergence between science and engineering research communities as well as expertise in Data Science foundations, systems, and applications. In addition, the DEAP Institutes will enable breakthroughs in science and engineering through collaborative, co-designed programs implementing NASA open science principles and architecture to formulate innovative data-intensive approaches to address critical national challenges.

Successful MUREP DEAP proposals will be funded as multi-year cooperative agreements not to exceed three (3) years. Please see the full Appendix N: MUREP DEAP for more details.

For general inquiries, please contact: MUREPDEAP@nasaprs.com.

A pre-proposal teleconference for the MUREP DEAP opportunity will be held on Wednesday, September 14, 2022 at 4:00 PM Eastern Time. During this session, the MUREP DEAP team will give an in-depth overview of the opportunity and highlight information contained in the EONS 2022 document regarding proposal preparation and requirements. Please visit the MUREP DEAP landing page in NSPIRES for information on how to join the call. Any changes to this session will be posted here as well. Proposers are strongly advised to check for updates prior to the call.

For more information regarding this opportunity, please visit the ENGAGEMENT OPPORTUNITIES IN NASA STEM (EONS-2022) page on the NASA Solicitation and Proposal Integrated Review and Evaluation System (NSPIRES) website and click on List of Open Program Elements.

Go here to see the original:

NASA MUREP PBI/HBCU Data Science Equity, Access and Priority for Research and Education (DEAP) - Space Ref

Read More..