Page 1,967«..1020..1,9661,9671,9681,969..1,9801,990..»

Winning Scientific Research Announced for Heart Failure and Social/Structural Determinants of Health – Diagnostic and Interventional Cardiology

July 15, 2022 The American Heart Association, the leading voluntary organization devoted to longer, healthier lives, recognizes structural racism as a major cause of poor health and premature death from heart disease and stroke.[1]Areas in the U.S. with more social vulnerabilities have higher premature mortality from cardiovascular disease.[2]The American Heart Association, and the Association of Black Cardiologists hosted a six-month data challenge in which researchers tested the relationships between heart failure and health disparities, social determinants of health and structural determinants of health. The results were evaluated by a peer review group of nearly 30 experts in the field. Four teams of researchers are winners.

Congratulationsto these researchers for their exceptional work in the heart failure data challenge, said Michelle A. Albert, M.D., M.P.H., past president of the Association of Black Cardiologists and president of the American Heart Association. Improving our understanding of how social determinants of health impact certain populations in order to develop consequential targeted solutions requires harmonization of different types of data. These teams must be commended for their efforts at addressing health equity, one of the most pressing areas in healthcare.

Ambarish Pandey, M.D. (University of Texas Southwestern Medical Center, Dallas) led the study Impact of Social and Structural Determinants of Health on Hospital Length of Stay among Heart Failure Patients according to Race,with his team Matthew W. Segar, M.D., M.S. (Texas Heart Institute, Houston), Shreya Rao, M.D., M.P.H. (University of Texas Southwestern Medical Center, Dallas), Sandeep Das, M.D., M.P.H., M.B.A. (University of Texas Southwestern Medical Center, Dallas).

The study leveraged the American Heart AssociationsGet With The Guidelines Heart Failureregistry data to identify key ZIP code level social determinants of health parameters that are significantly associated with prolonged length of stay following heart failure hospitalization.

Gargya Malla, M.D., M.P.H., Ph.D. (University of Alabama at Birmingham) led the projectNeighborhood Disadvantage and Risk of Heart Failure Among Black and White Adults: The Reasons for Geographic and Racial Differences in Stroke (REGARDS) Study.Her team included researchers from University of Alabama at Birmingham; Weill Cornell Medicine, New York; East Carolina University, Greenville, North Carolina, and Drexel University, Philadelphia. The study investigated incident heart failure risk associated with living in disadvantageous neighborhoods and if this association was different for white and Black adults in the US.

Jeffrey Tran, M.D. and Nancy Sweitzer, M.D., Ph.D. (University of Arizona Sarver Heart Center, Tucson, Arizona)led the study The Effect of Socioeconomic Determinants of Health on the Prescription of Angiotensin Receptor Blocker/Neprilysin Inhibitors at Discharge from the Hospital, which investigated how socioeconomic determinants of health impact the odds of being prescribed angiotensin receptor blocker/neprilysin inhibitors in patients experiencing heart failure with reduced ejection fraction being discharged from the hospital. The study leveraged multilevel multiple imputation to handle missing data and use existing data to the fullest extent possible.

Vishal Rao, M.D., M.P.H. (Duke University Medical Center, Durham, North Carolina) and Melissa Caughey, Ph.D. (University of North Carolina Chapel Hill, North Carolina) led the projectIn-Hospital Outcome Differences in Patients Hospitalized for Heart Failure Across Neighborhood Socioeconomic Disadvantage in the American Heart Association Get With The Guidelines - Heart Failure Registry.The team explored the association between socioeconomic status disadvantage and in-hospital heart failure outcomes in patients from diverse neighborhoods in the Get With The Guidelines - Heart Failure Registry.

The research findings from all the winning studies are currently under consideration for publication in peer-reviewed scientific journals and are not yet publicly available.

These types of data challenge projects provide much-needed insights into the relationships between heart failure and social determinants of health. Data challenges bring in top level scientists that provide novel and effective solutions, said Jennifer Hall, Ph.D., chief of data science for the American Heart Association.

Health disparities include environmental threats, individual and behavioral factors, inadequate access to health care, poverty and educational inequalities. Social determinants of health include resources such as food supply, housing, economic and social relationships, education and health care. Structural determinants of health include economic, governing and social policies that affect pay, working conditions, housing and education.

Researchers had access to the American Heart Associations Get With The Guidelines Heart Failure registry data on the American Heart AssociationsPrecision Medicine Platformto conduct their analyses. The Precision Medicine Platform is an easy-to-use research interface that allows researchers to collaborate from anywhere in the world in a secure, cloud-based environment. With artificial intelligence and deep machine learning capabilities, the Precision Medicine Platform gives researchers the power and speed to bring their data together collaboratively and accelerate their findings into impactful discoveries for patients faster than ever before.

For more information:www.heart.org

[1]Churchwell K, Elkind MSV, Benjamin RM, et al. Call to action: structural racism as a fundamental driver of health disparities: a presidential advisory from the American Heart Association.Circulation. 2020;142(24).

[2]Khan SU, Javed Z, Lone AN, et al. Social vulnerability and premature cardiovascular mortality among us counties, 2014 to 2018.Circulation. 2021;144(16):1272-1279.

View original post here:

Winning Scientific Research Announced for Heart Failure and Social/Structural Determinants of Health - Diagnostic and Interventional Cardiology

Read More..

AI micro-credential helps working professionals boost career options – University of Florida

From agriculture to health care, manufacturing to retail and banking, artificial intelligence is transforming the economy and giving businesses a competitive edge by helping them improve the products and services they deliver to customers.

And now working professionals can gain their own competitive edge by adding an artificial intelligence (AI) micro-credential through the University of Floridas Office of Professional and Workplace Development. This is the first micro-credential to be offered at UF, joining eight online or hybrid certificate programs.

Micro-credentials have emerged as an ideal way for working professionals to become proficient in a specific area through short, non-credit courses that culminate in a competency-based badge.

Earning a micro-credential helps fill knowledge gaps, especially for those in the workforce who have limited time and cannot commit to a semester or longer of learning.

The AI micro-credential program allows participants to learn skills they can leverage for career advancements, from a new job to a raise, new title or additional responsibilities and status with a current employer.

People with such skills are in high demand. There were nearly 15,000 AI-related job postings in Florida in 2021, according to the AI Index 2022 Annual Report.

UF launched its AI micro-credential program in the fall in partnership with NVIDIA, which provided funding. NVIDIA is a leading manufacturer of high-end graphics processing units and chip systems used in two-thirds of the worlds supercomputers, including UFs HiPerGator.

The AI micro-credential consists of seven non-credit courses offered in various modalities that allow people with any level of machine learning background to participate. The courses are available to everyone, from faculty and staff to the broader community.

Regina Rodriguez, provost fellow for professional education, says the program is a great way for anyone to learn skills in AI that will help them play a role in a world of growing reliance on technology.

The courses that we are launching at UF are for those that may not have any understanding of AI, she said. So really starting with foundation courses and teaching the users of AI versus the developers of AI.

To earn the micro-credential, participants complete a 15-hour foundation course that focuses on ethics in AI and a second, 15-hour fundamentals course with a focus on either engineering or STEM.

After the foundation courses are completed, participants choose a final course, designed to demonstrate how AI is used in various fields to solve real world problems, from one of four focus areas.

The current focus areas are agriculture and life sciences, applied data science, business and engineering. UF is adding specialization courses from the College of Medicine in mid-August and the College of Law in the fall. The program is also collaborating with UF Research Computing to offer short courses and access to the HiPer Gator supercomputer.

Upcoming 15-hour course offerings include Ethics of AI on Aug. 1, AI in Business on Sept. 3, AI in Agriculture and Life Sciences of Oct. 3 and AI in Applied Data Science on Nov. 1.

For people who are interested in learning about AI but do not want to commit to earning a micro-credential, there are one and four-hour asynchronous courses available apart from the 15-hour hybrid courses needed to earn a micro-credential. The one-hour course is free and provides users with basic knowledge about AI and the four-hour course costs $249 and earns participants a certificate of completion. However, the shorter courses do not provide as much faculty interaction and discussion as the $1,095 for each micro-credential course.

The courses are priced comparable to other universities professional development courses. UF faculty and staff can take the courses for free to better perfect their craft.

The micro-credential program is also beneficial for companies and executives looking to expand their knowledge. CEOs and other C-Suite business leaders working in and outside of AI have taken courses offered by the program.

Pete Martinez, the CEO of Sivotec Bioinformatics and a former IBM vice president, participated in the Ethics of AI course, which he described as intellectually stimulating.

The University of Florida Ethics of AI program provides a proactive approach for executives to engage in deep-thought through a multi-disciplinary forum on the ethical impacts of AI innovations, he said in a message to Rodriguez. What I found of great value was the involvement from industry in its development. If treated as a pure academic program, it would lose the real-life implications of policies and regulations.

The program has recently partnered with FlapMax, a conference and training program for AI startups, to provide over 80 worldwide companies in its network with webinars and info sessions on AI in agriculture. The program has also partnered with the technology company L3Harris to develop and deliver short courses for industry professionals learning about deep neural network-based solutions.

Rodriguez says the program is just getting started with artificial intelligence offerings.

You can start to become an expert in AI today, she said. It doesnt matter what stage of educational background youre in.

Emma Richards July 15, 2022

Follow this link:

AI micro-credential helps working professionals boost career options - University of Florida

Read More..

New CEO not likely to change Tibco once merged with Citrix – TechTarget

The revelation that Tibco will have a new CEO once its merger with Citrix is complete came as a surprise to some, but the change in leadership will likely not have a significant impact on the analytics vendor's platform development or its customers.

Tibco, founded in 1997 and based in Palo Alto, Calif., is a subsidiary of Vista Equity Partners and Evergreen Coast Capital Corp.

In January 2022, Vista and Evergreen reached an agreement to acquire Citrix, a digital workspace technology vendor founded in 1989 and based in Ft. Lauderdale, Fla., with the acquisition expected to close during the third quarter of this year.

Once completed, Vista and Evergreen plan to merge Citrix and Tibco to create a single company that will join Citrix's digital workspace and application delivery platform with Tibco's analytics and data management capabilities.

On Monday, Vista revealed that rather than appoint Tibco CEO Dan Streetman or Citrix chairman and interim president and CEO Bob Calderoni to lead the combined entity, it will instead bring in Tom Krause as the new CEO.

Krause was promoted to president of semiconductor giant Broadcom in 2020 and helped oversee Broadcom's recent $61 billion acquisition of VMWare. Before that, he was Broadcom's chief financial officer for four years. Streetman and Calderoni will remain in their roles until Tibco and Citrix are combined.

The move to bring in Krause was somewhat surprising but makes sense given the current economic climate, according to Doug Henschen, an analyst at Constellation Research.

He noted that Krause has a financial background, while Streetman was a sales leader before ascending to CEO and Citrix currently has an interim leader. And with the sharp declines in the stock market in 2022 and fears of a recession, the appointment of Krause indicates that Vista is placing an emphasis on the monetary health of Tibco and Citrix.

"We've just had a major shakeout in the financial markets and Vista appears to be more concerned about financial management at this time," Henschen said.

While Streetman won't be Tibco's CEO once the merger with Citrix is complete, the analytics vendor will still have its product and development leaders in place, which suggests stability for current Tibco customers, he added.

Tibco offers three separate analytics platforms, with Spotfire enabling deep data exploration, streaming analytics and data science; WebFocus specializing in scalable reporting that allows thousands of users to view and work with the same data; and JasperSoft designed for developers to enable them to embed BI within applications.

The analytics tools help make up Tibco's "predict" portfolio. In addition, the vendor has a "connect" portfolio that includes its cloud capabilities and a "unify" portfolio that addresses data management.

Meanwhile, despite the pending change at the top, Nelson Petracek remains Tibco's chief technology officer and Matt Quinn is still its chief operating officer. And at the product level, Mark Palmer is its senior vice president of analytics, data science and data products.

"Reports to the CEO at each brand unit can steer software direction," Henschen said.

In fact, the day after it was revealed that Streetman will eventually depart Tibco and Krause will become its new CEO, the vendor released ModelOps, an anticipated tool first unveiled in preview more than a year ago that will enable organizations to quickly deploy data science models at scale.

While Henschen expressed some surprise at the move to appoint a new leader of Tibco once it merges with Citrix, David Menninger, an analyst at Ventana Research, noted that acquisitions often lead to changes in leadership.

And though Tibco wasn't technically acquired in Vista and Evergreen's deal to buy Citrix, its merger with Citrix will result in a changed company. Citrix, meanwhile, is indeed getting a new owner.

"I'm never surprised when a change of ownership results in a change in leadership," Menninger said. "The acquirer usually often believes there is some untapped opportunity in the organization they are acquiring which the existing leadership did not recognize."

I'm not surprised there is a new CEO. It makes sense to drive a new direction for the unified company. This is, after all, not a takeover by one of the other, but more like a real merger. Donald FarmerFounder and principal, TreeHive Strategy

Similarly, Donald Farmer, founder and principal at TreeHive Strategy, said it's not a shock that Vista and Evergreen plan to put a new CEO in place once Tibco and Citrix have been joined, noting that neither Tibco nor Citrix is acquiring the other in the same way Tibco bought IBI in 2020, so it makes sense that neither company's leader will be CEO.

"I'm not surprised there is a new CEO," he said. "It makes sense to drive a new direction for the unified company. This is, after all, not a takeover by one of the other, but more like a real merger."

While a new CEO is set to take over once Tibco and Citrix join forces, it remains to be seen whether the two companies are a good fit.

The vendors' technologies do not have an obvious synergy, though at the time Vista and Evergreen's acquisition of Citrix was first revealed, Tibco's Streetman said the changing nature of the workforce with many more people working from home than just a few years ago served as part of the motivation for the move.

"I don't really see the synergies between Tibco and Citrix," Menninger said. "Obviously, both are software companies, but there is not a lot of overlap between Tibco's data and analytics capabilities and Citrix's digital workspace technology."

At the time the acquisition and resulting merger of Tibco and Citrix was first revealed, Henschen speculated that perhaps the greatest benefit to Tibco will be exposure to Citrix's customer base of more than 400,000.

However, six months later Henschen noted that the reasons for the merger still aren't clear.

"I'm still puzzling over the combination a bit and haven't seen synergistic messaging and positioning," he said. "The Tibco and Citrix sites are still displaying the messaging and positioning that was in place before the acquisition. We'll see if things change quickly in the wake of Krause's appointment."

Farmer, meanwhile, said he is a bit more bullish on the merger given how many more people work remotely than before the COVID-19 pandemic.

By combining Tibco and Citrix, the new company has the potential to deliver enterprise infrastructure capabilities to organizations with remote employees while also providing high-level analytics capabilities.

"This shift [to remote work] represents a challenge to any company delivering enterprise infrastructure," Farmer said. "There should be significant opportunities for the new company to deliver the entire hybrid work experience from the networking experience to the real-time data and analytics experience."

He cautioned, however, that a merger between companies the size of Tibco and Citrix could be complex, and if it proves unwieldy could hurt Tibco's product development pipeline.

"The merger could be complicated, messy and a drag on innovation," Farmer said. "If it plays out that way, this will be an opportunity lost, because the market is moving very quickly toward new working practices and new infrastructure to support it. Tom Krause has his work cut out to make this both effective and efficient."

View original post here:

New CEO not likely to change Tibco once merged with Citrix - TechTarget

Read More..

DataCamp Courses, Skill Tracks and Pricing Forbes Advisor – Forbes

Editorial Note: We earn a commission from partner links on Forbes Advisor. Commissions do not affect our editors' opinions or evaluations.

If you work in tech or are hoping to break into the field, you must keep your technical skills sharp to be competitive in the job market. But not everyone has the time or money to return to school for a degree. Fortunately, online learning platforms for coding are becoming more popular.

DataCamp is one such platform that helps you enhance your coding skills or deepen your knowledge of subjects like data science and machine learning. In this article, youll learn about DataCamp and how it differs from its competitors.

DataCamp is an online learning platform that teaches students new technical skills or helps them brush up on their current skill set. DataCamp is a self-paced, non-proctored approach to learning, similar to competing providers like Codecademy and CodeCamp. DataCamp teaches data science, machine learning and skills like business intelligence and SQL tools.

When you sign up with DataCamp, youll experience a hands-on approach to learning that includes regular skill assessments to track your progress. Courses include challenges and projects featuring real-world elements to help you figure out how to apply your new skills in the workplace.

Through a series of courses or career paths, DataCamp can help you learn coding languages like Python, R, SQL and Scala, along with products like Tableau, Power BI, Oracle SQL and Excel.

DataCamp has a few different paid tiers and one free offering. The free service level is relatively limited but allows you to complete six courses and provides unlimited access to DataCamps job board. Youll also get to create and maintain a professional profile on DataCamps site.

Paid membership levels are as follows:

DataCamp offers a full suite of courses and career paths to explore. Below, weve included details on several of the more popular courses offered.

Time to Completion: 4 hours

Course Format: Self-paced

Can Courses Be Completed Fully Online? Yes

Careers this Course Prepares Learners for: Data scientist

Overview of What to Expect in this Course: This course introduces students to the Python programming language and discusses how the language is used in the field of data science. Students learn to work with data in lists and how to use functions and packages. The course culminates with exposure to NumPy, which is Pythons package used for scientific and numerical computing.

Time to Completion: 4 hours

Course Format: Self-paced

Can Courses Be Completed Fully Online? Yes

Careers this Course Prepares Learners for: Data scientist

Overview of What to Expect in this Course: One of the main responsibilities of a data scientist is to convert raw data into meaningful information. This SQL course teaches data extraction and manipulation using SQL, MySQL, Oracle and PostgreSQL. The course breaks down into four chapters.

Time to Completion: 4 hours

Course Format: Self-paced

Can Courses Be Completed Fully Online? Yes

Careers this Course Prepares Learners for: Data analyst

Overview of What to Expect in this Course: This course introduces students to the open-source language R. Students learn about key concepts like vectors, factors, lists and data frames. The R course aims to help students develop the skills theyll need to do their own data analysis in R.

Time to Completion: 3 hours

Course Format: Self-paced

Can Courses Be Completed Fully Online? Yes

Careers this Course Prepares Learners for: Data Analyst

Overview of What to Expect in this Course: Power BI is a widely used business intelligence platform that allows users to create impactful data models. DataCamps Power BI course teaches students to use the drag-and-drop functionality and other methods to load and transform data using Power Query.

More in-depth and time-intensive than individual courses, DataCamps skill tracks give a more well-rounded look at popular IT areas. These tracks include programming in Python and R and data visualization. Below, we provide some details on DataCamps most popular skill tracks.

Time to Completion: 22 hours

Course Format: Self-paced

Can Courses Be Completed Fully Online? Yes

Careers this Course Prepares Learners for: Data scientist, data analyst

Overview of What to Expect in this Course: This track offers an in-depth look at programming in R and other coding languages used by data scientists. Students undergo a series of exercises to learn about common R elements, including vectors, matrices and data frames. More advanced courses in this skill track introduce concepts like conditional statements, loops and vectorized functions.

Time to Completion: 88 hours

Course Format: Self-paced

Can Courses Be Completed Fully Online? Yes

Careers this Course Prepares Learners for: Researcher, data scientist

Overview of What to Expect in this Course: This course teaches students how to use Python like a data scientist. Students learn to work with data: importing, cleaning, manipulating andmost importantlyvisualizing. A series of interactive exercises introduces learners to some of the most popular Python libraries, like pandas, NumPy and Matplotlib.

Time to Completion: 73 hours

Course Format: Self-paced

Can Courses Be Completed Fully Online? Yes

Careers this Course Prepares Learners for: Data engineer

Overview of What to Expect in this Course: In addition to Python, this in-depth skill track comprises 19 courses that introduce students to languages like Shell, SQL and Scala. Learners also gain exposure to big data tools like AWS Boto, PySpark, Spark SQL and MongoDB. Through six self-paced projects, students create their own databases and data engineering pipelines.

Time to Completion: 61 hours

Course Format: Self-paced

Can Courses Be Completed Fully Online? Yes

Careers this Course Prepares Learners for: Machine learning scientist, machine learning engineer

Overview of What to Expect in this Course: This skill track focuses on models. Comprising 15 courses, this course teaches students about creating, training and visualizing models, one of the most important tools for a machine learning engineer. Students are also introduced to Bayesian statistics, natural language processing and Spark.

Compare rates from participating lenders via Credible.com

Yes, reviews indicate that DataCamp is suitable for beginners. Even the providers more in-depth course offerings are very basic in nature, presenting material in simple, easy-to-understand formats.

When it comes to employers, a certificate from DataCamp does not carry the same weight as a degree. Still, a DataCamp certificate indicates that you have invested time and energy into learning a career-related skill set.

Possibly, though a DataCamp course on its own may not be enough for you to land a job. You should also gain hands-on experience and network with professionals in your desired field.

View original post here:

DataCamp Courses, Skill Tracks and Pricing Forbes Advisor - Forbes

Read More..

Radiologists hope to use AI to improve readings – University of Miami: News@theU

The Miller School of Medicine Department of Radiology is working with the Universitys Institute for Data Science and Computing to design an artificial intelligence tool that could help them diagnose patients in a more individualized way.

Over the years, new technology has helped radiologists diagnose illnesses on a multitude of medical images, but it has also changed their jobs.

While in the past these physicians spent more time speaking with patients, today they spend most of the time in the reading rooma dark space where they scrutinize images alongside a patients electronic medical records and other data sourcesto diagnose an illness.

A radiologists job is often solitary. And it is a trend that University of Miami Miller School of Medicine radiologists Dr. Alex McKinney and Dr. Fernando Collado-Mesa hope to change.

The two physicians have been working with the Universitys Institute for Data Science and Computing (IDSC) to create an artificial intelligence toolbox that will draw on a massive database of deidentified data and medical images to help doctors diagnose and treat diseases based not only on imaging data but by also considering a patients unique background and circumstances. This would include risk factors, like race and ethnicity, socioeconomic and educational status, and exposure. The physicians say it is a necessary innovation at a time when narrow artificial intelligence in radiology is only able to make a binary decision such as positive or negative for one disease, rather than scanning for a host of disorders.

We believe the next iteration of artificial intelligence should be contextual in nature, which will take in all of a patients risk factors, lab data, past medical data, and will help us follow the patient, said McKinney, who is also the chair of the Department of Radiology. It will become a form of augmented interpretation to help us take care of the patient.

According to Collado-Mesa, this toolbox will not just say yes or no, disease or no disease. It will point to the data around it to consider a variety of issues for each individual patient, to put its findings into a context, including future risk.

Current artificial intelligence tools are also limited to a specific type of medical image, and cannot, for example, analyze both MRI (magnetic resonance imaging) and ultrasound at the same time. In addition, the patient data that is used in these diagnosis tools is typically not inclusive of a range of demographic groups, which can lead to a bias in care. Having a tool that draws upon the examples of millions of South Florida patients, while maintaining their privacy, will help radiologists be more efficient and comprehensive, McKinney noted.

Right now, there is just so much data for radiologists to sift through. So, this could help us as our tech-based partner, McKinney added.

All of these factors led Collado-Mesa and McKinney to try and create a better alternative, and they spoke with IDSC director Nick Tsinoremas, also a professor of biochemistry and molecular biology. Tsinoremas and IDSCs advanced computing team came up with the idea of utilizing an existing toolcalled URIDEa web-based platform that aggregates deidentified patient information for faculty researchand adding in the deidentified images from the Department of Radiology.

They hope to unveil a first version of the toolbox this summer and plan to add new elements as more imaging data is added. It will include millions of CT scans, mammograms, and ultrasound and MRI images, along with radiographs, McKinney pointed out.

We dont want to rush this because we want it to be a high-quality, robust toolbox, said Collado-Mesa, an associate professor of radiology and breast imaging, as well as chief of innovation and artificial intelligence for the Department of Radiology.

Both physicians and Tsinoremas hope that the artificial intelligence tool will help answer vital research questions, like: what risk factors lead to certain brain tumors? Or, what are the most effective treatments for breast cancer in certain demographic groups? It will also use machine learning, a technique that constantly trains computer programs how to utilize a growing database, so it can learn the best ways to diagnose certain conditions.

Creating this resource can help with diagnosis and will allow predictive modeling for certain illnesses, so that if a person has certain image characteristics and clinical information that is similar to other patients from this database, doctors could predict the progression of a disease, the efficacy of their medication, and so on, Tsinoremas said.

To ensure the toolbox will be unbiased, the team is also planning to add more images and data of all population groups in the community, as it is available, as well as to monitor the different elements constantly and systematically within the toolbox to make sure it is performing properly.

The radiologists plan to focus first on illnesses that have a high mortality or prevalence in the local population, like breast cancer, lung cancer, and prostate cancer, and to add others with time.

The technology could allow them to spend more time with patients and offer more personalized, precision-based care based on the patients genetics, age, and risk factors, according to both physicians.

Artificial Intelligence has the potential to advocate for the patients, rather than a one-size-fits-all approach to medicine based on screening guidelines, McKinney said. This could help us get away from that, and it would hopefully offer more hope for people with rare diseases.

But as data is added in the future, the researchers hope to expand their work with the tool. And they hope that physicians across the University will use it to conduct medical research, too.

This is a resource that any UM investigator could potentially access, provided that they have the approvals, and it could spark a number of different research inquiries to describe the progression of disease and how patients respond to different treatments in a given time periodthese are just some of the questions we can ask, Tsinoremas said.

Read the rest here:

Radiologists hope to use AI to improve readings - University of Miami: News@theU

Read More..

Addressing the issues of dropout regularization using DropBlock – Analytics India Magazine

Dropout is an important regularization technique used with neural networks. Despite effective results in general neural network architectures, this regularization has some limitations with the convolutional neural networks. Due to this reason, it does not solve the purpose of building robust deep learning models. DropBlock is a regularization technique, which was proposed by the researchers at Google Brain, addresses the limitations of the general dropout scheme and helps in building effective deep learning models. This article will cover the DropBlock regularization methodology, which outperforms existing regularization methods significantly. Following are the topics to be covered.

By preserving the same amount of features, the regularization procedure minimizes the magnitude of the features. Lets start with the Dropout method of regularization to understand DropBlock.

Deep neural networks include several non-linear hidden layers, making them highly expressive models capable of learning extremely complex correlations between their inputs and outputs. However, with minimal training data, many of these complex associations will be the consequence of sampling noise, thus they will exist in the training set but not in the true test data, even if they are derived from the same distribution. This leads to overfitting, and several ways for decreasing it have been devised. These include halting training as soon as performance on a validation set begins to deteriorate.

There are two best ways to regularize a fixed-sized model.

Dropout is a regularization strategy that solves two difficulties. It eliminates overfitting and allows for the efficient approximation combination of exponentially many distinct neural network topologies. The word dropout refers to the removal of units (both hidden and visible) from a neural network. Dropping a unit out means removing it from the network momentarily, including with all of its incoming and outgoing connections. The units to be dropped are chosen at random.

A thinned network is sampled from a neural network by applying dropout. All the units that avoided dropout make up the thinning network. A collection of potential 2 to the power of nets thinning neural networks may be considered a neural network with a certain number of units. Each of these networks shares weights in order to keep the total number of parameters at the previous level or lower. A new thinning network is sampled and trained each time a training instance is presented. Therefore, training a neural network with dropout may be compared to training a group of 2 to the power of nets thinned networks with large weight sharing, where each thinned network is trained extremely infrequently or never.

Are you looking for a complete repository of Python libraries used in data science,check out here.

A method for enhancing neural networks is a dropout, which lowers overfitting. Standard backpropagation learning creates brittle co-adaptations that are effective for the training data but ineffective for data that has not yet been observed. These co-adaptations are disrupted by random dropout because it taints the reliability of any one concealed units existence. However, removing random characteristics is a dangerous task since it might remove anything crucial to solving the problem.

To deal with this problem DropBlock method was introduced to combat the major drawback of Dropout being dropping features randomly which proves to be an effective strategy for fully connected networks but less fruitful when it comes to convolutional layers wherein features are spatially correlated.

In a structured dropout method called DropBlock, units in a feature maps contiguous area are dropped collectively. Because activation units in convolutional layers are spatially linked, DropBlock performs better than dropout in convolutional layers. Block size and rate () are the two primary parameters for DropBlock.

Similar to dropout, the DropBlock is not applied during inference. This may be understood as assessing an averaged forecast over the ensemble of exponentially growing sub-networks. These sub-networks consist of a unique subset of sub-networks covered by dropout in which each network does not observe continuous feature map regions.

There are two main hyperparameters on which the whole algorithm works which are block size and the rate of unit drop.

The feature map will have more features to drop as every zero entry on the sample mask is increased to block size, the block size is sized 0 blocks, and so will the percentage of weights to be learned during training iteration, thus lowering overfitting. Because more semantic information is removed when a model is trained with bigger block size, the regularization is stronger.

According to the researchers, regardless of the feature maps resolution, the block size is fixed for all feature maps. When block size is 1, DropBlock resembles Dropout, and when block size encompasses the whole feature map, it resembles SpatialDropout.

The amount of characteristics that will be dropped depends on the rate parameter (). In dropout, the binary mask will be sampled using the Bernoulli distribution with a mean of 1-keep_prob, assuming that we wish to keep every activation unit with the probability of keep_prob.

We must, however, alter the rate parameter () when we sample the initial binary mask to take into account the fact that every zero entry in the mask will be extended by block size2 and the blocks will be entirely included in the feature map. DropBlocks key subtlety is that some dropped blocks will overlap, hence the mathematical equation can only be approximated.

Lets understand with an example shown in the below image, it represents the test results by researchers. The researchers applied DropBlock on the ResNet-50 model to check the effect of block size. The models are trained and evaluated with DropBlock in groups 3 and 4. So two ResNet-50 models were trained.

The first model has higher accuracy compared to the second ResNet-50 model.

The syntax provided by Keras to use DropBlock for regularizing the neural networks is shown below.

keras_cv.layers.DropBlock2D(rate, block_size, seed=None, **kwargs)

Hyperparameter:

DropBlocks resilience is demonstrated by the fact that it drops semantic information more effectively than the dropout. Convolutional layers and fully connected layers might both use it. With this article, we have understood about DropBlock and its robustness.

Follow this link:

Addressing the issues of dropout regularization using DropBlock - Analytics India Magazine

Read More..

Opinion: The hazards of TikTok — a genuine concern – Post Register

Country

United States of AmericaUS Virgin IslandsUnited States Minor Outlying IslandsCanadaMexico, United Mexican StatesBahamas, Commonwealth of theCuba, Republic ofDominican RepublicHaiti, Republic ofJamaicaAfghanistanAlbania, People's Socialist Republic ofAlgeria, People's Democratic Republic ofAmerican SamoaAndorra, Principality ofAngola, Republic ofAnguillaAntarctica (the territory South of 60 deg S)Antigua and BarbudaArgentina, Argentine RepublicArmeniaArubaAustralia, Commonwealth ofAustria, Republic ofAzerbaijan, Republic ofBahrain, Kingdom ofBangladesh, People's Republic ofBarbadosBelarusBelgium, Kingdom ofBelizeBenin, People's Republic ofBermudaBhutan, Kingdom ofBolivia, Republic ofBosnia and HerzegovinaBotswana, Republic ofBouvet Island (Bouvetoya)Brazil, Federative Republic ofBritish Indian Ocean Territory (Chagos Archipelago)British Virgin IslandsBrunei DarussalamBulgaria, People's Republic ofBurkina FasoBurundi, Republic ofCambodia, Kingdom ofCameroon, United Republic ofCape Verde, Republic ofCayman IslandsCentral African RepublicChad, Republic ofChile, Republic ofChina, People's Republic ofChristmas IslandCocos (Keeling) IslandsColombia, Republic ofComoros, Union of theCongo, Democratic Republic ofCongo, People's Republic ofCook IslandsCosta Rica, Republic ofCote D'Ivoire, Ivory Coast, Republic of theCyprus, Republic ofCzech RepublicDenmark, Kingdom ofDjibouti, Republic ofDominica, Commonwealth ofEcuador, Republic ofEgypt, Arab Republic ofEl Salvador, Republic ofEquatorial Guinea, Republic ofEritreaEstoniaEthiopiaFaeroe IslandsFalkland Islands (Malvinas)Fiji, Republic of the Fiji IslandsFinland, Republic ofFrance, French RepublicFrench GuianaFrench PolynesiaFrench Southern TerritoriesGabon, Gabonese RepublicGambia, Republic of theGeorgiaGermanyGhana, Republic ofGibraltarGreece, Hellenic RepublicGreenlandGrenadaGuadaloupeGuamGuatemala, Republic ofGuinea, RevolutionaryPeople's Rep'c ofGuinea-Bissau, Republic ofGuyana, Republic ofHeard and McDonald IslandsHoly See (Vatican City State)Honduras, Republic ofHong Kong, Special Administrative Region of ChinaHrvatska (Croatia)Hungary, Hungarian People's RepublicIceland, Republic ofIndia, Republic ofIndonesia, Republic ofIran, Islamic Republic ofIraq, Republic ofIrelandIsrael, State ofItaly, Italian RepublicJapanJordan, Hashemite Kingdom ofKazakhstan, Republic ofKenya, Republic ofKiribati, Republic ofKorea, Democratic People's Republic ofKorea, Republic ofKuwait, State ofKyrgyz RepublicLao People's Democratic RepublicLatviaLebanon, Lebanese RepublicLesotho, Kingdom ofLiberia, Republic ofLibyan Arab JamahiriyaLiechtenstein, Principality ofLithuaniaLuxembourg, Grand Duchy ofMacao, Special Administrative Region of ChinaMacedonia, the former Yugoslav Republic ofMadagascar, Republic ofMalawi, Republic ofMalaysiaMaldives, Republic ofMali, Republic ofMalta, Republic ofMarshall IslandsMartiniqueMauritania, Islamic Republic ofMauritiusMayotteMicronesia, Federated States ofMoldova, Republic ofMonaco, Principality ofMongolia, Mongolian People's RepublicMontserratMorocco, Kingdom ofMozambique, People's Republic ofMyanmarNamibiaNauru, Republic ofNepal, Kingdom ofNetherlands AntillesNetherlands, Kingdom of theNew CaledoniaNew ZealandNicaragua, Republic ofNiger, Republic of theNigeria, Federal Republic ofNiue, Republic ofNorfolk IslandNorthern Mariana IslandsNorway, Kingdom ofOman, Sultanate ofPakistan, Islamic Republic ofPalauPalestinian Territory, OccupiedPanama, Republic ofPapua New GuineaParaguay, Republic ofPeru, Republic ofPhilippines, Republic of thePitcairn IslandPoland, Polish People's RepublicPortugal, Portuguese RepublicPuerto RicoQatar, State ofReunionRomania, Socialist Republic ofRussian FederationRwanda, Rwandese RepublicSamoa, Independent State ofSan Marino, Republic ofSao Tome and Principe, Democratic Republic ofSaudi Arabia, Kingdom ofSenegal, Republic ofSerbia and MontenegroSeychelles, Republic ofSierra Leone, Republic ofSingapore, Republic ofSlovakia (Slovak Republic)SloveniaSolomon IslandsSomalia, Somali RepublicSouth Africa, Republic ofSouth Georgia and the South Sandwich IslandsSpain, Spanish StateSri Lanka, Democratic Socialist Republic ofSt. HelenaSt. Kitts and NevisSt. LuciaSt. Pierre and MiquelonSt. Vincent and the GrenadinesSudan, Democratic Republic of theSuriname, Republic ofSvalbard & Jan Mayen IslandsSwaziland, Kingdom ofSweden, Kingdom ofSwitzerland, Swiss ConfederationSyrian Arab RepublicTaiwan, Province of ChinaTajikistanTanzania, United Republic ofThailand, Kingdom ofTimor-Leste, Democratic Republic ofTogo, Togolese RepublicTokelau (Tokelau Islands)Tonga, Kingdom ofTrinidad and Tobago, Republic ofTunisia, Republic ofTurkey, Republic ofTurkmenistanTurks and Caicos IslandsTuvaluUganda, Republic ofUkraineUnited Arab EmiratesUnited Kingdom of Great Britain & N. IrelandUruguay, Eastern Republic ofUzbekistanVanuatuVenezuela, Bolivarian Republic ofViet Nam, Socialist Republic ofWallis and Futuna IslandsWestern SaharaYemenZambia, Republic ofZimbabwe

Visit link:
Opinion: The hazards of TikTok -- a genuine concern - Post Register

Read More..

How VDI and DaaS Help Companies Thrive in a World of Hybrid Work – BizTech Magazine

How Does VDI Compare To Desktop as a Service?

Like desktop virtualization, Desktop as a Service grants workers remote access to a virtual iteration of a desktop that lives elsewhere. A notable difference is that with DaaS, virtual desktops are made available through the cloud by a third-party service provider not a companys self-managed servers supported by virtualization technology from providers such as Citrix, VMware, Avaya and others.

The first requirement of VDI is to ensure that the infrastructure is in place to create the servers and data centers that will host virtual machines. The VMs then live on the servers with a businesss chosen operating system and apps. All of this is the responsibility of the organization deploying desktop virtualization, and should be overseen by a skilled IT department, which will manage hardware, software, security updates and patches.

Remote employees can connect to a virtual desktop using a tablet, smartphone, desktop or laptop that has access to the companys network. Often, thin clients or zero clients computers that have no onboard operating system or storage are used because of their cost-effectiveness and suitability for remote work.

Virtual desktops come in two forms: A persistent virtual desktop is a singular instance that preserves an employees personalized applications, data and settings from login to login. A nonpersistent virtual desktop gives users a clean slate each time they log in for work.

DaaS shares some commonalities with desktop virtualization. Employees still access the virtual iteration of a persistent or nonpersistent desktop using a device of their choice, but the virtual machines are not hosted on on-premises servers. Instead, they are managed offsite by a third-party cloud-service provider such as Microsoft Azure, Google Cloud or Amazon Web Services and accessed through an application or web browser.

After companies choose a provider, partner and subscription plan, the setup process is quick. We can be up and running in a matter of minutes or hours compared with the days or weeks that it could take if you had to procure hardware so that you could install that software on-premises, says Sachin Sharma, director of product marketing for VMware Horizon, the companys virtual desktop and DaaS offering.

Spinning VMs up or down is quick and easy, and has proved especially helpful for companies with seasonal workers or contractors who require temporary access, or companies that experience fluctuations in their workforce. Once a company is up and running with DaaS, its third-party provider oversees all maintenance, security, storage and upgrades.

LEARN MORE: Discover how VMware is making virtualization easy for businesses.

VDI and DaaS share some clear benefits.

They both offer workers a consistent user experience, from any location, that replicates the one they would have in a headquarters or branch office. It can restore your work environment, and you can have that be consistent regardless of location, device or connectivity, says Lotz. As more companies or employees migrate to hybrid models, that consistency will prove helpful.

VDI and DaaS also both offer security benefits. The increased endpoints that result from further adoption of hybrid work also increases a companys vulnerability to cyberattacks and phishing. But with a virtualized infrastructure, a server is the only entry point for an attack and is typically well protected. Your data lives inside the firewall instead of being scattered in somebody's end-user access device, says Sharma.

Another bonus for those adopting virtualized desktop experiences is a lighter load for IT staff. Whether its within the company or through a third-party provider, managing everything within one place, instead of in a distributed fashion, really helps IT reduce the time they spend in deploying patches, updating machines and updating VMs, says Sharma.

Read more:
How VDI and DaaS Help Companies Thrive in a World of Hybrid Work - BizTech Magazine

Read More..

On the Talent Hunt: Work-life balance in the construction sector? Yes, it’s possible with some creativity – TODAY

I work for a mechanical and electrical engineering firm in the construction sector.

Even before the pandemic, it was a herculean task recruiting Singaporean workers given the long hours and nature of our jobs. The situation is now even more challenging amid a tight labour market.

While we managed to complete projects which were delayed during Covid-19, we also clinched new jobs that needed project engineers on site.

Without sufficient manpower, our existing staff would be stretched more thinly, and we had to decline participation in some tenders over the past two years.

Thus, recruitment and retention have become a main focus for the company.

With work-life harmony a buzz phrase in recent years, we started looking at how we can change some practices to be a more appealing employer.

Construction is not the first industry that comes to mind when one talks about work-life harmony. It is even harder for a medium-sized enterprise like us due to our limited resources.

For instance, we found it difficult to implement some flexible work arrangements (FWAs) such as work-from-home (WFH) as our work requires our staff to be on site.

We realised that we had to be creative to achieve better work-life harmony.

First, we started allowing staggered reporting times for our office-based staff, who can choose a shift that suits their family commitments.

Second, for staff who need to be on site, we stepped up digitalisation efforts with cloud servers and a human resources (HR) mobile application so that they can work remotely from site offices and other convenient locations.

Supervisors are also given autonomy to provide staff with time-off to attend to emergencies.

By 2024, a new set of tripartite guidelines will kick in for employers to consider staff FWA requests fairly.

Before this takes place, we have started considering WFH requests for reasons such as caregiving and when employees children are sick.

These requests are approved by supervisors and HR to ensure that business needs are still met.

We believe that with family-friendly policies, our employees will feel more valued and committed to the company. This in turn boosts productivity and staff morale.

This is also why we have joined the Governments Made for Families initiative to show our commitment towards families.

To further ease our manpower crunch, we work closely with the Building and Construction Authority to offer sponsorships and scholarships for students who will then join us under a bond after they graduate.

We recruited one graduate under the programme this year and are looking at hiring another.

We have also found that utilising digital tools such as Building Information Modelling, which incorporates 3D models to create and manage a construction project throughout its life cycle, help us attract young talents by showing that we are progressive.

With the easing of border restrictions, our manpower situation is definitely better now, but as we navigate these uncertain times, we will continue to adapt our strategies to ensure that our business and people goals can be met.

ABOUT THE WRITER:

Mrs Sarah Tham, 41, is an associate director of DLE M&E, a mechanical and electrical engineering firm founded in 1975. It currently has over 300 employees.

If you are a business owner with an experience to share or know someone who wishes to contribute to this series, write to voices [at] mediacorp.com.sg with your full name, address and phone number.

Here is the original post:
On the Talent Hunt: Work-life balance in the construction sector? Yes, it's possible with some creativity - TODAY

Read More..

Old computer technology points the way to future of quantum computing Terrace Standard – Terrace Standard

Researchers have made a breakthrough in quantum technology development that has the potential to leave todays supercomputers in the dust, opening the door to advances in fields including medicine, chemistry, cybersecurity and others that have been out of reach.

In a study published in the journal Nature on Wednesday, researchers from Simon Fraser University in British Columbia said they found a way to create quantum computing processors in silicon chips.

Principal investigator Stephanie Simmons said they illuminated tiny imperfections on the silicon chips with intense beams of light. The defects in the silicon chips act as a carrier of information, she said. While the rest of the chip transmits the light, the tiny defect reflects it back and turns into a messenger, she said.

There are many naturally occurring imperfections in silicon. Some of these imperfections can act as quantum bits, or qubits. Scientists call those kinds of imperfections spin qubits. Past research has shown that silicon can produce some of the most stable and long-lived qubits in the industry.

These results unlock immediate opportunities to construct silicon-integrated, telecommunications-band quantum information networks, said the study.

Simmons, who is the universitys Canada Research Chair in silicon quantum technologies, said the main challenge with quantum computing was being able to send information to and from qubits.

People have worked with spin qubits, or defects, in silicon before, Simmons said. And people have worked with photon qubits in silicon before. But nobodys brought them together like this.

Lead author Daniel Higginbottom called the breakthrough immediately promising because researchers achieved what was considered impossible by combining two known but parallel fields.

Silicon defects were extensively studied from the 1970s through the 90s while quantum physics has been researched for decades, said Higginbottom, who is a post-doctoral fellow at the universitys physics department.

For the longest time people didnt see any potential for optical technology in silicon defects. But weve really pioneered revisiting these and have found something with applications in quantum technology thats certainly remarkable.

Although in an embryonic stage, Simmons said quantum computing is the rock n roll future of computers that can solve anything from simple algebra problems to complex pharmaceutical equations or formulas that unlock deep mysteries of space.

Were going to be limited by our imaginations at this stage. Whats really going to take off is really far outside our predictive capabilities as humans.

The advantage of using silicon chips is that they are widely available, understood and have a giant manufacturing base, she said.

We can really get it working and we should be able to move more quickly and hopefully bring that capability mainstream much faster.

Some physicists predict quantum computers will become mainstream in about two decades, although Simmons said she thinks it will be much sooner.

In the 1950s, people thought the technology behind transistors was mainly going to be used for hearing aids, she said. No one then predicted that the physics behind a transistor could be applied to Facebook or Google, she added.

So, well have to see how quantum technology plays out over decades in terms of what applications really do resonate with the public, she said. But there is going to be a lot because people are creative, and these are fundamentally very powerful tools that were unlocking.

Hina Alam, The Canadian Press

RELATED: US intel warns China could dominate advanced technologies

RELATED: Elon Musk claims Tesla will debut a humanoid robot next year

Computers and ElectronicsScienceSFU

Go here to see the original:
Old computer technology points the way to future of quantum computing Terrace Standard - Terrace Standard

Read More..