Page 7«..6789..2030..»

10 Reasons A Promotion Might Not Be Right For Your Data Science Career | by Banji Alo | Jul, 2024 – DataDrivenInvestor

4 min read

Who doesnt like a promotion?

Not everyone does.

Turning down a promotion or refusing to climb the corporate ladder might seem counterintuitive, but it is becoming increasingly common.

Matt is a professional Data Scientist who has been in the industry for over ten years and would not like to be promoted.

Similarly, Ben, whos in tech, is comfortable in his current role. He loves what he does and doesnt see himself taking up a higher role soon.

Here are some reasons why a promotion might not be for you.

Work can be fun.

Individuals who generally love their work dont want to be promoted. Their skills match perfectly with the role, and they feel confident when delivering tasks.

Confidence is key to overall job satisfaction.

They might face new challenges and have no idea how to solve them, but their passion for the role makes them go all out looking for a solution.

They get into their flow state and lose track of time while at work,

Read more here:

10 Reasons A Promotion Might Not Be Right For Your Data Science Career | by Banji Alo | Jul, 2024 - DataDrivenInvestor

Read More..

State approves new William & Mary school, the first in 50 years – Daily Press

WILLIAMSBURG William & Marys new School of Computing, Data Sciences, and Physics was officially approved Tuesday, giving students a new avenue into working in a data-rich world, the university announced.

The State Council of Higher Education for Virginia, the state agency that governs new schools and programs, approved the school Tuesday, according to a news release.

The school will bring together four programs: applied science, computer science, data science and physics.

I appreciate SCHEVs shared commitment to preparing broadly educated, forward-thinking citizens and professionals, W&M President Katherine A. Rowe said in the release. The jobs of tomorrow belong to those prepared to solve tomorrows problems. Machine learning, AI, computational modeling these are essential modes of critical thinking and core to a liberal arts education in the 21st century, she said.

The school will be operational in fall 2025, and a national search for a dean of the school is underway.

W&Ms Board of Visitors approved the creation of the school in November. Its approval at the state level makes it the universitys sixth school the first since the creation of the Raymond A. Mason School of Business in 1968.

Establishing the standalone School of Computing, Data Sciences, and Physics, will increase visibility of the programs and their growing career fields, the university said.

Innovation has been part of William & Mary since its inception, and this school will serve as the catalyst for countless new discoveries, partnerships and synergies, Provost Peggy Agouris said in a statement. The School of Computing, Data Sciences, and Physics is launching at a pivotal time within these dynamic fields, and Im incredibly proud to continue our journey of interdisciplinary growth and excellence across our undergraduate and graduate program offerings.

The four academic areas in the new school are experiencing strong growth in external investment (over $9 million in 2023) and student numbers, according to the university.

Undergraduate students will not apply to the school directly; instead, second-year students who meet the criteria will be allowed to enter the school. Students will be able to double major or minor in other programs at the university while attending the new school, according to the release.

Sam Schaffer, samuel.schaffer@virginiamedia.com

Originally Published: July 25, 2024 at 12:10 p.m.

Here is the original post:

State approves new William & Mary school, the first in 50 years - Daily Press

Read More..

What Does the Transformer Architecture Tell Us? | by Stephanie Shen | Jul, 2024 – Towards Data Science

14 min read

The stellar performance of large language models (LLMs) such as ChatGPT has shocked the world. The breakthrough was made by the invention of the Transformer architecture, which is surprisingly simple and scalable. It is still built of deep learning neural networks. The main addition is the so-called attention mechanism that contextualizes each word token. Moreover, its unprecedented parallelisms endow LLMs with massive scalability and, therefore, impressive accuracy after training over billions of parameters.

The simplicity that the Transformer architecture has demonstrated is, in fact, comparable to the Turing machine. The difference is that the Turing machine controls what the machine can do at each step. The Transformer, however, is like a magic black box, learning from massive input data through parameter optimizations. Researchers and scientists are still intensely interested in discovering its potential and any theoretical implications for studying the human mind.

In this article, we will first discuss the four main features of the Transformer architecture: word embedding, attention mechanism, single-word prediction, and generalization capabilities such as multi-modal extension and transferred learning. The intention is to focus on why the architecture is so effective instead of how to build it (for which readers can find many

Original post:

What Does the Transformer Architecture Tell Us? | by Stephanie Shen | Jul, 2024 - Towards Data Science

Read More..

AI Hackathon hopes to encourage kids to go into the tech field – WRTV Indianapolis

INDIANAPOLIS Data breaches by hackers are a growing problem around the world. That is one reason the need for cybersecurity professionals is expected to grow.

Thats why the Department of Defense, in partnership with local Indianapolis Tech leaders, hosted a hackathon.

WRTV

"We know that the national security challenges of the future are going to be in the high-tech space and so this is a great opportunity for kids to be exposed to ways to serve their country and also get exposed to these really exciting technologies, Andrew Kossack, the Executive Vice President of the Applied Research Institute, said.

WRTV

The U.S. Bureau of Labor Statistics projects cybersecurity jobs to grow 32 percent by 2032.

At the hackathon, local high school students leaned skill that could lead them to career in advanced robotics, artificial intelligence, and data science.

WRTV

"It allows you to make connections with people," Naman Vyas, a student going into his freshman year, said. People that can help you get jobs and explore opportunities in the future.

Vyas is on the robotics team at his school. He is interested in the tech sector, potentially to someday keep data safe from would-be hackers.

WRTV

"White hat hacker, what they do is they hack into websites and figure out ways to protect it better, Vyas said. I think that would be a really cool job. Like problem solving, trying to find the problems so you can fix those problems and make it a lot safer for websites and companies."

Students like Vyas are very attractive to tech leaders.

"There aren't too many industries that you can get into today that don't have some type of tech aspect, Stacey Arnold, with the Luddy School of Informatics, Computing, and Engineering, said. We are really committed to ensuring that students have the tools they need to be able to navigate the world with having a tech savvy skill.

WRTV

Winners of the hackathon challenge were given money to attend Indiana University. IU awarded two $5,000 scholarships.

Go here to read the rest:

AI Hackathon hopes to encourage kids to go into the tech field - WRTV Indianapolis

Read More..

How Data Science Is Transforming The Clinical Trial Process – TechiExpert.com

One of the biggest challenges for drug developers is understanding how to make their drugs work as promised in the lab. And this is where clinical trials come in. The clinical trial process allows developers to test their drugs with human subjects in controlled environments to determine their efficacy and safety.

Indeed, the clinical trial process is crucial for drug development as it can significantly impact patients lives. However, clinical trials can be slow and expensive, with many factors that can affect the outcomes.But fortunately, using data science and adopting advanced technology, such as automating clinical trials with Formedix ryze software, are powerful ways to help drug developers understand how well their drugs work in real-world conditions.

Additionally, platforms like Evidation.com play a crucial role by utilizing real-world research to measure health in everyday life, which helps researchers understand the effectiveness of treatments outside controlled environments. This approach provides valuable insights and generates compelling evidence, accelerating decision-making in clinical trials. To learn more about the benefits of these platforms, read more here.

Furthermore, data scientists have helped doctors determine which patients would benefit most from specific treatments based on their genetic makeup, symptoms, past medical history, and family history. This process allows doctors to provide the best care possible for their patients while also ensuring they do not miss any patients who may need additional care or attention.

That said, read on to learn more about how data science is changing clinical trials.

In the medical field, many clinical studies rely on patients willing to participate in a clinical trial. It can be challenging, as many patients dont want to be involved in studies. One way to ensure that patients participate in clinical trials is by using data science techniques designed to optimize patient recruitment and retention.

The medical field can improve patient recruitment with the help of data scientists who understand how to use machine-learning techniques for this purpose. Notably, these machine learning techniques include identifying high-value targets, developing strategies for reaching them, and evaluating their results once they have been implemented. Data scientists can use these insights to improve future recruitment strategies.

Retention rates are also crucial in ensuring that patients remain engaged in studies. Data scientists may use machine learning techniques to identify factors that lead to high retention rates among participants and then use this information to improve retention rates over time.

Advancements in patient recruitment software powered by AI are further enhancing these efforts, enabling a more efficient and precise approach to identify eligible participants. This technology utilizes vast datasets to optimize patient recruitment strategies, ensuring trials are populated quickly with suitable candidates, thereby reducing timelines and costs.

Clinical trials are vulnerable to poor study design, poor data collection practices, and misleading results. But thats where data science comes in. Its transforming clinical trials by strengthening risk-based monitoring. Researchers can achieve it by identifying and analyzing the relationship between clinical trial data and the variables in the clinical trial environment.

The main goal of this strategy is to improve clinical trials by preventing adverse events during the process. In addition, it helps to identify any potential issues that may arise during the process and make sure theyre resolved before proceeding with further research.

Accordingly, the first step in implementing this strategy is data collection from various sources, including patient input surveys and feedback forms, as well as other objective measures such as lab test results or physician observations. Then these data sets are analyzed using statistical methods such as regression analysis or nonparametric statistics.

Then, researchers can get viewable, easy-to-understand reports that show all the information they need on each patient and how they fared while participating in the trial.

Clinical trials are a crucial part of drug development. Apart from helping researchers test the safety and effectiveness of new drugs, they also provide data that can inform future research, which is essential for companies working on different kinds of medicines.

However, there has been a long-standing problem with the predictability of clinical trials. Researchers have long struggled to determine how successful their clinical trials will be. And they often cant predict this until after patients have been enrolled in the study. The uncertainty about the value of a test can make it difficult for pharmaceutical companies to invest in drug research.

Accordingly, data science has helped address this issue by providing evidence about how well clinical trials are likely to perform in practice. The ability to use machine learning algorithms to predict how successful a clinical trial is will be based on its design. It allows pharmaceutical companies to make more informed decisions about whether or not to invest resources into them.

Data science is revolutionizing clinical trials by helping identify ideal locations for clinical trials. Clinical trials are usually conducted at multiple sites around the country, which increases their cost and slows down the process of getting new drugs to market.

Using machine learning techniques, data scientists can identify which sites are most suitable for conducting clinical trials based on factors such as proximity and other resources that must be made available. They can then use this information to help create a list of potential sites that meet all criteria needed for conducting a trial effectively.

With that, companies no longer have to spend money on traveling costs or rent out unused buildings. They can conduct clinical trials in one location instead of spreading them out over multiple locations.

Conclusion

The role of data science in clinical trials will be essential to watch in the coming years. As pharmaceutical companies try to streamline this expensive, time-consuming process, finding ways to use data science more effectively will be critical. Itll lead to more drugs being approved by the FDA and help drive down the costs associated with clinical trials.

See the article here:

How Data Science Is Transforming The Clinical Trial Process - TechiExpert.com

Read More..

Authority Backlinks Service on Cloud Hosting Platforms Launched by LinkDaddy – Newsfile

July 23, 2024 11:31 PM EDT | Source: Plentisoft

Miami, Florida--(Newsfile Corp. - July 23, 2024) - LinkDaddy's latest updates help business owners to get their own marketing content placed on top cloud hosting sites, where it can help to improve their search engine rankings, or help them to rank for a larger selection of keywords.

Authority Backlinks Service On Cloud Hosting Platforms Launched By LinkDaddy

To view an enhanced version of this graphic, please visit: https://images.newsfilecorp.com/files/8814/217420_efd45d15d4a29a1c_002full.jpg

More information about how backlinking can improve search rankings and updated marketing techniques from LinkDaddy can be found at https://linkdaddy.com/cloud-authority-backlinks

Business owners commonly use the LinkDaddy content and backlinking service to expand their targeted marketing areas, reach new demographics, or improve the search rankings for new products or services. LinkDaddy is now about to place marketing content on 15 popular hosting services, including several of the most highly ranked options.

Although content can be hosted nearly anywhere online, LinkDaddy limits its hosting to servers with exceptionally high domain authority. This helps to build credibility with the search algorithms, as each new piece of content gives the client's business a boost to its own domain authority.

As Tony Peacock, LinkDaddy CEO, says "We craft high-quality content tailored to your specific keywords. This content is designed to resonate with your target audience and align with your website's niche."

While many marketing techniques provide short-term results, cloud backlinking has been shown to provide long-term and cumulative benefits. As each new piece of content with backlinks goes live, and the search engine algorithms find it on high-authority sites, client brands will be moved further up in the search results.

Clients can choose from 3 different packages on the LinkDaddy website, with each package containing a unique list of high-authority hosting options. Each client will receive content specific to their brands, products, and services, a personalized HTML page on a popular service, and will have their marketing content posted on up to 5 different, highly reputable hosting services.

Tony Peacock clarifies that, "Using the Cloud Stacking method helps your content show up more in search engine results. When your content ranks higher, it's easier for your audience to find it. This can lead to more people visiting your site, more engagement, and ultimately, more conversions."

More information about building backlinks with LinkDaddy and using content to improve search rankings can be found at https://linkdaddy.com/cloud-authority-backlinks/

To view the source version of this press release, please visit https://www.newsfilecorp.com/release/217420

SOURCE: Plentisoft

Continue reading here:
Authority Backlinks Service on Cloud Hosting Platforms Launched by LinkDaddy - Newsfile

Read More..

Cutting An IoT Fan Free Of The Cloud – Hackaday

The cloud is supposed to make everything better.You can control things remotely, with the aid of a benevolent corporation and their totally friendly servers. However, you might notlike those servers, and you might prefer to take personal control of your hardware. If thats the case, you might like to follow the story of [ouaibe] and their quest to free a fan from the cloud.

The unit in question was a tower fan from Dreo. [ouaibe] noted that there was already a project to control the fans using Home Assistant, but pure lower-level local control was the real goal here. Work began on pulling apart the Dreo Android app to determine how it talked to the fan, eventually turning up a webserver on board, but little progress. The next step was to disassemble the unit entirely. That turned up multiple PCBs inside, with one obviously for wireless communication and another hosting a Sino Wealth microcontroller. Dumping firmwares followed, along with reverse engineering the webserver, and finally establishing a custom ESPHome integration to fully control the fan.

[ouaibe] has shared instructions on how to cut your own fan from the cloud, though notes that the work wont be extended to other Dreo products any time soon. In any case, its a great example of just how much work it can take to fully understand and control an IoT device thats tethered to a commercial cloud server. Its not always easy, but it can be done!

See the original post here:
Cutting An IoT Fan Free Of The Cloud - Hackaday

Read More..

[News] Tencent Cloud Releases Self-developed Server OS, Supporting Chinas Top Three CPU Brands – TrendForce

Due to challenges in exporting high-performance processors based on x86 and Arm architectures to China, the country is gradually adopting domestically designed operating systems.

According to industry sources cited by Toms hardware, Tencent Cloud recently launched the TencentOS Server V3 operating system, which supports Chinas three major processors: Huaweis Kunpeng CPUs based on Arm, Sugons Hygon CPUs based on x86, and PhytiumsFeiTeng CPUs based on Arm.

The operating system optimizes CPU usage, power consumption, and memory usage. To optimize the operating system and domestic processors for data centers, Tencent has collaborated with Huawei and Sugon to develop a high-performance domestic database platform.

Reportedly, TencentOS Server V3 can run GPU clusters, aiding Tencents AI operations. The latest version of the operating system fully supports NVIDIA GPU virtualization, enhancing processor utilization for resource-intensive services such as Optical Character Recognition (OCR). This innovative approach reduces the cost of purchasing NVIDIA products by nearly 60%.

TencentOS Server is already running on nearly 10 million machines, making it one of the most widely deployed Linux operating systems in China. Other companies, such as Huawei, have also developed their own operating systems, like OpenEuler.

Read more

(Photo credit: Tencent Cloud)

Read the rest here:
[News] Tencent Cloud Releases Self-developed Server OS, Supporting Chinas Top Three CPU Brands - TrendForce

Read More..

Surge in AI server demand from cloud service providers: TrendForce – InfotechLead.com

TrendForces latest industry report reveals a sustained high demand for advanced AI servers from major cloud service providers (CSPs) and brand clients, projected to continue into 2024.

The expansion in production by TSMC, SK Hynix, Samsung, and Micron has alleviated shortages in the second quarter of 2024, significantly reducing the lead time for NVIDIAs flagship H100 solution from 4050 weeks to less than 16 weeks.

Key Insights:

AI Server Shipments: AI server shipments in Q2 are estimated to rise by nearly 20 percent quarter-over-quarter, with an annual forecast now at 1.67 million units, representing a 41.5 percent year-over-year growth.

Budget Priorities: Major CSPs are prioritizing budgets towards AI server procurement, overshadowing the growth of general servers. The annual growth rate for general server shipments is a mere 1.9 percent, with AI servers expected to account for 12.2 percent of total server shipments, a 3.4 percentage point increase from 2023.

Market Value: AI servers are significantly boosting revenue growth, with their market value projected to exceed $187 billion in 2024a 69 percent growth rate, comprising 65 percent of the total server market value.

Regional Developments:

North America and China: North American CSPs like AWS and Meta are expanding proprietary ASICs, while Chinese companies Alibaba, Baidu, and Huawei are enhancing their ASIC AI solutions. This trend will likely increase the share of ASIC servers in the AI server market to 26 percent in 2024, with GPU-equipped AI servers holding about 71 percent.

Market Dynamics:

AI Chip Suppliers: NVIDIA dominates the GPU-equipped AI server market with a nearly 90 percent share, whereas AMD holds about 8 percent. When considering all AI chips used in AI servers (GPU, ASIC, FPGA), NVIDIAs market share is around 64 percent for the year.

Future Outlook: Demand for advanced AI servers is anticipated to remain robust through 2025, driven by NVIDIAs next-generation Blackwell platform (GB200, B100/B200), which will replace the Hopper platform. This shift is expected to boost demand for CoWoS and HBM technologies, with TSMCs CoWoS production capacity estimated to reach 550600K units by the end of 2025, growing by nearly 80 percent.

Memory Advancements: Mainstream AI servers in 2024 will feature 80 GB HMB3, with future chips like NVIDIAs Blackwell Ultra and AMDs MI350 expected to incorporate up to 288 GB of HBM3e by 2025. The overall HBM supply is projected to double by 2025, fueled by the sustained demand in the AI server market.

Conclusion:

The AI server market is experiencing unprecedented growth, with significant contributions to revenue and technological advancements. As major CSPs and tech giants continue to invest heavily in AI infrastructure, the industry is set for transformative developments through 2025.

View original post here:
Surge in AI server demand from cloud service providers: TrendForce - InfotechLead.com

Read More..

How decentralization could have prevented the global Microsoft meltdown – Cointelegraph

The widespread collapse of Microsofts Windows operating system, which disrupted key services worldwide, is being touted as a vindication of blockchain and decentralized technology.

Zain Cheng, the chief technology officer of Web3 development firm Horizen Labs, told Cointelegraph the Microsoft outage underscores the vulnerabilities of centralized systems, where single points of failure can lead to widespread disruption.

As Cheng points out, the disruption was indeed widespread. From July 1819, businesses, supermarkets, broadcasters, airlines and banks ground to a halt as 8.5 million systems encountered the blue screen of death.

In the aftermath, the technical issues were blamed on everything from CrowdStrike cybersecurity software to a regulatory deal with the European Union.

However, cryptonians view the problem as something far more existential: centralization.

Wes Levitt, head of strategy at decentralized cloud network Theta Labs, told Cointelegraph:

Cheng added, In contrast, decentralized blockchain networks like Bitcoin and Ethereum remained fully operational during the outage, highlighting their resilience.

This incident demonstrates the benefits of decentralization, where distributed trust and verification processes reduce the risk of widespread failures and enhance system stability, Cheng said.

Whatever the narrative and the precise nature of the system flaw, Microsoft and CrowdStrike are pointing the finger of blame elsewhere.

They may have a hard time convincing the public, who are less than impressed with the consequences of their failure. Levitt says that even if these companies public relations attempts fail to shift public perception, theres little an unhappy public can do.

Ultimately, the CrowdStrike debacle will lead to some erosion of trust in legacy systems, but there isnt much the average consumer can do about it these are services that you rely on every day that (for now) usually dont have a viable alternative, Levitt said.

Cheng again compared centralized and decentralized systems in terms of reliability.

Recent:Compound, Celer attack may have been caused by faulty migration system DNS experts

Public trust in centralized services has already been steadily eroding, and the CrowdStrike crash has intensified frustrations. Critical services like airlines, banks and shops were incapacitated, but DeFi [decentralized finance] continued on decentralized platforms, Cheng said.

Developers will need to invest considerable time and effort to make decentralized alternatives to existing systems.

As Web3 matures, the reliability and security of decentralized alternatives will become even more relevant. Individuals and businesses are increasingly seeking more dependable and resilient solutions to counter centralized system failures, said Cheng.

These events will likely spur increased interest and development in hardened, decentralized alternatives such as those built on blockchain networks, said Levitt.

In the wake of the Crowdstrike crash, decentralized advocates are enjoying their told you so moment.

The banking sector is one of the major industries disrupted from July 1819. Cointelegraph asked Cheng if it is possible to argue that Bitcoin is now more reliable than traditional finance.

Bitcoin kicked off the movement of immutable blockchain platforms, decentralized applications and continuous globally accessible networks, setting the stage for these innovations to outshine traditional finance, Cheng said.

Levitt remains slightly more cautious about extolling the virtues of decentralization, pointing out that any discussion about blockchain networks should also come with a few caveats.

Magazine: WazirX hackers prepped 8 days before attack, swindlers fake fiat for USDT: Asia Express

It is possible to argue that Bitcoin is more reliable than TradFi, and that certainly was the case this week, although we should be quick to remember that we still have our own work to do to battle the ills of network congestion on blockchain networks before we start bragging too much about network effectiveness, Levitt said.

As Levit pointed out, not all blockchain networks are created equal or enjoy the same level of robustness as Bitcoin.

Levitt said: This event should also call attention to concerns about centralization and single points of failure on blockchain networks; Bitcoin may have never gone down, but that isnt true of many other chains.

Meanwhile, centralized exchanges and other sectors within crypto are not likely to be quite so invulnerable, leading to calls from within the industry to consider moving to decentralized cloud-based architectures.

Read the rest here:

How decentralization could have prevented the global Microsoft meltdown - Cointelegraph

Read More..