Developers today are a disturbed lot with auto code-generation platforms like GitHub Copilot, Code Whisper or even ChatGPT and Bard threatening to make their job redundant in the coming months. Writing code is no longer a Herculean task that it used to be as just about anyone can do it now by giving prompts and building new applications and tools.
Many developers are even being asked to learn prompt engineering in the name of upskilling and to boost productivity. So, what does one do now? Move on and focus on optimising LLMs.
LLMs work on heavy computers. Recently, OpenAI CEO Sam Altman said that he was worried about the lack of GPUs for powering OpenAIs models. This clearly shows that the need for optimising LLMs is the need of the hour, and this is where the research is increasingly shifting to.
Take a look at the open source community coming up with models like Falcon-7B that are able to perform on a par, or even better than GPT-based models, even on a single device. This requires much less computation and thus improving the efficiency and performance of models. Contribution to the open source ecosystem is what developers need to focus on since even Google and OpenAI agree that they cannot compete with what the community offers.
Building and improving the efficiency of models is based on making better algorithms that these models work on. Recently, DeepMind released AlphaDev, an algorithm built on AlphaZero that can sort data three times faster than a human-written algorithm. This is one of the major breakthroughs in reducing the computation requirement of these AI systems with better sorting algorithms.
Replit is another great example that is boosting the hard-core developer community. Apart from banking on the phone-based developer ecosystem and building Replit optimised for it, Replit came to the rescue of developers by starting Replit bounties.
Using Replit, a lot of non-developers can put up a bug or a problem on bounty, and the developers can solve these problems and earn cycles, which is Replits virtual currency which can be a great source of income.
Now that developers have earned money, there are a lot of problems that need solving. Since LLMs are compute heavy, they are also carbon intensive. Alok Lall, sustainability head at Microsoft India, told AIM, When we look at reducing emissions, it is very easy to look at infrastructure and get more efficient hardware like servers, heating, ventilation, and cooling systems, but addressing and understand the main ingredient the code, is the most important.
This is where Microsoft partnered with Thoughtworks, GitHub, and Accenture to build Green Software Foundation in 2021, to make coding and software development sustainable. This clearly shows that the need for making models more sustainable by making more efficient algorithms is of utmost importance for a lot of companies and developers.
If we consider that generative AI or more specifically, LLM-based models are just a bubble that is going to burst, there is a lot of space that requires research and development by developers. For example, DeepMinds AlphaFold for predicting protein structures is one of the crucial fields that needs more exploration.
Banking on this, recently Soutick Saha, bioinformatic developer at Wolfram, developer ProteinVisualisation paclet, a tool for bringing biomolecular structures for everyone to build further on. He described how he has worked with six programming languages in the last 12 years, and was able to develop this by learning the Wolfram language in just five months.
In India, the rise of open source semiconductor technology like RISC-V for designing chips has driven more startups into the design chip design industry. A lot of RISC-V startups are increasingly getting funded in India.
Then there is quantum computing. NVIDIA opened the floor for research in quantum computing by replicating CUDAs success and building QODA (quantum optimised device architecture). The open source platform is built for integrating GPUs and CPUs in one system, thus developers, not prompt engineers, can dive into the field.
Similarly, Quantum Brilliance, a company focusing on developing miniaturised, room-temperature quantum computing solutions, open source its Qristal SDK. This will further allow developers to innovate quantum algorithms for quantum accelerators. This also includes C++ and Python APIs, with the support for NVIDIA CUDA for creation of quantum-enhanced designs.
No-code platforms typically excel at creating simple or straightforward applications. However, when it comes to building complex systems with intricate business logic, integrations, and scalability requirements, hardcore developers are still essential. Focus on architecting and building robust, scalable, and efficient systems that require advanced technical knowledge.
There is a lot more to do, and we are just getting started. Now more than ever, quit sulking and complaining about the prompt engineers. Instead of rolling eyes over these auto-code generation platforms, developers can leverage their creativity and adaptability to solve complex problems and build architectures, while letting these platforms do the laborious task of writing code.
Read the original post:
Relevance of Software Developers in the Era of Prompt Engineering - Analytics India Magazine
- Why We Need Humanoid Robots Instead Of Faceless Kiosks - Forbes [Last Updated On: December 28th, 2019] [Originally Added On: December 28th, 2019]
- Marcus vs Bengio AI Debate: Gary Marcus Is The Villain We Never Needed - Analytics India Magazine [Last Updated On: January 3rd, 2020] [Originally Added On: January 3rd, 2020]
- AI Could Save the World, If It Doesnt Ruin the Environment First - PCMag Portugal [Last Updated On: April 19th, 2020] [Originally Added On: April 19th, 2020]
- AI's Carbon Footprint Issue Is Too Big To Be Ignored - Analytics India Magazine [Last Updated On: December 23rd, 2020] [Originally Added On: December 23rd, 2020]
- Towards Broad Artificial Intelligence (AI) & The Edge in 2021 - BBN Times [Last Updated On: June 16th, 2021] [Originally Added On: June 16th, 2021]
- Future Prospects of Data Science with Growing Technologies - Analytics Insight [Last Updated On: June 29th, 2021] [Originally Added On: June 29th, 2021]
- Attempt to compare different types of intelligence falls a bit short - Ars Technica [Last Updated On: January 2nd, 2022] [Originally Added On: January 2nd, 2022]
- The age of AI-ism - TechTalks [Last Updated On: January 16th, 2022] [Originally Added On: January 16th, 2022]
- Bin Yu [Last Updated On: February 15th, 2022] [Originally Added On: February 15th, 2022]
- Meta AI Boss: current AI methods will never lead to true intelligence - Gizchina.com [Last Updated On: September 30th, 2022] [Originally Added On: September 30th, 2022]
- Singapores Central Bank Partners With Google to Explore AI for Internal Use - 24/7 Wall St. [Last Updated On: June 4th, 2023] [Originally Added On: June 4th, 2023]
- The Race for AGI: Approaches of Big Tech Giants - Fagen wasanni [Last Updated On: July 27th, 2023] [Originally Added On: July 27th, 2023]
- Creative Machines: The Future is Now with Arthur Miller - CUNY Graduate Center [Last Updated On: September 25th, 2023] [Originally Added On: September 25th, 2023]