To many, AI is just a horrible Steven Spielberg movie. To others, its the next generation of learning computers. But what is artificial intelligence, exactly? The answer depends on who you ask. Broadly, artificial intelligence (AI) is the combination of computer science and robust datasets, deployed to solve some kind of problem.
Many definitions of artificial intelligence include a comparison to the human mind or brain, whether in form or function. Alan Turing wrote in 1950 about thinking machines that could respond to a problem using human-like reasoning. His eponymous Turing test is still a benchmark for natural language processing. Later, Stuart Russell and John Norvig observed that humans are intelligent, but were not always rational. Russell and Norvig saw two classes of artificial intelligence: systems that think and act like a human being, versus those that think and act rationally. Today, weve got all kinds of programs we call AI.
Many AIs employ neural nets, whose code is written to emulate some aspect of the architecture of neurons or the brain. However, not all intelligence is human-like. Nor is it necessarily the best idea to emulate neurobiological information processing. Thats why engineers limit how far they carry the brain metaphor. Its more about how phenomenally parallel the brain is, and its distributed memory handling. As defined by John McCarthy in 2004, artificial intelligence is the science and engineering of making intelligent machines, especially intelligent computer programs. It is related to the similar task of using computers to understand human intelligence, but AI does not have to confine itself to methods that are biologically observable.
Moreover, the distinction between a neural net and an AI is often a matter of philosophy, more than capabilities or design. Many AI-powered systems are neural nets, under the hood. We also call some neural nets AIs. For example, OpenAIs powerful GPT-3 AI is a type of neural net called a transformer (more on these below). A robust neural nets performance can equal or outclass a narrow AI. There is much overlap between neural nets and artificial intelligence, but the capacity for machine learning can be the dividing line.
Conceptually: In the sense of its logical structure, to be an AI, you need three fundamental parts. First, theres the decision process usually an equation, a model, or just some code. AIs often perform classification or apply transformations. To do that, the AI must be able to decide on patterns in the data. Second, theres an error function some way for the AI to check its work. And third, if the AI is going to learn from experience, it needs some way to optimize its model. Many neural networks do this with a system of weighted nodes, where each node has both a value and a relationship to its network neighbors. Values change over time; stronger relationships have a higher weight in the error function.
Deep learning networks have more hidden layers than conventional neural networks. Circles are nodes, or neurons.
Physically: Typically, an AI is just software. AI-powered software services like Grammarly and Rytr use neural nets, like GPT-3. Those neural nets consist of equations or commands, written in things like Python or Common Lisp. They run comparisons, perform transformations, and suss out patterns from the data. They run on server-side hardware, usually, but which hardware isnt important. Any conventional silicon will do, be it CPU or GPU. However, there are dedicated hardware neural nets, a special kind of ASIC called neuromorphic chips.
Not all ASICs are neuromorphic designs. However, neuromorphic chips are all ASICs. Neuromorphic design is fundamentally different from CPUs, and only nominally overlaps with a GPUs multi-core architecture. But its not some exotic new transistor type, nor any strange and eldritch kind of data structure. Its all about tensors. Tensors describe the relationships between things; theyre a kind of mathematical object that can have metadata, just like a digital photo has EXIF data.
Modern Nvidia RTX GPUs have a huge number of tensor cores. That makes sense if youre drawing moving polygons, each with some number of properties or effects that apply to it. But tensors can handle more than just spatial data. The ability to parallelize tensor calculations is also why GPUs get scalped for crypto mining, and why theyre used in cluster computing, especially for deep learning. GPUs excel at organizing many different threads at once.
But no matter how elegant your data organization might be, it still has to filter through multiple layers of software abstraction before it ever becomes binary. Intels neuromorphic chip, Loihi 2, affords a very different approach.
Loihi 2 is a neuromorphic chip that comes as a package deal with a software ecosystem named Lava. Loihis physical architecture invites almost requires the use of weighting and an error function, both defining features of AI and neural nets. The chips biomimetic design extends to its electrical signaling. Instead of ones and zeroes, on or off, Loihi fires in spikes with an integer value capable of carrying much more data. It begs to be used with tensors. What if you didnt have to translate your values into machine code and then binary? What if you could just encode them directly?
Machine learning models that use Lava can take full advantage of Loihi 2s unique physical design. Together, they offer a hybrid hardware-software neural net that can process relationships between multiple entire multi-dimensional datasets, like an acrobat spinning plates.
AI tools like Rytr, Grammarly and others do their work in a regular desktop browser. In contrast, neuromorphic chips like Loihi arent designed for use in consumer systems. (At least, not yet.) Theyre intended for researchers. Instead, neuromorphic engineering has a different strength. It can allow silicon to perform another kind of biomimicry. Brains are extremely cheap, in terms of power use per unit throughput. The hope is that Loihi and other neuromorphic systems can mimic that power efficiency to break out of the Iron Triangle and deliver all three: good, fast, and cheap.
If the three-part logical structure of an AI sounds familiar, thats because neural nets have the same three logical pillars. In fact, from IBMs perspective, the relationship between machine learning, deep learning, neural networks and artificial intelligence is a hierarchy of evolution. Its just like the relationship between Charmander, Charmeleon and Charizard. Theyre all separate entities in their own right, but each is based on the one before, and they grow in power as they evolve. We still have Charmanders even though we also have Charizards.
Artificial intelligence as it relates to machine learning, neural networks, and deep learning. Image: IBM
When an AI learns, its different than just saving a file after making edits. To an AI, learning involves changing its process.
Many neural nets learn through a process called back-propagation. Typically, a neural net is a feed-forward process, because data only moves in one direction through the network. Its efficient, but its also a kind of ballistic (unguided) process. In back-propagation, however, later nodes in the process get to pass information back to earlier nodes. Not all neural nets perform back-propagation, but for those that do, the effect is like changing the coefficients in front of the variables in an equation.
We also divide neural nets into two classes, depending on what type of problems they can solve. In supervised learning, a neural net checks its work against a labeled training set or an overwatch; in most cases, that overwatch is a human. For example, SwiftKey learns how you text, and adjusts its autocorrect to match. Pandora uses listeners input to finely classify music, in order to build specifically tailored playlists. 3blue1brown even has an excellent explainer series on neural nets, where he discusses a neural net using supervised learning to perform handwriting recognition.
Supervised learning is great for fine accuracy on an unchanging set of parameters, like alphabets. Unsupervised learning, however, can wrangle data with changing numbers of dimensions. (An equation with x, y and z terms is a three-dimensional equation.) Unsupervised learning tends to win with small datasets. Its also good at recognizing patterns we might not even know to look for.
Transformers are a special, versatile kind of AI capable of unsupervised learning. They can integrate many different streams of data, each with its own changing parameters. Because of this, theyre great at handling tensors. Tensors, in turn, are great for keeping all that data organized. With the combined powers of tensors and transformers, we can handle more complex datasets.
Video upscaling and motion smoothing are great applications for AI transformers. Likewise, tensors are crucial to the detection of deepfakes and alterations. With deepfake tools reproducing in the wild, its a digital arms race.
The person in this image does not exist. This is a deepfake image created by StyleGAN, Nvidias generative adversarial neural network.
Video signal has high dimensionality. Its made of a series of images, which are themselves composed of a series of coordinates and color values. Mathematically and in computer code, we represent those quantities as matrices or n-dimensional arrays. Helpfully, tensors are great for matrix and array wrangling. DaVinci Resolve, for example, uses tensor processing in its (NVidia RTX) hardware-accelerated Neural Engine facial recognition utility. Hand those tensors to a transformer, and its powers of unsupervised learning do a great job picking out the curves of motion on-screen and in real life.
In fact, that ability to track multiple curves against one another is why the tensor-transformer dream team has taken so well to things like natural language processing. And the approach can generalize. Convolutional transformers a hybrid of a CNN and a transformer excel at image recognition on the fly. This tech is in use today, for things like robot search and rescue or assistive image and text recognition, as well as the much more controversial practice of dragnet facial recognition, la Hong Kong.
The ability to handle a changing mass of data is great for consumer and assistive tech, but its also clutch for things like mapping the genome, and improving drug design. The list goes on. Transformers can also handle different kinds of dimensions, not just the spatial, which is useful for managing an array of devices or embedded sensors like weather tracking, traffic routing, or industrial control systems. Thats what makes AI so useful for data processing at the edge.
Not only does everyone have a cell phone, there are embedded systems in everything. This proliferation of devices gives rise to an ad hoc global network called the Internet of Things (IoT). In the parlance of embedded systems, the edge represents the outermost fringe of end nodes within the collective IoT network. Edge intelligence takes two main forms: AI on edge and AI for edge. The distinction is where the processing happens. AI on edge refers to network end nodes (everything from consumer devices to cars and industrial control systems) that employ AI to crunch data locally. AI for the edge enables edge intelligence by offloading some of the compute demand to the cloud.
In practice, the main differences between the two are latency and horsepower. Local processing is always going to be faster than a data pipeline beholden to ping times. The tradeoff is the computing power available server-side.
Embedded systems, consumer devices, industrial control systems, and other end nodes in the IoT all add up to a monumental volume of information that needs processing. Some phone home, some have to process data in near real-time, and some have to check and correct their own work on the fly. Operating in the wild, these physical systems act just like the nodes in a neural net. Their collective throughput is so complex that in a sense, the IoT has become the AIoT the artificial intelligence of things.
As devices get cheaper, even the tiny slips of silicon that run low-end embedded systems have surprising computing power. But having a computer in a thing doesnt necessarily make it smarter. Everythings got Wi-Fi or Bluetooth now. Some of it is really cool. Some of it is made of bees. If I forget to leave the door open on my front-loading washing machine, I can tell it to run a cleaning cycle from my phone. But the IoT is already a well-known security nightmare. Parasitic global botnets exist that live in consumer routers. Hardware failures can cascade, like the Great Northeast Blackout of summer 2003, or when Texas froze solid in 2021. We also live in a timeline where a faulty firmware update can brick your shoes.
Theres a common pipeline (hypeline?) in tech innovation. When some Silicon Valley startup invents a widget, it goes from idea to hype train to widgets-as-a-service to disappointment, before finally figuring out what the widgets actually good for.
Oh, okay, there is an actual hypeline. Above: The 2018 Gartner hype cycle. Note how many forms of artificial intelligence showed up on this roller coaster then and where they are now. Image: Gartner, 2018
This is why we lampoon the IoT with loving names like the Internet of Shitty Things and the Internet of Stings. (Internet of Stings devices communicate over TCBee-IP.) But the AIoT isnt something anyone can sell. Its more than the sum of its parts. The AIoT is a set of emergent properties that we have to manage if were going to avoid an explosion of splinternets, and keep the world operating in real time.
In practice, artificial intelligence is often the same thing as a neural net capable of machine learning. Theyre both software that can run on whatever CPU or GPU is available and powerful enough. Neural nets often have the power to perform machine learning via back-propagation. Theres also a kind of hybrid hardware-and-software neural net that brings a new meaning to machine learning. Its made using tensors, ASICs, and neuromorphic engineering by Intel. Furthermore, the emergent collective intelligence of the IoT has created a demand for AI on, and for, the edge. Hopefully we can do it justice.
Go here to see the original:
What Is Artificial Intelligence? - ExtremeTech
- What is Artificial Intelligence? How Does AI Work? | Built In [Last Updated On: September 5th, 2019] [Originally Added On: September 5th, 2019]
- Artificial Intelligence What it is and why it matters | SAS [Last Updated On: September 5th, 2019] [Originally Added On: September 5th, 2019]
- artificial intelligence | Definition, Examples, and ... [Last Updated On: September 5th, 2019] [Originally Added On: September 5th, 2019]
- Benefits & Risks of Artificial Intelligence - Future of ... [Last Updated On: September 5th, 2019] [Originally Added On: September 5th, 2019]
- What is AI (artificial intelligence)? - Definition from ... [Last Updated On: September 11th, 2019] [Originally Added On: September 11th, 2019]
- What is Artificial Intelligence (AI)? ... - Techopedia [Last Updated On: September 13th, 2019] [Originally Added On: September 13th, 2019]
- 9 Powerful Examples of Artificial Intelligence in Use ... [Last Updated On: September 18th, 2019] [Originally Added On: September 18th, 2019]
- What's the Difference Between Robotics and Artificial ... [Last Updated On: September 18th, 2019] [Originally Added On: September 18th, 2019]
- The Impact of Artificial Intelligence - Widespread Job Losses [Last Updated On: September 18th, 2019] [Originally Added On: September 18th, 2019]
- Artificial Intelligence & the Pharma Industry: What's Next ... [Last Updated On: September 18th, 2019] [Originally Added On: September 18th, 2019]
- Artificial Intelligence | GE Research [Last Updated On: September 18th, 2019] [Originally Added On: September 18th, 2019]
- A.I. Artificial Intelligence (2001) - IMDb [Last Updated On: October 5th, 2019] [Originally Added On: October 5th, 2019]
- 10 Best Artificial Intelligence Course & Certification [2019 ... [Last Updated On: October 15th, 2019] [Originally Added On: October 15th, 2019]
- Artificial Intelligence in Healthcare: the future is amazing ... [Last Updated On: October 15th, 2019] [Originally Added On: October 15th, 2019]
- Will Artificial Intelligence Help Resolve the Food Crisis? - Inter Press Service [Last Updated On: November 18th, 2019] [Originally Added On: November 18th, 2019]
- Two-thirds of employees would trust a robot boss more than a real one - World Economic Forum [Last Updated On: November 18th, 2019] [Originally Added On: November 18th, 2019]
- UofL partners with industry experts to launch Artificial Intelligence Innovation Consortium Lane Report | Kentucky Business & Economic News - The... [Last Updated On: November 18th, 2019] [Originally Added On: November 18th, 2019]
- China Sees Surge of Edtech Investments With Focus on Artificial Intelligence - Karma [Last Updated On: November 18th, 2019] [Originally Added On: November 18th, 2019]
- NIST researchers use artificial intelligence for quality control of stem cell-derived tissues - National Institutes of Health [Last Updated On: November 18th, 2019] [Originally Added On: November 18th, 2019]
- Indiana University Touts Big Red 200 and Artificial Intelligence at SC19 - HPCwire [Last Updated On: November 18th, 2019] [Originally Added On: November 18th, 2019]
- One way for the Pentagon to prove it's serious about artificial intelligence - C4ISRNet [Last Updated On: November 18th, 2019] [Originally Added On: November 18th, 2019]
- Artificial Intelligence Will Enable the Future, Blockchain Will Secure It - Cointelegraph [Last Updated On: November 18th, 2019] [Originally Added On: November 18th, 2019]
- Artificial intelligence has become a driving force in everyday life, says LivePerson CEO - CNBC [Last Updated On: November 18th, 2019] [Originally Added On: November 18th, 2019]
- 4 Reasons to Use Artificial Intelligence in Your Next Embedded Design - DesignNews [Last Updated On: November 18th, 2019] [Originally Added On: November 18th, 2019]
- Artificial Intelligence Essay - 966 Words | Bartleby [Last Updated On: November 18th, 2019] [Originally Added On: November 18th, 2019]
- AI News: Track The Latest Artificial Intelligence Trends And ... [Last Updated On: November 18th, 2019] [Originally Added On: November 18th, 2019]
- AI in contact centres: It's time to stop talking about artificial intelligence - Verdict [Last Updated On: November 20th, 2019] [Originally Added On: November 20th, 2019]
- Newsrooms have five years to embrace artificial intelligence or they risk becoming irrelevant - Journalism.co.uk [Last Updated On: November 20th, 2019] [Originally Added On: November 20th, 2019]
- Scientists used IBM Watson to discover an ancient humanoid stick figure - Business Insider [Last Updated On: November 20th, 2019] [Originally Added On: November 20th, 2019]
- The Mark Foundation Funds Eight Projects at the Intersection of Artificial Intelligence and Cancer Research - BioSpace [Last Updated On: November 20th, 2019] [Originally Added On: November 20th, 2019]
- Colorado at the forefront of AI and what it means for jobs of the future - The Denver Channel [Last Updated On: November 20th, 2019] [Originally Added On: November 20th, 2019]
- Highlights: Addressing fairness in the context of artificial intelligence - Brookings Institution [Last Updated On: November 20th, 2019] [Originally Added On: November 20th, 2019]
- Artificial intelligence won't kill journalism or save it, but the sooner newsrooms buy in, the better - Nieman Journalism Lab at Harvard [Last Updated On: November 20th, 2019] [Originally Added On: November 20th, 2019]
- How To Get Your Rsum Past The Artificial Intelligence Gatekeepers - Forbes [Last Updated On: November 20th, 2019] [Originally Added On: November 20th, 2019]
- Epiq expands company-wide initiative to accelerate the deployment of artificial intelligence for clients globally - GlobeNewswire [Last Updated On: November 20th, 2019] [Originally Added On: November 20th, 2019]
- Preparing the Military for a Role on an Artificial Intelligence Battlefield - The National Interest Online [Last Updated On: November 20th, 2019] [Originally Added On: November 20th, 2019]
- Podcast decodes ethics in artificial intelligence and its relevance to public - Daily Bruin [Last Updated On: November 20th, 2019] [Originally Added On: November 20th, 2019]
- Global Military Artificial Intelligence (AI) and Cybernetics Market Report, 2019-2024: Focus on Platforms, Technologies, Applications and Services -... [Last Updated On: November 20th, 2019] [Originally Added On: November 20th, 2019]
- Artificial intelligence warning: Development of AI is comparable to nuclear bomb - Express.co.uk [Last Updated On: November 20th, 2019] [Originally Added On: November 20th, 2019]
- Google's new study reveals 'Artificial Intelligence benefiting journalism' - Digital Information World [Last Updated On: November 23rd, 2019] [Originally Added On: November 23rd, 2019]
- Artificial Intelligence (AI) in Retail Market worth $15.3 billion by 2025 - Exclusive Report by Meticulous Research - GlobeNewswire [Last Updated On: November 23rd, 2019] [Originally Added On: November 23rd, 2019]
- With artificial intelligence to a better wood product - Newswise [Last Updated On: November 23rd, 2019] [Originally Added On: November 23rd, 2019]
- Report to Congress on Artificial Intelligence and National Security - USNI News [Last Updated On: November 23rd, 2019] [Originally Added On: November 23rd, 2019]
- Most plastic is not getting recycled, and AI robots could be a solution - Business Insider [Last Updated On: November 23rd, 2019] [Originally Added On: November 23rd, 2019]
- Fujifilm Showcases Artificial Intelligence Initiative And Advances AI - AiThority [Last Updated On: November 23rd, 2019] [Originally Added On: November 23rd, 2019]
- Artificial intelligence could be one of the most valuable tools mankind has built - here's one small but meani - Business Insider India [Last Updated On: November 23rd, 2019] [Originally Added On: November 23rd, 2019]
- Artificial Intelligence: A Need of Modern 'Intelligent' Education - Thrive Global [Last Updated On: November 23rd, 2019] [Originally Added On: November 23rd, 2019]
- Drones And Artificial Intelligence Help Combat The San Francisco Bays Trash Problem - Forbes [Last Updated On: November 23rd, 2019] [Originally Added On: November 23rd, 2019]
- DesignCon Expands Into Artificial Intelligence, Automotive, 5G, IoT, and More For 2020 Edition - I-Connect007 [Last Updated On: November 23rd, 2019] [Originally Added On: November 23rd, 2019]
- Is St. Louis ready for artificial intelligence? It will steal white-collar jobs here, too - STLtoday.com [Last Updated On: November 23rd, 2019] [Originally Added On: November 23rd, 2019]
- IT chiefs recognise the risks of artificial intelligence bias - ComputerWeekly.com [Last Updated On: November 23rd, 2019] [Originally Added On: November 23rd, 2019]
- PNNL researchers working to improve doctor-patient care through artificial intelligence - NBC Right Now [Last Updated On: November 28th, 2019] [Originally Added On: November 28th, 2019]
- How Augmented Reality and Artificial Intelligence Are Helping Entrepreneurs Create a Better Customer Experience - CTOvision [Last Updated On: November 28th, 2019] [Originally Added On: November 28th, 2019]
- Manufacturing Leaders' Summit: Realising the promise of Artificial Intelligence - Manufacturer.com [Last Updated On: November 28th, 2019] [Originally Added On: November 28th, 2019]
- 2019 Artificial Intelligence in Precision Health - Dedication to Discuss & Analyze AI Products Related to Precision Healthcare Already Available -... [Last Updated On: November 28th, 2019] [Originally Added On: November 28th, 2019]
- Artificial intelligence will affect Salt Lake, Ogden more than most areas in the nation, study shows - KSL.com [Last Updated On: November 28th, 2019] [Originally Added On: November 28th, 2019]
- It Pays To Break Artificial Intelligence Out Of The Lab, Study Confirms - Forbes [Last Updated On: November 28th, 2019] [Originally Added On: November 28th, 2019]
- The Best Artificial Intelligence Stocks of 2019 -- and The Top AI Stock for 2020 - The Motley Fool [Last Updated On: November 28th, 2019] [Originally Added On: November 28th, 2019]
- Artificial Intelligence of Things (AIoT) Market Research Report 2019-2024 - Embedded AI in Support of IoT Things/Objects Will Reach $4.6B Globally by... [Last Updated On: December 1st, 2019] [Originally Added On: December 1st, 2019]
- How Augmented Reality and Artificial Intelligence Are Helping Entrepreneurs Create a Better Customer Experience - Entrepreneur [Last Updated On: December 1st, 2019] [Originally Added On: December 1st, 2019]
- SC Proposes Introduction Of Artificial Intelligence In Justice Delivery System - Inc42 Media [Last Updated On: December 1st, 2019] [Originally Added On: December 1st, 2019]
- Artificial intelligence in FX 'may be hype' - FX Week [Last Updated On: December 1st, 2019] [Originally Added On: December 1st, 2019]
- Fujifilm Showcases Artificial Intelligence Initiative And Advances at RSNA 2019 - Imaging Technology News [Last Updated On: December 1st, 2019] [Originally Added On: December 1st, 2019]
- The Surprising Way Artificial Intelligence Is Transforming Transportation - Forbes [Last Updated On: December 1st, 2019] [Originally Added On: December 1st, 2019]
- Artificial Intelligence in 2020: The Architecture and the Infrastructure - Gigaom [Last Updated On: December 2nd, 2019] [Originally Added On: December 2nd, 2019]
- AI IN BANKING: Artificial intelligence could be a near $450 billion opportunity for banks - here are the strat - Business Insider India [Last Updated On: December 2nd, 2019] [Originally Added On: December 2nd, 2019]
- The impact of artificial intelligence on humans - Bangkok Post [Last Updated On: December 2nd, 2019] [Originally Added On: December 2nd, 2019]
- Should the EU embrace artificial intelligence, or fear it? - EURACTIV [Last Updated On: December 8th, 2019] [Originally Added On: December 8th, 2019]
- BioSig Technologies Announces New Collaboration on Development of Artificial Intelligence Solutions in Healthcare - GlobeNewswire [Last Updated On: December 8th, 2019] [Originally Added On: December 8th, 2019]
- Artificial intelligence-based fitness is promising but may not be for everyone - Livemint [Last Updated On: December 8th, 2019] [Originally Added On: December 8th, 2019]
- Pondering the Ethics of Artificial Intelligence in Health Care Kansas City Experts Team Up on Emerging - Flatland [Last Updated On: December 8th, 2019] [Originally Added On: December 8th, 2019]
- Baidu Leads the Way in Innovation with 5712 Artificial Intelligence Patent Applications - GlobeNewswire [Last Updated On: December 8th, 2019] [Originally Added On: December 8th, 2019]
- Artificial Intelligence and National Security, and More from CRS - Secrecy News [Last Updated On: December 8th, 2019] [Originally Added On: December 8th, 2019]
- Longer Looks: The Psychology Of Voting; Overexcited Neurons And Artificial Intelligence; And More - Kaiser Health News [Last Updated On: December 8th, 2019] [Originally Added On: December 8th, 2019]
- Emotion Artificial Intelligence Market Business Opportunities and Forecast from 2019-2025 | Eyesight Technologies, Affectiva - The Connect Report [Last Updated On: December 8th, 2019] [Originally Added On: December 8th, 2019]
- The next generation of user experience is artificially intelligent - ZDNet [Last Updated On: December 8th, 2019] [Originally Added On: December 8th, 2019]
- What Jobs Will Artificial Intelligence Affect? - EHS Today [Last Updated On: December 8th, 2019] [Originally Added On: December 8th, 2019]
- Will the next Mozart or Picasso come from artificial intelligence? No, but here's what might happen instead - Ladders [Last Updated On: December 8th, 2019] [Originally Added On: December 8th, 2019]
- Artificial intelligence apps, Parkinsons and me - BBC News [Last Updated On: December 8th, 2019] [Originally Added On: December 8th, 2019]
- VA launches National Artificial Intelligence Institute to drive research and development - FierceHealthcare [Last Updated On: December 8th, 2019] [Originally Added On: December 8th, 2019]