In this contributed article, editorial consultant Jelani Harper takes a look at how word embeddings are directly responsible for many of the exponential advancements natural language technologies have made over the past couple years. They’re foundational to the functionality of popular Large Language Models like ChatGPT and other GPT iterations. These mathematical representations also have undeniable implications for textual applications of Generative AI.
Heard on the Street – 6/5/2023
Welcome to insideAI News’s “Heard on the Street” round-up column! In this regular feature, we highlight thought-leadership commentaries from members of the big data ecosystem. Each edition covers the trends of the day with compelling perspectives that can provide important insights to give you a competitive advantage in the marketplace.
Video Highlights: Fine Tune GPT-J 6B in Under 3 Hours on IPUs
Did you know you can run GPT-J 6B on Graphcore IPU in the cloud? Following the now infamous leaked Google memo, there’s been a real storm in the AI world recently around smaller, open source language models, like GPT-J, that are cheaper and faster to fine-tune, run and perform just as well as larger models for many language tasks.
UVA Researchers Built an AI Algorithm That Understands Physics
Normally, when testing the behavior of materials under high heat or explosive conditions, researchers have to run simulation after simulation, a data-intensive process that can take days even on a supercomputer. However, with a deep learning algorithm created by Stephen Baek, Phong Nguyen and their research team, the process takes less than a second on a laptop.
ChatGPT: A Fraud Fighter’s Friend or Foe?
In this contributed article, Doriel Abrahams, Head of Risk, U.S., Forter, explores how ChatGPT can combine with social engineering to conduct fraud, some of the generative AI trends he anticipates will play out this year, and how existing fraud rings could use the technology to manipulate both businesses and consumers alike.
Top Data Science Ph.D. Dissertations (2019-2020)
The American Mathematical Society (AMS) recently published in its Notices monthly journal a long list of all the doctoral degrees conferred from July 1, 2019 to June 30, 2020 for mathematics and statistics. The degrees come from 242 departments in 186 universities in the U.S. I enjoy keeping a pulse on the research realm for my field, so I went through the entire published list and picked out 48 dissertations that have high relevance to data science, machine learning, AI and deep learning. The list below is organized alphabetically by state.
ICLR 2023 Paper Award Winners
The International Conference on Learning Representations (ICLR), the premier gathering of professionals dedicated to the advancement of the many branches of artificial intelligence (AI) and deep learning—announced 4 award-winning papers, and 5 honorable mention paper winners.
One AI Introduces CutGPT, a Slim and Task-Oriented Generative AI Solution
One AI, the language AI platform that enable businesses to tune and deploy generative AI capabilities in days, has announced the release of CutGPT, a slim and task-oriented generative AI solution.
Video Highlights: Building Machine Learning Apps with Hugging Face: LLMs to Diffusion Modeling
In this video presentation from our friends over at FourthBrain we have a timely presentation by Jeff Boudier, Product Director at Hugging Face, to discuss building machine learning apps with Hugging Face from LLMs to diffusion modeling.
Book Review: Math for Deep Learning
One of my favorite learning resources for gaining an understanding for the mathematics behind deep learning is “Math for Deep Learning” by Ronald T. Kneusel from No Starch Press. If you’re interested in getting quickly up to speed with how deep learning algorithms work at a basic level, then this is the book for you.