• Turing Post
  • Posts
  • Token 1.2: Use Cases for Foundation Models and Key Benefits

Token 1.2: Use Cases for Foundation Models and Key Benefits

(but don’t give up your decision tree yet!)

In our 1.1 episode about the paradigm shift from data-centric to task-centric machine learning (ML), we’ve mentioned that task-centric ML focuses on using foundation models to perform a wide range of tasks efficiently and with fewer training examples. In the comment section, Vu Ha, startups’ advisor at AI2 Incubator and Semantic Scholar’s first engineer, pointed out that he first mentioned the term task-centric in October 2021, way ahead of time!

In that article, he proposed the following visualization of task-centric ML:

Image Credit: AI2 Incubator, Insights 4

Now, let's dig deeper into the world of foundation models and find out what makes them such game-changers. We also take a look at the cases when traditional ML is still the king.

Basics first: what are foundation models?

Foundation models (FM) are large-scale neural networks trained on broad data. They can perform multiple tasks without specific training, often through zero-shot or few-shot learning. They may handle multiple types of data (text, image, audio), making them a superset of Large Language Models (LLMs).

Let’s dive a little deeper

Foundation models, a term that comes to us from Stanford University, “to underscore their critically central yet incomplete character”, have altered the trajectory of ML by lowering the barrier of entry for building complex AI systems. They've proven especially adept at one thing: versatility. Unlike the niche specialization of earlier models, these computational behemoths are surprisingly flexible and agile across an array of tasks.

These models employ…

This is the last week when you can subscribe to Turing Post for only $50/year or $6/month and gain access to ALL our articles. This includes the FM series, which is packed with knowledge, the Unicorn Chronicle which offers insights into the main genAI companies, and other bonuses. Plus, you'll have the satisfaction of supporting independent tech journalism. Thank you!

Join the conversation

or to participate.