Fill-in-the-Blank: Machine Learning Foundations and Core Concepts
Back to Pack

Fill-in-the-Blank: Machine Learning Foundations and Core Concepts

Complete the sentences by filling in the blanks. Each correct answer earns points!

15 Questions • 150 Total Points
1

is a field of AI focused on statistical algorithms that learn from data and generalize to unseen data without explicit programming.

Context: Machine learning definition

2

Deep learning is a subset of , and ML is a subset of AI.

Context: Relationship between deep learning, ML, and AI

3

is the ability to perform accurately on new, unseen examples after training on a finite dataset.

Context: Generalization meaning

4

A measures the discrepancy between model predictions and true outcomes.

Context: Loss function concept

5

Many ML/deep learning algorithms can be described as minimizing on training data under a theoretical framework.

Context: Empirical risk minimization (ERM)

6

Minimizing a loss function on training data (empirical risk minimization) causes which leads to improved predictive performance.

Context: Cause→effect: ERM leads to parameter learning and better prediction

7

Finite training sets and uncertainty about the future cause learning theory to provide rather than absolute guarantees.

Context: Cause→effect: uncertainty leads to probabilistic bounds

8

provides a theoretical framework for describing when learning algorithms can achieve near-optimal performance with high probability.

Context: PAC learning meaning

9

Clustering groups similar unlabeled data points into k clusters, which causes data to be represented compactly using .

Context: Cause→effect: clustering leads to centroid-based compact representation

10

k-means clustering partitions data into k clusters, each represented by a .

Context: k-means clustering term

11

A system that predicts posterior probabilities of a sequence given its history can be used for optimal data compression because arithmetic coding turns probabilistic predictions into representations.

Context: Cause→effect: posterior prediction enables optimal compression

12

An optimal compressor can encode symbols effectively given prior history, which causes it to be used for .

Context: Cause→effect: optimal compression enables prediction

13

Supervised learning trains models using data to predict outputs such as class labels or numeric values.

Context: Supervised learning meaning

14

Unsupervised learning finds structure in data, with clustering grouping similar points into clusters.

Context: Unsupervised learning meaning

15

ML overlaps with statistics and data mining, but ML emphasizes generalizable prediction while data mining (KDD) emphasizes discovering previously unknown in data.

Context: Relationship to data mining (KDD) vs ML goals