Skip to article frontmatterSkip to article content
Site not loading correctly?

This may be due to an incorrect BASE_URL configuration. See the MyST Documentation for reference.

Welcome to Quick Linear Algebra, where we uncover the math behind all the buzzwords — without turning your notebook into an ancient scroll of equations.

If you’ve ever thought,

“Why are data scientists obsessed with matrices?”

the answer is simple: because Excel walked so linear algebra could run. 🏃‍♂️💨


🧮 Why Linear Algebra Matters for Business ML

Machine learning is basically “math applied to big tables.”

Linear algebra is the grammar for working with those tables — helping models:

  • Combine customer data 🧾

  • Transform variables 📈

  • Make predictions 🔮

In short: Linear algebra = data manipulation with style.


📊 Meet the Stars: Scalars, Vectors, and Matrices

TermSymbolThink Of It AsBusiness Example
Scalar( x )A single numberOne product’s price
Vector( \vec{x} )A list of numbersPrices of all your products
Matrix( A )A table of numbersProduct prices × stores
Tensor( \mathcal{T} )Multi-dimensional arrayPrice × store × day

So yeah, matrices are just glorified spreadsheets — except they actually behave when you multiply them.


💡 Visualizing the Concept

You See This In ExcelML Sees This As
Columns of “Customer ID,” “Spend,” “Region”Feature Matrix ( X )
Column of “Churned = Yes/No”Target Vector ( y )

Every ML model does something like this:

[ \hat{y} = X \cdot \beta ]

Where:

  • ( X ) → all your features (inputs)

  • ( \beta ) → learned weights (importance of each feature)

  • ( \hat{y} ) → predicted outcome

📊 Translation: “Combine all business variables with their importance weights → get your prediction.”


⚙️ Core Operations (Without the Fear)

OperationMath FormPlain EnglishBusiness Analogy
Addition( A + B )Combine datasetsMerge sales from two regions
Multiplication( A \times B )Weighted sumApply importance to each factor
Dot Product( \vec{x} \cdot \vec{y} )Measure alignmentHow similar two customers’ behaviors are
Transpose( A^T )Flip rows ↔ columnsSwitch from “per store” view to “per product” view
Inverse( A^{-1} )Undo a transformationBack out discounts to get list price
Identity Matrix( I )Do nothing (neutral)Like “no change” in a KPI dashboard

🧩 Practice Corner #1: “The Matrix Manager”

You’re analyzing product data:

[ X =

[AdSpendDiscount 105 87]\begin{bmatrix} \text{AdSpend} & \text{Discount} \ 10 & 5 \ 8 & 7 \end{bmatrix}

\beta =

[2 3]\begin{bmatrix} 2 \ 3 \end{bmatrix}

]

Compute ( X \cdot \beta ).

🧠 Hint: Multiply row-by-row and add the results.

RowCalculationResult
1(10×2) + (5×3)35
2(8×2) + (7×3)37

✅ So predicted sales = [35, 37]. Boom. You just did linear algebra — and a regression prediction.


📦 Why Businesses Should Care

Because every model doing:

  • Forecasting,

  • Segmentation, or

  • Optimization

is secretly crunching matrix math behind the scenes.

ML TaskLinear Algebra Role
Linear RegressionCombine feature matrix with coefficients
PCA / Dimensionality ReductionRotate data into simpler shapes
RecommendationsCompute similarity between users/items
Deep LearningMultiply enormous matrices really, really fast

TL;DR: If data is the new oil, matrices are the refineries.


🧩 Practice Corner #2: “Matrix or Mayhem?”

Decide whether each business example uses linear algebra (✅) or not (❌):

ScenarioLinear Algebra?
1. Adding total monthly sales from all stores❌ (simple sum)
2. Predicting sales using ad spend and discounts
3. Comparing customer segments via similarity scores
4. Counting how many customers churned last quarter

📘 Quick Recap

Scalars → single numbers ✅ Vectors → columns of data ✅ Matrices → multi-feature tables ✅ Dot products → similarity or weighted sums ✅ Linear algebraExcel formulas with superhero capes


🧭 Up Next

Next stop: Calculus Essentials → We’ll see how models “learn” — by using calculus to reduce their sadness (loss) one tiny derivative at a time. 💧📉


Notations of Basic Algebra

1. Number:

  • A simple number: 5

  • A negative number: -3

  • A decimal number: 2.718

  • A fraction: 12\frac{1}{2}

2. Vector:

  • A column vector: v=[123]\mathbf{v} = \begin{bmatrix} 1 \\ 2 \\ 3 \end{bmatrix}

  • A row vector: u=[abc]\mathbf{u} = \begin{bmatrix} a & b & c \end{bmatrix}

3. Dot Product:

  • The dot product of two vectors a\mathbf{a} and b\mathbf{b}: ab=a1b1+a2b2++anbn\mathbf{a} \cdot \mathbf{b} = a_1b_1 + a_2b_2 + \cdots + a_nb_n

  • Example with specific vectors: [12][34]=(1×3)+(2×4)=3+8=11\begin{bmatrix} 1 \\ 2 \end{bmatrix} \cdot \begin{bmatrix} 3 \\ 4 \end{bmatrix} = (1 \times 3) + (2 \times 4) = 3 + 8 = 11

4. Multiplication:

  • Scalar multiplication: 3×4=123 \times 4 = 12 or 34=123 \cdot 4 = 12

  • Matrix-vector multiplication: Ax=bA \mathbf{x} = \mathbf{b}, where AA is a matrix and x,b\mathbf{x}, \mathbf{b} are vectors. Example: [1234][56]=[(1×5)+(2×6)(3×5)+(4×6)]=[1739]\begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix} \begin{bmatrix} 5 \\ 6 \end{bmatrix} = \begin{bmatrix} (1 \times 5) + (2 \times 6) \\ (3 \times 5) + (4 \times 6) \end{bmatrix} = \begin{bmatrix} 17 \\ 39 \end{bmatrix}

  • Matrix-matrix multiplication: AB=CAB = C Example: [1001][2345]=[(1×2)+(0×4)(1×3)+(0×5)(0×2)+(1×4)(0×3)+(1×5)]=[2345]\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} \begin{bmatrix} 2 & 3 \\ 4 & 5 \end{bmatrix} = \begin{bmatrix} (1 \times 2) + (0 \times 4) & (1 \times 3) + (0 \times 5) \\ (0 \times 2) + (1 \times 4) & (0 \times 3) + (1 \times 5) \end{bmatrix} = \begin{bmatrix} 2 & 3 \\ 4 & 5 \end{bmatrix}

5. Inverse:

  • The inverse of a matrix AA is denoted as A1A^{-1}, such that AA1=A1A=IAA^{-1} = A^{-1}A = I, where II is the identity matrix.

  • Example of a 2×22 \times 2 inverse: If A=[abcd]A = \begin{bmatrix} a & b \\ c & d \end{bmatrix}, then A1=1adbc[dbca]A^{-1} = \frac{1}{ad - bc} \begin{bmatrix} d & -b \\ -c & a \end{bmatrix}, provided adbc0ad - bc \neq 0. Example with numbers: If A=[2143]A = \begin{bmatrix} 2 & 1 \\ 4 & 3 \end{bmatrix}, then adbc=(2×3)(1×4)=64=2ad-bc = (2 \times 3) - (1 \times 4) = 6 - 4 = 2, and A1=12[3142]=[1.50.521]A^{-1} = \frac{1}{2} \begin{bmatrix} 3 & -1 \\ -4 & 2 \end{bmatrix} = \begin{bmatrix} 1.5 & -0.5 \\ -2 & 1 \end{bmatrix}.

6. Identity Matrix:

  • A 2×22 \times 2 identity matrix: I2=[1001]I_2 = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}

  • A 3×33 \times 3 identity matrix: I3=[100010001]I_3 = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix}

7. Identity and Inverse Multiplication:

  • Multiplication of a matrix by its inverse results in the identity matrix: AA1=IAA^{-1} = I Example: [2143][1.50.521]=[(2×1.5)+(1×2)(2×0.5)+(1×1)(4×1.5)+(3×2)(4×0.5)+(3×1)]=[321+1662+3]=[1001]=I\begin{bmatrix} 2 & 1 \\ 4 & 3 \end{bmatrix} \begin{bmatrix} 1.5 & -0.5 \\ -2 & 1 \end{bmatrix} = \begin{bmatrix} (2 \times 1.5) + (1 \times -2) & (2 \times -0.5) + (1 \times 1) \\ (4 \times 1.5) + (3 \times -2) & (4 \times -0.5) + (3 \times 1) \end{bmatrix} = \begin{bmatrix} 3 - 2 & -1 + 1 \\ 6 - 6 & -2 + 3 \end{bmatrix} = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} = I

8. Transpose:

  • The transpose of a matrix AA is denoted as ATA^T or AA'. The rows of AA become the columns of ATA^T, and vice versa.

  • Example: If A=[123456]A = \begin{bmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \end{bmatrix}, then AT=[142536]A^T = \begin{bmatrix} 1 & 4 \\ 2 & 5 \\ 3 & 6 \end{bmatrix}.

  • Transpose of a vector: If v=[123]\mathbf{v} = \begin{bmatrix} 1 \\ 2 \\ 3 \end{bmatrix}, then vT=[123]\mathbf{v}^T = \begin{bmatrix} 1 & 2 & 3 \end{bmatrix}.

# Your code here