Quick Linear Algebra#
Welcome to Quick Linear Algebra, where we uncover the math behind all the buzzwords — without turning your notebook into an ancient scroll of equations.
If you’ve ever thought,
“Why are data scientists obsessed with matrices?”
the answer is simple: because Excel walked so linear algebra could run. 🏃♂️💨
🧮 Why Linear Algebra Matters for Business ML#
Machine learning is basically “math applied to big tables.”
Linear algebra is the grammar for working with those tables — helping models:
Combine customer data 🧾
Transform variables 📈
Make predictions 🔮
In short: Linear algebra = data manipulation with style.
📊 Meet the Stars: Scalars, Vectors, and Matrices#
Term |
Symbol |
Think Of It As |
Business Example |
|---|---|---|---|
Scalar |
( x ) |
A single number |
One product’s price |
Vector |
( \vec{x} ) |
A list of numbers |
Prices of all your products |
Matrix |
( A ) |
A table of numbers |
Product prices × stores |
Tensor |
( \mathcal{T} ) |
Multi-dimensional array |
Price × store × day |
So yeah, matrices are just glorified spreadsheets — except they actually behave when you multiply them.
💡 Visualizing the Concept#
You See This In Excel |
ML Sees This As |
|---|---|
Columns of “Customer ID,” “Spend,” “Region” |
Feature Matrix ( X ) |
Column of “Churned = Yes/No” |
Target Vector ( y ) |
Every ML model does something like this:
[ \hat{y} = X \cdot \beta ]
Where:
( X ) → all your features (inputs)
( \beta ) → learned weights (importance of each feature)
( \hat{y} ) → predicted outcome
📊 Translation: “Combine all business variables with their importance weights → get your prediction.”
⚙️ Core Operations (Without the Fear)#
Operation |
Math Form |
Plain English |
Business Analogy |
|---|---|---|---|
Addition |
( A + B ) |
Combine datasets |
Merge sales from two regions |
Multiplication |
( A \times B ) |
Weighted sum |
Apply importance to each factor |
Dot Product |
( \vec{x} \cdot \vec{y} ) |
Measure alignment |
How similar two customers’ behaviors are |
Transpose |
( A^T ) |
Flip rows ↔ columns |
Switch from “per store” view to “per product” view |
Inverse |
( A^{-1} ) |
Undo a transformation |
Back out discounts to get list price |
Identity Matrix |
( I ) |
Do nothing (neutral) |
Like “no change” in a KPI dashboard |
🧩 Practice Corner #1: “The Matrix Manager”#
You’re analyzing product data:
[
X =
\begin{bmatrix}
\text{AdSpend} & \text{Discount}
10 & 5
8 & 7
\end{bmatrix}, \quad
\beta =
\begin{bmatrix}
2 \ 3
\end{bmatrix}
]
Compute ( X \cdot \beta ).
🧠 Hint: Multiply row-by-row and add the results.
Row |
Calculation |
Result |
|---|---|---|
1 |
(10×2) + (5×3) |
35 |
2 |
(8×2) + (7×3) |
37 |
✅ So predicted sales = [35, 37]. Boom. You just did linear algebra — and a regression prediction.
📦 Why Businesses Should Care#
Because every model doing:
Forecasting,
Segmentation, or
Optimization
is secretly crunching matrix math behind the scenes.
ML Task |
Linear Algebra Role |
|---|---|
Linear Regression |
Combine feature matrix with coefficients |
PCA / Dimensionality Reduction |
Rotate data into simpler shapes |
Recommendations |
Compute similarity between users/items |
Deep Learning |
Multiply enormous matrices really, really fast |
TL;DR: If data is the new oil, matrices are the refineries.
🧩 Practice Corner #2: “Matrix or Mayhem?”#
Decide whether each business example uses linear algebra (✅) or not (❌):
Scenario |
Linear Algebra? |
|---|---|
1. Adding total monthly sales from all stores |
❌ (simple sum) |
2. Predicting sales using ad spend and discounts |
✅ |
3. Comparing customer segments via similarity scores |
✅ |
4. Counting how many customers churned last quarter |
❌ |
📘 Quick Recap#
✅ Scalars → single numbers ✅ Vectors → columns of data ✅ Matrices → multi-feature tables ✅ Dot products → similarity or weighted sums ✅ Linear algebra → Excel formulas with superhero capes
🧭 Up Next#
Next stop: Calculus Essentials → We’ll see how models “learn” — by using calculus to reduce their sadness (loss) one tiny derivative at a time. 💧📉
Notations of Basic Algebra#
1. Number:
A simple number: \(5\)
A negative number: \(-3\)
A decimal number: \(2.718\)
A fraction: \(\frac{1}{2}\)
2. Vector:
A column vector: \(\mathbf{v} = \begin{bmatrix} 1 \\ 2 \\ 3 \end{bmatrix}\)
A row vector: \(\mathbf{u} = \begin{bmatrix} a & b & c \end{bmatrix}\)
3. Dot Product:
The dot product of two vectors \(\mathbf{a}\) and \(\mathbf{b}\): \(\mathbf{a} \cdot \mathbf{b} = a_1b_1 + a_2b_2 + \cdots + a_nb_n\)
Example with specific vectors: \(\begin{bmatrix} 1 \\ 2 \end{bmatrix} \cdot \begin{bmatrix} 3 \\ 4 \end{bmatrix} = (1 \times 3) + (2 \times 4) = 3 + 8 = 11\)
4. Multiplication:
Scalar multiplication: \(3 \times 4 = 12\) or \(3 \cdot 4 = 12\)
Matrix-vector multiplication: \(A \mathbf{x} = \mathbf{b}\), where \(A\) is a matrix and \(\mathbf{x}, \mathbf{b}\) are vectors. Example: \(\begin{bmatrix} 1 & 2 \\ 3 & 4 \end{bmatrix} \begin{bmatrix} 5 \\ 6 \end{bmatrix} = \begin{bmatrix} (1 \times 5) + (2 \times 6) \\ (3 \times 5) + (4 \times 6) \end{bmatrix} = \begin{bmatrix} 17 \\ 39 \end{bmatrix}\)
Matrix-matrix multiplication: \(AB = C\) Example: \(\begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} \begin{bmatrix} 2 & 3 \\ 4 & 5 \end{bmatrix} = \begin{bmatrix} (1 \times 2) + (0 \times 4) & (1 \times 3) + (0 \times 5) \\ (0 \times 2) + (1 \times 4) & (0 \times 3) + (1 \times 5) \end{bmatrix} = \begin{bmatrix} 2 & 3 \\ 4 & 5 \end{bmatrix}\)
5. Inverse:
The inverse of a matrix \(A\) is denoted as \(A^{-1}\), such that \(AA^{-1} = A^{-1}A = I\), where \(I\) is the identity matrix.
Example of a \(2 \times 2\) inverse: If \(A = \begin{bmatrix} a & b \\ c & d \end{bmatrix}\), then \(A^{-1} = \frac{1}{ad - bc} \begin{bmatrix} d & -b \\ -c & a \end{bmatrix}\), provided \(ad - bc \neq 0\). Example with numbers: If \(A = \begin{bmatrix} 2 & 1 \\ 4 & 3 \end{bmatrix}\), then \(ad-bc = (2 \times 3) - (1 \times 4) = 6 - 4 = 2\), and \(A^{-1} = \frac{1}{2} \begin{bmatrix} 3 & -1 \\ -4 & 2 \end{bmatrix} = \begin{bmatrix} 1.5 & -0.5 \\ -2 & 1 \end{bmatrix}\).
6. Identity Matrix:
A \(2 \times 2\) identity matrix: \(I_2 = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix}\)
A \(3 \times 3\) identity matrix: \(I_3 = \begin{bmatrix} 1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \end{bmatrix}\)
7. Identity and Inverse Multiplication:
Multiplication of a matrix by its inverse results in the identity matrix: \(AA^{-1} = I\) Example: \(\begin{bmatrix} 2 & 1 \\ 4 & 3 \end{bmatrix} \begin{bmatrix} 1.5 & -0.5 \\ -2 & 1 \end{bmatrix} = \begin{bmatrix} (2 \times 1.5) + (1 \times -2) & (2 \times -0.5) + (1 \times 1) \\ (4 \times 1.5) + (3 \times -2) & (4 \times -0.5) + (3 \times 1) \end{bmatrix} = \begin{bmatrix} 3 - 2 & -1 + 1 \\ 6 - 6 & -2 + 3 \end{bmatrix} = \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} = I\)
8. Transpose:
The transpose of a matrix \(A\) is denoted as \(A^T\) or \(A'\). The rows of \(A\) become the columns of \(A^T\), and vice versa.
Example: If \(A = \begin{bmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \end{bmatrix}\), then \(A^T = \begin{bmatrix} 1 & 4 \\ 2 & 5 \\ 3 & 6 \end{bmatrix}\).
Transpose of a vector: If \(\mathbf{v} = \begin{bmatrix} 1 \\ 2 \\ 3 \end{bmatrix}\), then \(\mathbf{v}^T = \begin{bmatrix} 1 & 2 & 3 \end{bmatrix}\).
# Your code here