Time Complexity measures how many steps (operations) an algorithm takes to finish, based on the size of its input. It is NOT about seconds or minutes — it is about counting how much work the computer has to do as the input grows.
Think of scanning groceries at a checkout counter. If you have 5 items, the scanner beeps 5 times. If you have 100 items, it beeps 100 times. The number of beeps grows directly with the number of items — that is Linear Time, written as O(n).
Step 1: Look at your code and find the parts that repeat (loops, recursive calls).
Step 2: Ask "How many times does this repeat, based on the input size n?"
Step 3: Write it in Big-O form. Drop constants and small terms.
2n + 100 steps → O(n)
n × n steps → O(n²)
5 steps (always) → O(1)
Each algorithm counts its operations one by one. The one that finishes first is the fastest!
Always 1 step, no matter what n is
6 steps when n = 6
36 steps when n = 6
1function getFirstItem(arr) {2return arr[0]; // Always 1 step3}4// 10 items? 1 step.5// 10 million items? Still 1 step.6// Time Complexity = O(1)
1function printAll(arr) {2for (let i = 0; i < arr.length; i++) {3console.log(arr[i]); // 1 step per item4}5}6// 10 items = 10 steps7// 1000 items = 1000 steps8// Time Complexity = O(n)
1function printAllPairs(arr) {2for (let i = 0; i < arr.length; i++) { // outer: n times3for (let j = 0; j < arr.length; j++) { // inner: n times each4console.log(arr[i], arr[j]);5}6}7}8// 10 items = 100 steps (10 × 10)9// 1000 items = 1,000,000 steps (1000 × 1000)10// Time Complexity = O(n²) — VERY SLOW for big inputs
O(1) — Constant: Instant. Example: accessing an array by index.
O(log n) — Logarithmic: Cuts the problem in half each step. Example: binary search.
O(n) — Linear: Touches every item once. Example: a single for-loop.
O(n log n) — Linearithmic: Efficient sorting. Example: merge sort.
O(n²) — Quadratic: Loop inside loop. Example: bubble sort. Avoid for large data!
Let's break this down from scratch, no prior knowledge needed.
What is an "operation"?
An operation is any single thing the computer does: adding two numbers, comparing values, printing something on screen, or looking up an item. Each of these counts as 1 step.
Why do we count steps instead of time?
Because time depends on your machine. A gaming PC finishes faster than an old laptop, but both run the SAME number of steps. Steps are universal. Seconds are not.
How does input size matter?
The "input" is the data you give to your code. If your code processes a list of names, the input size (called n) is the number of names. Time complexity answers: "If n doubles, does my work double? Or does it explode?"
The Big Idea — Big-O Notation
We write time complexity using Big-O notation like O(n), O(1), or O(n²). This tells us the GROWTH RATE of the work:
The Golden Rule
We drop constants when writing Big-O. If your code does 3n + 5 steps, we write O(n) because at massive scale, the "3" and the "5" barely matter. What matters is that it grows linearly.
Imagine two apps that both search for a contact name. App A checks every single contact one by one. App B jumps straight to the right letter. For 100 contacts, both feel instant. For 10 million contacts, App A freezes and App B still works in a blink. Time Complexity tells you WHICH app will survive at scale.
"A loop runs from 1 to n. What is the time complexity?"
O(n) — because the loop runs once for each item, so the number of steps equals the number of items.