Algorithms

An introduction

by Pang Yan Han

• Computer Science major at NUS (soon to be Year 4)
• TA'ed CS2010 (Algorithms II) module for 2 times
• Avid problem solver

Big thanks to Calvin for giving me this chance

First time talking about a tech topic in a non-class setting

Aims

• An introduction to some practical algorithms
• A little bit of fun theory
• Increase your interest in the field

What is an algorithm? (Defn from Google)

Algorithm (pronounced L-ger-rhythm)

A process or set of rules to be followed in calculations or other problem-solving operations, esp. by a computer.

Motivation 1

Finding all pages where a keyword appears in a book

Motivation 1 - Finding all pages where a keyword appears in a book

Algorithm 1

number of occurrences = 0

while there are still pages in the book:
while there are words on the page:
if the current word is the keyword
increment number of occurrences by 1
we are done with the page
else


Is that an algorithm?

Yes. But I think we all agree it is not a very good one.

Motivation - Finding all pages of a book where a keyword appears

Algorithm 2

Suppose I am a very rich person. So for each page of the book, I pay 1 person $10 to check if that page contains the keyword. number of occurrences = 0 Hire 1 person for each page of the book to execute Algorithm 1 While there are still people counting: wait for someone to finish counting his/her page add to the number of occurrences  Motivation - Finding all pages of a book where a keyword appears Algorithm 2 - The rich man's algorithm Is that a good algorithm? Yes and No. Motivation - Finding all pages of a book where a keyword appears Algorithm 2 - The rich man's algorithm Yes Tedium is removed. Time taken is roughly the time it takes for 1 person to go through 1 page of the book. Motivation - Finding all pages of a book where a keyword appears Algorithm 2 - The rich man's algorithm No If the book has 1000 pages, I have to pay$10,000. The amount of money paid depends on the number of pages in the book.

Motivation - Finding all pages of a book where a keyword appears

Algorithm 3

Go to the index of the book and look for the keyword

If we find the keyword, then we can retrieve all the pages it appears in
Otherwise, the keyword is probably not inside the book


Is this a good algorithm?

Yes

• Very fast
• No need to pay many people to do it (save money)

Algorithm Analysis

(Or, Big O Notation in < 10 min)

NOTE: Not meant to make fun of how this is taught in a formal CS curriculum.

Just showing the most relevant stuff quickly.

Constant Time Operations

• Array indexing
• Variable assignment
• Integer/Character/Floating point Comparison
• Bitwise operations

We refer to such operations as having $O(1)$, or more accurately, $Θ(1)$ time complexity.

Example 1

              
// N is assigned somewhere
int y = 4;
int x = 0;
for (int i = 0; i < N; i++) { // loop runs for N times
x += y;                   // addition is a Θ(1) operation
}


Time complexity? $Θ(N)$

aka. Linear Time

Example 2

              
// N is assigned somewhere
int y = 4;
int x = 0;
for (int i = 0; i < N; i++) {      // loop runs for N times
for (int j = 0; j < N; j++) {  // loop runs for N times
x += y;                    // addition is a Θ(1) operation
}
}


Time Complexity? $Θ(N^2)$

Example 3

              
// A and N are declared above
for (int i = 0; i < N; i++) {
for (int j = 0; j < N; j++) {
for (int k = 0; k < N; k++) {
A[i][k] = min(A[i][k], A[i][j] + A[j][k]);
}
}
}


Time Complexity? $Θ(N^3)$

aka. Cubic Time

I guess the pattern is becoming clear

Example 4

              
// N assigned somewhere
// we want to find key
int low, high, mid;
low = 0;
high = N;
ans = -1;
while (low < high) {
mid = (low + high) / 2;
if (A[mid] == key) {
ans = mid;
break;
} else if (A[mid] < key) {
low = mid + 1;
} else {
high = mid;
}
}


Time Complexity? $Θ(log(N))$

aka. Logarithmic Time

Example 5

              
def merge(lst, start, mid, end):
nr = end - start + 1
aux = [0 for i in xrange(nr)]
fp = start
sp = mid + 1
for i in xrange(nr):
if fp > mid:
aux[i] = lst[sp]
sp += 1
elif sp > end:
aux[i] = lst[fp]
fp += 1
elif lst[sp] < lst[fp]:
aux[i] = lst[sp]
sp += 1
else:
aux[i] = lst[fp]
fp += 1
for i in xrange(nr):
lst[start + i] = aux[i]

def mergesort(lst, start, end):
if start >= end:
return lst
mid = (start + end) / 2
mergesort(lst, start, mid)
mergesort(lst, mid + 1, end)
merge(lst, start, mid, end)

if __name__ == '__main__':
lst = [7, 6, 5, 4, 3, 2, 1]
mergesort(lst, 0, 6)
print lst


Time Complexity? $Θ(N log(N))$

aka. Linearithmetic Time

Example 6

              
NEGINF = -1000000000
def choose(i, N, value, weight, wtRem):
if wtRem < 0:
return NEGINF;
elif i >= N:
return 0
else:
return max(
value[i] + choose(i + 1, N, value, weight, wtRem - weight[i]),
choose(i + 1, N, value, weight, wtRem)
)

if __name__ == '__main__':
value = [5, 7, 8, 9, 10]
weight = [1, 13, 2, 6, 16]
N, weightRem = 5, 35
print choose(0, N, value, weight, weightRem)


Time Complexity? $Θ(2^N)$

aka. Exponential Time

Example 7

              
def permute(idx, nr, lst, chosen, p, allPerms):
if idx >= nr:
allPerms.append([x for x in p])
else:
for i in xrange(nr):
if not chosen[i]:
chosen[i] = True
p[idx] = lst[i]
permute(idx + 1, nr, lst, chosen, p, allPerms)
chosen[i] = False

def perm(N):
lst = [x for x in xrange(1, N+1)]
chosen = [False for i in xrange(N)]
p = [0 for i in xrange(N)]
allPerms = []
permute(0, N, lst, chosen, p, allPerms)
print allPerms

if __name__ == '__main__':
perm(6)


Time Complexity? $Θ(N!)$

aka. Factorial Time

What you've just seen

Algorithm Analysis

• An analysis framework independent of:
• time taken for a given computer to execute a certain type of instruction
• the compiler you are using
• most programming languages, as long as they are non-lazy and not functional (yes I am talking about Haskell)
• A way to quantify the time efficiency of algorithms

So, why do we care?

Given a computer that can execute 1 billion ops per second

10 1000 10000 1 million 1 billion
$Θ(1)$ Constant Constant Constant Constant Constant
$Θ(log(N))$ 3.32 ns 9.97 ns 13.3 ns 19.9 ns 29.9 ns
$Θ(N)$ 10 ns 1 us 10 us 0.001 s 1 s
$Θ(N log(N))$ 33.2 ns 9.97 us 0.133 ms 0.0199 s 29.9 s
$Θ(N^2)$ 100 ns 0.001 s 0.1s 16.67 min 31.7 years
$Θ(N^3)$ 1 us 1 s 16.67 min 31.7 years 3.17e10 years
$Θ(2^N)$ 1 us 3.4e284 years 1.995e3001 s 9.9e301020s Too big...
$Θ(N!)$ 3.63 ms 4.02e2558 s 2.85e35650 s 8.26e5565699 s Too big...

NOTE: $2^N$ (from $N = 10000$) and $N!$ (from $N = 1000$) required Wolfram Alpha to compute. And it overflowed on $N=10^9$ for those 2.

Why study Algorithms?

Practical Concerns

• We want to solve problems fast. In particular, big problems should be solved in a reasonable amount of time.
• More efficient algorithms = solve bigger problems
• Some problems are just impractical without efficient algorithms
• Computers not getting faster ( Herb Sutter - The Free Lunch Is Over )
• Learn a larger toolkit of general problem solving and algorithm design techniques for future problems

1. Compilers / Interpreters

1. Compilers / Interpreters

• We will be writing in 0s and 1s without them
• Commonly agreed that using higher level languages = greater productivity

1. Compilers / Interpreters - Efficient algorithms

• Modern compilers divided into 4 phases
• Lexing
• Parsing
• Semantic Analysis
• Code Generation

1. Compilers / Interpreters - Lexing

• Lexing - converting program from string representation to valid lexical tokens
• Eg. if (true) { } -> IF LPAREN BOOLTRUE RPAREN LBRACE RBRACE
• Theory: Finite Automata
• Interface to programmers: Regular Expressions
• A Regular Expression is one of:
• One single character
• Union of 2 Regular Expressions
• Concatenation of 2 Regular Expressions
• Kleene Star of a Regular Expression
• Things like lookahead, lookbehind, backtracking, etc are not true regular expressions

1. Compilers / Interpreters - Lexing

• Regular Expressions compiled down to DFA (Deterministic Finite Automata)
• String matching is done in $Θ(M)$ time, where $M$ is the length of the string
• That is very efficient
• Commonly used tool: Flex

1. Compilers / Interpreters - Parsing

• From lexical tokens, generate an abstract syntax tree
• Efficient algorithms:
• LR Parsing (Left to right, Rightmost derivation) - linear time algorithm invented by Donald Knuth. Gave rise to a widely used parsing method known as LALR (Look Ahead LR)
• LL Parsing (usually linear time) - GCC uses a handwritten LL(1) parser
• Commonly used Tool: Bison

1. Compilers / Interpreters

• Efficient algorithms for lexing and parsing enable faster compilation
• Not covered here, but backends of modern compilers require efficient algorithms as well, to handle problems such as spilling
• Application of fast lexing and parsing algorithms other than Compilers:
• Syntax highlighting in IDEs

2. Operating Systems

Operating Systems

• OS needs to store a list of tasks for future execution, and have an efficient way of retrieving the next task, inserting new tasks, deleting tasks, etc
• Linux Kernel uses a Red-Black Tree data structure to store the tasks (source: Wikipedia )
• OS will be much slower without this efficient data structure, which supports $Θ(log(N))$ search, insert, delete

3. Computer Networks

3. Computer Networks

• Modern computing infrastructure impossible without efficient computer networks and their underlying algorithms
• Case in point: Network Subsystem is a huge part of Linux Kernel
• Example Problems:
• Network Layout
• Routing
• Congestion

4. File Compression

4. File Compression

• Most of us here should be familiar with these:
• Enable gzip compression on servers
• Minify JavaScript and CSS for production
• Compression algorithms should:
• compress files down to a much smaller size
• run efficiently. Just imagine the number of requests a popular web application receives.
• Huffman Coding: Basis behind many compression algorithms
• Other examples, such as JPEG and MP3

5. Filesystems

5. Filesystems

• Hard disk - slowest memory component
• Registers: Instant, Main memory: Nanoseconds, Hard Disk: Milliseconds
• Efficient data structure required for fast retrieval and storage: B-Tree (and variants)
• Main ideas: Balanced $N$-way Trees. Leaves are never too far from root. Root is small enough to be held in memory. Minimizes the number of disk seeks.

Hopefully that should be sufficient to convince you that efficient algorithms are necessary

Terminology

• Vertex (A point)
• Edge (a line joining 2 vertices)
• A graph is a collection/set of vertices and edges

Terminology (cont)

• Weight - a number associated with an edge
• Directed graph - a graph whose edge has direction
• A graph can be one of:
• unweighted and undirected
• unweighted and directed
• weighted and undirected
• weighted and directed

A graph, in this context, refers this type of graph

And not this type of graph:

Social networks

Probably not a coincidental name

The Internet

And a whole lot more

In fact, anything with some kind of connection/relation

Graph Representations

Data structures to represent a graph

Two major representations:

Graph Representations - Summary

• $Θ(1)$ checking if edge exists
• Space is $Θ(V^2)$, optimal for dense graphs
• Less space used for sparse graphs
• $Θ(V+E)$ traversal compared to $Θ(V^2)$ of adjacency matrix

Graph Representations - Summary

• Graph representations are inter-convertible (try it!)
• Most of the time, adjacency list is used (at least for most of the graph algorithms I've seen so far)

Approach

A mix of:

• Algorithmic Problems from:
• UVa Online Judge
• Codeforces
• "Stab in the dark" guesses as to what algorithms are used by some well-known software - may be highly inaccurate, so don't take my word for it

Algorithmic Problems

Problems designed by experts to be solved using a combination of well-studied algorithms and algorithmic techniques, ingenuity, and sometimes math.

An example of such a problem

UVa 10908 - Largest Square

Features of such problems

• usually long problem statement
• very properly defined input and output format
• write an actual computer program (usually in C, C++, Java; some judges support Python and other languages) that passes all the hidden test cases of a computer online judge
• the computer program must finish running within a given time limit (recall our algorithm analysis section earlier)
• if one gets it wrong, may not have any clear indication as to what exactly went wrong

Why?

• Only proven approach I know of to improve in algorithms
• Not just implementing a "textbook" algorithm, but modifying and often combining several approaches
• Identifying the "right" approach(es) will sharpen problem solving skills
• Can study code from many experts

A note of caution

• Material can be a bit academic / math-like / not real-world / "fake" / boring / (insert any other noun)
• In other words: it may be a bit hard
• Problem statements are very long; so I will summarize them
• Actual time is spent identifying the problem type, unable to detail here due to time constraints

Graph algorithms

From a source vertex $S$, traverse (walk through) the graph starting from vertices directly adjacent to $S$ (or 1 hop away from $S$), then vertices 2 hops away from $S$, then those 3 hops away from $S$, and so on.

Demo

http://yanhan.github.io/bfsvis/

BFS - Analysis

• Time complexity: $Θ(V+E)$
• Space complexity: $Θ(V)$
• Traverses vertices in order of increasing distance from source

Example Problem 1 - UVa 11561 Getting Gold

http://uva.onlinejudge.org/external/115/11561.html

UVa 11561 - Abridged Problem statement

http://uva.onlinejudge.org/external/115/11561.html

Given a map with the following characters:

• P - player
• G - gold ()
• T - trap
• # - wall
• . - normal floor

When standing next to (left, right, up, down) a trap, the player knows there is a trap but not where the trap is.

Find the maximum number of gold pieces he can retrieve.

UVa 11561 - Example 1 & Explanation

http://uva.onlinejudge.org/external/115/11561.html

UVa 11561 - Example 2 & Explanation

http://uva.onlinejudge.org/external/115/11561.html

UVa 11561 - Summary

http://uva.onlinejudge.org/external/115/11561.html
• Example of an implicit graph
• Vertices = squares in the map, edges = left, top, right, bottom

Time Complexity analysis:

• $H ≤ 50, W ≤ 50$
• Read input: $\sim (H*W)$
• BFS: $\sim (H*W + 4*H*W)$
• Total: $\sim (6*H*W) = 6 * 50 * 50 = 15,000$ ops per test case

Example Problem 2 - UVa 11792

Krochanska is Here!

http://uva.onlinejudge.org/external/117/11792.html

UVa 11792 - Abridged Problem Statement

http://uva.onlinejudge.org/external/117/11792.html

Given some train lines containing train stations from $1$ to $N$, such as:

• 1-2-3-4-5
• 3-5-1-4-2

where an important train station is one which appears in more than 1 train line.

Determine the important train station with total minimal travel cost to all other important train stations.

UVa 11792 - Example 1 + Explanation

http://uva.onlinejudge.org/external/117/11792.html

Train lines:

• 1-2-3-4-5
• 3-5-1-4-2

Important stations: 1, 2, 3, 4, 5

IDEA: Build graph using adjacency list

• 1: 2, 4, 5
• 2: 1, 3, 4
• 3: 2, 4, 5
• 4: 1, 2, 3, 5
• 5: 1, 3, 4

UVa 11792 - Example 1 + Explanation

http://uva.onlinejudge.org/external/117/11792.html

Train lines:

• 1-2-3-4-5
• 3-5-1-4-2

Important stations: 1, 2, 3, 4, 5

Stn 1 2 3 4 5 Total
1 0 1 2 1 1 6
2 1 0 1 1 2 5
3 2 1 0 1 1 5
4 1 1 1 0 1 4
5 1 2 1 1 0 5

Distances computed by BFS, with source dist = 0

Minimum node = 4

UVa 11792 - Example 2 + Explanation

http://uva.onlinejudge.org/external/117/11792.html

Train lines

• 3-5-1-2-4-7-6
• 3-6-1

Important stations: 1, 3, 6

Stn 1 3 6 Total
1 0 2 1 3
3 2 0 1 3
6 1 1 0 2

Minimal node: 6

UVa 11792 - Summary

http://uva.onlinejudge.org/external/117/11792.html
• Vertices: Train Stations
• Build graph using adjacency lists representation
• Perform BFS from every important train station to compute the shortest path from it to all other important train stations

UVa 11792 - Time Complexity

http://uva.onlinejudge.org/external/117/11792.html
• $N ≤ 10,000$ (total train stations)
• $1 ≤ S ≤ 100$ (important stations)
• BFS: $\sim (S*N) \approx 1,000,000$ ops per test case

BFS Use 1 - Bipartite Graph Checking

Bipartite Graph = 2-colorable graph

Left graph is bipartite, Right graph is not

More efficient algorithms!

Example: Graph Matching

• Bipartite Graphs: Hopcroft Karp Algorithm - Average $Θ(E*log(V))$ [source: Wikipedia ]
• General Graph: Edmonds Blossom Algorithm - $Θ(V^3)$ [Advanced Algorithm... I still don't fully understand it]

BFS Use 2 - Graph Flood Fill

Minesweeper Game

Clicking on empty square will reveal as many squares as possible, until all border squares are integers/reached borders of puzzle.

Clicking on numbered squares / mines = only 1 square revealed

Weighted Single Source Shortest Path Problem

Given a weighted graph (with non-negative edge weights) and a source vertex $S$, find the shortest distance from $S$ to all other vertices in the graph.

Efficient algorithm: Dijkstra's Algorithm (pronounced dike-stra)

Edsger W. Dijkstra (1930-2002)

A Case against the GO TO Statement

The Linux Kernel today:

On a serious note

Edsger W. Dijkstra (1930-2002)

• Dijkstra's algorithm (Weighted single source shortest paths)
• Shunting Yard algorithm (Parsing arithmetic expressions, eg. 1 + 2 * 4)
• Formulated the Dining Philosophers Problem
• Invented the Semaphore
• Self-stabilizing token ring algorithm
• Part of first ALGOL compiler team

His work is still very relevant today!

Dijkstra's Algorithm - an example

Example Graph. Source = 0, Destination = 5

Shortest path

Dijkstra's Algorithm - Key ideas

• Use an array to keep track of shortest distance from source vertex
• Source vertex has distance of 0
• All other vertices have initial distance infinity
• Use a priority queue data structure to retrieve vertex with least distance

Dijkstra's Algorithm - Key ideas (2)

• Start: Push source vertex $S$, along with its distance $0$, onto priority queue
• Main loop: Retrieve vertex $v$ with least distance. Stop when the priority queue is empty
• Perform relaxation operation on $v$ [by adding the distance from $S$ to $v$ and weight of edge $v-w$ to see if we can obtain a possibly shorter distance from $S$ to neighbor $w$, in which case we put $w$ and its new distance onto the priority queue]

Dijkstra's Algorithm - an example

Initialize source distance and put it onto pq

Dijkstra's Algorithm - an example

Active vertex - vertex 0

Dijkstra's Algorithm - an example

Relaxation on vertex 0

Dijkstra's Algorithm - an example

Active vertex - vertex 1

Dijkstra's Algorithm - an example

Relaxation on vertex 1

Dijkstra's Algorithm - an example

Active vertex - vertex 3

Dijkstra's Algorithm - an example

Relaxation on vertex 3

Dijkstra's Algorithm - an example

Active vertex - vertex 2

Dijkstra's Algorithm - an example

Relaxation on vertex 2

Dijkstra's Algorithm - an example

Active vertex - vertex 4

Dijkstra's Algorithm - an example

Relaxation on vertex 4 (no changes)

Dijkstra's Algorithm - an example

Active vertex - vertex 4 [Case of obsolete distance]

Dijkstra's Algorithm - an example

Active vertex - vertex 5

Dijkstra's Algorithm - an example

Relaxation on vertex 5 (no changes)

Dijkstra's Algorithm - an example

Active vertex - vertex 5 [Case of obsolete distance]

Done =)

Dijkstra's Algorithm - Time Complexity

$$Θ((E+V)*log(V))$$ using a binary min-heap for priority queue implementation

Source: Wikipedia

UVa 12144 - Almost Shortest Path

http://uva.onlinejudge.org/external/121/12144.html
• imho, not an easy problem, or at least I am not creative enough
• idea came from seeing the same concept used in another similar problem
• At time of writing, only 126 people solved it (26 Sep 2013)
• Don't take this number too seriously; there are many other online judges, UVa is just one that I am more familiar with

UVa 12144 - Almost Shortest Path

http://uva.onlinejudge.org/external/121/12144.html

Abridged Problem Statement

Given a weighted directed graph with $N ≤ 500$ vertices, $M ≤ 10000$ edges where edge weights are positive, compute the distance of the almost shortest path from source vertex $S$ to destination vertex $D$, or output -1 if no such path exists.

An almost shortest path is a path from $S$ to $D$ which does not include any edges in any shortest path from $S$ to $D$.

UVa 12144 - Almost Shortest Path

http://uva.onlinejudge.org/external/121/12144.html

Example 1

Blue edges = Shortest Path

Red edges = Almost Shortest Path

UVa 12144 - Almost Shortest Path

http://uva.onlinejudge.org/external/121/12144.html

Example 2

Blue edges = Shortest Path

UVa 12144 - Almost Shortest Path

http://uva.onlinejudge.org/external/121/12144.html

Example 3

Blue edges = Shortest Path

Red edges = Almost Shortest Path

UVa 12144 - Almost Shortest Path

http://uva.onlinejudge.org/external/121/12144.html

Approach 1

• Compute all possible shortest paths, and obtain a set of all the edges involved in the shortest paths
• In worst case, given $M = 10000$ and $N = 500$, there are $$O({10000\choose 499 }) \approx 1.33 * 10^{859}$$ possible paths, among which some are shortest paths [This is a not a tight upper bound I think. Bound is confirmed by comment on a question I asked on Quora
• Verdict: TLE (Time Limit Exceeded)
• Food for thought: Is it a wonder why algorithms are important, no matter how fast computers get?

UVa 12144 - Almost Shortest Path

http://uva.onlinejudge.org/external/121/12144.html

Approach 2

• Compute a shortest path using Dijkstra's Algorithm
• Each of those edges are definitely in a shortest path; use a set $S$ to take note of them
• We put those edges on a queue, and while the queue has some edge, we take an edge out, and run Dijkstra's algorithm on the graph, barring use of that edge
• If Dijkstra's algorithm gives us a path of equal length as that of the shortest path, record any new edges to $S$ and put them onto the queue

UVa 12144 - Almost Shortest Path

http://uva.onlinejudge.org/external/121/12144.html

Approach 2 (Continued)

Correctness aside (this algorithm is wrong), this algorithm is approximately $$Θ(M*(M+N)*log(N))$$ which takes about $$10000 * (10000 + 500) * log(500) \approx 941,407,350$$ operations per test case in the worst case, assuming $M = 10000, N = 500$

Verdict: TLE

UVa 12144 - Almost Shortest Path

http://uva.onlinejudge.org/external/121/12144.html

Approach 3 (A correct approach)

• Perform 2 x Dijkstra's Algorithm, once from the source vertex $S$ using edges in their original direction, once from the destination $D$ using the edges in reverse direction
• That yields us 2 shortest paths arrays, forward and backward
• The Dijkstra's Algorithm ran forward yields a shortest path distance $K$

UVa 12144 - Almost Shortest Path

http://uva.onlinejudge.org/external/121/12144.html

Approach 3 (A correct approach)

• Loop through each of the $M$ edges in our edgeset $E$
• An edge $e$ has 2 adjacent vertices $u$ and $v$. If $$forward[u] + e.weight + backward[v] == K$$ then $e$ must be part of some shortest path, so we can remove $e$ from our edge set $E$ (Recall that $K$ was the shortest path distance found previously)
• Using our new edge set $E'$, perform Dijkstra's Algorithm in the forward direction, and output the almost shortest path distance

UVa 12144 - Almost Shortest Path

http://uva.onlinejudge.org/external/121/12144.html

Approach 3 (A correct approach) - Example

UVa 12144 - Almost Shortest Path

http://uva.onlinejudge.org/external/121/12144.html

Approach 3 (A correct approach) - Time Complexity

• 3 x Dijkstra's Algorithm = $Θ((M + N) * log(N))$
• 1 x Loop through edge set = $Θ(M)$
• Given $M = 10000, N = 500$, that is roughly $$3 * (10000 + 500) * log(500) + 10000 \approx 292,422$$ operations per test case, which is pretty efficient

UVa 12144 - Almost Shortest Path

http://uva.onlinejudge.org/external/121/12144.html

Final words on this problem

• First approach: $$\approx O({10000 \choose 499}) \approx 1.33 * 10^{859}$$
• Second approach (which is wrong): $$\approx Θ(M*(M+N)*log(N))$$ $$\approx 10000 * (10000 + 500) * log(500)$$ $$\approx 941,407,350$$
• Third approach: $$\approx 3 * Θ((M + N) * log(N)) + Θ(M)$$ $$\approx 3 * (10000 + 500) * log(500) + 10000$$ $$\approx 292,422$$

Codeforces 198B - Jumping on Walls

http://codeforces.com/problemset/problem/198/B

Codeforces 198B - Jumping on Walls

http://codeforces.com/problemset/problem/198/B

Abridged Problem Statement

• You are a ninja stuck in a deep canyon with a left wall and a right wall, each of which is $N$ metres tall
• Initially, you are at the 1 metre point of the left wall, and the water level in the canyon is 0 metres
• You can perform one of three actions per turn:
• Go up the current wall by 1 metre
• Go down the current wall by 1 metre
• Jump $K$ metre(s) upwards to the adjacent wall
Provided that there are no spikes on your destination point

Codeforces 198B - Jumping on Walls

http://codeforces.com/problemset/problem/198/B

Abridged Problem Statement

• After an action from you, the water level in the canyon will rise up by 1 metre
• If the height of the position you are on is $≤$ the water level, you will perish
• Determine if you can escape the canyon. You escape the canyon if you are able to reach a position $> N$ metres (whether by climbing from $N$ to $N+1$ or jumping from some height $H$ to $H+K > N$, where $H ≥ 1, K ≥ 1$)
• It is guaranteed that your starting position is safe and does not have any spike

Codeforces 198B - Jumping on Walls

http://codeforces.com/problemset/problem/198/B

Problem Constraints

• $1 ≤ N, K ≤ 10^5$
• Time Limit per test case: 2 seconds
• Memory Limit per test case: 256 megabytes

Codeforces 198B - Jumping on Walls

http://codeforces.com/problemset/problem/198/B

Example 1 ($N = 7, K = 3$)

Legend: N = ninja, N = ninja previous pos, | = wall, X = spike

Initial, Water level = 0

Codeforces 198B - Jumping on Walls

http://codeforces.com/problemset/problem/198/B

Example 1 ($N = 7, K = 3$)

Legend: N = ninja, N = ninja previous pos, | = wall, X = spike

After Turn 1, Water level = 1

Ninja jumps up 3 metres to right wall

Codeforces 198B - Jumping on Walls

http://codeforces.com/problemset/problem/198/B

Example 1 ($N = 7, K = 3$)

Legend: N = ninja, N = ninja previous pos, | = wall, X = spike

After Turn 2, Water level = 2

Ninja goes down 1 metre

Codeforces 198B - Jumping on Walls

http://codeforces.com/problemset/problem/198/B

Example 1 ($N = 7, K = 3$)

Legend: N = ninja, N = ninja previous pos, | = wall, X = spike

After Turn 3, Water level = 3

Ninja jumps 3 metres up to left wall

Ninja successfully escapes the next turn

Codeforces 198B - Jumping on Walls

http://codeforces.com/problemset/problem/198/B

Example 2 ($N = 6, K = 2$)

Legend: N = ninja, N = ninja previous pos, | = wall, X = spike

Initial, Water Level = 0

Codeforces 198B - Jumping on Walls

http://codeforces.com/problemset/problem/198/B

Example 2 ($N = 6, K = 2$)

Legend: N = ninja, N = ninja previous pos, | = wall, X = spike

After turn 1, Water Level = 1

Ninja jumps 2 metres up to right wall

Codeforces 198B - Jumping on Walls

http://codeforces.com/problemset/problem/198/B

Example 2 ($N = 6, K = 2$)

Legend: N = ninja, N = ninja previous pos, | = wall, X = spike

After turn 2, Water Level = 2

Ninja climbs 1 metre down

Ninja perishes

Codeforces 198B - Jumping on Walls

http://codeforces.com/problemset/problem/198/B

Example 3 ($N = 3, K = 2$)

Legend: N = ninja, N = ninja previous pos, | = wall, X = spike

Initial, Water Level = 0

Codeforces 198B - Jumping on Walls

http://codeforces.com/problemset/problem/198/B

Example 3 ($N = 3, K = 2$)

Legend: N = ninja, N = ninja previous pos, | = wall, X = spike

After turn 1, Water Level = 1

Ninja climbs up 1 metre

Codeforces 198B - Jumping on Walls

http://codeforces.com/problemset/problem/198/B

Example 3 ($N = 3, K = 2$)

Legend: N = ninja, N = ninja previous pos, | = wall, X = spike

After turn 2, Water Level = 2

Ninja climbs up 1 metre

Ninja escapes the next turn

Codeforces 198B - Jumping on Walls

http://codeforces.com/problemset/problem/198/B

Approach

• Model this as a graph problem
• Vertices = (Wall, Height on wall, Water Level)
• Edges = go up 1 metre, go down 1 metre, jump $K$ metres up to adjacent wall
• We only consider feasible states, i.e. Height on wall $>$ water level
• If we find a state where Height on wall $≥ N$, then ninja can escape, so output "YES"
• If on state with height $H$, we find that $H+K > N$, then ninja can escape as well, so output "YES"

Codeforces 198B - Jumping on Walls

http://codeforces.com/problemset/problem/198/B

Approach

• In particular, the ninja should want to escape as quickly as possible (there is no harm in escaping faster)
• Model this as a weighted single source shortest path problem
• State: D[Wall][Height] = lowest water level we can reach the given height on the given wall

Codeforces 198B - Jumping on Walls

http://codeforces.com/problemset/problem/198/B

Example 1 Walkthrough ($N = 7, K = 3$), initial

D[wall][height]
1 2 3 4 5 6 7
L 0
R

Codeforces 198B - Jumping on Walls

http://codeforces.com/problemset/problem/198/B

Example 1 Walkthrough ($N = 7, K = 3$), after turn 1

D[wall][height]
1 2 3 4 5 6 7
L 0 1
R 1

Codeforces 198B - Jumping on Walls

http://codeforces.com/problemset/problem/198/B

Example 1 Walkthrough ($N = 7, K = 3$), after turn 2

D[wall][height]
1 2 3 4 5 6 7
L 0 1
R 2 1

Codeforces 198B - Jumping on Walls

http://codeforces.com/problemset/problem/198/B

Example 1 Walkthrough ($N = 7, K = 3$), after turn 3

D[wall][height]
1 2 3 4 5 6 7
L 0 1 3
R 2 1

Codeforces 198B - Jumping on Walls

http://codeforces.com/problemset/problem/198/B

Example 1 Walkthrough ($N = 7, K = 3$), after turn 4

D[wall][height]
1 2 3 4 5 6 7
L 0 1 3
R 2 1

Codeforces 198B - Jumping on Walls

http://codeforces.com/problemset/problem/198/B

Time Complexity

• Dijkstra's Algorithm: $Θ((E+V)*log(V))$
• $V \approx 2 * N = 2 * 10^5$ (2 walls, each of height $N$)
• $E \approx 3 * V = 6 * 10^5$ (3 actions: go up 1 metre, go down 1 metre, jump up $K$ metres to adjacent wall)
• Overall $$\approx (6 * 10^5 + 2 * 10^5) * log_2(2 * 10^5)$$ $$\approx 14,087,712$$
• Time Limit per test case = 2 seconds, ok

Practical Applications of Dijkstra's Algorithm

Dijkstra's Algorithm - Application 1

Actual algorithms are definitely much more complicated

Dijkstra's Algorithm - Application 2

OSPF (Open Shortest Path First)

Source: Wikpedia

A* algorithm

Source: Wikpedia

Travelling Salesman Problem (TSP)

Travelling Salesman Problem (TSP)

Problem Description

Given a set of $N$ cities and the distances between any pair of them, is there a way to travel through each of the the $N$ cities once with a total distance $≤ K$ (for some given $K$), while starting and ending at the same city?

Travelling Salesman Problem (TSP)

Example

Question: Is there a TSP tour $≤ 50$

Seems like we have to examine all possible tours to find out, and there are $N!$ of them

Small insight: Fix the start/ending city, only $(N-1)!$ tours

Travelling Salesman Problem (TSP)

Path 0-1-2-3-0

Total distance $= 4 + 20 + 60 + 15 = 99$

Travelling Salesman Problem (TSP)

Path 0-1-3-2-0

Total distance $= 4 + 2 + 60 + 7 = 73$

Travelling Salesman Problem (TSP)

Path 0-2-1-3-0

Total distance $= 7 + 20 + 2 + 15 = 44$

Travelling Salesman Problem (TSP)

Path 0-2-3-1-0

Total distance $= 7 + 60 + 2 + 4 = 73$

Travelling Salesman Problem (TSP)

Path 0-3-1-2-0

Total distance $= 15 + 2 + 20 + 7 = 44$

Travelling Salesman Problem (TSP)

Path 0-3-2-1-0

Total distance $= 15 + 60 + 20 + 4 = 99$

Travelling Salesman Problem (TSP)

2 tours with distance $≤ 50$

• $0 - 2 - 1 - 3 - 0$
• $0 - 3 - 1 - 2 - 3$

Travelling Salesman Problem (TSP)

• Seems like we have to check all tours!
• And there are a huge number of tours, exponential in the number of cities

Seems like exhaustive search is the only viable algorithm

Travelling Salesman Problem (TSP)

Did you observe another aspect of the TSP?

Given any tour, for example $0 - 2 - 1 - 3 - 0$

Can I easily verify whether it is $≤ K$ ($50$ in this case)

Yes!

Travelling Salesman Problem (TSP)

The TSP given in this form above, is called a decision problem

Notice how we only need to output a YES / NO answer

To get this answer, we just need to find any tour with distance $≤ K$

The TSP, given in this decision problem form , falls under a very important class of problems known as the NP-Complete problems

If we change the problem to that of finding the shortest tour, then TSP becomes NP-hard

Travelling Salesman Problem (TSP)

To date, the most efficient exact algorithm for solving the optimization version of TSP has time complexity $Θ(N^2*2^{N})$

It was invented by Michael Held and Richard Karp in 1962

So problems such as the TSP are strictly of theoretical interest, right?

Practical areas where the TSP shows up

The TSP itself!

• Shouldn't be too surprising
• Problem originated from actual travelling salesmen themselves, dating back from the 1800s
• Still very common today for people who need to travel large distances very often
• Interesting note: Abraham Lincoln probably had to routinely "solve" this problem in his days as a lawyer in the US circuit courts

Vehicle Routing

• Very similar to the previous slide
• $N$ packages to deliver to $N$ different places, find optimum route
• Other similar problems: Airline routing and scheduling

VLSI (Very Large Scale integration)

• Need to solder/drill/engrave $N$ points on a circuit board using a mechanical arm
• TSP: Find order to visit each of the $N$ points such that the total distance the mechnical arm needs to move is minimized

VLSI (Very Large Scale integration)

• Scan chain: route linking scan points with input and output connections
• TSP: Determine order of scan points to minimize scan chain length
• Shorter scan chain helps to save wiring space, and saves time during testing (signals sent more quickly)

Mapping Genomes

• Obtain accurate placement of genetic markers (segments of DNA which appear exactly once in genome and are easily identified)
• Expose DNA to X-rays to break it into fragments; combine fragments with rodent DNA to form hybrid cell lines
• IDEA: markers which are close on the genome very unlikely to be broken apart by the X-rays
• Create "experimental distance" between every pair of markers (likely by their frequency of occurrence on the same hybrid cell lines)
• TSP: Find tour with minimal cost, thereby determining order of markers in genome

Aiming Telescopes

• For large-scale telescopes, rotating them to make an observation is very time consuming
• TSP: Given a set of observations, obtain an ordering of observations that minimizes the total rotation time of the telescope

Computer Games

• Set of textures required to make computer graphics look realistic
• Desirable to store textures sequentially, but not possible unless textures duplicated many many times = unrealistic storage requirements
• IDEA: Minimize breaks between set of textures needed in a scene.
• A break occurs when the textures are stored in more than one location on the disk
• City = Texture
• Cost from city $u$ to $v$ = number of sets of textures that contain one of the textures but not the other
• TSP: Obtain an ordering of textures that minimizes the total cost

More on NP-Completeness

NP-Completeness

You will be a millionaire if you prove $P = NP$ (or otherwise)

NP-Complete Problems

• Decision problems solvable by an Non-deterministic Turing Machine in Polynomial time
• Polynomials are of the form $N^{K}$, for some constant $K$
• Earlier, we showed that it was easy to verify if some path is a solution to an instance of the decision version of TSP
• easy = polynomial time
• In layman's terms, for the TSP, this means if you could try out all possible TSP tours in parallel, then it would take a polynomial amount of time to determine if there is tour with distance $≤ K$ (I might be wrong on this way of explaining though)

NP-Complete Problems

• More formally, Polynomial Time algorithms = Efficient
• Yes, even if it is $N^{1000}$, but you won't really find such algorithms in practice

NP-Complete Problems

2 features

• Polynomial Time verification
• Every NP-Complete problem can be converted to it, in polynomial time

Other "features"

• Search space is exponential in input size, and we seem to need to explore it all
• Not a true feature

NP-Complete Problems

• In particular, focus on the last point: "Every NP-Complete problem can be converted to it, in polynomial time"
• Wikipedia shows a list of long list of NP-Complete problems
• Do researchers then attempt to convert every known NP-Complete problem to one he is interested in proving to be NP-Complete?
• No. He only needs to convert any one NP-Complete problem to it
• In particular, this is known as a reduction

NP-Complete Problems

• The astute reader will notice: But there must be some initial NP-Complete problem, right?
• Yes. The SAT (satisfiability) problem
• Proof is a bit technical. Will not show here
• Version in "Introduction to the Theory of Computation" (Michael Sipser) is fairly understandable, encouraged reading for people who are interested

NP-Complete reductions

Polynomial Time reductions

• Proof to show that $Y$ is at least as hard as $X$ (where $X$ is a known NP-Complete problem)
• If there exists a Polynomial Time algorithm for $Y$, then there exists a Polynomial Time algorithm for solving $X$ (only possible with Polynomial Time reduction from $X$ to $Y$)
• Can you spot the algorithm?

NP-Complete reductions

This is also the reason you hear people saying that if there is an efficient algorithm for solving 1 specific NP-Complete problem, then there is an efficient algorithm for solving all of them

Coping with NP-Completeness

• Showed some very practical applications earlier on, so people do requre "solutions" to instances of such problems daily
• People have obviously worked around the lack of efficient exact algorithms
• Main approaches (still very active research areas):
• Heuristics
• Approximation Algorithms
• Randomized Algorithms
Way beyond what I know; motivated people can pursue these topics on your own =)

The Halting Problem

The Halting Problem

Problem Statement: Is there an algorithm which can tell whether an arbitrary computer program will terminate?

• Discovered by Alan Turing in 1936, before the first electronic computer was invented
• Answer is obviously no; no such algorithm exists

The Halting Problem

Suppose $P$ is a program that does not halt if an input program $I$ halts and $P$ halts if an input program $I$ does not halt

The Halting Problem

Notice that we can feed $P$ to itself as the input.

Question: Does $P$ halt when $P$ itself is the input?

Case 1: $P$ halts. Then by our assumption, the input to $P$ does not halt, so $P$ does not halt. Contradiction.

The Halting Problem

Case 2: $P$ does not halt. Then by our assumption, the input to $P$ must have halted, so $P$ halts. Contradiction.

The Halting Problem

So $P$ neither halts nor does not halt. This is a contradiction. Therefore, such a program/algorithm $P$ can never exist.

The Halting Problem - Implications

• Independent of how fast computers become or how much memory they have
• There are certain things that cannot be computed; you've just seen one of them

The Halting Problem - Implications

• Think this is of no practical consequence?
• Think again
• You can change the "whether an arbitrary input program halts" to other things, such as
• whether an input program prints a certain string, such as "hello world"
• whether an input program calls a certain function
• whether a certain variable in an input program will ever contain a certain value
Those are undecidable as well
• Do people still try to determine if a program terminates, and other properties? Yes
• Falls under the field of Program Analysis . However, results may be approximate

Conclusion

• Hopefully this presentation was helpful
• On a flipside, you might have gained some insight as to why this isn't exactly the most popular area among CS undergrads these days
• For the people who know all these, you will find it too simple
• For those new to these, you might find it too much information to digest. It's normal.
• Takeaway message: Efficient algorithms are important

Thank you!

This presentation was created using

Reveal.js

https://github.com/hakimel/reveal.js/

The math symbols are formatted using

MathJax

http://www.mathjax.org/

which is incorporated as part of Reveal.js

Nice looking graphs like this one:

are drawn using:

Microsoft Paint

Just Kidding!

Nice looking graphs like this one:

are drawn using:

GraphViz

http://www.graphviz.org/