[A+DS] <1998< [plan] >2000>

## CSE2304 Algorithms and Data Structures, Progress 1999

This will be added to after each lecture.

### week 1: 1-5 March

• Lecture 1 (3/3/99): Admin, prac's, prac#1 (max sum interval), linux, gcc.
Abstract data type (ADT), ideal properties, unambiguous, implementation v. spec', e.g. Int, e.g. Tree, prefix, infix, postfix.
• Lecture 2 (5/3/99): [?How to calculate the number of bits set to `1' in a computer word that is `w' bits long? Can do it in ~log(w) instructions!]
Queue ADT and Breadth First tree Traversal (see C/Tree/). Fibonacci numbers (rabbits, see Recursion section), binary rec' Fib' slow, linear rec' Fib' O(n), tail recursion removed gives iterative, finally fast O(log(n)) version! and its proof by induction.

### week 2: 8-12 March, [prac-1]

• Lecture 3 (10/3/99): The foam-cubes of Hanoi (thanks to volunteers).
Binary search tree, its ordered property, insert and delete in binary search tree. See [demo], insert.c in [C/Tree/] and BST in [tree-notes].
Non-examinable, for interest: recursion without self-reference, see Y.a68 in [A-68/]. Challenge: is it possible to program Y in C?
• Lecture 4 (12/3/99): Re prac#1, there is a linear time solution! Hint: use cumulative sums, keep the minimum cumulative sum (MCS) to date and the greatest gap ever seen between the MCS and the current cumulative sum.
Parse trees, recursive-descent parsing, see *.h, parse.c and driverParser in [C/Tree/]. See functions `operand', `term', `exp', `sequence'. Examples. Left associative operators.
Re prac#2, suggest you use my parser, create a routine to traverse a (parse-) Tree and generate MIPS code during the traversal.

### week 3: 15-19 March [prac-2] Grp-A.

• Lecture 5 (17/3/99): [Edit Distance], uses, problem description, recurrence & exponential-time algorithm, save results in a matrix gives O(n2)-time and space algorithm, examples. (Here's some [DNA].)
• Lecture 6 (19/3/99): [Hirschberg's], O(n)-space, O(n2)-time algorithm for edit distance; it is an example of divide and conquer.
(Also for interest only [Strassen's] O(nlog27)-time matrix multiplication algorithm, another divider and conqueror. Challenge: I'd like to see a C version!)

### week 4: 22-26 March (prac-2 Grp-B).

• Lecture 7: Yet another example of divide and conquer: Knuth's O(nlog23)-time integer multiplication algorithm; see [here].
Sorting is (i) important in its own right, (ii) provides examples for program proof and complexity analysis. e.g. [Selection sort]. Correctness: loop-invariants:
```while(condition)
{/* invariant */
body
}
/* not condition && invariant => result, if well designed */
```
• Lecture 8: Time & space complexity, testing and stability of (e.g.) selection sort. [Insertion Sort] note that its weaker invariant (weaker than selection sort's), allows a faster (best & average cases) algorithm. [Dutch national flag] problem (2 colours as preamble to partition and quick-sort).

### week 5: 29 March - Thurs' 1 April

• Lecture 9: Three-colour [Dutch national flag] problem, its relationship to partitioning and [Quick Sort]. Speed-up techniques for Quicksort: use insertion sort if N<=4 (say), try for better estimate of median, 3-way partition,`...`
Heap Sort, heap property, down-heap(), the sort. (We will see a heap later, as a priority queue in e.g. [Kruskal's algorithm].

### week 6: 12-16 April [prac-3] Grp-A

• Lecture 10: (Briefly recap pre-easter down-heap and also up-heap.) [Tables], the abstract data type, its properties. Table by unsorted array and sorted array. Binary search, the loop invariant, termination, correctness, 2-way v. 3-way test, [f(x)=0] and difference between int and real (float) datatypes w.r.t. termination.
• Lecture 11: (Recap properties of lookup-Table abstract data type.) Table by [AVL-] (i.e. height-balanced binary search) -tree & [demonstration]. Keeping a balance: Left-2 rotation, Left-3 rotation (Right-2 and Right-3 are mirror images).

### week 7: 19-23 April (prac-3 Grp-B)

• Lecture 12: Examples of [AVL-tree] insertion and deletion. Min number `n[h]' of nodes for an (some) AVL tree of height h, examples for h=1,2,3,4,5,..., n[h]=1+n[h-1]+n[h-2], so n[h]>2*n[h-2]. Hence h<=2*log(n/2), i.e. h is O(log(n)), and so are insert, delete & search. Every internal (non-leaf) node of an AVL tree has two non-null subtrees.
[2-3-Trees] another height-balanced structure.
• Lecture 13: Tables by ... [2-3-Trees], [2-3-4-Trees], [B-Trees], [Tries]. Examples of insertion in the above.

### week 8: 26-30 April [prac-4] Grp-A

• Lecture 14: [PATRICIA-trees] guarantee that every internal node is a genuine branching node (c.f. [Tries). Examples of insertion in a P'-tree.
Hashing: (i) for lookup (hash-) tables (reminder from 1st year), (ii) application in Rabin's [String Search] algorithm.
NB. The Boyer-Moore string-searching algorithm is not examinable.
• Lecture 15: [Graph], G=<V,E>. Applications of graphs. Examples of directed and undirected, weighted and unweighted graphs (based on the campus map, see also prac'5). A dense graph has |E|~|V|2, but a sparse graph has |E|<<|V|2. Representation of a graph by an adjacency-matrix (dense) or by adjacency-lists (sparse).

### week 9: 3-7 May (prac-4 Grp-B)

• Lecture 16: Algorithms on [Directed Graphs]: Dijkstra (single-source shortest-paths), Floyd (all-pairs shortest-paths), Warshall (transitive closure).
• Lecture 17: [Undirected Graphs]: the minimum spanning tree problem, definition, applications, Prim's algorithm and Kruskal's algorithm.

### week 10: 10-14 May [prac-5] Grp-A

• Lecture 18: Detailed look at the data-structures behind [Kruskal's] algorithm: priority-queue of the edges implemented by a heap, `up_heap()` & `down_heap()` (NB. smallest on-top heap), and partition of the set of vertices; complexity O(|E|*log|E|)-time. i.e. Dense graph => use Prim's, sparse graph => use Kruskal's algorithm.
• Lecture 19: Depth-first and breadth-first traversal of graphs, e.g. the campus map.
[Directed Acyclic Graph] (DAG), examples: parse-trees & -DAGs, building projects, subject prerequisites, call-graph of routines in a program for recursion detection,`...` Topological-sort of a DAG.

### week 11: 17-21 May (prac-5 Grp-B)

• Lecture 20: Since the last lecture there is a DAG / topological sort demo [here].
Example application of binary (parse) trees used to represent and to manipulate boolean expressions (i.e. propositional logic, logic design, ...) [here]. (Might be useful for CSE2303 Formal Methods I !)
[Suffix Tree] is a data-structure that can be used to find (a) substrings in a text (string) quickly, (b) longest repeated substring of a text, (c) longest common substring of two (or more) texts, (d) longest palindrome of a text.
NB. The algorithm to construct a suffix tree in linear time is not examinable!
• Lecture 21: Abstract Data Types (ADTs) and notation, e.g.: SxT={(s,t)|s in S and t in T}, S->T = set of functions from S to T, also S+T, S*, S+, etc. Examples of use: Stacks, Queues; note similarities and differences between (i) push, pop & top and (ii) addq, popq & front. Other examples: Lists, Tables, Trees, even Int & Set, etc..
ADTs are useful as specifications as opposed to implementations, e.g. in Software Engineering.

### week 12: 24-28 May [prac-6] Grp A

• Lecture 22: [Recurrence relations], their solution and proof. Logarithmic-, linear-, & exponential-complexity and examples of algorithms having these complexities. Big-O notation, examples, order-k differences and uses, and practical hints for estimating program complexity when analysis is difficult.
• Lecture 23: Numerical (i.e. floating-point) computations: numerical [errors] due to limited accuracy. Simple methods of [integration]: rectangle rule, trapezoidal rule & Simpson's rule.

### week 13: 31 May-4 June (prac-6 Grp-B)

• Lecture 24: Finish numerical integration and [Simpson's rule]. The remainder was taken up with answering questions about possible exam questions, revisions etc.
• Lecture 25: Revision Q&A "tutorial".

© L.Allison 1999, Comp. Sci. and S.W.E., Monash University, Australia.