Thursday, July 13, 2017

Binary Search

Given a sorted array arr[] of n elements, write a function to search a given element x in arr[].
A simple approach is to do linear search.The time complexity of above algorithm is O(n). Another approach to perform the same task is using Binary Search.
Binary Search: Search a sorted array by repeatedly dividing the search interval in half. Begin with an interval covering the whole array. If the value of the search key is less than the item in the middle of the interval, narrow the interval to the lower half. Otherwise narrow it to the upper half. Repeatedly check until the value is found or the interval is empty.
Example: binary-search1
Image Source : http://www.nyckidd.com/bob/Linear%20Search%20and%20Binary%20Search_WorkingCopy.pdf
The idea of binary search is to use the information that the array is sorted and reduce the time complexity to O(Logn).
We basically ignore half of the elements just after one comparison.
  1. Compare x with the middle element.
  2. If x matches with middle element, we return the mid index.
  3. Else If x is greater than the mid element, then x can only lie in right half subarray after the mid element. So we recur for right half.
  4. Else (x is smaller) recur for the left half.
Recursive implementation of Binary Search
// Java implementation of recursive Binary Search
class BinarySearch
{
    // Returns index of x if it is present in arr[l..r], else
    // return -1
    int binarySearch(int arr[], int l, int r, int x)
    {
        if (r>=l)
        {
            int mid = l + (r - l)/2;
 
            // If the element is present at the middle itself
            if (arr[mid] == x)
               return mid;
 
            // If element is smaller than mid, then it can only
            // be present in left subarray
            if (arr[mid] > x)
               return binarySearch(arr, l, mid-1, x);
 
            // Else the element can only be present in right
            // subarray
            return binarySearch(arr, mid+1, r, x);
        }
 
        // We reach here when element is not present in array
        return -1;
    }
 
    // Driver method to test above
    public static void main(String args[])
    {
        BinarySearch ob = new BinarySearch();
        int arr[] = {2,3,4,10,40};
        int n = arr.length;
        int x = 10;
        int result = ob.binarySearch(arr,0,n-1,x);
        if (result == -1)
            System.out.println("Element not present");
        else
            System.out.println("Element found at index "+result);
    }
}
/* This code is contributed by Rajat Mishra */

Output:
Element is present at index 3
Iterative implementation of Binary Search
// Java implementation of iterative Binary Search
class BinarySearch
{
    // Returns index of x if it is present in arr[], else
    // return -1
    int binarySearch(int arr[], int x)
    {
        int l = 0, r = arr.length - 1;
        while (l <= r)
        {
            int m = l + (r-l)/2;
 
            // Check if x is present at mid
            if (arr[m] == x)
                return m;
 
            // If x greater, ignore left half
            if (arr[m] < x)
                l = m + 1;
 
            // If x is smaller, ignore right half
            else
                r = m - 1;
        }
 
        // if we reach here, then element was not present
        return -1;
    }
 
    // Driver method to test above
    public static void main(String args[])
    {
        BinarySearch ob = new BinarySearch();
        int arr[] = {2, 3, 4, 10, 40};
        int n = arr.length;
        int x = 10;
        int result = ob.binarySearch(arr, x);
        if (result == -1)
            System.out.println("Element not present");
        else
            System.out.println("Element found at index "+result);
    }
}

Output:
Element is present at index 3
Time Complexity:
The time complexity of Binary Search can be written as
T(n) = T(n/2) + c 
The above recurrence can be solved either using Recurrence T ree method or Master method. It falls in case II of Master Method and solution of the recurrence is Theta(Logn).
Auxiliary Space: O(1) in case of iterative implementation. In case of recursive implementation, O(Logn) recursion call stack space.
Algorithmic Paradigm: Divide and Conquer
Interesting articles based on Binary Search.
Coding Practice Questions on Binary Search
Recent Articles on Binary Search.

Merge Sort

Like QuickSort, Merge Sort is a Divide and Conquer algorithm. It divides input array in two halves, calls itself for the two halves and then merges the two sorted halves. The merge() function is used for merging two halves. The merge(arr, l, m, r) is key process that assumes that arr[l..m] and arr[m+1..r] are sorted and merges the two sorted sub-arrays into one. See following C implementation for details.
MergeSort(arr[], l,  r)
If r > l
     1. Find the middle point to divide the array into two halves:  
             middle m = (l+r)/2
     2. Call mergeSort for first half:   
             Call mergeSort(arr, l, m)
     3. Call mergeSort for second half:
             Call mergeSort(arr, m+1, r)
     4. Merge the two halves sorted in step 2 and 3:
             Call merge(arr, l, m, r)
The following diagram from wikipedia shows the complete merge sort process for an example array {38, 27, 43, 3, 9, 82, 10}. If we take a closer look at the diagram, we can see that the array is recursively divided in two halves till the size becomes 1. Once the size becomes 1, the merge processes comes into action and starts merging arrays back till the complete array is merged.
Merge-Sort

We strongly recommend that you click here and practice it, before moving on to the solution.

/* C program for Merge Sort */
#include<stdlib.h>
#include<stdio.h>
 
// Merges two subarrays of arr[].
// First subarray is arr[l..m]
// Second subarray is arr[m+1..r]
void merge(int arr[], int l, int m, int r)
{
    int i, j, k;
    int n1 = m - l + 1;
    int n2 =  r - m;
 
    /* create temp arrays */
    int L[n1], R[n2];
 
    /* Copy data to temp arrays L[] and R[] */
    for (i = 0; i < n1; i++)
        L[i] = arr[l + i];
    for (j = 0; j < n2; j++)
        R[j] = arr[m + 1+ j];
 
    /* Merge the temp arrays back into arr[l..r]*/
    i = 0; // Initial index of first subarray
    j = 0; // Initial index of second subarray
    k = l; // Initial index of merged subarray
    while (i < n1 && j < n2)
    {
        if (L[i] <= R[j])
        {
            arr[k] = L[i];
            i++;
        }
        else
        {
            arr[k] = R[j];
            j++;
        }
        k++;
    }
 
    /* Copy the remaining elements of L[], if there
       are any */
    while (i < n1)
    {
        arr[k] = L[i];
        i++;
        k++;
    }
 
    /* Copy the remaining elements of R[], if there
       are any */
    while (j < n2)
    {
        arr[k] = R[j];
        j++;
        k++;
    }
}
 
/* l is for left index and r is right index of the
   sub-array of arr to be sorted */
void mergeSort(int arr[], int l, int r)
{
    if (l < r)
    {
        // Same as (l+r)/2, but avoids overflow for
        // large l and h
        int m = l+(r-l)/2;
 
        // Sort first and second halves
        mergeSort(arr, l, m);
        mergeSort(arr, m+1, r);
 
        merge(arr, l, m, r);
    }
}
 
/* UTILITY FUNCTIONS */
/* Function to print an array */
void printArray(int A[], int size)
{
    int i;
    for (i=0; i < size; i++)
        printf("%d ", A[i]);
    printf("\n");
}
 
/* Driver program to test above functions */
int main()
{
    int arr[] = {12, 11, 13, 5, 6, 7};
    int arr_size = sizeof(arr)/sizeof(arr[0]);
 
    printf("Given array is \n");
    printArray(arr, arr_size);
 
    mergeSort(arr, 0, arr_size - 1);
 
    printf("\nSorted array is \n");
    printArray(arr, arr_size);
    return 0;
}

Output:
Given array is
12 11 13 5 6 7

Sorted array is
5 6 7 11 12 13
Time Complexity: Sorting arrays on different machines. Merge Sort is a recursive algorithm and time complexity can be expressed as following recurrence relation.
T(n) = 2T(n/2) + \Theta(n)
The above recurrence can be solved either using Recurrence Tree method or Master method. It falls in case II of Master Method and solution of the recurrence is \Theta(nLogn).
Time complexity of Merge Sort is \Theta(nLogn) in all 3 cases (worst, average and best) as merge sort always divides the array in two halves and take linear time to merge two halves.
Auxiliary Space: O(n)
Algorithmic Paradigm: Divide and Conquer
Sorting In Place: No in a typical implementation
Stable: Yes
Applications of Merge Sort
  1. Merge Sort is useful for sorting linked lists in O(nLogn) time.In case of linked lists the case is different mainly due to difference in memory allocation of arrays and linked lists. Unlike arrays, linked list nodes may not be adjacent in memory. Unlike array, in linked list, we can insert items in the middle in O(1) extra space and O(1) time. Therefore merge operation of merge sort can be implemented without extra space for linked lists. In arrays, we can do random access as elements are continuous in memory. Let us say we have an integer (4-byte) array A and let the address of A[0] be x then to access A[i], we can directly access the memory at (x + i*4). Unlike arrays, we can not do random access in linked list. Quick Sort requires a lot of this kind of access. In linked list to access i’th index, we have to travel each and every node from the head to i’th node as we don’t have continuous block of memory. Therefore, the overhead increases for quick sort. Merge sort accesses data sequentially and the need of random access is low.
  2. Inversion Count Problem
  3. Used in External Sorting

QuickSort

Like Merge Sort, QuickSort is a Divide and Conquer algorithm. It picks an element as pivot and partitions the given array around the picked pivot. There are many different versions of quickSort that pick pivot in different ways.
  1. Always pick first element as pivot.
  2. Always pick last element as pivot (implemented below)
  3. Pick a random element as pivot.
  4. Pick median as pivot.
The key process in quickSort is partition(). Target of partitions is, given an array and an element x of array as pivot, put x at its correct position in sorted array and put all smaller elements (smaller than x) before x, and put all greater elements (greater than x) after x. All this should be done in linear time.
Pseudo Code for recursive QuickSort function :
/* low  --> Starting index,  high  --> Ending index */
quickSort(arr[], low, high)
{
    if (low < high)
    {
        /* pi is partitioning index, arr[p] is now
           at right place */
        pi = partition(arr, low, high);

        quickSort(arr, low, pi - 1);  // Before pi
        quickSort(arr, pi + 1, high); // After pi
    }
}
quicksort
Partition Algorithm
There can be many ways to do partition, following pseudo code adopts the method given in CLRS book. The logic is simple, we start from the leftmost element and keep track of index of smaller (or equal to) elements as i. While traversing, if we find a smaller element, we swap current element with arr[i]. Otherwise we ignore current element.
/* low  --> Starting index,  high  --> Ending index */
quickSort(arr[], low, high)
{
    if (low < high)
    {
        /* pi is partitioning index, arr[p] is now
           at right place */
        pi = partition(arr, low, high);

        quickSort(arr, low, pi - 1);  // Before pi
        quickSort(arr, pi + 1, high); // After pi
    }
}
Pseudo code for partition()
/* This function takes last element as pivot, places
   the pivot element at its correct position in sorted
    array, and places all smaller (smaller than pivot)
   to left of pivot and all greater elements to right
   of pivot */
partition (arr[], low, high)
{
    // pivot (Element to be placed at right position)
    pivot = arr[high];  
 
    i = (low - 1)  // Index of smaller element

    for (j = low; j <= high- 1; j++)
    {
        // If current element is smaller than or
        // equal to pivot
        if (arr[j] <= pivot)
        {
            i++;    // increment index of smaller element
            swap arr[i] and arr[j]
        }
    }
    swap arr[i + 1] and arr[high])
    return (i + 1)
}
Illustration of partition() :
arr[] = {10, 80, 30, 90, 40, 50, 70}
Indexes:  0   1   2   3   4   5   6 

low = 0, high =  6, pivot = arr[h] = 70
Initialize index of smaller element, i = -1

Traverse elements from j = low to high-1
j = 0 : Since arr[j] <= pivot, do i++ and swap(arr[i], arr[j])
i = 0 
arr[] = {10, 80, 30, 90, 40, 50, 70} // No change as i and j 
                                     // are same

j = 1 : Since arr[j] > pivot, do nothing
// No change in i and arr[]

j = 2 : Since arr[j] <= pivot, do i++ and swap(arr[i], arr[j])
i = 1
arr[] = {10, 30, 80, 90, 40, 50, 70} // We swap 80 and 30 

j = 3 : Since arr[j] > pivot, do nothing
// No change in i and arr[]

j = 4 : Since arr[j] <= pivot, do i++ and swap(arr[i], arr[j])
i = 2
arr[] = {10, 30, 40, 90, 80, 50, 70} // 80 and 40 Swapped
j = 5 : Since arr[j] <= pivot, do i++ and swap arr[i] with arr[j] 
i = 3 
arr[] = {10, 30, 40, 50, 80, 90, 70} // 90 and 50 Swapped 

We come out of loop because j is now equal to high-1.
Finally we place pivot at correct position by swapping
arr[i+1] and arr[high] (or pivot) 
arr[] = {10, 30, 40, 50, 70, 90, 80} // 80 and 70 Swapped 

Now 70 is at its correct place. All elements smaller than
70 are before it and all elements greater than 70 are after
it.
Implementation:
Following are C++, Java and Python implementations of QuickSort.
/* C implementation QuickSort */
#include<stdio.h>
 
// A utility function to swap two elements
void swap(int* a, int* b)
{
    int t = *a;
    *a = *b;
    *b = t;
}
 
/* This function takes last element as pivot, places
   the pivot element at its correct position in sorted
    array, and places all smaller (smaller than pivot)
   to left of pivot and all greater elements to right
   of pivot */
int partition (int arr[], int low, int high)
{
    int pivot = arr[high];    // pivot
    int i = (low - 1);  // Index of smaller element
 
    for (int j = low; j <= high- 1; j++)
    {
        // If current element is smaller than or
        // equal to pivot
        if (arr[j] <= pivot)
        {
            i++;    // increment index of smaller element
            swap(&arr[i], &arr[j]);
        }
    }
    swap(&arr[i + 1], &arr[high]);
    return (i + 1);
}
 
/* The main function that implements QuickSort
 arr[] --> Array to be sorted,
  low  --> Starting index,
  high  --> Ending index */
void quickSort(int arr[], int low, int high)
{
    if (low < high)
    {
        /* pi is partitioning index, arr[p] is now
           at right place */
        int pi = partition(arr, low, high);
 
        // Separately sort elements before
        // partition and after partition
        quickSort(arr, low, pi - 1);
        quickSort(arr, pi + 1, high);
    }
}
 
/* Function to print an array */
void printArray(int arr[], int size)
{
    int i;
    for (i=0; i < size; i++)
        printf("%d ", arr[i]);
    printf("n");
}
 
// Driver program to test above functions
int main()
{
    int arr[] = {10, 7, 8, 9, 1, 5};
    int n = sizeof(arr)/sizeof(arr[0]);
    quickSort(arr, 0, n-1);
    printf("Sorted array: n");
    printArray(arr, n);
    return 0;
}
Output:
Sorted array:
1 5 7 8 9 10
Analysis of QuickSort
Time taken by QuickSort in general can be written as following.
 T(n) = T(k) + T(n-k-1) + theta(n)
The first two terms are for two recursive calls, the last term is for the partition process. k is the number of elements which are smaller than pivot.
The time taken by QuickSort depends upon the input array and partition strategy. Following are three cases.
Worst Case: The worst case occurs when the partition process always picks greatest or smallest element as pivot. If we consider above partition strategy where last element is always picked as pivot, the worst case would occur when the array is already sorted in increasing or decreasing order. Following is recurrence for worst case.
 T(n) = T(0) + T(n-1) + theta(n)
which is equivalent to  
 T(n) = T(n-1) + theta(n)
The solution of above recurrence is theta(n2).
Best Case: The best case occurs when the partition process always picks the middle element as pivot. Following is recurrence for best case.
 T(n) = 2T(n/2) + theta(n)
The solution of above recurrence is theta(nLogn). It can be solved using case 2 of Master Theorem.
Average Case:
To do average case analysis, we need to consider all possible permutation of array and calculate time taken by every permutation which doesn’t look easy.
We can get an idea of average case by considering the case when partition puts O(n/9) elements in one set and O(9n/10) elements in other set. Following is recurrence for this case.
 T(n) = T(n/9) + T(9n/10) + theta(n)
Solution of above recurrence is also O(nLogn)
Although the worst case time complexity of QuickSort is O(n2) which is more than many other sorting algorithms like Merge Sort and Heap Sort, QuickSort is faster in practice, because its inner loop can be efficiently implemented on most architectures, and in most real-world data. QuickSort can be implemented in different ways by changing the choice of pivot, so that the worst case rarely occurs for a given type of data. However, merge sort is generally considered better when data is huge and stored in external storage.
What is 3-Way QuickSort?
In simple QuickSort algorithm, we select an element as pivot, partition the array around pivot and recur for subarrays on left and right of pivot.
Consider an array which has many redundant elements. For example, {1, 4, 2, 4, 2, 4, 1, 2, 4, 1, 2, 2, 2, 2, 4, 1, 4, 4, 4}. If 4 is picked as pivot in Simple QuickSort, we fix only one 4 and recursively process remaining occurrences. In 3 Way QuickSort, an array arr[l..r] is divided in 3 parts:
a) arr[l..i] elements less than pivot.
b) arr[i+1..j-1] elements equal to pivot.
c) arr[j..r] elements greater than pivot.
See this for implementation.
How to implement QuickSort for Linked Lists?
QuickSort on Singly Linked List
QuickSort on Doubly Linked List
Can we implement QuickSort Iteratively?
Yes, please refer Iterative Quick Sort.
Why Quick Sort is preferred over MergeSort for sorting Arrays
Quick Sort in its general form is an in-place sort (i.e. it doesn’t require any extra storage) whereas merge sort requires O(N) extra storage, N denoting the array size which may be quite expensive. Allocating and de-allocating the extra space used for merge sort increases the running time of the algorithm. Comparing average complexity we find that both type of sorts have O(NlogN) average complexity but the constants differ. For arrays, merge sort loses due to the use of extra O(N) storage space.
Most practical implementations of Quick Sort use randomized version. The randomized version has expected time complexity of O(nLogn). The worst case is possible in randomized version also, but worst case doesn’t occur for a particular pattern (like sorted array) and randomized Quick Sort works well in practice.
Quick Sort is also a cache friendly sorting algorithm as it has good locality of reference when used for arrays.
Quick Sort is also tail recursive, therefore tail call optimizations is done.
Why MergeSort is preferred over QuickSort for Linked Lists?
In case of linked lists the case is different mainly due to difference in memory allocation of arrays and linked lists. Unlike arrays, linked list nodes may not be adjacent in memory. Unlike array, in linked list, we can insert items in the middle in O(1) extra space and O(1) time. Therefore merge operation of merge sort can be implemented without extra space for linked lists.
In arrays, we can do random access as elements are continuous in memory. Let us say we have an integer (4-byte) array A and let the address of A[0] be x then to access A[i], we can directly access the memory at (x + i*4). Unlike arrays, we can not do random access in linked list. Quick Sort requires a lot of this kind of access. In linked list to access i’th index, we have to travel each and every node from the head to i’th node as we don’t have continuous block of memory. Therefore, the overhead increases for quick sort. Merge sort accesses data sequentially and the need of random access is low.
How to optimize QuickSort so that it takes O(Log n) extra space in worst case?
Please see QuickSort Tail Call Optimization (Reducing worst case space to Log n )

Tuesday, July 11, 2017

What are the best books on algorithms and data structures?

I am looking for books to brush up my concepts algorithms and data structures and looking for self study books.
Have this question too? Request Answers:
Request From Quora
100+ Answers
Adam D'Angelo 
The most standard book is Introduction to Algorithms, by Cormen, Leiserson, Rivest, and Stein. It's used in a lot of college intro algorithms courses.
Venkateswara Rao Sanaka
This is my personal preference. There might be many other good books.

1. CLRS - The classic comprehensive text book on algorithms. A must read atleast once in programmer's career.

2. Introduction to Algorithms: A Creative Approach by Udi Manber - An excellent book on various algorithm categories. Many interesting questions on web portals as interview questions can be found in this book. Chapter end exercises are an asset. One must attempt the "Creative Problems" section at the end of every chapter. If a programmer wants to know the power of induction as problem solving approach, he must read this book. Strongly recommended.

3. The Algorithm Design Manual by Skiena - Lots of algorithmic problems, and discussions, war stories, related problems, interesting exercises. It helps in modeling a problem in different ways. A must work book for every passionate programmer. Don't read this unless you have good insight into algorithms.

4. Algorithms by Das Gupta - Precise book on few algorithmic categories, pick any chapter based on interest and attempt end of chapter exercises.

5. Algorithms 4e by Sedgewick - Relatively beginner level book, covers graphs, strings, hashing, searching, sorting, etc. very well. It follows OOP approach in Java. Strongly recommended for beginners, though nothing stops a professional. Web portal containing plenty of interesting exercises. There are other books by Sedgewick on Algorithms. Recommended for data structure learning.

6. Introduction to Design and Analysis of Algorithms by Levitin - An introductory book in algorithm design. Recommended for beginners. One can enjoy the explanation and solving end of section exercises.

On Programming Style:
1. Programming Pearls by Bentley - A must read book on design and implementation of computer programs.

2. The Practice of Programming by Kernighan - Written during Unix days, still one of the best resource on program design and implementation principles.

3. Advanced Programming in the Unix Environment by W. Richard Stevens - It covers many Unix internals and kernel level API. It follows an excellent programming style. Stevens books are one of the best in their category. I would say, they stand at the level of CLRS in algorithms category. Highly recommended.
Siddharth Goel
Well I would definitely NOT suggest Introduction to Algorithms . That is certainly an advanced book and is more sort of reference book which covers a lot of topics. My personal recommendation would be to go for MOOC rather than a book. Coursera offers some courses on Algorithms out of which two are most famous -
  1. From Princeton University by Bob Sedgewick and Kevin Wayne.
  2. From Standford University by Tim Roughgarden (Part1 and Part2)
Both are awesome courses and provides a very structured way to learn algorithms. You can have a look in those two courses. I would also encourage you to have a look at Siddhanth Deshpande's answer to Coursera: Which is better for beginners to learn the subject, the "Algorithms- Design and Analysis" course taught by Tim Roughgarden of Stanford, or the "Algorithms" course taught by  Kevin Wayne and Robert Sedgewick of Princeton? . It also covers some good points regarding those courses.

Top 10 Algorithms and Data Structures for Competitive Programming

In this post “Important top 10 algorithms and data structures for competitive coding “.
Topics :
  1. Graph algorithms
  2. Dynamic programming
  3. Searching and Sorting:
  4. Number theory and Other Mathematical
  5. Geometrical and Network Flow Algorithms
  6. Data Structures
competitive-programming
The below links cover all most  important algorithms and data structure topics:

Graph Algorithms
  1. Breadth First Search (BFS)
  2. Depth First Search (DFS)
  3. Shortest Path from source to all vertices **Dijkstra**
  4. Shortest Path from every vertex to every other vertex **Floyd Warshall**
  5. Minimum Spanning tree **Prim**
  6. Minimum Spanning tree **Kruskal**
  7. Topological Sort
  8. Johnson’s algorithm
  9. Articulation Points (or Cut Vertices) in a Graph
  10. Bridges in a graph
All Graph Algorithms

Dynamic Programming
  1. Longest Common Subsequence
  2. Longest Increasing Subsequence
  3. Edit Distance
  4. Minimum Partition
  5. Ways to Cover a Distance
  6. Longest Path In Matrix
  7. Subset Sum Problem
  8. Optimal Strategy for a Game
  9. 0-1 Knapsack Problem
  10. Assembly Line Scheduling
All DP Algorithms


Searching And Sorting
  1. Binary Search
  2. Quick Sort
  3. Merge Sort
  4. Order Statistics
  5. KMP algorithm
  6. Rabin karp
  7. Z’s algorithm
  8. Aho Corasick String Matching
  9. Counting Sort
  10. Manacher’s algorithm: Part 1, Part 2 and Part 3
All Articles on Searching, Sorting and Pattern Searching.

Number theory and Other Mathematical
Prime Numbers and Prime Factorization
  1. Primality Test | Set 1 (Introduction and School Method)
  2. Primality Test | Set 2 (Fermat Method)
  3. Primality Test | Set 3 (Miller–Rabin)
  4. Sieve of Eratosthenes
  5. Segmented Sieve
  6. Wilson’s Theorem
  7. Prime Factorisation
  8. Pollard’s rho algorithm

Modulo Arithmetic Algorithms
  1. Basic and Extended Euclidean algorithms
  2. Euler’s Totient Function
  3. Modular Exponentiation
  4. Modular Multiplicative Inverse
  5. Chinese remainder theorem Introduction
  6. Chinese remainder theorem and Modulo Inverse Implementation
  7. nCr%m and this.
Miscellaneous:
  1. Counting Inversions
  2. Counting Inversions using BIT
  3. logarithmic exponentiation
  4. Square root of an integer
  5. Heavy light Decomposition , this and this
  6. Matrix Rank
  7. Gaussian Elimination to Solve Linear Equations
  8. Hungarian algorithm
  9. Link cut
  10. Mo’s algorithm and this
  11. Factorial of a large number in C++
  12. Factorial of a large number in Java+
  13. Russian Peasant Multiplication
  14. Catalan Number
All Articles on Mathematical Algorithms

Geometrical and Network Flow Algorithms
  1. Convex Hull
  2. Graham Scan
  3. Line Intersection
  4. Interval Tree
  5. Matrix Exponentiation and this
  6. Maxflow Ford Furkerson Algo and Edmond Karp Implementation
  7. Min cut
  8. Stable Marriage Problem
  9. Hopcroft–Karp Algorithm for Maximum Matching
  10. Dinic’s algo and e-maxx
All Articles on Geometric Algorithms

Data Structures
  1. Binary Indexed Tree or Fenwick tree
  2. Segment Tree (RMQ, Range Sum and Lazy Propagation)
  3. K-D tree (See insert, minimum and delete)
  4. Union Find Disjoint Set (Cycle Detection and By Rank and Path Compression)
  5. Tries
  6. Suffix array (this, this and this)
  7. Sparse table
  8. Suffix automata
  9. Suffix automata II
  10. LCA and RMQ
All Articles on Advanced Data Structures.
How to Begin?
Please see How to begin with Competitive Programming?
How to Practice?
Please see http://practice.geeksforgeeks.org/
What are top algorithms in Interview Questions?
Top 10 algorithms in Interview Questions
How to prepare for ACM – ICPC?
How to prepare for ACM – ICPC?
This is an initial draft. We will soon be adding more links and algorithms to this post. Please write comments if you find anything incorrect, or you want to share more information about the topic discussed above