python-algorithms¶
python-algorithms project is a collection of algorithms implemented on
Python3.6
You don’t need to install these project as a module (via
pip) because usually you just need only one algorithm instead of all
pack, so just copy and paste the source code. For easy navigation please
use links to the source code below.
algorithms.arithmetic¶
GCD¶
Greatest common divisor (gcd) of two or more integers, which are not all zero, is the largest positive integer that divides each of the integers.
LCM¶
The least common multiple, lowest common multiple, or smallest common multiple of two integers a and b, usually denoted by LCM(a, b), is the smallest positive integer that is divisible by both a and b. [Wikipedia]
algorithms.graphs¶
-
get_edges
(graph)[source]¶ Function for calculating number of edges
Parameters: graph – graph representation as Example: {1: {2: 1, 3: 5}, 2: {3: 2}, 4: {1: 2}} Returns: edges of the graph
-
get_nodes
(graph)[source]¶ Function for calculating number of nodes in the graph
Parameters: graph – graph representation as Example: {1: {2: 1, 3: 5}, 2: {3: 2}, 4: {1: 2}} Returns: nodes of the graph
-
num_of_edges
(graph)[source]¶ Function for calculating number of edges
Parameters: graph – graph representation as Example: {1: {2: 1, 3: 5}, 2: {3: 2}, 4: {1: 2}} Returns: number of the edges in the graph
-
num_of_nodes
(graph)[source]¶ Function for calculating number of nodes in the graph
Parameters: graph – graph representation as Example: {1: {2: 1, 3: 5}, 2: {3: 2}, 4: {1: 2}} Returns: number of nodes in the graph
-
prepare_direct_graph
(edges)[source]¶ Function for converting list of edges to dict graph representation
Parameters: edges – list of tuples, example: [(1,2), (2,3)] Returns: that represent graph Return type: Defaultdict(list) Examples
>>> prepare_direct_graph([(1, 2), (3, 4), (2, 4), (2, 3)]) defaultdict(list, {1: [2], 2: [1, 3, 4], 3: [2, 4], 4: [2, 3]})
-
prepare_undirect_graph
(edges)[source]¶ Function for converting list of edges to dict graph representation
Parameters: edges – list of tuples, example: [(1,2), (2,3)] Returns: that represent graph Return type: Defaultdict(list) Examples
>>> prepare_undirect_graph([(1, 2), (3, 4), (2, 4), (2, 3)]) defaultdict(list, {1: [2], 2: [1, 3, 4], 3: [2, 4], 4: [2, 3]})
-
prepare_weighted_direct_graph
(edges)[source]¶ Function for conveting list of edges with weights to dict graph representation :param edges: list of tuples, example [(1, 2, 3), (3, 2, 1)]; [(node_a, node_b, weight)]
Returns: >>> prepare_weighted_direct_graph([(1, 2, 1), (4, 1, 2), (2, 3, 2), (1, 3, 5)]) defaultdict(dict, {1: {2: 1, 3: 5}, 2: {3: 2}, 4: {1: 2}})
-
prepare_weighted_undirect_graph
(edges)[source]¶ Function for conveting list of edges with weights to dict graph representation :param edges: list of tuples, example [(1, 2, 3), (3, 2, 1)]; [(node_a, node_b, weight)]
Returns: >>> prepare_weighted_undirect_graph([(1, 2, 1), (4, 1, 2), (2, 3, 2), (1, 3, 5) ]) defaultdict(<class 'dict'>, {1: {2: 1, 4: 2, 3: 5}, 2: {1: 1, 3: 2}, 4: {1: 2}, 3: {2: 2, 1: 5}})
-
reverse_graph
(graph)[source]¶ Function for reverting direction of the graph
Parameters: graph – graph representation as Example: [(1, 2), (3, 4), (2, 4), (2, 3)] Returns: reversed graph Examples
>>> reverse_graph({1: [2], 2: [1, 3, 4], 3: [2, 4], 4: [2, 3]}) defaultdict(<class 'list'>, {2: [1, 3, 4], 1: [2], 3: [2, 4], 4: [2, 3]})
-
reverse_weighted_graph
(graph)[source]¶ Function for reverting direction of the graph (weights still the same)
Parameters: graph – graph representation as Example: {1: {2: 1, 3: 5}, 2: {3: 2}, 4: {1: 2}} Returns: reversed graph Examples
>>> reverse_weighted_graph({1: {2: 1, 3: 5}, 2: {3: 2}, 4: {1: 2}}) defaultdict(<class 'dict'>, {2: {1: 1}, 3: {1: 5, 2: 2}, 1: {4: 2}})
BFS¶
Breadth-first search (BFS) is an algorithm for traversing or searching tree or graph data structures. It starts at the tree root (or some arbitrary node of a graph, sometimes referred to as a ‘search key’) and explores the neighbor nodes first, before moving to the next level neighbours. [Wikipedia]
Worst-case performance: O(V + E) = O(b^d)
Worst-case space complexity O(V) = O(b^d)
DFS¶
Breadth-first search (BFS) is an algorithm for traversing or searching tree or graph data structures. It starts at the tree root (or some arbitrary node of a graph, sometimes referred to as a ‘search key’) and explores the neighbor nodes first, before moving to the next level neighbours. [Wikipedia]
Worst-case performance: O(V + E) = O(b^d)
Worst-case space complexity O(V) = O(b^d)
https://wikipedia.org/wiki/Breadth-first_search
-
bfs_iterative
(graph, root)[source] Iterative version of the BFS algorithm
Parameters: - graph – dict representation of the graph
- root – start point
Returns: in order of visiting vertices
Return type: list
Examples:
Dijkstra¶
Dijkstra’s algorithm is an algorithm for finding the shortest paths between nodes in a graph, which may represent, for example, road networks. It was conceived by computer scientist Edsger W. Dijkstra in 1956 and published three years later [Wikipedia]
Worst-case Performance: O(|E|+|V| log |V|)
-
dijkstra
(graph, start, target)[source]¶ Solves shortest path problem using Dijkstra algorithm :param graph: graph representation :param start: start node :param target: target node
Returns: distance between start and target nodes Return type: int Examples
>>> graph = prepare_weighted_undirect_graph( [(1, 2, 7), (1, 3, 9), (1, 6, 14), (6, 3, 2), (6, 5, 9), (3, 2, 10), (3, 4, 11), (2, 4, 15), (6, 5, 9), (5, 4, 6)])
>>> dijkstra(graph, 1, 6) 11
Bidirectional Dijkstra¶
Bidirectional search is a graph search algorithm that finds a shortest path from an initial vertex to a goal vertex in a directed graph. It runs two simultaneous searches: one forward from the initial state , and one backward from the goal, stopping when the two meet in the middle. [Wikipedia]
-
bidi_dijkstra
(graph, start, target)[source]¶ Calculate shortest path via Dijkstra algorithm, with bidirectional optimization which means that we start from target and start points and swith between them each step
Parameters: - graph – graph representation
- start – start node
- target – target node
Returns: lengths of the shortest path between start and target nodes
Return type: int
Examples: >>> graph = prepare_weighted_undirect_graph( [(1, 2, 7), (1, 3, 9), (1, 6, 14), (6, 3, 2), (6, 5, 9), (3, 2, 10), (3, 4, 11), (2, 4, 15), (6, 5, 9), (5, 4, 6)])
>>> dijkstra(graph, 1, 6) 11
Cycle detection (DFS)¶
Bellman-Ford (shortest path with negative arbitrage)¶
Kruskal¶
Bellman-Ford (with negative cycle detection)¶
Bipartite¶
In the mathematical field of graph theory, a bipartite graph (or bigraph) is a graph whose vertices can be divided into two disjoint and independent sets U and V such that every edge connects a vertex in U to one in V. Vertex sets U and V are usually called the parts of the graph. Equivalently, a bipartite graph is a graph that does not contain any odd-length cycles. [Wikipedia]
BST check¶
Function tor check if the tree is correct Binary Search Tree (BST)
-
check_if_bst
(node, mini=-inf, maxi=inf)[source]¶ Check if the given tree is Binary Search Tree (BST)
Parameters: - node – root node of the Tree. node arg must have .left, .right and .data variables
- mini – min value - should be omitted
- maxi – max value - should be omitted
Returns: bool - True if it’s BST and False if not
Examples
Precondition:
>>> class Node: ... def __init__(self, data): ... self.data = data ... self.left = None ... self.right = None >>> root = Node(4) >>> root.left = Node(2) >>> root.right = Node(6) >>> root.left.left = Node(1) >>> root.left.right = Node(3) >>> root.right.left = Node(5) >>> root.right.right = Node(7)
Example itself:
>>> check_if_bst(root) True
Strongly connected components¶
In the mathematical theory of directed graphs, a graph is said to be strongly connected or diconnected if every vertex is reachable from every other vertex.
Topological sort¶
In the field of computer science, a topological sort or topological ordering of a directed graph is a linear ordering of its vertices such that for every directed edge uv from vertex u to vertex v, u comes before v in the ordering. [Wikipedia]
algorithms.greedy¶
Covering segments¶
Given a set of n segments {[a(0) ,b(0) ],[a(1) ,b(1) ],…,[a(n)−1 ,b(n)−1 ]} with integer coordinates on a line, find the minimum number m of points such that each segment contains at least one point.
-
covering_segments
(segments: [<class 'tuple'>]) → list[source]¶ Function for finding minimum number of points that each segment contains
Parameters: segments – list of tuples with start and end point coordinates Returns: list of points where segments are crossing Examples
>>> covering_segments([(4, 7), (1, 3), (2, 5), (5, 6)]) [3, 6]
Fractional Knapsack¶
Given weights and values of n items, we need put these items in a knapsack of capacity W to get the maximum total value in the knapsack.
Complexity: O(n log n)
-
fractional_knapsack
(capacity: int, items: [<class 'tuple'>]) → float[source]¶ Function solves fractional knapsack problem
Parameters: - capacity – total capacity of backpack
- items – list of tuples [(value, weight),…]
Returns: float - maximum value that can be placed in backpack
Examples
>>> fractional_knapsack(50, [(60, 10), (100, 20), (120, 30)]) 240.0
algorithms.search¶
Binary Search¶
In computer science, binary search is a search algorithm that finds the position of a target value within a sorted array. Binary search compares the target value to the middle element of the array; if they are unequal, the half in which the target cannot lie is eliminated and the search continues on the remaining half until it is successful. If the search ends with the remaining half being empty, the target is not in the array.
-
binary_search
(sorted_array, target_element)[source]¶ Binary search algorithm
Parameters: - sorted_array – list of sorted elements (integers)
- target_element – element to find
Returns: position of target_element if succes -1 if element is not found
Examples
>>> binary_search([1, 2, 3, 4], 5) -1
>>> binary_search([x for x in range(10000)], 26) 26
Closest Pair¶
The closest pair of points problem or closest pair problem is a problem of computational geometry: given n points in metric space, find a pair of points with the smallest distance between them. The closest pair problem for points in the Euclidean plane[1] was among the first geometric problems that were treated at the origins of the systematic study of the computational complexity of geometric algorithms. [Wikipedia]
-
brute_force_distance
(points)[source]¶ Brute force solution for closest pair problem
Complexity: O(n^2)
Parameters: points – list of points in the following format [(x1, y1), (x2, y2), … , (xn, yn)] Returns: Minimum distance between points Examples
>>> n_log_n_squared_distance([(2, 3), (12, 30), (40, 50), (5, 1), (12, 10), (3, 4)]) 1.4142135623730951
-
n_log_n_squared_distance
(points)[source]¶ O(N (LogN)^2) solution for closest pair problem
Complexity: O(n (log n)^2)
Parameters: points – list of points in the following format [(x1, y1), (x2, y2), … , (xn, yn)] Returns: Minimum distance between points Examples
>>> n_log_n_squared_distance([(2, 3), (12, 30), (40, 50), (5, 1), (12, 10), (3, 4)]) 1.4142135623730951
Fibonacci¶
In mathematics, the Fibonacci numbers are the numbers in the following integer sequence, called the Fibonacci sequence, and characterized by the fact that every number after the first two is the sum of the two preceding ones:
Fibonacci with modulo¶
Calculating (n-th Fibonacci number) mod m
Rabin-Karp¶
In computer science, the Rabin–Karp algorithm or Karp–Rabin algorithm is a string searching algorithm created by Richard M. Karp and Michael O. Rabin (1987) that uses hashing to find any one of a set of pattern strings in a text. For text of length n and p patterns of combined length m,
its average and best case running time is O(n+m) in space O(p), but its worst-case time is O(nm).
-
rabin_karp
(text, pattern)[source]¶ Rabin-Karp algorithm that finds all occurrences of pattern in text
Parameters: - text – text to search in
- pattern – pattern to search for
Returns: list of position where pattern placed in text
Examples
>>> rabin_karp('AABAACAADAABAABA', 'AABA') [0, 9, 12]
>>> rabin_karp('aaaaa', 'aaa') [0, 1, 2]
algorithms.sorting¶
Merge Sort¶
In computer science, merge sort (also commonly spelled mergesort) is an efficient, general-purpose, comparison-based sorting algorithm. Most implementations produce a stable sort, which means that the implementation preserves the input order of equal elements in the sorted output. Mergesort is a divide and conquer algorithm that was invented by John von Neumann in 1945.
Quick Sort¶
Quicksort (sometimes called partition-exchange sort) is an efficient sorting algorithm, serving as a systematic method for placing the elements of an array in order. Developed by Tony Hoare in 1959[1] and published in 1961,[2] it is still a commonly used algorithm for sorting. [Wikipedia]
https://en.wikipedia.org/wiki/Quicksort
Average complexity: O(n log n) Worst case: O(n^2)
-
quick_sort
(array, low, high)[source]¶ Quick Sort function sorts array
Parameters: - array – list of elements
- low – at starting point low must be = 0
- high – at starting point high must be = len(array)-1
Returns: sorted array
Return type: list
Examples
>>> quick_sort([4, 9, 4, 4, 1, 9, 4, 4, 9, 4, 4, 1, 4], 0, 12) [1, 1, 4, 4, 4, 4, 4, 4, 4, 4, 9, 9, 9]
>>> quick_sort([1, 10, 32, 4.], 0, 3) [1, 4.0, 10, 32]
algorithms.dynamic_programming¶
Knapsack¶
The knapsack problem or rucksack problem is a problem in combinatorial optimization: Given a set of items, each with a weight, determine the number of each item to include in a collection so that the total weight is less than or equal to a given limit.
-
optimal_weight
(capacity, weights)[source]¶ Function calculate optimal_weight for rucksack from given list of weights
Parameters: - capacity – max capacity of rucksak
- weights – list of weights
Returns: Max possible weight that meet <= max capacity
Examples
>>> optimal_weight(165, [23, 31, 29, 44, 53, 38, 63, 85, 89, 82]) 165
algorithms.hash_tables¶
Hash Chain¶
A hash chain is the successive application of a cryptographic hash function to a piece of data. In computer security, a hash chain is a method to produce many one-time keys from a single key or password. [Wikipedia]
-
class
HashChain
(bucket_count)[source]¶ Bases:
object
Class HashChain realisation
Examples
>>> hash_chain = HashChain(5) >>> hash_chain.add("world") >>> hash_chain.add("HellO") >>> hash_chain.check(4) HellO world >>> hash_chain.find("World") no >>> hash_chain.find("world") yes >>> hash_chain.delete("world") >>> hash_chain.check(4) HellO >>> hash_chain.delete("HellO") >>> hash_chain.add("luck") >>> hash_chain.add("GooD") >>> hash_chain.check(2) GooD luck
Explanation: The ASCII code of ’w’ is 119, for ’o’ it is 111, for ’r’ it is 114, for ’l’ it is 108, and for ’d’ it is 100. Thus, h(“world”) = 4. It turns out that the hash value of “HellO“ is also 4. We always insert in the beginning of the chain, so after adding “world” and then “HellO” in the same chain index 4, first goes “HellO” and then goes “world”. Of course, “World” is not found, and “world” is found, because the strings are case-sensitive, and the codes of ’W’ and ’w’ are different. After deleting “world”, only “HellO” is found in the chain 4. Similarly to “world” and “HellO”, after adding “luck” and “GooD” to the same chain 2, first goes “GooD” and then “luck”.