Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Commit 7b41700

Browse files
index all tables
1 parent 780b28a commit 7b41700

12 files changed

+15
-14
lines changed

‎book/chapters/algorithms-analysis.adoc

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -47,7 +47,7 @@ TIP: Algorithms are instructions on how to perform a task.
4747
Not all algorithms are created equal. There are "good" and "bad" algorithms. The good ones are fast; the bad ones are slow. Slow algorithms cost more money to run. Inefficient algorithms could make some calculations impossible in our lifespan!
4848

4949
To give you a clearer picture of how different algorithms perform as the input size grows, take a look at the following problems and how their relative execution time changes as the input size increases.
50-
(((table)))
50+
(((Tables, Algorithms input size vs Time)))
5151

5252
.Relationship between algorithm input size and time taken to complete
5353
[cols=",,,,,",options="header",]
@@ -111,7 +111,7 @@ When we are comparing algorithms, we don't want to have complex expressions. Wha
111111
TIP: Asymptotic analysis describes the behavior of functions as their inputs approach to infinity.
112112

113113
In the previous example, we analyzed `getMin` with an array of size 3; what happen size is 10 or 10k or a million?
114-
(((table)))
114+
(((Tables, Operations of 3n+3)))
115115

116116
.Operations performed by an algorithm with a time complexity of `3n + 3`
117117
[cols=",,",options="header",]
@@ -153,6 +153,7 @@ There are many common notations like polynomial, _O(n^2^)_ like we saw in the `g
153153

154154
Again, time complexity is not a direct measure of how long a program takes to execute but rather how many operations it performs in given the input size. Nevertheless, there’s a relationship between time complexity and clock time as we can see in the following table.
155155

156+
(((Tables, Input size vs clock time by Big O)))
156157
.How long an algorithm takes to run based on their time complexity and input size
157158
[cols=",,,,,,",options="header",]
158159
|===============================================================

‎book/chapters/array.adoc

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -203,7 +203,7 @@ Runtime: O(1).
203203
== Array Complexity
204204

205205
To sum up, the time complexity on an array is:
206-
(((table)))
206+
(((Tables, Array Complexities)))
207207

208208
// tag::table[]
209209
.Time/Space complexity for the array operations
@@ -217,6 +217,7 @@ To sum up, the time complexity on an array is:
217217
(((Constant)))
218218
(((Runtime, Constant)))
219219

220+
(((Tables, JavaScript Array buit-in operations Complexities)))
220221
.Array Operations timex complexity
221222
|===
222223
| Operation | Time Complexity | Usage

‎book/chapters/big-o-examples.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -245,7 +245,7 @@ Factorial start very slow and then it quickly becomes uncontrollable. A word siz
245245
== Summary
246246

247247
We went through 8 of the most common time complexities and provided examples for each of them. Hopefully, this will give you a toolbox to analyze algorithms.
248-
(((table)))
248+
(((Tables, Common time complexities and examples)))
249249

250250
// tag::table[]
251251
.Most common algorithmic running times and their examples

‎book/chapters/graph.adoc

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -283,8 +283,7 @@ include::{codedir}/data-structures/graphs/node.js[tag=removeAdjacent, indent=0]
283283
----
284284

285285
== Graph Complexity
286-
287-
(((table)))
286+
(((Tables, Graph adjacency matrix/list complexities)))
288287

289288
// tag::table[]
290289
.Time complexity for a Graph data structure

‎book/chapters/linear-data-structures-outro.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ In this part of the book, we explored the most used linear data structures such
1818
.Use a Stack when:
1919
* You need to access your data as last-in, first-out (LIFO).
2020
* You need to implement a <<Depth-First Search for Binary Tree, Depth-First Search>>
21-
(((table)))
21+
(((Tables, Array/Lists/Stack/Queue complexities)))
2222
2323
// tag::table[]
2424
.Time/Space Complexity of Linear Data Structures (Array, LinkedList, Stack & Queues)

‎book/chapters/linked-list.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -233,7 +233,7 @@ Notice that we are using the `get` method to get the node at the current positio
233233
== Linked List Complexity vs. Array Complexity
234234

235235
So far, we have seen two liner data structures with different use cases. Here’s a summary:
236-
(((table)))
236+
(((Tables, Array/Lists complexities)))
237237

238238
// tag::table[]
239239
.Big O cheat sheet for Linked List and Array

‎book/chapters/map-hashmap-vs-treemap.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
== TreeMap Time complexity vs HashMap
1616

1717
As we discussed so far, there is a trade-off between the implementations.
18-
(((table)))
18+
(((Tables, HashMap/TreeMap complexities)))
1919

2020
// tag::table[]
2121
.Time complexity for different Maps implementations

‎book/chapters/map-hashmap.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -287,7 +287,7 @@ https://github.com/amejiarosario/dsa.js/blob/7694c20d13f6c53457ee24fbdfd3c0ac571
287287
== HashMap time complexity
288288

289289
Hash Map it’s very optimal for searching values by key in constant time *O(1)*. However, searching by value is not any better than an array since we have to visit every value *O(n)*.
290-
(((table)))
290+
(((Tables, HashMap complexities)))
291291

292292
// tag::table[]
293293
.Time complexity for a Hash Map

‎book/chapters/queue.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@ You can see that the items are dequeue in the same order they were added, FIFO (
6363
== Queue Complexity
6464

6565
As an experiment, we can see in the following table that if we had implemented the Queue using an array, its enqueue time would be _O(n)_ instead of _O(1)_. Check it out:
66-
(((table)))
66+
(((Tables, Queue complexities)))
6767

6868
// tag::table[]
6969
.Time/Space complexity for queue operations

‎book/chapters/set.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -205,7 +205,7 @@ This method has an average runtime of *O(1)*.
205205
We can say that `HashMap` in on average more performant O(1) vs. O(log n). However, if a
206206
rehash happens, it will take *O(n)* instead of *O(1)*. A `TreeSet` is always *O(log n)*.
207207

208-
(((table)))
208+
(((Tables, HashSet/TreeSet complexities)))
209209

210210
// tag::table[]
211211
.Time complexity HashSet vs TreeSet

0 commit comments

Comments
(0)

AltStyle によって変換されたページ (->オリジナル) /