跳转至

17.2 The accounting method

17.2-1

Suppose we perform a sequence of stack operations on a stack whose size never exceeds \(k\). After every \(k\) operations, we make a copy of the entire stack for backup purposes. Show that the cost of \(n\) stack operations, including copying the stack, is \(O(n)\) by assigning suitable amortized costs to the various stack operations.

For every stack operation, we charge twice.

  • First, we charge the actual cost of the stack operation.
  • Second, we charge the cost of copying an element later on.

Since we have the size of the stack never exceed \(k\), and there are always \(k\) operations between backups, we always overpay by at least enough. Therefore, the amortized cost of the operation is constant, and the cost of the \(n\) operation is \(O(n)\).

17.2-2

Redo Exercise 17.1-3 using an accounting method of analysis.

Let \(c_i =\) cost of \(i\)th operation.

\[ c_i = \begin{cases} i & \text{if $i$ is an exact power of $2$}, \\\\ 1 & \text{otherwise}. \end{cases} \]

Charge \(3\) (amortized cost \(\hat c_i\)) for each operation.

  • If \(i\) is not an exact power of \(2\), pay \(\$1\), and store \(\$2\) as credit.
  • If \(i\) is an exact power of \(2\), pay \(\$i\), using stored credit.
\[ \begin{array}{cccc} \text{Operation} & \text{Cost} & \text{Actual cost} & \text{Credit remaining} \\\\ \hline 1 & 3 & 1 & 2 \\\\ 2 & 3 & 2 & 3 \\\\ 3 & 3 & 1 & 5 \\\\ 4 & 3 & 4 & 4 \\\\ 5 & 3 & 1 & 6 \\\\ 6 & 3 & 1 & 8 \\\\ 7 & 3 & 1 & 10 \\\\ 8 & 3 & 8 & 5 \\\\ 9 & 3 & 1 & 7 \\\\ 10 & 3 & 1 & 9 \\\\ \vdots & \vdots & \vdots & \vdots \end{array} \]

Since the amortized cost is \(\$3\) per operation, \(\sum\limits_{i = 1}^n \hat c_i = 3n\).

We know from Exercise 17.1-3 that \(\sum\limits_{i = 1}^n \hat c_i < 3n\).

Then we have

\[\sum_{i = 1}^n \hat c_i \ge \sum_{i = 1}^n c_i \Rightarrow \text{credit} = \text{amortized cose} - \text{actual cost} \ge 0.\]

Since the amortized cost of each operation is \(O(1)\), and the amount of credit never goes negative, the total cost of \(n\) operations is \(O(n)\).

17.2-3

Suppose we wish not only to increment a counter but also to reset it to zero (i.e., make all bits in it \(0\)). Counting the time to examine or modify a bit as \(\Theta(1)\), show how to implement a counter as an array of bits so that any sequence of \(n\) \(\text{INCREMENT}\) and \(\text{RESET}\) operations takes time \(O(n)\) on an initially zero counter. (\(\textit{Hint:}\) Keep a pointer to the high-order \(1\).)

(Removed)