Sans Pareil Technologies, Inc.

Key To Your Business

Lesson 16 - Stacks and Queues


A stack is a basic data structure that can be logically thought as linear structure represented by a real physical stack or pile, a structure where insertion and deletion of items takes place at one end called top of the stack. The basic concept can be illustrated by thinking of your data set as a stack of plates or books where you can only take the top item off the stack in order to remove things from it. This structure is used all throughout programming.

The basic implementation of a stack is also called a LIFO (Last In First Out) to demonstrate the way it accesses data, since as we will see there are various variations of stack implementations.

There are three main operations that can be performed on stacks . They are:
  1. inserting an item into a stack (push).
  2. deleting an item from the stack (pop).
  3. displaying the contents of the stack(peek).

Considered as a linear data structure, or more abstractly a sequential collection, the push and pop operations occur only at one end of the structure, referred to as the top of the stack. This makes it possible to implement a stack as a singly linked list and a pointer to the top element.

A stack may be implemented to have a bounded capacity. If the stack is full and does not contain enough space to accept an entity to be pushed, the stack is then considered to be in an overflow state. The pop operation removes an item from the top of the stack.

A stack can be easily implemented either through an array, vector or a linked list. What identifies the data structure as a stack in either case is not the implementation but the interface: the user is only allowed to pop or push items onto the array or linked list, with few other helper operations.


Stacks are commonly used for the following types of problems:
  • Converting a decimal number into a binary number - Convert each digit in the number into binary format (generally by using remainder of division by 2), push the number on to a stack, and repeat to end of input number. Pop the values off the stack to build the converted binary number in correct order.
  • Replace recursive algorithms with iterative ones by using a stack to avoid recursion.
  • Expression evaluation and syntax parsing - Many compilers use a stack for parsing the syntax of expressions, program blocks etc. before translating into low level code.


In computer science, a queue is a particular kind of abstract data type or collection in which the entities in the collection are kept in order and the principal (or only) operations on the collection are the addition of entities to the rear terminal position, known as enqueue, and removal of entities from the front terminal position, known as dequeue. This makes the queue a First-In-First-Out (FIFO) data structure. In a FIFO data structure, the first element added to the queue will be the first one to be removed. This is equivalent to the requirement that once a new element is added, all elements that were added before have to be removed before the new element can be removed. Often a peek or front operation is also entered, returning the value of the front element without dequeuing it. A queue is an example of a linear data structure, or more abstractly a sequential collection.

Queues provide services in computer science, transport, and operations research where various entities such as data, objects, persons, or events are stored and held to be processed later. In these contexts, the queue performs the function of a buffer.

A bounded queue is a queue limited to a fixed number of items.

There are several efficient implementations of FIFO queues. An efficient implementation is one that can perform the operations—enqueuing and dequeuing—in O(1) time.
• Linked list
• A doubly linked list has O(1) insertion and deletion at both ends, so is a natural choice for queues.
• A regular singly linked list only has efficient insertion and deletion at one end. However, a small modification—keeping a pointer to the last node in addition to the first one—will enable it to implement an efficient queue.
• A deque implemented using a modified dynamic array


In computer science, a double-ended queue (dequeue, often abbreviated to deque) is an abstract data type that generalises a queue, for which elements can be added to or removed from either the front (head) or back (tail). It is also often called a head-tail linked list, though properly this refers to a specific data structure implementation of a deque.
This general data class has some possible sub-types:
• An input-restricted deque is one where deletion can be made from both ends, but insertion can be made at one end only.
• An output-restricted deque is one where insertion can be made at both ends, but deletion can be made from one end only.

Both the basic and most common list types in computing, queues and stacks can be considered specialisations of deques, and can be implemented using deques.


There are at least two common ways to efficiently implement a deque: with a modified dynamic array or with a doubly linked list.

The dynamic array approach uses a variant of a dynamic array that can grow from both ends, sometimes called array deques. These array deques have all the properties of a dynamic array, such as constant-time random access, good locality of reference, and inefficient insertion/removal in the middle, with the addition of amortised constant-time insertion/removal at both ends, instead of just one end. Three common implementations include:
• Storing deque contents in a circular buffer, and only resizing when the buffer becomes full. This decreases the frequency of resize operations.
• Allocating deque contents from the centre of the underlying array, and resizing the underlying array when either end is reached. This approach may require more frequent resizing and waste more space, particularly when elements are only inserted at one end.
• Storing contents in multiple smaller arrays, allocating additional arrays at the beginning or end as needed. Indexing is implemented by keeping a dynamic array containing pointers to each of the smaller arrays.


One example where a deque can be used is the A-Steal job scheduling algorithm. This algorithm implements task scheduling for several processors. A separate deque with threads to be executed is maintained for each processor. To execute the next thread, the processor gets the first element from the deque (using the "remove first element" deque operation). If the current thread forks, it is put back to the front of the deque ("insert element at front") and a new thread is executed. When one of the processors finishes execution of its own threads (i.e. its deque is empty), it can "steal" a thread from another processor: it gets the last element from the deque of another processor ("remove last element") and executes it. The steal-job scheduling algorithm is used by Intel's Threading Building Blocks (TBB) library for parallel programming.