What is Parallel Computing? – types of parrallel Computing

Parallel Computing is evolved from serial computing that attempts to emulate what has always been the state of affairs in natural World. We can say many complex irrelevant events happening at the same time sequentionally. For instance; planetary movements, Automobile assembly, Galaxy formation, Weather and Ocean patterns.

Historically, it is considered to be “the high end of computing” and has been used to model difficult scientific, computational and engineering problems.

In computational field technique which is used for solving the computational tasks by using different type multiple resources simultaneously is called as parallel computing. It breaks down large problem into smaller ones, which are solved concurrently.


A system in which two or more parts of single program operate concurrently on multiple processors.

Parallel computing has become dominant paradigm in computer architecture and parallel computers can be classified according to the level at which their hardware supports parallelism. Computational resources may include:

  • Single computer with multiple processors.
  • Variant no. of computers connected to network.

A combination of both.

Types of Parallel Computing:

There are several Types of Parallel Computing which are used World wide.

  1. Bit-level Parallelism.
  2. Instruction level Parallelism.
  3. Task Parallelism.

Bit Level Parallelism:

It is a form of parallelism which is based on increasing processors word size. It shortens the no. of instructions that the system must run in order to perform a task on variables which are greater in size.

Instruction Level Parallelism:

It is a form of parallel computing in which we can calculate the amount of operation carried out by an operating system at same time. For example

  1. Instruction pipelining.
  2. Out of order execution.
  3. Register renaming.
  4. Speculative execution.
  5. Branch prediction.

Task Parallelism:

Task Parallelism is a form of parallelization in which different processors run the program among different codes of distribution. It is also called as Function Parallelism.

Applications of Parallel Computing:

This decomposing technique is used in application requiring processing of large amount of data in sophisticated ways. For example;

  1. Data bases, Data mining.
  2. Networked videos and Multimedia technologies.
  3. Medical imaging and diagnosis.
  4. Advanced graphics and virtual reality.
  5. Collaborative work environments.

Prose of Parallel Computing:

    • Save Time/Money:                                                                                                        
    • Parallel usage of more resources shortens the task completion time, with potential cost saving. Parallel clusters can be constructed from cheap components.
    • Solve Larger Problems:                                                                               
    • Complex and large problems that are impractical to solve by a single computer especially with limited memory. For example; 1.Problem requiring Peta FLOPS and Peta Bytes. 2. Web search engines/million of transactions per section in data base processing.
    • Concurrency:                                                                                                                 
    • Multiple computing resources can be doing simultaneous things, as compared to single computer resource. For example Access Grid (Provides global collaboration network virtually).
    • Support Non Local Resources:                                                                                           
    • Network wide computer resources can be utilized in scarcity at local resources.

Cons of Parallel Computing:

  • Transmission Speed:                                                
  • Transmission speed is relatively low as depends upon, how fast data can move through hardware. Transmission media limitations such as (limit of copper wire 9cm/nanosecond) make data transmission low.
  • Difficult Programming:                                                                                                                 
  • It is difficult to write Algorithms and computer programs supporting parallel computing as, it requires integration of complex instructions. Only people with enough knowledge can code program well.
  • Communication and Synchronization:
  • Communication and synchronization between the sub tasks are typically one of the greatest obstacles to get good parallel program performance.

Future of Parallel Computing:

 It is expected to lead to other major changes in the industry. Major companies like INTEL Corp and Advanced Micro Devices Inc has already integrated four processors in a single chip. Now what needed is the simultaneous translation and break through in technologies, the race for results in parallel computing is in full swing. Another great challenge is to write a software program to divide computer processors into chunks. This could only be done with the new programming language to revolutionize the every piece of software written. Parallel computing may change the way computer work in the future and how. We use them for work and play.