Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

Parallel computing can you help me solve problem them step by step I am learning

ID: 3856031 • Letter: P

Question

Parallel computing can you help me solve problem them step by step I am learning as I go Parallel computing can you help me solve problem them step by step I am learning as I go 1-7. A cluster of 32 computers is being set up. The server node has two Ethernet connections, one to the Internet and one to the cluster. The Internet IP address is 216.123.0.0. Devise an IP address assignment for the cluster using C class format. 1-8. A company is proposing an IPv8 format using 512 bits. Do you think this is justified? Explain. It is possible to construct a system physically that is a hybrid of a message-passing multicomputer and a shared memory multiprocessor. Write a report on how this might be achieved and its relative advantages over a pure message-passing system and a pure shared memory system. 1-9.

Explanation / Answer

A Presentation on Parallel Computing -Ameya Waghmare(Rno 41,BE CSE) Guided by-Dr.R.P.Adgaonkar(HOD),CSE Dept. 3.
  • Parallel computing is a form of computation in which many instructions are carried out simultaneously operating on the principle that large problems can often be divided into smaller ones, which are then solved concurrently (in parallel).
  • Why is it required?
4.
  • With the increased use of computers in every sphere of human activity,computer scientists are faced with two crucial issues today.
  • Processing has to be done faster like never before
  • Larger or complex computation problems need to be solved
5.
  • Increasing the number of transistors as per Moore’s Law isn’t a solution,as it also increases the frequency scaling and power consumption.
  • Power consumption has been a major issue recently,as it causes a problem of processor heating.
  • The perfect solution is PARALLELISM
  • In hardware as well as software.
6. Difference With Distributed Computing
  • When different processors/computers work on a single common goal,it is parallel computing.
  • Eg.Ten men pulling a rope to lift up one rock,supercomputers implement parallel computing.
  • Distributed computing is where several different computers work separately on a multi-faceted computing workload.
  • Eg.Ten men pulling ten ropes to lift ten different rocks,employees working in an office doing their own work.
7. Difference With Cluster Computing
  • A computer cluster is a group of linked computers, working together closely so that in many respects they form a single computer.
  • Eg.,In an office of 50 employees,group of 15 doing some work,25 some other,and remaining 10 something else.
  • Similarly,in a network of 20 computers,16 working on a common goal,whereas 4 on some other common goal.
  • Cluster Computing is a specific case of parallel computing.
8. Difference With Grid Computing
  • Grid Computing makes use of computers communicating over the Internet to work on a given problem.
  • Eg.When 3 persons,one of them from USA,another from Japan and a third from Norway are working together online on a common project.
  • Websites like Wikipedia,Yahoo!Answers,YouTube,FlickR or open source OS like Linux are examples of grid computing.
  • Again,it serves a san example of parallel computing.
9. The Concept Of Pipelining
  • In computing, a pipeline is a set of data processing elements connected in series, so that the output of one element is the input of the next one. The elements of a pipeline are often executed in parallel or in time-sliced fashion; in that case, some amount of buffer storage is often inserted between elements.
10. Approaches To Parallel Computing
  • Flynn’s Taxonomy
  • SISD(Single Instruction Single Data)
  • SIMD(Single Instruction Multiple Data)
  • MISD(Multiple Instruction Single Data)
  • MIMD(Multiple Instruction Multiple Data)
11. Approaches Based On Computation
  • Massively Parallel
  • Embarrassingly Parallel
  • Grand Challenge Problems
12. Massively Parallel Systems
  • It signifies the presence of many independent units or entire microprocessors, that run in parallel.
  • The term massive connotes hundreds if not thousands of such units.