Although the origins of parallel computing go back to the last century, it was only in the 1970s that parallel and vector computers became available to the scientific community. The first of these machines-the 64 processor llliac IV and the vector computers built by Texas Instruments, Control Data Corporation, and then CRA Y Research Corporation-had a somewhat limited impact. They were few in number and available mostly to workers in a few government laboratories. By now, however, the trickle has become a flood. There are over 200 large-scale vector computers now installed, not only in government laboratories but also in universities and in an increasing diversity of industries. Moreover, the National Science Foundation's Super- computing Centers have made large vector computers widely available to the academic community. In addition, smaller, very cost-effective vector computers are being manufactured by a number of companies. Parallelism in computers has also progressed rapidly. The largest super- computers now consist of several vector processors working in parallel.
Although the number of processors in such machines is still relatively small (up to 8), it is expected that an increasing number of processors will be added in the near future (to a total of 16 or 32). Moreover, there are a myriad of research projects to build machines with hundreds, thousands, or even more processors. Indeed, several companies are now selling parallel machines, some with as many as hundreds, or even tens of thousands, of processors.
1. Introduction.- 2. Direct Methods for Linear Equations.- 3. Iterative Methods for Linear Equations.- Appendix 2. Convergence of Iterative Methods.- Appendix 3. The Conjugate Gradient Algorithm.- Appendix 4. Basic Linear Algebra.
Series: Frontiers in Computer Science
Number Of Pages: 305
Published: 30th April 1988
Publisher: Springer Science+Business Media
Country of Publication: US
Dimensions (cm): 23.4 x 15.6
Weight (kg): 1.39