Orders of magnitude (computing)
Encyclopedia
This list compares various amounts of computing power in instructions per second organized by order of magnitude
Order of magnitude
An order of magnitude is the class of scale or magnitude of any amount, where each class contains values of a fixed ratio to the class preceding it. In its most common usage, the amount being scaled is 10 and the scale is the exponent being applied to this amount...

.

10-1

Slowest single sentient computation
  • 5×10−1 Speed of the average human mental calculation for multiplication using symbols circa 2000 B.C.

100

Reality speed frame of computation
  • 1 OP
    Operation
    Operation may refer to:* Scientific operation* Surgery, or operation* An operation in mathematics:**Unary operation**Binary operation**Arity...

    /S the speed of the average human addition calculation using symbols circa 2000 B.C.
  • 1 OP/S the speed of Zuse Z1 first fully digital automated computer 1936
  • 5 OP/S world record for human addition set


101

Faster than human mental reaction computation begins
  • 2×101 Zuse Z3 1941
  • 6×101 Upper end of serialized human perception computation (light bulbs in the US do not flicker to the human observer)


102

Faster than animal perception computation begins
  • 1.2×102 Estimated serial perception processing for a double dorsal brain.
  • 2×102 Upper end of serialized human through put. This is roughly expressed by the lower limit of accurate event placement on small scales of time (The swing of a conductors arm, the reaction time to lights on a drag strip etc.)
  • 2×102 IBM 602
    IBM 602
    The IBM 602 Calculating Punch , introduced in 1946, was an electromechanical calculator capable of addition, subtraction, multiplication, and division. It was programmed using a plugboard control panel and was IBM's first machine that did division....

     1946 computer.

103

Kilo scale computing
  • 9.2×104 Intel 4004
    Intel 4004
    The Intel 4004 was a 4-bit central processing unit released by Intel Corporation in 1971. It was the first complete CPU on one chip, and also the first commercially available microprocessor...

     First commercially available full function CPU on a chip 1971
  • 5×105 Colossus computer
    Colossus computer
    Not to be confused with the fictional computer of the same name in the movie Colossus: The Forbin Project.Colossus was the world's first electronic, digital, programmable computer. Colossus and its successors were used by British codebreakers to help read encrypted German messages during World War II...

     vacuum tube
    Vacuum tube
    In electronics, a vacuum tube, electron tube , or thermionic valve , reduced to simply "tube" or "valve" in everyday parlance, is a device that relies on the flow of electric current through a vacuum...

     supercomputer 1943


106

Mega scale computing
  • 1×106 Motorola 68000
    Motorola 68000
    The Motorola 68000 is a 16/32-bit CISC microprocessor core designed and marketed by Freescale Semiconductor...

     commercial computing 1979
  • 1.2×106 IBM 7030 "Stretch"
    IBM 7030
    The IBM 7030, also known as Stretch, was IBM's first transistorized supercomputer. The first one was delivered to Los Alamos National Laboratory in 1961....

     Vacuum tube supercomputer 1961

109

Giga scale computing
  • 1×109 ILLIAC IV 1972 supercomputer does first computational fluid dynamics
    Computational fluid dynamics
    Computational fluid dynamics, usually abbreviated as CFD, is a branch of fluid mechanics that uses numerical methods and algorithms to solve and analyze problems that involve fluid flows. Computers are used to perform the calculations required to simulate the interaction of liquids and gases with...

     problems
  • 1.354×109 Intel Pentium III commercial computing 1999
  • 147.6×109 Intel Core-i7 980X Extreme Edition
    Gulftown (microprocessor)
    Gulftown or Westmere-EP is the codename of a six-core hyperthreaded Intel processor able to run up to 12 threads in parallel. It is based on Westmere microarchitecture, the 32 nm shrink of Nehalem. Originally rumored to be called the Intel Core i9, it is sold as an Intel Core i7...

     commercial computing 2010


1012

Tera scale computing
  • 1.34×1012 Intel ASCI Red 1997 Supercomputer
  • 1.344×1012 GeForce GTX 480
    GeForce 400 Series
    The GeForce 400 Series is the 11th generation of Nvidia's GeForce graphics processing units. The series was originally slated for production in November 2009, but, after a number of delays, launched on March 26, 2010 with availability following in April 2010....

     from NVIDIA
  • 4.64×1012 Radeon HD 5970 from ATI
  • 5.152×1012 S2050/S2070 1U GPU Computing System
    Nvidia Tesla
    The Tesla graphics processing unit is nVidia's third brand of GPUs. It is based on high-end GPUs from the G80 , as well as the Quadro lineup. Tesla is nVidia's first dedicated General Purpose GPU...

     from NVIDIA
  • 100×1012 Estimated parallelized throughput of the human brain


1015

Petascale
Petascale
In computing, petascale refers to a computer system capable of reaching performance in excess of one petaflop, i.e. one quadrillion floating point operations per second. The standard benchmark tool is LINPACK and Top500.org is the organisation which tracks the fastest supercomputers...

 computing
  • 1.026×1015 IBM Roadrunner 2009 Supercomputer
  • 4.3×1015 Fastest computer system as of 2009 Folding@home
    Folding@home
    Folding@home is a distributed computing project designed to use spare processing power on personal computers to perform simulations of disease-relevant protein folding and other molecular dynamics, and to improve on the methods of doing so...

     Cloud computing
    Cloud computing
    Cloud computing is the delivery of computing as a service rather than a product, whereby shared resources, software, and information are provided to computers and other devices as a utility over a network ....

  • 20×1015 IBM Sequoia
    IBM Sequoia
    Sequoia is a petascale Blue Gene/Q supercomputer being constructed by IBM for the National Nuclear Security Administration as part of the Advanced Simulation and Computing Program...

     Circa 2011

1018

Exascale computing
Exascale computing
Exascale computing is a 21st-century attempt to move computing capabilities beyond the existing petascale. If achieved, it would represent a thousandfold increase over that scale...

  • 1×1018 It is estimated that the need for exascale computing will become pressing around 2018

1021

Zetta scale computing
  • 1×1021 Accurate global weather estimation on the scale of approximately 2 weeks. Assuming Moore's law
    Moore's Law
    Moore's law describes a long-term trend in the history of computing hardware: the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years....

    remains constant, such systems may be feasible around 2030.

A zettascale computer system could generate more single floating point data in one second than was stored by any digital means on Earth in first quarter 2011.
The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK