The statistical definition of entropy Digital information in the form of bits (r
ID: 2082664 • Letter: T
Question
The statistical definition of entropy Digital information in the form of bits (represented as 0s and 1s) is processed by a computer chip by rapidly changing the electronic charge in numerous switches called transistors between two states representing “0” and “1”. Thanks to miniaturization, modern computer processors contain about 109(1 billion) transistors that operate at a rate of 3 GHz (3 billion state changes every second) while consuming only 15 W of power.
a. Assuming every transistor in this processor changes state every cycle, how much energy is consumed per transistor per operation? Express an answer in units of J/bit.
b. Using the value of the Boltzmann constant " = 1.38×10*+, J K, calculate the entropy of a system with W = 2 microstates. The quantity you will calculate is known as the dissipation associated with saving or erasing a single bit of information.
c. At T = 300 K, how much heat is associated with this dissipation? Compare to the energy you found in Part a.
d. An advanced civilization might be able to build an ideal computer processor with molecular-scale transistors that consume a minimum amount of energy per operation equal to the heat you found in Part c. How much power would this ideal computer consume if it had identical specs to the modern non-ideal computer (109 transistors and 3 Ghz)?
Explanation / Answer
Gordon Moore himself stated during an interview September 18, 2007, at Intel’s twice-annual technical conference that we will soon be bumping against the laws of physics: “another decade, a decade and a half I think we’ll hit something fairly fundamental.”
Since this involves a physics limit (in his words), he went on to quote Stephen Hawking during his visit to Intel in 2005. “When Stephen Hawking was asked what are the fundamental limits to microelectronics, he said the speed of light and the atomic nature of matter” [9]. Determining an ultimate physics limit to Moore’s Law would mark out a future boundary to electronics miniaturization.
A calculation of the quantum limit to Moore’s Law was conducted by writing Moore’s Law in equation form as [5]
This equation predicts the number of transistors or equivalent computing power in any given year from the number of transistors in any other earlier year [5].
From the definition of Moore’s Law, we know that the characteristic dimension or length of a transistor is inversely proportional to the number of transistors on an IC. If the measurement of is in “number per meter,” then, from dimensional analysis, the measurement of is in meters (m), or, equivalently, is the number per meter just as in (1).
We can then rewrite (1) as
The characteristic dimension of an electron from Heisenberg uncertainty is the Compton wavelength [10] m based on Planck’s constant , the mass of the electron , and the speed of light .
The Compton wavelength of the electron is the fundamental limit to measuring its position based on quantum mechanics and special relativity, or the length scale where a relativistic quantum field theory is necessary for an adequate description [11]. The Compton wavelength is therefore the fundamental boundary to determining the position (or spin) of a particle, which satisfies the Stephen Hawking prediction that this limit would be based on the speed of light and the atomic nature of matter since is determined by , , and [5]. Rewriting (2) using the year of 2008 with available technology, transistor feature size, and Compton wavelength, m or 0.00243nm:Solving for the exponent using the natural log function, we end up to have
This is the quantum limit year predicted by Moore’s Law if electrons were implemented as the smallest quantum computing transistor elements [5].
3.5. The Economic Limit to Moore’s Law
The higher component density has led to a decrease in end consumer prices. However, the costs for producers follow a converse trend: research and development, manufacture, and tests become more and more expensive with each new generation. This observation is known as Rock’s Law and sometimes also referred to as Moore’s Second Law [12]; fabrication facility (fab) costs also follow an exponential growth. Despite this exponential growth of facility costs, the cost per shipped unit decreases at an exponential rate. Karl Rupp first investigated economic limitations to the semiconductor business. A summary of their results has already been published in [13]. Karl then found out If costs for a single fab are at most 0.02% of the GWP (i.e., ), a reduced growth of transistor counts per chip for economic reasons is likely to happen around 2020 as shown in Figure 3
Related Questions
drjack9650@gmail.com
Navigate
Integrity-first tutoring: explanations and feedback only — we do not complete graded work. Learn more.