Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

Spatial locality refers to the tendency of execution to involve a number of memo

ID: 3688328 • Letter: S

Question

Spatial locality refers to the tendency of execution to involve a number of memory locations that are clustered. Temporal locality refers to the tendency for a processor to access memory locations that have been used recently. (20pts) Consider the following code: for (i=0; i<20; i++) for (j=0; j<10; j++) a[i]= a[i]*j

a. Give one example of spatial locality in the code.

b. Give one example of temporal locality in the code.

c. What are the strategy for exploiting spatial locality and temporal locality?

Explanation / Answer

a) one example of spatial locality in the code:

the array elements are read and written in sequence determined by the outer loop.

b)An example of temporal locality:

the inner loop accesses i & j and the same array element 10 (20) times in a row, or the variables i and j

c)The strategy for exploiting spatial locality and temporal locality:

Spatial locality is generally exploited by using larger cache blocks and by incorporating prefetching mechanisms (fetching items of anticipated use) into the cache control logic. Temporal locality is exploited by keeping recently used instruction and data values in cache memory and by exploiting a cache hierarchy. * Explain the write policy in the memory cache and the possible approaches to cache coherency. Before a block that is resident in the cache can be replaced, it is necessary to consider if it has been altered in the cache but not in main memory. Basically, there are two problems with writing in the CACHE memory. First, more than one device may have access to main memory. If a word has been altered only in the CACHE, then the corresponding memory word is invalid. The simplest technique to write in the CACHE is called write through. With this technique, all write operations are made to main memory as well as to the CACHE, ensuring that main memory is always valid. The main disadvantage is that it generates substantial memory traffic and may create a bottleneck. An alternative technique, knows as write back, minimizes memory writes. With this, updates are made only in the CACHE. When an update occurs, an UPDATE bit associated with the line is set. Then, when a block is replaced, it is written back to main memory if and only if the UPDATE bit is set. Another problem is when there is more than one device that has its own CACHE. Possible approaches to CACHE coherency are: Bus watching: Each CACHE controller monitors the address lines to detect write operations to memory by other bus masters. If another bus master writes to a location in shared memory that also resides in the cache memory, the cache controller invalidates that cache entry. This technique depends on the use of a write-through policy by all cache controllers. Hardware transparency: Additional hardware is used to ensure that all updates to main memory via cache are reflected in all caches. Non-cacheable memory: Only a portion of main memory is shared by more than one processor, and this is designated as non-cacheable.

Hire Me For All Your Tutoring Needs
Integrity-first tutoring: clear explanations, guidance, and feedback.
Drop an Email at
drjack9650@gmail.com
Chat Now And Get Quote