Locality of reference
Encyclopedia
In computer science
, locality of reference, also known as the principle of locality, is the phenomenon of the same value or related storage
locations being frequently accessed. There are two basic types of reference locality. Temporal locality refers to the reuse of specific data and/or resources within relatively small time durations. Spatial locality refers to the use of data elements within relatively close storage locations. Sequential locality, a special case of spatial locality, occurs when data elements are arranged and accessed linearly, e.g., traversing the elements in a one-dimensional array.
Locality is merely one type of predictable
behavior that occurs in computer systems. Systems which exhibit strong locality of reference are good candidates for performance optimization through the use of techniques, like the cache
and instruction prefetch
technology for memory, or like the advanced branch predictor
at the pipelining of processors.
In order to make benefit from the very frequently occurring temporal and spatial kind of locality, most of the information storage systems are hierarchical; see below. The equidistant locality is usually supported by the diverse nontrivial increment instructions of the processors. For the case of branch locality, the contemporary processors have sophisticated branch predictors, and on the base of this prediction the memory manager of the processor tries to collect and preprocess the data of the plausible alternatives.
are:
obviously benefits from temporal and spatial locality. A cache is a simple example of exploiting temporal locality, because it is a specially designed faster but smaller memory area, generally used to keep recently referenced data and data near recently referenced data, which can lead to potential performance increases. Data in cache does not necessarily correspond to data that is spatially close in main memory; however, data elements are brought into cache one cache line at a time. This means that spatial locality is again important: if one element is referenced, a few neighboring elements will also be brought into cache. Finally, temporal locality plays a role on the lowest level, since results that are referenced very closely together can be kept in the machine registers
. Programming languages such as C allow the programmer to suggest that certain variables are kept in registers.
Data locality is a typical memory reference feature of regular programs (though many irregular memory access patterns exist). It makes the hierarchical memory layout profitable. In computers, memory is divided up into a hierarchy in order to speed up data accesses. The lower levels of the memory hierarchy tend to be slower, but larger. Thus, a program will achieve greater performance if it uses memory while it is cached in the upper levels of the memory hierarchy and avoids bringing other data into the upper levels of the hierarchy that will displace data that will be used shortly in the future. This is an ideal, and sometimes cannot be achieved.
Typical memory hierarchy (access times and cache sizes are approximations of typical values used for the purpose of discussion; actual values and actual numbers of levels in the hierarchy vary):
Modern machines tend to read blocks of lower memory into the next level of the memory hierarchy. If this displaces used memory, the operating system tries to predict which data will be accessed least (or latest) and move it down the memory hierarchy. Prediction algorithms tend to be simple to reduce hardware complexity, though they are becoming somewhat more complicated.
for i in 0..n
for j in 0..m
for k in 0..p
C[i][j] = C[i][j] + A[i][k] * B[k][j];
When dealing with large matrices, this algorithm tends to shuffle data around too much. Since memory is pulled up the hierarchy in consecutive address blocks, in the C programming language
it would be advantageous to refer to several memory addresses that share the same row (spatial locality). By keeping the row number fixed, the second element changes more rapidly. In C and C++
, this means the memory addresses are used more consecutively. One can see that since
Temporal locality can also be improved in the above example by using a technique called blocking. The larger matrix can be divided into evenly-sized sub-matrices, so that the smaller blocks can be referenced (multiplied) several times while in memory.
for (ii = 0; ii < SIZE; ii += BLOCK_SIZE)
for (kk = 0; kk < SIZE; kk += BLOCK_SIZE)
for (jj = 0; jj < SIZE; jj += BLOCK_SIZE)
for (i = ii; i < ii + BLOCK_SIZE && i < SIZE; i++)
for (k = kk; k < kk + BLOCK_SIZE && k < SIZE; k++)
for (j = jj; j < jj + BLOCK_SIZE && j < SIZE; j++)
C[i][j] = C[i][j] + A[i][k] * B[k][j];
The temporal locality of the above solution is provided because a block can be used several times before moving on, so that it is moved in and out of memory less often. Spatial locality is improved because elements with consecutive memory addresses tend to be pulled up the memory hierarchy together.
Computer science
Computer science or computing science is the study of the theoretical foundations of information and computation and of practical techniques for their implementation and application in computer systems...
, locality of reference, also known as the principle of locality, is the phenomenon of the same value or related storage
Computer storage
Computer data storage, often called storage or memory, refers to computer components and recording media that retain digital data. Data storage is one of the core functions and fundamental components of computers....
locations being frequently accessed. There are two basic types of reference locality. Temporal locality refers to the reuse of specific data and/or resources within relatively small time durations. Spatial locality refers to the use of data elements within relatively close storage locations. Sequential locality, a special case of spatial locality, occurs when data elements are arranged and accessed linearly, e.g., traversing the elements in a one-dimensional array.
Locality is merely one type of predictable
Predictability
Predictability is the degree to which a correct prediction or forecast of a system's state can be made either qualitatively or quantitatively.-Predictability and Causality:...
behavior that occurs in computer systems. Systems which exhibit strong locality of reference are good candidates for performance optimization through the use of techniques, like the cache
Cache
In computer engineering, a cache is a component that transparently stores data so that future requests for that data can be served faster. The data that is stored within a cache might be values that have been computed earlier or duplicates of original values that are stored elsewhere...
and instruction prefetch
Instruction prefetch
In computer architecture, instruction prefetch is a technique used in microprocessors to speed up the execution of a program by reducing wait states....
technology for memory, or like the advanced branch predictor
Branch predictor
In computer architecture, a branch predictor is a digital circuit that tries to guess which way a branch will go before this is known for sure. The purpose of the branch predictor is to improve the flow in the instruction pipeline...
at the pipelining of processors.
Locality of reference
The locality of reference, also known as the locality principle, is the phenomenon that the collection of the data locations referenced in a short period of time in a running computer often consists of relatively well predictable clusters. Important special cases of locality are temporal, spatial, equidistant and branch locality.- TemporalTimeTime is a part of the measuring system used to sequence events, to compare the durations of events and the intervals between them, and to quantify rates of change such as the motions of objects....
locality: if at one point in time a particular memory location is referenced, then it is likely that the same location will be referenced again in the near future. There is a temporal proximity between the adjacent references to the same memory location. In this case it is common to make efforts to store a copy of the referenced data in special memory storage, which can be accessed faster. Temporal locality is a very special case of the spatial locality, namely when the prospective location is identical to the present location.
- Spatial locality: if a particular memory location is referenced at a particular time, then it is likely that nearby memory locations will be referenced in the near future. In this case it is common to attempt to guess the size and shape of the area around the current reference for which it is worthwhile to prepare faster access.
- EquidistantDistanceDistance is a numerical description of how far apart objects are. In physics or everyday discussion, distance may refer to a physical length, or an estimation based on other criteria . In mathematics, a distance function or metric is a generalization of the concept of physical distance...
locality: it is halfway between the spatial locality and the branch locality. Consider a loop accessing locations in an equidistant pattern, i.e. the path in the spatial-temporal coordinate space is a dotted line. In this case, a simple linear function can predict which location will be accessed in the near future.
- BranchBranch (computer science)A branch is sequence of code in a computer program which is conditionally executed depending on whether the flow of control is altered or not . The term can be used when referring to programs in high level languages as well as program written in machine code or assembly language...
locality: if there are only few amount of possible alternatives for the prospective part of the path in the spatial-temporal coordinate space. This is the case when an instruction loop has a simple structure, or the possible outcome of a small system of conditional branching instructions is restricted to a small set of possibilities. Branch locality is typically not a spatial locality since the few possibilities can be located far away from each other.
In order to make benefit from the very frequently occurring temporal and spatial kind of locality, most of the information storage systems are hierarchical; see below. The equidistant locality is usually supported by the diverse nontrivial increment instructions of the processors. For the case of branch locality, the contemporary processors have sophisticated branch predictors, and on the base of this prediction the memory manager of the processor tries to collect and preprocess the data of the plausible alternatives.
Reasons for locality
There are several reasons for locality. These reasons are either goals to achieve or circumstances to accept, depending on the aspect. The reasons below are not disjoint; in fact, the list below goes from the most general case to special cases.- Predictability: In fact, locality is merely one type of predictable behavior in computer systems. Luckily, many of the practical problems are decidableDecidableThe word decidable may refer to:* Decidable language*Decidability for the equivalent in mathematical logic*Gödel's incompleteness theorem, a theorem on the indecidability of languages consisting of "true statements" in mathematical logic....
and hence the corresponding program can behave predictably, if it is well written.
- Structure of the program: Locality occurs often because of the way in which computer programComputer programA computer program is a sequence of instructions written to perform a specified task with a computer. A computer requires programs to function, typically executing the program's instructions in a central processor. The program has an executable form that the computer can use directly to execute...
s are created, for handling decidable problems. Generally, related data is stored in nearby locations in storage. One common pattern in computingComputingComputing is usually defined as the activity of using and improving computer hardware and software. It is the computer-specific part of information technology...
involves the processing of several items, one at a time. This means that if a lot of processing is done, the single item will be accessed more than once, thus leading to temporal locality of reference. Furthermore, moving to the next item implies that the next item will be read, hence spatial locality of reference, since memory locations are typically read in batches.
- Linear data structures: Locality often occurs because code contains loops that tend to reference arrays or other data structures by indices. Sequential locality, a special case of spatial locality, occurs when relevant data elements are arranged and accessed linearly. For example, the simple traversal of elements in a one-dimensional array, from the base address to the highest element would exploit the sequential locality of the array in memory. The more general equidistant locality occurs when the linear traversal is over a longer area of adjacent data structureData structureIn computer science, a data structure is a particular way of storing and organizing data in a computer so that it can be used efficiently.Different kinds of data structures are suited to different kinds of applications, and some are highly specialized to specific tasks...
s having identical structure and size, and in addition to this, not the whole structures are in access, but only the mutually corresponding same elements of the structures. This is the case when a matrixMatrix (mathematics)In mathematics, a matrix is a rectangular array of numbers, symbols, or expressions. The individual items in a matrix are called its elements or entries. An example of a matrix with six elements isMatrices of the same size can be added or subtracted element by element...
is represented as a sequential matrix of rows and the requirement is to access a single column of the matrix.
Use of locality in general
If most of the time the substantial portion of the references aggregate into clusters, and if the shape of this system of clusters can be well predicted, then it can be used for speed optimization. There are several ways to make benefit from locality. The common techniques for optimizationOptimization (computer science)
In computer science, program optimization or software optimization is the process of modifying a software system to make some aspect of it work more efficiently or use fewer resources...
are:
- to increase the locality of references. This is achieved usually on the software side.
- to exploit the locality of references. This is achieved usually on the hardware side. The temporal and spatial locality can be capitalized by hierarchical storage hardwares. The equidistant locality can be used by the appropriately specialized instructions of the processors, this possibility is not only the responsibility of hardware, but the software as well, whether its structure is suitable for compiling a binary program which calls the specialized instructions in question. The branch locality is a more elaborate possibility, hence more developing effort is needed, but there is much larger reserve for future exploration in this kind of locality than in all the remaining ones.
Use of spatial and temporal locality: hierarchical memory
Hierarchical memory is a hardware optimization that takes the benefits of spatial and temporal locality and can be used on several levels of the memory hierarchy. PagingPaging
In computer operating systems, paging is one of the memory-management schemes by which a computer can store and retrieve data from secondary storage for use in main memory. In the paging memory-management scheme, the operating system retrieves data from secondary storage in same-size blocks called...
obviously benefits from temporal and spatial locality. A cache is a simple example of exploiting temporal locality, because it is a specially designed faster but smaller memory area, generally used to keep recently referenced data and data near recently referenced data, which can lead to potential performance increases. Data in cache does not necessarily correspond to data that is spatially close in main memory; however, data elements are brought into cache one cache line at a time. This means that spatial locality is again important: if one element is referenced, a few neighboring elements will also be brought into cache. Finally, temporal locality plays a role on the lowest level, since results that are referenced very closely together can be kept in the machine registers
Processor register
In computer architecture, a processor register is a small amount of storage available as part of a CPU or other digital processor. Such registers are addressed by mechanisms other than main memory and can be accessed more quickly...
. Programming languages such as C allow the programmer to suggest that certain variables are kept in registers.
Data locality is a typical memory reference feature of regular programs (though many irregular memory access patterns exist). It makes the hierarchical memory layout profitable. In computers, memory is divided up into a hierarchy in order to speed up data accesses. The lower levels of the memory hierarchy tend to be slower, but larger. Thus, a program will achieve greater performance if it uses memory while it is cached in the upper levels of the memory hierarchy and avoids bringing other data into the upper levels of the hierarchy that will displace data that will be used shortly in the future. This is an ideal, and sometimes cannot be achieved.
Typical memory hierarchy (access times and cache sizes are approximations of typical values used for the purpose of discussion; actual values and actual numbers of levels in the hierarchy vary):
- CPU registers (8-32 registers) – immediate access
- L1 CPU cacheCPU cacheA CPU cache is a cache used by the central processing unit of a computer to reduce the average time to access memory. The cache is a smaller, faster memory which stores copies of the data from the most frequently used main memory locations...
s (32 KiB to 128 KiB) – fast access - L2 CPU caches (128 KiB to 12 MiBMIBMIB may refer to any of several concepts:* Master of International Business, a postgraduate business degree* Melayu Islam Beraja, the adopted national philosophy of Brunei* Motion induced blindness, a visual illusion in peripheral vision...
) – slightly slower access - Main physical memory (RAMRam-Animals:*Ram, an uncastrated male sheep*Ram cichlid, a species of freshwater fish endemic to Colombia and Venezuela-Military:*Battering ram*Ramming, a military tactic in which one vehicle runs into another...
) (256 MiB to 24 GiBGibGib may refer to:* A castrated male cat or ferret* Gibibit , measurement unit of digitally stored computer information* Gibraltar, British overseas territory* Drywall, construction material...
) – slow access - Disk (file systemFile systemA file system is a means to organize data expected to be retained after a program terminates by providing procedures to store, retrieve and update data, as well as manage the available space on the device which contain it. A file system organizes data in an efficient manner and is tuned to the...
) (100 GiB to 2 TiBTIBTIB may mean:* Therapy Interfering Behavior* Forbundet Træ-Industri-Byg i Danmark, the Danish Timber Industry and Construction Workers' Union* Tibet* The Win32 Thread Information Block, an element in Microsoft Windows computer programming....
) – very slow - Remote Memory (such as other computers or the Internet) (Practically unlimited) – speed varies
Modern machines tend to read blocks of lower memory into the next level of the memory hierarchy. If this displaces used memory, the operating system tries to predict which data will be accessed least (or latest) and move it down the memory hierarchy. Prediction algorithms tend to be simple to reduce hardware complexity, though they are becoming somewhat more complicated.
Spatial and temporal locality example: matrix multiplication
A common example is matrix multiplication:for i in 0..n
for j in 0..m
for k in 0..p
C[i][j] = C[i][j] + A[i][k] * B[k][j];
When dealing with large matrices, this algorithm tends to shuffle data around too much. Since memory is pulled up the hierarchy in consecutive address blocks, in the C programming language
C (programming language)
C is a general-purpose computer programming language developed between 1969 and 1973 by Dennis Ritchie at the Bell Telephone Laboratories for use with the Unix operating system....
it would be advantageous to refer to several memory addresses that share the same row (spatial locality). By keeping the row number fixed, the second element changes more rapidly. In C and C++
C++
C++ is a statically typed, free-form, multi-paradigm, compiled, general-purpose programming language. It is regarded as an intermediate-level language, as it comprises a combination of both high-level and low-level language features. It was developed by Bjarne Stroustrup starting in 1979 at Bell...
, this means the memory addresses are used more consecutively. One can see that since
j
affects the column reference of both matrices C
and B
, it should be iterated in the innermost loop (this will fix the row iterators, i
and k
, while j
moves across each column in the row). This will not change the mathematical result, but it improves efficiency. By switching the looping order for j
and k
, the speedup in large matrix multiplications becomes dramatic. (In this case, 'large' means, approximately, more than 100,000 elements in each matrix, or enough addressable memory such that the matrices will not fit in L1 and L2 caches.)Temporal locality can also be improved in the above example by using a technique called blocking. The larger matrix can be divided into evenly-sized sub-matrices, so that the smaller blocks can be referenced (multiplied) several times while in memory.
for (ii = 0; ii < SIZE; ii += BLOCK_SIZE)
for (kk = 0; kk < SIZE; kk += BLOCK_SIZE)
for (jj = 0; jj < SIZE; jj += BLOCK_SIZE)
for (i = ii; i < ii + BLOCK_SIZE && i < SIZE; i++)
for (k = kk; k < kk + BLOCK_SIZE && k < SIZE; k++)
for (j = jj; j < jj + BLOCK_SIZE && j < SIZE; j++)
C[i][j] = C[i][j] + A[i][k] * B[k][j];
The temporal locality of the above solution is provided because a block can be used several times before moving on, so that it is moved in and out of memory less often. Spatial locality is improved because elements with consecutive memory addresses tend to be pulled up the memory hierarchy together.
See also
- Burst mode (computing)Burst mode (computing)Burst mode is a generic computing term referring to any situation in which a device is transmitting data repeatedly without waiting for input from another device or waiting for an internal process to terminate before continuing the transfer of data.In the case of DMA, the DMA controller and the...
- Row-major orderRow-major orderIn computing, row-major order and column-major order describe methods for storing multidimensional arrays in linear memory. Following standard matrix notation, rows are numbered by the first index of a two-dimensional array and columns by the second index. Array layout is critical for correctly...
- File system fragmentationFile system fragmentationIn computing, file system fragmentation, sometimes called file system aging, is the inability of a file system to lay out related data sequentially , an inherent phenomenon in storage-backed file systems that allow in-place modification of their contents. It is a special case of data fragmentation...
- Cache-oblivious algorithm
- Partitioned global address spacePartitioned global address spaceIn computer science, a partitioned global address space is a parallel programming model. It assumes a global memory address space that is logically partitioned and a portion of it is local to each processor. The novelty of PGAS is that the portions of the shared memory space may have an affinity...