Content-Addressable Memory:CAM Architecture.

Introduction

In the ordinary memory circuit designs, such as DRAMs and SRAMs, the memory devices utilize write cycle to store data and read cycle to retrieve the stored data by addressing specific memory location, called an address. In other words, each read cycle and write cycle can access only one specific memory location which is indicated by an address. As a result, the data access of ordinary memory devices is operated in a sequential manner. In the high-speed data search applications, for instance, Internet routers [1–5], image processing [6–8], and pattern recognitions [9–11], the time required for finding data stored in memory array is as short as possible to achieve high-speed data search performance. Because of the sequential data access manner, the data search performance of the ordinary memory devices relies on fast memory bandwidth that cannot be well applied to search-intensive applications. If a memory device can provide useful function, called search function, that compares desired search data with all the data stored in the memory array simultaneously, then the data search performance of the memory device will be improved greatly. To realize the parallel data comparison operation, the data stored in the memory array have to be identified for access by the contents of the stored data themselves rather than by their addresses. Memory that is accessed in this way is called content-addressable memory (CAM).

Content-Addressable Memory-0663

CAM Architecture

A general CAM architecture usually consists of the data memory with valid bit field, the address decoder, the word match circuit, and the address priority encoder, as shown in Figure 56.1 [12]. The memory organization of the CAM is composed of the data memory and the valid bit field, where the valid bit field indicates the availability of data stored in the related memory location. The address decoder is used to indicate the specific memory location for the read and write operations. To realize the data search operation, the word match circuit is designed to compare the search data with the data stored in the related memory location. In the word match circuit, the data matching is performed by comparing every bit in the stored data with the search data. If every bit in the stored data matches with every corresponding bit in the search data, then the stored data matches the search data, else the stored data mismatches the search data. In the CAM architecture, all word match circuits are operated in parallel to speed up data search operation, and the parallel data comparison may search out more than one matched data stored in CAM array during each data search operation. To output the address of the best-matched stored data, the address priority encoder is used to choose the highest priority address among those matched stored data during each data search operation. In addition, one additional ‘Match Found’ output signal will be required, since it is possible that none of the stored data matches the search data.

Generally, CAMs have three operation modes: read, write, and search modes. The read and write modes access and manipulate data in the CAM array in the same way as an ordinary memory. The benefit of the CAM is realized in search operation mode to achieve powerful data search performance. In this operation, a desired search data is sent into CAM to compare with all valid data stored in CAM array simultaneously, and the highest priority address among those matched stored data is sent to the output port “Matched Address.” Based on the parallel data comparison architecture, the CAM circuit performs large amounts of comparison operation during each data search operation for searching out the address of the best matched data stored in CAM array, the high-speed data search feature makes the CAM to be an appealing memory architecture for search-intensive applications.

Comments

Popular posts from this blog

Square wave oscillators and Op-amp square wave oscillator.

Timing Description Languages:SDF

Adders:Carry Look-Ahead Adder.