COMPUTER-AIDED DESIGN (CAD) TOOLS FOR LAYOUT:SUPPORT TOOLS

SUPPORT TOOLS

In terms of layout entry, we have covered the basic types of tools involved. The design process is not only layout entry. The layout needs to be verified against various quality standards, and manufacturing design rules and layout can be migrated from other sources to save time and effort. Tools that perform all of these operations will be covered in this section, as well as a discussion of standard data- base formats used for layout today.

Layout Verification Tools

Figure 10.7 documents the layout design procedure as discussed in Chapter 3. The highlighted steps correspond to a verification tool that is discussed in this section. It is important to remember that layout verification must be done on the entire chip and on the file that is to be sent to manufacturing.

DRC/LVS/ERC. The verification of a full chip database file has been an issue for the verification tools over the course of time as designs have grown from 1,000 to 10,000,000 transistors. As the designs have grown, designers and tool providers have evolved the methodologies capable of verifying these designs.

From a user’s point of view, the requirements for verification tools are dif- ferent from those for layout entry or design tools. Ease of debugging or correct- ing problems, tool capacity, and run times are the key issues for these tools. The verification process in general is a feedback mechanism for the designer to validate the design as well as identify problems or shortcomings. Historically, the layout verification tools had limitations in addressing the key issues as summarized in Table 10.1.

In all cases the accuracy of the checks depends on the values and algorithms that are coded into them. These values and algorithms are captured in files referred to as setup files, command files, or rule decks.

“Work” structures refer to the methodology of breaking a full chip layout into smaller structures for verification that together try to ensure that all potential problems are found. This methodology is very time consuming and potentially error prone, but was necessary because the capacity and speed of the verification tools could not handle an entire chip at once.

The user interface of these tools has evolved significantly over time and they are approaching the ideal conditions just listed. For example, DRC error bars are

Computer-Aided Design (CAD) Tools for Layout-0158

Computer-Aided Design (CAD) Tools for Layout-0159

standard and they show the location of polygons or edges that violate a particular rule. Cross-probing between layout and schematics is also standard for LVS debugging. Features of today’s debuggers include browsing based on error type, layer, layer groupings, jumping to the critical errors first, etc.

Many polygon pushers have an “online” version of the verification tools so that small jobs can be executed almost interactively and the time required to export the database to a separate tool is eliminated. The capacity and run-time issues have been virtually eliminated for small blocks.

Final verification of the file sent to manufacturing must be done with a stand- alone tool that can read the tape-out file. Historically, this was not possible because the capacity and run times of the verification tools were not sufficient to check the entire design.

Recently, “hierarchical” layout verification has been introduced specifically to address the capacity and run-time issues. In the past the limits of the verification tools were determined by the amount of data that the tool had to load and process, and this was a function of the size of the design in terms of polygons.

The hierarchy of the design was ignored and any hierarchy that existed in the design was essentially removed for the verification process. A “flat” database was created. This approach ensured that polygons that existed on top of cells were checked alongside polygons that were drawn inside the cells. The tools were required to store and manage the entire database this way.

Hierarchical layout verification is a different approach that takes advantage of the hierarchy built into the design. Cells that are repeated are checked only once and then discarded for the remainder of the design. The tool requires careful management of the effects of over-the-cell routing.

Note that layout designers can take advantage of the features of the hierarchical verification tools by building efficient hierarchy into their designs. Specifically, limiting over-the-cell routing and matching the layout hierarchy to the schematic hierarchy are good methodologies to accelerate the layout verification process.

The key issues within a hierarchical layout verification environment are reexamined as shown in Table 10.2. It may appear that layout verification issues have been completely solved!

Computer-Aided Design (CAD) Tools for Layout-0160

One place where LVS debuggers can improve is the location and debugging of power-to-power shorts. This type of error is very pervasive and produces a lot of output if left uncontrolled.

Extraction. Extraction is the hottest product today in the Deep Sub-Micron era. Layout extraction is another way of verifying that the layout performs as expected. If DRC checks the rules for mask making, and LVS checks that the connectivity and sizes of all devices are correct, the extraction of the layout is checking that the performance of the layout in simulation meets the required goals.

Layout extraction produces data that feeds back the result of layout to the circuit design process. The format of the data can be simply a netlist of devices, resistors, and capacitors, or the extraction tool can simplify the network of parasitic components by calculating an equivalent delay or lumped RC model.

Extraction is nice to have for normal digital circuits but is a must for analog, RF, and microwave designs where each small capacitance can change circuit behavior.

Extraction methodologies and tools evolved much more quickly in parallel with the development of ASIC flows, since the level of automation in circuit design was increased, thus separating the designer from manually designing all aspects of the design. The extraction process gives the required feedback to the circuit design to evaluate the layout implementation of the circuit. In the case of the ASIC flow, extraction of the real layout from place-and-route is crucial to the size and timing of the design. The reasons are obvious:

• ASIC designers are not analog experts, as they concentrate on developing functionality

• The number of nets in a design is impossible for a human to digest

• In general, ASIC designers do not even see the layout

Back annotation is the term that describes the step of feeding layout information back to the circuit design. Final simulations should be run with the extracted values from layout. For a final extraction to be successful, there are a few minimum requirements:

Computer-Aided Design (CAD) Tools for Layout-0161

• The layout is DRC and LVS clean without errors or warnings

• The extraction environment is set up with accurate process information and tested on a small circuit as a sanity check

• Critical signals are extracted with a higher degree of accuracy

The circuit design team should understand the accuracy of the extraction so that they can account for the limitations of the tool when modeling and designing their circuitry. Extraction tools trade off accuracy for run time as shown in Table 10.3. The main difference in accuracy is how the extraction tool calculates the effects of near-body effects. In the example, 1D near-body effects (i.e., coupling to other lines) are not considered at all. 3D field solvers not only take into account all near bodies, but also solve complex sets of equations to calculate parasitic values.

Ideally, the extraction flow is very fast and perfectly automated so that seeing the layout becomes unnecessary. In this case the setup of the flow is very important.

Note that visual audits should still be done for many specialized applications. People’s eyes and expertise are still useful to analyze the effects of different layout architectures. The only proven methodology today is plotting the cells/blocks and asking experts to audit the layout.

Plotting and Plotters. There are not too many kinds of plotting software avail- able in the market. In general, layout designers are using two kinds of plotting software. One is simple printing software that is using the drivers of the specific printer to print the layout. The more expensive but extensive version is software that is written for “plotting” VLSI. Let’s see what are the advantages of such software:

• The user can plot parts or “windows” from the big cell

• The user can choose only specific layers to be plotted

• The user can define fill patterns different from the ones shown on the screen—in general, designers are using a black background for screen work and white for plotting, so what looks good on black may not on white, espe- cially when there are 3 to 10 layers of metal on top of each other

• The user can define a scale for plotting such as 1,000¥, 5,000¥, 10,000¥ so the picture will be greatly enlarged to analyze analog problems

• The user can choose a variety of options for plotting cell arrays—for example, doughnut shapes or corners only

Why do we need plotting at all? One reason is that there may not be coding for all of the rules for DRC verification. The second is that in some cases some rules are very rare and the DRC rule check may not be deterministic. The third important reason is that architectural improvements may only be understood from a visual inspection of a large-scale plot.

In terms of plotters, there are two kinds available today for VLSI applications: electrostatic or inkjet. Electrostatic plotters require a climate-controlled room with a high level of humidity. This is one reason why inkjet plotters are becoming popular, as they work at room temperature. However each of these plotters has advantages and disadvantages:

• Electrostatic plotters can deliver perfect plots up to 10 meters in length.

• Inkjet plotters have a limit to the length of plot.

• Electrostatic plotters are more expensive to buy, but in terms of price per square meter of plotting area, the cost is the same in the long run because the ink and the paper are more expensive in the case of the inkjet.

• The widths of the plots that can be obtained are comparable because all of them have 36-, 44-, or 54-inch paper width capability.

• Electrostatic toner is delivered in gallon sizes compared to ink that comes in 1.36-liters bottles.

• Both types can be connected to the network and organized in a queue for plot prioritizing.

In conclusion, if you do not need a color plotter to perform audits, you may not need a plotter at all. Black-and-white plotters are used mostly for mask/reticle check, where they are even 64 inches wide, but there are only two layers to check against each other. When you have to deal with four layers of poly and two or three metals, as in a DRAM process, color is obligatory.

Migration Tools

Migration tools are most useful in three scenarios:

1. Second sourcing for added capacity or reliable supply

2. Design reuse

3. Manufacturing process evolution

Lately, silicon intellectual property has become “in” in the VLSI industry, and layout converters have really started to get global attention. There are two ways to deal with process migration. One is to design layout inside tools that can perform process retargeting; the other is to use GDSII converters after everything is silicon proven.

A converter can be used to change almost all levels of layout complexity. The tool was used in the past to migrate low-level cells, and/or full standard cell

libraries, very fast and efficiently. Each cell was limited to ~10 to 40 transistors per cell, and all the layers required in a VLSI process were converted. For such tools, extensive and very knowledgeable setup and maintenance is required.

Converters provide the best solution to migrate full chips from one process to another. The converters available today are not schematic or netlist driven; however, some transistor resizing is possible using tables or scaling factors. In general, the cell topology, pin positions and assignments, electromigration, and RC delay requirements are maintained.

These tools are used for purposes such as the following:

• Standard cell library migration, where standardization of the pin assignment, cell height, neighboring requirements, etc., is an important factor in layout design.

• Cells for datapath, where tool and design requirements have to be guaran- teed and tailored to specific designs.

• Full chip conversion. Converters are starting to work hierarchically, so the size of the source data is no longer a problem.

• In the case where the chip is in advanced stages of layout but a process design rule that affects chip size is changed. Running a converter in hierarchical mode will solve the problem in a matter of hours with almost 100 percent DRC correct results.

Advantage and disadvantages include the following:

• They give the user the capability to migrate specific kinds of layout quickly, but they are expensive for a company that works in a single defined and proven process. Startup fabless companies will likely invest in migration tools, and this will fund further tool development.

• The user may require a minimum amount of training in the macro language, but advanced layout and design knowledge is key. It is important to under- stand the key characteristics of the source layout to ensure that the target layout quality is maintained.

• This type of tool may take some time to set up and to interface with other tools involved in the design flow.

• They are fast compared to any other full hand-crafted capability in migrating layout, and that is why they have gained so much market share in the past 5 years.

• The drawback is that these tools cannot add layers—i.e., migration of a two- layer metal chip to a three-layer metal chip.

• They fully respect the original topology; however, they cannot take advantage (alone) of new and perhaps better architectures that may arise in the destination process. In the case of libraries, the easiest solution is to change the source with minimum effort and then to run the conversion. In case of full chips, the vendors of this kind of software developed various levels of migration, such as only the cells, cells and routing, and only routing.

• We should chose the tools based on capabilities, but also on the user inter- face and setup simplicity. If the migration is efficient but takes time to set up and to debug the constraints, then the total effort is what counts.

• Interestingly enough, some silicon compiler vendors who totally ignored the migration market started to work in providing GDSII input and output to and from their tools to grab a piece of this “hot” new pie.

Data Formats

Any designer who wants to use point tools instead of integrated tools within one framework has to learn how to deal with the data transfer issues. Every layout tool starts from a different idea and they all have a different purpose, so internally they each may have a database format that is efficient for their needs.

For example, a problem starts when a layout designer wants to transfer data of a standard cell library to a different place-and-route tool. At the beginning of the IC design industry, there was only one company providing layout design tools for the entire market. This market was very small compared with today, and the format was defined by them based on the limitations of the hardware and soft- ware of the time. Everybody who wanted to enter the VLSI layout market had to comply with this format; otherwise, no one would buy their tool.

The format was and still is GDSII and was developed by Calma on Data General machines. So today if you want to export data from the Mentor platform to the Cadence platform, the only guaranteed way is GDSII. There are other widely used standard data formats such as CIF, LEF, and DEF, but they became popular for the same reason: Cadence developed these formats and had the great- est market share for IC layout. The GDSII format is still the dominant format, so a discussion of this format is warranted.

GDSII is a binary format that, from the user point of view, has the following qualities:

• In each stream file there is a limitation of 64 layers, which have a subdivision of 64 DATATYPEs per layer. So in total the limitation of the stream is 64 ¥ 64 = 4,096 different layers to define polygons for manufacturing.

• Each polygon or path cannot have more than 199 vertices, so if the layout has polygon bigger than this number, the output subroutine will break it into pieces of 199 only. This limitation comes from the Calma software, which could handle only 199 coordinates per polygon!

• There is no logical or electrical information attached to a polygon. There are no pins, ports, nets, or signal recognition, and this is a big drawback for place-and-route. There are no pins; however, there is a simplified form of recognizing ports. When writing a GDSII file the ports become TEXT with a TEXTTYPE that is attached to a small polygon on the layer specified in the export command file. When importing this GDSII into another tool, the user usually writes macros that will select the text and regenerate the ports. The problem is that there is no solution for net information to be preserved.

• Device generator results, vias and contacts, or automated layouts that are “not polygon level” or soft devices are flattened to polygons. This is a big problem when the transistor size is changing and the data is coming out of a tool that uses these features. Again, this is because historically, Calma didn’t have device generation capabilities.

• GDSII recognizes full hierarchy of objects, but always takes the first reference cell found in the design, regardless of the full path. GDSII uses unique names for each cell but does not recognize the full path name, which is again historical. Unique names in UNIX mean that something in the full path name is different. In Calma times the cell names were attached to a library that had a unique place to be written on the disk.

Another format that is mostly used for place-and-route is LEF, which contains layout information required for a library and routing setups and, together with a DEF file, fully characterizes nets, pins, ports, and signals.

Comments

Popular posts from this blog

Square wave oscillators and Op-amp square wave oscillator.

Adders:Carry Look-Ahead Adder.