Synopsys: computational modeling with Sentaurus TCAD

1. Computational Environment.

1.1. Sentaurus on Linux.

Sentaurus TCAD from Synopsys is an advanced commercial computational environment, a collection of tools, for performing modeling of electronic devices. Sentaurus can be run on Windows and Linux. Who however would use Windows for a serious work? Linux, once mastered, offers 10,000 times more ways of an efficient solving of problems.

1.2. Perl

It is possible to work in a batch mode in Sentaurus TCAD. However, I myself find it more convenient to use Perl scripts rather for control of batch processing and changing parameters of calculations. Why? Well, because I know perl. It is a very flexible programming language, suitable in particular for working on text files. And I am inclined much more towards working from terminal window rather, not for using a GUI interface. I am convinced that this is also a much more productive approach towards serious computation.

Besides, perl is realy good when one needs to do some manipulation on text data files.

1.3. TCL

Tcl (Tool Command Language) is a very powerful but easy to learn dynamic programming language, suitable for a very wide range of uses, including web and desktop applications, networking, administration, testing and many more. Open source and business-friendly, Tcl is a mature yet evolving language that is truly cross platform, easily deployed and highly extensible.

Sentaurus TCAD contains libraries designed to be used together with TCL. They are run through their tdx interface and are used for manipulating (extracting) data from TDR files.

1.4. Nedit

A very important point when working with text data files is the right choice of a text editor. NEdit is a multi-purpose, light-weight text editor for the X Window System, which combines a standard, easy to use, graphical user interface with the thorough functionality and stability required by users who edit text eight hours a day. It has its own limitations. But it is excellent for editing data (an ability to use Regular Expressions, keystrokes recording, using macros, copy/paste/search/replace on columns of text, not lines only, are examples of just quite a few outstanding features there).

In case of working with other text files, not data files, when for instance I need to use UTF-8 characters (that happens often, since I am in Russia and must use often Cyrylic, and also I am Polish, so I need sometime to write using Polish diacriticals), when for instance I need to write a LaTeX publication - gedit on Linux is excellent for that.

1.5. Gnuplot

Encapsulated Postscript figures, convenient to be used in LaTeX, can be created easy by Gnuplot.

Gnuplot is a portable command-line driven graphing utility for linux, OS/2, MS Windows, OSX, VMS, and many other platforms. The source code is copyrighted but freely distributed (i.e., you don't have to pay for it). It was originally created to allow scientists and students to visualize mathematical functions and data interactively, but has grown to support many non-interactive uses such as web scripting. It is also used as a plotting engine by third-party applications like Octave. Gnuplot has been supported and under active development since 1986.

Gnuplot supports many types of plots in either 2D and 3D. It can draw using lines, points, boxes, contours, vector fields, surfaces, and various associated text. It also supports various specialized plot types.

Gnuplot supports many different types of output: interactive screen terminals (with mouse and hotkey input), direct output to pen plotters or modern printers, and output to many file formats (eps, fig, jpeg, LaTeX, metafont, pbm, pdf, png, postscript, svg, ...). Gnuplot is easily extensible to include new output modes. Recent additions include an interactive terminal based on wxWidgets and the creation of mousable graphs for web display using the HTML5 canvas element.

A nice feature about Gnuplot is that for every figure you create you may save scripting language used by gnuplot in a plain text file, and run that file from gnuplot terminal window. In that way it is easy to change ploting parameters at later time.

Gnuplot allows also to perfom some simple computation. And, what is very convenient, to perform complex data fiting on large collections of data, in an automatic way, by using batch processing.

It is even possible to run Gnuplot from within perl scripts. This is an extremally powerful combination!

1.6. Grep

Grep is just one of many example tools available on Linux that are excellent for working with text files (either data files or Sentaurus log files). Other examples could be: bash, sort, ls, find, etc.

1.7. LaTeX

The term LaTeX refers only to the language in which documents are written, not to the editor used to write those documents. LaTeX is most widely used by mathematicians, scientists, engineers, philosophers, lawyers, linguists, economists, researchers, and other scholars in academia. It is used because of the high quality of typesetting achievable by TeX.

LaTeX is based on the idea that authors should be able to focus on the content of what they are writing without being distracted by its visual presentation. In preparing a LaTeX document, the author specifies the logical structure using familiar concepts such as chapter, section, table, figure, etc., and lets the LaTeX system worry about the presentation of these structures. It therefore encourages the separation of layout from content while still allowing manual typesetting adjustments where needed. This is similar to the mechanism by which many word processors allow styles to be defined globally for an entire document or the use of Cascading Style Sheets to style HTML.

It is very convenient to use Encapsulated Postscript graphics files in LaTeX.

1.8. Summary of tools used:

2. Solving an Example Problem.

2.1. Problem description

This is an example related to modelig AlGaAs laser properties. The laser structure already exists and this time will not be described (though the problem is interesting itself; we also use perl scripts to create the structure, through sde interface available in Sentaurus TCAD).

Important laser parameters are, for instance, the threshold current, Ith, and the slope S, S=dL/dI. L is light intensity emitted by laser and I is the electrical current flowing through the device. When current I is small, there is no laser action, there is practically no light emitted. The laser action starts at current Ith and just above that value the light intensity increases linearly with the current value.

We have a large collection of results from computation (sometime, these might be thousands of data files). The aim is to analyse these data, to find out what is Ith and the slope S=dL/dI.

2.2. Directory structure

Example directory structure and placement of scripts (just a developed convention only; needed however for proper using of scripts):


"logs" directory contains collection of log files for every computation process performed. Sometime these log files contain very valuable computational results and are analysed as well (by using perl scripts, for instance).

"parameters" directory must always exist. It is needed for performing calculations. It contains information about material parameters of the device studied.

"mesh" directory contains files that describe the geometrical structure of the device studiet, its doping properties and information about mesh (geometry of a set of datapoints used during calculations, since computation is performed on a discrete set of datapoints).

"plots" directory contains results of computation. Usually, two types of datafiles are stored there. In this example, we will analyze files with extension ".plt". These are plain text files, with a somewhat complex structure. Program "inspect", which is a part of Synopsys Sentaurus TCAD collection, may be used for viewing these datasets.

Some results may be stored also in files with extension ".tdr". These are binary data that can be viewed by using "tecplot_sv" from Synopsys Sentaurus TCAD collection.

2.3. Using perl to control computation process

It is possible to work in batch mode in Sentaurus TCAD. However, we find it convenient to use perl scripts rather for control of batch processing and changing parameters of calculations. In this particular case, the main script we use to start the computation process is

Have a look to the content of and 3D_des_template.txt. reads file 3D_des_template.txt and stores its content in variable $D. Array @myLengths contains a set of lengths of laser (in μm) for which we want to perform calulations. We assume that one laser facet (the right one) has reflection coefficient equal 1, while we want, for every laser length, to compute laser characteristics for a set of left facet reflectivity, in steps of 3%, starting from 0.02 to 0.98.

When is run, we replace in 3D_des_template.txt the words "_REFLECTIVITY_L_" with desired reflectivity, and "_CavityLength_" with desired laser length. We also replace the word "_FILE_" by a combination of numbers used for file names where the plot data and log files are stored, which will let us identify later, during the data or log files analyses, by using file names, which parameters were used for calculations.

The resulting new command file for calculations is, in this case, 3D_des_mirrors.cmd. Now, we perform computing from within perl script by issuing system command:

	"/usr/synopsys/bin/sdevice 3D_des_mirrors.cmd"
2.4. Using Inspect and Perl to extract the data.

After calculations are done, plots/ directory contains results in a set of .plt and/or .tdr files. We add to plots/ directory the following files:


The first two, extract_template.cmd and extract_template_iv.cmd, are template files written in inspect scripting language (actually, inspect uses TCL as its scripting language, with their own additions in the form of procedures). These files will be read by scripts and, respectively.

By running (on extract_template.cmd), we obtain a set of plain text files in plots/dat/ directory that contain dependence of optical power as a function of current (two columns of data separated by TAB delimiter: current in the first column and light intensity in the second column).

By running (on extract_template_iv.cmd), we obtain a set of plain text files in plots/ivdat/ directory that contain dependence of voltage as a function of current: current in the first column and voltage in the second column.

We find it more convenient (also when working with gnuplot) to have however I-V and I-L data in one file, instead of in two files. For that, we create plots/ivpdat directory and place there file

After running the above script from within plots/ivpdat directory, we obtain in the same directory a set of files that contain, in three columns data (TAB separated) with I, V, and L in each column.

2.5. Using Perl to analyze the data.

One of possible ways to find out the position of lasing threshold, Ith, is to find maximum in derivative:

	d log L/dU

For that, in the first step, we create files that contain derivatives, using script, and after that we use to find the maximum of derivative for each file.

2.6. Using Gnuplot, Perl and grep to perform linear fitting of the data.

Another way of finding threshold current is by fiting a linear dependence to L(I) just above the threshold current. This is a somewhat more accurate method than that one described previously.

It is convenient and fast to use gnuplot for that.

However, we can not do this right away for all the data in a file since L(I) is linear only above the threshold (and not too far above from there); We can not use data that are below threshold in a fiting procedure either.

There are two tricks possible here. The first one (thanks to Макс Русских for that idea) is by using grep. The data for L that are much lower than 1 are saved by using exponential decimal notation. For instance, 0.001 will be saved in datafile as 1e-3. Hence, we may use grep with "-v" option on every file of interest and that way eliminate all the data that are below 0.001. This method works well in our case.

Another way is to use gnuplot only, with the following trick. Just set there Y-axis window in the range that is above certain minimal value (0.001 in this example) and belowe certain maximal value (2.0 in this example):

	set yrange [0.001:2.0]

Now, when gnuplot is used for fitting, it will perform fitting procedure only for the data that are within the assigned y-range.

Gnuplot can perform fiting of all files at once and it will log fiting details in "fit.log" file. Therefore, we would like to create a suitable gnuplot file for that.

First, let us define function f(x) and limit yrange in fit.plot:

	a=0.2; b=1.0; f(x) = b*(x-a);
	set yrange [0.001:2.0]

The next part of the code may be done manualy in a text editor (not such a big deal realy when you have let say a few tens of files). In case of larger number of files we may use a perl script like this (a so called perl one-liner that can be run from shell in this case):

	perl -e 'for ($i=0;$i<97;$i++){$i++;$i++;$f="1000";$z=$i;if($i<10){$z="0".$i;}\
		print "fit f(x) \"$f","_$z.dat\" u 1:3 via a,b\n"}' >> fit.plot

That produces fit.plot file with content like this:

	fit f(x) "1000_02.dat" u 1:3 via a,b
	fit f(x) "1000_05.dat" u 1:3 via a,b
	fit f(x) "1000_08.dat" u 1:3 via a,b
	fit f(x) "1000_98.dat" u 1:3 via a,b

And add an exit line at the end of fit.plot:


Now, we may run that file with gnuplot:

	gnuplot fit.plot

That creates fit.log file which contains our fiting parameters.

Now, we may use grep or write one more perl script to parse fit.log file and extract desired information. A something like this may be used as a first step:

	grep '^FIT\|^a\|^b' fit.log | grep '+\|FIT' > fit.dat

3. A few more examples.

3.1. Running Gnuplot from within Perl.

Here are two simple examples of running gnuplot from within perl scripts: and

And here is another example, showing how to use gnuplot from within perl scripts. This time it does not save the script to file, it communuicates directely with gnuplot. The example is not related to lasers.

3.2. Perl script for parsing Sentaurus log files. is a simple example script for parsing Sentaurus log files created during computation process. The aim was to extract all energy levels in quantum well from a large collection of files. There is a quite detailed desctription available within the code.

3.3. TCL script for parsing TDR binary data files.

While ".plt" files contain plain text data like I-V dependencies, the binary ".tdr" files contain spacial (volume) data. For instance, temperature distribution within the device modeled. These can be accessed, for instance, by using tecplot_sv, which is a component of Sentaurus TCAD package. Or, by using "tdx" (also a part of Sentaurus TCAD package). It is possible however, and often very desirable, to have a programmatic access to these data. This file, extract.tcl, can be run from terminal window by issuing the following command:

	tdx -b -tcl extract.tcl

It extracts, in this particular case, lattice temperature from a large collection of ".tdr" files, and saves results in plain text files suitable for later analyses with gnuplot.

3.4. Using perl to create a device structure and meshing.

In this example, we use perl script

When is run, it reads template SDE file, sde_waveguide_template.cmd, replaces there a set a parameters that determine device dimensions, doping concentrations and meshing information, and after that it runs "sde" to create the actual device. The reults are stored, for later calculations, in "mesh/" directory.

sde_waveguide_template.cmd file contains a script in a language used by sde program (Scheme). One could generate the structure "manually" (interactively) by using GUI. However, if you want to create tens or hundreeds of variants of the structure (for instance, to change waveguide dimensions) - in such a case using GUI does not make sense. This method is fast and error-prone.

4. What next?

If only time allows, I would like to create an interface between Synopsys and an SQL engine (most likely, I would use Postgres database server). It would allow to convert the data from ".plt" and ".tdr" files to SQL data in database, and later on to have a powerful SQL access to these data.

There are also many questions yet that I do not have answer to about access to the computational process of Sentaurus engine.

Join me and others with similar professional inclinations on a Yahoo! Group discussion forum "Nanouse":

Description of "Nanouse": Discussions on software used in nanotechnology, commercial and open source. How to use it, examples of programming. Connecting with other users and developers: Synopsys Sentaurus TCAD, Comsol Multiphysics, Nextnano, Nemo5, QuantumEspresso, Archimedes, and more... Discussing physical and engineering modeling problems, sharing code and ideas, etc.

To subscribe, send a very short message with a few words about yourself to: