# MATLAB Scripting and LaTeX

MATLAB is a powerful environment for numerical computations. Consequently, it is often used in academia and industry to quickly perform simulations, test models, perform matrix computations and visualize data. It provides toolboxes for signal processing, neural networks, curve fitting and so on.

Because MATLAB is such an easy to use and powerful tool, it is no surprise we want to use it for data processing and then export our data for plotting in LaTeX. This way, we do not need to hard code values in our `pgfplots`

plots (a boring and cumbersome task). It is very easy to import and export data in CSV-form in MATLAB and LaTeX. We will use the CSV-format because it is very easy to use. By using an intermediate format to save our data, we can make our LaTeX plot dynamic. If our raw data changes (more measurement points, wrong measurements), the values *automatically* change in LaTeX after running our script.

This is not a tutorial to the MATLAB environment or programming language. Readers are supposed to have basic understanding of MATLAB and LaTeX.

We will now illustrate how we can import data into MATLAB, process it and export it again for plotting in `pgfplots`

(LaTeX). The entire thought process behind the experiment is discussed below.

## Collect the Data

The first and by far most important step is the data collection. Good data is fundamental in research and without it, you cannot draw good conclusions nor have a good plot, even if the plots are good on the eye. So first of all, we need to spend time collecting correct data and save it in a consistent and structured way. In the following, we always assume data is saved in the CSV-format. Individual values are separated by a comma and the rest is just plain ASCII-text. These files are easily read by different programs and can be accessed (and manipulated) with a plain text editor such as nano on the CLI, Notepad or TextEdit. You can even use MS Excel to enter the different values and then save it to CSV.

After collecting the data, we must process it most of the times before we can plot something useful. We will illustrate this process of reading values, processing and then exporting them with an easy example that illustrates the most important aspects.

Consider the circuit depicted in the figure below (impedance values only for reference). We want to know what the (transfer) function is (from the figure we immediately see it is a low-pass filter). Because we do not want to measure too many points and the frequency axis is typically split up logarithmically, we will measure three points per decade. Thus, we apply a certain sinusoidal voltage at point 1 and measure the response at the output (point 2). The measurement data is displayed below.

element | value |
---|---|

R_{} |
971Ω ± 1Ω |

R_{2} |
555kΩ ± 1kΩ |

L | 4.71mH ± 0.01mH |

C_{1} |
1.467nF ± 0.001nF |

C_{2} |
4.55nF ± 0.01nF |

C_{3} |
19.2pF ± 0.1pF |

Data processing, in this case, is limited to the calculation of the transfer function (voltage ratio and phase). This is exactly what this script is doing:

The CSV-file looks like this (this just the first part of the measurement data):

It is obvious an interesting phenomenon occurs between 50kHz and 100kHz. Now suppose, we ignored the effect at 75kHz (e.g. because of measurement error, not displayed in the tables) and only considered the low pass effect. But later someone (read: a fellow student or collegue) notices the unusual phenomenon and decides it should be investigated further. We end up with additional measurement data that needs processing and plotting. Because we used an automated script, we do not need to manually recompute the transfer function, we just rerun the script.

If we write a script, we only have to enter the new measurement points in our CSV-file and that’s it. Thus, scripting takes a bit more work to get going in the beginning, but can further reduce the workload drastically (espcially in more complicated setups). The result is depicted in the figure below.

We now have enough data to plot our final transfer function. This can be done with `pgfplots`

from a CSV-file. However, we will not go into detail about how to do this, since it is already described here

## Plot and Export Data in MATLAB

Sometimes it is not possible to plot data directly in `pgfplots`

in an easy way, for example when the number of data points becomes very lager (more than 500). In that case the `pgfplots`

buffer is too small (`pgfplots`

needs more than 500MB typically) and compilation is impossible. A way to fix this is to increase the buffer size, but this is not trivial and system dependent. As a consequence, we will not plot our data in `pgfplots`

, but directly in MATLAB. The MATLAB plot will be exported to a common file format (PNG, PDF) and included in our LaTeX file.

Plotting data in MATLAB is very easy, the `plot`

-command (or a variant) is most of the times sufficient. An example is illustrated for the transfer function (Bode-plot) below:

The final thing to do, is write this plot to a file which can be used in LaTeX. Possible formats are PDF, PNG and EPS. Personally, I prefer PDF because if is vector based (in contrast to PNG, which is pixel-based and thus becomes blurry when zooming in), and easily produced.

Writing the plot to a file is done by printing to file. To have only the plot, we first determine the plot size and set the size units. Next, we extract the plot location on the screen. All this is needed because we only want the plot and do not want too much whitespace at the edges (there always is some). Finally, we print the plot to a file named `transfer_function_figure.pdf`

. Notice we explicitly denote the figure handle `tf_fig`

. This makes our code more robust (as opposed to using `gcf`

) when we would add an additional figure to the script.

## Source Code

All source code used to create this page, including measurement data, is contained in the MATLAB-file (`.m`

) and CSV-file (`.csv`

).