4

I have a friend which writes a rather long document. By now he has about 130 (PDF) Pages. He uses TexWorks on Windows and his document takes sometimes up to 2 - 3 minutes until it is compiled.

I'm not asking if this time is normal for such a long file. I know it depends on many things and every PC would have different times depending on the CPU etc.

But I was wondering which constructs in a tex file would take especially long time if there are many of them in a document.

For example I could imagine that including many images would take a lot of time since every external file has to be loaded into the document (assumed draft is not enabled).

Also in his document he very often uses mbox to prevent hyphenation. If given a lot of those, would that increase the time the document takes to compile?

As an last example: he didn't know the option openright and thus used \newpage everywhere where he wanted a new chapter to open on the right page. Does this cause extra compile time?

I hope my question is clear enough.

Edit: One more thing I saw in his document. As the very first line I saw \documentclass[all]{...}. I didn't know that the option all existed. Does it do what I imagine? Does it actualy enable every possible option? But isn't that contradictory? For example for parskip and halfparskip? Would this hurt the performance? I imagine it to be like in Java import *.

ap0
  • 329
  • PNG images can slow down compilation. – egreg Oct 02 '14 at 11:16
  • I've found that a slow computer contributes significantly to long compilation times ;^) (I'm only half joking. Sometimes, upgrading a computer is the proper course of action) – Steven B. Segletes Oct 02 '14 at 11:17
  • egreg, Compared to for example JPG images? – ap0 Oct 02 '14 at 11:17
  • Steven B., I realise that and I understand. But I am more interested in the maybe wrongly used constructs in LaTeX itself which cause an extra long time. – ap0 Oct 02 '14 at 11:19
  • 1
    Complicated commands that do a lot of behind the scenes stuff will slow down the compilation. For example, sorting data with datatool or similarly using TeX to internally sort the glossaries with the glossaries package (rather than using makeindex or xindy to do the sorting) or complicated picture-drawing code, such as tikz. Regarding all, that's either specific to the particular document class being used or it gets picked up by one of the loaded packages. – Nicola Talbot Oct 02 '14 at 11:44
  • 1
    Comparing the compilation time from TeXWorks and from the command line may give a good suggestion, what slows down the process. – Przemysław Scherwentke Oct 02 '14 at 12:03
  • Is it an image intensive document or too many macros ? Often if you have a complicated header or a very special page layout design it increases the overhead per page. You can compile a 10 page document and then 20 and see if it extrapolates to 130 pages wrt to compile time to see whether something is inherently slow or things get intensive later in the document such as custom plots or drawings or images. I don't know TeXWorks but in TeXnicCenter the log is updated on the fly so you can see the complicated pages taking more time. – percusse Oct 02 '14 at 12:05
  • percusse, the document uses different margins, not the defaults. Also headings are left/right of the text not above. He also uses his own defined environments for defintions. I think they are colored boxes. – ap0 Oct 02 '14 at 12:19
  • the standard classes don't have an all option, the class you are using may have, but I wouldn't want to guess what it does. \def\z{\z}\z slows down things quite a bit. – David Carlisle Oct 02 '14 at 12:29
  • One thing I have encountered is the use of the microtype package as per Disable microtype for a portion of a document. So, if microtype is being used try commenting that package out and see if that changes the compile times. – Peter Grill Oct 02 '14 at 16:06
  • 1
    There are not enough power computer for the user that do not think "make it simple, idiot" until he obtain a snail compilation time. The severity of the problem is often proportional to the kilometers of the preamble. – Fran Oct 02 '14 at 23:41
  • @PrzemysławScherwentke Why would comparing the time to compile in TW with the time to compile at the command line be informative? Are you suggesting that the choice of editor is slowing compilation? – cfr Oct 03 '14 at 02:38
  • @cfr Certainly. Compiling time from the comand line on Pentium III 500 MHz can be shorter than a graphical editor on Pentiom i3 2.1 GHz. – Przemysław Scherwentke Oct 03 '14 at 07:27
  • @PrzemysławScherwentke But all the editor does is execute the command you run on the command line. You might see a difference if different volumes of information are printed as output, but the editor is not doing anything different to compile. – cfr Oct 03 '14 at 14:34

1 Answers1

1

I understand that math can take longer than straight text, due to the many additional considerations that are involved. Including images (other than pdf or mps) format will also increase compile times, because these need to be processed rather than simply dropped into the code.

But TeX is a typesetting engine, and while it's Turing-complete, things that require a lot of processing aren't going to be particularly efficient when we ask TeX to do them. The example that springs most immediately to mind (due to its recent release) is the qrcode package, which uses TeX to convert strings of text into binary data and then builds graphical QR codes from them. TeX is capable of this sort of thing, but wasn't designed for it; so it takes a bit longer to process a QR code than it does simple text. (Not much longer in this case; qrcode appears to be quite excellent. But still.)

As also mentioned in the comments, extensive drawing in TeX will likewise take more time. tikz and friends suffer from this; though again, the efficiency of these programs is remarkable given how far they're stretching TeX beyond its original bounds. shapepar, though it's using TeX for its intended purpose, coerces TeX into typesetting paragraphs in odd and creative shapes, and has to run repeated tests to determine how best to fit given texts into those shapes; as a result, it often takes longer to run than simply typesetting text in normal, rectangular paragraphs would.

The bottom line is that TeX was designed for breaking lines into paragraphs, paragraphs into pages, and formulas into nicely typeset units; anything other than that is going to be slower.

dgoodmaniii
  • 4,270
  • 1
  • 19
  • 36