High memory use while converting file

use LibreOffice to convert a pptx file to pdf under Centos,
the pptx is 236M size,and LibreOffice is 7.6.4.1.
while converting it to pdf, the memory usage is about 3GB,
why is it use so much memory?

the command is:
libreoffice --headless --invisible --convert-to pdf:writer_pdf_Export /root/2023122613_bojy8wjch9wyy_lv0.pptx --outdir /root

A side remark.

Why is this strange thing so popular? Exporting to PDF uses own export filter for each component; Impress (which handles PPTX) uses impress_pdf_Export. But explicit specification of export filter is needed only when there is an ambiguity (different filters may export to the same format - like HTML, for which we have several filters), or when you need to pass filter’s options. You have neither condition; why not simply use --convert-to pdf?

1 Like

If you are really interested, start with unpacking the files, as both office suites use packed files on disk. The check the contents.

Note that such a size most probably indicates images (and other media) in the document package (well, it’s a presentation, who would expect anything different). And usually, the images are already compressed - like JPG/PNG. Inspecting the package will not reveal a significant difference of package size vs. package content size; only taking the packed images expansion size (e.g., JPEG → BMP) into account may give adequate answer…

1 Like

thank you.
I had try impress_pdf_Export and –convert-to pdf
almost the same situation.

Do you expect a technical answer?
It uses that much memory, because its algorithms are implemented so to use that much memory. It is not known even if it is something to improve, or if it is OK - it simply can’t be said without a sample file. But anyway, unless someone if personally interested in minimizing used memory, there’s no point to focus on that specifically in the development - much more important is correctness and performance…

thank you for your reply…