Excessive memory usage - command line linux

I consistently receive these errors telling me the conversion is using a massive amount of memory. Do I have the command messed up or should I have a setting set to lower the memory usage? I still want to keep fast conversions but 600MB of memory for 1 conversion… That’s extremely high.

Resource: Virtual Memory Size
Exceeded: 626 > 512 (MB)
Executable: /usr/lib64/libreoffice/program/soffice.bin
Command Line: /usr/lib64/libreoffice/program/soffice.bin --headless --convert-to pdf file/path/here/file-name-here.docx --outdir newpath/pathway_docx

I changed the pathway and filename from the original names for security reasons but kept the format. The filename does have dashes in it.

End of question

Start of frustration with this submission system

BTW. On an android phone, your tags and verification system is extremely bad. It’s way out of control. I cannot get past the stupid tags requirement and it’s extremely frustrating. It’s consistently giving me errors. I must include one of the the following writer Calc impress base draw math meta. I DID INCLUDE IT!! This is at least 10 submission attempts. Then after each wrong it erases the CAPTCHA which then asks you again in different languages. I have been asked in Spanish, German, English. I don’t speak German wtf! Right now, it’s asking me again in Spanish. No soy un robot. I am in the United states! Then when I finally pass it, I decided to copy and paste the entire error line into the tags and it errors out telling me please use letters, numbers and characters… Ahhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh insert pulling hair out gif here

The tag suggestions don’t even offer the required tags except meta. One of the tags I had in there was --convert-to and that was marked wrong. It was offered by your system… I had to remove it to be able to submit.

I finally had to put every tag instead of 1:
writer Calc impress base draw math meta
and remove all commas and that let me submit.

Having all those tags makes the tag system completely useless because my post is not about those tags.

Virtual memory size 626MB, so what? Don’t limit it to ridiculous 512MB.

For tags, one tag common and maybe meta would had been enough.

So virtual memory size of 626MB for 1 conversion is not a bad thing? The virtual memory size is what it is by default. We haven’t changed this. What would you suggest the limit be for this setting on the server?

I tried including just writer in the tags and it didn’t work. Error error error… Very frustrating.

I called the hosting company and they said that’s not the limitation of the virtual memory. That’s lfd reporting that this process went over that amount of virtual memory. My question still stands is that too much memory, virtual or not, for one process, one conversion to use?

If I start modularizing the software and selling the software then we might have 60 conversions at the same time on our hands. 676 megabytes * 60 seems like an awful lot of virtual memory to be used for a conversion process.

Do we have or do we need to optimize any settings or do we have the command written properly? What could be causing, what appears to be, over excessive usage for 1 conversion?

A running Writer instance (empty document) needs about 240MB. To that add the memory needed with a document fully loaded (which it needs to be to convert it to a different format). This of course entirely depends on the actual document content and the feature set it uses. I’d not limit to less than 1GB. If you plan to convert also spreadsheet documents then add even more, as they tend to sometimes contain an awful lot of data and formula work.

Thank you for that answer. That helps a lot. We are converting both documents and spreadsheets. I will find out what the virtual memory limit is and then I will also increase the warning limit based on your response. Thank you again very much.