Big spreadsheet - how much RAM?

I have a spreadsheet of two sheets, each with 400k rows, and around 10 columns. Around 75MB.

It started as a CSV so I’m confident there’s no formatting.

Copying and pasting a column of formulas uses 100% of my CPU. It takes 2-3 minutes to copy one column at a time, and I have to force-quit when I ask it to do more than that.

What sort of RAM/storage/etc do I need to work with something this size?
(I’m not techie so I don’t actually know what the limiting factor is likely to be)

Mac OS 12.7
1.6 GHz Dual-Core Intel Core i5
8 GB 1600 MHz DDR3

Thanks.

Which version of Libre Office do you use?

difficult to say in general.

try wtih 4k, 40k, … to make your own idea.

some other ideas …

What is the purpose of loading database data into a spreadsheet? If it is a one-off job, import, sort, filter format, analyse and throw away the source, you can live with the shortcomings and delays.
If this data set is expanding and used on a daily basis, you should not store the stock data on the grid of an arithmetic calculator.

The problem is, I can’t: LO is literally unusable with a file this size.

Ok, I’ve had a similar comment before. Makes sense, I guess; I just don’t know what the alternative is.

It’s 24.8.2.1

So I look at the RAM requirements for 4000 rows and then multiply by 400?

You cannot calculate a memory location by specifying a number of rows.
The decisive factor is the maximum number of characters in a line.
1 character corresponds to 1 byte (8 bits).
Add an overhead of approx. 10%.
Now you can do the math.

1 Like

Developing a small database is the one and only alternative. See Why does LibreOffice Calc take so long to save changes to a file?