Manipulating Log Files

I know what I want to do but not how. I have a dump of logs by severity, source and code.
They are in a spreadsheet and I want to simplify it by grouping the like rows and counting how many of each there are so I can decide how best to go through and investigate each.

Can I do this in the spreadsheet or do I need to set up a database for something this small?

Thanks

Without seeing the structure of your log files, I’ll take a guess that pivot tables might be a solution.

It is hard to tell, what YOU can do. From your question I’d assume you have no clue where to start.
.
Also “something this small” can be misleading. I’ve seen logs with 100.000 entries and more. Simple counting is no problem, but I can easily block spreadsheets with complex filters over 10.000 rows wich are blazingly fast in a database, if one manages the right sql-command.
.
Some guides to start in a spreadsheet:

  • There is “data in columns” to separate text by markers like “:”, if it is better to split strings then search in them.
  • You can sort and filter your data. Take care to use full rows.
  • Counting takes a bit of care, as you have to decide, if invisible rows shall be counted or not. There is a set if functions to count only visible data (not filtered). You can also create your own conditions for COUNTIF()
  • A quick way can be pivot-tables. Take care to allow drilldown to data.
  • Often forgotten: You may filter data to another sheet, for example to have there only (a copy) of data for “severe” log entries. To work on this smaller set of data can be much mire effective. A proper setup can even refresh this.
  • From the above you may derive to use both: Database and Spreadsheet. I often drag the result of a query to calc, to leave filtering in the database and doing only the more visual work in Calc (Calc/Base can remember this also, so a refresh is possible. Base can also handle Text-Databases = csv-files, but the sql on this is quite limited.)