Import data from URL to libreoffice

The function in LibreOffice doesn’t seem to work very well. So I wounder if there is any useful tolls for this? I found something called windmill that isn’t free unfortunately. Perhaps its possible to use the script on the page or something?

I am looking for something like this:

This alowes for using the actual script on the page. I also need a document that can uppdate along with the page.

In one cell use WEBSERVICE to load a XML page and in other cells use FILTERXML to extract specific data from that cell. For example
A1: =WEBSERVICE("https://example.com/")
A2: =FILTERXML(A1;...)
Replace ... with your XPath expression.

Hi!

Pleas elaborerat on that. so in A1 i put the webpage URL?
And in A2 I put some element i can find in the script of the page?

Can this metod pick up new information that might show up on the page by the way?

This: A1: =WEBSERVICE("https://example.com/")

Returns “name” in cell A1 by the way

At the end of How to use WEBSERVICE and FILTERXML calc functions? there is an example.

1 Like

With your help I managed to fix this. So that I can now import 3 colums from the webbpage I whanted data from. They also fits perfectly in my sheet. Thanks for excellent documentation.

There is still a problem however as the data from the webbpage is imported from a “tabell” that has a limited sett of rows where the old data disappears as the tabell is fill up from abow. The same thing happen in my worksheet and this is not good as I like to save all the data.

If theres a way to copy the data i import to other rows in where the rows can be added indefinitely that would be gate. This can happen in the same worksheet but it woud be even better if the data can be transfered from one dokument to another.

disregardles of method this sould hapen in real time.

Finally it would be great if the data could fill up from top to bottom. In the table where I import data from its the other way around. This would also happen automatically in realtime reversing the rows after it has bin gathered to a dokument is and not primarely what im looking for (And I know how to do that).

An othere thing that I would like to bring up is that thers a 15 minutes delay when the data i gathered. any one know if there is some comon reason why this happen when importing data “fluently”?

Maby its better to narow down the qustion to this:

I mange to import att chart from an online source. There is still a problem however as the data from the webbpage is imported from a “chart” that has a limited sett of rows where the old data disappears as the tabell is fill up from abow. This is normal for alot of online charts displaying stock data.

Naturaly the chart will look the same in my worksheet as it dose in the online source. This is not good as I like to save the data. So now Im looking for solution that would mean that all the data could be saved in a document.

Maby its better to narow down the qustion to this:

I mange to import att chart from an online source. There is still a problem however as the data from the webbpage is imported from a “chart” that has a limited sett of rows where the old data disappears as the tabell is fill up from abow. This is normal for alot of online charts displaying stock data.

Naturaly the chart will look the same in my worksheet as it dose in the online source. This is not good as I like to save the data. So now Im looking for solution that would mean that all the data could be saved in a document.

If possible, consider editing your last post and including an upload of your current spreadsheet (ODS).
.
Also, you might use the Comment icon rather than Suggest a solution or Reply to avoid posting comments as solutions.

Yes, I can see how it could be som confusion here! Its just a normal spreadsheet with a number of colums where the data is imported from an external source. It looks compleatly normal.

But becous the UR link go to a tabell where older data is deleted the same happens in my spreadsheet as well. There sould be some possibility to save the data? Maby its posibel to copy all the data that comes in to a particular row to an othere part och the spreadsheet?

Normally a role like this can be repeated and that would mean that the data could fill up from top to bottom in the new colums.

I alos consider importing data from one document to an other. The premiss hear is that source will be a dockumet thats is open and that is uppdating continuously. Is this posibel?

This import function from updating document would be extremely useful as there is a lot of data from various sources online as well as on my computer that sould go into one document.

To clarify, you would like to:

  1. Download information from the web service.
  2. Compare that information to existing local information. For example, compare values by date.
  3. If there is a new entry in the new data, then append that new data to the existing local data.
  4. Repeat this each time you use the spreadsheet.

Is that what you are looking for?

Also: Are data ever updated? Would there ever be a case where, say, you have the same date for the local data and the new data and the new data values need to replace the old data values?

Thanks for your engagement

  1. Yes that part is working and every thing is formatted correctly.2. That could be an exampel (see below) 3.In some case yes 4. Yes and the import funtion from the web page runs automatically every time I open my document

The data updates all the time . And they are time specific as well. Hence every row is treated as a unike pice of information and sould go into the document. No information is ever replaced in this first ”colection face”.

This can decribe the principal (Although some data is “fixed”): Data is displayed in rows like this:

1997.631999.1311.070.5611833.411838.42.80.024048.814055.880.30.57FTSE 7396.37400.39.20.131274.931284.736.10.12USA500 3904.203905.703.990.1
0.5617871301929292,…2232…232323
1997.631999.1311.070.5611833.411838.42.80.024048.814055.880.30.57FTSE 7396.37400.39.20.131274.931284.736.10.12USA50

later the data is data is deplayd like this:

7396.37400.39.20.131274.931284.736.10.12USA500 3904.203905.703.990.1
0.5617871301929292,…2232…232323
1997.631999.1311.070.5611833.411838.42.80.024048.814055.880.30.57FTSE
7396.37400.39.20.131274.931284.736.10.12USA50
8989.0909.2323232…SE.I0909Q0W9

As you can see the top row has disaperad, the secound row has becom the and a new row is now displayd in the botem.

What I like to do is to save all this data in a sheet .How this can be done? Perhaps calc can save ever pice of dta that comes up in the first row to an othere row?Or perhaps the data can be continuously sent to and other spreadsheet? I have no ideer realy.

Another barrage of questions:

  1. How much data is being delivered? 10 rows a day? 100? How much of that will typically be new? That is, how many rows are likely to repeat each time the web service is queried? Is there likely 1 new row at the end of 10, or 9 new rows out of 20?
  2. How many days need to be kept in the same place? Can one sheet or file store all the data for a calendar year, then archive that file and start fresh the next year? Or each month?
  3. Is it safe to assume that if a row repeats then it is the same row as before, or is there a reasonable chance that the same information might actually be repeated but represent the results of another point in time? Obviously, what I’m looking at is how a person can match rows.
  4. Will there ever be gaps…where your collection is missing chunks of data. For example, you don’t attend to this because you are on vacation. Do you need to indicate those gaps somehow?
  5. Do you have any experience with Base (databases)?

Finally, does the web service have an API (or RESTful API) that can be accessed via Google Scripts? I’ll be honest, if it does and you have a Google account and are up to some scripting, then dropping the information into a Google Sheets document via an automatic nightly batch job might be a better choice than LibreOffice.