-1

Basicly i have a project in my mind which collects new updated data from a website then takes the url of it and update my site for an instance when a new book chapter has published it detects it and gets the url then it adds the url to chapters category i have created at my site. Is it possible to acctualy do something like that. (i dont have acces to json files of the websites im taking the data)

  • 1
    Welcome to Stack Overflow. Please take the [tour] to learn how Stack Overflow works and read [ask] on how to improve the quality of your question. Then check the [help/on-topic] to see which questions are on-topic on this site. Please see: [Why is “Is it possible to…” a poorly worded question?](https://softwareengineering.meta.stackexchange.com/q/7273). Please show your attempts you have tried and the problems/error messages you get from your attempts. – Progman May 28 '22 at 19:00
  • If you want all the links on a page you can get it using JS: `n=$$('a');for(u in n)console.log(n[u].href)` Store them as strings in an array and compare them at intervals to check if something is changed. `https://stackoverflow.com/questions/1187518/how-to-get-the-difference-between-two-arrays-in-javascript` – tangel May 28 '22 at 20:27

0 Answers0