[R] getting data from a webpage
glennmschultz at me.com
Mon Dec 19 16:01:44 CET 2016
I was getting data swap rate data from the St. Louis Fed FRED database via the FRED API. ICE stopped reporting to FRED and now I must get the data from the ICE website. I would like to use httr to get the data but I really don't know much about website design. I think the form redirects but I am not sure that is the case much less how to identify what website the form redirects to. I used the developer and inspect elements to come up with the below which failed miserably. In addition, I purchase the book Automated Data Collection with R which has not been to useful helping me to understand how to navigate pages using forms and redirects.
Can anyone provide a good reference to understanding how to get data from websites using forms and redirects. Specifically,
How find the actual webpage that on must submit the POST request.
How to the find the redirected page which really has the data.
#get initial cookies
h <- handle("https://www.theice.com/")
GET(handle = h)
POST(url = "https://www.theice.com/marketdata/reports/180",
body = list(reportDate = "15-Dec-2016",
SeriesNameAnRunCode_chosen = "USD Rates 1100"),
encode = "form",
handle = h)
page <- GET(url= "https://www.theice.com/marketdata/reports/icebenchmarkadmin/ISDAFIXHistoricalRates.shtml",
handle = h)
More information about the R-help