Has something changed in the data explorer? Can't get query via API

This stopped working two days ago for a hosted customer:


  HTTP_STATUS=$(curl -s -o /tmp/discourse_response.json -w "%{http_code}" \
    -X POST "https://HOSTNAME/admin/plugins/discourse-data-explorer/queries/3/run" \
    -H "Content-Type: multipart/form-data;" \
    -H "Api-Key: $DISCOURSE_API_KEY" \
    -H "Api-Username: $DISCOURSE_API_USER" \
    -F "limit=ALL")

I see successful use of that key:

It has query-only permissions:

I double-checked that the URL of the query hadn’t changed again by Reverse engineer the Discourse API.

Then I generated a new global key and I’m still getting a 400 error.

I don’t know what the problme could be

400 or 406 ?

Try adding a Accept: application/json header.

Wait. I found it:

{"errors":["You supplied invalid parameters to the request: limit"],"error_type":"invalid_parameters"}

So the issue is -F "limit=ALL"

So now my question is why that stopped working?

1 Like
1 Like

Thanks very much. But I was unable to use limit=ALL even as an admin with a global key. The client needs to download all of the data (I’m currently unclear what the limit is or how large the set is likely to be.

That is exactly what the PR aimed to do!

Options are backups, or doing pagination with

select * from table where id > (:page * 5000)

so you get it in chunks for 5k.

It’s working!

Is the max for a normal query that I’m doing 5000? If so, I’m definitely safe.

:laughing:

Looks like it is 10k

2 Likes

Thanks very much. Looks ilke I didn’ t need that no-limit anyway. :slight_smile:

I’ve been very pleased that I figured out how to use Github actions to daily pull this query to upload it to a vendor’s FTP site. It’s broken twice in the past couple of weeks. Hopefully this is the last time for a long while!

1 Like