I need a CSV export of data that goes beyond 10k rows in the CSV export, this is resulting in not all records being exported.
Assuming this line is responsible for the said limit
What will be the right way to bump this limit to maybe 100k or 1M records?
I tried doing DISCOURSE_QUERY_RESULT_MAX_LIMIT = 1000000
in app.yml and that doesn’t seem to have any effect.
merefield
(Robert)
3
Might have to resort to plugin-ing the plugin.
Create a small one called x-discourse-data-explorer
and it might get evaluated afterwards.
Then simply override the module constant?
That’ll be a lot of pluginception I was hoping there was a simpler way out.
cc @riking
merefield
(Robert)
5
Or fork the plugin, make the change, point to your fork and rebuild? (why I didn’t respond with that first? )
1 Like
That’s what I was thinking as well. If there is no other (reasonable) way out, forking is the last option.
It’ll however be good to understand what is the rationale to hard limit results to 10k? maybe something breaks if more than 10k results are fetched?
1 Like
merefield
(Robert)
7
“discourse-data-explorer-MAX” … I’ll get my coat!
1 Like
riking
(Kane York)
8
Yes, that module constant is intentionally set up to be easy to override from another plugin.
3 Likes
JammyDodger
Split this topic
9