Data explorer query to list the longest "estimated read time" topics?

Hi everyone,

Is it possible to create a data explorer query which can list the “top X” topics by “estimated read time” ?

I’d love to know which topics on our Discourse are the most time consuming to read :smiley:

(And off topic, further to this post by @simon , I can’t seem to add a data-explorer tag to this post?)

2 Likes

I think the approach I suggested in that topic needs to be improved. One problem with it is that only TL3 and above users can tag posts on Meta. That means that the majority of the site’s users wouldn’t be able to follow my instructions. The other issue is that we’ll end up with both unanswered topics and answered topics having the data-explorer tag. That won’t help much with searching for queries.

1 Like

Sorry for the late answer. I got caught up in the question about how to organize Data Explorer queries on the site. Using the data-explorer tag seems like the ideal solution, but topics that contain a Data Explorer query will need to be tagged by a user with TL3 status.

I think that something like the following query will give you the information you’re looking for:

SELECT
topic_id,
category_id,
SUM(total_msecs_viewed) / 60000  AS estimated_minutes_read
FROM topic_users tu
JOIN topics t ON t.id = tu.topic_id
WHERE t.deleted_at IS NULL
AND t.archetype = 'regular'
GROUP BY tu.topic_id, category_id
ORDER BY estimated_minutes_read DESC
LIMIT 100

The LIMIT 100 statement in the last line of the query could be adjusted or removed if you want more results to be returned.

Interestingly, the topic with by far the most recorded read time on Meta is DiscourseConnect - Official Single-Sign-On for Discourse (sso). It’s currently at 126048 minutes.

3 Likes

Hi @simon

Is that formula correct?

If I pick four or five at random, and compare the estimated read time column result in this query, to the estimated read time in the topic itself, I’m getting two very different numbers? :thinking:

It looks like that query gives you the Topics that have been read for the most time, rather than the Topics that take the longest to read?

1 Like

Ah, that could explain the issue.

I’m guessing total_msecs_viewed is the wrong column to use here?

You can use the average time that the users take to read the topic.
In that case, you can just change the SUM function to AVG, it would look like this:

SELECT
    topic_id,
    category_id,
    AVG(total_msecs_viewed) / 60000  AS estimated_minutes_read
FROM topic_users tu
JOIN topics t ON t.id = tu.topic_id
WHERE t.deleted_at IS NULL
AND t.archetype = 'regular'
GROUP BY tu.topic_id, category_id
ORDER BY estimated_minutes_read DESC
LIMIT 100

3 Likes

Thanks for the suggestion @michebs but I’m afraid that one is way off the mark too.

A few examples:

What the query says What the topic says
438 61
353 58
335 40
196 24

But that would mean, on average, a person takes 438 minutes to read that top topic? That seems unlikely. This may sound silly, but did you have enough 0s in 60,000?

Edit: Or maybe the AVG includes all the re-readings of a topic too? So once through would be 61 minutes, but actually users spend on average 438 minutes in there.

Though now I’m quite interested to know how the Estimated Read Time is worked out for the Summary, as ideally you’d want those to match. Even shrinking those by a factor of ten would only ballpark it. :thinking:

1 Like

Yes, exactly :blush:

1 Like

I had a little search, and found this "There are 84 replies with an estimated read time of 0 minutes." - #9 by nbianca.

I struggle with deciphering these things, but it seems it’s uses a word count x time figure (plus a minimum time to cover posts with no words, like images).

There was also this one which gave a hint as to what the final value may be called: (though its old, so may have changed?)

Not massively helpful, I’m sure, but thought I’d share just in case. :slightly_smiling_face:

I hope you get the answers you’re looking for. :crossed_fingers:

I’ve had another look at this, and it seems (in its simplest form) to be topic.word_count multiplied by the ‘read time word count’ admin setting (default 500 word/min). So I think this query would produce the top X ‘longest to read’ topics:

-- [params]
-- integer :limit = 10

SELECT t.id as topic_id, (t.word_count)/500+1 AS estimated_read_time
FROM topics t
WHERE t.word_count IS NOT NULL
AND t.archetype = 'regular'
ORDER BY t.word_count DESC 
LIMIT :limit

Though there’s also the alternative ‘4 second minimum’ too: (amount of posts x 4)/60. Which is there to account for photo topics with no word count. So it works out both, and displays whichever is larger. But I haven’t quite worked out how to add that in yet. :slightly_smiling_face:

Unfortunately, I haven’t got a large enough site to test it out on properly. It seemed to work on a small test sample, but it may need tweaking. :slightly_smiling_face:

Edit: I added in a ‘limit’ parameter to make it closer to the OP spec. :+1:

1 Like

By George, I think he’s got it!

@JammyDodger I ran your query, here are a few screen grabs for reference.

First, the “top 10”:

And sure enough:

:scream: :clap:t2:

There are a couple of numbers that don’t quite match, but it’s really close!

1 Like

Seem like I definitely need to work out how to add the photo one in. :slightly_smiling_face: I’ve not given up yet. :crossed_fingers:

1 Like

I’ve had another go. :slightly_smiling_face: I’m not 100% on this one, as I don’t have a large enough sample to test it against, but it’s picked up my test topics. :+1:

-- [params]
-- integer :limit = 10

WITH read_time AS (
SELECT t.id as topic_id, 
(t.word_count)/500+1 as word_count_time, 
(t.posts_count*4)/60+1 as post_count_time
FROM topics t
WHERE t.word_count IS NOT NULL
AND t.archetype = 'regular'
AND t.deleted_at IS NULL
)

SELECT topic_id, CONCAT (CASE WHEN word_count_time > post_count_time THEN word_count_time ELSE post_count_time END, ' min') AS estimated_reading_time 
FROM read_time
ORDER BY estimated_reading_time DESC
LIMIT :limit

1 Like