2

I'm trying to figure out the best way to compile all "post threads" (Questions and all related Answers and Comments) where a specific user owns any of the {Question, Answer or [a comment on either the Q or A]}.

It seems like there's probably a more efficient way than the combinations of methods I've been trying, for example:

 (1) get `question_id`'s from `users/xxx/questions`
 (2) get `question_id`'s from `users/xxx/answers`
 (3) get `combined list` of `post_id` + `post_type` from   
  - `users/xxx/comments` and `users/xxx/mentioned`,
    - then split into 2 post-types:  
    - if post_type="question" then it's a question_id  
     - remaining `post_id`'s are `answer_id`'s: 
        (4) get `question_id`'s from `answer/{ids}`

Finally, combine all question_id's into unique list and query them again, for selected fields from questions/answers/comments. (body, dates, counts, userids, username, wrapper).

Is there a better way to do this?

It doesn't help that I've been intermittently fighting with the "missing comments" problem.

Thanks!

1 Answer 1

1

Since some users might exhaust your quota, for such a set of data, this might be a better job for SEDE.


In your API approach, not sure why you are getting question id's from answer id's. You can give /answers/{ids} a filter to make it return: username, tags, and title for the answer's question -- which may be all you're after?


But in/with the API, I would:

  1. Use the /users/{ids}/timeline route to get question, answer, and comment id's on one swoop.
    As a bonus, comment text is already included in the detail property for comments.

  2. Optionally, query /users/{ids}/mentioned too.

  3. Feed the answer post id's, 100 at a time, to /answers/{ids} to get further details if desired.

  4. Feed the question post id's, 100 at a time, to /questions/{ids} to get further details if desired.

9
  • I need to do this programmatically, hence the API, unless there's a way to do this from SEDE? I'd much rather retrieve 50,000 rows at a time instead of 100. Meanwhile, ` /users/{ids}/timeline` may be a quicker way for me to retrieve my list of "every question a user has touched". For example, if "User X" came to this page and added a comment anywhere, I's now want to include not only the comment in the dataset for "User X", but also the source Question, all of its Answers, and all comments on the question, or any of its' answers.
    – ashleedawg
    Commented May 27, 2018 at 13:32
  • I don't foresee a problem with API limits, because 10k calls should be enough to get 7 @JonSkeet's/day. (I hope he doesn't mind being a unit of measurement!) The average user has 10 posts and 26 comments.
    – ashleedawg
    Commented May 27, 2018 at 13:36
  • JonSkeet has over 2767 pages of data (based on wanting to get the parent question for every comment, answer, and mention). That's over a quarter of your bandwidth and would take 20 to 50 minutes to fetch if you somehow didn't get rate limited. Commented May 27, 2018 at 18:27
  • Data Dump data can be fetched programmatically from Google's BigQuery -- which has an API. Commented May 27, 2018 at 18:30
  • Also, to reiterate, the /timeline route solves your "intermittent missing comments problem", in this case. Commented May 27, 2018 at 18:33
  • ...BigQuery -- which has an API very interesting... I'm going to check that out, although I eventually need to get this working with the realtime data.
    – ashleedawg
    Commented May 29, 2018 at 10:03
  • 1
    If you fetch the bulk with BigQuery, then you can limit the SE API calls to just that data that occurred since the last data dump. Commented May 29, 2018 at 17:47
  • Great idea, thanks. Out of curiosity, on what did you base the estimate for fetch time? Is there a standard avg you use?
    – ashleedawg
    Commented May 29, 2018 at 18:45
  • For me, if I try a long series of sustained fetches at more than about 2 per second, I start getting backoffs. Limiting to 1 per second seldom sees problems. Theoretically, if you want to push it, you could get 2767 pages in 2 minutes. Commented May 29, 2018 at 19:35

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .