2016 Literature Review 2016 Literature Review

starting strength gym
Results 1 to 4 of 4

Thread: 2016 Literature Review

  1. #1
    Join Date
    Nov 2009

    Default 2016 Literature Review

    by Jonathon Sullivan

    *Jonathon Sullivan, physician and Starting Strength Coach, delivers his exercise science literature review to the Starting Strength Coaches Association Conference. In Part 1, Sully reviews how to critically read a science paper and begins his analysis of Wirth et al.
    Wirth et al. The impact of back squat and leg-press exercises on maximal strength and speed-strength parameters. J Strength Cond Res. 2016 May;30(5):1205-12. doi: 10.1519/JSC.0000000000001228
    Watch Part I

    Watch Part II

    Watch Part III

  2. #2
    Join Date
    May 2016
    Conroe, TX


    This is awesome video, really helps to learn how to analyse the research articles, and this is applicable not only to S&C, but any type of reasearch.

  3. #3
    Join Date
    May 2016


    This was neat. I was sad when it ended halfway through.

    I remember seeing that abstract and thinking, "yep, that makes sense", and having the group go through the red flags in the data is a helpful reminder that just because a paper matches my expectations doesn't mean I don't have to look at the experimental design. Confirmation bias is sneaky.

  4. #4
    Join Date
    Nov 2009


    I used to read these papers all the time as part of grad school but was never really taught how to do it.


    This paper and the recent video made me think about it again.

    Sully, how do you feel about the approach where you read the experimental design (methods) first and see if the results (no matter how beautiful) would actually answer the question? This is what struck me about Morton et al.

    I'm not suggesting that, if you disagree with the method, one completely disregard the paper. (Unless, you are strictly and hurriedly looking for the answer to the question).

    Instead, simply acknowledge the fact that the conclusion cannot come from the results because it was the wrong experiment and, that no matter how perfectly executed, gives you data that answers a different question.

    You can, of course, continue your analysis method to see if there are observations that could be used to ask better questions or answer questions other than the one they are attempting to answer.

    So my humble submission is:

    Title: Is this a question I want answered?
    Abstract: Can I quickly see if they designed it to answer the question?
    Method: If no, disregard the conclusion but read on (with a grain of salt) to grab tidbits for other work. If maybe or yes, deep dive and find out for sure.
    Results: What did they find?
    Self-reflection: What would I conclude?
    Conclusion: What was their answer? Can our differences be reconciled? What would I tell my clients about it?


Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts