- Feb 16, 2011
- 1,456
- 5
- 0
acoggan said:Actually, that is a very interesting spin on things: anyone who has knocked around the scientific world at all knows that it would be up to the journal editors to make any decisions re. a retraction, not the original authors (or in this case, author). IOW, only someone really inexperienced would approach another scientist and ask them (no matter how nicely) to retract a paper (and only someone really naive would ask Ed Coyle that!). I'm therefore guessing things didn't go down quite as portrayed in that interview...that is, either Ashenden et al. approached the journal first (or everything was run through the journal), or they first approached Coyle much more mildly, and things only went downhill after that, or something.
...when a research study is sent off for review, it is usually sent by the editor of the journal to a few reviewers - experts in the field, who cast their eye over it to make sure it is well-controlled and worthy of publication. The Coyle study of Armstrong was submitted on February 22nd. It was accepted three weeks later. Of course, short turnover times are not unique, but this is remarkable. Remember, this is a study so criticized that it inspired two SEPARATE letters in response, and trust us, letters like these are not all that common in the publication process. It became a case study here at UCT in how not to write a case report, and as mentioned in our first post, was criticized widely at conferences.
The first response and criticism was swift, and came from two sources. First, David Martin and colleagues from Australia wrote a letter titled "Has Armstrong's cycle efficiency improved?". This was accompanied by a letter from Yorck Olaf Schumacher and his colleagues titled "Scientific considerations for physiological evaluations of elite athletes".
Essentially, these letters criticized the study design, the method and the scientific process followed, including the conclusions.
AngusW said:The timings seem a bit strange. LA Confidential was published in 2004. Ed Coyle testifies for LA in November 2005. When exactly did he have the conversation with Greg Lemond in which Lemond enlightened Coyle about Michele Ferrari? If it was prior to the SCA hearing, why did Coyle testify for LA, being sickened as he was by the news that LA was with Ferrari?
Velodude said:Speculations. I do not believe the public was privy to the complaint by peer scientists to the University for scientific misconduct by Dr. Edward Coyle.
Velodude said:We do know the complaint was dismissed.
Velodude said:However, we do know the background:
1. There was no intention to undertake a study on Armstrong 1992-1999. Armstrong was opportunistic in turning up at the lab for testing unrelated to any study. Hence the inconsistency of tests, data and times within a cycling season.
Velodude said:2. 5 years after these testing results lay undisturbed in a filing cabinet, Armstrong had public image problems relating to doping commencing with the publication of LA Confidentiel - Les secrets de Lance Armstrong by Walsh and Ballester in 2004.
Velodude said:3. In February 2005 Dr Edward Coyle cobbled together the loose data 1992-1999 into a "scientific" study to justify Armstrong's performance development.
Velodude said:The study was approved by the university in a lightning 3 weeks without being presented for peer and editorial review!
Velodude said:4. The study was controversial and came in for widespread scientific criticism.
Velodude said:5. Coyle admitted to a calculation error but continued to personally promote the study through the media.
Velodude said:6. Armstrong used Coyle as a paid witness at the SCA deposition hearing in November 2005 to use the study as justification against his doping.
Velodude said:I understand the process is as follows as presented by Science in Sport who were one of many who were highly critical of the study:
It appears JAP did not send off the study to reviewers before publication. (JAP receive a donation from Livestrong?) Then the critical letters flowed back to JAP on March 17:
Stingray34 said:Mobilised, more like it
biokemguy said:That's not how I've seen it work, but maybe your field is different than mine.
Journal editors don't have time to keep track of material that has already passed through the peer review process to see if it should be retracted.
That's up to the authors and their colleagues in the same field.
acoggan said:<snip>... Not true: Ed pulled the data together in order to present it at a small scientific meeting held here in St. Louis in the spring of 2002... <snip>
acoggan said:As a member of the Editorial Board of JAP (although not back then), I can tell you that the Science of Sport guys are simply incorrect: there is nothing really remarkable about a study going from submission to acceptance in just 3 wk. All it would require is 1) reviewers who jumped right on their assigned task, and who had at most minimal comments for the author(s), and 2) an editor who took their responsibilities seriously.
.
Velodude said:Can you provide a link concerning this small scientific meeting.
Velodude said:Was he likewise ostracized by that small scientific fraternity for his study?
Neworld said:Sir,
You're joking right? I'm referring to points 1) and 2).
Could you please generate some data from your former experience with JAP that shows the average temporal 'turnaround' for a said submission...to include recognition of submission, allocation to reviewers, reviewing process, reply to primary author with suggestions and corrections, secondary submission with corrections, re-reviewing process, acceptance and then publication.
3 weeks, really?
acoggan said:You could generate such statistics yourself simply by looking at the submitted/acceptance dates in a few issues of the journal. However, that would really only tell you the average turn-around time, not the minimum turn-around time for a paper that (presumably) was accepted with few, if any, revisions.
AngusW said:The timings seem a bit strange. LA Confidential was published in 2004. Ed Coyle testifies for LA in November 2005. When exactly did he have the conversation with Greg Lemond in which Lemond enlightened Coyle about Michele Ferrari? If it was prior to the SCA hearing, why did Coyle testify for LA, being sickened as he was by the news that LA was with Ferrari?
acoggan said:Babble.
acoggan said:So just for fun I thought I'd look at a few issues from the mid-2000s to get an idea of turn-around times...lo and behold, here's the 2nd paper I checked:
http://jap.physiology.org/content/98/1/40.full
(Submitted 26 August 2004 ; accepted in final form 16 September 2004, so only ~3 wk)
EDIT 1: And in the same issue (i.e., January 2005):
http://jap.physiology.org/content/98/1/100.full
(Submitted 2 July 2004 ; accepted in final form 8 August 2004, so ~5 wk)
EDIT 2: Another one from the same issue:
http://jap.physiology.org/content/98/1/160.full
(Submitted 26 June 2003 ; accepted in final form 8 August 2004, so ~6 wk)
Neworld said:In my field, your colleagues would question you at meetings, ask you further stern questions after your flawed presentation, send you response letters in journals...and if you don't get the hint ...phone calls are made and busy editors find out very quickly.
Neworld said:1. Small sample size, but at least you tried.
Neworld said:2. What about all the papers rejected, don't forget to include those...
Neworld said:3. Yes, I would image most readers here would like to the the outliers (min and Max turnaround) and the average.
Neworld said:4. Retrospective studies? Ahh...you'd agree those trials are NOT very useful, especially with limited subjects and discordant equipment. Maybe it should have been a case report; not. Yikes, JAP must have been thin on available work back then.
Well there you go: your field is no different, i.e., once a paper has been published the first "line of attack" is via the journal.
acoggan said:How quickly papers are rejected has little bearing on how quickly they are accepted.
Oh, I'd definitely agree that a retrospective study is weaker than a prospective one, just as I'd agree that a case report is weaker than a study with a larger sample size.
OTOH, the data set in question were obtained from an individual with unique palmares, which makes them more interesting. These are the issues the reviewers had to weigh, and they decided that the paper warranted publication.
danjo007 said:the comments here are just getting more and more stupid by the number.. and by the same people no less!
Bonkstrong said:Lance was tested at 7am this morning by the WTC - well, according to his facebook page he was...
Race Radio said:Looks like Lance is reading the forum again