This website contains other cold fusion items.
Click to see the list of links
392) Unexpected rejection of our important paper
Montclair State University, New Jersey, USA
June 12, 2010
This website contains other cold fusion items.
The Curie Project was described in unit 375. Participants were Jeff Driscoll, Mike Horton, Ludwik Kowalski, and Pete Lohstreter. Our independent attempts to verify Orianis claim of reproducibility turned out to be negative.
On most experimental surfaces (31 out of 34) the mean track densities were essentially the same as on control surfaces. On the remaining three experimental surfaces the densities were much higher than those reported by Oriani. They were tentatively attributed to contamination.
The purpose of this unit is to describe our attempt to publish that paper.
Submitting the paper
On October 16, 2009, I received the following message from Dr. Jan Marwan, who was certainly familiar with our project. Dear Dr Kowalski, The American Institute of Physics (AIP) offered me to edit a book titled "Low
Energy Nuclear Reactions Sourcebook" based on the New Energy Technology Symposium that I am organizing.
I work as the only editor together with the AIP on this book, and we want to give many research groups working in the LENR field the opportunity to publish their results here. This covers on the one hand brand new results recently discovered, but may also be a brief review of the work
you performed over the last 20 years. The book we are planning is going to be a general sourcebook, covering most of the topics (experimental and theory) of LENR, with which the scientific audience worldwide, reading this book in the future, may get the general idea of what LENR is but
also may go into very detail. We think that with this work, based on your article among many others, we are going to publish a book of high demand to which scientists and students at Universities, Research Institutes, Industries etc worldwide will have access for a very long time.
This AIP book is officially a proceeding based on the New Energy Technology Symposium held in San Francisco, CA, March 2010 at the American Chemical Society (ACS). However, the AIP agrees that people who, due to the difficult funding situation at this time, may not be able to attend
the meeting, are given equal chances to publish in AIP, same as, of course, those who attend the meeting in San Francisco. This book, again, will cover new results but may also contain results already presented and published elsewhere (permission by the first publisher, of course,
needed, and if no verbatim used) and so to give LENR scientists the opportunity to briefly review their long term work in this field. Those who already published in ACS LENR Sourcebook volume 1 & 2 are given equal chances to publish again (preferably new discoveries), this time
in AIP. I think we all feel the same that this offer by the AIP does not come up very often in our life. . . .
Naturally, we wrote the paper and submitted it before the deadline. It was hard to believe that our establishment (American Institute of Physics) was going publish a book with papers describing cold fusion. I am not inserting our paper here because it is too long (eight pages) and,
more importantly, because its future remains undetermined. It was first accepted in May 2010, and then rejected in June.
The unexpected rejection
On June 11, 2010, I received the following message from Dr. Marwan
I am sorry to let you know that the other reviewer surprisingly rejected your AIP paper.
I therefore decided not to include your paper into this AIP book. I truly apologise for the inconvenience.
Inconvenience? It is more that this. A report from the "other reviewer," Y, was attached to the message. The report of the first reviewer, X, was sent to me on May 8, when the book was accepted. Responding to this June 11 message I wrote:
a) I am also very sorry. On what basis was our paper rejected?
b) How many reviewers did you send the paper to?
c) What options do we have in this unexpected situation?
That was more than 24 hours ago. It is Saturday; I will probably not receive the answer before Monday. Meanwhile let me summarize the X and Y reports.
X ==> "Publish as received, or with minor corrections as indicated." All these corrections were minor and easy to make.
Y ==> I strongly recommend the editor to reject this paper for publication.
Another big difference was in ranking of our paper. Five categories of ranking were:
a) Technical Quality
d) Originality (when applicable)
Three choices for each category were: "good," "average," and "poor."
*** Ranking by X ==> a, c and e are " good," d is "N/A," and b is "average."
***Ranking by Y ==> all five categories, even d, which is not applicable, are "poor."
The rejection of our paper by Dr. Marwan, on June 11, was a sudden reversal; a previous message on June 10 asked for minor formatting corrections (elimination of blank lines between paragraphs).
What would I do if I were in Marwans place?
It is not easy to be an editor; I am glad I am not in Jans place. AIP ethical standards (Google found them for me) describe what is acceptable and what is not acceptable, as far as reviewing papers is concerned.
Suppose the manuscript is clear and comprehensive to me (in Marwans place) while the referees ratings for these categories are poor. That would be an indication of a possible bias. The poor for all categories would also be a warning that
something is not right.
I would ask Ludwik to comment about what referee Y wrote. Then I would send the manuscript to at least one more referee, and would inform Ludwik about this. In choosing referees I would eliminate those who might have vested interests in defending the teams criticized in the
submitted article. And I would be open-minded about what is recommended by referees. Their role is to suggest; decisions about rejections belong to editors.
Please note that we made no attempts to replicate SPAWAR experiments. Our comments (about that field) were based on results published by respected researchers (more specifically, authors of references 10 and 11). This was clearly stated in our manuscript. Ignoring this, reviewer Y wrote:
This paper specifically addresses the reproducibility issue of the SPAWAR experiments. Kowalski and his co workers, in attempting to replicate this experiment, failed, and from that they questioned the results obtained by the SPAWAR and the
Oriani group and concluded that the broad range of the SPAWAR research using CR 39 tracks to detect nuclear particle emission is wrong and totally misleading. Moreover, in his final conclusion Kowalski interpreted this issue as a failure of the whole LENR/cold fusion subject
and convincingly predicted this research topic to become out of interest.
However, does the fact that Kowalski and his co workers not able to replicate the SPAWAR results means that SPAWAR, and a few other research groups who successfully replicated these experiments, have been misinterpreting their results, or does it rather mean that the Kowalski
group failed because of lacking scientific background and experimental skill? On the one hand, the SPAWAR researchers working in this field are highly recognized and published their work in more than 20 peer reviewed papers. If Kowalski were right, it would mean that the
reviewers had been wrong. On the other hand, it is not very much convincing to claim irreproducibility of the SPAWAR results on the basis of the work carried out by high school teachers. Although everyone wishes to replicate other researchers' experiments it still requires
experimental expertise, and now one could argue that a high school teacher may not have the skill compared to a highly qualified research scientist.
To evidence the truth of LENR/cold fusion requires detailed experimental data that the SPAWAR group has provided, to prove this research wrong requires experimental data as well that I was missing reading this paper. Therefore, I strongly recommend the editor to reject this
paper for publication.
I do not think that we, as a group, were unqualified to replicate Orianis experiments. In fact, we complemented each other. I am a nuclear physicist (Ph.D. in 1963), J.D. is an engineer, M.H. and P.L. are experienced chemistry and physics teachers. We were an ideal team
for the selected task. Aware of our limitations, we did not attempt to perform SPAWAR-type experiments. What else can I say? I also participated in numerous research projects; my list of publications, in peer
reviewed journals, click to see it is also impressive.
The paper we want to publish will eventually be appended to this unit. I am not doing this now because this might conflict with our attempts to publish it. Those who read it will see that no attempts were made to claim irreproducibility of the SPAWAR results on the basis
of the work carried out by high school teachers. We were comparing results published by distinguished CMNS researchers, more specifically, by authors of references (10) and (11). The readers will also see that over 90% of our manuscript is devoted to our own investigations
of the Oriani-type effect. Why were recent SPAWAR-type experiments marginally mentioned? Because they have something in common with Oriani-type experiments. In both cases thin Mylar films are used to protect CR-39 detectors from the electrolyte.
Conclusions and interpretations
Referring to our experimental results we wrote: The overall conclusion is that experimental data reported in , and summarized in Figure 2, are not reproducible. Track densities measured on our thirty-four CR-39 surfaces were either much lower or
much higher than those reported by Oriani. The origin of high track densities, in Experiments #10, #11 and #12, is not clear.
And here is our final observation: Experimental results supporting two recent claims of emission of charged nuclear projectiles due to electrolysis are not reproducible. Focusing on reproducibility is probably more important, at this stage, than
focusing on interpretation of experimental results. The future of the so-called Cold Fusion field--now called Condensed Matter Nuclear Science (CMNS)--remains uncertain. The attitude of those who control scientific research (editors of mainstream
journals, directors of granting agencies, etc.), toward the CMNS field remains highly negative. The field is in real danger of disappearing without producing clear yes-or-no answers about its extraordinary claims. It will probably be rediscovered later in this century.
Refering to this Y wrote: Moreover, in his final conclusion Kowalski interpreted this issue as a failure of the whole LENR/cold fusion subject and convincingly predicted this research topic to become out of
Where does the term "out of interest" come from? And we did not say failure of the whole LENR/cold fusion subject; we were referring to the uncertainty about the field future. Furthermore, we are not convincingly predicting
anything. Is it not true that the field is in real danger of disappearing? In any case, our last observation could easily be modified to avoid possible misrepresentations. In Marwans place I would ask Ludwik to rewrite the final observation. That can easily be done. Rejecting
the entire paper (with valuable experimental results) because of one questionable formulation (at the end of the last paragraph) is not reasonable. Each of us put a lot of work into this piece of scientific research; why should it be wasted? We are pleased that no technical
mistakes were found in our paper.
A constructive suggestion
Suppose the potentially-controversial ending sentences are removed. This would reduce our final observation to: “Experimental results supporting two recent claims of emission of charged nuclear projectiles due to electrolysis are not
reproducible. Focusing on reproducibility is probably more important, at this stage, than focusing on interpretation of experimental results." Would such personal observation, or something similar, be acceptable? I hope so. If not then the entire last
paragraph could be eliminated. This seems to be more reasonable than throwing away the data. Do you agree?
Click to see the list of links