Laborious Process
Although the appraisal process is termed as rapid, I encountered some challenges that made me find it a difficult activity, especially at the beginning. As a result, it took me approximately ten to fifteen minutes to identify and enter the information in the evaluation table. First, the presentation of data varied from one study to another. Just like Fineout-Overholt et al. (2010), I wondered why the authors of the studies would not have uniformly presented the information.
The different presentations of the information made it a bit difficult for me to identify some details. For instance, some studies had bona fide research questions, while others had it implied in the purpose of the study. It took me more time to locate information such as sample characteristics, including size and how a sample was obtained in study reports, which were unclearly written than those whose information was presented. Some studies had almost all the information needed for the evaluation table in their abstracts and made things easier for me.
Reading of Abstracts
By reading an abstract, I was unable to make quick decisions about including or excluding a study, but this was applicable while reviewing some studies. Since an abstract offers a short description of the entire research, I read it to determine the characteristics of the sample, data collection, analysis methods used, and the respective findings, as well as vital implications or recommendations (Jeanfreau & Jack Jr, 2010). If any crucial information in an abstract, for instance, the main findings, deviated from my research question, then I was able to exclude such a study from my research sources rapidly.
However, some abstracts were not helpful in instant determining whether to include or rule out a study if its information was consistent with my research but insufficient to tell whether the entire study was in line with my study question. In such a case, I read the discussion section to get further information, such as the limitations of the research and likely implication, which lack in the abstract. The introduction sections were also helpful in my decision making since, by reading them, I was able to point out the key research concepts, objectives, and subjects, in addition to the themes.
What Worked Best?
After carefully appraising each study one by one, I was confident in my decision to include the most relevant and credible studies in the evaluation table. I was careful to use only studies that had the most affirmative and thorough research. Though I realized that evidence presented in some studies was contrary to that in others, I was mostly concerned with the conclusions drawn by the researchers with respect to the entire body of research. I found out that it is not the quantity of the studies that I had that could establish the dependability of the findings and evaluation table but the consistency of evidence coupled with the quality of the methods employed (Winters & Echeverri, 2012). Appraising the reliability of the studies before filling in the evaluation table acted as a means of bringing together the portions of the evidence puzzle in a meticulous manner.
What Did not Work?
While filling data in the evaluation table by taking each column at a time, it proved difficult to note the differences and similarities of the studies used easily. Since every one of the studies had a practice of establishing how it was carried out and how the results were assessed and data analyzed before the interpretation of the outcome, comparison of the studies was complicated by the dissimilar methodologies undertaken by the authors.
References
Fineout-Overholt, E., Melnyk, B. M., Stillwell, S. B., & Williamson, K. M. (2010). Evidence-based practice step by step: Critical appraisal of the evidence. The American Journal of Nursing, 110(7), 47-52.
Jeanfreau, S. G., & Jack Jr, L. (2010). Appraising qualitative research in health education: Guidelines for public health educators. Health Promotion Practice, 11(5), 612-617.
Winters, C. A., & Echeverri, R. (2012). Teaching strategies to support evidence-based practice. Critical Care Nurse, 32(3), 49-54.