Successful Research Planning in Education. Essay (Critical Writing)

Exclusively available on IvyPanda Available only on IvyPanda

Overview

In any setting at all, sound research is both valid and reliable. Validity is the extent to which the research measures what it purports to measure. Reliability, on the other hand, refers to how well others can replicate the research. This requires that the study is done so systematically and scientifically that others can duplicate the researcher’s methods and validate or contradict the findings.

We will write a custom essay on your topic a custom Critical Writing on Successful Research Planning in Education.
808 writers online

The scientific rigor of the survey undertaken last term was diminished by inadequate methodology owing to vague objectives, skimpy literature search, deficient sample size, and wrong application of tests of statistical significance.

Validity

For a study to be valid, it must set up testable hypotheses, research objectives and a study instrument that truly measures the attitudes, traits or behaviours it aims to research.

The research done last term started out on the right foot by using the title “How does (sic) online and face-to-face discussion affect postgraduate students’ collaborative learning?” As a presumed goal, this was a good start toward S.M.A.R.T. goal statement because it is specific, it is potentially measurable, attainable and realistic. Whether the project was constructed and executed in timely fashion is another matter.

Since there is no need to re-invent the wheel each time, a scientifically-inclined researcher will review the respectable body of literature, critically evaluate what others have already found, with what methods and in what settings with a view to seeing how he might advance the body of knowledge. The subject of the study happened to be the Blackboard online education and discussion tool, fairly new but building on discussion group, e-group and discussion forums that have been around since the advent of the Internet and even bulletin board systems before that. In that span of at least 15 years’ use, it is reasonable to expect that there have been a number of scientific evaluations of the application. However, the paper from last term drew on just three sources for the concept and two articles that bore on benefits and drawbacks perceived in practice by the audience of interest, students. Perhaps, five sources are adequate for parameter-setting, i.e., formulating study objectives and questionnaire content. But just two empirical studies does not seem a solid enough body of reportage from professional journals to comprise any kind of “trend” or even an initial consensus of how another study might seek to advance the body of knowledge.

The next logical step was to have specific and testable objectives. “In this study we want to see the perspectives of the students concerning collaboration through online and face-to-face discussion” is a reasonable statement of the purpose of the study. But the statement of study objectives, “What is the students’ opinion on the effectiveness of online discussion on collaborative learning compared to face-to-face communication?” is general enough to amount to an exploratory survey, something one does when there is no prior work to fall back on. Note the implicit target audience: students who have already been exposed to online CMS.

Having had the benefit of at least two studies, the paper from last term could have articulated the null and alternative hypotheses and research objectives thus:

1 hour!
The minimum time our certified writers need to deliver a 100% original paper
  • Ho1: There are no benefits to students for using Blackboard and other online CMS tools.
  • Ha1: There are clearly-perceived benefits for students who use of Blackboard and other online CMS tools.

Scanning the available literature again, one learns:

  • Incidence – Columbia College reports that more students now take online than traditional classes (Ganey, 2006). Susan Hockfield, president of MIT, estimated that no less than 40 million people all over the world have taken advantage of the institute’s Open CourseWare online classes (2008).
  • Change – Over the horizon, Stoloff (2008) reviews work by Solomon and Schrum suggesting that today’s students are going to make even more use of online learning tools such as podcasts, electronic portfolios, mapping software, aggregators, online tutorials, Internet telephony, and immersive environments. The authors also predict that such issues as second-language learning, accommodations for students with special needs, the digital divide, and assessment will also soon be addressed. On the other hand, Gulati (2008) takes the constructivist viewpoint to bemoan that online learning methods have yet to adapt to individualized learning needs.
  • Effectiveness – There are issues of online “communal health” and variable effectiveness for teacher-centred versus student-focused Blackboard implementations (Ashman, 2006). Having the tool is not enough. Instructors need specialized skills in teaching strategies, technology skills and pedagogy methods that increase the level of interaction with students in order to make best use of Blackboard (Alraway, 2006). De Arment (2002) affirmed that online communication integrated with a face-to-face class can provide ways for all students to participate in interactive dialogue and thus can promote learning by encouraging active engagement with content. However, faculty must consider critical instructional design considerations and complex contextual factors into account to discern which types of instructional interactions are enhanced by online communication tools.
  • Benefits – Christensen and Horn of the Harvard Business School predict that 10 percent of all courses will be online within six years and 50 percent by 2019 (Education Next, 2008) because students get the twin benefits of low cost and customized learning.

Given all these, one might then define the study objectives as that of critically investigating:

  1. The perceived and objective effectiveness of online versus classroom teaching.
  2. Which interaction type, live or remote, is believed to be more beneficial.
  3. What the convenience of online tools means in point of the work-family balance, schedules and lifestyles of students.

To be as rigorous as possible, the study should also have defined important terms such as the scope of “online tools”.

Validity — Methodology

The first item under “Methods” concerned the sample size. Relying on a convenience sample of just 10 students is faulty. It violates the recommended minimum of 30 needed for parametric statistics. Having a puny base of respondents leads to unstable tallies and makes tests of significance unusable. Low dispersion, coupled with the fact that half the sample was ignorant about the subject matter of the study, is the most likely cause for such contradictions as those the study group found for question 14 and 15 findings, that time saving was reckoned the primary benefit of both face-to-face and online collaboration. No amount of sophistication in questionnaire formulation can overcome sampling errors due to flawed sample size and sampling method decisions.

Hence, the study proponent could not promise to make “data analysis relevant to quantitative” information. There being no systematic sampling scheme (it was designated “random”), validity in terms of generalizing the results to an unknown population of interest is highly questionable. Not even a budget constraint can be used to justify a puny sample if the incremental cost of interviewing at least 20 more is not much more than the cost to photocopy questionnaires.

One also notes that the definition of target respondents did not specify whether they had to have experienced Internet-based CMS tools already. An opinion survey about an educational tool or indeed, any other phenomenon requires exposure to the material in question. Otherwise, one risks invalidating the entire study because it does not measure what it purports to.

Some more thought could have been given to study design. There is nothing inherently wrong with face-to-face interviews. However, the high educational level of the targeted respondents and the sheer simplicity of the questionnaire content could have opened up options for an online survey, telephone interviewing, central location interviewing, “quadrangle-intercept” methods, or a “mail survey” involving leaving a self-administered questionnaire behind in students’ pigeonholes. After all, sampling precision does not seem to have been an issue: any student was potentially exposed to, and knowledgeable about Blackboard. Neither was there any need to expose the respondents to any stimuli, visual or otherwise, or to perform any specialized tasks that the study proponent needed to observe. Thirdly, the matter of quality of response obtained could have been addressed by giving a proper explanation of the study objectives and enlightening respondents on how this might help them in future.

Remember! This is just a sample
You can get your custom paper by one of our expert writers

Next, the research proposal did not specify whether the questionnaire was structured or not. The more structured the study instrument, the more data collection options the researcher could have looked at. Secondly, a structured questionnaire ensures that interviews are done uniformly and that this source of non-sampling error is eliminated. Fourth, a structured questionnaire is inherent to quantitative research while less structured, free-flowing formats are more characteristic of qualitative and exploratory studies.

As to the questionnaire itself:

  • The study proponents decided to record respondent gender as token compliance with rules about measuring socio-demographic variables. This could have been implemented better. First of all, there was no basis for believing that men and women differ in acceptance of, and comfort level with, online CMS tools. Length of exposure to the Internet generally and to forums or discussion boards specifically, years of access to PC’s and being a working student always on the lookout for time-saving devices might have been more relevant confounding or explanatory variables.
  • The questionnaire started properly with a filter question about exposure to Internet-based CMS. Half of the respondents had no experience with the tools in question. Interviews should have terminated right there and then for these ignorant students. Instead, the rest of the findings were contaminated by answers from non-knowledgeable respondents. “Garbage in, garbage out” where half of the interviewees were concerned.
  • Here is where some thought to the relevant universe or population of interest would have helped. Specifying the sampling frame as exposed students would have alerted the study group to the need to end the interview right after the screener had yielded a negative answer.
  • The answer options to question 3 are incomplete. Meant to be an ordinal scale, the available answers did not include those for “less than several times a week” (presumably this should been “zero” and “once or twice a week”). This is not to be facetious. After all, certain courses or tutors require students to log on just once a week and that would have counted as “full use” for that course.

As well, the two existing options are unequal in size. “Several times a week” presumably stands for thrice to 20 times a week while the other choice is from 21 times to infinity.

Once cannot conceive of a situation where even graduate students assiduously check the CMS facilities just as frequently as, or more frequently than, email, the latest goings-on at LiveLEAK or Facebook. The spuriously high frequency obtained for “thrice daily or more often” are most likely the pointless answers of students who did not know what CMS is and recklessly decided to give the interviewer an answer that seemed right or proper. Once again, failure to define the study baselines played havoc with the meaningfulness of the study results.

  • The variable “confidence with using Blackboard” (question 4) should have been preceded or replaced by “how knowledgeable are you in using Blackboard?” Skill at navigating around Blackboard is a prerequisite to confidence and more informed opinion about the tool. Quite frankly, confidence is a misplaced criterion variable in this study. It is a function solely of familiarity. And if the study team had wanted to treat confidence as a pointer for acceptance or lack of it, it would have been better to ask the question explicitly.
  • Since seven of ten respondents counted themselves “confident enough”, the cross-tabulation with frequency of use, while intuitively logical, did not yield any differential patterns. The researchers were limited in their “analysis” to describing the numbers. The cross-tabulation should not have been reported in the study findings.
  • The absence of a meaningful result in this cross-tabulation must be ascribed to the sheer absence of dispersion brought about by a hopelessly inadequate and non-knowledgeable sample.
  • Question 5, an attempt to discover the barriers to greater acceptance of Blackboard, is flawed in three respects. In questionnaire formulation, the research team neglected to translate the theoretical/managerial construct, “barriers”, to layman’s terms. “Difficulties” or “problems” would have aligned with the everyday language of students who are mostly non-technical.

Second, the question could have been more clearly re-phrased as “What ___ prevent you from using Blackboard as often as you want?” The researchers overlooked the fact that the final phrasing begged for a dichotomous, yes/no answer.

Thirdly, the options given are both incomplete and vague. “Fluency in English” and “skill in computer use” would have been clearer. As to “Technical software problems” and “Not being in possession”, one has to wonder whether the respondents interpreted them in so many idiosyncratic ways that the results now defy any concept of validity.

  • The same consideration applies to one of the answer options to the question on how Blackboard promotes collaboration: “Flexibility because of the option to use both syn and asyn communication”. One hazards that that only Communications and Networking Engineering majors are familiar with the terms “synchronous” and “asynchronous”. That only one student opted for this feature underlines how mystified the other respondents must have been.

This brings us to the requirement that many opinion and marketing researchers overlook the distinction between service features and benefits.

  • On the other hand, questions 6 and 7 are formulated properly (except for the “minor” error that the latter is actually two questions in one) and have concrete answer options. The follow-up question to item 7, “Explain how you use each of them” is either immaterial for not being actionable or should have been tackled in an observational study.
  • “Preference” as a criterion (in question 8) is probably irrelevant. It would have been more meaningful to employ “more useful to you” (as a student or in coping with academic requirements).
  • The questionnaire recorded age but the research team did not imagine that this demographic variable had anything to do with competence when it came to PC or online collaboration tools.

Lastly, there are analytical issues and flaws highlighted by the “statistically significant” correlation obtained for gender and question 11, incidence of collaborative problems on Blackboard. First, SPSS is not equipped to block nonsensical attempts to run correlations or any other parametric statistics on inadequate samples. Second, the research team needed to brush up on their elementary statistics because they fell into the trap of concluding that an extremely low value like 0.089 represented good or “high” correlation and that a Z value of 0.807 for a two-tailed test of hypothesis is “significant”. As well, the study group accepted a plurality of 4 in question 14 as true “in most cases.”

We will write
a custom essay
specifically for you
Get your first paper with
15% OFF

Bibliography

Alraway, N. (2006) The impact of video conferencing technology on students’ interactivity in gender segregated campuses. Ph.D. diss., Indiana State University.

Ashman, K. (2006) Online composition classes call for a pedagogical paradigm shift: Students as cartographers of their own knowledge maps. Ph.D. diss., The Florida State University.

DeArment, C. A. (2002) Instructional uses of computer-mediated text-based messaging tools: A case study of faculty and student experiences and perceptions. Ed.D. diss., University of Pittsburgh.

Education Next (2008) Computer-based learning could transform public education within a decade through “disruptive innovation,” experts say. Business Wire.

Ganey, t. (2006) College plugs in to online education. Knight Ridder Tribune Business News.

Gulati S. (2008) Compulsory participation in online discussions: Is this constructivism or normalisation of learning? Innovations in Education and Teaching International 45, no. 2: 183-192.

Hockfield, S. (2008) Numbers that can change the world. Boston Globe.

Stoloff, D. L. (2008) Web 2.0: new tools, new schools. Choice, 1592-1593.

Print
Need an custom research paper on Successful Research Planning in Education. written from scratch by a professional specifically for you?
808 writers online
Cite This paper
Select a referencing style:

Reference

IvyPanda. (2021, September 2). Successful Research Planning in Education. https://ivypanda.com/essays/successful-research-planning-in-education/

Work Cited

"Successful Research Planning in Education." IvyPanda, 2 Sept. 2021, ivypanda.com/essays/successful-research-planning-in-education/.

References

IvyPanda. (2021) 'Successful Research Planning in Education'. 2 September.

References

IvyPanda. 2021. "Successful Research Planning in Education." September 2, 2021. https://ivypanda.com/essays/successful-research-planning-in-education/.

1. IvyPanda. "Successful Research Planning in Education." September 2, 2021. https://ivypanda.com/essays/successful-research-planning-in-education/.


Bibliography


IvyPanda. "Successful Research Planning in Education." September 2, 2021. https://ivypanda.com/essays/successful-research-planning-in-education/.

Powered by CiteTotal, online bibliography generator
If you are the copyright owner of this paper and no longer wish to have your work published on IvyPanda. Request the removal
More related papers
Cite
Print
1 / 1