Sultan Qaboos University
College of Education
Instructional & Learning Technology Department
TECH4102: Evaluation in Educational Technology/Spring 2009

Khadeejah & Amal Portfolio

Students Names & IDs:

Amal Al-Balushi(66755 ) Khadeejah AL-Shidhani(61344)

Course instructor:

Dr. Alaa Sadiq alaasadik@squ.edu.om

Thank you for taking the time to view our portfolio. This portfolio is about a collection of works that results from our efforts and search in the course of Evaluation in Educational Technology (TECH4102). These works have enriched our experience and background in the field of Educational Technology.

You can view our work by simply clicking on the links from the right side menue:

Wednesday 6 May 2009

Evaluating Instructional Technology Implementation in a Higher Education Environment



Summery of research:
Overview:
The research first reviews the literature and describes the methods used in a myriad of evaluation studies in instructional technology. The review result on the following three conclusions:
(1) Multiple data collection methods have been used to collect information from multiple sources
(2) Various evaluation models or approaches have been followed; and
(3) There are a common set of problems and concerns about past evaluations of technology-enhanced instruction.
Then it provides a concrete example of evaluating a campus-wide learning technology effort (the SCALE Project) at the University of Illinois at Urbana-Champaign. This evaluation spanned three years and used multiple methods.




Evaluation methodology:
Purpose:
Since the research serves different clients it has more than one objective. The research has defined three main client groups: (1) the campus administration, (2) involved instructors, and (3) the Alfred P. Sloan Foundation (the funding agency).
These objectives are:
• Evaluating the impact of ALN (Asynchronous Learning Networks) on professors and students
• Understanding the economic implications of ALN.

Evaluation approach:
To answer the evaluation questions, the research adopted the mixed-method approach. Following a document review, both qualitative data and quantitative data were collected. Quantitative data took the form of survey results, records of use, achievement gain scores, and an extensive cost-benefit analysis (which is reported in a subsequent article). Qualitative data primarily involved interviews. Some data collection efforts had both qualitative and quantitative elements. For example, the evaluation of computer conferencing involved tallying interactions between various course members as well as a qualitative content analysis of the interactions. Some impact and efficiency data were also collected during Years one and two.

Instrument:
  • Student surveys: To assess student attitudes and perceptions about the use of ALN, students involved in the sponsored courses are surveyed. a “Conferencing” survey was administered to students enrolled in SCALE-sponsored courses wherein conferencing software was the primary application, and a “Web” survey was administered to students in courses primarily using the Web.
  • Post-course instructor surveys:Instructors were asked questions about the time commitment of, level of satisfactionWith, and support required for teaching courses with ALN.
  • Computer support personnel surveys:Respondents were asked about thetraining they had received, the types of question students and instructors asked them, andtheir satisfaction with various learning technologies used in SCALE-sponsored courses.
  • Student and TA (teaching assistant) group interviews:The TA interviews were conducted without students or professors present and focused on issues of computer accessibility, ease of use, student satisfaction, and perceived instructional benefits of using ALN
  • Instructor interviews: An individual one-hour interview focused on professors’ perceptions of ALN effects on certain quality indicators of education. These indicators include quality and quantity of student-to-student interaction; quantity and quality of student-to-instructor interaction; and the sense of community fostered within classrooms.During year three the focus of all data collection (instructor interviews included) was on assessing instructional efficiencies. During these interviews, information was collected on instructors’ salary and time allocation for teaching, cost per student for each course (with and without ALN), and the infrastructure cost of running an ALN course.
  • Gains in student achievement: The evaluation team undertook several quasi-experimental studies during the first two years. These looked at achievement score differences between (1) ALN and non-ALN sections of the same course; (2) semesters taught with ALN and those taught without ALN in the same course (i.e., historical comparisons); and (3) similar courses where one of the professors used ALN and the other taught without ALN.Content analysis of the SCALE instructors conference:A content analysis of the postings to the conference was conducted as part of the evaluation. The following five categories emerged from the postings made to this conference: (1) announcements and sharing of information; (2) specific technical assistance questions and answers; (3) sharing of best practices; (4) philosophical issues of interest; and (5) general suggestions to the SCALE project staff for improvement.
  • Course conferences:The evaluation team monitored the student computer conferences in several courses throughout the evaluation. The purpose of the monitoring was to tally student and instructor use as well as determine the type of interactions that occurred.

Research resource: Cheryl Bullock and John Ory, American Journal of Evaluation 2000; 21; 315

No comments:

Post a Comment