Wednesday, February 2, 2011

Tuesday, February 1, 2011

Journey - Evaluation paper - Graduate Certificate in Applied eLearning

Hi Bronwyn and fellow class mates

Some comments on my journey through the Evaluation paper.

At first I thought this paper would be 'no sweat'.  How wrong I was.   First and foremost, I must confess that I am not the X or Y and contemporary generations that has been brought up in front of a computer screen.  So for me accessing and using on-line forms of communication is a struggle.  I am fine with everyday useage but once I have to create blogs, access eluminate, use eluminate - it is somewhat of a challenge. 

I recall one night trying to get onto eluminate (without Dana) took me nearly one whole hour of the two hour session.  The reason was that I was using the wrong password.  I not only stayed at my work (MIT) to access it until the required 6.30 p.m. had the frustration of not being able to get into it when I knew that the discussion was occurring, but then have the 3/4 hour drive home afterwards.  So for me, this aspect of the educational delivery was somewhat of a challenge.  

Working with Dana was a real treat.  He is amazing and has a great command of the technology.  We worked very well as a partnership and it was a real delight and a privilege to have such a partner.  We learnt much from each other.  

The plan - was a lot of work.  I finally found all the parameters were well identified in the Reeves and Hedberg (2003) book but did not discover this till I have completed it.  I did not think it would be so much work but even with the guidelines and the exemplars (which I couldn't always have time to read) one still has to do the thinking and construction required to achieve this goal.   I found that instructions had not only to be clear but to be understood.  The comprehending of each aspect as it related to my topic took many hours to produce.  The comprehensiveness of the plan was not understood at first.  Having Dana as a partner was great in this process.

Eluminate
I did not always find the discussion helpful for most members had their own perspectives they wished to explore which was not necessarily what I wanted to know at the time but however, it was helpful.
It was good to hear from each member what they were doing and what approach they were taking to their evaluation.

Although e-learning is all about the 'constructivist' approach to knowledge development, I would have preferred a good guide to a reading list that would have been focused specifically on the evaluation process. 
I am of the generation that still likes to explore literature rather than on-line material wholely.

The Evaluation Report took hours and hours.   It did not help that I was tremendously busy at the conclusion of the year with my work and personal commitments.   This report was an extremely time-consuming piece of work to produce.   The writing up required to report on every facet of the evaluation process, creating and using the data collection tools, refining tools on the basis of Bronwyn's feedback and then putting these into operation, meant negotiation over time with students and lecturers and for me the Programme Leader of the Bachelor of Nursing programme at MIT.  Catching the students and requesting their involvement is easy but then getting them to carry out what you intend for them to do becomes a secondary requirement when other more urgent matters are pressing for them.   Additionally, any instructional content, major modifications and teaching strategies must be approved by the Programme Committee before alterations can be made.  So time consumption in gathering data is a very real factor when collecting data within and around everyday commitments.

The analysis and compilation of the report to include portraying the results in table and graphic form was time consuming and challenging.  Fortunately I had Dana to help me with the graphs for I use these so infrequently
I had to relearn how to do it.   Then to get Bronwyn's detailed feedback requiring work still to be done after completion was daunting. 

Bronwyn's teaching and support throughout the paper has been awesome even if at times, I have been somewhat disappointed with the evaluated outcome.   I have appreciated Bronwyn's obvious desire for students to do well and to cope with a predominantly on-line teaching method which, we all know, is open to the students interpretation. It is in the latter that I found much ambiguity and at times confusion.  This in turn has brought frustration.   I can say that it has been a challenging journey but it has been good to share that journey with others and especially Dana as we have paddled our waka together to conclude the paper.

Thank you Bronwyn for your commitment to your teaching and for the patience that is required for students to achieve their educational goals.    I have learnt a great deal and have appreciated your detailed comments that give very direction for correction and improvement.

Thank you fellow class mates for sharing my journey - I trust that you have all achieved good outcomes and will go well in your future educational endeavours.

Louise

A theoretical model that I have learnt about and used during my journey on the evaluation paper

Hi everybody

I am sure, like me, you have learned a lot of new ideas.  I did not know that so much had been written about the term'evaluation' and now recognise that there are many interpretations of this term.
Using Reeves and Hedberg's (2003) six facets of instructional product design I learnt that all of these require different questions and approaches.  The six facets, needs analysis, formative evaluation, effectiveness evaluation, the overall impact of the learning package,  maintenance and then finally a review of the total instructional process.   The most confusing terms for me were formative and summative evaluation as these terms I am very familiar with within a particular context.   The 'effectiveness evaluation' which was the focus of my study and report required me to look at the total learning package.  This I found was extremely comprehensive and complex.  Reeves and Hedberg claim that evaluating on-line instructional designs is a complex and an inexact science and I would totally agree with them.  Perhaps what was most significant for me was to re-orient myself to the different approaches to the word 'evaluation' and recognise that there are different orientation e.g. the orientation required for my 'effectiveness' evaluation was the methodological approach. 

During the theoretical exploration I looked anew at a research term 'triangulation' that I was familiar with from another context but began to understand it differently in relation to the 'effectiveness' evaluation.   I read up on and explored the electic mixed methods pragmatic approach to data collection to understand the origins of this approach and recognise that it has evolved from principally the logico-positivistic paradigm in the first instance, then the interpretive paradigm and finaly has a pragmatic dimension to it.  Reeves and Hedberg (2003) consider that using the electic mixed methods pragmatic approach to collect data for analysis on the topic of issue provides a practical orientation to practical problems that instructional designers are confronted with and provides useful information in order to move forward.  

My exploration into the electic mixed methods approach took me to Bronwyn's webpage (2003, (c) Otago Polytechnic  "Experimental and Multiple Methods Evaluation Models' and my reading of this literature extended further.  The term 'triangulation' became clearer and the techniques e.g. survey, interview, discussion board entries, as well as quantitative scales that show definitive outcomes.    The upside to this method is the breadth and depth one can achieve by examining your topic from a range of different perspectives.  For my 'effectiveness' evaluation I used a student survey, lecturer peers and a written comment to address the questions that I had raised to gain insight into the 'effectiveness' of my on-line 'Conception to irth' of the lifespan content that new nurses need to learn.  But of course the downside to this method is that the results achieved depends on a large enough sample to gather the data, the adequacy of the techniques to establish the depth and breadth of information required and then much of it is probabalistic rather than predictable.  However,  the triangulation approach does give comprehensive data from which to make decisions to points ways to improving teaching/learning processes. 

So as far as I am concerned, I learnt a lot but there is still more to learn no doubt.  

I will be interested to hear others' points of view.

Congratulations on concluding the evaluation paper.  I am sure you too have had an interesting journey.

All the best for your future studies.

Louise

  

Sunday, December 12, 2010

Final Report is Finally Complete

Dear All
Whew what a marathon.  We are done. 
Please follow the link below to view my report in its PDF format.
http://electronic-learning.yolasite.com/final-report.php
Your comments are much appreciated.

Have a lovely Chrismass and holiday.
It has been lovely working with you.

Louise.

Tuesday, November 30, 2010

Almost at the conclusion

Hi all
It is a while since I posted anything on my blog but Dana and I have been working away at our final report.   He will not participate tonight as he is unwell but we hope to post our work next week.   How is everybody else doing.  Both Dana and I have been heavily tied up with students sitting their examinations either preparing them for them or marking the final results which all have deadlines.  Hence the lack of particpatioin on the blog.  I have looked up some material related to evaluation from overseas.  One article by Fetaji & Fetaji (2009) discusses a multidimensional evaluation of elearning using a qualitative research design and comparative analysis of factors that can effect student learning utilising two LMS - ANGEL and Moodle - Doies anyone have any experience of the ANGEL LMS?   This is an interesting report identifying factors that most of us would know about such as learner styles, culture, skill level , teacher effectiveness, intelligence variation, communication styles etc.  This report was located as an Electronic Journal of e-learning, 7(1) 1-84) but I could not find a tital except it focused on evaluation of elearning.
HOpe everybody is making good progress.  Louise   

Sunday, September 12, 2010

My evaluation plan

Hi everybody

I am posting my evaluation plan for assignment two.  I have done a literature review but cannot access it so will post it later.  I am at present in London at a conference.   I would be pleased to get comments
Dana and from all the group. Hope everybody is going Ok with theirs. 

Dana I could not get our pictures to copy!!


Introduction
In 2009, the writer enrolled into the Graduate Certificate of e-learning and in the Design paper of this qualification, an on-line Conception to birth component was developed and posted on the MIT Learning Management System for student learning. The first time this component was run was in Semester one, 2010 and it became evident that there should be some modification if the students were to use this component and learn from it. In semester two and a new group of students, a modification has been carried out.

The purpose of this evaluation plan is to evaluate the modified e-learning component of the Lifespan development course “Conception to Birth” to see if it as an effective learning tool.

Background

The design of the component is directed by the Blackboard Learning Management System (LMS) and uses the format the students have been introduced to through EMIT. Students are expected to access the LMS through their User Login and password that they receive on enrolment. They are also taught as part of their orientation how to navigate around the system. They are informed that all announcements, course information, course content, and assessment information will be accessed via the LMS. Students become familiar with the LMS set up and expect to see information presented in the design of the system. They can access the system asynchronously any time day or night.

From Conception to Birth is designed around a story of a family expecting a new baby. The family consists of father, mother, an elder son and baby daughter and an expected new baby entering the family (appendix i).

Students are directed to a course text from which the story of the family acts as a background for numerous questions that trace the conception to the zygote, embryo and fetal phase of the pregnancy. They are given online resources that illustrate the development process, the timeline involved and critical phases with questions to assist the students to find out the information they need to for their learning.

Additionally, a number of technological tools are used to assist them in their learning process. They have a number of ‘matching’ exercises to learn key terms, Quizlet is used for learning new terminology, a crossword is used to link concepts to definitions and a summative multi-choice test to see how they were learning. The summative multi-choice test was worth 10% of their total mark for the paper and comprised 20 multi-choice questions. From the total enrolment only four students failed to reach a pass mark and the highest mark was 19/20 and the lowest 9/20.

A discussion board was also set up for students to discuss the component as they learnt the material. In the original exposure there were 148 students enrolled in the course. To manage the large number of students, they were placed in 16 discussion groups. Instructions were given verbally, demonstrated in class and placed within the conception to birth component with an appropriate access button. Each group was asked to choose a leader who would take the week’s discussion from the group and place the key response themes on a class wiki. The writer would then check on each of the wiki’s to gain an overall impression of how students were learning and more importantly, how students were accessing and using the systems that were available to them to assist them with their learning. The discussion board did not contribute to the overall course mark for students. The discussion activity was broken up into four weeks with questions to drive the discussion each week. The writer kept a check on the discussion activity of the students.

A summative evaluation was undertaken at the conclusion of the component. Students were asked to evaluate their experience with the component and sadly only four students completed the evaluation. These four were positive.

From the writer’s observation, the discussion area of the learning component was not utilised well with very few students using it and the leadership and class wiki was only taken up by one leader. It was decided on observations made, that the component would be modified for the new enrollees for semester two.

Modifications made have included more instructions given frequently to students through the Emit LMS. Checking in class to see how the students are managing. The leadership for group activity has been scrapped along with the wiki but the discussion groups have continued with the questions being announced and published weekly to guide the students. The writer checks on all groups twice per week to see how the activity is being managed. To date I have noticed that this monitoring is making a total difference to the uptake. The students are enthusiastic and I monitor how frequently individual students access the discussion. The very new students are beginning to ask their own questions apart from the formulated ones which are very encouraging. The summative multi-choice has been changed to a formative test which is for their own monitoring of their learning but the students have been informed that questions related to the component will appear in their summative test to contribute to 20% of their total mark for the paper. Additionally the students have been informed that the group who has the most participation on the discussion board will get chocolate fish. This really has got the students going.

Aim:

The aim of the evaluation is to see if the modified e-learning component of the Lifespan development conception to birth is an effective learning tool for new students of nursing.

The purpose of the evaluation was threefold:

(a) to determine whether e-learning can satisfactorily replace the traditional lecture method of transmitting theoretical information to new students of nursing.

(b) To determine whether the conception to birth component was designed in an effective manner for student learning?

(c) To assess the effectiveness of the component through both formative and summative evaluation to demonstrate student learning.

Methodology:

Evaluation big questions:

Matched to purpose

1. How effective is the on-line segment of the lifespan curriculum component “conception to birth” as a tool for student learning?

2. How effective was the formative learning tools to assist their learning?

3. What did their formative test results demonstrate in relation to the material to be learnt?

4. What did their summative test results demonstrate in relation to the material to be learnt?

Related questions:

1. Was the timing correct for the students to respond to the questions asked:

2. Did they find the reading material interesting?

3. Did they find the questions easy to answer from their text book reading?

4. How easy was it to access the discussion board?

5. Did they find the discussion board helpful to their learning?

6. Were the questions asked relevant to their key learning goals?

7. How did fellow students assist their learning through their participation in the discussion board?

Data collection methods:

Design of evaluation

It is proposed to use a range of data collection methods to evaluate the effectiveness of the on-line conception to birth component of the lifespan curriculum content. These are as follows:

1. An evaluation form has been developed as part of the component and students are asked to complete the evaluation form (appendix 1)

2. Lecturer assessing the discussion board regularly over a two week period to assess participation and to assess the quality of the responses to the questions asked

3. Lecturer assessing the formative test participation and results of the students who participated

4. Lecturer assessing the summative questions as part of a formal test of the lifespan component of the curriculum to include conception to birth (10 questions)

5. Students completing an evaluation of the conception to birth component of the lifespan content

Audiences

1. Students undertaking 722.520 Foundations of Nursing (Lifespan Conception to birth) component are the primary audience

2. Myself as the primary lecturer

3. Secondary sources other lecturers involved in the teaching of the Lifespan component

4. Bachelor of Nursing Teams One and Two Co-ordinator

5. Programme Leader of Bachelor of Nursing programme

Logistics and Timeline

Implementation

Instrumentation

1. Evaluation form

2. Multichoice questions – formative

3. Summative


Sample

1. 112 students

2. 6 lecturers

Timeline = one semester 20 weeks.

1. Discussion board open for four weeks – the month of August, 2010

2. Analysis of responses on discussion board for 12 groups of 10 students twice per week and responding to the students 8 hours per week = 40 hours

3. Monitoring hits and performance on formative test of participating students –August to September 3 - 4 hours per week

4. Summative test results = 8 hours

5. Evaluation of component per evaluation form yet to be determined – 4 hours

6. Approximately 56 hours estimated in the evaluation process

Budget – 56 hours at $45 per hour = $2,520

Decisions

Outcomes expected:

Positive outcomes would be:

1. Students participate fully in the discussion board

2. Students take advantage of the formative test for their own learning and their is at least a 50% participation rate as measured

3. Students demonstrate 100% achievement of the ten related questions to the conception to birth component of a forty question summative test worth 20% of their grade for the Foundation of Nursing paper

Negative Outcomes could be:

1. Poor participation (<30%) in the discussion board

2. Poor participation (.30%) in the 20 question formative test on the conception to birth component of the Life Span teaching

3. Poor pass rate (>30%) on the 10 question component of the lifespan 40 question summative test

From the total process – further modification to the component a conception to birth of the lifespan teaching according to the demonstrated performance of the students and the feedback on their evaluation of the component.

It would the same for the lecturers involved in the teaching and the summative test results.

Sunday, September 5, 2010

Working with Dana

I have been working with Dana on our assignment on Friday September 3.  We have had many useful discussions one of which was to clarify the distinction between an aim and a purpose.  This seems a grey area but I believe it is that an aim is a more broader approacvh to evaluation in relation to our topic and that the purpose is more specific.  That is the approach that we have taken. 

Unfortunately in the LSC interruptions occur with students wanting assistance and making appointments for that is the focus of the centre so we get quite a few interruptions.  Nevertheless we are making progress.
Dana's smiley face approach to a difficult topic will I am sure be a winner with students especially as they create their face and require to know the constituents of the cellular structure in order to do this.  Therefore there learning will be accelerated through both their research, finding a solution to the problem confronting them and drawing their smiley face.  This approach to learning is fun and should make it easier for them to remember the structure of a cell.

I am getting along with my assignment and hope to post this tomorrow.  I have done a literature search and have some interesting reading that Dana and I will share in discussion and select what is relevant for both of us. 
HOpe everyone else is getting along OK. 

Dana is a great coffee maker!!!
Louise