The following address will take you to my new blog
http://blogforassessment.blogspot.com/
Tuesday, 27 May 2008
TMA03
[The O’Donnell et al (2006) paper will be referred to as the U-study; the Rekkedal and Qvist-Eriksen (2004) paper will be referred to as the NKI-study.]
Settings and Theoretical Contexts
While both evaluations considered online support for learners, substantially different perspectives were evident. The U-study was based within a European HEI, focusing on two in-house, on-line support packages, available to all HE students. This study assumed ICT would enhance educational opportunities, with the content of the packages being developed to assist new students, through an induction programme and a support facility for those expected to use ICT. The induction package focused on orientation and study skills, including ‘learning to learn’ skills (such as time management), as well as confidence building blocks; the support package featured ICT literacy components, including technical aspects and access issues. This was an evaluation study of the packages, but for what purpose(s) was unclear.
The NKI-study was set within an organisation driven by commercial needs to provide flexible learning solutions, which afforded the delivery to be taken at an individual pace; this was stated as their ‘philosophy’ for internet education. NKI’s previous research (for example, Rekkedal and Paulsen, 1998) confirmed flexibility was a key driver of their strategy and this study intended to evaluate if customer demands were being fulfilled. The intention was two-fold; to examine student’s support needs and to assess their satisfaction with this support available.
The U-study suggested on-line packages would support learners and this could enhance their educational opportunities; reference is made to Salmon (2000) in relation to this. However, while there is evidence that new opportunities could exist to support learning through online means, the use of Salmon’s theory was perhaps irrelevant, as this model of on-line learning stages was aimed at the development of collaboration and moderation skills, not at packages to support the initial stages of student learning. The implications of students being unable to learn online effectively was of greater relevance and perhaps the views of, for example, Tyler-Smith (2006) in relation to developing ‘learn to learn skills’ would have provided greater weight to the rationale; indeed, the importance of preparing students for online study appears to be crucial to this research. The study assumed that students would need to develop some ICT skills and there was also an awareness that HEI providers may assume that a certain level of IT literacy exists when this is often not the case (Somekh, 2000).
The NKI-study suggested the student group could be an important social and academic support in learning; however, it was not clear how this was prompted, with the emphasis being on individual student learning. While forums are encouraged, these are not used to any large extent as part of the learning experience, so theories of collaboration cannot realistically be applied.
The products were originally developed along the lines of didactical teaching, where the learner is guided through input models; although literature on developing theories has been reviewed, particularly relating to the possible importance of collaboration and social learning, the solutions continue to be student-led, based on personal knowledge construction. Interaction on-line was not considered important for many and one might concur with this if agreeing with the evidence of Davies and Graff (2005), who established that improved performance was not necessarily achieved by those who interacted more (although a greater volume of online activity could lead to higher grades and interaction may support ‘potential failers’). The NKI-study agreed that a social constructivist perspective may appear ideal as a social theory of learning, however, if online collaboration is not taking place and students have already stated that individual learning is of prime importance, NKI-study’s perspective is more greatly dictated by Piagian roots.
Strengths of Methodological approach
The NKI-study was easy to follow and logically structured. As a commercial aware organisation, NKI clearly based much of their study on their own previous research and are attempting to continually assess the viability of their internet learning packages. As there is access to paying clients as a sample population, their market research is a valuable method of assessing the product value; the study was targeted to the same audience, within the same context. A key strength is the clarity of the link between the need to review client’s perception of learner support and satisfaction, with the method of the data collection. The use of open-ended questions for depth of response and to follow up previous market research ensured that the data would be of relevance and current value to the organisation.
In relation to what the report uncovered, it declared that clients were overall satisfied with the support elements of the learning packages; however, to examine student’s need for support services may have been more meaningfully measured by greater quantitative analysis.
The U-study provided a clear overview of the content and purpose of the packages (if not the intention) of the study and this was justified using of a wide range. The study was aided by access to the whole student population within the HEI and the technical ability to offer on-line questionnaires, which would encourage those already accessing on-line resources, to respond. Some free response questions were available and both pre and post questionnaires were used; a method also used by Bos et al (2000), which allows comparative analysis.
Limitations and weaknesses
For the NKI-study, the intention was for the study support elements (classified as the parts of the course additional to the course content) to be analysed; however, might some students have difficulty in separating these components? A key criticism would be the sampling; to use statements such as ‘…nearly at random…’, the lack of clarity in explaining the relevance of a ‘student number’ in the selection process and the variety of selection judgements allowed by the interviewers, makes bring in to doubt the validity of the study. The quantitative data collated was not analysed in any detail and only ‘revealed’ in the final summary; the qualitative statements were not summarised.
While NKI established that the first phase of study was important, there seemed to be little emphasis on research of developing trust in the learning technology and learner interface; Bos et al (2002) suggest that it is difficult to establish this online trust, so it may be crucial to explore this aspect and possible links to lack of study support use, as an assumption is being made of ‘trust’ in the NKI systems.
The main weakness of the U-study was the lack of clarity of purpose; the aim was not stated until the end of the report, under the discussion; only at this point does it become known that the evaluation was to evaluate effectiveness and ‘…if it could be delivered appropriately online…’. There was no scale defined for measuring effectiveness and one would also question the meaning of ‘appropriately’ here; this should be explicitly defined.
In the U-study, demographics were discussed, but not stated as a relevant issue at the outset; nor was it clear how this discussion was linked to the support packages. It may have aided the reader to have a clear classification of what the authors’ meant by a ‘Primer’, as terminology familiarity was assumed. Another weakness of this study is the amount of time spent on research, analysis and summary of the location of the IT access and the connection type; this appeared irrelevant to the issue of support and was not stated as a mainstream of the study. In fact, there was a volume of quantitative data, which did not appear to ‘answer the question’; for example, satisfaction is not a measure of effectiveness, unless it is clearly stated that there is a scale defining this term. While it was acknowledged that packages ‘improved confidence’, this again is not an indication of ‘being effective’. Additionally, the authors did not respond to negative free response questions regarding time consumption and volume of information; this could have been linked with the issues Davies and Gaff (2005) considered, in relation to time pressures in student’s studies being a major component of the on-line learning experience.
In relation to validity, it is more likely that those who accessed and complete the packages are more likely to be IT competent anyway and would also be more likely to complete on-line questionnaires, thus skewing the data.
Lastly, related to ethical issues, it is unclear how students could remain ‘anonymous’ when clearly a logon to the system would be required, where this could be traced back to an individual.
Proposals
The key proposals for the NKI-study would be
Offer alternative feedback methods
Focus questions on specific elements to aid development of course components
Specify sampling selection methods
For the NKI-study, as learners are perceived as competent online users, providing online questionnaires may have allowed a higher response rate. It may also have been of higher value to the organisation if some focus had been placed on the usefulness which students attached to support elements; perhaps specific questions on tutor support and/or feedback is required. However, the recommendation would be for responses to be provided in a different format; the personal interaction of telephone interviews may lead to some respondents being uncomfortable in providing negative statements, particularly regarding tutors. The selection of questions from the personal choice of the interviewees could also be addressed in this focussing of content and NKI could also have targeted questioning to the reasons for lack of collaboration; if learners can be encouraged to ‘think by learning’ through online activities (as Nardi et al, 2004, discuss), and this is seen as a support mechanism of value, then learning as a social process may be encouraged.
While any students may provide reliable data, the selection of the sample should be stated; this would increase the validity of the study results.
The key proposals for the U-study would be
Clarify reasons for evaluation
Control group completing package as compulsory
Analysis of free response questions
Collection of qualitative data through interviews
Greater clarity of graphical representations
Feedback from non-student agents
Aims must be stated at the outset and these should be clearly defined, including boundaries of the study and terminology used; this would ensure the design matched the intended purpose, as well as allowing the reader to ‘see where its going’.
If a control group were to complete the package as compulsory, this would mean judgements could be made in relation to support components which were most used and valued by students, rather than making assumptions on information provided by those who freely selected to participate (as these may well have the higher ICT skills anyway). If all members of the control group participated, it may be possible to determine what additional support was actually required, much as was evidenced in the Cox (2007) study, where individual differences in learning were made more transparent. In relation to the use of the data, free response answers should be analysed and fully summarised; researchers cannot ‘ignore’ negative input or aspects brought to light, which were not perhaps anticipated at the start of the study.
Qualitative data could be collected through interviews, to encourage a depth of response; it would be particularly important to gain insight into why some students felt their support was ‘effective’. As Bell (2005) suggests, a guided interview could be used, so that questions would be focussed around support package issues, but with greater allowances made for issues not covered by the researchers’ initial ideas on support and its effectiveness. If data is collected only from those who freely access the packages, then the results would be unreliable when trying to apply results to the whole HEI population.
Best use should be made of the data collated and simplicity of graphical representations would support the discussions more strongly; for example, the ‘attitudes’ presentation as a graph would have more clearly illustrated the ‘pre’ and ‘post’ results differences, rather than by a table format.
Feedback from non-student agents would be an integral part of a well balanced study, where the ‘effectiveness’ of the packages is being measured; quantitative data may express grades, but showing true correlation to use of support may be challenging. Insight into actual performance could be carried out by asking tutors if weaknesses were still apparent in those who had completed the packages; from experience, it is not the taking part in the learning that provides the results, but the ability to transfer the acquired skills gained.
Overall, the NKI-study is seen as ‘better’, due to the intentions being stated and, to a large extent, met. However, as Tolmie (2001) suggests, perhaps the complete online experience needs ‘…examined, not just the use of the technology in isolation…’ (p237)
(Word count 2112)
References
Bell, J. (2005) Doing Your Research Project: A guide for first-time researchers in education, health and social science, (4th edn), Buckingham, Open University Press.
Bos, N., Olson, J., Gergle, D., Olson, G. and Wright, Z. (2002) ‘Effects of four computer-mediated communications channels on trust development’ in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems: Changing Our World, Changing Ourselves, Minneapolis, Minnesota, 2002, New York, NY, ACM; also available online at http://libezproxy.open.ac.uk/login?url=http://doi.acm.org/10.1145/503376.503401 (Accessed 3 May 2008).
Cox, R. (2007) ‘Technology-enhanced research: educational ICT systems as research instruments’, Technology, Pedagogy and Education, vol. 16, no. 3, pp. 337–56; also available online at http://libezproxy.open.ac.uk/login?url=http://dx.doi.org/10.1080/14759390701614470 (Accessed 19 May 2008).
Davies, J. and Graff, M. (2005) ‘Performance in e-learning: online participation and student grades’, British Journal of Educational Technology, vol. 36, no. 4, pp. 657–63; also available online at http://libezproxy.open.ac.uk/login?url=http://dx.doi.org/10.1111/j.1467-8535.2005.00542.x (Accessed 5 May 2008).
Nardi, B., Schiano, D., and Gumbrecht, M. (2004) ‘Blogging as social activity, or, would you let 900 million people read your diary?’ in Proceedings of the 2004 ACM Conference on Computer Supported Cooperative Work, Chicago, Illinois, New York, NY, ACM; also available online at http://libezproxy.open.ac.uk/login?url=http://doi.acm.org/10.1145/1031607.1031643 (Accessed 12 May 2008).
O’Donnell, C.M., Sloan, D.J. and Mulholland, C.W. (2006) ‘Evaluation of an online student induction and support package for online learners’, European Journal of Open, Distance and E-Learning [online], 2006/I, http://www.eurodl.org/materials/contrib/2004/Rekkedal_Qvist-Eriksen.htm (Accessed 03 May 2008).
Tolmie, A. (2001) ‘Examining learning in relation to the contexts of use of ICT’, Journal of Computer Assisted Learning, vol. 17, no. 3, pp. 235–41; also available online at http://libezproxy.open.ac.uk/login?url=http://www.blackwell-synergy.com/links/doi/10.1046/j.0266-4909.2001.00178.x (Accessed 2 April 2008).
Rekkedal, T. and Qvist-Eriksen, S. (2004) ‘Support services in e-learning – an eevaluation study of students’ needs and satisfaction’, European Journal of Open, Distance and E-Learning [online], 2004/I, http://www.eurodl.org/materials/contrib/2006/Catherine_M_O_Donnell.htm (Accessed 01 May 2008).
Rekkedal, T. and Paulsen M.F. (1998) NKI-STUDY Internet Students - Experiences and Attitudes. The Third Generation NKI-STUDY Electronic College: An Evaluation Report Written for the Leonardo Online Training Project [on line] http://nettskolen.NKI-study.no/forskning/33/evaluati.htm (accessed 20th May 2008).
Salmon, G. (2000) E-moderating: The key to teaching and learning online, London, Kogan Page.
Somekh, B. (2000) New Technology and Learning: Policy and Practice in the UK, 1980–2010 Education and Information Technologies Volume 5, Number 1, Netherlands, Springer. Available online at http://www.springerlink.com/content/m632j4315m210154/. (Accessed 10th May 2008).
Tyler-Smith, K. (2006) Early Attrition among First Time eLearners, Journal of Online Learning and Teaching Volume 2, Number 2 [on line]
http://jolt.merlot.org/Vol2_No2_TylerSmith.htm (accessed 15th May 2008).
Settings and Theoretical Contexts
While both evaluations considered online support for learners, substantially different perspectives were evident. The U-study was based within a European HEI, focusing on two in-house, on-line support packages, available to all HE students. This study assumed ICT would enhance educational opportunities, with the content of the packages being developed to assist new students, through an induction programme and a support facility for those expected to use ICT. The induction package focused on orientation and study skills, including ‘learning to learn’ skills (such as time management), as well as confidence building blocks; the support package featured ICT literacy components, including technical aspects and access issues. This was an evaluation study of the packages, but for what purpose(s) was unclear.
The NKI-study was set within an organisation driven by commercial needs to provide flexible learning solutions, which afforded the delivery to be taken at an individual pace; this was stated as their ‘philosophy’ for internet education. NKI’s previous research (for example, Rekkedal and Paulsen, 1998) confirmed flexibility was a key driver of their strategy and this study intended to evaluate if customer demands were being fulfilled. The intention was two-fold; to examine student’s support needs and to assess their satisfaction with this support available.
The U-study suggested on-line packages would support learners and this could enhance their educational opportunities; reference is made to Salmon (2000) in relation to this. However, while there is evidence that new opportunities could exist to support learning through online means, the use of Salmon’s theory was perhaps irrelevant, as this model of on-line learning stages was aimed at the development of collaboration and moderation skills, not at packages to support the initial stages of student learning. The implications of students being unable to learn online effectively was of greater relevance and perhaps the views of, for example, Tyler-Smith (2006) in relation to developing ‘learn to learn skills’ would have provided greater weight to the rationale; indeed, the importance of preparing students for online study appears to be crucial to this research. The study assumed that students would need to develop some ICT skills and there was also an awareness that HEI providers may assume that a certain level of IT literacy exists when this is often not the case (Somekh, 2000).
The NKI-study suggested the student group could be an important social and academic support in learning; however, it was not clear how this was prompted, with the emphasis being on individual student learning. While forums are encouraged, these are not used to any large extent as part of the learning experience, so theories of collaboration cannot realistically be applied.
The products were originally developed along the lines of didactical teaching, where the learner is guided through input models; although literature on developing theories has been reviewed, particularly relating to the possible importance of collaboration and social learning, the solutions continue to be student-led, based on personal knowledge construction. Interaction on-line was not considered important for many and one might concur with this if agreeing with the evidence of Davies and Graff (2005), who established that improved performance was not necessarily achieved by those who interacted more (although a greater volume of online activity could lead to higher grades and interaction may support ‘potential failers’). The NKI-study agreed that a social constructivist perspective may appear ideal as a social theory of learning, however, if online collaboration is not taking place and students have already stated that individual learning is of prime importance, NKI-study’s perspective is more greatly dictated by Piagian roots.
Strengths of Methodological approach
The NKI-study was easy to follow and logically structured. As a commercial aware organisation, NKI clearly based much of their study on their own previous research and are attempting to continually assess the viability of their internet learning packages. As there is access to paying clients as a sample population, their market research is a valuable method of assessing the product value; the study was targeted to the same audience, within the same context. A key strength is the clarity of the link between the need to review client’s perception of learner support and satisfaction, with the method of the data collection. The use of open-ended questions for depth of response and to follow up previous market research ensured that the data would be of relevance and current value to the organisation.
In relation to what the report uncovered, it declared that clients were overall satisfied with the support elements of the learning packages; however, to examine student’s need for support services may have been more meaningfully measured by greater quantitative analysis.
The U-study provided a clear overview of the content and purpose of the packages (if not the intention) of the study and this was justified using of a wide range. The study was aided by access to the whole student population within the HEI and the technical ability to offer on-line questionnaires, which would encourage those already accessing on-line resources, to respond. Some free response questions were available and both pre and post questionnaires were used; a method also used by Bos et al (2000), which allows comparative analysis.
Limitations and weaknesses
For the NKI-study, the intention was for the study support elements (classified as the parts of the course additional to the course content) to be analysed; however, might some students have difficulty in separating these components? A key criticism would be the sampling; to use statements such as ‘…nearly at random…’, the lack of clarity in explaining the relevance of a ‘student number’ in the selection process and the variety of selection judgements allowed by the interviewers, makes bring in to doubt the validity of the study. The quantitative data collated was not analysed in any detail and only ‘revealed’ in the final summary; the qualitative statements were not summarised.
While NKI established that the first phase of study was important, there seemed to be little emphasis on research of developing trust in the learning technology and learner interface; Bos et al (2002) suggest that it is difficult to establish this online trust, so it may be crucial to explore this aspect and possible links to lack of study support use, as an assumption is being made of ‘trust’ in the NKI systems.
The main weakness of the U-study was the lack of clarity of purpose; the aim was not stated until the end of the report, under the discussion; only at this point does it become known that the evaluation was to evaluate effectiveness and ‘…if it could be delivered appropriately online…’. There was no scale defined for measuring effectiveness and one would also question the meaning of ‘appropriately’ here; this should be explicitly defined.
In the U-study, demographics were discussed, but not stated as a relevant issue at the outset; nor was it clear how this discussion was linked to the support packages. It may have aided the reader to have a clear classification of what the authors’ meant by a ‘Primer’, as terminology familiarity was assumed. Another weakness of this study is the amount of time spent on research, analysis and summary of the location of the IT access and the connection type; this appeared irrelevant to the issue of support and was not stated as a mainstream of the study. In fact, there was a volume of quantitative data, which did not appear to ‘answer the question’; for example, satisfaction is not a measure of effectiveness, unless it is clearly stated that there is a scale defining this term. While it was acknowledged that packages ‘improved confidence’, this again is not an indication of ‘being effective’. Additionally, the authors did not respond to negative free response questions regarding time consumption and volume of information; this could have been linked with the issues Davies and Gaff (2005) considered, in relation to time pressures in student’s studies being a major component of the on-line learning experience.
In relation to validity, it is more likely that those who accessed and complete the packages are more likely to be IT competent anyway and would also be more likely to complete on-line questionnaires, thus skewing the data.
Lastly, related to ethical issues, it is unclear how students could remain ‘anonymous’ when clearly a logon to the system would be required, where this could be traced back to an individual.
Proposals
The key proposals for the NKI-study would be
Offer alternative feedback methods
Focus questions on specific elements to aid development of course components
Specify sampling selection methods
For the NKI-study, as learners are perceived as competent online users, providing online questionnaires may have allowed a higher response rate. It may also have been of higher value to the organisation if some focus had been placed on the usefulness which students attached to support elements; perhaps specific questions on tutor support and/or feedback is required. However, the recommendation would be for responses to be provided in a different format; the personal interaction of telephone interviews may lead to some respondents being uncomfortable in providing negative statements, particularly regarding tutors. The selection of questions from the personal choice of the interviewees could also be addressed in this focussing of content and NKI could also have targeted questioning to the reasons for lack of collaboration; if learners can be encouraged to ‘think by learning’ through online activities (as Nardi et al, 2004, discuss), and this is seen as a support mechanism of value, then learning as a social process may be encouraged.
While any students may provide reliable data, the selection of the sample should be stated; this would increase the validity of the study results.
The key proposals for the U-study would be
Clarify reasons for evaluation
Control group completing package as compulsory
Analysis of free response questions
Collection of qualitative data through interviews
Greater clarity of graphical representations
Feedback from non-student agents
Aims must be stated at the outset and these should be clearly defined, including boundaries of the study and terminology used; this would ensure the design matched the intended purpose, as well as allowing the reader to ‘see where its going’.
If a control group were to complete the package as compulsory, this would mean judgements could be made in relation to support components which were most used and valued by students, rather than making assumptions on information provided by those who freely selected to participate (as these may well have the higher ICT skills anyway). If all members of the control group participated, it may be possible to determine what additional support was actually required, much as was evidenced in the Cox (2007) study, where individual differences in learning were made more transparent. In relation to the use of the data, free response answers should be analysed and fully summarised; researchers cannot ‘ignore’ negative input or aspects brought to light, which were not perhaps anticipated at the start of the study.
Qualitative data could be collected through interviews, to encourage a depth of response; it would be particularly important to gain insight into why some students felt their support was ‘effective’. As Bell (2005) suggests, a guided interview could be used, so that questions would be focussed around support package issues, but with greater allowances made for issues not covered by the researchers’ initial ideas on support and its effectiveness. If data is collected only from those who freely access the packages, then the results would be unreliable when trying to apply results to the whole HEI population.
Best use should be made of the data collated and simplicity of graphical representations would support the discussions more strongly; for example, the ‘attitudes’ presentation as a graph would have more clearly illustrated the ‘pre’ and ‘post’ results differences, rather than by a table format.
Feedback from non-student agents would be an integral part of a well balanced study, where the ‘effectiveness’ of the packages is being measured; quantitative data may express grades, but showing true correlation to use of support may be challenging. Insight into actual performance could be carried out by asking tutors if weaknesses were still apparent in those who had completed the packages; from experience, it is not the taking part in the learning that provides the results, but the ability to transfer the acquired skills gained.
Overall, the NKI-study is seen as ‘better’, due to the intentions being stated and, to a large extent, met. However, as Tolmie (2001) suggests, perhaps the complete online experience needs ‘…examined, not just the use of the technology in isolation…’ (p237)
(Word count 2112)
References
Bell, J. (2005) Doing Your Research Project: A guide for first-time researchers in education, health and social science, (4th edn), Buckingham, Open University Press.
Bos, N., Olson, J., Gergle, D., Olson, G. and Wright, Z. (2002) ‘Effects of four computer-mediated communications channels on trust development’ in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems: Changing Our World, Changing Ourselves, Minneapolis, Minnesota, 2002, New York, NY, ACM; also available online at http://libezproxy.open.ac.uk/login?url=http://doi.acm.org/10.1145/503376.503401 (Accessed 3 May 2008).
Cox, R. (2007) ‘Technology-enhanced research: educational ICT systems as research instruments’, Technology, Pedagogy and Education, vol. 16, no. 3, pp. 337–56; also available online at http://libezproxy.open.ac.uk/login?url=http://dx.doi.org/10.1080/14759390701614470 (Accessed 19 May 2008).
Davies, J. and Graff, M. (2005) ‘Performance in e-learning: online participation and student grades’, British Journal of Educational Technology, vol. 36, no. 4, pp. 657–63; also available online at http://libezproxy.open.ac.uk/login?url=http://dx.doi.org/10.1111/j.1467-8535.2005.00542.x (Accessed 5 May 2008).
Nardi, B., Schiano, D., and Gumbrecht, M. (2004) ‘Blogging as social activity, or, would you let 900 million people read your diary?’ in Proceedings of the 2004 ACM Conference on Computer Supported Cooperative Work, Chicago, Illinois, New York, NY, ACM; also available online at http://libezproxy.open.ac.uk/login?url=http://doi.acm.org/10.1145/1031607.1031643 (Accessed 12 May 2008).
O’Donnell, C.M., Sloan, D.J. and Mulholland, C.W. (2006) ‘Evaluation of an online student induction and support package for online learners’, European Journal of Open, Distance and E-Learning [online], 2006/I, http://www.eurodl.org/materials/contrib/2004/Rekkedal_Qvist-Eriksen.htm (Accessed 03 May 2008).
Tolmie, A. (2001) ‘Examining learning in relation to the contexts of use of ICT’, Journal of Computer Assisted Learning, vol. 17, no. 3, pp. 235–41; also available online at http://libezproxy.open.ac.uk/login?url=http://www.blackwell-synergy.com/links/doi/10.1046/j.0266-4909.2001.00178.x (Accessed 2 April 2008).
Rekkedal, T. and Qvist-Eriksen, S. (2004) ‘Support services in e-learning – an eevaluation study of students’ needs and satisfaction’, European Journal of Open, Distance and E-Learning [online], 2004/I, http://www.eurodl.org/materials/contrib/2006/Catherine_M_O_Donnell.htm (Accessed 01 May 2008).
Rekkedal, T. and Paulsen M.F. (1998) NKI-STUDY Internet Students - Experiences and Attitudes. The Third Generation NKI-STUDY Electronic College: An Evaluation Report Written for the Leonardo Online Training Project [on line] http://nettskolen.NKI-study.no/forskning/33/evaluati.htm (accessed 20th May 2008).
Salmon, G. (2000) E-moderating: The key to teaching and learning online, London, Kogan Page.
Somekh, B. (2000) New Technology and Learning: Policy and Practice in the UK, 1980–2010 Education and Information Technologies Volume 5, Number 1, Netherlands, Springer. Available online at http://www.springerlink.com/content/m632j4315m210154/. (Accessed 10th May 2008).
Tyler-Smith, K. (2006) Early Attrition among First Time eLearners, Journal of Online Learning and Teaching Volume 2, Number 2 [on line]
http://jolt.merlot.org/Vol2_No2_TylerSmith.htm (accessed 15th May 2008).
TMA02
Part 1
There was no personal experience of wikis prior to participating in the activity; a posting and a comment were made, but virtually no interaction took place. This would imply that learning was independent – learning as a personal construction of knowledge (Von Glasersfeld, 1995). This appeared to be at odds with my comprehension of a potential strength of the wiki, that it could enable learning from a socio-cultural perspective (Cobb, 1994), through knowledge sharing. Was there, then, a lack of motivation to complete this task?
If, as many suggest (Roschelle, 1992; Cole, 1996; Rogoff, 1999), learning is achieved through some form of social collaboration, a number of theoretical positions support this; for example, the observation of learning as a social activity is the foundation of Activity Theory (Vygotsky, 1974), with the focus of learning being creating over time, through tasks. This theory is explicit in its use of cultural tools (whether through language or use of physical tools), which may have implications for who has access to the learning space, if it is technological determined. However, it is made clear that it is the activity, which contributes to learning, not merely the presence of others; more recent suggestions have also viewed that this is activity and collaboration is a key aspect of on-line learning (Salmon, 2000). There are, therefore, a range of issues to consider from an activity theory perspective.
Some may view the wiki as a Community of Practice (Lave and Wenger, 2002), but I would oppose this analysis, due to the lack of interaction, little evidence of ‘learning from masters’ and the fact that ‘situated opportunities’ did not occur; additionally, as Wenger suggests, Communities of Practice (CoP) cannot be created as such, but are developed from the practice itself. The wiki could, however, be seen as correlating more strongly with the Jones and Preece (2006) concept of ‘Communities of Interest’ (CoI). While I do not necessarily agree with their assumption of CoP being more organisationally-led, the view of the wiki being a CoI may prevail, as it has the valid component of a common interest (in carrying out the course task), whether or not the learning goal occurs. In opposition to this, though, the wiki did not fit other suggested criteria of a CoI, as it did not develop in an organic way and was not ‘open to all’, but determined by an educational institution’s constraints. Jones and Preece (2006) also discussed ‘usability’ (p118) - the ‘features and functions’ of technological access - and this may have particular relevance to the wiki issues, where the level of IT cognition and the accessibility issues may have determined the low level of participation, rather than the task itself.
The Tolmie (2001) research on software prompts was of interest; an initial view was taken that the software part of the technology tools was not part of the learning process. In the case of the wiki, participants may have had to negotiate a learning path with new technological tools; learning to use these would be very much a part of the learning experience as a whole. Some participants may have learned much from this, although these would have been unintended outcomes, more concurrent with the usability of learning tools and collaboration through IT (Jones and Preece, 2006).
Another key matter is narratives, which is raised regularly in educational research and appears to be a crucial component of learning; from Bruner’s tenets (1996), where narratives were a way of explaining things, (how they happened); to more recent discussions of narratives in a non-linear form (Laurilland et al, 2000). This may be one component that is missing from the wiki and possibly why it was found difficult to ‘learn from’; there appeared to be no narrative thread, no order, nor any method to easily follow a narrative line of a particular topic. To use the idea of Crook and Dymott (2005), in comparing reading practices between page and screen text, the wiki does not afford the opportunity to easily reflect on the narrative thread; as Greeno et al (1999, as cited in McCormick and Paechter) suggested, readers use many ‘repair strategies’ to aid comprehension and these may not be as accessible in an on-line format. Additionally, as these technological tools did not support this ‘narrative thread’ facility and, being a fairly visual learner, it was difficult to create an overall picture of a topic; perhaps diagrams such as a concept map would have aided knowledge construction.
If using the wiki for educational research purposes, one would need to be clear on which aspect was being presented; for example, if the wiki was viewed purely as a technological tool, then this would involve very different research to the view of a wiki as an activity system, as Jonassen and Rohrer-Murphy (1999) discuss. Using any of the socio-cultural theories in education would imply designing the learning structure to aid the construction of knowledge through interactions and activity, but this would also be dependent on the context (Tolmie, 2001), in which the learning is set. Crook and Dymott (2005), state that ‘…computers are a technology for collaboration…’ (p106); in linking this with activity theory, wikis could be identified as a meeting point of the technological tools and theories of learning through collaboration.
To organise these ideas, two classifications from Conole et al (2004) have been used; ‘Activity based’ and ‘Socially situated’ theories (see appendix 1 for a personal knowledge construction of this).
Activity based theories appear to imply that the use of methodologies which analyse the social interactions during tasks and/or the resultant changes in independent knowledge, would be appropriate, suggesting some statistical analysis. However, the activities are bound in with the context in which the learning takes place, as Jonassen and Rohrer-Murphy (1999) suggest, and therefore qualitative methods would be just as appropriate.
As socially situated theories are generally dependent on selected variables, specific not only the learner, but also to the artefacts, setting, culture, etc.; taking this theoretical stand could assume that qualitative methods may be more appropriate, gather the depth of data.
Wiki research raises such questions as
Are some learning subjects more fitting to wiki use?
How often are specific artefacts used; when and where?
How could usability features be assessed?
Did learners find it easy to negotiate the material/course?
Considering the first two questions, if quantitative methods were used, this could be in the form of analysing statistical results from previous research, gathered from a range of faculties and institutions; or tracking the level of usage of different artefacts. While Jonassen and Rohrer-Murphy (1999) suggest a qualitative approach is required to analyse activity theory, numerical data on, for example, the number of interactions in a particular subject’s collaboration space, would be of value in assessing trends of usage. One could also look at tracking how often the various parts of courses were used (such as case studies) and this could feed forward to future course design. A clear disadvantage in using these methods would be the lack of context; a loss of any temporal significance and the nuances of group dynamics, which may be a part of the learning development. Using quantitative methods to research what is viewed as a collaborative learning theory, would need to be taken from the perspective of addressing a need for a stakeholder to have ‘real evidence’, in the form of numerical data.
From the perspective of socially situated theories, the most appropriate research methods would be qualitative, to analyse existing teaching and learning structures and to gain insights to the thoughts of learners; for example, Laurilland et al (2000) saw ‘real learning’ observations and transcripts of the dialogue as crucial to their investigation. Ethnographic observations and transcribing are, of course, time consuming activities; however, a key benefit would be in using results to identify what could be better supported, much as suggested by Conole et al (2004), when using a framework such as the learning design model, to analyse current practices. There may be limitations of this approach, as the researcher has great influence in making assumptions and objective analysis; there may also be pressures of an institutional or organisational nature. A key weakness may be in the narrowness of any conclusions drawn and the problem of trying to make theoretical ideas or models ‘fit’ the evidence; for example, in using the PCGE case and fitting this to their framework, Conole et al did not appear to have considered the vital issue of how important it may be to have external input to this course and what perhaps needed to be changed were the tasks in relation to this visit, not the entire component.
It would be appropriate to state that any theoretical perspective and any methodology could be used in research, providing that this was explicitly stated and clarified within the context of the investigation or evaluation being carried out. The wiki use overall raises too many questions; the research questions would, therefore, be ultimately determined by what and who wanted the answers.
Part 2
Laurilland’s (2002) Conversational Framework (CF) was a model developed to make the process of learning, through teacher and learner interactions, more transparent. These ‘narrative lines’ were seen as operating discursively and interactively, suggesting two levels; however, the process appears cyclical in nature, much as any system with a feedback loop mechanism (Fleming and Levie, 1993).
A strength of the CF is that it is a clear, simple model, applicable to almost any learning landscape; this may be one reason researchers have appropriated the framework as an introductory point, to emphasise the role dialogue has in building knowledge through interactions (for example, McAndrew et al, 2002). A weakness of the framework is the difficulty in viewing the quantity and quality of the interactions; a three dimensional model would be necessary to analyse this depth. Another weakness, cited by researchers (such as Lee, 2006) was the incompleteness of the research; this was then used as a platform to extend the framework to their specific area of interest, particularly in the use of technological dialogue.
Conole et al (2004) introduced the CF as a model of learning and suggests that such models support specific theoretical views; this is used as a starting point to consider how most pedagogical theories are not designed to deal with e-learning. In an attempt to highlight theory features and match these to appropriate e-learning methods, a range of theories were categorised, with the CF allocated as literature of ‘socially situated’ learning, which is dependent on language and culture; as these variables are very closely aligned with conversations, this seemed appropriate. However, Conole et al suggests that all aspects of their model are used, except ‘non reflection’, so this appears to indicate that the CF is a ‘good’ theory for online e-learning.
McAndrew et al (2002) used the CF to suggest a change in the power dynamics of learning, but it was not clear if the statement of ‘…weakening the image of a teacher in the system…’ (p155) was a positive reinforcement of increasing facilitative skills, which are perhaps more greatly needed in online communities; or referring to a general move towards a greater level of collaboration in learning environments.
Weller et al (2005) appropriated the CF to analyse audio conferencing and it was shown that this was a useful model to clarify the interactions of the narrative lines. However, it would be debatable if the discussions taking place were actually ‘……in Laurilland’s (2002) conversational framework….’ (P.70); this is where the researcher has placed the dialogue, but is not a known fact.
Lee (2006) is keen to use the CF to highlight the issues of both cyclical learning and the use of activity-based theory framework. This is taken further to suggest the model also distinguishes between academic and experiential knowledge, but his analysis may only be valid against particular learning contexts.
It is accepted that assessment of the CF here is very brief, due to time constraints and further research would have allowed much greater depth of evaluation.
(Word count 2000)
Appendix 1
(Unable to load the diagram here - apologies)
C:\Documents and Settings\Carole\Desktop\Educational technology research Management\H809 TMAs\H809 TMA02 diagram.mht
(Needs to have link with subject matter
Dependent on cultural context; changes over time
Which subject or discipline is most appropriate?
No of times different tools used; where and when?
Collect quantitative data; primary statistical methods, secondary references to surveys
Usability factors (such as access)
Requires clear structure and navigability
How could usability features be assessed? Was it easy?
Did learners make sense of it; was it logical?
Collect qualitative data; observational, dialogue analysis
Either could be an evaluation of previous research)
Reference list
Bruner, J. (1996) The culture of education, Cambridge, Harvard University Press.
Cobb, P. (1994) ‘Where Is the Mind? Constructivist and Sociocultural Perspectives on Mathematical Development’, Educational Researcher, vol. 23, no. 7, pp. 13-20 American Educational Research Association.
Cole, M. (1996) Cultural Psychology: a once and future discipline, Cambridge, Harvard University Press.
Conole, G., Dyke, M., Oliver, M. and Seale, J. (2004) ‘Mapping pedagogy and tools for effective learning design’, Computers & Education, vol. 43, nos. 1–2, pp. 17–33. Available online at http://libezproxy.open.ac.uk/login?url=http://dx.doi.org/10.1016/j.compedu.2003.12.018 (Accessed 5 February 2008).
Crook, C. and Dymott, R. (2005) ‘ICT and the literacy practices of student writing’ in Monteith, M. (ed.) Teaching Secondary School Literacies with ICT, Maidenhead, Open University Press.
Fleming, M. and Levie, W.H. (1993) ‘Instructional message design: principles from the behavioral and cognitive sciences’ (2nd edition), Educational Technology Publications, Englewood Cliffs, NJ.
Greeno, J.G., Pearson, P.D. and Schoenfeld, A.H. (1999) ‘Achievement and Theories of Knowing and Learning’ in McCormick, R. and Paechter, C. (eds) Learning and Knowing, London, Sage Publishing.
Jonassen, D. and Rohrer-Murphy, L. (1999) ‘Activity theory as a framework for designing constructivist learning environments’, Educational Technology Research and Development, vol. 47, no. 1, pp. 61–79. Available online at http://libezproxy.open.ac.uk/login?url=http://dx.doi.org/10.1007/BF02299477 (Accessed 5 February 2008).
Jones, A. and Preece, J. (2006) ‘Online communities for teachers and lifelong learners: a framework for comparing similarities and identifying differences in communities of practice and communities of interest’, International Journal of Learning Technology, vol. 2, no. 2–3, pp. 112–37.
Lave J. and Wenger E. (2002) ‘Legitimate peripheral participation in community of practice’ in Harrison, R., Reeve, F., Hanson, A., Clarke, J. (eds.) Supporting Lifelong Learning Volume 1 Perspectives on learning, London, RoutledgeFalmer in association with the Open University.
Laurilland, D. (2002) Rethinking University Teaching: A Conversational Framework for the Effective Use of Learning Technologies (2nd edn), London, RoutledgeFalmer.
Laurilland, D., Stratfold, M., Luckin, R., Plowman, L. and Taylor, J. (2000) ‘Affordances for learning in a non-linear narrative medium’, Journal of Interactive Media in Education (JIME), vol. 2. Available online at http://www-jime.open.ac.uk/00/2/ (accessed 1st April, 2008).
Lee, N. (2006) ‘Design as a learning cycle: A conversational experience’, Studies in Learning, Evaluation, Innovation and Development [online], vol. 3, no.2, pp. 12-22. Available at http://sleid.cqu.edu.au (accessed 5th April 2008).
McAndrew, P., Mackinnon, L. and Rist R. (2002) ‘A framework for Work-Based Networked Learning’, Journal of Interactive Learning Research, vol. 13, no. 1-2, pp. 151-168. Available at
http://kn.open.ac.uk/document.cfm?documentid=6503 (Accessed 9th April 2008).
Rogoff, B. (1999) ‘Cognitive Development through Social Interaction: Vygotsky and Piaget’ in Murphy, P. (ed.) Learners, Learning and Assessment London, Sage Publishing.
Roschelle, J. (1992) ‘Learning by collaborating: convergent conceptual change’, Journal of the Learning Sciences, vol. 2, no. 3, pp. 235–76.
Salmon, G. (2000) E-moderating: the key to teaching and learning online, London, Kogan Page.
Tolmie, A. (2001) ‘Examining learning in relation to the contexts of use of ICT’, Journal of Computer Assisted Learning, vol. 17, no. 3, pp. 235–41.
Von Glasersfeld, E. (1995) Radical constructivism: A way of knowing and learning. London, Falmer Press.
Vygotsky, L.S. (1974) Mind in Society, Cambridge, Harvard University Press.
Weller, M., Pegler C. and Mason, R. (2005) ‘Use of innovative technologies on an e-learning course’, Internet and Higher Education, vol. 8, pp. 61–71.
There was no personal experience of wikis prior to participating in the activity; a posting and a comment were made, but virtually no interaction took place. This would imply that learning was independent – learning as a personal construction of knowledge (Von Glasersfeld, 1995). This appeared to be at odds with my comprehension of a potential strength of the wiki, that it could enable learning from a socio-cultural perspective (Cobb, 1994), through knowledge sharing. Was there, then, a lack of motivation to complete this task?
If, as many suggest (Roschelle, 1992; Cole, 1996; Rogoff, 1999), learning is achieved through some form of social collaboration, a number of theoretical positions support this; for example, the observation of learning as a social activity is the foundation of Activity Theory (Vygotsky, 1974), with the focus of learning being creating over time, through tasks. This theory is explicit in its use of cultural tools (whether through language or use of physical tools), which may have implications for who has access to the learning space, if it is technological determined. However, it is made clear that it is the activity, which contributes to learning, not merely the presence of others; more recent suggestions have also viewed that this is activity and collaboration is a key aspect of on-line learning (Salmon, 2000). There are, therefore, a range of issues to consider from an activity theory perspective.
Some may view the wiki as a Community of Practice (Lave and Wenger, 2002), but I would oppose this analysis, due to the lack of interaction, little evidence of ‘learning from masters’ and the fact that ‘situated opportunities’ did not occur; additionally, as Wenger suggests, Communities of Practice (CoP) cannot be created as such, but are developed from the practice itself. The wiki could, however, be seen as correlating more strongly with the Jones and Preece (2006) concept of ‘Communities of Interest’ (CoI). While I do not necessarily agree with their assumption of CoP being more organisationally-led, the view of the wiki being a CoI may prevail, as it has the valid component of a common interest (in carrying out the course task), whether or not the learning goal occurs. In opposition to this, though, the wiki did not fit other suggested criteria of a CoI, as it did not develop in an organic way and was not ‘open to all’, but determined by an educational institution’s constraints. Jones and Preece (2006) also discussed ‘usability’ (p118) - the ‘features and functions’ of technological access - and this may have particular relevance to the wiki issues, where the level of IT cognition and the accessibility issues may have determined the low level of participation, rather than the task itself.
The Tolmie (2001) research on software prompts was of interest; an initial view was taken that the software part of the technology tools was not part of the learning process. In the case of the wiki, participants may have had to negotiate a learning path with new technological tools; learning to use these would be very much a part of the learning experience as a whole. Some participants may have learned much from this, although these would have been unintended outcomes, more concurrent with the usability of learning tools and collaboration through IT (Jones and Preece, 2006).
Another key matter is narratives, which is raised regularly in educational research and appears to be a crucial component of learning; from Bruner’s tenets (1996), where narratives were a way of explaining things, (how they happened); to more recent discussions of narratives in a non-linear form (Laurilland et al, 2000). This may be one component that is missing from the wiki and possibly why it was found difficult to ‘learn from’; there appeared to be no narrative thread, no order, nor any method to easily follow a narrative line of a particular topic. To use the idea of Crook and Dymott (2005), in comparing reading practices between page and screen text, the wiki does not afford the opportunity to easily reflect on the narrative thread; as Greeno et al (1999, as cited in McCormick and Paechter) suggested, readers use many ‘repair strategies’ to aid comprehension and these may not be as accessible in an on-line format. Additionally, as these technological tools did not support this ‘narrative thread’ facility and, being a fairly visual learner, it was difficult to create an overall picture of a topic; perhaps diagrams such as a concept map would have aided knowledge construction.
If using the wiki for educational research purposes, one would need to be clear on which aspect was being presented; for example, if the wiki was viewed purely as a technological tool, then this would involve very different research to the view of a wiki as an activity system, as Jonassen and Rohrer-Murphy (1999) discuss. Using any of the socio-cultural theories in education would imply designing the learning structure to aid the construction of knowledge through interactions and activity, but this would also be dependent on the context (Tolmie, 2001), in which the learning is set. Crook and Dymott (2005), state that ‘…computers are a technology for collaboration…’ (p106); in linking this with activity theory, wikis could be identified as a meeting point of the technological tools and theories of learning through collaboration.
To organise these ideas, two classifications from Conole et al (2004) have been used; ‘Activity based’ and ‘Socially situated’ theories (see appendix 1 for a personal knowledge construction of this).
Activity based theories appear to imply that the use of methodologies which analyse the social interactions during tasks and/or the resultant changes in independent knowledge, would be appropriate, suggesting some statistical analysis. However, the activities are bound in with the context in which the learning takes place, as Jonassen and Rohrer-Murphy (1999) suggest, and therefore qualitative methods would be just as appropriate.
As socially situated theories are generally dependent on selected variables, specific not only the learner, but also to the artefacts, setting, culture, etc.; taking this theoretical stand could assume that qualitative methods may be more appropriate, gather the depth of data.
Wiki research raises such questions as
Are some learning subjects more fitting to wiki use?
How often are specific artefacts used; when and where?
How could usability features be assessed?
Did learners find it easy to negotiate the material/course?
Considering the first two questions, if quantitative methods were used, this could be in the form of analysing statistical results from previous research, gathered from a range of faculties and institutions; or tracking the level of usage of different artefacts. While Jonassen and Rohrer-Murphy (1999) suggest a qualitative approach is required to analyse activity theory, numerical data on, for example, the number of interactions in a particular subject’s collaboration space, would be of value in assessing trends of usage. One could also look at tracking how often the various parts of courses were used (such as case studies) and this could feed forward to future course design. A clear disadvantage in using these methods would be the lack of context; a loss of any temporal significance and the nuances of group dynamics, which may be a part of the learning development. Using quantitative methods to research what is viewed as a collaborative learning theory, would need to be taken from the perspective of addressing a need for a stakeholder to have ‘real evidence’, in the form of numerical data.
From the perspective of socially situated theories, the most appropriate research methods would be qualitative, to analyse existing teaching and learning structures and to gain insights to the thoughts of learners; for example, Laurilland et al (2000) saw ‘real learning’ observations and transcripts of the dialogue as crucial to their investigation. Ethnographic observations and transcribing are, of course, time consuming activities; however, a key benefit would be in using results to identify what could be better supported, much as suggested by Conole et al (2004), when using a framework such as the learning design model, to analyse current practices. There may be limitations of this approach, as the researcher has great influence in making assumptions and objective analysis; there may also be pressures of an institutional or organisational nature. A key weakness may be in the narrowness of any conclusions drawn and the problem of trying to make theoretical ideas or models ‘fit’ the evidence; for example, in using the PCGE case and fitting this to their framework, Conole et al did not appear to have considered the vital issue of how important it may be to have external input to this course and what perhaps needed to be changed were the tasks in relation to this visit, not the entire component.
It would be appropriate to state that any theoretical perspective and any methodology could be used in research, providing that this was explicitly stated and clarified within the context of the investigation or evaluation being carried out. The wiki use overall raises too many questions; the research questions would, therefore, be ultimately determined by what and who wanted the answers.
Part 2
Laurilland’s (2002) Conversational Framework (CF) was a model developed to make the process of learning, through teacher and learner interactions, more transparent. These ‘narrative lines’ were seen as operating discursively and interactively, suggesting two levels; however, the process appears cyclical in nature, much as any system with a feedback loop mechanism (Fleming and Levie, 1993).
A strength of the CF is that it is a clear, simple model, applicable to almost any learning landscape; this may be one reason researchers have appropriated the framework as an introductory point, to emphasise the role dialogue has in building knowledge through interactions (for example, McAndrew et al, 2002). A weakness of the framework is the difficulty in viewing the quantity and quality of the interactions; a three dimensional model would be necessary to analyse this depth. Another weakness, cited by researchers (such as Lee, 2006) was the incompleteness of the research; this was then used as a platform to extend the framework to their specific area of interest, particularly in the use of technological dialogue.
Conole et al (2004) introduced the CF as a model of learning and suggests that such models support specific theoretical views; this is used as a starting point to consider how most pedagogical theories are not designed to deal with e-learning. In an attempt to highlight theory features and match these to appropriate e-learning methods, a range of theories were categorised, with the CF allocated as literature of ‘socially situated’ learning, which is dependent on language and culture; as these variables are very closely aligned with conversations, this seemed appropriate. However, Conole et al suggests that all aspects of their model are used, except ‘non reflection’, so this appears to indicate that the CF is a ‘good’ theory for online e-learning.
McAndrew et al (2002) used the CF to suggest a change in the power dynamics of learning, but it was not clear if the statement of ‘…weakening the image of a teacher in the system…’ (p155) was a positive reinforcement of increasing facilitative skills, which are perhaps more greatly needed in online communities; or referring to a general move towards a greater level of collaboration in learning environments.
Weller et al (2005) appropriated the CF to analyse audio conferencing and it was shown that this was a useful model to clarify the interactions of the narrative lines. However, it would be debatable if the discussions taking place were actually ‘……in Laurilland’s (2002) conversational framework….’ (P.70); this is where the researcher has placed the dialogue, but is not a known fact.
Lee (2006) is keen to use the CF to highlight the issues of both cyclical learning and the use of activity-based theory framework. This is taken further to suggest the model also distinguishes between academic and experiential knowledge, but his analysis may only be valid against particular learning contexts.
It is accepted that assessment of the CF here is very brief, due to time constraints and further research would have allowed much greater depth of evaluation.
(Word count 2000)
Appendix 1
(Unable to load the diagram here - apologies)
C:\Documents and Settings\Carole\Desktop\Educational technology research Management\H809 TMAs\H809 TMA02 diagram.mht
(Needs to have link with subject matter
Dependent on cultural context; changes over time
Which subject or discipline is most appropriate?
No of times different tools used; where and when?
Collect quantitative data; primary statistical methods, secondary references to surveys
Usability factors (such as access)
Requires clear structure and navigability
How could usability features be assessed? Was it easy?
Did learners make sense of it; was it logical?
Collect qualitative data; observational, dialogue analysis
Either could be an evaluation of previous research)
Reference list
Bruner, J. (1996) The culture of education, Cambridge, Harvard University Press.
Cobb, P. (1994) ‘Where Is the Mind? Constructivist and Sociocultural Perspectives on Mathematical Development’, Educational Researcher, vol. 23, no. 7, pp. 13-20 American Educational Research Association.
Cole, M. (1996) Cultural Psychology: a once and future discipline, Cambridge, Harvard University Press.
Conole, G., Dyke, M., Oliver, M. and Seale, J. (2004) ‘Mapping pedagogy and tools for effective learning design’, Computers & Education, vol. 43, nos. 1–2, pp. 17–33. Available online at http://libezproxy.open.ac.uk/login?url=http://dx.doi.org/10.1016/j.compedu.2003.12.018 (Accessed 5 February 2008).
Crook, C. and Dymott, R. (2005) ‘ICT and the literacy practices of student writing’ in Monteith, M. (ed.) Teaching Secondary School Literacies with ICT, Maidenhead, Open University Press.
Fleming, M. and Levie, W.H. (1993) ‘Instructional message design: principles from the behavioral and cognitive sciences’ (2nd edition), Educational Technology Publications, Englewood Cliffs, NJ.
Greeno, J.G., Pearson, P.D. and Schoenfeld, A.H. (1999) ‘Achievement and Theories of Knowing and Learning’ in McCormick, R. and Paechter, C. (eds) Learning and Knowing, London, Sage Publishing.
Jonassen, D. and Rohrer-Murphy, L. (1999) ‘Activity theory as a framework for designing constructivist learning environments’, Educational Technology Research and Development, vol. 47, no. 1, pp. 61–79. Available online at http://libezproxy.open.ac.uk/login?url=http://dx.doi.org/10.1007/BF02299477 (Accessed 5 February 2008).
Jones, A. and Preece, J. (2006) ‘Online communities for teachers and lifelong learners: a framework for comparing similarities and identifying differences in communities of practice and communities of interest’, International Journal of Learning Technology, vol. 2, no. 2–3, pp. 112–37.
Lave J. and Wenger E. (2002) ‘Legitimate peripheral participation in community of practice’ in Harrison, R., Reeve, F., Hanson, A., Clarke, J. (eds.) Supporting Lifelong Learning Volume 1 Perspectives on learning, London, RoutledgeFalmer in association with the Open University.
Laurilland, D. (2002) Rethinking University Teaching: A Conversational Framework for the Effective Use of Learning Technologies (2nd edn), London, RoutledgeFalmer.
Laurilland, D., Stratfold, M., Luckin, R., Plowman, L. and Taylor, J. (2000) ‘Affordances for learning in a non-linear narrative medium’, Journal of Interactive Media in Education (JIME), vol. 2. Available online at http://www-jime.open.ac.uk/00/2/ (accessed 1st April, 2008).
Lee, N. (2006) ‘Design as a learning cycle: A conversational experience’, Studies in Learning, Evaluation, Innovation and Development [online], vol. 3, no.2, pp. 12-22. Available at http://sleid.cqu.edu.au (accessed 5th April 2008).
McAndrew, P., Mackinnon, L. and Rist R. (2002) ‘A framework for Work-Based Networked Learning’, Journal of Interactive Learning Research, vol. 13, no. 1-2, pp. 151-168. Available at
http://kn.open.ac.uk/document.cfm?documentid=6503 (Accessed 9th April 2008).
Rogoff, B. (1999) ‘Cognitive Development through Social Interaction: Vygotsky and Piaget’ in Murphy, P. (ed.) Learners, Learning and Assessment London, Sage Publishing.
Roschelle, J. (1992) ‘Learning by collaborating: convergent conceptual change’, Journal of the Learning Sciences, vol. 2, no. 3, pp. 235–76.
Salmon, G. (2000) E-moderating: the key to teaching and learning online, London, Kogan Page.
Tolmie, A. (2001) ‘Examining learning in relation to the contexts of use of ICT’, Journal of Computer Assisted Learning, vol. 17, no. 3, pp. 235–41.
Von Glasersfeld, E. (1995) Radical constructivism: A way of knowing and learning. London, Falmer Press.
Vygotsky, L.S. (1974) Mind in Society, Cambridge, Harvard University Press.
Weller, M., Pegler C. and Mason, R. (2005) ‘Use of innovative technologies on an e-learning course’, Internet and Higher Education, vol. 8, pp. 61–71.
The end of the blog....
Well, it's now some time since I posted anything to this blog. As far as the H809 course goes, I seemed to do so much for Block 2 of the course (readings, input to wikkis and TGF discussions, own research, as well as this blog) that I felt quite exhausted by the time the TMA02 was submitted. I didn't seem to have the same motivation and energy and was probably suffering from the 'blog burnout' that Nardi at al discuss.
I am going to post TMA02 and TMA03 here, so that this blog reflects the course up to this date. I am then planning to start a new blog for the ECA component, as I think this could be a 'working document' and be continued after the course end, as I further research blog use. I'll post the 'new' blog address here, as soon as I have started this and will, of course, be open to comments on the development of my project.
I am going to post TMA02 and TMA03 here, so that this blog reflects the course up to this date. I am then planning to start a new blog for the ECA component, as I think this could be a 'working document' and be continued after the course end, as I further research blog use. I'll post the 'new' blog address here, as soon as I have started this and will, of course, be open to comments on the development of my project.
Thursday, 3 April 2008
Laurilland et al (2000)
Laurillard et al. (2000)
Affordances for learning in a non-linear narrative medium
The first thing I had to ‘get my head round’ was the idea of ‘affordances’; how did I perceive this concept? Well…as how we can use something; what we can ‘take’ from an available set of tools; the value or equity of the way we use something? The affordance could also be a suggestion of how something could be used or be the method most suitable to lend itself to the task. I also saw here one of the key issues of a previous reading I had reviewed (Lave and Wenger, a ‘teaching’ versus a ‘learning’ curriculum) – the issue of the teaching may be designed in a certain way, but does this mean it is accessible (in all dimensions) and of most use to the student?
In reading this paper, I was also struck by the fact that I was carrying out a reading task as discussed by Crook and Dymott; I printed the reading out and viewed the diagram on page 4 with the text on page 5. If I had read this online, I would have been distracted by the technology and used a lot of time in trying to have these pages side by side on the screen. The diagram certainly made more sence when viewed here - http://www2.smumn.edu/deptpages/~instructtech/lol/laurillard/ (interactive CF diagram).
1. What are the central arguments? How do the authors intend to link these to the framework described?
The central argument is that learning is a process carried out between teacher and learner and a ‘narrative line’ is required for the learning to take place. The process of interaction and feedback is shown diagrammatically, as a conversational framework (although this seems to be cyclical in nature – ‘cycles within a sequence’, p5).
2. What are the key strengths of the CF in terms of what it highlights about the learning process? Can you see any limitations in this approach?
The strengths of the CF is that it is a clear, simple model and could be applicable to almost any learning situation. From the figure (1) on p4, the limitations are that the quantity and depth of interactions cannot be seen (the interactive version assists this understanding).
3. Re the sequence of teacher–student interactions; can you think of an alternative sequence, perhaps from your own experience, which could be mapped using the CF?
There is, of course, the problem of lack of interaction, which can occur at times. The teacher may ask for help from ‘stronger’ students and peer interventions in feedback have also aided learning. Some subject matter lends itself to greater group discussions and peer feedback; also, problem solving activities create different lines of feedback and interaction.
Data collection and analytical techniques – limitations, alternative methods, benefits?
The data collected included dialogue of lesson interactions, observational notes and transcripts of dialogue (video), assignment reviews and feedback on work carried out. Using short dialogue clips can be used to highlight project issues, but there may be a danger of bias, as the whole context is not available to the reader (e.g. prior experience of this type of learning, level of IT skills). It may have been useful to record the number of tasks/process interactions; does this have implications for learning taking place, or taking place at a deeper level?
The pros and cons of the two-stage design process?
If wider data is collected and analysed in the second design, how will it be possible to compare the ‘results’? The second design highlighted ‘testing’ as an integral component, which was not evident in the first design; however, the original design was used to see areas where improvement of the desing could take pace, so there must be an argument of why ‘waste time’ in carrying out the same process, when the potential design benefits have been identified?
I had some other thoughts on this paper.
On page 8, it appears that the author is almost ‘blaming the technology’ when the students did not seem to have goals or progress towards it; is this not just poor teaching practice, that the students were not completely clear on the session objectives? This is apparent on page 11, where the new design included ‘…a clear statement of an overall goal…’. I was unclear here why the use of a Notepad was included – was there not access to a Word or similar method of creating notes? Why use this? Did students already have the skills to use this or not? Would this impact on how students used this method of recording the task? I considered that the use of an audio summary (p 12) a much greater affordance to constructing their conceptual knowledge, rather than ‘writing’ tasks. Will ‘creating affordances’ mean that greater interaction, and therefore learning, will actually occur?
Affordances for learning in a non-linear narrative medium
The first thing I had to ‘get my head round’ was the idea of ‘affordances’; how did I perceive this concept? Well…as how we can use something; what we can ‘take’ from an available set of tools; the value or equity of the way we use something? The affordance could also be a suggestion of how something could be used or be the method most suitable to lend itself to the task. I also saw here one of the key issues of a previous reading I had reviewed (Lave and Wenger, a ‘teaching’ versus a ‘learning’ curriculum) – the issue of the teaching may be designed in a certain way, but does this mean it is accessible (in all dimensions) and of most use to the student?
In reading this paper, I was also struck by the fact that I was carrying out a reading task as discussed by Crook and Dymott; I printed the reading out and viewed the diagram on page 4 with the text on page 5. If I had read this online, I would have been distracted by the technology and used a lot of time in trying to have these pages side by side on the screen. The diagram certainly made more sence when viewed here - http://www2.smumn.edu/deptpages/~instructtech/lol/laurillard/ (interactive CF diagram).
1. What are the central arguments? How do the authors intend to link these to the framework described?
The central argument is that learning is a process carried out between teacher and learner and a ‘narrative line’ is required for the learning to take place. The process of interaction and feedback is shown diagrammatically, as a conversational framework (although this seems to be cyclical in nature – ‘cycles within a sequence’, p5).
2. What are the key strengths of the CF in terms of what it highlights about the learning process? Can you see any limitations in this approach?
The strengths of the CF is that it is a clear, simple model and could be applicable to almost any learning situation. From the figure (1) on p4, the limitations are that the quantity and depth of interactions cannot be seen (the interactive version assists this understanding).
3. Re the sequence of teacher–student interactions; can you think of an alternative sequence, perhaps from your own experience, which could be mapped using the CF?
There is, of course, the problem of lack of interaction, which can occur at times. The teacher may ask for help from ‘stronger’ students and peer interventions in feedback have also aided learning. Some subject matter lends itself to greater group discussions and peer feedback; also, problem solving activities create different lines of feedback and interaction.
Data collection and analytical techniques – limitations, alternative methods, benefits?
The data collected included dialogue of lesson interactions, observational notes and transcripts of dialogue (video), assignment reviews and feedback on work carried out. Using short dialogue clips can be used to highlight project issues, but there may be a danger of bias, as the whole context is not available to the reader (e.g. prior experience of this type of learning, level of IT skills). It may have been useful to record the number of tasks/process interactions; does this have implications for learning taking place, or taking place at a deeper level?
The pros and cons of the two-stage design process?
If wider data is collected and analysed in the second design, how will it be possible to compare the ‘results’? The second design highlighted ‘testing’ as an integral component, which was not evident in the first design; however, the original design was used to see areas where improvement of the desing could take pace, so there must be an argument of why ‘waste time’ in carrying out the same process, when the potential design benefits have been identified?
I had some other thoughts on this paper.
On page 8, it appears that the author is almost ‘blaming the technology’ when the students did not seem to have goals or progress towards it; is this not just poor teaching practice, that the students were not completely clear on the session objectives? This is apparent on page 11, where the new design included ‘…a clear statement of an overall goal…’. I was unclear here why the use of a Notepad was included – was there not access to a Word or similar method of creating notes? Why use this? Did students already have the skills to use this or not? Would this impact on how students used this method of recording the task? I considered that the use of an audio summary (p 12) a much greater affordance to constructing their conceptual knowledge, rather than ‘writing’ tasks. Will ‘creating affordances’ mean that greater interaction, and therefore learning, will actually occur?
Tolmie, Crook and Dymott
While Tolmie seemed to take a psychological stand on learning taking place through a constructivist perspective ('in the head'), and looking at how this learning was achieved through technology, the issues raised appeared to lie more strongly with a socio-cultural perspective. I saw this through the issues of differences raised, which were culturally determined, such as gender and background, as well as being set within a pre-existing educational framework. Crook and Dymott appeared to be more transparent in their context setting and stated that the research has its roots in activity theory, where the complex relationship of person, system and culture determine the learning and how it may be achieved. It was clear that one could not view the technology and the various ways in which it might be used, as separate from the society and culture to which it is a part.
The Tolmie reading certainly made allowances for differences and while this may be seen as 'good practice', it led to research evidence which was 'unseen' and this meant it was difficult to draw any solid conclusions - just build awareness of differences. Crook and Dymott appeared to have more visible and recordable evidence, but there still appeared to be an assumption that technology was changing the writing activities, with little input from other aspects of changes in society.
The style of the Tolmie research would be appropriate when there may be identifiable differences between groups to be included in a project, but the measuring tools would need to be reviewed to clarify exactly what is being measured. Crook and Dymott's methods could be realistically used to review other applications, such as the use of software in the financial sector of learning; their narrative style seemed very appropriate for this type of observational study.
The Tolmie reading certainly made allowances for differences and while this may be seen as 'good practice', it led to research evidence which was 'unseen' and this meant it was difficult to draw any solid conclusions - just build awareness of differences. Crook and Dymott appeared to have more visible and recordable evidence, but there still appeared to be an assumption that technology was changing the writing activities, with little input from other aspects of changes in society.
The style of the Tolmie research would be appropriate when there may be identifiable differences between groups to be included in a project, but the measuring tools would need to be reviewed to clarify exactly what is being measured. Crook and Dymott's methods could be realistically used to review other applications, such as the use of software in the financial sector of learning; their narrative style seemed very appropriate for this type of observational study.
Wednesday, 26 March 2008
Crook and Dymott (2005)
Crook and Dymott (2005) ICT and the literacy practices of student writing
Crook and Dymott adopt a different theoretical approach to learning and context and propose that writing and technology are mutually constitutive; the context is not one which is a ‘surround’, as discussed by Tolmie, but a part of the whole learning.
The current theoretical school of Activity Theory is a modern descendant of Vygotsky’s approach to learning, who suggested things are ‘done through’ (mediated by) cultural artefacts, such as language, manmade tools, socially categorizing and institutionalising. Cultural psychologists define learning as becoming expert in the use of these cultural artefacts and therefore learning is classified as a social activity (even if this involves reading in physical isolation, as the reading and text and the book are culturally determined). Vygotsky (1981) wrote that ‘by being included in the process of behavior, the tool [artefact] alters the entire flow and structure of mental functions’ (p. 137). This is taken further by those who see learning as participation, using artefacts, which are a fundamental part of the learning (‘distributed cognition’ where the team and artefacts work together). If this is the view, then situated learning is specifically dependent on the context (culture and artefacts) within which the setting is taking place (Rogoff, for example); our learning is dependent on the situation we are present in and the ‘way of doing things round here’, in a social world.
The seminal text in situated cognition was Brown, Collins and Duguid’s article in Educational Researcher (1989). The authors described an apprenticeship model of learning closely related to the work of Lave (1988) and the idea of ‘legitimate peripheral participation’. Socio-cultural theory does not focus on measuring learning with ICT in terms of what is in an individual’s head or in terms of learning outcomes. Instead the focus is on individuals-using-technology-in-settings (Crook, 1994). Learning is conceptualised as a social endeavour and qualitative research methodologies have been adopted to investigate this area because the focus of inquiry is on the processes of learning and on meaning making in social settings.
As much of the above is very relevant to the reading, this (from the H809 course text) has been included, with some adaptations.
Thoughts on the reading:
The five aspects of reading play very specific parts in describing the activity of writing, to ensure the reader can create a rich picture of the relationship between the technology and how the processes related to writing are carried out. I think it would be fair to say that I had not considered the different ways in which we use text on the screen as opposed to text on a page. The issue of ‘…pages …could be turned without distracting visual attention from reading…’ (p102) is very relevant and may be an insight as to the difficulty I have in ‘screen reading’; there are too many potential distractions and the ease of moving to other tasks is not one which necessarily supports a physical book reading task. The other issue I considered of high importance was in relation to the socialising aspect of using the PC; the use of technology in collaborative activity, but, interestingly, not as in creating essays collaboratively. Perhaps this is still an ‘in the head’ task, whether through exam, personal task or within the classroom context. Overall, the five faces of writing as stated here offer opportunities to develop ideas of writing as a shared, culture-based activity; activity theory that is technologically driven?
The different aspects will necessarily have an ‘effect’ on the writing; this may be to some extent linked with the skills of the learner and the range of tools (technological artefacts) that the individual engages with. The processes carried out, however, may not ‘constitute’ writing; this will depend on the end product and if the text is formatted in a coherent and logical fashion (as dictated by the cultural setting, perhaps?). It appears that, while the information gathering tasks can be viewed as situated, possibly collaborative (distributive, if consensus agreed?), the actual writing is a task carried out ‘from the head’ to the ‘paper’. As students used the resources in such diverse ways, we should firstly ask what learning it is that we are considering? Is it the production of a piece of work or is it learning the cultural construction of this (a ‘learning to learn’ task)? The assignment produced will be mediated through a selected range of artefacts; the prescribed marking scheme for this piece of work may dictate that some marks are achieved by specifically referring to use of some of these or using these explicitly (for example, on line research references or a document formatting expectation), so in this sense the learning is mediated. If I was to assess (I am assuming ‘mark’ and not ‘assess for research’?) some of the students’ assignments described in this chapter, I would be guided by what the assessment criteria was, so necessarily on the end product and not the process.
Crook and Dymott adopt a different theoretical approach to learning and context and propose that writing and technology are mutually constitutive; the context is not one which is a ‘surround’, as discussed by Tolmie, but a part of the whole learning.
The current theoretical school of Activity Theory is a modern descendant of Vygotsky’s approach to learning, who suggested things are ‘done through’ (mediated by) cultural artefacts, such as language, manmade tools, socially categorizing and institutionalising. Cultural psychologists define learning as becoming expert in the use of these cultural artefacts and therefore learning is classified as a social activity (even if this involves reading in physical isolation, as the reading and text and the book are culturally determined). Vygotsky (1981) wrote that ‘by being included in the process of behavior, the tool [artefact] alters the entire flow and structure of mental functions’ (p. 137). This is taken further by those who see learning as participation, using artefacts, which are a fundamental part of the learning (‘distributed cognition’ where the team and artefacts work together). If this is the view, then situated learning is specifically dependent on the context (culture and artefacts) within which the setting is taking place (Rogoff, for example); our learning is dependent on the situation we are present in and the ‘way of doing things round here’, in a social world.
The seminal text in situated cognition was Brown, Collins and Duguid’s article in Educational Researcher (1989). The authors described an apprenticeship model of learning closely related to the work of Lave (1988) and the idea of ‘legitimate peripheral participation’. Socio-cultural theory does not focus on measuring learning with ICT in terms of what is in an individual’s head or in terms of learning outcomes. Instead the focus is on individuals-using-technology-in-settings (Crook, 1994). Learning is conceptualised as a social endeavour and qualitative research methodologies have been adopted to investigate this area because the focus of inquiry is on the processes of learning and on meaning making in social settings.
As much of the above is very relevant to the reading, this (from the H809 course text) has been included, with some adaptations.
Thoughts on the reading:
The five aspects of reading play very specific parts in describing the activity of writing, to ensure the reader can create a rich picture of the relationship between the technology and how the processes related to writing are carried out. I think it would be fair to say that I had not considered the different ways in which we use text on the screen as opposed to text on a page. The issue of ‘…pages …could be turned without distracting visual attention from reading…’ (p102) is very relevant and may be an insight as to the difficulty I have in ‘screen reading’; there are too many potential distractions and the ease of moving to other tasks is not one which necessarily supports a physical book reading task. The other issue I considered of high importance was in relation to the socialising aspect of using the PC; the use of technology in collaborative activity, but, interestingly, not as in creating essays collaboratively. Perhaps this is still an ‘in the head’ task, whether through exam, personal task or within the classroom context. Overall, the five faces of writing as stated here offer opportunities to develop ideas of writing as a shared, culture-based activity; activity theory that is technologically driven?
The different aspects will necessarily have an ‘effect’ on the writing; this may be to some extent linked with the skills of the learner and the range of tools (technological artefacts) that the individual engages with. The processes carried out, however, may not ‘constitute’ writing; this will depend on the end product and if the text is formatted in a coherent and logical fashion (as dictated by the cultural setting, perhaps?). It appears that, while the information gathering tasks can be viewed as situated, possibly collaborative (distributive, if consensus agreed?), the actual writing is a task carried out ‘from the head’ to the ‘paper’. As students used the resources in such diverse ways, we should firstly ask what learning it is that we are considering? Is it the production of a piece of work or is it learning the cultural construction of this (a ‘learning to learn’ task)? The assignment produced will be mediated through a selected range of artefacts; the prescribed marking scheme for this piece of work may dictate that some marks are achieved by specifically referring to use of some of these or using these explicitly (for example, on line research references or a document formatting expectation), so in this sense the learning is mediated. If I was to assess (I am assuming ‘mark’ and not ‘assess for research’?) some of the students’ assignments described in this chapter, I would be guided by what the assessment criteria was, so necessarily on the end product and not the process.
Tuesday, 25 March 2008
Tolmie (2001)
Tolmie (2001) Examining learning in relation to the contexts of use of ICT
If as an educationalist you think of learning as ‘in the head’ (Bredo) and the surroundings being external, you cannot possibly ‘know’ what is being learned without using external measurement tools (some sort of assessment method). This ‘symbol processing’ means that we own have our interpretations and representations; but this must be influenced by the culture we are situated in, as we need common understanding or at least communication (such as language) to share this knowledge. But what, then, is ‘tested’; the knowledge or the skills of communicating these? The context within which the learner is learning – the surroundings and the paraphernalia – and the learner’s ability to use these, may determine the level or ‘grade’ of the knowledge, once it is no longer ‘in the head’.
This article was interesting from the perspective of assuming that a tool, such as a computer, will affect how a learner learns, as well as the making note of the likely impact of other contextual aspects, such as institution and gender groupings. Tolmie’s paper, from the outset, makes an assumption that ‘something is happening in the head’, as opposed to purely mechanical occurrences, (in the abstract the use of ‘interplay’ suggests correlation between a variable and the context). There is, however, an awareness of the ‘…complexities of the educational process…’ and this may have determined why various contexts were assessed, specifically because they were variable.
In the study about gender effects, the learning was physically located within a secondary school; but was this the intended question, or are we suggesting that the location of the learning was situated, due to the interactions, and not ‘in the head’? The focus was on what learning took place as a result of pair work, using physics software (and its associated dialogue?); the software prompts were mentioned as part of the technological toolbox, and were not originally considered as part of the learning process. The issue of gender seemed to be assumed as a ‘female’ culture, as it was generalized, rather than it being viewed as individual, personal characteristics, inputs and/or effects. There are numerous studies related to gender and learning and this view is consistent with one of Murphy, where the females ‘helped each other’.
In taking some sort of sociological stance, it is assumed that a range of variables, not just gender, will be assumed to ‘be dependent on’, ‘be a function of’ or be ‘affected’ by; the context is irreconcilably linked to the learning.
If as an educationalist you think of learning as ‘in the head’ (Bredo) and the surroundings being external, you cannot possibly ‘know’ what is being learned without using external measurement tools (some sort of assessment method). This ‘symbol processing’ means that we own have our interpretations and representations; but this must be influenced by the culture we are situated in, as we need common understanding or at least communication (such as language) to share this knowledge. But what, then, is ‘tested’; the knowledge or the skills of communicating these? The context within which the learner is learning – the surroundings and the paraphernalia – and the learner’s ability to use these, may determine the level or ‘grade’ of the knowledge, once it is no longer ‘in the head’.
This article was interesting from the perspective of assuming that a tool, such as a computer, will affect how a learner learns, as well as the making note of the likely impact of other contextual aspects, such as institution and gender groupings. Tolmie’s paper, from the outset, makes an assumption that ‘something is happening in the head’, as opposed to purely mechanical occurrences, (in the abstract the use of ‘interplay’ suggests correlation between a variable and the context). There is, however, an awareness of the ‘…complexities of the educational process…’ and this may have determined why various contexts were assessed, specifically because they were variable.
In the study about gender effects, the learning was physically located within a secondary school; but was this the intended question, or are we suggesting that the location of the learning was situated, due to the interactions, and not ‘in the head’? The focus was on what learning took place as a result of pair work, using physics software (and its associated dialogue?); the software prompts were mentioned as part of the technological toolbox, and were not originally considered as part of the learning process. The issue of gender seemed to be assumed as a ‘female’ culture, as it was generalized, rather than it being viewed as individual, personal characteristics, inputs and/or effects. There are numerous studies related to gender and learning and this view is consistent with one of Murphy, where the females ‘helped each other’.
In taking some sort of sociological stance, it is assumed that a range of variables, not just gender, will be assumed to ‘be dependent on’, ‘be a function of’ or be ‘affected’ by; the context is irreconcilably linked to the learning.
Saturday, 22 March 2008
Jones and Preece (2006)
Jones and Preece (2006)
Online communities for teachers and lifelong learners
1. How do the authors define online communities? Is this different from the way you have encountered the term being used elsewhere?
The term is defined from one stated by Preece (2000): a group of people with aims, guided by policies and supported by computer technology. The key aspects highlighted are ‘people, purpose, policies, computing technology’ and it is suggested that these components should be focused on to create a successful online community. I was interested to note the author’s distinction between COPs and COI and, while I don’t necessarily agree with the assumption of COP being substantially organisational-bound, I do see the necessity to differentiate here.
This definition fits with my understanding of an online community; the flexibility to survive as a solely online entity, but with the capability to exist in a ‘real world’ set-up or to survive in any format between these brands.
2. Note that the Dublin group met in the pub to maintain social cohesion. What challenges does this raise for research methods that might be adopted to track this behaviour?
The research was to analyse online communities; by meeting and being an observer in the ‘real world’ activities of the teacher’s group, the researchers will gain insights into the interactions of the group, which may make it impossible to compare the group’s online interactions to those of a ‘virtual’ group. In tracking on line behaviour, the researcher can remain ‘invisible; not so in the ‘real world’, where the physical presence of ‘others’ could alter the dynamics, interactions and actions of the group.
Additionally, the researchers will have put ‘names to face’ and interpretations of future contributions of individuals could be altered due to this additional knowledge of an individual. As stated in the report, the teacher’s group created the collaboration by meeting face-to-face; very difficult to compare this to groups where collaboration is negotiated online.
? What is meant by the phrase ‘…healthy…community…?’ This is used on a couple of occasions in the paper?
Characteristics of the framework;
Sociability – interactions; personal characteristics; attitude; recognition; reciprocity; social presence
Usability – features and functions of software; information resources; navigation tools; access; format of dialogue
The Conole et al model would not really be useful in relation to this reading, as the emphasis here is not on learning. The Jones and Preece framework was built up to analyse, develop and maintain existing online (and/or blended) groups and their interactions, regardless of any learning outcomes or intentions. The Conole el al. framework looked at pedagogical processes, tools and techniques, and appeared to be aimed at developing education and learning online in a formal manner. The Jones and Preece frame work fits into a socially situated group; but possibly activity based for the Dublin teachers. This appears to highlight the individual, information and non-reflective components for the knee injuries group and the social, experience and reflection components for the teaching group.
Jones, A. and Preece, J. (2006) ‘Online communities for teachers and lifelong learners: a framework for comparing similarities and identifying differences in communities of practice and communities of interest’, International Journal of Learning Technology, vol. 2, no. 2–3, pp. 112–37.
Online communities for teachers and lifelong learners
1. How do the authors define online communities? Is this different from the way you have encountered the term being used elsewhere?
The term is defined from one stated by Preece (2000): a group of people with aims, guided by policies and supported by computer technology. The key aspects highlighted are ‘people, purpose, policies, computing technology’ and it is suggested that these components should be focused on to create a successful online community. I was interested to note the author’s distinction between COPs and COI and, while I don’t necessarily agree with the assumption of COP being substantially organisational-bound, I do see the necessity to differentiate here.
This definition fits with my understanding of an online community; the flexibility to survive as a solely online entity, but with the capability to exist in a ‘real world’ set-up or to survive in any format between these brands.
2. Note that the Dublin group met in the pub to maintain social cohesion. What challenges does this raise for research methods that might be adopted to track this behaviour?
The research was to analyse online communities; by meeting and being an observer in the ‘real world’ activities of the teacher’s group, the researchers will gain insights into the interactions of the group, which may make it impossible to compare the group’s online interactions to those of a ‘virtual’ group. In tracking on line behaviour, the researcher can remain ‘invisible; not so in the ‘real world’, where the physical presence of ‘others’ could alter the dynamics, interactions and actions of the group.
Additionally, the researchers will have put ‘names to face’ and interpretations of future contributions of individuals could be altered due to this additional knowledge of an individual. As stated in the report, the teacher’s group created the collaboration by meeting face-to-face; very difficult to compare this to groups where collaboration is negotiated online.
? What is meant by the phrase ‘…healthy…community…?’ This is used on a couple of occasions in the paper?
Characteristics of the framework;
Sociability – interactions; personal characteristics; attitude; recognition; reciprocity; social presence
Usability – features and functions of software; information resources; navigation tools; access; format of dialogue
The Conole et al model would not really be useful in relation to this reading, as the emphasis here is not on learning. The Jones and Preece framework was built up to analyse, develop and maintain existing online (and/or blended) groups and their interactions, regardless of any learning outcomes or intentions. The Conole el al. framework looked at pedagogical processes, tools and techniques, and appeared to be aimed at developing education and learning online in a formal manner. The Jones and Preece frame work fits into a socially situated group; but possibly activity based for the Dublin teachers. This appears to highlight the individual, information and non-reflective components for the knee injuries group and the social, experience and reflection components for the teaching group.
Jones, A. and Preece, J. (2006) ‘Online communities for teachers and lifelong learners: a framework for comparing similarities and identifying differences in communities of practice and communities of interest’, International Journal of Learning Technology, vol. 2, no. 2–3, pp. 112–37.
Friday, 21 March 2008
Conole et al (2004)
Conole et al. (2004) Mapping pedagogy and tools for effective learning design
1. Who do the authors see as the main audience for this paper?
The main audience appears to be the practioners.
2. What is the main aim of the paper?
To make more transparent the links between pedagogical theories and e-learning methods; to suggest a model for assessing how current teaching and learning practice could be adapted to make best use of available technology for relevant theoretical perspectives.
3. Fit the readings from the course into Table 1 of the Conole et al. paper?
Mmmm…do they ‘fit’? I could only see two obvious links.
Wegerif and Mercer – socially situated learning – ‘…the joint construction of knowledge in the classroom…’
Roschelle – Activity based (with the focus on ‘…convergence…’) or perhaps ‘…socially situated…’ (re the use of language) or experiential (re experience as a foundation for learning).
This may be a learning point – yes, we can use any theoretical perspective, as long as it is justified.
The model maps the six components - Individual; Social; Reflection: Non reflection; Information; Experience, or the three dimensions – information/experience, reflective/non-reflective and individual/social.
I was concerned about ‘skills learning’ being classified as ‘non-reflection’; maybe unconscious reflection would be fairer, as I see a lot of vocational learning and the passing on of tacit knowledge under the guise of ‘skills’.
I like the ideas of these dimensions and a form of classification; but I found the idea of a ‘cube’ difficult to grasp and an octahedron even more challenging. Perhaps I am used to seeing models of a more linear or 2-D nature. I would see a ‘shape’ of the categorisation more along the lines of
1. Who do the authors see as the main audience for this paper?
The main audience appears to be the practioners.
2. What is the main aim of the paper?
To make more transparent the links between pedagogical theories and e-learning methods; to suggest a model for assessing how current teaching and learning practice could be adapted to make best use of available technology for relevant theoretical perspectives.
3. Fit the readings from the course into Table 1 of the Conole et al. paper?
Mmmm…do they ‘fit’? I could only see two obvious links.
Wegerif and Mercer – socially situated learning – ‘…the joint construction of knowledge in the classroom…’
Roschelle – Activity based (with the focus on ‘…convergence…’) or perhaps ‘…socially situated…’ (re the use of language) or experiential (re experience as a foundation for learning).
This may be a learning point – yes, we can use any theoretical perspective, as long as it is justified.
The model maps the six components - Individual; Social; Reflection: Non reflection; Information; Experience, or the three dimensions – information/experience, reflective/non-reflective and individual/social.
I was concerned about ‘skills learning’ being classified as ‘non-reflection’; maybe unconscious reflection would be fairer, as I see a lot of vocational learning and the passing on of tacit knowledge under the guise of ‘skills’.
I like the ideas of these dimensions and a form of classification; but I found the idea of a ‘cube’ difficult to grasp and an octahedron even more challenging. Perhaps I am used to seeing models of a more linear or 2-D nature. I would see a ‘shape’ of the categorisation more along the lines of
So, theory A might be plotted where this symbol (?) is and have the shape of an uneven arrow.
Not sure what makes me see this more clearly like this, but I am sure there will be a theory about it!
As I found it difficult to see specific theories of learning in each reading, it was again challenging to try to fit readings with the theories. For Wegerif and Mercer, I looked at collaboration, with the characteristics of teacher explaining, exploratory talk, problem solving, sharing information, whole class discussions. I saw these as highlighting the social and experience aspects most strongly, but it could be possible to view a pre and post view of these, which may highlight the other aspects more strongly, when information is given. Perhaps many courses of teaching and learning would be seen in this light and that may be part of the learning journey.
I do agree that in trying to incorporate or expand technology use in any course will be dependent on the design of particular activities, students’ prior knowledge, tutor personality (and skills mix?), group dynamics, learning styles and preference. I also note the author’s comments on the weakness of the model in terms of it being very subjective; however, surely much in teaching is?
As far as developing my own ‘taxonomy’ of learning theories, my first thought is that ‘It’s all been done before’; any phrases I come up with would necessarily be taken from some previous description of types. It would be along the lines of continuums of ‘in the head’ to ‘out there’; ‘passive’ to ‘active’ and perhaps ‘linguistically determined’ to ‘multi-sensory’. This would be a fascinating activity to spend more time on, but – guess what? – no time! It is the last day of week 7 and I feel I have worked non stop on this for two days and I still have another reading to review and analyse. Wish the Easter bunny would come and take me way….
Not sure what makes me see this more clearly like this, but I am sure there will be a theory about it!
As I found it difficult to see specific theories of learning in each reading, it was again challenging to try to fit readings with the theories. For Wegerif and Mercer, I looked at collaboration, with the characteristics of teacher explaining, exploratory talk, problem solving, sharing information, whole class discussions. I saw these as highlighting the social and experience aspects most strongly, but it could be possible to view a pre and post view of these, which may highlight the other aspects more strongly, when information is given. Perhaps many courses of teaching and learning would be seen in this light and that may be part of the learning journey.
I do agree that in trying to incorporate or expand technology use in any course will be dependent on the design of particular activities, students’ prior knowledge, tutor personality (and skills mix?), group dynamics, learning styles and preference. I also note the author’s comments on the weakness of the model in terms of it being very subjective; however, surely much in teaching is?
As far as developing my own ‘taxonomy’ of learning theories, my first thought is that ‘It’s all been done before’; any phrases I come up with would necessarily be taken from some previous description of types. It would be along the lines of continuums of ‘in the head’ to ‘out there’; ‘passive’ to ‘active’ and perhaps ‘linguistically determined’ to ‘multi-sensory’. This would be a fascinating activity to spend more time on, but – guess what? – no time! It is the last day of week 7 and I feel I have worked non stop on this for two days and I still have another reading to review and analyse. Wish the Easter bunny would come and take me way….
Thursday, 20 March 2008
Theories of learning
Theories of learning
· Why are some theoretical approaches more popular than others?
This could be because some have been given greater status than others, either in the public domain or by decision-making bodies. In a more positive vein, the more popular approaches are likely to be those which have been ‘tried and tested’ by practitioners.
· Is it simply, as some practitioners suspect, changing fashion in theories?
Possibly (with links to status, as above); however, are the practitioners who purport that these ‘new approaches’ are a ‘fashion’ the same ones who seem unprepared to embrace new ideas (particularly if these are related to technology and often because they may be required to learn a new set of skills)?
· Does it depend on what you want to do with the theory?
Yes – definitely. As in politics, one can create a ‘case’ for either side of an argument, depending on what you would like to emphasise.
· Or on what types of learner you are dealing with and in what context?
Yes, again. Age of learner is one variable that often makes you focus on the theoretical aspects; for example, the reflections of a 16 year old, with no experience of work are very different to a 35 year old. A learning task would need to be constructed with a different set and volume of questions to stimulate the responses of the learner with less ‘life’ experience. The context is also pertinent, as if the learners are generally highly motivated (such as we OU learners), some theoretical approaches will carry with them assumptions that learners are able and willing to learn and access is assured; many settings cannot assume such and as a consequence, the theoretical approach is perhaps linked with the social and cultural aspects more closely.
Activity 7.2: Building a wiki of key concepts in learning theories
The following resources were accessed, to gain an overview of what was intended as the material.
· Greg Kearsley: ‘Theory into Practice Database’
· James Atherton: ‘Learning and Teaching’ website
· JISC website
· Wikipedia
There are, as I often say to my own students, ‘Too many TOES’ (Theories of everything and something). I was not sure what to post on the wiki; should I concentrate on theories I already know a little about (such as communities of practice) and explore the critics of these in more detail? Or should I look at theories I have never heard of (such as Functional Context theory, Sticht), to widen my knowledge? There is obviously little time to look at a range in any great depth at this stage and so I think I will revisit the theme of social and situated learning and post some thoughts on that topic.
Social and situated learning
A strong theoretical point was made by Lave and Wenger in relation to seeing the community as the learning curriculum and placing the learner as (initially) a peripheral newcomer. The learner finds out about all aspects of ‘the job’ by gradually becoming a more central part of the group, as the ‘old timers’ deliver both formal and informal instruction. This was seen by many as an apprenticeship model, but this was much more; the curriculum is seen not a teaching one, constructed for the learner, but a learning curriculum of ‘situated activities’. This had idea rang strongly with me when I first came across it and had implications for the teaching of HND students; there association with work based modules clearly reflected their application to classroom tasks. It also made me question the curriculum that is set (this certainly linked to policy and power; awarding bodies and industry/sector skills) and if it needs to be more learner-driven.
One paper critical of some of the ideas feeding this perspective was from Fuller and Unwin (2005), who argue that much of the research was in relation to very limited populations (such as the AA and remote tailors) and the associations that could be made with the contemporary workplace were limited. Another paper (Hodkinson and Hodkinson, 2004) also point out that Lave and Wenger ignore the crucial element of what the learners brings to the community as an individual and the implications this has for changing the community.
The OU library, and specifically journal articles, is very useful in terms of finding specific information relating to authors and/or concepts; however, I was also intrigued to come across - http://www.learning-theories.com/communities-of-practice-lave-and-wenger.html - where I am sure I will be returning when I am unclear on any theoretical perspective, as I found the site easy to view and navigate.
Fuller, A. and Unwin L. (2005) Older and wiser? workplace learning from the perspective of experienced employees in International Journal of Lifelong Education Volume 24, No. 1 (January – February 2005), p. 21–39, Taylor & Francis Group Ltd
Hodkinson, H., and Hodkinson, P.M. (2004). "Rethinking the Concept of Community of Practice in relation to Schoolteachers’ Workplace Learning". International Journal of Training and Development, 8(1), 21-31.
Lave J. and Wenger E. (2002) ‘Legitimate peripheral participation in community of practice’ in Harrison, R., Reeve, F., Hanson, A., Clarke, J. (eds.) Supporting Lifelong Learning Volume 1 Perspectives on learning, London, Routledge Falmer in association with the Open University
· Why are some theoretical approaches more popular than others?
This could be because some have been given greater status than others, either in the public domain or by decision-making bodies. In a more positive vein, the more popular approaches are likely to be those which have been ‘tried and tested’ by practitioners.
· Is it simply, as some practitioners suspect, changing fashion in theories?
Possibly (with links to status, as above); however, are the practitioners who purport that these ‘new approaches’ are a ‘fashion’ the same ones who seem unprepared to embrace new ideas (particularly if these are related to technology and often because they may be required to learn a new set of skills)?
· Does it depend on what you want to do with the theory?
Yes – definitely. As in politics, one can create a ‘case’ for either side of an argument, depending on what you would like to emphasise.
· Or on what types of learner you are dealing with and in what context?
Yes, again. Age of learner is one variable that often makes you focus on the theoretical aspects; for example, the reflections of a 16 year old, with no experience of work are very different to a 35 year old. A learning task would need to be constructed with a different set and volume of questions to stimulate the responses of the learner with less ‘life’ experience. The context is also pertinent, as if the learners are generally highly motivated (such as we OU learners), some theoretical approaches will carry with them assumptions that learners are able and willing to learn and access is assured; many settings cannot assume such and as a consequence, the theoretical approach is perhaps linked with the social and cultural aspects more closely.
Activity 7.2: Building a wiki of key concepts in learning theories
The following resources were accessed, to gain an overview of what was intended as the material.
· Greg Kearsley: ‘Theory into Practice Database’
· James Atherton: ‘Learning and Teaching’ website
· JISC website
· Wikipedia
There are, as I often say to my own students, ‘Too many TOES’ (Theories of everything and something). I was not sure what to post on the wiki; should I concentrate on theories I already know a little about (such as communities of practice) and explore the critics of these in more detail? Or should I look at theories I have never heard of (such as Functional Context theory, Sticht), to widen my knowledge? There is obviously little time to look at a range in any great depth at this stage and so I think I will revisit the theme of social and situated learning and post some thoughts on that topic.
Social and situated learning
A strong theoretical point was made by Lave and Wenger in relation to seeing the community as the learning curriculum and placing the learner as (initially) a peripheral newcomer. The learner finds out about all aspects of ‘the job’ by gradually becoming a more central part of the group, as the ‘old timers’ deliver both formal and informal instruction. This was seen by many as an apprenticeship model, but this was much more; the curriculum is seen not a teaching one, constructed for the learner, but a learning curriculum of ‘situated activities’. This had idea rang strongly with me when I first came across it and had implications for the teaching of HND students; there association with work based modules clearly reflected their application to classroom tasks. It also made me question the curriculum that is set (this certainly linked to policy and power; awarding bodies and industry/sector skills) and if it needs to be more learner-driven.
One paper critical of some of the ideas feeding this perspective was from Fuller and Unwin (2005), who argue that much of the research was in relation to very limited populations (such as the AA and remote tailors) and the associations that could be made with the contemporary workplace were limited. Another paper (Hodkinson and Hodkinson, 2004) also point out that Lave and Wenger ignore the crucial element of what the learners brings to the community as an individual and the implications this has for changing the community.
The OU library, and specifically journal articles, is very useful in terms of finding specific information relating to authors and/or concepts; however, I was also intrigued to come across - http://www.learning-theories.com/communities-of-practice-lave-and-wenger.html - where I am sure I will be returning when I am unclear on any theoretical perspective, as I found the site easy to view and navigate.
Fuller, A. and Unwin L. (2005) Older and wiser? workplace learning from the perspective of experienced employees in International Journal of Lifelong Education Volume 24, No. 1 (January – February 2005), p. 21–39, Taylor & Francis Group Ltd
Hodkinson, H., and Hodkinson, P.M. (2004). "Rethinking the Concept of Community of Practice in relation to Schoolteachers’ Workplace Learning". International Journal of Training and Development, 8(1), 21-31.
Lave J. and Wenger E. (2002) ‘Legitimate peripheral participation in community of practice’ in Harrison, R., Reeve, F., Hanson, A., Clarke, J. (eds.) Supporting Lifelong Learning Volume 1 Perspectives on learning, London, Routledge Falmer in association with the Open University
Wednesday, 19 March 2008
Timelines, theories and technologies
Timelines, theories and technologies
I was a little unclear on exactly what this timeline was about; available technology and learning theories or political agendas? Trying to construct a timeline by taking into account technological developments; theoretical developments and paradigm shifts (such as in psychology, sociology and cultural studies); changes in political and policy direction, was a bit of a challenge. I could do with some sort of 3-D model to show how I view things, but this will have to do…
I have spent quite a lot of time on this activity already and have found it very interesting, but I need to draw it to an end for the time being. There are still many theories I would like to bring in, but space and time forbid!
However, there are obviously limits to this type of ‘timeline’ model, making it difficult to ‘fit things in boxes’. There also appears to be a plethora of published research carried out from the mid eighties and much of this may be linked to the knowledge being available and being easily shared, through the use of technology. It is also difficult to say which research feeds which policies, as the time delay between ‘gaining knowledge’ and ‘actual implementation’ could vary substantially, depending on whatever other policies are on the drawing board at the time.
Just as an aside, while I was checking a date for this, I came across this
http://www.eric.ed.gov/ERICDocs/data/ericdocs2sql/content_storage_01/0000019b/80/14/16/ee.pdf
Anyone who thinks they are being radical and ‘new’ in how they would like to change education, well, that’s what Dewey was doing such a long time ago. Very interesting article, but not, I realise, with much to link it with the technology themes being discussed?
I was a little unclear on exactly what this timeline was about; available technology and learning theories or political agendas? Trying to construct a timeline by taking into account technological developments; theoretical developments and paradigm shifts (such as in psychology, sociology and cultural studies); changes in political and policy direction, was a bit of a challenge. I could do with some sort of 3-D model to show how I view things, but this will have to do…
I have spent quite a lot of time on this activity already and have found it very interesting, but I need to draw it to an end for the time being. There are still many theories I would like to bring in, but space and time forbid!
However, there are obviously limits to this type of ‘timeline’ model, making it difficult to ‘fit things in boxes’. There also appears to be a plethora of published research carried out from the mid eighties and much of this may be linked to the knowledge being available and being easily shared, through the use of technology. It is also difficult to say which research feeds which policies, as the time delay between ‘gaining knowledge’ and ‘actual implementation’ could vary substantially, depending on whatever other policies are on the drawing board at the time.
Just as an aside, while I was checking a date for this, I came across this
http://www.eric.ed.gov/ERICDocs/data/ericdocs2sql/content_storage_01/0000019b/80/14/16/ee.pdf
Anyone who thinks they are being radical and ‘new’ in how they would like to change education, well, that’s what Dewey was doing such a long time ago. Very interesting article, but not, I realise, with much to link it with the technology themes being discussed?
Thinking about a project...
‘Are there likely to be any benefits to HE students, of using blogs as a tool to improve learning through personal reflection?’
This is set within the context of a business and management centre, which is part of an FE college and in relation to delivery of professional courses (CMI and ILM) and HE within FE.
While a blog facility has recently been installed within the current VLE (blackboard), this is largely unused; why is this? Is it promoted through teaching staff? Could this facility be integrated into the courses? Would it benefit learners?
· How will participants be able to give their informed consent?
Information about the project would be available to all students in the form of a written summary of the project, on commencement of their course, with an ‘opt-in’ statement, for those interested in taking part.
· How can I guarantee confidentiality and anonymity?
In the broader context of not using specific or ‘real’ names, this could be guaranteed; however, due to the small cohorts, if the students were to be allowed access to the results of the project, it would be likely that individuals could be identified. Additionally, in such a small department of the institution, any comments used from interviews with staff may also be problematic.
· Are there any potential conflicts of interest in my research?
I am aware that there will be conflict in relation to the traditional ‘teaching hours’ culture and the view taken by myself and (some) others that new models of blended and/or distance learning need to be developed, in a strategy to increase take-up on the course. I also need to be aware of possible personal bias towards drawing conclusions for implementing greater use of technology in the course, which may place me in a ‘stronger’ position within the department.
· Could any aspect of my research cause distress or psychological harm?
This would be unlikely.
· Do I have the skills to analyse the results in an appropriate manner?
I think I currently have adequate skills, from previous experience of small scale projects, which have involved interviewing, analysing transcripts, coding qualitative data and critically assessing related papers. A weaker area may be in the presentation of any appropriate numerical data.
· What unintended consequences could result from publication or other publicity?
If the project finds that the existing blog facility is not in the most acceptable format for learners, is it likely that this can be changed, as financial resources have already been invested in this facility? It may be that students unanimously find that the use of blogging is not something they want to do as part of educational or professional development.
Synopsis of the information that relevant ethics committees or gatekeepers might need for a project
Gatekeepers would need to be aware that students would be able to give informed consent and that they may opt out of participating in the project. As a key concern at this stage is the lack of any institutional guidance on research ethics; initial searches have not found any policies relating to this and there seemed to be a general lack of knowledge on this among colleagues.
The gatekeepers may need to be aware of the underpinning ‘minefield’; that the project may show how it is lack of enthusiasm from teaching staff that is holding back the use of this (and likely other) technological tools; or it may be that the selected user interface is not ‘friendly’.
While there are no additional financial resources required to complete the project, the allocation of time resources for the researcher needs to be considered.
This is set within the context of a business and management centre, which is part of an FE college and in relation to delivery of professional courses (CMI and ILM) and HE within FE.
While a blog facility has recently been installed within the current VLE (blackboard), this is largely unused; why is this? Is it promoted through teaching staff? Could this facility be integrated into the courses? Would it benefit learners?
· How will participants be able to give their informed consent?
Information about the project would be available to all students in the form of a written summary of the project, on commencement of their course, with an ‘opt-in’ statement, for those interested in taking part.
· How can I guarantee confidentiality and anonymity?
In the broader context of not using specific or ‘real’ names, this could be guaranteed; however, due to the small cohorts, if the students were to be allowed access to the results of the project, it would be likely that individuals could be identified. Additionally, in such a small department of the institution, any comments used from interviews with staff may also be problematic.
· Are there any potential conflicts of interest in my research?
I am aware that there will be conflict in relation to the traditional ‘teaching hours’ culture and the view taken by myself and (some) others that new models of blended and/or distance learning need to be developed, in a strategy to increase take-up on the course. I also need to be aware of possible personal bias towards drawing conclusions for implementing greater use of technology in the course, which may place me in a ‘stronger’ position within the department.
· Could any aspect of my research cause distress or psychological harm?
This would be unlikely.
· Do I have the skills to analyse the results in an appropriate manner?
I think I currently have adequate skills, from previous experience of small scale projects, which have involved interviewing, analysing transcripts, coding qualitative data and critically assessing related papers. A weaker area may be in the presentation of any appropriate numerical data.
· What unintended consequences could result from publication or other publicity?
If the project finds that the existing blog facility is not in the most acceptable format for learners, is it likely that this can be changed, as financial resources have already been invested in this facility? It may be that students unanimously find that the use of blogging is not something they want to do as part of educational or professional development.
Synopsis of the information that relevant ethics committees or gatekeepers might need for a project
Gatekeepers would need to be aware that students would be able to give informed consent and that they may opt out of participating in the project. As a key concern at this stage is the lack of any institutional guidance on research ethics; initial searches have not found any policies relating to this and there seemed to be a general lack of knowledge on this among colleagues.
The gatekeepers may need to be aware of the underpinning ‘minefield’; that the project may show how it is lack of enthusiasm from teaching staff that is holding back the use of this (and likely other) technological tools; or it may be that the selected user interface is not ‘friendly’.
While there are no additional financial resources required to complete the project, the allocation of time resources for the researcher needs to be considered.
Thursday, 13 March 2008
Week 6; Audiences and ethics
In relation to the audience, there are some issues which relate to the Block 1 readings; for example, reading 1 was based within a sociology context and this resulted in a number of ICT explanations, as this was not necessarily expected existing knowledge for the audience; the scope appeared to be determined by the funder, who was promoting VC use; the issue raised in the podcast relating to written word sometimes being a poor conduit of what has actually occurred within the reported situations was of relevance here - much of the transcipt would not have captured the richness of detail required to validate some points.
In relation to ethical issues on Reading 1, there was likely bias, as could be seen through the assumption of the 'enhancement' of programmes and the promotion of VC; the participants were not protected in terms of their names or the content of their personal discussions (or, at least, we are not made aware of this).
For the TMA research question I proposed, I appear to be aiming at a range of audiences; delivers of management courses, policy makers within an institution and decision makers within course programmes. The emphasis would be expected to be on the 'hard evidence' for the decision makers and policy makers, but practitioners would likely to be more interested in the 'what works' element and 'how do you do that?' This thinking mirrors some of the views heard on the podcast. If the research is aimed at this diverse group, it may be that it allows substantial freedom to focus on particular aspects; however, does this in turn suggest that the lack of depth would not really 'answer any questions' or provide adequate evidence on which to base strategic planning decisions?
Considering the ethical aspects of research in this field, it is not known if there is an ethical research policy within the institution; delivery of professional management courses at level 5 and above, as opposed to more academic course, such as HNDs, sits uncomfortably within the HE in FE framework, so it is not clear who should be the key point of contact for this.
However, generic issues of confidentiality and anonymity can be addressed by providing the assurance of these when first requesting participation and making a statement on how this will be achieved. An unintended consequence of the research could be that if one model of delivery is selected, whereas it is the flexibility and choice which could dictate future demand.
In relation to ethical issues on Reading 1, there was likely bias, as could be seen through the assumption of the 'enhancement' of programmes and the promotion of VC; the participants were not protected in terms of their names or the content of their personal discussions (or, at least, we are not made aware of this).
For the TMA research question I proposed, I appear to be aiming at a range of audiences; delivers of management courses, policy makers within an institution and decision makers within course programmes. The emphasis would be expected to be on the 'hard evidence' for the decision makers and policy makers, but practitioners would likely to be more interested in the 'what works' element and 'how do you do that?' This thinking mirrors some of the views heard on the podcast. If the research is aimed at this diverse group, it may be that it allows substantial freedom to focus on particular aspects; however, does this in turn suggest that the lack of depth would not really 'answer any questions' or provide adequate evidence on which to base strategic planning decisions?
Considering the ethical aspects of research in this field, it is not known if there is an ethical research policy within the institution; delivery of professional management courses at level 5 and above, as opposed to more academic course, such as HNDs, sits uncomfortably within the HE in FE framework, so it is not clear who should be the key point of contact for this.
However, generic issues of confidentiality and anonymity can be addressed by providing the assurance of these when first requesting participation and making a statement on how this will be achieved. An unintended consequence of the research could be that if one model of delivery is selected, whereas it is the flexibility and choice which could dictate future demand.
Wednesday, 12 March 2008
Some thoughts on blogs
Thought I should reflect on blogs in general, before I continue with my studies, as this is the tool I was most interested in learning about. Is this practice of blogging, at the intersection of technology and learning, going to be a useful tool for a practicing manager, in the context of developing their skills and knowledge? My background (in management, then teaching) and my current practice (in education) will no doubt influence how I view the tool and its likely benefits and will also influence how I may 'sell' the idea' to others. From a teaching perspective, it is expected that one should be a 'reflective practitioner' and that through this reflection one will learn to 'do better'; however, can you learn by reflection only, or does it require the visibility of being in text, to be viewed as valid? This could be a benefit of blogging - making thoughts more accessible and ideas more transparent to others; or does this bring its own problems?
How would blogging fit into a 1. learners life and 2. manager's life?
Using this blog tool for the first time myself, on this course, I can appreciate the benefits as a learner; I have had to construct my meaning and build my understandings in a coherent way, 'in case' someone reads my blog! But is this not a step back to a constructivist view of learning, where 'in the head' (Bredo?) learning is at the top of the hierarchy? Does this imply that blogging has to be viewed as only one part of a social network, to be of benefit?
I have seen as a disadvantage the amount of time that it takes to deal with course materials and then blog as well; or I am 'hiding' by using lack of time as an excuse for not using the blog effectively, to consolidate the ideas? Thinking along this thread, a blog may be of use to a learner if it is in relation to a small component - a concept or group of theories on a particular topic - where the topic may be read and researched and then 'unpicked' by the learner, to gain more deep learning than a surface glance over a text. This may be beneficial for some parts of a course, then, rather than for a course as a whole.; this strikes me as a 'good idea'.
From the practicing mangers point of view, is using a blog as a reflective learning tool, realistic? In some sectors - notably education and welfare - the recording of the narratives of 'critical incidents' and 'informal learning situations' are positively encouraged; in others, reflective practice (at least in a written format) still appears to be viewed as 'a waste of time'.
There are other issues on the practicing manager theme; who is the blog 'for' and who will have access to this? The issue of confidentiality rears its ugly head; although, in a teacher-learner relationship, much sensitive organisational information is passed and dissected in the current learning model.
A couple of thoughts to ponder
1. If Durkheim saw text as a veil between the real world and the self, is this use of blogging making both more transparent?
2. Can a blog increase the range of a subject being discussed and contribute to 'unassessed outcomes' or 'unintended' learning outcomes (Newman, Griffen and Cole, (1989), as in '...doing something else..')
I feel at the moment that a blog may be seen as a chore by most learners, unless linked to a score or grade; a little like the first block of this course??
I'm now looking to the idea of having to have interactions to make the blog 'worthwhile'; surely some social construction is required via input from others? Is this where wikis come in?
How would blogging fit into a 1. learners life and 2. manager's life?
Using this blog tool for the first time myself, on this course, I can appreciate the benefits as a learner; I have had to construct my meaning and build my understandings in a coherent way, 'in case' someone reads my blog! But is this not a step back to a constructivist view of learning, where 'in the head' (Bredo?) learning is at the top of the hierarchy? Does this imply that blogging has to be viewed as only one part of a social network, to be of benefit?
I have seen as a disadvantage the amount of time that it takes to deal with course materials and then blog as well; or I am 'hiding' by using lack of time as an excuse for not using the blog effectively, to consolidate the ideas? Thinking along this thread, a blog may be of use to a learner if it is in relation to a small component - a concept or group of theories on a particular topic - where the topic may be read and researched and then 'unpicked' by the learner, to gain more deep learning than a surface glance over a text. This may be beneficial for some parts of a course, then, rather than for a course as a whole.; this strikes me as a 'good idea'.
From the practicing mangers point of view, is using a blog as a reflective learning tool, realistic? In some sectors - notably education and welfare - the recording of the narratives of 'critical incidents' and 'informal learning situations' are positively encouraged; in others, reflective practice (at least in a written format) still appears to be viewed as 'a waste of time'.
There are other issues on the practicing manager theme; who is the blog 'for' and who will have access to this? The issue of confidentiality rears its ugly head; although, in a teacher-learner relationship, much sensitive organisational information is passed and dissected in the current learning model.
A couple of thoughts to ponder
1. If Durkheim saw text as a veil between the real world and the self, is this use of blogging making both more transparent?
2. Can a blog increase the range of a subject being discussed and contribute to 'unassessed outcomes' or 'unintended' learning outcomes (Newman, Griffen and Cole, (1989), as in '...doing something else..')
I feel at the moment that a blog may be seen as a chore by most learners, unless linked to a score or grade; a little like the first block of this course??
I'm now looking to the idea of having to have interactions to make the blog 'worthwhile'; surely some social construction is required via input from others? Is this where wikis come in?
The TMA01
The last week has been spent preparing and submitting the first assignment and then having a look round the materials for Block 2. Thought I would post TMA01 here, so that I can refer back to this course and my progress on it, in one place. Hoping to get started on the new Block tomorrow, when I have a couple of days allocated for much of my study.
Part 1
Empirical research question
This is based on an aspect of my current practice;
‘Are there likely to be any benefits to HE students, of using blogs as a tool to improve learning through personal reflection?’
A theory of ‘reflective learning’ (Kolb, 1984 and developed by others) has been held as ‘true’ and been used constantly in the field of education. The concept of reflection has also been perceived as an important aspect of developing in a professional capacity in the workplace - the ‘reflective practitioner’, as discussed by Schön, (1987) - which may be of particular relevance to HE management courses, in the important bridge between integrating the study of management theories and the ‘world of work’.
The promotion of reflective learning, for example, by the Higher Education Academy (HEA, 2002), in conjunction with government employability strategies, has resulted in a move within the HE sector to greater emphasis being placed on personal development skills, including those of being a reflective learner. For many management and/or business courses, the awarding body looks to evidence of being able to reflect and to discuss the application of the theoretical learning, in the form of learning logs and/or personal journals. This is emphasised by the Quality Assurance Agency (QAA) in a policy statement (2001), which includes refers to the need of HE institutions to have a ‘…means by which students can monitor, build and reflect upon their personal development…’
Within an institution I am currently involved with, a blog facility was set up last year, within the existing VLE, (‘Blackboard’). The current use made this varies tremendously between departments, with the main purpose being as a repository for session resources.
This deserves to be researched because the technology is currently in place and readily accessible. As we move towards using more on-line resources and there is wider access to IT for the HE management population, it has been suggested that we should use a blog tool as an integral component of some courses. However, it must be considered what would be the likely benefits to the student (if any); would incentives be required to assist this change? It may be viewed as ‘not worth the effort’ by the student community.
This needs to be researched in order to identify if this would be the case, as the conclusions would impact on the allocation of a range of resources in the future (in relation to technology, teaching development or/and student awareness).
Initial thoughts are that
There may be opportunities to use the blog facility as part of evidence for assessment
The use should not be viewed just as a ‘tick box’, administrative task (Black, 1999)
The blog may be an important building block for developing future, shared, on-line learning
There may be ethical implications, as there would be with any ‘human involvement’ project. Students may be reluctant to discuss some aspects of the course and specifically state their thoughts on, say, conflict or motivation in the workplace, if these thoughts were to become part of a visible, ‘permanent’ record, which others could view, as this relates to and is integrated with, their current working environment.
Part 2
Methods - strengths and weaknesses
A wide range of methods are available as part of the researcher’s toolkit. Personal interviews can provide a richness of information and a depth of insight, inaccessible by other forms of data collection. Time may need to be spent in the preparation, however, in relation to ensuring confidentiality and trying to encourage ‘free speech’. In some settings, it may be important to ensure anonymity, but it would need to be considered if this was viable or would detract from the validity of the project. These types of face to face methods of data collection may mean that some nuances, such as body language, go largely unrecorded, but some interpretation may be offered, for which there is no tangible evidence; or the evidence could be used out of its original context, which may lead to questioning the validity of the data. Additionally, the transcription of any interview, however, brief, takes time, as would a manual analysis of the transcriptions.
Questionnaires can allow preparation of specific questions to be asked in exactly the same format to all participants and may provide benefits ease of comparison; for example, Hiltz and Meinke used a pre and post questionnaire method. The negative implications of questionnaires can be the high time and cost resource usage.
Statistical data and analysis, whether small scale (Hiltz and Meinke), or much larger scale (OECD), can provide measures by which to easily compare variables and present the information in a suitable visual format for the audience; detailed and complex information could be analysed (as Wegerif and Mercer achieved). However, analysis may be limited to the researcher’s manipulation skills or, on a larger scale, be dependent on the software and processing technology and financial commitment available. There may be ethical implications related to manipulation of data; while, qualitative data may appear on the surface to be easier to validate than qualitative data, it may be more challenging to validate the methods used in the analysis.
The use of computer-based text analysis highlights the benefit of capturing context and temporal issues; a blend of data, which can be linked; but again this may be dependent on available resources, particularly the investment of technology and skills.
Wegerif and Mercer also used coding schemes, to create the ‘big picture’ of using a range of variables, with ‘publicly verifiable’ categories, which indicates that the research may be more transparent and accessible then other methods. However, the coding may be limiting to the text interpretation, if researchers make suppositions and assumptions. This is where the ethical dilemma of a researcher may be of concern; if the project is intending to confirm a hypothesis, the selection of categories and the analysis and interpretation of these may be skewed.
Systematic observation can provide a richness of detail, but may be limited in value, as much may depend on the observer’s perception and focus of topic. In the case of Laurilland, the observations included extensive explanatory notes, transcriptions and diagrams, in an effort to clarify the situations; this level of detail would indicate greater validity than a purely note-taking style of observation. A disadvantage of this method, in some situation, would be that the observer may alter the normal course of the activity, by their presence in the situation.
Methods – selection
The preferred methods would be those which necessarily fit my existing skills set and experience as a researcher, as well as those methods which would be most suitable for the type of data to be collected, which would mainly be qualitative in nature. It is also important to consider the participants, who would be more familiar with questionnaires than other methods, but who may not feel comfortable with one-to-one interviews.
From the methods met so far, many could contribute; however, due to word limitations, the focus will be on
· Interviews
· Coding schemes
· Statistical data analysis
Interviews could be arranged, with anonymity protected, as it would be the content of the qualitative statement that would be used to highlight key issues, not the respondent’s relationship with the statement; the use of a quote or phrase from a particular student would be used to sum up a general consensus for a particular aspect. Students would perhaps feel more comfortable in a focus group setting, than being interviewed individually; but this would have its own set of advantages and disadvantages, particularly in management course, where there may be vying for leadership.
The interview transcriptions could be analysed by the use of coding schemes, where a range of categories are created from the overview of the material and each input is assigned to a specific category to indicate the volume of subject discussion or the range of topics stated.
Statistical data analysis could also be developed from this coding analysis, such as recording the number of variables linked to the student’s perception on different topics. However, the main use of using this type of analysis would be in relation to the physical counting of participation variables, such as times of access, duration of logon and word quantity of the posting. The analysis could be carried out manually or by the use of simple Excel formulas. Any statistical collation would allow for the use of graphs and/or charts and this is useful in the presentation of data and may appear to contribute to the validity of the project conclusions (whether or not this is actually the case).
These methods would be preferable because they are in keeping with a small scale project, where the researcher’s skills are limited and time resources constrained. One other ethical consideration would be any personal bias; while is it expected that integrity and high professional standards overall are maintained, if it is in ones own interests to achieve a particular recommendation in any study, this may impact on the methodology and therefore the conclusions reached.
(1502 words)
References
Black P. (1999) Assessment, Learning Theories and Testing Systems in Murphy, P. (ed.) Learners, Learning and Assessment London, Sage Publishing
Brockbank, A and McGill, I. (1998) Facilitating Reflective Learning in Higher Education, Bristol, Open University Press
Higher Education Academy (2002) Personal Development Planning: a Tool for Reflective Learning available at http://www.heacademy.ac.uk/resources/detail/resources/casestudies/cs_080 Accessed 6th March 2008
Hiltz, S.R. and Meinke, R. (1989) ‘Teaching sociology in a virtual classroom’, Teaching Sociology, vol. 17, no. 4, p. 431–46
Kolb, D. A., (1984) The experiential learning: Experience as the source of learning and development. NJ: Prentice-Hall
Laurillard, D. (1994) ‘How can learning technologies improve learning?’, Law Technology Journal, vol. 3, no. 2
Oliver, M., Roberts, G., Beetham, H., Ingraham, B. and Dyke, M. (2007) ‘Knowledge, society and perspectives on learning technology’ in Conole, G. and Oliver, M. (eds) Contemporary Perspectives on E-learning Research, London, RoutledgeFalmer.
Organisation for Economic Co-operation and Development (OECD) (2005) E-learning in Tertiary Education: Where do we stand?, Paris, OECD.
Quality Assurance Agency (2001) http://www.qaa.ac.uk/academicinfrastructure/progressFiles/guidelines/progfile2001.asp accessed 4th March 2008
Roschelle, J. (1992) ‘Learning by collaborating: convergent conceptual change’, Journal of the Learning Sciences, vol. 2, no. 3, pp. 235–76.
Schön D. (1987) The Reflective Practitioner: How Professionals Think in Action (San Francisco, Jossey-Bass
Wegerif, R. and Mercer, N. (1997) ‘Using computer-based text analysis to integrate qualitative and quantitative methods in research on collaborative learning’, Language and Education, vol. 11, no. 4, p. 271–86
Part 3
Project selection
It is anticipated that I would select option (a), plan the design of a study. Currently, I am not aware of how this should be constructed and presented in the ‘…form of a research bid…’, but I am sure this will become clear as I continue through the course.
While still considering the main theme of my project, I am aware of the need to consolidate the idea as soon as possible. My situation is that I am working part time in a business and management centre, which is part of an FE college and my role, is delivery of both FE and HE courses.
My brief has been to start with one course (the CMI Level 5 Diploma in Management) and to look at devising a blended learning model, which will necessarily require increased use of technology, in order to develop and make best use of distance learning components. The other additional need is to integrate technology as part of developing professional practice, as many of the candidates use e-mailing, conferencing and numerous software programmes in their daily lives – being part of a college community should not appear to be taking a step back in time!
In the light of these thoughts, I am considering a project along the lines of
‘How could technology components be embedded into existing management courses?’
The intention would be to look at, on a practical level, how specific new components (such as, perhaps, wikis and blogs) could be of embedding and what the technical and technological implications might be.
It is likely that interviews would be relevant in relation to assessing student perceptions of the likely changes and the perceived benefits; discussions with existing staff and IT support would also be required, but there are no issues of access and there is support in the wider college politics for this type of project. There would also be opportunities to ground the research in the concepts of collaboration (Murphy), reflection (as discussed briefly in part 1) and assessment methods (Gardiner, Black). I have yet to access more recent research on changes in management course and review HE-based research papers in relation to awarding bodies thoughts on evidence for assessment (for example, could a professional discussion – still a relatively new method in this context, anyway – also be linked with using webcams?).
This should be a doable project in relation to existing skills and knowledge; although assistance with technical specifications of using some components may need to be accessed.
The limitations of the project are that it will be a ‘local story’ and may contribute little to ‘new knowledge’ outside the current setting. However, there may be valuable information uncovered, which would assist with the implementation of the new components, which is already being driven by reduced demand for the traditional college attendance course and for economic reasons.
I await my feedback on this....
Part 1
Empirical research question
This is based on an aspect of my current practice;
‘Are there likely to be any benefits to HE students, of using blogs as a tool to improve learning through personal reflection?’
A theory of ‘reflective learning’ (Kolb, 1984 and developed by others) has been held as ‘true’ and been used constantly in the field of education. The concept of reflection has also been perceived as an important aspect of developing in a professional capacity in the workplace - the ‘reflective practitioner’, as discussed by Schön, (1987) - which may be of particular relevance to HE management courses, in the important bridge between integrating the study of management theories and the ‘world of work’.
The promotion of reflective learning, for example, by the Higher Education Academy (HEA, 2002), in conjunction with government employability strategies, has resulted in a move within the HE sector to greater emphasis being placed on personal development skills, including those of being a reflective learner. For many management and/or business courses, the awarding body looks to evidence of being able to reflect and to discuss the application of the theoretical learning, in the form of learning logs and/or personal journals. This is emphasised by the Quality Assurance Agency (QAA) in a policy statement (2001), which includes refers to the need of HE institutions to have a ‘…means by which students can monitor, build and reflect upon their personal development…’
Within an institution I am currently involved with, a blog facility was set up last year, within the existing VLE, (‘Blackboard’). The current use made this varies tremendously between departments, with the main purpose being as a repository for session resources.
This deserves to be researched because the technology is currently in place and readily accessible. As we move towards using more on-line resources and there is wider access to IT for the HE management population, it has been suggested that we should use a blog tool as an integral component of some courses. However, it must be considered what would be the likely benefits to the student (if any); would incentives be required to assist this change? It may be viewed as ‘not worth the effort’ by the student community.
This needs to be researched in order to identify if this would be the case, as the conclusions would impact on the allocation of a range of resources in the future (in relation to technology, teaching development or/and student awareness).
Initial thoughts are that
There may be opportunities to use the blog facility as part of evidence for assessment
The use should not be viewed just as a ‘tick box’, administrative task (Black, 1999)
The blog may be an important building block for developing future, shared, on-line learning
There may be ethical implications, as there would be with any ‘human involvement’ project. Students may be reluctant to discuss some aspects of the course and specifically state their thoughts on, say, conflict or motivation in the workplace, if these thoughts were to become part of a visible, ‘permanent’ record, which others could view, as this relates to and is integrated with, their current working environment.
Part 2
Methods - strengths and weaknesses
A wide range of methods are available as part of the researcher’s toolkit. Personal interviews can provide a richness of information and a depth of insight, inaccessible by other forms of data collection. Time may need to be spent in the preparation, however, in relation to ensuring confidentiality and trying to encourage ‘free speech’. In some settings, it may be important to ensure anonymity, but it would need to be considered if this was viable or would detract from the validity of the project. These types of face to face methods of data collection may mean that some nuances, such as body language, go largely unrecorded, but some interpretation may be offered, for which there is no tangible evidence; or the evidence could be used out of its original context, which may lead to questioning the validity of the data. Additionally, the transcription of any interview, however, brief, takes time, as would a manual analysis of the transcriptions.
Questionnaires can allow preparation of specific questions to be asked in exactly the same format to all participants and may provide benefits ease of comparison; for example, Hiltz and Meinke used a pre and post questionnaire method. The negative implications of questionnaires can be the high time and cost resource usage.
Statistical data and analysis, whether small scale (Hiltz and Meinke), or much larger scale (OECD), can provide measures by which to easily compare variables and present the information in a suitable visual format for the audience; detailed and complex information could be analysed (as Wegerif and Mercer achieved). However, analysis may be limited to the researcher’s manipulation skills or, on a larger scale, be dependent on the software and processing technology and financial commitment available. There may be ethical implications related to manipulation of data; while, qualitative data may appear on the surface to be easier to validate than qualitative data, it may be more challenging to validate the methods used in the analysis.
The use of computer-based text analysis highlights the benefit of capturing context and temporal issues; a blend of data, which can be linked; but again this may be dependent on available resources, particularly the investment of technology and skills.
Wegerif and Mercer also used coding schemes, to create the ‘big picture’ of using a range of variables, with ‘publicly verifiable’ categories, which indicates that the research may be more transparent and accessible then other methods. However, the coding may be limiting to the text interpretation, if researchers make suppositions and assumptions. This is where the ethical dilemma of a researcher may be of concern; if the project is intending to confirm a hypothesis, the selection of categories and the analysis and interpretation of these may be skewed.
Systematic observation can provide a richness of detail, but may be limited in value, as much may depend on the observer’s perception and focus of topic. In the case of Laurilland, the observations included extensive explanatory notes, transcriptions and diagrams, in an effort to clarify the situations; this level of detail would indicate greater validity than a purely note-taking style of observation. A disadvantage of this method, in some situation, would be that the observer may alter the normal course of the activity, by their presence in the situation.
Methods – selection
The preferred methods would be those which necessarily fit my existing skills set and experience as a researcher, as well as those methods which would be most suitable for the type of data to be collected, which would mainly be qualitative in nature. It is also important to consider the participants, who would be more familiar with questionnaires than other methods, but who may not feel comfortable with one-to-one interviews.
From the methods met so far, many could contribute; however, due to word limitations, the focus will be on
· Interviews
· Coding schemes
· Statistical data analysis
Interviews could be arranged, with anonymity protected, as it would be the content of the qualitative statement that would be used to highlight key issues, not the respondent’s relationship with the statement; the use of a quote or phrase from a particular student would be used to sum up a general consensus for a particular aspect. Students would perhaps feel more comfortable in a focus group setting, than being interviewed individually; but this would have its own set of advantages and disadvantages, particularly in management course, where there may be vying for leadership.
The interview transcriptions could be analysed by the use of coding schemes, where a range of categories are created from the overview of the material and each input is assigned to a specific category to indicate the volume of subject discussion or the range of topics stated.
Statistical data analysis could also be developed from this coding analysis, such as recording the number of variables linked to the student’s perception on different topics. However, the main use of using this type of analysis would be in relation to the physical counting of participation variables, such as times of access, duration of logon and word quantity of the posting. The analysis could be carried out manually or by the use of simple Excel formulas. Any statistical collation would allow for the use of graphs and/or charts and this is useful in the presentation of data and may appear to contribute to the validity of the project conclusions (whether or not this is actually the case).
These methods would be preferable because they are in keeping with a small scale project, where the researcher’s skills are limited and time resources constrained. One other ethical consideration would be any personal bias; while is it expected that integrity and high professional standards overall are maintained, if it is in ones own interests to achieve a particular recommendation in any study, this may impact on the methodology and therefore the conclusions reached.
(1502 words)
References
Black P. (1999) Assessment, Learning Theories and Testing Systems in Murphy, P. (ed.) Learners, Learning and Assessment London, Sage Publishing
Brockbank, A and McGill, I. (1998) Facilitating Reflective Learning in Higher Education, Bristol, Open University Press
Higher Education Academy (2002) Personal Development Planning: a Tool for Reflective Learning available at http://www.heacademy.ac.uk/resources/detail/resources/casestudies/cs_080 Accessed 6th March 2008
Hiltz, S.R. and Meinke, R. (1989) ‘Teaching sociology in a virtual classroom’, Teaching Sociology, vol. 17, no. 4, p. 431–46
Kolb, D. A., (1984) The experiential learning: Experience as the source of learning and development. NJ: Prentice-Hall
Laurillard, D. (1994) ‘How can learning technologies improve learning?’, Law Technology Journal, vol. 3, no. 2
Oliver, M., Roberts, G., Beetham, H., Ingraham, B. and Dyke, M. (2007) ‘Knowledge, society and perspectives on learning technology’ in Conole, G. and Oliver, M. (eds) Contemporary Perspectives on E-learning Research, London, RoutledgeFalmer.
Organisation for Economic Co-operation and Development (OECD) (2005) E-learning in Tertiary Education: Where do we stand?, Paris, OECD.
Quality Assurance Agency (2001) http://www.qaa.ac.uk/academicinfrastructure/progressFiles/guidelines/progfile2001.asp accessed 4th March 2008
Roschelle, J. (1992) ‘Learning by collaborating: convergent conceptual change’, Journal of the Learning Sciences, vol. 2, no. 3, pp. 235–76.
Schön D. (1987) The Reflective Practitioner: How Professionals Think in Action (San Francisco, Jossey-Bass
Wegerif, R. and Mercer, N. (1997) ‘Using computer-based text analysis to integrate qualitative and quantitative methods in research on collaborative learning’, Language and Education, vol. 11, no. 4, p. 271–86
Part 3
Project selection
It is anticipated that I would select option (a), plan the design of a study. Currently, I am not aware of how this should be constructed and presented in the ‘…form of a research bid…’, but I am sure this will become clear as I continue through the course.
While still considering the main theme of my project, I am aware of the need to consolidate the idea as soon as possible. My situation is that I am working part time in a business and management centre, which is part of an FE college and my role, is delivery of both FE and HE courses.
My brief has been to start with one course (the CMI Level 5 Diploma in Management) and to look at devising a blended learning model, which will necessarily require increased use of technology, in order to develop and make best use of distance learning components. The other additional need is to integrate technology as part of developing professional practice, as many of the candidates use e-mailing, conferencing and numerous software programmes in their daily lives – being part of a college community should not appear to be taking a step back in time!
In the light of these thoughts, I am considering a project along the lines of
‘How could technology components be embedded into existing management courses?’
The intention would be to look at, on a practical level, how specific new components (such as, perhaps, wikis and blogs) could be of embedding and what the technical and technological implications might be.
It is likely that interviews would be relevant in relation to assessing student perceptions of the likely changes and the perceived benefits; discussions with existing staff and IT support would also be required, but there are no issues of access and there is support in the wider college politics for this type of project. There would also be opportunities to ground the research in the concepts of collaboration (Murphy), reflection (as discussed briefly in part 1) and assessment methods (Gardiner, Black). I have yet to access more recent research on changes in management course and review HE-based research papers in relation to awarding bodies thoughts on evidence for assessment (for example, could a professional discussion – still a relatively new method in this context, anyway – also be linked with using webcams?).
This should be a doable project in relation to existing skills and knowledge; although assistance with technical specifications of using some components may need to be accessed.
The limitations of the project are that it will be a ‘local story’ and may contribute little to ‘new knowledge’ outside the current setting. However, there may be valuable information uncovered, which would assist with the implementation of the new components, which is already being driven by reduced demand for the traditional college attendance course and for economic reasons.
I await my feedback on this....
Wednesday, 5 March 2008
And in to week 5...
And in to week 5…never have the week numbers seemed so prominent in a course; if you don't need to be active in online activities and use the traditional distance method, it appears to be easier to fit in your own timescale for tasks and activities.
These few weeks have been really tough on the study front and that hasn’t included carrying out all of the on-line tasks; however, the readings and reflections have been completed. Due to time constraints, I can’t see that I’ll be unable to read and comment on other’s blogs, but I have noticed that there are few (or no) postings so far in the TGF on the last 4 papers, so perhaps I am not as far behind as I thought. Time this week must be spent on constructing the first TMA and to this end I have tried to take a step back from the theory and consider again what issues I would like to research. I have also looked over what I think I will be taking away as key thoughts from this block. These may look a little disorganised here (I should find out how to insert a mind map image onto this blog), but are a summary of important points that I would like to bear in mind when constructing my TMA.
Many methodologies and tools –
Interviews; questionnaires; statistical data analysis; computer based text analysis; coding schemes; text (or transcript) analysis; discourse analysis; observation
And lots of dichotomies (if that is the correct expression) in considering research –
Knowledge - positivism/’out there’ and constructivism/’in your head Tools – critical and analytical
Qualitative and quantitative
Methods and methodologies
Pedagogies and social/political perspectives
Action research and activity theory
Investigations and evaluations
In reviewing the readings, I want to capture some of the key thoughts that may be relevant to where I ending going with my research;
Oliver, M., Roberts, G., Beetham, H., Ingraham, B. and Dyke, M. (2007); p22 ‘…whether these multiple perspectives are…a sign of an immature discipline…’; p23 ‘…what is or might be ‘forgotten’ when knowledge is digitised…’ re Baker; the concept of communitarianism; theme of learning to understand rather than gaining ‘truth’; ‘petit recits’ versus ‘grand narratives’ (Lyotard, 1979).
Hiltz, S.R. and Meinke, R. (1989); ‘…qualitative analysis can be effective for generating theories but not so effective for rigorously testing them…’ (Hammersley, 1992).
Still to research the concept of a ‘deep feature situation’ in learning.
I have basically used the review to create a table, with heading a of methodology, link to research paper, strengths, weaknesses and relevant to use in answer research question or not? This should allow me to focus on what I consider to be the most relevant aspects and to follow a logical thought process of
1. consider research question
2. select particular methods/tools
3. interpret using a theoretical position
….which should be what the methodology is as a whole.
Hope I have got this right; for example, I am understanding that a research instrument is the same as a tool? And a tool may also be the method?
These few weeks have been really tough on the study front and that hasn’t included carrying out all of the on-line tasks; however, the readings and reflections have been completed. Due to time constraints, I can’t see that I’ll be unable to read and comment on other’s blogs, but I have noticed that there are few (or no) postings so far in the TGF on the last 4 papers, so perhaps I am not as far behind as I thought. Time this week must be spent on constructing the first TMA and to this end I have tried to take a step back from the theory and consider again what issues I would like to research. I have also looked over what I think I will be taking away as key thoughts from this block. These may look a little disorganised here (I should find out how to insert a mind map image onto this blog), but are a summary of important points that I would like to bear in mind when constructing my TMA.
Many methodologies and tools –
Interviews; questionnaires; statistical data analysis; computer based text analysis; coding schemes; text (or transcript) analysis; discourse analysis; observation
And lots of dichotomies (if that is the correct expression) in considering research –
Knowledge - positivism/’out there’ and constructivism/’in your head Tools – critical and analytical
Qualitative and quantitative
Methods and methodologies
Pedagogies and social/political perspectives
Action research and activity theory
Investigations and evaluations
In reviewing the readings, I want to capture some of the key thoughts that may be relevant to where I ending going with my research;
Oliver, M., Roberts, G., Beetham, H., Ingraham, B. and Dyke, M. (2007); p22 ‘…whether these multiple perspectives are…a sign of an immature discipline…’; p23 ‘…what is or might be ‘forgotten’ when knowledge is digitised…’ re Baker; the concept of communitarianism; theme of learning to understand rather than gaining ‘truth’; ‘petit recits’ versus ‘grand narratives’ (Lyotard, 1979).
Hiltz, S.R. and Meinke, R. (1989); ‘…qualitative analysis can be effective for generating theories but not so effective for rigorously testing them…’ (Hammersley, 1992).
Still to research the concept of a ‘deep feature situation’ in learning.
I have basically used the review to create a table, with heading a of methodology, link to research paper, strengths, weaknesses and relevant to use in answer research question or not? This should allow me to focus on what I consider to be the most relevant aspects and to follow a logical thought process of
1. consider research question
2. select particular methods/tools
3. interpret using a theoretical position
….which should be what the methodology is as a whole.
Hope I have got this right; for example, I am understanding that a research instrument is the same as a tool? And a tool may also be the method?
Subscribe to:
Posts (Atom)