Talking Teaching

December 6, 2018

the sad state of science learning in primary school

This post was first published on my ‘other’ blog. It’s not intended to diss primary school teachers – quite the reverse! They need all the help & support they can get to help them deliver the science curriculum.

In 2011, Sir Peter Gluckman released his report, Looking ahead: science education for the 21st centuryIn it, he noted the need to improve science teaching in primary schools, commenting that

there should be an attempt to improve the confidence [my emphasis] of all teachers within primary schools to assist in science and that all primary schools should be encouraged to develop a science champion.

And in 2012, David Vannier pointed out that

there is growing evidence that too many children are not doing well in science and do not have access to effective instruction, especially at the primary level.

and that

[at] the same time that the New Zealand government is seeking to spur innovation in science as a means to improve the economy, less and less emphasis is being placed on science instruction in primary schools.

Fast forward to Monday this week, when Radio NZ reported on the findings of The National Monitoring Study of Student Assessment (NSSA): that 20 percent of Year 8 children last year reached the expected level of achievement in science – the lowest figure of any learning area in the curriculum. While most children liked learning about science at school – 82% of those in year 4 and 65% in year 8 – those figures haven’t changed significantly since the previous survey in 2010, and the decline between years 4 and 8 should be a concern. Overall, these results don’t augur well for science literacy and engagement with science amongst our young people.

You may be tempted to lay this result at the feet of National Standards. Don’t. Looking Ahead was published in 2011. National Standards were first implemented in 2010, just a year earlier. The issues identified by Sir Peter Gluckman have had a longer gestation than that.

I wrote about Sir Peter’s report at the time, highlighting his statement that

science education is not just for those who see their careers involving science but is an essential component of core knowledge that every member of our society requires.

Thus, science education needs to deliver on what Sir Peter characterised as ‘citizen-focused objectives’, where all children need to have:

  • a practical knowledge at some level of how things work;
  • some knowledge of how the scientific process operates and some level of scientific literacy
  • enough knowledge of scientific thinking as part of their development of general intellectual skills so that they are able to distinguish reliable information from less reliable information.

But can it deliver? His report also notes that

[a] well prepared primary school teacher will integrate excitement about the natural world and scientific forms of thinking into literacy and numeracy teaching, and into general educational processes. The challenge is how to provide primary teachers with the skills to do so. [My emphasis]

I believe that meeting this challenge will require changes to at least two things: teacher-training curricula, and professional development (PD) and support.

Just 25% of primary school teachers hold another qualification, in addition to their teaching degree, and it’s probably fair to say that BSc graduates are in a minority. Intending primary school teachers usually study for a 3-year Bachelor of Teaching degree, and take a range of papers in their first year – including one on science teaching. This one paper, plus learning opportunities while on practicum in schools, may well be their sole exposure to science (Campbell, 2018).

Which is where the PD and support come in. Ally Bull (2016) found that science was “marginalised” in the primary curriculum; and that teachers – lacking confidence to teach the subject – often had little in-school support and only limited access to opportunities for PD. The majority of those providing the PD (51%) aimed to enhance teachers’ confidence to teach science, and just 5% felt that developing their knowledge of science was important. Bull also cited other research that found that “primary teachers’ lower confidence in low confidence in teaching science reflected their lesser degree of content knowledge.”

There are ways to address this. Anne Hume & Cathy Buntting (2014) developed resources and shared these with primary teacher trainees, encouraging them to think about what science ideas they could teach (plus the why, when & how) while using those resources. Their results? Really encouraging:

Even student teachers who had previously felt very apprehensive about teaching science reported feeling far more confident about the prospect after completing the CoRe assignment.

Programs like theirs, changes in teacher education, and the commitment to provide ongoing mentoring and support, should raise teachers’ confidence in teaching science and see them reach their full potential as ‘science champions’. Our teachers and our children deserve no less.

 

 

Advertisements

December 2, 2018

teachers’ reactions to this year’s year 13 bio exam

Today I’ve been hearing from some very unhappy teachers. As in, teachers who are upset to the point of tears on behalf of their students. Excellent, very experienced teachers. The reason for their unhappiness? This year’s NCEA Level 3 (year 13) biology exam, sat by their students just a few days ago.

And at this point I should emphasise that the teachers’ concerns were focussed towards the New Zealand Qualifications Authority, and not the individual examiner(s) who, after all, prepare these documents with advice and guidance from NZQA staff. Their concerns were focused on the system.

Now, it’s several years since I was an examiner at this level, and I know that the nature of the exam has changed. And of course the teachers themselves are well aware of what’s been expected in the past; they’re just taken aback by the nature of this year’s papers**.

Thing is, I was also involved in developing Scholarship-level exams and to me, while for some questions there’s quite a bit of resource material to get through in this year’s L3 exam, the amount of writing required of students seems an awful lot for level 3. The question-books for the exam (you’ll find them here) contain multiple blank pages for students to write their answers: the implicit message is that a lot of writing is needed. There are 3 questions like this in a book, so three essay-type answers for students hoping to achieve an excellence for the paper.

This may not sound like much – but the actual exam covers 3 separate achievement standards. So a student who’d prepared for all three (and most schools encourage this) would find themselves faced with writing up to 9 (yes, nine) extended answers over the space of three hours. (For comparison, a Schol Bio candidate would write just 3 essays in the same time frame.) In other words, the demands of this exam are such that it would quite likely preclude students from doing justice to all three papers.

So, here are some of the teachers’ concerns (I’m quoting with their permission):

  • for an ‘excellence’ response, a student had to demonstrate high-order analysis & evaluation skills, in an answer generated in just 20 minutes (less, really, because of the requirement to read the question & plan an answer first). This is a big ask.
  • students who were slower writers, or who had lower (but still OK) literacy levels would struggle to complete in a way that still allowed them to demonstrate their knowledge about biology and so gain an ‘achieved’. [And let’s remember that there’s a lot more to knowledge than simply being able to write a bunch of definitions.]
  • “all my students felt let down by this examination. All their hard work, dedication, and love for the subject were lost in those 3 hours.”
  • “I have seen a fair few of my students this week and they are so demoralised! Many are gutted they tried to do all 3 papers [because they] couldn’t do them justice- am gutted for them!”
  • “A Facebook comment stated, “OK, I will play their game and only do two externals next year”. What is happening to the integrity of our subject when the assessment is driving the whole course structure of Biology in schools throughout the country?”
  • “We are losing students because the subject is deemed too difficult. A colleague informed me that her daughter would not be taking Biology next year because it was too hard so she is taking Physics and Chemistry instead.”

Yes, this is anecdotal. But if these comments reflect a widespread reality, then science education in this country will be the loser.

 

** I was rather concerned about a particular question, too – but that’s best left to another post, on my ‘other’ blog.

May 10, 2018

talking about what we should teach

This is a cross-post of something I originally wrote for my ‘other’ blog.

While I was on holiday (Japan – it was wonderful!) – I read Tom Haig’s interesting article about ‘curriculum wars’ over on Education Central, and it reminded me of the concerns I’ve held for some time that we don’t really talk enough about what to teach in our classrooms, be they university-level or in the secondary sector.

Several years back (how time flies!) I was involved in developing the ‘Living World’ component of the New Zealand Curriculum document, as well as entering into the discussions around what the science component of that document should deliver. (Right down to a discussion of what it actaully is to ‘do’ science.) At the time I was somewhat taken aback to discover that the panel was not required to give any exemplars for teachers, any indication of what they might do to help students master particular concepts – something that’s noted by Tom. Yes, I totally get it that schools are free to set their own curricula, but at the same time I couldn’t halp thinking that the occasional ‘starter for 10’ might be useful.

Layered on top of that – & amplified by my experiences in relation to developing and assessing Achievement Standards for NCEA, was the way that while new content or concepts might be loaded on up-front, we didn’t seem to remove stuff at the other end. This had the result that the amount of information associated with a standard might just grown & grow (CRISPR, anyone?). Pretty much the same thing tends to happen at university – if you look at one of the standard first-year biology textbooks, Campbell BiologyA, you’ll see that it’s become steadily thicker over time as new material’s added. (In my experience, at least some first-year uni lecturers argue that all the basic stuff should be delivered at school; they shouldn’t have to teach that. However, this sits poorly against the fact that no NZ universities have any prerequisites for their first-year biology papers, and also suggests that those making the statement don’t really recognise that not all year 13 students are heading for university. Remember, schools have the ability to shape their curricula to suit the needs and requirements of their individual communities.)

In other words, we didn’t seem to be having any discussion around what should be taught, and why. And we still don’t, although hopefully such issues will be addressed in the review of NCEA. For, as Tom Haig says:

Working out what we should be teaching, and why, is something that we should be discussing together and taking much more seriously as teachers than the second place it’s taken to discussions of technique. Hattie, ERO, the Best Evidence Synthesis and so forth are filled with advice about ‘how’, but shouldn’t we be thinking just as hard about ‘what’?

A No relation! I was privileged, though, to meet the late Neil Campbell when he visited New Zealand, and was struck by what a wonderful educator he was.

February 13, 2018

engagement & experiences in undergraduate science education

This post is based on a presentation at the 2017 First-Year Science Educators’ Colloquium (FYSEC), and is also published on the Bioblog. 

At FYSEC2017Gerry Rayner led a session called “Undergraduate science education in the 21st century: issues, needs, opportunities”.

Gerry kicked off by commenting that education has a greater impact – on students, teachers, and the wider society in which education systems are embedded – when people work together across a range of disciplines. What are the issues currently facing undergraduate science in NZ & Australia, he asked, and how do we address them? This was something that generated quite a bit of subsequent discussion. On the list:

  • rising enrolments: Gerry commented that in Australia, the removal of caps on enrolment, together with international demand, meant that some predictions of student numbers saw growth of perhaps 30% over the next few years’
  • increased diversity – not only cultural and ethnic diversity, but also a wider range of prior knowledge and academic achievement on entry;
  • as fees increase, and with that, student debt, we’re already seeing a change in attitude: students see themselves as customers, paying for a product, and can expect particular outcomes;
  • lower on-campus attendance may well have an effect on student engagement (and comments from attendees showed that this is something we all face) – but, to support increased numbers, we are pushed to provide more on-line delivery;
  • this means that educators need to provide not only more on-line content and assessment, but also the sort of meaningful interactions that enhance student engagement;
  • the need – Gerry described it as a moral obligation, & I agree that the obligation is there – to provice meaningful opportunities for students to enhance their employability. That is, it’s not all about mastery of content, and students also need to gain a whole range of work-related competencies and capabilities.

Gerry then introduced some data from a report on student engagement in New Zealand universities (Radloff, 2011), which defines this thing called ‘engagement’ as

students’ involvement with activities and conditions that are likely to generate high-quality learning, [something that] is increasingly seen as important for positive learning outcomes

and comments that

measures of student engagement provide information about individuals’ intrinsic involvement with their learning, and the extent to which they are making use of available educational opportunities. Such information enhances knowledge about learning processes, can be a reliable proxy for understanding students’ learning outcomes and provides excellent diagnostic measures for learning enhancement activities.

This wide-ranging report is based on data from the AUSSEA survey of student engagement, & includes chapters on Maori and Pasifika student engagement; engagement in relation to field of study; the experiences of international students; relationships between engagement, preparation for study, and employment; students’ departure intentions; differences between part-time & full-time students; and the impact of distance education cf on-campus learning on student engagement. The survey has 6 engagement scales (academic challenge, active learning, student/staff interactions, enriching educational experiences, supportive learning environment, & work-integrated learning), & 7 outcome scales (higher-order thinking, general learning outcomes, general development outcomes, career readiness, average overall grade, departure intention, and overall satisfaction). In Radloff’s report the AUSSE data from NZ were also benchmarked against responses from Australian, South African, and US undergraduate students.

The results, said Gerry, were generally good but (& the report also makes this clear) not entirely comforting. In measures of engagement, for example, NZ students rated the quality of staff-student interactions quite poorly (an average score of 18 compared to 35 in the US); and a low proportion (across all countries) felt that they had enriching educational environments – while at the same time strongly agreeing that they had quite a supportive learning environment!

And on the ‘outcomes’ scales, only about a third of NZ first-year students felt that they had gained some level of career readiness through their uni studies. At the same time, around 30% of them had considered leaving university (yes, there were a range of reasons underlying this). Even by the end of the degree only 35% felt that they were really career-ready, & 29% had considered leaving during the year. This is not particularly positive.

Overall, for the natural & physical sciences, NZ students felt that: they didn’t get a lot of support from their university; they were less likely to answer questions or get involved in discussions; they had low levels of interaction with others in their class; felt they had lower career readiness, and lower levels of workplace-integrated learning experiences, than students from other disciplines (in fact, in this 2011 report only 9% reported involvement in some sort of placement or work experience); tended to have jobs unrelated to their future study/career hopes; and were less likely than those from other disciplines to feel that their study at uni helped prepare them for the workplace.

And again, there’s that 30% of them who either considered leaving, or planned to leave, before completing their studies (but those reporting working regularly with others in class were much less likely to be in this group). However, it’s not all doom & gloom on that front:

while nearly one-third of New Zealand’s university students have seriously considered leaving their university before completing their study, students are generally very satisfied with their experience at university. [Around 75%] rated the quality of academic advice received as ‘good’ or ‘excellent. [And more than 80%] were satisfied with their overall educational experience… The vast majority … indicated that given the chance to start over, they would attend the same university again.

Nonetheless, Gerry argued (& I agree), it appears that as a country we don’t prepare science students particularly well for the workplace – despite the fact that we’d hope that they will be contributing to the ‘knowledge economy’. So the delivery of workplace-integrated learning (WIL) becomes something that STEM faculties need to look at more closely. We also need to work on improving student perceptions of the nature of their learning experiences & outcomes. Here, Gerry suggested that experiential learning that helps develop skills as well as content knowledge, peer tutoring, innovative use of technology, case studies, group work, and role playing can all help – and can also be a part of preparing students for the WIL component of their learning, and for the workplace after university. (Of course, this means that institutions also need to provide ongoing PD for their teaching staff, to support them in using new means of delivery.)

Students benefit from WIL, as they can get a better understanding of the world beyond the universities. This is true even for projects run on campus, so long as there are industry links of some sort and the students are working on authentic problems that let them apply their content knowledge in real-world contexts. But WIL has benefits for academics as well, as the improved connections with employers can deliver research opportunities. It requires effort (& investment) to set up, but the outcomes for institutions and students would make this worthwhile.

A AUSSE: the Australasian Survey of Student Engagement

A.Radloff (ed.) (2011) Student engagement in New Zealand’s universities. pub. ACER & Ako Aotearoa. ISBN 978-0-473-19590-8

November 28, 2017

what’s feedback – and do unversities do it well?

This is something that I’ve also posted on my ‘other’ blog, based on a most excellent article that a colleague has just sent me.

I’ve just received a reminder that I need to set up the paper & teaching appraisal for my summer school paper. This is a series of items that students can answer on a 1-5 scale (depending on how much or how little they agree with each statement), plus opportunities to give open-ended responses to a few questions. These last are the ones where I might want to find out how the students think I might improve my teaching, or the aspects of the paper that they did & didn’t like.

Among the first set of items is usually a stem along the lines of “this teacher provides useful feedback on my work”, where responses would range from ‘always’ (1) to ‘never’ (5). It’s the one where I get my lowest scores – and this is despite the fact that I provide general feedback to the class, written individual feedback on essays etc (& when I was teaching first-year, the opportunity to get feedback on drafts), and verbal feedback when the opportunity is there. Digging into that a bit, it appeared that most students only saw the written feedback as feedback at all, and since a substantial minority didn’t collect their essays afterwards, then they felt they weren’t getting feedback. Bit of a catch-22, and one that perhaps marking & giving feedback on line might ameliorate? I hope so.

But you can understand why students might not participate in an appraisal of the paper and the teaching in it: if they feel that the teachers aren’t providing them with feedback, why bother? And – just as important – if we don’t close the loop & tell students how we use their feedback, then why would they bother?

So, are universities good at providing feedback to students? I don’t agree, and I think quite a few students would say no – and according to this excellent article in the Conversation, academic researchers, Australia’s 2015 Graduate Course Experience survey, and the Australian government’s “Feedback for Learning” project agree with them. For example:

The 2015 Graduate Course Experience surveyed over 93,000 students within four months of their graduation. It reported that while close to three quarters of graduates felt the feedback they received was helpful, 16.3% could not decide if the feedback was helpful, while a further 9.7% found the feedback unhelpful. Clearly something is wrong when a quarter of our graduates indicate feedback is not working.

The findings from the Feedback for Learning survey of more than 4,000 students are particularly interesting – & saddening. Of all those surveyed, 37% said that the feedback is discouraging. Thirty-seven percent!!! There were few instances where students felt that they’d received the opportunity to benefit from any formative feedback they received. 15% of all respondents found the feedback upsetting – but this rose for international students, students with poor English skills (these first two are not necessarily one & the same) or a learning disability. And a majority of both staff & students felt that the feedback is impersonal.

You can see why I found the article saddening. But why is there such a problem? Perhaps, suggests the Conversation, it’s partly (largely?) because in many cases both academics and students don’t really understand what ‘feedback’ really is.

For example, many academics and students assume that feedback is a one-way flow of information, which happens after assessment submission and is isolated from any other event. In addition, academics and students often feel that the role of feedback is merely to justify the grade. A further misunderstanding is that feedback is something that is done by academics and given to students. These beliefs are deeply held in academic culture.

Luckily there are things that we can do about it. The article describes four things that educators should bear in mind that would significantly improve both the quality of feedback that we provide, and the nature of students’ learning experiences arising from that feedback. I strongly recommend reading those recommendations – and acting on them.

October 2, 2016

unplugging a flipped classroom

The always-excellent Faculty Focus has been running a series on techniques for developing and running flipped classrooms. I’ve been reading them with interest, because – as some of you might remember – I’ve ‘flipped’ some (but not all) of my own teaching sessions.

Now, my own classes have been pretty low-tech; with the ‘design-an-organism’ classes (an idea that I learned from my colleague Kevin Gould), students are expected to do a bit of revision of their notes, but the actual lecture-room experience involves nothing more than group work + pens & paper (& a projector to share the results).  So the topic of a recent post naturally caught my eye: The Flipped Classroom Unplugged: three tech-free strategies for engaging students – not least because at my workplace there’s an increasing amount of discussion around ‘going digital’, and we need to take care not to throw the baby out with the bathwater.

Dr Barbi Honeycutt’s list includes: adapting the ‘muddiest point’ feedback technique (except now it’s the students who analyse the comments for commonalities and patterns); mind-mapping; and a brain-storming challenge.

I use mind-mapping quite a bit in class, & also in my own thinking & planning. Barbi’s post reminded me of the PhD research of my friend (& then-student), Cathy Buntting, in which she had me teaching students in tutorial classes how to develop mind-maps using the same tools Barbi describes: post-it notes, pens, & large sheets of paper (in place of whiteboards). (Writing concepts on the sticky notes lets students move them around, revising their maps as their understanding changes.) We also encouraged the students to use concept maps in their revision & to plan essays in exams – and Cathy found that the sort of deep learning encouraged by this technique really paid off in those examinations: students who’d learned complex information using concept maps did much better on questions testing complex understandings than those who tended to use shallow, rote learning methods. (There was no difference between the groups when it came to rote-learning tasks.) She also found that a large majority of students thoroughly enjoyed these tutorial sessions and found the mind-mapping technique both enjoyable and helpful.

I’ve used concept mapping widely (though not exclusively; a range of tools is much better) ever since. However, in lecture classes it’s usually been as a means to show how to review knowledge of a topic & plan out an exam answer, after students have spent time in discussion. In future, I really must be a lot more active in encouraging their own use of this tool in lectures, & not just in the smaller, more manageable tutorial sessions.

Thank you, Faculty Focus!

 

September 15, 2016

helping first-year students cope with the reality of university study

For many students making the transition from secondary school to university can be a difficult experience. Their teachers have probably told them that they can be expected to learn more & work harder, but the students don’t really know what that entails beyond doing ‘more of the same’. (They may also have been told that at uni it’s ‘sink or swim’, & that they’ll be left pretty much to their own devices – it was nice to hear from a group of our class reps that they hadn’t found this to be the case and that they felt their learning was well supported.)

Unfortunately doing more of the same, and just doing it harder, may not be a good coping strategy when it comes to self-directed learning. Certainly our experience in first-year biology this year was that many students simply seemed unaware of, or unprepared for, the need to do more than simply attend lectures (or watch them on panopto).  And as Maryellen Weimer points out in her excellent blog on The Teaching Professor, there are an awful lot of distractions: new friends, new social opportunities, new jobs… Plus students can find it hard to recognise when they do or don’t understand something, equating familiarity with knowledge, and are used to a lot more teacher guidance.

And as Maryellen points out,

Additionally, there’s the reluctance of students to change their approaches. When asked what they plan to do differently for the next exam, students often respond that they’ll do what they did for the previous one, only they’ll do it more. Dembo and Seli’s research shows that even after successfully completing developmental courses that teach learning strategies, students didn’t change their approaches. Finally, and even more fundamentally, strategies may be known and understood, but unless they’re applied, they’re worthless.

This is something I hear quite often, from students who’ve been asked to see me because their teachers have identified that the students are struggling. The idea that continuing to do the same, but more & harder, is a hard one to shift sometimes.

But over on The Teaching Professor, you’ll find some useful suggestions for turning this around. Some of these, such as moving to earlier assessment, are changes we’re already making, but there are clearly other tools to use as well. And as our student cohorts’ demographics continue to change (we’re seeing an increasing number of first-in-family enrolments, for example), there’s an urgent need for universities to adapt in turn. Expert teachers such as Dr Weimer can help us with this.

September 21, 2015

does it help when we give handouts to students?

I seem to be asking a lot of questions in my posts lately.

Recently a teaching colleague pointed me at a post entitled 5 common teaching practices I’m kicking to the curb. While I’ve never used 4 of them, it’s common practice (at my uni, anyway) to provide students with a printed study guide** that contains much of the course content, and most of us also make the powerpoints available ahead of class. (In my case, they’re incomplete; I see no real value in providing absolutely anything that I’m intending to cover during a class.) The intention is that students will read through them ahead of coming to class, & identify the bits where they need to pay particular attention &/or ask questions. The reality, of course, is that some do and some don’t.

But the post my friend shared, plus my own recent learning around the effects of laptops in class, have made me think carefully about this practice. Like the author of that post,

[this] system, I felt, made it easy to catch up students who had missed class, and it prevented students from missing important points made during the lecture.

However, handing out a complete set of everything probably does send a message to some students that they don’t need to do anything further. And if they don’t engage with the material covered – in print & in class – then any learning is likely to be transient at best :( This leads me to think that I should be doing more to help students learn how to take notes that will be useful, in a way that means that they aren’t simply focusing on writing everything down & so failing to pay attention to what’s actually being said. This is important, as I’ve noticed that students will do exactly that (write it all down) for each novel slide. Fortunately help is out there, as the author of 5 common teaching practices includes some useful links, including this one to a University of Nebraska (Lincoln) page advising staff on how to help students learn to write good notes.

Perhaps it should be required reading for all of us teaching in first-year papers?

 

** or, in the digital era, a downloadable pdf :)

August 31, 2015

should we stop students using laptops during lectures?

I guess it depends on what they’re using their laptops for.

Most days when I come in at the back of the lecture room & walk down to the front, I’ll see a lot of laptops open & in use. Quite a few students will actually have the (incomplete*) powerpoint for the day’s class open on their screens, but quite a few others are on Facebook (or some arcane form of social media that I haven’t caught up with yet) or just surfing. So when a friend shared an article titled Professors push back against laptops in the lecture hall, I read it with interest & also shared it with one of our big FB student pages for some consumer opinion. (There’s some interesting commentary here, too.)

One of the major reasons many oppose laptop use is their potential to distract students from what’s going on in the classroom, and judging from the ‘consumer feedback’ I received, that can be quite a big issue:

I don’t begrudge others using them except when they are watching videos or checking facebook etc during lectures. That’s very distracting.

It’s only annoying and distracting when people take their laptops and play games or scroll Facebook. Which a lot of people do…

Somewhat surprisingly, that distractive effect extends to students putting their devices to what many of us would regard as ‘legitimate’ use ie searching for information directly related to the class. And I’ll admit, sometimes I’ll ask a student to look something up, especially if I think they’re doing something other than class-related work! For example, this brief report cites a study showing that

students who spent a greater proportion of time seeking course-related sites recalled significantly less than those who were more often browsing sites unrelated to the course (r = -.516, p. < .02).

And worse:

the more students used their laptops, the lower their class performance (β = -.179, t(115) = -2.286, p = .024), the less attention they paid to lectures (p = .049), the less clear lectures seemed to them (p = .049), and the less they felt they understood the course material (p = .024)

Yikes! This really piqued my interest, & led me to a 2014 paper by Mueller & Oppenheimer, which has the wonderful title, The Pen is Mightier than the Keyboard. Here’s the abstract:

Taking notes on laptops rather than in longhand is increasingly common. Many researchers have suggested that laptop note taking is less effective than longhand note taking for learning. Prior studies have primarily focused on students’ capacity for multitasking and distraction when using laptops. The present research suggests that even when laptops are used solely to take notes, they may still be impairing learning because their use results in shallower processing. In three studies, we found that students who took notes on laptops performed worse on conceptual questions than students who took notes longhand. We show that whereas taking more notes can be beneficial, laptop note takers’ tendency to transcribe lectures verbatim rather than processing information and reframing it in their own words is detrimental to learning.

I’ve certainly observed that many students struggle with long-hand note-taking, to the extent that I’ll get the occasional complaint that “she moves on to the next slide before I’ve copied it all down” in my teaching appraisals. (I do explain that they shouldn’t be ‘copying it all down’…**) And I type much faster than I write, so I can sympathise with students who want to use their laptops for note-taking in class. So did some of my students, commenting that

I actually find typing notes better for me, because my typing speed is so much faster than my writing speed.

and

I would hate it if we were not allowed laptops in lectures anymore! I’d miss half the notes and then have to go home and panopto lectures (or die if they weren’t panoptoed) which just takes up time that i could use studying all my notes properly.

Mueller & Oppenheimer’s paper has really got me thinking. They point out that there is a considerable body of evidence around the efficacy of note-taking, commenting that even without the distraction effect,

laptop use might impair performance by affecting the manner and quality of in-class note taking.

This could have that negative impact on learning by two routes: ‘encoding’, and ‘external storage’. ‘Encoding’ is valuable because – ideally! – students process information as they make their notes, and doing this enhances both their learning & their ability to retain information. ‘External storage’ refers to the ability to review and learn from notes at some later point, including notes taken by others: indeed, we employ note-takers to do this for students who are unable (for a variety of reasons) to take notes themselves.

An important question here is, what are students actually doing when they take those in-class notes? Are they actively summarising what’s been discussed eg via drawing a concept map, or writing a paraphrase? Or are they simply copying, word for word, every single thing I say & show in class?*** While some could argue, “but it doesn’t matter ‘cos I’ll write a summary later”, Mueller & Oppenheimer observe that

verbatim note taking predicts poorer performance than nonverbatim note taking, especially on integrative and conceptual items.

This underlies their suggestion that while laptops allow more rapid note-taking, if those notes are verbatim, then learning and understanding may actually suffer. In fact, they observe that

One might think that the detriments to encoding would be partially offset by the fact that verbatim transcription would leave a more complete record for external storage, which would allow for better studying from those notes. However, we found the opposite—even when allowed to review notes after a week’s delay, participants who had taken notes with laptops performed worse on tests of both factual content and conceptual understanding, relative to participants who had taken notes longhand.

So where do we go from here? I must admit to being a tad flummoxed at the moment – with the need to offer more flexible learning opportunities and  the current trend to ‘paperless offices’, we’re moving into a more highly digitised world and those laptops aren’t going to go away any time soon. How, then, to overcome the apparent negative effects they may have on student learning? If part of the problem lies with the ability to take appropriate notes, do we need to somehow teach this skill to all our incoming first-years?

 

* I mean, why would I give them the whole lot up front (including the answers to my in-class quizzes)?

** no, seriously! What I’d much prefer is that they read through the material I provide ahead of class, identify the bits where they have no idea what I’m talking about, & then that’s where they should focus any note-taking during class.

*** and if they are taking such fulsome notes – how much attention is being paid to everything else that’s going on in class: the questions, discussion, extra explanations?

 

August 28, 2015

does powerpoint make students stupid & professors boring?

The author of this article certainly thinks so. He kicks off with this:

Do you really believe that watching a lecturer read hundreds of PowerPoint slides is making you smarter?

I asked this of a class of 105 computer science and software engineering students last semester.

Well, first up, that’s a leading, and loaded, question. And secondly, I’d be surprised if anyone really believed that. Yes, I’m sure that there are lecturers who simply read off their powerpoint slides (which really is a no-no!). And what did we use in the days Before Powerpoint (BP)? Quite likely overhead transparencies, either printed or handwritten, and yes, some of us almost certainly had lecturers who simply read all the information off the transpareny. (I know I did!)

In other words, the header ignores the fact that Powerpoint is simply a tool. Nothing more, and nothing less. It cannot make anyone boring. That’s done by the person using it; similarly,  the way the tool is used will have a flow-on effecct on learners. Indeed, this was the focus of a post I wrote some time ago, and if you haven’t already read the 2008 paper by Yiannis Gabriel that I discussed therein, you should do so now.

A better question would be: how do we help professors to use powerpoint (& other technologies) in ways that better support student learning?

That, of course, requires that we are able to measure student learning in meaningful ways. And here I definitely agree with the author of the article:

Any university can deploy similar testing to measure student learning. Doing so would facilitate rigorous evaluations of different teaching methods. We would be able to quantify the relationship between PowerPoint use and learning. We would be able to investigate dozens of learning correlates and eventually establish what works and what doesn’t.

Isn’t it time that we got serious about doing this?

 

 

Older Posts »

Create a free website or blog at WordPress.com.