Talking Teaching

July 7, 2010

the first-year experience – from the academic’s point of view

Yesterday we had another meeting of the uni’s Teaching Network – an informal meeting of staff from across the campus to talk about some teaching-related issue that interests us. (Lunch is provided – an added extra!) Today’s topic was ‘the first-year experience’, beginning with a discussion of the NCEA. The ‘National Certificate of Educational Achievement’ is the qualification gained by most state (public) school students in New Zealand, although some may opt for the Cambridge International Exams or the International Baccalaureate. It was a good choice of topic as many (most?) university lecturers are pretty much unaware of the intricacies of the NCEA, & most people there probably went away with a better understanding of the whole system.

Then the discussion got really interesting. How much attention should we pay to the NCEA ie to systems of assessment used in schools – & also to their curriculum? How helpful is it for our students if we use those systems (or something similar)? Can we change the way we teach & assess our students, given that there’s likely to be more experiential learning – & assessment of that learning – going on in your average secondary school classroom than you’d see in many first-year lecture rooms & laboratories? And should we? Shouldn’t students simply ‘harden up’ & get used to the fact that universities do things differently? Are we doing our students a disservice if we change what we do to accommodate their different prior experiences? Would this constitute ‘dumbing down’ the university experience? Can we improve our use of learning objectives? How good are we at teaching our students the skills & attributes of critical thinking?

I can’t pretend that I know all the answers to those questions, but I’ll certainly do my best to give you my perspective!

On the issue of curriculum awareness, I would argue that all first-year university lecturers (at the very least) should be aware of the curriculum that students taking their subjects would have followed in their senior secondary school years. How else can we be sure that we’re linking in with students’ prior learning; that we’re not hitting them with ideas they’ve never heard of (& so have nothing to build that new information onto), or with material that has already been fairly thoroughly covered at school? Neither of those possibilities strikes me as a good thing as there’s the potential for students to turn off from what’s going on in our classrooms – either they feel that they’ve heard it before & get bored (& maybe don’t recognise how what we’re doing might be building on that earlier knowledge), or they don’t understand & give up.

I’ve already written a bit on the nature of assessment; here, for example. Personally I believe that universities can learn a bit from looking at assessment systems like the one used in NCEA – not to say that it’s all sweetness & light, but it is definitely a more transparent system than is typical in many university papers. And I like the fact that students are scaffolded into the question & that examiners must think carefully about exactly what they’re testing (& I speak from experience here). For example, if the intention is to examine students’ ability to interpret information, then in NCEA you give them that information, rather than expect them to regurgitate the facts before interpreting them. I’m not arguing that we shouldn’t expect students to learn key facts & concepts! Only that perhaps we need to think more carefully about what & how we’re assessing.

At yesterday’s discussion someone raised the point that we don’t have funding to change what we do, that in a financially-tight environment exams & multichoice tests are the best way to handle the demands placed on the lecturer by the need to assess learning – in large classes in particular – & that change would require money we don’t have. I’m not sure that I agree with that. Some of the changes we’re instituting in first-year biology – peer-marking of essay drafts, for example – certainly take time to set up & you need to educate the students about what you’re doing & why. And of course, there are no actual marks in the mark sheet. But once these changes are in place then – hopefully! – student learning around what’s actually expected of them in tasks such as writing an essay should improve, & from my point of view the end-product should be of higher quality & perhaps even require less actual time from me in marking & giving feedback :-) So I would argue that it is possible to change the way we teach & assess even in large university classes, in a way that links with students’ earlier experiences of these processes & enhances their enculturation into the academic world of the university.

The counter-argument to this is that we would simply be perpetuating the ‘spoon-feeding’ that may have gone on in schools. Should students just ‘get on with it’, harden up, & accept that universities do things differently? (Secondary teachers: please don’t shoot me! I don’t actually agree with that; I’m being rhetorical here :-) Well, not really, this argument was raised yesterday.) This would perhaps be true – if universities can demonstrate that their way of  ‘doing things’ produces better outcomes, in terms of student learning, than the alternatives I’m discussing. But I’m not 100% convinced that we can. For example, if you look at your average ‘graduate profile’ – the statement of what graduates are capable of – you’ll find it says things like ‘will demonstrate critical thinking skills, be an independent learner, be capable of communicating in a range of different modalities’, & so on… But – how do we know this? Do we give students the opportunity to learn & practice these skills, & how do we know that they possess them at the end? Traditional, teacher-down, lecture- & ‘cookbook’ lab-based teaching isn’t going to deliver on these things.

The other thing here, of course, is that the nature of the student body has changed over time. When I was a student, mumblety-years ago, only a rather small proportion of senior school students went on to university. We were taught by that traditional method, by & large we survived it, & (mostly) we did well. It worked for us, why won’t it work for ‘them’… But this ignores the fact that – whether we like it or not – incoming students are far more diverse, with a wider range of learning experiences, language abilities, cultural backgrounds, & so on. Universities simply have to take this into consideration in their practices. On a purely pragmatic basis, having those who can’t cope (read, haven’t ‘hardened up’) simply drop out won’t serve us well when a proportion of our funding rests on completions & retention rates…

And no, taking all the above into consideration does not equate to dumbing down our curriculum! Doing things differently doesn’t mean doing them less well & may in fact improve the effectiveness of our teaching.

Learning objectives… We all have them. Often they’re implicit, unstated; when you go into a lecture you’ll have at the back of your mind the things you hope your students will learn from it. But all the course outlines I’m familiar with have learning objectives/outcomes written in there somewhere. So what’s not to like?

The trouble is that they are often phrased in such a way that students aren’t actually much wiser about what they’re expected to be able to do. ‘Students will understand/demonstrate an understanding of…’ is a common one, & it provokes the question, how will they (or the teacher) know that they understand? Will they be able to use a piece of equipment in the right way & in the right context? Or show their understanding by explaining a concept or correctly interpreting a particular data set? If the outcome is ‘have the attributes of a critical thinker’ then again, how will both parties know that this has been achieved?

So spelling them out has benefits to both parties, & for those of us on the lecturing side of the partnership, it helps us to focus on what we actually intend our students to gain from our teaching. And from our assessment – assessment drives student learning much more strongly than teaching does. So it’s no use having clearly stated learning objectives if we then go on to ignore them in setting assessment items. If the former expect higher-order skills & the latter test mostly lower-order attributes, then which do you think the students will focus on in their revision & exam preparation? (I have to say, having good clear learning objectives makes it much easier to develop good assessment tools!)

And critical thinking? I think I will save that for a whole ‘nother post :-)


  1. I agree that changing the way one teaches is not “dumbing down” the curriculum. I have heard this sentiment expressed a lot, that if one doesn’t teach the same way they’ve been teaching, that equates to watering down the course. Expectations and method of delivery are two completely different subjects. i really don’t understand why people put them together. First, you decide what you want them to learn and what requirements you expect them to achieve in the course. Then you decide what is the best way to teach that information. They are not the same discussion at all. It may be that certain methods of teaching do not let one cover as much in class. That just means you demand more of them outside of class. To digress a bit, it seems common to demand the students only learn what was said in class. I find this extremely odd because that is not at all expected in most English courses and Poly Sci courses, in which they expect the student to do considerable reading outside of class and are expected to be familiar with that material whether or not it is mentioned in class. Why it is that science courses can not seem to be as rigorous as an English course is beyond me. But back to the main point, simply covering material in class is rather pointless if the students aren’t learning it and everyone is well aware of just how ineffective a traditional lecture is. At best they serve to inform the students of what they are expected to know. They do little actual teaching. It seems to me that more information could be covered and more effectively by requiring them to read outside of class and using lectures to tie the material together, not teach them the basics which they can easily pick up from the book.

    In short, the “dumbing down” idea of altered teaching methods comes from trying to discuss two very different subjects at the same time and is a mistake in logic. We should not change the bar just because we change the method.

    Comment by jdmimic — July 8, 2010 @ 5:06 am

    • You know it, I know it… The trouble is, I’m not sure that all teaching staff really are aware of how ineffective the traditional lecture format is at promoting meaningful long-term learning. (Maybe because said lecturers did OK in the ‘old’ system…) The difficulty is to change a whole teaching culture, when up until now the institutional emphasis has been primarily on research – that’s what’s tended to attract the kudos & support the promotion applications. It’s not something that can be done from the top down. So a change to overtly recognising good teaching would be a good first step. As would giving adequate support to staff development units (which at most institutions seem to run on the smell of an oily rag!) It’s carrot, rather than stick, material.

      Comment by alison — July 8, 2010 @ 4:05 pm

      • Why do you think it can’t be done from the top down? It seems to me that the overt recognition of good teaching and the adequate support for staff development are things that are only meaningful if they are from the top. How do we spur such change in our departments if not supported by the upper administration? I guess my question is how do we initiate the changes that we see should be made? What can we do in our departments, particularly those of us that are junior faculty?

        You mentioned curriculum awareness. While I agree it would be optimal, I don’t think it is possible in the United States. We have a locally controlled educational system. Standards vary enormously from one community to the next, so it is nigh impossible for a university professor to have any good idea what the students have been exposed to in high school.

        I have also tried to talk to my department some about how the students have changed in the last decade. Many of them seem to view the student body as rather stagnant, each new student body as pretty much the same as the year before. If anything, there is the standard “they just aren’t as smart or as hardworking as they used to be” attitude. The students of today though are much more comfortable with electronic media and expect it in everything. This greatly changes the thought processes of the students compared with students a decade or even five years ago. Teaching the same way to them simply is not going to work.
        And as you mentioned, we are no longer dealing with the highest level students that made it to college, we are dealing more and more with the standard high school student who previously would not have gone to college, which is a whole new complication.

        Between the student composition changes and the varying curriculum issues, it seems that we need to have some way of continuing education for the faculty on the incoming students. I have no idea how to do that though in a way that the faculty would go along with it and not consider it a waste of time.

        As for as the learning objectives go, I completely agree that most of them are written in a rather vague manner that aren’t really helpful and then are paid no attention to come assessment time. Part of the problem is that in many instances, it seems as if the teaching objectives are written simply to check off a requirement imposed by the administration. There seems to be little real effort to convince people that the teaching objectives might actually be useful or how to use them.

        But honestly, I don’t see any of this changing until the administration and tenure committees see teaching as at least as important as brining in grant money for research; and as long as the schools are deriving a substantial amount of their income from the grants, that isn’t going to change.

        Speaking of which, I don’t know anything of how other countries support their universities. I am only familiar with the United States. Here, all the major universities derive a large amount of income by skimming “indirect costs” off of every grant that comes in to any of the professors. Is that done elsewhere?

        Comment by jdmimic — July 9, 2010 @ 3:00 am

  2. I guess I wasn’t as clear as I might have been. It’s just staff often don’t feel ‘ownership’ of things like policy statements & can be resistant to what they can perceive as ‘top-down directives’. Unless staff are also given support, encouragement & advice on the ‘hows’ & the ‘whys’, then they’re likely to be resentful & the changes, if not actually resisted, will be slow. But yes, having an institution actively promoting teaching excellence & rewarding good teaching in a meaningful way (eg having teaching count as much as research in the promotion round) is the only way to get academic staff to take it all seriously. In NZ we have an organisation called Ako Aotearoa which actively promotes & rewards this sort of thing eg through national ‘Tertiary Teaching Excellence’ awards, & through the provision of a whole heap of on-line resources; they also offer funding for teaching-focused research initiatives.

    On the issue of what more junior staff can do, & can only make suggestions based on my own experience. The first is, if your institution has a unit that supports teachers in developing good teaching practice (something like our own Teaching Development Unit), then seek them out. They probably run courses/workshops on aspects of teaching & learning & I’ve always found these really worthwhile. They’ll also be aware of people in other Schools/Faculties who are interested in improving teaching & learning, & can put you in touch with them. (Sorry, I’ve just realised you might read this as me telling ‘you’ [jdmimic] personally, but really I’m just addressing anyone reading this.) Once you’re part of a community of practice, it’s easier to feel confident that you’re on the right track :-)

    Learning objectives – I think the general failure to use these properly may well reflect the fact that (here anyway) most uni lecturers aren’t trained teachers, so they’re not exposed to any of the pedagogical stuff. At this institution part of our Teaching & Learning Plan requires faculty to take a much closer look at their use of objectives & rewrite them in a meaningful way – which of course is going to require a lot of support & advice from both those of us who already do this & also the TDU (goodness knows where their staff will find the time for all this ;-))

    And maybe just talk with other people in a similar role. The tutors in the various departments in our Faculty have started meeting on an occasional basis (another ‘community of practice’ if you like) & sharing ideas about how they work with their lab & tutorial classes. They’re all good at what they do & I think this is going to be a very productive relationship, for them & for their students.

    Support for universiies… Here in NZ, all the universities receive government funding (although these days it’s not sufficient to cover all the costs of running the institution & so a proportion of external research income does get taken to help pay for ‘central’ services). The government component is in 2 parts, related to teaching & research. The teachng part makes up about 80% of our govenment funding & shortly a portion of that will be allocated on the basis of measures of teaching quality. (Fairly blunt measures as they are bsed on retention & completion, but anyway…). The research component is allocated on the basis of research quality: it’s called the Performance-Based Research Fund. All staff receive a grade as a result of an external review of the quality of what they do (A, B, C & R [= research-inactive]) & the summative grade for the staff working a research ‘area’ (eg cellular & molecular biology) allows the government to rank institutions on how they are performing in the various research areas &, of course, to get an overall ranking. And that’s tied to the funding.

    Comment by alison — July 11, 2010 @ 11:51 am

  3. […] the end of my last Talking Teaching post I mentioned critical thinking – & said I’d leave that topic till later. This is ‘later’ […]

    Pingback by the skills of critical thinking | BioBlog — July 11, 2010 @ 1:39 pm

  4. […] the first-year experience – from the academic's point of view … […]

    Pingback by Psat Test | Those Who Grade Standardized Tests | — July 13, 2010 @ 11:15 pm

RSS feed for comments on this post. TrackBack URI

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Create a free website or blog at

%d bloggers like this: