I'm still thinking about Steve Greenlaw's post about why some faculty are more willing than others to re-think and re-design their courses. As Steve points out in his comment to my previous post, the costs of fundamentally rethinking a course are really large. I'm particularly aware of this because for the last couple months, I have spent an absurd amount of time prepping my fall courses, and I know I'll spend an even more absurd amount this summer. Having this amount of time is a luxury that most of my colleagues do not have and it's really only available to me because a) I have tenure and b) I have established myself in my field of research in such a way that my colleagues are unlikely to consider me a 'slacker' even if I don't publish anything for awhile (which, at the rate I'm going, is entirely possible!).* On the other hand, there are lots of things I could have chosen to do with this time and I chose to spend it on my classes.
But I digress... The point I wanted to make is that while re-thinking courses has always been a time-consuming undertaking, re-designing a course to incorporate new technology can be particularly time-consuming, and I wonder if the necessary time investment might be growing. It could just be me - I have spent much of my time exploring Web 2.0 tools (for example, setting up this blog!) and I've found this fascinating new world of blogs, wikis, social bookmarking, etc. to be both addictive and somewhat overwhelming. Every new blog I read leads me to others and every new tool I discover opens possibilities for my courses that I want to explore. I know that I'm probably biting off more than I should for my fall courses but I can see so much potential for using technology to make my 500-student class actually interactive that it's hard to hold back. I realize that other faculty, even those who are just as eager to improve their classes, may not dive in quite as deeply, nor would they need to.
But it also seems like every time I learn what one thing is, I see a reference to some new thing I haven't heard of before (Plurk? are you kidding? I just got onto Twitter!) and I sometimes feel completely overwhelmed by all the stuff I still don't know (is it just me or does Second Life intimidate the hell out of anyone else?). And that's just understanding what things are, let alone figuring out which tools might be most useful for improving my teaching. Yet, I know I'm way ahead of most of my faculty colleagues, some of whom can hardly find their way around Blackboard. So I wonder what will happen as the frontiers of technology pull farther ahead - will it become harder and harder for late adopters to catch up? Or might they actually have some sort of advantage because they'll be able to dive into Web 3.0 (or whatever it will be called) without having to unlearn the previous version? Or is adapting to/adopting new technology itself a skill, so even if you get a late start, you can learn how to adapt/adopt as you go?
* For those who are curious, I do research in education policy and am currently working in Sacramento on legislation that will hopefully lead to pretty major reform of California's school finance system. Since the legislative process is anything but smooth, I've had lots of time in-between meetings with legislators to work on other things, like my classes.
But I digress... The point I wanted to make is that while re-thinking courses has always been a time-consuming undertaking, re-designing a course to incorporate new technology can be particularly time-consuming, and I wonder if the necessary time investment might be growing. It could just be me - I have spent much of my time exploring Web 2.0 tools (for example, setting up this blog!) and I've found this fascinating new world of blogs, wikis, social bookmarking, etc. to be both addictive and somewhat overwhelming. Every new blog I read leads me to others and every new tool I discover opens possibilities for my courses that I want to explore. I know that I'm probably biting off more than I should for my fall courses but I can see so much potential for using technology to make my 500-student class actually interactive that it's hard to hold back. I realize that other faculty, even those who are just as eager to improve their classes, may not dive in quite as deeply, nor would they need to.
But it also seems like every time I learn what one thing is, I see a reference to some new thing I haven't heard of before (Plurk? are you kidding? I just got onto Twitter!) and I sometimes feel completely overwhelmed by all the stuff I still don't know (is it just me or does Second Life intimidate the hell out of anyone else?). And that's just understanding what things are, let alone figuring out which tools might be most useful for improving my teaching. Yet, I know I'm way ahead of most of my faculty colleagues, some of whom can hardly find their way around Blackboard. So I wonder what will happen as the frontiers of technology pull farther ahead - will it become harder and harder for late adopters to catch up? Or might they actually have some sort of advantage because they'll be able to dive into Web 3.0 (or whatever it will be called) without having to unlearn the previous version? Or is adapting to/adopting new technology itself a skill, so even if you get a late start, you can learn how to adapt/adopt as you go?
* For those who are curious, I do research in education policy and am currently working in Sacramento on legislation that will hopefully lead to pretty major reform of California's school finance system. Since the legislative process is anything but smooth, I've had lots of time in-between meetings with legislators to work on other things, like my classes.
Comments
Post a Comment
Comments that contribute to the discussion are always welcome! Please note that spammy comments whose only purpose seems to be to direct traffic to a commercial site will be deleted.