Welcome new readers!

The "New to the blog? Start here" page will give you an overview of the blog and point you to some posts you might be interested in. You can also subscribe to receive future posts via RSS, Facebook or Twitter using the links on the right-hand side of the page, or via email by entering your address in the box. Thanks for reading!

Monday, August 31, 2009

Link round-up

- Greg Mankiw is teaching a freshman seminar this fall and shares his reading list here.*

- J.D. on Get Rich Slowly has a really nice overview of federal taxes, including showing historical marginal and average tax rates, an international comparison of tax burdens, and how much we pay per billion dollars of government spending (along with useful links for the source data for all that).

- That last point, about how much we pay, is from the guy who does the death and taxes poster, which I had heard about but never actually seen. It's pretty amazing, showing "...over 500 programs and departments and almost every program that receives over 200 million dollars annually. The data is straight from the president's 2010 budget request and will be debated, amended, and approved by Congress to begin the fiscal year."

- And this one isn't really about teaching economics but I can't understand why more people aren't talking about the 'public option' that exists (and seems to work pretty darn well) for worker's comp insurance. Maybe there is something fundamentally different about general health insurance that I'm missing but it seems worth talking about.

* Update: Mankiw also mentions that he is having a hard time choosing the lucky 15 students (out of 200 applicants) who get to take his seminar. Chad Aldeman on The Quick and the Ed has a great idea for him: randomly select half and see if they perform any differently than the half that is hand-picked.

Saturday, August 29, 2009

Marketplace Fun-oh-One

NPR's Marketplace has had a feature this past week where they have talked to economists about some lighter topics. A couple might be interesting to principles students, particularly Justin Wolfers talking about the opportunity costs of exercise and Betsey Stevenson talking about searching for a mate. Friday's segment with Paul Kedrosky is a good example of how economists think (and how we can't really turn off that mode of thinking, even when we're doing pretty mundane stuff).

Tuesday, August 18, 2009

How I teach Principles: Aplia

I find myself working on several projects this summer that involve writing about my teaching approach in the 500-seat Micro Principles class and I thought that readers here might be interested as well. I previously posted about how I use clickers and podcasts.

Most economics professors have, by now, heard of Aplia but for anyone who hasn't, it is a company founded by Paul Romer that basically provides online assignments. They work with several publishers and if you use a textbook they partner with, you can get problem sets customized to that text and an online version of the book.

The first semester I used Aplia, I assigned several of the problem sets that corresponded to the Mankiw text I use. Students tended to hate them, I think largely because I did not edit the questions carefully enough, to make them match what I do in class and the questions I ask on exams (I don't use the publisher-provided test bank). In subsequent semesters, I have assigned fewer problem sets, and those I do assign have now been edited carefully. Instead, I use Aplia primarily because it allows me to do 'experiments' that I could not otherwise do with 500 students (and by experiments, I mean activities in which students are assigned roles as participants in a market and then they trade in an auction environment and can see firsthand what the market does). Aplia has five experiments that are appropriate for micro principles and I use four of them (basic supply and demand, taxes, tragedy of the commons and asymmetric information; the one I omit covers price controls). Each experiment has a preparatory problem set that walks students through how the experiment works, and a follow-up problem set that helps them process what they have done (and because the follow-up problem set provides some made-up data, it is possible for students to do the follow-up problem set even if they did not do the experiment). As an incentive for the students (and because attendance is typically lower on Fridays anyway), I schedule each of the experiments in place of a regular Friday class meeting; that is, students log in from home or a computer lab instead of coming to class. I also schedule at least one other time slot, for any students who have technical problems.

Side note: the experiment screen has a 'chat' area where students can talk to one another. In theory, they could use this to ask questions or clarification about the activity but for the most part, students just chat to kill time while waiting for the experiment to start or in between rounds and they tend to have the sort of random social exchanges you might expect. There is a disclaimer along the bottom that the chat room is monitored and saved but I'm always amused at what students will say before they realize that I am there as well (so far, nothing illegal but there's always someone using a lot of profanity). When I type something, my name shows up in red (everyone else's names are in regular black) and almost without fail, the first time I pop in, someone says, "OMG, I didn't realize Imazeki was seeing all this," which is even funnier to me because I see that too!

Because I use Mankiw’s textbook, Aplia has the added advantage of providing an online version of the textbook. I do not follow the text super-closely and I always tell students at the beginning of the semester that if they are consistent about listening to the podcasts, coming to class and taking good notes, they may not even need the book. However, if they decide not to buy the book, they can still access the online version through Aplia.

Student reaction to Aplia is not quite as positive as their reaction to clickers, though a majority (67%) agree that the experiments help them understand and remember course content (versus 18% who disagree), and 72% believe they are a worthwhile use of class time (versus 15.5% who don't).

Tuesday, August 11, 2009

How I teach Principles: Podcasts

I find myself working on several projects this summer that involve writing about my teaching approach in the 500-seat Micro Principles class and I thought that readers here might be interested as well. I previously posted about how I use clickers.

One of the challenges for faculty who want to make their classes more interactive is that these activities generally take more time than simply lecturing on the same material. I absolutely believe that using clickers and other in-class activities lead students to a deeper understanding of ideas, and I have always taken more of a 'depth over breadth' approach anyway. Still, when I started using clickers, I knew that I would have to make some adjustments and cover even less material. One way I have made time in class is that I have stopped using class time for basic definitions. Instead, I require that students listen to short podcasts (no more than five minutes) that I record using Audacity, a freeware sound editor. The podcasts give a basic introduction to new terms and concepts, and the presentation is actually quite similar to what I used to say in class. I found that recording the podcasts was smoother if I wrote out a script first; this has the added advantage that I can also post that script on the class website along with the audio file (an example of a podcast script can be found here).

In order to make sure that students really are listening to the podcast and are ready to dive into applications, I usually ask a clicker question at the beginning of class that tests their knowledge of the terms and concepts I expect them to know (these are extremely easy if the students listened to the podcast and students must get the answer correct to get full clicker credit). If too many students get those questions wrong, I will spend a minute or two reviewing the material (which often is accomplished simply by explaining the answer to the clicker question itself); however, I purposely don’t spend too much time, instead telling students that they really need to come to class prepared. If I spend too much time in class reviewing what they are supposed to already know, a) they have no incentive to do the work beforehand and b) it defeats the whole purpose in saving class time for other things. I specifically explain this to the class on the first day and have generally had few problems (that is, more than 90% of the class usually answers the review questions correctly so I can usually move right on).

Like a lot of things with the large lecture, there is a big upfront fixed cost but now that I have all the files, I can re-use the same podcasts every semester. One advantage of using the Audacity software is that I can easily cut out and paste in selected parts of any podcast. In particular, I can tailor the introduction each semester (for example, including reminders to students about upcoming assignments or exams), without having to re-record the whole thing. And student reaction to the podcasts has been extremely positive. Students have told me that they like that they can listen to the podcasts anywhere, and repeatedly, and many read the scripts as well. Because I do the podcasts myself, they are closely tied to what I cover in class and students recognize that the podcasts are pointing them to the concepts I consider most important.

Saturday, August 8, 2009

Student reaction to clickers

This is a follow-up to my previous post about how I use clickers in the 500-seat Micro Principles class.

Although I do not have direct evidence of how clickers impact student learning, I have survey responses to several questions about clickers (SDSU’s Instructional Technology Services provides a survey that they ask all clicker-using faculty to administer at the end of each semester). Responses to these questions suggest that students believe clickers help them learn and make them feel more involved:
  • Class clicker usage helps me to remember important course content: 80.6% strongly or somewhat agree; 7.3% strongly or somewhat disagree
  • Class clicker usage helps me focus on course content I should study outside of class: 70.9% strongly or somewhat agree; 9.7% strongly or somewhat disagree
  • Class clicker usage makes me more likely to attend class: 85% strongly or somewhat agree; 5.3% strongly or somewhat disagree
  • Class clicker usage helps me to feel more involved in class: 83.5% strongly or somewhat agree; 6.3% strongly or somewhat disagree
  • I understand why my professor is using clickers in this course: 94.7% strongly or somewhat agree; 0.97% strongly or somewhat disagree
  • My professor asks clicker questions which are important to my learning: 92.2% strongly or somewhat agree; 1.5% strongly or somewhat disagree
  • Buying the clicker and getting it working was worthwhile: 68% strongly or somewhat agree; 12.6% strongly or somewhat disagree
[These percentages are from Spring 2009 (n=206, 56% of enrollment). The percentage agreeing with these statements has risen each of the three semesters I’ve taught the large lecture and the percentage disagreeing has fallen.]

This doesn't mean students love clickers; the percentage that "would select a course section which uses clickers over another section of the same course which did not use clickers" or that "would like more professors to use clickers in their courses" are far smaller (41.7% and 53.4%, respectively). One question where I think the responses are quite telling is: "Class clicker usage makes the class feel smaller to me (less crowded, more intimate)": only 37.4% strongly or somewhat agree and 20.4% strongly or somewhat disagree (42.2% neither agree nor disagree). To me, this reinforces the difficulty of making a big class "seem small". I'm come to believe that it's pointless to try - a room that seats 500 students is never going to feel 'intimate', even if there are way fewer than 500 bodies sitting there. However, that doesn't mean that one can't use interactive techniques. The way I'd put it is that clickers have allowed me to continue teaching in a 'small-class style' even though it's a much larger section.

Thursday, August 6, 2009

How I teach Principles: Clickers

I find myself working on several projects this summer that involve writing about my teaching approach in the 500-seat Micro Principles class and I thought that readers here might be interested as well. Over the next several days, I'll be posting about how I use clickers, Aplia and podcasts.

I use clickers from eInstruction; San Diego State decided a few years ago to standardize with one company across campus and I think it was a really good move (more information about clicker use at SDSU, including faculty and student feedback, and links to research on their effectiveness, can be found here). As more and more faculty have adopted clickers, it has become easier for me to explain them to my students and to justify their cost. I embed clicker questions in the PowerPoint slides using eInstruction’s PowerPoint plug-in so the transition to questions is seamless during lectures. My policy is to make every class worth the same number of points (last semester, it was 3 points; previous semesters, it was 5), each question is worth one point, and if I happen to ask more questions, I just randomly select three (other colleagues adjust the points on each question or make every question worth the same so the points per day could vary). At the end of the semester, I keep the top 25 daily scores; dropping at least a few scores means I can avoid issues with students who forgot their clickers or who have dead batteries, etc. (note: I teach MWF and there are always several days without scores for various reasons; I found that it is better to tell students that I will KEEP the top 25 scores, rather than telling them I will DROP the lowest X scores, because X may have to change over the course of the semester). Last semester, I also made a quiz available on Blackboard that students could take if they missed class; I take the higher of their clicker score or quiz score for a given day. It is easier for students to get full credit if they come to class but by offering the quiz, a) students who attend class get a little extra practice if they want it and b) I believe there were fewer disruptive students in class (i.e., students who were only coming to class to get the points but really did not want to be there tended to talk more, especially given that the size of the class allowed them a lot of anonymity; with the online quiz, they were less likely to come to class, which I feel is ultimately better for the other students who do attend, but I still felt reassured that the absent students were staying on top of the material).

One feature of the eInstruction system that I use occasionally is “pick-a-student”, which randomly draws a name from the roster (a box shows up on the screen with the name and clicker ID). I tend to use this when I have asked the class to brainstorm examples or asked them a question that doesn’t really have a ‘wrong’ answer. Although students don’t love it, they don't seem to hate it either. On a mid-semester evaluation, I asked, "How do you feel about my calling on students in class (check all that apply)?" with the following response options (about 2/3 of the class responded to this question):
  • I hate it and really wish you wouldn't do it (17%)
  • It's not helpful because most of the time I can't hear people's responses. (13%)
  • I'm not crazy about it but I understand why you do it. (65%)
  • It's not helpful if people give wrong answers; I'd rather you just tell us the answer. (11%)
  • It makes me more likely to pay attention in case you call on me. (35%)
  • I like it because it breaks up the lecture. (23%)
  • It's fine but you spend too much time letting students talk. (8%)
In general, student feedback about clickers has been largely positive (I'm compiling some stats from end-of-semester surveys that I will discuss in a separate post). They recognize that the clickers keep them more engaged; for example, students have made comments like, “I pay more attention because I know a clicker question is going to be coming up” and “I like that I can see right away if I get the answer right.” The clickers give the students (and me) immediate feedback on how they are doing, feedback that would not otherwise be possible in that large a class. I think they also appreciate that the clicker questions are similar to what they will see on exams. In fact, now that I have been using them for a few semesters, I have started using old exam questions as clicker questions. When I started, I was concerned about the fact that I can only ask multiple-choice (or numeric answer) questions but I have found ways to use the multiple-choice clicker questions to motivate working on more open-ended questions: I pose an open-ended question (e.g., “Use a supply and demand graph to show what happens to price and quantity if X happens”) and then follow that with a multiple-choice clicker question that can be easily answered if they did the graph first (and I give them less time to answer, since they were already given time to draw the graph).