Welcome new readers!

The "New to the blog? Start here" page will give you an overview of the blog and point you to some posts you might be interested in. You can also subscribe to receive future posts via RSS, Facebook or Twitter using the links on the right-hand side of the page, or via email by entering your address in the box. Thanks for reading!

Thursday, June 30, 2011

PollEverywhere: Summing up

[This post wraps up my reflections on my pilot of PollEverywhere this past spring. If you missed my last two posts, I discussed how I used PollEverywhere in my data analysis course, and student reaction to it.]

So here are my general thoughts on PollEverywhere, particularly relative to standard clickers:

  • Convenience for students. They all have cell phones so they never 'forget' their device the way they do with clickers.
  • Easy to ask open-ended questions. Even on clicker systems that have this feature, it is generally easier with cell phones/laptops.
  • Relatively low-cost. If you have fewer than 30 students, the service would be totally free; if you need to track more responses, there is a cost for a PE account that someone (you, your institution, or the students) will have to bear. For students who do not have unlimited texting, there may be costs related to sending/receiving messages; the total cost will depend on how many questions you ask (in a previous post, I calculated that for my class, assuming 10 cents per text, it would still be cheaper for students than buying a clicker).
  • Relatively low-commitment. Even if clickers are used a lot on your campus (as is the case at SDSU), PE can be a great complement for faculty who only want to use the technology occasionally. For example, in my spring writing course, there was one class meeting where I wanted to survey the class about some plagiarism issues, but since it was only that one day, it really didn't seem worth having them buy and register clickers. In the past, I probably would have either skipped the surveying or I would have asked students to answer the questions before class; instead, I used PE (note that in this situation, I did not need to track responses so students did not need to have accounts or register their phones).
  • Few problems with software. This is a relative thing - many faculty on my campus have had major issues with the clicker software we've been using. In comparison, PE was really easy and almost always worked well. Not perfect but glitches were pretty rare.
Disadvantages [with the caveat that some of these things seem like a pain to me just because I'm used to clickers; if you've never used clickers, you may not find most of these issues particularly problematic. Also, according to a comment on my first post, the PollEverywhere folks are working on several of these concerns!]
  • Lack of integration with University systems. Students have to register on a separate site and getting grades into the course management system requires more work than most clicker systems.
  • Multiple submissions from same person recorded as separate responses. If you allow multiple responses (which allows students to change their answer after submitting something), you'll have to deal with sorting those out afterwards.
  • Integration with PowerPoint is kind of clunky. I do everything in PowerPoint and prefer not to switch to a separate site to ask questions so I embedded all my PE questions into PowerPoint. To do this, you create all the questions on the PE site and then export a PowerPoint file. The questions themselves are shockwave flash objects embedded in the slides that I would then cut and paste into my class PowerPoint presentation (where I also created a timer, which I'm used to having for clicker questions). It's not difficult; it's just more time-consuming and not nearly as seamless as with clicker systems.
  • Difficult to ask questions on the fly. Many clicker systems have a way to get responses to questions that have not been prepared ahead of time (e.g., asking a question verbally or creating a new question on the fly). It isn't hard to create questions in PollEverywhere but in order to generate the codes for responses, you do have to go into the system and create the question with answer responses. I think trying to ask something that you didn't prepare ahead of time would be too time-consuming to do spontaneously in class. It occurs to me as I write this that one possible workaround (if you anticipate needing this option) is to create a generic question so the codes are already created...
  • Need consistent cell service. Most of my students do not have laptops so if cell service doesn't work for someone, there is no alternative way for them to submit responses to the system. I really am not sure what I would do if I had more than two or three of these students. In my opinion, this is the biggest reason not to have PollEverywhere completely replace clickers (e.g., if a campus is trying to decide on one standard, as SDSU does).
Other issues [Not necessarily good or bad but stuff you should think about if you're considering using PE]
  • Students have cell phones out during class. Duh. This is not a problem for me but I know it is a huge issue for other people. Aside from issues related to students cheating (which I didn't worry about because all my questions were low-stakes) and students texting their friends (which I don't worry about in general), the one thing I was a bit concerned about was whether students would remember to turn off their ringers. I put a reminder on the first slide to silence phones (there's an image in my previous post) and I have to say, I don't think I had any more ringers go off than in a typical semester; if anything, I think there may have been fewer.
  • Monitoring answer distribution is kind of clunky. This is true with the eInstruction clicker system as well - if you want to see what the answer distribution looks like but don't want students to see it (e.g., if you think the distribution is likely to be mixed and you plan to re-ask the question), the system itself doesn't really let you do this since whatever you see on the computer is what is shown on the projector screen. My workaround is that I usually 'freeze' the projector (so the image on the screen doesn't change even if what I see on the computer monitor changes). This is only possible in some smart classrooms on my campus, not in others.
  • No technology is 100% perfect. You still need to have a plan for how you will handle things like the system being slow, or students who insist they submitted responses but the system didn't register them. As with any technology, it's best if you go in with a flexible attitude. One of my students commented, "I loved polleverywhere... my only concern would be if other professors would be able to utilize the program as efficiently... Sometimes when professors who are not tech savvy attempt to use such programs, it sometimes eats up more class time just for them to figure out how to use it." PollEverywhere is pretty user-friendly but if you're the type of person who gets flustered easily when things don't go entirely according to plan, you may want to do a lot of practice runs before integrating into a live class.
Overall, I think PollEverywhere is a great service, particularly for faculty who want to incorporate clicker-type activities but don't want to make their students buy/register a clicker, or if you really want to ask open-ended questions. I do think it is best for low-stakes activities, particularly given the issues I encountered with cell reception, but it would be great as a back-channel or in classes where the technology is only needed once in a while.

Monday, June 27, 2011

Student response to PollEverywhere

In my last post, I described how I used PollEverywhere in my data analysis course this spring. In this post, I'll discuss student reaction to PollEverywhere; in my next post, I'll wrap up with my own impressions and thoughts/suggestions for others who are considering using it in their classes.

At the end of the semester, I surveyed my students about a number of aspects of using PollEverywhere. SDSU's Instructional Technology Services asks all clicker-using faculty to administer the same survey every semester and I adapted most of those questions for PE. Many of those questions aren't really about the specific technology but are about using any kind of response system (e.g., "Clickers/PollEverywhere usage helps me to remember course content"). I discussed student responses to clickers a couple years ago and reactions haven't changed much (if anything, the percentages of students agreeing with most of the statements has increased); reactions to PE on those questions look really similar. More relevant to PE specifically, I did ask a couple questions about PollEverywhere versus other options, and some questions about technical problems (percentages based on both sections combined, n=122 for most questions):
  • In the future, I would select a course section which uses PollEverywhere over another section of the same course which did not use PollEverywhere (note that the comparison here is really another course that does not use any response technology, not comparing to a course that uses clickers instead of PE): 60% strongly or somewhat agree; 6% strongly or somewhat disagree (that's 33% who neither agree nor disagree)
  • In the future, I would select a course section which uses PollEverywhere over another section of the same course which uses traditional clickers: 75% strongly or somewhat agree; 6% strongly or somewhat disagree (19% who neither agree nor disagree)
  • I would like more professors to use PollEverywhere in their courses: 75% strongly or somewhat agree; 6% strongly or somewhat disagree (19% who neither agree nor disagree)
  • I was able to set up my PollEverywhere account and register my phone (if applicable) with few or no problems: 92% strongly or somewhat agree; 6% strongly or somewhat disagree
  • Once everything was set up and registered, PollEverywhere has worked well for me: 90% strongly or somewhat agree; 5% somewhat disagree
Preferred over clickers: I should point out that for the first and third questions (about choosing a section that uses the technology over a section that does not, and wanting more professors to use the technology), the percentages agreeing about PollEverywhere are much higher and percentages disagreeing are much lower than when I asked the same questions about clickers in my fall data classes, where roughly 37% agreed and about 20% disagreed (remember, I pretty much just replaced clickers with PollEverywhere for the spring classes; the questions themselves, and how the technology was used, were the same). Students also seemed to prefer PE over clickers specifically (about 80% of the students had used clickers in other classes). Of course, I could be really cynical and suggest that maybe students prefer PE because it meant they could have their cell phones out in class, and I don't doubt that actually does make students more receptive to PE. But in their open-ended comments, students pointed out that it was more convenient because they always have their phones with them (versus having to remember to bring clickers), they liked having multiple options for submitting responses, and they liked that it was free, for those with unlimited texting. On the other hand, a couple students complained that it wasn't free for them because they pay per text. Other student complaints were similar to complaints I've heard about clickers: phone died and had to miss points (I drop 10% but that doesn't stop students from complaining), need more time to submit responses, and sometimes responses wouldn't go through.

Reception issues: That last comment deserves more discussion. One uncontrollable factor that has a huge impact on the success of PollEverywhere is cell phone service. The vast majority of my students (89%) submitted their responses via text message; only a handful submitted via web browser either using their smartphone (3%) or laptop (5%). But there were also a couple of students in each class (3% total) who were never able to submit via phone because of bad cell reception and they did not have laptops; my understanding is that pre-paid phones and Nextel phones were the worst. I allowed those students to submit their responses on paper and I manually recorded their scores. Note that this policy only made sense for me because a) almost all the PE questions were low-stakes and b) I only had a couple in each section (out of 75). There were also a number of students who could submit via text most of the time but would have a problem every once in a while; I told those students they could either write their responses on a piece of paper or just let that be part of the 10% of points that would be dropped.

There were a few times when the glitches did seem to be with the PollEverywhere site (e.g., it took a long time to get a confirmation text, or some students didn't get a confirmation text at all but the system did record their response) but I don't expect perfection from any technology (and the problems were definitely rarer than the last time I used eInstruction clickers in the 500-seater). Just as with clickers, I think it's imperative to have a flexible policy (like dropping some points or having alternative ways to get the points) so students don't freak out. More about that in my next post...

Friday, June 24, 2011

Using PollEverywhere instead of clickers

Months ago, I mentioned that I was part of an ITS pilot of PollEverywhere this past spring. Quick reminder: PollEverywhere is a web-based service where anyone can create a multiple-choice or open-ended question and people can respond via text, Twitter or website. I first used PollEverywhere in the fall when I wanted a way for my teams to submit open-ended responses. The free version only allows up to 30 responses per poll which was fine for 13 team responses but wouldn't work for individual responses (since I have 75 students in each of my sections) so I used clickers for any individual responses. In the spring, the University bought a PE account subscription so there could be unlimited responses. It also meant that students could register and their responses were recorded so I could use PollEverywhere as a replacement for clickers. In this post, I'll explain the mechanics of how I used PollEverywhere and some of the associated pluses and minuses. In my next couple posts, I'll talk about what the students thought and my overall impressions.

Low-stakes assessments: I mostly used PE to have students submit their individual responses to multiple-choice application problems before they discussed those same problems in their teams. My main concern was making sure that students had to think about the problem individually a little and commit to an answer before discussion. I didn't care so much what specific answer they chose so students received credit just for answering anything (i.e., participating); there were only a couple times when I made credit dependent on selecting the 'right' answer. PE allows you to embed polls in PowerPoint and that is what I did, rather than switching over to the website each time. One downside of PE, relative to clickers, is that there is no timer so I created one using animation in PowerPoint. It's a clunky workaround but if you want to give students a visual indication of how much time they have to answer a question, I'm not sure what the alternatives are.

Grouping questions: One thing I had to decide was if I wanted to 'group' my questions together or not. The way PE normally works, each answer choice has a randomly-generated unique keyword; for example, if you want to choose answer A, then you send in '70101' and if you want to choose answer B, then you send in '70103', etc. With a paid account, you can also create your own keywords to replace the random numbers (but they still have to be unique since the keyword identifies both the question and the specific answer choice). An alternative is grouping multiple questions together and assigning a keyword to the group. Once you do that, respondents send in the group keyword before any questions are asked; they get a response that says they are enrolled in that 'event'. Then, the codes for individual answer choices within the group are numbered 1-9 and then alphabetically. That is, say the first question in the group has five answer choices; they would be numbered 1, 2, 3, 4, and 5 so students who wanted to select the fifth answer would only have to text in the number '5'.  If the next question in the group also has five answer choices, they would be numbered 6, 7, 8, 9 and A so students who wanted to select the fifth answer there would text in the letter 'A'. This can get a little bit confusing since most of us are used to talking about answer choices as A though E but I got used to it. For me, grouping questions together made it easier to keep my poll questions organized. I created one group for each class session, for each section, and named the group with the date and section time; for example, May11AM for my 11am class and May11PM for my 2pm class. As soon as students got to class, they knew they should text in the day's keyword so they would be ready to go when the first question came up. Here's what the first slide of every class looked like (this was up as students walked in):

And here's what a typical question looked like (note that the strip along the side was my 'timer'):

No integration with Blackboard: In order to give students credit for their PE responses, they first have to create accounts in PE and if they want, they can register their cell phones (so any response sent from that phone number is automatically connected to their account). If they don't register their phones, they have to log in each time and submit responses using a browser. Another drawback of PE, relative to clickers, is that it is not integrated with Blackboard, the course management system. This means there are extra steps for students (registering on a separate site) and extra steps for me. To get their daily points into Blackboard, I had to create a 'report' in PE, download that to Excel, make any necessary adjustments in Excel (such as giving credit for right answers versus just participation, or just summing up the points for the day), then upload to Blackboard. A colleague in the business school who also piloted PE this spring has apparently developed an Excel macro that can take care of some of the Excel manipulations but I just did things manually. For me, the extra work wasn't a huge issue but one thing that was frustrating was that in order to upload to Blackboard easily, I asked students to change their identifier to their University ID number (the default when they create their accounts is their email address). By the third week (and after multiple reminders), almost all students had done this but I had two students (one in each section) that never made the change; since I stopped making the adjustment for them after Week 3, this meant that their PE points were zero for every single class and they STILL didn't figure it out! [Note: if PE were used a lot more across campus so this happened in all their classes, I have to assume they would eventually fix it but I'm still amazed...]

Dealing with multiple responses: Another issue I had to consider was how to handle multiple responses. With most clicker system, students can change their responses as long as the question is open and the system will simply retain the last answer submitted. With PE, you can set an option to only allow up to X responses per person or unlimited responses; if you choose to allow multiple responses, PE records every response separately (every response is time-stamped). PE also can send a confirmation text so students can verify their response was received (this is an option you can turn on or off). In my case, since it usually didn't matter which specific answer a student selected, I set things up so they could only submit one response; on the few occasions where their specific choice 'mattered', I made sure to tell students that they needed to be extra careful before sending in their responses since they would only get one shot at it. My colleague in the business school allowed multiple responses and then used his Excel macro to only count the last submission for each student.

Next time, I'll share some of the feedback I got from students...

p.s. While I was working on writing this post, InsideHigherEd had an interesting article on standardization of clickers that mentions cell phones replacing clickers. And if you're more old-school, ProfHacker just posted an article about low-tech alternatives to clickers.

Related posts:
Texting in response to open-ended questions
Student response to PollEverywhere

Thursday, June 23, 2011

Econ Ed sessions at the Westerns

For anyone attending the Westerns next week, it looks like there are only a few econ ed sessions; here's what I could find. Also note that CSWEP is sponsoring a panel on Thursday morning (8:15-10am) called 'Striking a Balance: Getting Tenure and Having a Life' - grad students and junior faculty are particularly encouraged to attend!

Friday, July 1, 8:15-10:00am
Chair: Robert L. Sexton, Pepperdine University
Papers: Satyajit Ghosh, University of Scranton, and Sarah Ghosh, University of Scranton
Beyond ‘Chalk and Talk’: Teaching Macroeconomic Policy with Spreadsheet Simulation
Denise L. Stanley, California State University, Fullerton, and Morteza Rahmatian, California State University, Fullerton
Can Technology Make Large Classrooms Neutral for Learning? The Case of an Upper-Division CBE Core Class
David M. Switzer, St. Cloud State University, and Kenneth Rebeck, St. Cloud State University
Using Online Tools to Improve the Quantity and Quality of Student Evaluations
Gandhi Veluri, Andhra University
Usage of Computer Techniques in Understanding Economics
Discussants: Denise L. Stanley, California State University, Fullerton
Frank M. Machovec, Wofford College
Gandhi Veluri, Andhra University
Sheena L. Murray, University of Colorado, Boulder

Saturday, July 2, 8:15-10:00am
Chair: A. Wahhab Khandker, University of Wisconsin, La Crosse
Papers: Manfred Gaertner, University of St. Gallen, Bjorn Griesbach, University of St. Gallen, and Florian Jung, University of St. Gallen
The Financial Crisis and the Macroeconomics Curriculum: A Survey among Undergraduate Instructors in Europe and the United States
Susan Jacobson, Regis College
Community-Based Learning—Making It Stick
David F. Kauper, University of the Pacific
Cost-Minimizing Students
A. Wahhab Khandker, University of Wisconsin, La Crosse, and Amena Khandker, University of Wisconsin, La Crosse
What Should We Teach Our Students About Interest Rate Determination
Discussants: Denise A. Robson, University of Wisconsin, Oshkosh
David F. Kauper, University of the Pacific
David E. Chaplin, Northwest Nazarene University
Manfred Gaertner, University of St. Gallen

Saturday, July 2, 2:30-4:15pm
Organizer and Moderator: Elia Kacapyr, Ithaca College
Panelists: Pierangelo De Pace, Pomona College
Lesley Chiou, Occidental College
Christiana E. Hilmer, San Diego State University
Nicholas Shunda, University of Redlands

Monday, June 20, 2011

Early adoption

I've always been a semi-early adopter of technology. I'm not a fanatic about it; I'm just sort of fascinated by the internet, by the ability we now have to reach people we never could in the past, and I like playing around with stuff. Back in college, I would 'chat' with friends via the VAX (I think that's what it was called), and in grad school I learned some html so I could create a personal webpage with lots of random stuff on it. I was actually excited when our campus started using Blackboard because it was easier to post my class stuff there than on the webpages I created on my own. And as the number of tech and web-based communication tools has exploded, I've explored a bunch of them, as I've written about here a lot.

But even though I think technology is a wonderful thing, when it comes to teaching, I don't think I use technology just for technology's sake. Rather, I'd say that when I'm faced with a problem, I tend to look to technology as part of the solution. Lisa Lane points out that many faculty are OK with using technology for non-pedagogical problems, like recording grades, and technology is clearly great for simplifying things like distributing course materials. But my interest in using technology for teaching really kicked into high gear when I started teaching the 500-seat class. I certainly can't imagine teaching that class without clickers but once I started using them, I quickly realized the opportunities they create for student engagement so that now, I wouldn't teach a class of any size without them. And that experience has led me to look for other ways that technology can increase interaction both inside and outside the classroom.

Given my own inclinations, I have to admit that I find it a bit odd when I encounter people who seem to be anti-technology. On the one hand, I do understand why some people think Twitter, Facebook, blogs, etc. are a waste of time (because goodness knows they can be!), and I certainly understand the frustration many teachers have with their students' texting all the time and all the associated issues that we could blame on the 'net gen' connection to technology. But on the other hand, I can't help but think that people who make those kinds of comments are, well, big fuddy-duddies, particularly since these comments often come from people who don't actually know anything about the technologies they are disparaging. To me, it sounds a lot like the latest version of, "Eh, kids these days!" And when I hear those comments from teachers, I can't help but wonder: do they not understand that at least some of these tools have the potential to help them reach students, to increase student interaction and engagement? Or is it that they don't care about reaching students? Or, to put that more nicely: why is that some people perceive the costs of learning about technology to be so much greater than the benefits?

Tuesday, June 14, 2011

Turtle steps

I am obsessive about lists. I make lists for what I need to do on a given day, in a given week, for different projects, for work and for my personal life, you name it. I make lists of stuff I need to buy at Target or at Trader Joe's, what I need to pack for an upcoming trip, and stuff I want to blog about sometime in the future. I've also started writing down pretty much anything that I might want to remember later, as soon as it comes into my head, because I swear, my memory just doesn't work the way it used to (I'd like to blame social media multi-tasking rather than old age but I may be deluding myself with that one). My point is, I make a lot of lists.

Some might see list-making as a procrastination device - i.e., time spent making my To Do list is time NOT spent actually doing the things on my list. But while that is technically true, I've realized that, for me, list-making is a way of making sure that I do things efficiently and it actually makes me less likely to procrastinate. The efficiency part comes from thinking through the different aspects of a project before I begin, writing down exactly what needs to be done, in what order. But the real key for me is that my To Do lists are masterpieces of what life coach Martha Beck calls 'turtle steps'. In her book, Finding Your Own North Star, Beck talks about how she got stuck writing her Ph.D. dissertation because she felt so overwhelmed by the enormity of the task that she couldn't make herself even begin. She didn't make any progress until she stopped thinking about the task as "Write dissertation" and instead broke the task down into steps that she considered manageable; in her case, that meant she told herself she only needed to work for 15 minutes each day. While that doesn't sound like much, it was 15 minutes more than she had been doing before and she ultimately finished.

I'm a huge fan of turtle steps - most people would probably find it ridiculous how much I can break down a project! But the great thing about those tiny steps is that when I'm having one of my blah days, when I just can't seem to get motivated and all I want to do is play Bejeweled or Angry Birds, I can almost always find something on my list that is so small that I think to myself, "OK, I can totally do THAT and then I'll at least have done something productive today." About half the time, doing that one little thing pushes me enough out of my lethargy that I end up finishing off at least a few more. But even on really bad days, when that one thing is the only thing I do, I figure it's still more than I would have otherwise done (given that I am honest about the fact that the alternative is that I would have done nothing). And of course, on good days, there's nothing to stop me from plowing through several items.

To give you an idea of what I mean by turtle steps (and why I'm thinking about them today), one of my summer projects is writing a new edition of a book (with the book's original author). I was terrible about working on it during the spring semester and have promised myself and my co-author that it is my top priority this summer. So of course, today, my first real day of summer work (I actually took a vacation last week), I could not find any motivation for it. I had to catch up on a bunch of stuff that I missed while away so by the time I was ready to get to work on the book, I had run out of steam. But on my project To Do list, one of the items is "create a Word file for each chapter (to record notes and links)", followed by a list of the individual chapters I'm supposed to focus on first. So I opened Word, typed "Demography" at the top (that's one of the chapters), added a couple section headers as reminders of what I need to do, and saved the file. I opened another document, typed "Education" and the section headers, and saved the file. I made three other files for the three other chapters, and then crossed all those items off my list. That's it. I know it doesn't sound like much but it's more than nothing and it's five minutes I won't have to spend tomorrow so I can start right in on the next item: "Go through 3 papers and decide if ideas are worth keeping". I have a stack of about fifty papers to go through so that item will be repeated multiple times but if I just wrote "Go through stack of papers" I know I would be much more likely to find some way of putting it off (note this is also how I get through grading - if I thought "I have 150 papers to grade," I would suddenly find Top Chef reruns the most interesting thing in the world but if I tell myself "Just grade five papers before you watch TV," I'll sit down and do it).
(oh, and for my co-author, who I'm pretty sure will read this: don't worry, I promise that there will be plenty of days when I accomplish a lot more than I did today!!!).

Maybe this sounds ridiculous but it works for me. How do you stay productive when you don't feel particularly motivated?

Tuesday, June 7, 2011

Work in progress...

Don't be alarmed - it's still the same site, just looks a little different. After three years, I figured the design could stand to be updated (if you usually get my posts via email or in a reader, ignore this). Over the next few weeks, I'm planning to clean up my labels so it will be easier to find old posts on related topics and just get things a little more organized around here. Yay for summer!