Skip to main content

Student response to PollEverywhere

In my last post, I described how I used PollEverywhere in my data analysis course this spring. In this post, I'll discuss student reaction to PollEverywhere; in my next post, I'll wrap up with my own impressions and thoughts/suggestions for others who are considering using it in their classes.

At the end of the semester, I surveyed my students about a number of aspects of using PollEverywhere. SDSU's Instructional Technology Services asks all clicker-using faculty to administer the same survey every semester and I adapted most of those questions for PE. Many of those questions aren't really about the specific technology but are about using any kind of response system (e.g., "Clickers/PollEverywhere usage helps me to remember course content"). I discussed student responses to clickers a couple years ago and reactions haven't changed much (if anything, the percentages of students agreeing with most of the statements has increased); reactions to PE on those questions look really similar. More relevant to PE specifically, I did ask a couple questions about PollEverywhere versus other options, and some questions about technical problems (percentages based on both sections combined, n=122 for most questions):
  • In the future, I would select a course section which uses PollEverywhere over another section of the same course which did not use PollEverywhere (note that the comparison here is really another course that does not use any response technology, not comparing to a course that uses clickers instead of PE): 60% strongly or somewhat agree; 6% strongly or somewhat disagree (that's 33% who neither agree nor disagree)
  • In the future, I would select a course section which uses PollEverywhere over another section of the same course which uses traditional clickers: 75% strongly or somewhat agree; 6% strongly or somewhat disagree (19% who neither agree nor disagree)
  • I would like more professors to use PollEverywhere in their courses: 75% strongly or somewhat agree; 6% strongly or somewhat disagree (19% who neither agree nor disagree)
  • I was able to set up my PollEverywhere account and register my phone (if applicable) with few or no problems: 92% strongly or somewhat agree; 6% strongly or somewhat disagree
  • Once everything was set up and registered, PollEverywhere has worked well for me: 90% strongly or somewhat agree; 5% somewhat disagree
Preferred over clickers: I should point out that for the first and third questions (about choosing a section that uses the technology over a section that does not, and wanting more professors to use the technology), the percentages agreeing about PollEverywhere are much higher and percentages disagreeing are much lower than when I asked the same questions about clickers in my fall data classes, where roughly 37% agreed and about 20% disagreed (remember, I pretty much just replaced clickers with PollEverywhere for the spring classes; the questions themselves, and how the technology was used, were the same). Students also seemed to prefer PE over clickers specifically (about 80% of the students had used clickers in other classes). Of course, I could be really cynical and suggest that maybe students prefer PE because it meant they could have their cell phones out in class, and I don't doubt that actually does make students more receptive to PE. But in their open-ended comments, students pointed out that it was more convenient because they always have their phones with them (versus having to remember to bring clickers), they liked having multiple options for submitting responses, and they liked that it was free, for those with unlimited texting. On the other hand, a couple students complained that it wasn't free for them because they pay per text. Other student complaints were similar to complaints I've heard about clickers: phone died and had to miss points (I drop 10% but that doesn't stop students from complaining), need more time to submit responses, and sometimes responses wouldn't go through.

Reception issues: That last comment deserves more discussion. One uncontrollable factor that has a huge impact on the success of PollEverywhere is cell phone service. The vast majority of my students (89%) submitted their responses via text message; only a handful submitted via web browser either using their smartphone (3%) or laptop (5%). But there were also a couple of students in each class (3% total) who were never able to submit via phone because of bad cell reception and they did not have laptops; my understanding is that pre-paid phones and Nextel phones were the worst. I allowed those students to submit their responses on paper and I manually recorded their scores. Note that this policy only made sense for me because a) almost all the PE questions were low-stakes and b) I only had a couple in each section (out of 75). There were also a number of students who could submit via text most of the time but would have a problem every once in a while; I told those students they could either write their responses on a piece of paper or just let that be part of the 10% of points that would be dropped.

There were a few times when the glitches did seem to be with the PollEverywhere site (e.g., it took a long time to get a confirmation text, or some students didn't get a confirmation text at all but the system did record their response) but I don't expect perfection from any technology (and the problems were definitely rarer than the last time I used eInstruction clickers in the 500-seater). Just as with clickers, I think it's imperative to have a flexible policy (like dropping some points or having alternative ways to get the points) so students don't freak out. More about that in my next post...

Comments

Popular posts from this blog

What are the costs?

I came across an interesting discussion about a 19-year-old intern who was fired from The Gazette in Colorado Springs for plagiarism. There appears to be some controversy over the fact that the editor publicly named the girl in a letter to readers (explaining and apologizing for the plagiarism), with some people saying that doing so was unduly harsh because this incident will now follow her for the rest of her career. I was intrigued by this discussion for two reasons - one, it seems pretty clear to me that this was not a case of ignorance (as I have often encountered with my own students who have no idea how to paraphrase or cite correctly) and two, putting aside the offense itself, I have often struggled with how to handle situations where there are long-term repercussions for a student, repercussions that lead the overall costs to be far higher than might seem warranted for the specific situation. As an example of the latter issue, I have occasionally taught seniors who need to p

What was your high school economics experience like?

As I mentioned in my last post , I am asking my Econ for Teachers students to reflect on their reading by responding to discussion prompts. It occurred to me that it wouldn't be a bad idea for me to share my thoughts on those issues here and see if anyone wants to chime in. For this week, the students were asked to read the California and national content standards , an article by Mark Schug and others about why social science teachers dread teaching economics and how to overcome the dread, an article by William Walstad on the importance of economics for understanding the world around us and making better personal decisions (with some evidence on the dismal state of economic literacy in this country), and another article by Walstad on the status of economic education in high schools (full citations below). The reflection prompt asks the students to then answer the following questions: What was your high school econ experience like? What do you remember most from that class? How do

When is an exam "too hard"?

By now, you may have heard about the biology professor at Louisiana State (Baton Rouge) who was removed from teaching an intro course where "more than 90 percent of the students... were failing or had dropped the class." The majority of the comments on the Inside Higher Ed story about it are supportive of the professor, particularly given that it seems like the administration did not even talk to her about the situation before acting. I tend to fall in the "there's got to be more to the story so I'll reserve judgment" camp but the story definitely struck a nerve with me, partly because I recently spent 30 minutes "debating" with a student about whether the last midterm was "too hard" and the whole conversation was super-frustrating. To give some background: I give three midterms and a cumulative final, plus have clicker points and Aplia assignments that make up about 20% of the final grade. I do not curve individual exams but will cu