In my last post, I described how I used PollEverywhere in my data analysis course this spring. In this post, I'll discuss student reaction to PollEverywhere; in my next post, I'll wrap up with my own impressions and thoughts/suggestions for others who are considering using it in their classes.
At the end of the semester, I surveyed my students about a number of aspects of using PollEverywhere. SDSU's Instructional Technology Services asks all clicker-using faculty to administer the same survey every semester and I adapted most of those questions for PE. Many of those questions aren't really about the specific technology but are about using any kind of response system (e.g., "Clickers/PollEverywhere usage helps me to remember course content"). I discussed student responses to clickers a couple years ago and reactions haven't changed much (if anything, the percentages of students agreeing with most of the statements has increased); reactions to PE on those questions look really similar. More relevant to PE specifically, I did ask a couple questions about PollEverywhere versus other options, and some questions about technical problems (percentages based on both sections combined, n=122 for most questions):
Reception issues: That last comment deserves more discussion. One uncontrollable factor that has a huge impact on the success of PollEverywhere is cell phone service. The vast majority of my students (89%) submitted their responses via text message; only a handful submitted via web browser either using their smartphone (3%) or laptop (5%). But there were also a couple of students in each class (3% total) who were never able to submit via phone because of bad cell reception and they did not have laptops; my understanding is that pre-paid phones and Nextel phones were the worst. I allowed those students to submit their responses on paper and I manually recorded their scores. Note that this policy only made sense for me because a) almost all the PE questions were low-stakes and b) I only had a couple in each section (out of 75). There were also a number of students who could submit via text most of the time but would have a problem every once in a while; I told those students they could either write their responses on a piece of paper or just let that be part of the 10% of points that would be dropped.
There were a few times when the glitches did seem to be with the PollEverywhere site (e.g., it took a long time to get a confirmation text, or some students didn't get a confirmation text at all but the system did record their response) but I don't expect perfection from any technology (and the problems were definitely rarer than the last time I used eInstruction clickers in the 500-seater). Just as with clickers, I think it's imperative to have a flexible policy (like dropping some points or having alternative ways to get the points) so students don't freak out. More about that in my next post...
At the end of the semester, I surveyed my students about a number of aspects of using PollEverywhere. SDSU's Instructional Technology Services asks all clicker-using faculty to administer the same survey every semester and I adapted most of those questions for PE. Many of those questions aren't really about the specific technology but are about using any kind of response system (e.g., "Clickers/PollEverywhere usage helps me to remember course content"). I discussed student responses to clickers a couple years ago and reactions haven't changed much (if anything, the percentages of students agreeing with most of the statements has increased); reactions to PE on those questions look really similar. More relevant to PE specifically, I did ask a couple questions about PollEverywhere versus other options, and some questions about technical problems (percentages based on both sections combined, n=122 for most questions):
- In the future, I would select a course section which uses PollEverywhere over another section of the same course which did not use PollEverywhere (note that the comparison here is really another course that does not use any response technology, not comparing to a course that uses clickers instead of PE): 60% strongly or somewhat agree; 6% strongly or somewhat disagree (that's 33% who neither agree nor disagree)
- In the future, I would select a course section which uses PollEverywhere over another section of the same course which uses traditional clickers: 75% strongly or somewhat agree; 6% strongly or somewhat disagree (19% who neither agree nor disagree)
- I would like more professors to use PollEverywhere in their courses: 75% strongly or somewhat agree; 6% strongly or somewhat disagree (19% who neither agree nor disagree)
- I was able to set up my PollEverywhere account and register my phone (if applicable) with few or no problems: 92% strongly or somewhat agree; 6% strongly or somewhat disagree
- Once everything was set up and registered, PollEverywhere has worked well for me: 90% strongly or somewhat agree; 5% somewhat disagree
Reception issues: That last comment deserves more discussion. One uncontrollable factor that has a huge impact on the success of PollEverywhere is cell phone service. The vast majority of my students (89%) submitted their responses via text message; only a handful submitted via web browser either using their smartphone (3%) or laptop (5%). But there were also a couple of students in each class (3% total) who were never able to submit via phone because of bad cell reception and they did not have laptops; my understanding is that pre-paid phones and Nextel phones were the worst. I allowed those students to submit their responses on paper and I manually recorded their scores. Note that this policy only made sense for me because a) almost all the PE questions were low-stakes and b) I only had a couple in each section (out of 75). There were also a number of students who could submit via text most of the time but would have a problem every once in a while; I told those students they could either write their responses on a piece of paper or just let that be part of the 10% of points that would be dropped.
There were a few times when the glitches did seem to be with the PollEverywhere site (e.g., it took a long time to get a confirmation text, or some students didn't get a confirmation text at all but the system did record their response) but I don't expect perfection from any technology (and the problems were definitely rarer than the last time I used eInstruction clickers in the 500-seater). Just as with clickers, I think it's imperative to have a flexible policy (like dropping some points or having alternative ways to get the points) so students don't freak out. More about that in my next post...
Comments
Post a Comment
Comments that contribute to the discussion are always welcome! Please note that spammy comments whose only purpose seems to be to direct traffic to a commercial site will be deleted.