Friday, April 27, 2012
Back to the Clicker
1. Optimal Acquisition and Distribution on Campus: Which Approach Works Best? - Ben Harwood - Skidmore College
2. Peer Instruction - Do Students Really Learn from Peer Discussion in Computing - Leo Porter - Skidmore College
3. Strategies to Assess the Impact of Clicker Use - Michael Krikonis - Clark University
815 North Broadway Ave. Murray Aikins Ballroom
Saratoga Springs, New York
When:9:00 am - 3:00 pm
Note: Registration begins at 8:00am
Workshop Organizer: Ben Harwood of Skidmore College
Clickers and polling devices may just have reached a maturity with faculty and students on campuses. With clickers being easily acquired, institutions have the flexibility to take innovative approaches to their integration with teaching and learning. Experience also shows clickers are transcending into co-curricular areas such as research, faculty meetings and campus social events. Even as clickers are widely adopted on campuses, challenges still remain:
1. What is the optimal acquisition and distribution model for my campus?
2. What are best practices for pedagogy and support?
3. What strategies are available to assess the impact of clicker use?
During this SIG, participants will examine the progress of personal response systems as well as imagine the future as feedback is increasingly possible via web-enabled devices, collaborative work platforms and LMS integrations. The program will consist of pre-determined individual presentations and participant-shaped open group conversation activities. Technologists, faculty and students are invited to participate and share experiences and data from their campuses.
8:00 A.M. - 9:00 A.M. Registration and coffee
Speakers for the Day:
Derek Bruff, Director, Center for Teaching, Vanderbilt University.
Ben Harwood, Instructional Technologist. Skidmore College.
Michael Krikonis, Academic Technologist, Clark University.
Leo Porter, Assistant Professor, Mathematics and Computer Science, Skidmore College.
Lucas Wright, Educational Technologist, St. Lawrence University.
9:00 A.M. - 9:25 A.M. Goals of the days and introductions with Derek Bruff, author of Teaching with Classroom Response Systems: Creating Active Learning Environments Derek joins the meeting via videoconference.
9:25 A.M. - 10:00 A.M. Group conversation and activity.
During this time, participants are encouraged to suggest topics they would like to learn more about during the SIG. Each participant has the ability to shape parts of the day and especially the final hour of programming by focusing conversation on specific problems or concerns.
10:00 A.M. - 10:30 A.M. Optimal Acquisition and Distribution on Campus: Which Approach Works Best?
Speaker: Ben Harwood, Instructional Technologist, Skidmore College.
Clicker acquisition and distribution models vary from campus to campus. While each approach has its pros and cons, at Skidmore we’ve discovered that it makes most sense for IT to purchase clicker kits for faculty and students. Web polling solutions are having little impact on our current planning. While there’s some interest within IT to explore web polling from student mobile devices, the physical handheld clicker will likely remain the preferred device of choice among faculty and students. Which approach is being used on your campus? Is it likely to change in the future?
10:30 A.M. - 11:00 A.M. Best practices for use and support
Speaker: Lucas Wright, Educational Technologist, St. Lawrence University.
Classroom response systems have found their niche in formative assessment, but it doesn't end there. We will examine several practices in use across disciplines and showcase how this particular technology is a very small but essential catalyst for participation, critical thinking and discussion, and can make for interesting additions to class projects. We will also make connections between supporting multiple methods of deployment and use.
11:00 A.M. - 11:15 A.M. Break
11:15 A.M. - 12:00 P.M. Peer Instruction in Computer Science – The Impact of Group Discussion on Learning
Speaker: Leo Porter, Assistant Professor, Mathematics and Computer Science, Skidmore College
Peer Instruction (PI) is an instructional approach that engages students in constructing their own understanding of concepts that was originally developed by Eric Mazur for physics classes at Harvard University. The foundation of PI is having students individually respond to a question, discuss with peers, and respond to the same question again. Although not required, this process is made easier through the use of electronic voting systems (clickers). Students benefit from PI by actively engaging in questions during class and by being able to voice their confusions to their classmates and the instructor. Instructors benefit from real-time data on which concepts are difficult for their students. The effectiveness of PI in physics was shown by a six-thousand multi-institutional study and has since gained attention in other sciences. In computer science, PI has not yet been widely adopted, but has been well received by students in introductory classes and by the faculty who have adopted the practice.
In general, the peer discussion portion of PI leads to an increase in the number of students answering a question correctly. In this talk, I will focus on answering the question: Are these students really learning, or are they just "copying" the right answer from someone in their group? In an article in the journal Science, Smith et al. affirm that genetics students individually learn from discussion: having discussed a first question with their peers, students are better able to correctly, individually answer a second, conceptually-related question. We replicate their study, finding that students in upper-division computing courses (computer architecture and theory of computation) also learn from peer discussions, and explore differences between our results and those of Smith et al.
12:15 P.M. - 1:00 P.M. Lunch, Murray Aikins Dining Hall
1:00 P.M. - 2:00 P.M. Strategies to Assess the Impact of Clicker Use
Speaker: Michael J. Krikonis, Academic Technologist, Clark University
An examination of the impact of CRS requires a holistic approach when fine-tuning a Clicker initiative. Assessment is as necessary to ensure Clickers are an optimal pairing with a teaching style as it is to measure impact as a learning tool. Additionally, assessment is critical to crafting successful user support models, scalable distribution models, and reliable system integrations. During the session you will explore strategies for measuring both the practical and learning capacities of Clicker implementations and reflect on the success and failures through examination of student and faculty feedback.
2:00 P.M - 3:00 P.M. Group activity and Summary with Derek Bruff
During this time, participants will determine how this hour is best spent for the group. One possible idea is a lighning round where individuals or small groups can take up to 5 minutes to share stories, problems, Eureka moments, etc. This time is ultimately yours.
3:00 P.M. End
NERCOMP MEMBERS - LOGIN TO RECEIVE MEMBER PRICING OF $130
Registration Cancellation Policy
By clicking on the "Order Now" button, you are indicating a commitment to attend and will be held responsible for the registration fee. Your fee can be refunded if you notify us of a cancellation at least 8 days prior to the event via email to nercomp@nercomp
NERCOMP reserves the right to use any photographs or other mechanical recordings taken at NERCOMP events in promotional materials. No mechanical recordings of any kind may be used at NERCOMP events without the prior written consent of NERCOMP organizers and presenters. The views and opinions expressed at NERCOMP events do not necessarily reflect those of NERCOMP, nor does NERCOMP make any representation regarding the information presented at NERCOMP events.