Position vs. Time Graphical User Interface

Giving Robots (and Students) a Head Start

“So we should just play with this, then!”

This summer, I reprogrammed the Scribbler 2 Robot from Parallax Corporation to be a physics apparatus.  (I intend the robot project itself to be the subject of a future blog post.) This lab practicum is the student’s first opportunity to program a robot themselves. Each group is given two robots, one programmed in advance by me that students studied in their first robot lab, and a second one which they must program themselves through a position vs. time graphical user interface. Students must understand how a position vs. time graph describes motion in order to get the robot to complete the assigned task.

Having limited prior background, students typically decide to graph distance and time and calculate speed when asked what they can change and measure to describe the robot’s motion. So this is what happens in robot lab 1. The purpose of this second lab is to present the students a task that requires the more sophisticated concept of position. Following the precept, “idea first name afterwards,” which modeling instruction adopted from the research of Arnold Arons, students draw a position vs. time graph because a distance vs. time graph is inadequately descriptive of the situation, then compare and contrast their new representation to the distance vs. time graph. Only when students can clearly articulate these differences does the instructor provide the name position to describe the new concept.

In robot lab 1, students created a distance vs. time graph by measuring the robot’s motion and graphing the individual points. The Position vs. Time Graphical User Interface (PvsTGUI) allows the students to do the same thing in reverse, to draw a position vs. time graph on their computer and send it to a robot, which will then perform the motion described by the graph. The screen shot below shows the GUI. To operate it, students grab a line segment from the blue bar at the top and drag it onto the graph. By grabbing and dragging, they can move the endpoints of the segment to a different location. Clicking on the robot icon in the upper right corner sends the program to a robot, if one is attached to the computer through a USB port. The PvsTGUI allows multiple segments of motion and parabolic curved segments, but these features are unnecessary for the task at hand, so I just demonstrate the basics at this time.

Position vs. Time Graphical User Interface

Position vs. Time Graphical User Interface

It’s implicit in the activity that you can use physics knowledge to manipulate your environment by controlling a robot. No heavy handed discussion about “real world applications” necessary. The students are doing it themselves from day 3 of their physics class.

I demonstrated the PvsTGUI to individual groups as they completed robot lab 1. In every group, at least one student commented, “cool!” and one student said, “So we should just play with this, then!” Getting the robot to perform to their own commands – with ease – trumped the disparaging initial reaction about the slow speed of the robots. Now the robots were cool, and this engagement was one of the goals of the project. It’s implicit in the activity that you can use physics knowledge to manipulate your environment by controlling a robot. No heavy handed discussion about “real world applications” necessary. The students are doing it themselves from day 3 of their physics class.

The task was to program a second robot so that it had a head start of 60cm but reached the same finish line at the same time as their first robot, then to display the motion of both robots on a single “distance vs. time” graph, so that the head start, the speeds, and the fact that they reached the same finish at the same time was clearly visible on the graph. Although I used the words “distance vs. time” in my instructions to the students – because that’s the only graph of motion they have so far encountered – the quotes are there to emphasize they will really be drawing a position vs. time graph. The new name will not be introduced until the post lab discussion.

Once students successfully programmed their robot and tested the result, every group correctly showed a 60cm y-intercept for the line representing robot 2’s motion. It took some pointed questions from me, however, for any of these groups to recognize that their robot could not have traveled a distance of 60cm in 0 seconds. This, as well as examination of additonal points as necessary served to make them realize that their graph’s y-axis label of “distance” was inaccurate. I asked them to write a sentence on their whiteboards explaining what their graph did show, getting various descriptions generally stating “where the robot is at, not how far it traveled,” as well as the word “location” and in one case “position,” the correct technical term.

Once the class discussion indicated that this meaning had penetrated to a majority of the students, I finally introduced the word “position,” asking them to change the label on the y-axis of their graph and also to draw what the distance vs. time graph would look like. Comparing the information it was possible to read off of the two graphs below clearly showed the students why physicists prefer the more informative position vs. time graph. A good followup question involved asking students to show how the distance showed up on their position vs. time graphs.

Position and Distance Graphs

I was pleased with this as an introduction to position. The head start task was less complicated than the the task I had given students in previous years, which involved a slow buggy and a fast buggy moving in opposite directions from opposite ends of a track. Because I wanted students to develop the concept of position from the buggy task, my instructions became complicated and convoluted. I could not figure out how to make them simpler without giving away the answer. By contrast, I was able to provide simple, straightforward instructions for the head start robot, and students developed the idea of position spontaneously and with much less direction from me. Rather than provide any hints about how to solve the problem, I just had to repeat the instructions that the robots must reach the same end point at the same time, and the graphs on their whiteboards must show not only that, but also the head start.

The students definitely had an easier time figuring out how to complete the robot task, but they still encountered some difficulties along the way. Some groups assumed that their second robot would have to have the same speed as the first, and calculated the appropriate time for it to get to the same end point. Some even correctly graphed the two robots showing that they did not reach the end at the same time. They were still surprised when their actual test failed and the robots did not arrive simultaneously.

One of my goals in developing the robots was to enable the students to check their understanding independently, without the need for a teacher to check or grade their work, so I was pleased with the student’s response to “failing” their first test. They went right back to work trying to figure out the error in their understanding. It was not viewed as a failure, but rather as a sign that they had more work to do. I believe that having your understanding impersonally checked in the lab helps set up a growth mindset, where the students focus on succeeding despite the setback. Reaction to getting an answer marked wrong by a teacher, is more likely to be taken personally as an evaluation of ability, as in, “I’m just not good at this.” (Self-checking materials, an idea borrowed from Montessori, will be the subject of a later blog post.)

Many of the other groups struggled to figure out that the time for the two robots had to be the same, even if they did not suffer from the assumption that the speeds would be equal. I speculate that they had some intuitive feeling that both distance and time would have to be different in order for speed to be different. To back this up, I overheard many discussions about the appropriate distance and time for the second robot, and whether these choices would make the second robot faster or slower. Even when they generally knew it would have to be slower, they still had to work through the idea that a shorter distance with the same time would make the speed slower, but a shorter time would actually make the speed faster.

None of these productive and revealing discussions occurred in previous years with the buggies, because the speed of the second buggy was set, and the students did not have to determine it themselves. The more I can give students a task to work on independently the more I am able to learn by listening in to their discussions. Had I just told students about the concept of position, I would never had imagined the number of different issues the students would have to think through in order to really get it.

For example, I’m sure that in my pre-modeling days, I just assumed that students could see the equation speed = distance/time as a relationship among dynamic quantities, perceiving how one would increase or decrease due to changes in another. In fact, I depended on this background understanding as a lever to help them understand the bigger picture. Those many students who were lacking this background could not have hoped to understand my lecture and instead would have fallen back on mechanically plugging numbers into the equation as a survival strategy.

Because I was able to direct the students less and listen more, the robot lab revealed several more dimensions to the concepts of position and speed. The robot-head-start task has given many of my students a head start in conceptual development compared to previous years.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s