By Jason Oswald
Upper School Computer Science Department Chair
GA celebrated Computer Science Education Week during December 9-13 with a series of events called ga.codes(). The core concept that drove the planning of this event was that we would have older students teach younger students. When I explained to my AP Computer Sciences Principles students that I wanted them to teach first graders the chicken dance, they, to put it lightly, had questions. But the truth is that, at a certain level, computer science is the reorganization of certain skills and abilities into one field. We want first graders (and everyone, really) to practice breaking down problems into smaller tasks, group related things, and do things in a particular order. If you can teach someone the chicken dance in that way, you’re also teaching them to think like a computer scientist. One of my favorite moments from the week was, after teaching a class the chicken dance, asking the students why we were there doing this at all and having them tell us this was like the coding they’ve been doing in class. We teachers live for such “a-ha!” moments.
Once students have the ability to create the things in their heads with code, the next step on the continuum is to have them interact with the world in some way. In the Lower School, students take the computer science ideas they’ve been learning and work with robots, which is a fantastic transition in so many ways. The robots are equipped with a variety of sensors and abilities, and so there is built-in scaling in these activities. But with ga.codes(), we wanted to shake things up a bit, so for the 3rd-5th graders, we planned a Machine Learning lesson.
The lesson was centered around a tool developed by Google called The Teachable Machine (version 1). It is elegant in its simplicity. You train the machine on three sets of visual data, interpret what those things mean for the machine, and define your outputs. Then, you feed the machine new inputs, and hope it interprets them correctly. If it doesn’t, you think about why, adjust your training data, and then try again. We started out training the machine with me giving “thumbs up” “thumbs down” and “OK” hand gestures, then I asked a student to come and try the machine out. When a particular gesture didn’t work, it took about a half second for them to come up with possible solutions: my hand is bigger than their hand, I used my right hand and they used their left, etc. The big breakthrough of the lesson was asking them how to solve this problem. Once they realized they needed a wider variety of data to solve more complicated problems, they were off to the races. Perhaps teaching Machine Learning to that age seems a little bold, but when a 3rd grader can start to think about bias in training data, that seems like a win.
In 6th grade, we introduced a tool called CoSpaces Edu, which allows students to work in 3D to create scenes, models, and other constructions, which can be programmed and viewed in AR (Augmented Reality) and VR (Virtual Reality) spaces. It was an brief introduction to the tool, but an initial goal for them is to use the tool to build an interactive model of the cell. Imagine putting on a pair of virtual reality goggles to find yourself floating in a cell and then poking the endoplasmic reticulum and seeing the ribosomes be released. Now, imagine being a student and putting that together– what level of understanding of the material is behind the construction of such a model? We’re excited to see!
At the beginning of December, The New York Times published a visualization of historical air quality data from the past year. In the regular version of the report there are partially opaque black dots floating on the page. It shows you what a normal amount of dots should look like, then what the worst quality day in the past year for your area was, what it was like in San Francisco last year during the Camp Fire, and what it was like in New Delhi on the worst day of the year. The density of the dots increases in two dimensions as you scroll through the report and it is a good visualization. However, there is also an AR feature where you can get the dots to show up floating around in your personal space just by moving your phone around. It isn’t hard to see which is more powerful.
I recently tasked my AP Computer Science Principles students with seeking out a data set that they were interested in and to create a visualization of that data. They shared those visualizations with 7th graders during CS Education Week, and while some of the visualizations were merely informative or insightful, some presented troubling information about obesity rates, heart disease, and overdose rates. Uncovering a problem through exploration is a doorway to thinking about solutions to those problems. Armed with self-generated insights, might we engage students to work towards developing solutions for these problems. And if they found these problems using code, might it not occur to them that to wield those same tools in seeking out solutions?
The ability to solve authentic problems is the final step in our continuum of understanding. I spoke to a student recently who wants to use Machine Learning to develop a screening process for a particular disease based only on image data. Imagine that. In the amount of time it takes you to unlock your iPhone with your face, that same iPhone could tell you that you are at risk. I’ve had a number of similar conversations. Students want to create programs to work on writing Chinese characters, learn about the solar system, analyze the stock market– whatever their interests are, they are finding ways to connect their learning to those interests and to develop solutions in those fields.
Our students today are sitting in class with unfettered access to the greatest repository of information ever created, with hardware and software unfathomable when they first started school, and the desire to create, connect, and help. And for as much as I would love for them to program games for their calculators, they can do so much more, and we need them to. We need them to look at something that seems complicated at first– like the chicken dance– and know they have the ability to break that problem down into parts. We need them to leverage machines and technology to solve complex problems using tools developed at a high level and to understand when those machines might behave in unexpected ways and then how to correct them. We need them thinking, literally, in higher dimensions. We need them ready to solve the problems in front of them with the tools available and the tools yet created.
During World War II, and after earning her Ph.D. in mathematics from Yale, Grace Hopper tried to enlist in the Navy, but was rejected because of her age (34). She joined the Navy Reserves, and eventually began work on some of the very early computer systems. She had the crazy notion that we should be able to program computers using natural language and that they were capable of more than just arithmetic. Her ideas and working examples were rejected until a few years later when they became the foundation of COBOL, a language that became so ubiquitous that it is still used in production environments today– sixty years after its inception. That is mind-boggling. How many pieces of technology do you use that are sixty years old? She eventually became a Rear Admiral and was posthumously awarded the Presidential Medal of Freedom. Computer Science Education Week happens at the time it does in recognition of Grace Hopper’s birthday, December 9, 1909.
"The most important thing I've accomplished, other than building the compiler is training young people. They come to me, you know, and say, 'Do you think we can do this?' I say, 'Try it.' And I back 'em up. They need that. I keep track of them as they get older and I stir 'em up at intervals so they don't forget to take chances."