You get ready to go to work in the morning, go into your garage, open the door, pick up the newspaper from the driveway, get in the car, and say:
"Take me to work."
The car backs out of your driveway, on the road and heads toward work. You're sitting in the back seat (no driver in front!) reading the paper.
On the STOP sign near your block, you say:
"Stop by the Starbucks on the way"
The car goes into a Starbucks, you go in, get your favorite latte, hop back in, and off you go!
Couple weeks back I went over to U-Penn to watch the DARPA Urban Challenge Race being live webcast from Victorville, CA. There were about 60-70 people in the room (we had to move to a bigger room!) watching the events unfold. Lots of loud cheering each time Penn's entry (Little Ben) was shown on the screen. 5-10 minutes into the race, Little Ben ran into a snag: it just came to a halt trying to take a left turn onto a major road. DARPA had to stop all the other cars for 5-10 minutes. The commentators wondered out loud if that was the end of Little Ben. Half the people in the room walked out dejected.
Among the many changes in U.S. policy after 9/11 was one that went unnoticed by everyone except a few geeks: The military quietly reversed its longstanding position on the role of robots in battlefields, and now embraces the idea of autonomous killing machines. There was no outcry from the academics who study robotics—indeed, with few exceptions they lined up to help, developing new technologies for intelligent navigation, locomotion, and coordination. At my own institute, an enormous space is being out-fitted to coordinate robotic flying, swimming, and marching units in preparation for some future Normandy.
Two computer scientists have found an interesting difference between how men and women use software. From an MSNBC report:
Laura Beckwith, a new computer science Ph.D. from Oregon State University, and her adviser, Margaret Burnett, specialize in studying the way people use computers to solve everyday problems — like adding formulas to spreadsheets, animation to Web sites and styles to word processing documents.
A couple of years ago, they stumbled upon an intriguing tidbit: Men, it seemed, were more likely than women to use advanced software features, specifically ones that help users find and fix errors. Programmers call this "debugging," and it's a crucial step in building programs that work.
In the past, the Franklin Institute has invited us here at Bryn Mawr College to participate in demonstrations of our interesting robotics projects. We have always been very happy to take a group of robots on a nice Saturday morning in the Fall and have some fun showing kids of all ages our toys, er, I mean, "research opportunities."
However, this year I am hesitating. This year, the FI is bundling their robot demonstrations with an event called Robot Conflict. They describe it this way:
Bryn Mawr College Department of Philosophy, Department of Computer Science, The Center for Science in Society, and the Delaware Valley Distinguished Lecture Series in Computer Science presents:
William J. Rapaport
University at Buffalo
Title: Philosophy of Computer Science
William J. Rapaport is an Associate Professor in the Department of Computer Science and Engineering, an affiliated faculty member in the Departments of Philosophy and of Linguistics, and a member of the Center for Cognitive Science, all at State University of New York at Buffalo. His research interests are in cognitive science, artificial intelligence, computational linguistics, knowledge representation and reasoning, contextual vocabulary acquisition, philosophy of mind, philosophy of language, critical thinking, and cognitive development. His research has been supported by the National Science Foundation and the National Endowment for the Humanities.
Here at the IPRE, we are working on developing hardware, software, and course materials based on robots for use in teaching introductory computing courses.
One aspect of the project that we are very conscious of is how the students might perceive robots in the classroom. One of our goals is to develop materials that will attract students into computing. If we use a device that some students find alienating, then we will, of course, have failed. So, we are sensitive to such perceptions, specifically those that have gender correlations.
How can one develop materials that are sensitive to gender biases? The same way that you write good software: you need to test. Feedback is the only way that you can know for certain, and then you revise and test again. We all have biases and, even when we are aware of that fact, these biases can still pop-up and have adverse affects.
Bryn Mawr College and Georgia Technical Institute were awarded a grant by Microsoft to develop a curriculum for computer science using robots as a learning tool. Both colleges designed and implemented an introductory computer science course based on this premise. Georgia Tech continued to offer their normal introductory class as well. At the conclusion of all Bryn Mawr and Georgia Tech introductory classes, students completed a survey about their experiences in the course, as well as some basic personal information (e.g. gender, ethnicity, etc.) and background (in programming and robotics). The survey included fifty-two questions about the course, although the non-robot classes (all at Georgia Tech) received a twenty-one question subset of the survey, excluding all questions referring to robots in the class.
I’ve just perused two light-hearted, comical articles. One ( “Elvis Lives On as a Robotic Head…”) concerns a newly released commodity, namely a robotic bust of Elvis, capable of singing and speaking with his trademark voice. The other ( “I am rubbish at Scrabble - but playing it online has taught me how to be really good at cheating”) deals with the author’s use of a scrabble-solving engine when playing online Scrabble (although he goes on to realize that it’s more enjoyable to play simply using his own insights instead of acting as a slave to a computer program).
On August 1st, my much worked on survey was finally closed and it was now time to analyze the results. I must say that a lot of the survey results paralleled with my hypotheses. These include people's agreements with the fact that robots do, in fact, make computer science more interesting; how only a small fraction of people's entire class would enroll in computer science when it was first offered and how over 90% of their classes were male; how one of the major reasons that women tend not to enroll in computer science courses is because they fine the environment uncomfortable etc. Sadly, I still look for the reason that women find this environment uncomfortable and many would say that the reason is because computer science is a male-dominated field, but why is the field male-dominated in the first place? The search remains...