RoadRider is a training gadget for bicycle road riders. The system consists of visual and audio displays, which are assembled on the bicycle handlebars and on the rider’s helmet.
Using the system, the rider can compete against his previous sessions or against sessions of other riders. During the ride, the rider’s speed is continuously sampled and recorded. The different recorded sessions are available to the rider for future use and are listed in the graphic interface of the system. In the beginning of each session, the rider selects the session he wish to compete against.
The rider’s current speed is constantly compared with the speed of his competitor (his previous session). Based on his performance, the rider receives a real-time audio-visual feedback which provides him with an accurate indication of his whereabouts, in comparison to the competitor and to the length of the track.
The main aspects RoadRider aims to provide for the rider are are information and motivation. Due to the demanding nature of competitive bicycle riding, it was important that the system will provide the rider with ongoing information, while maintaining his undivided attention on the road ahead. For that reason, we decided to keep the visual display to a minimum and to provide most of the information in the form of a 3D audio. We explored different audio tracks, manipulations and combinations, in the hope to achieve an informative and motivating outcome, which provides the rider with a sense of a real race environment.
How does it work?
The application was constructed of several parts:
* The Phidget – received the input from the bicycle wheel
* The Flash – was responsible for collecting the data from the Phidget, and from the saved files, performing calculations, creating the visual output and sending a string of values to the MAX/MSP component.
* The MAX/MSP – played the different sounds according to the values it received from the flash component, and was also responsible for saving the data.
The Phidget component received the input from the bicycle wheel every time the wheel completed a turn. The flash component was responsible for: collecting the data from the Phidget, and from the saved files, performing calculations on the data, creating the visual output and sending a string of values to the MAX/MSP component. The MAX/MSP component played the different sounds according to the values it received from the Flash component, and was also responsible for saving the data.
The communication between the Phidget component and the Flash component was done through the Phidget’s webservice, and with the Flash listening on the websevice port. For the Flash to the MAX/MSP connection, we used a custom object called flashserver. This object listened on the http port that the Flash broadcast on and passed the values to the MAX/MSP.
The Flash component first opened a names file, which contained all the names of the files currently saved. Each file contained a list of numbers representing the different speeds of previous riders. For the purpose of the demo, we collected about 30 seconds of data, which resulted in about 30 values saved on each file. The user chose a rider to ride against, and the Flash would open the file, read its contents and create a competitor speed array from the data in the file.
Once the race started, the flash began receiving sensor data from the Phidget, counting every full wheel turn, which represented the rider passing about 2 meters. Once every second, the number of wheel turns was collected, resulting in the meters-per-second current speed reading. This was extrapolated into current and average speed, distance passed, acceleration etc. and was compared to the “competitor” values.
We’ve compiled a sequence of scenarios that dictated which sound would be played if the rider was located ahead, behind, or next to the competitor, and the volume level. There were a dozen different sounds that were used, and to control them, a string of values was aggregated according to the scenario. If a sound wasn’t used, a ‘0’ value was sent as that sound’s volume, and if a sound was to be played at full volume, a value of ‘100’ was sent, and so on. Most sounds had up to 4 channels, for forward right, backwards right, and forward and backwards left, to create a 3d sound experience. Therefore, if a sound had 4 channels, the string contained 4 different volume commands. The values string was then sent to the MAX/MSP component via serial output.
Scenarios Volume and Channels mapping
In addition, a visual display was created, showing both riders’ average speed and total distance, as well as an animation showing their relative position, where the rider that was leading was showed to be higher than the other rider, this display was also updated every second.
Once the race was over, the flash collected the rider’s speed readings per second into another string and sent it to the MAX/MSP to be saved. The MAX/MSP component had opened all sound files and then played each file according to the value it was sent. For example, the values “75 25 0 0” meant that the left upper channel was to be played at 75% volume level, the left backwards channel at 25%, and both right channels were to be silent, giving the impression that the sound came from the left, and slightly forwards. As mentioned before, the MAX/MSP component was also the part that was handling the file saving process.
All three components ran on a single Windows XP machine through their desktop clients, downloaded from the developers sites. All the files were saved under one folder, this included the MAX/MSP .maxpat file, the Phidget Flash drivers, the Flash .fla and .swf files and the media files (sound and images) and the text files that were read by the Flash component and written to by the MAX/MSP component. The MAX/MSP custom objects were placed in the Program Files folder for faster integration. The .maxpat file was placed inside the media and text files folder. The main folder’s path was added to the MAX/MSP client File Preferences list. A crossdomain.xml file was also placed inside the main folder to bypass Adobe Flash’s security restrictions.
Version issues: During the integration of the different components, we discovered several issues which were caused by the actual platforms we used. The MAX/MSP version we originally used, 5.1 had a bug that caused it to lock the entire folder it was writing to, disabling any attempt to read from the files it used, saving into it, or re-compiling the fla file. Updating to MAX/MSP v. 5.1.3 resolved that issue.
Similarly, the Flash component refused to read values from the Phidget, due to security features introduced in the Adobe Flash CS4 release. Using Adobe Flash CS3 (downgrading the software) resolved that. In both cases, debugging these issues was an arduous task, since no faults could be found in the code.
Installation/Configuration: The project required several unusual elements, especially in the different connectivity of the various hardware component. The main machine had to have a USB entrance for the Phidget, a 3d-capable sound card with 3-point outlets, and a way to communicate with the hand-held machine. Another issue was that to properly test the project, it needed to be fully constructed, and therefore many of the errors were not apparent until the last days when all components were assembled. It also made it hard to fix those errors as we couldn’t disassemble the parts and test them, forcing us to work on the fixes while the parts were in “display mode”.
Interoperability: another major debugging hurdle was caused by the fact that, while each component worked by itself, most of the issues were caused in the interim – a message that a component received was not what it was supposed to receive, and didn’t know how to parse it, or parsed it incorrectly. However, the component reported the error as if it was caused by itself, and as result it took us more time to realise that it was caused by another device.
IO Security: Due to design restrictions, Adobe Flash can read information from files, but cannot write to them. On the other hand, MAX/MSP can write to files, but has a cumbersome read capability. We resorted to reading with the Flash and writing with the MAX/MSP components, which is considered a bad practice, and had created several bugs.
Connectivity: The entire system was connected by different cables, some of them extended, and wifi connections, all of which were prone to faults. We had a major issue with the 3d earphones which didn’t map correctly to the 3d channels in the sound, and the connection between the main machine and the hand-held was disrupted as result of a wifi outage during the show.