Search, Actuate & Navigate

Ambient Plotter..

Our goal is use a robot arm as a plotter, which can draw images based on textual input. These images are generated, based on interpretation filters, converting emotions in text to colours and shapes in drawings.

For the larger part, technical problems were solved beforehand. In the preceding three weeks, we were taught how to control the robot arm. Some existing code we used, was written in the previous year on the same premise. Also, a GUI for testing was present, as was the hardware.

Technical problems

There were some problems with the existing hard-/software though. The old code was somewhat unfinished, as it couldn't switch colours with the robot arm. This was fixed on wednesday, and perfected on thursday. The list containing the gripper-positions (abstract arm-movements) wasn't updated correctly. Also, there was no check to see if the gripper was holding a marker yet (as is the case after initialization). See the code for documentation.

There are some real-world problems with the wooden board holding the markers, and the paper. These are solved the easy way, by manually holding on to them.

Our addendum was to incorporate the text-parser and map words to shapes/colours. Understanding the existing code as to where to put it, and how to call existing functions, was basically patient reading. More difficulty lies in how to represent meanings, and how these relate to shapes and colours. The easy solution is if-statements in java. For example 'if( bevat("love") drawHeart();' We didn't get to incorporating a real parser with accompanying meaning-ontology. This probably would have been the best solution. We did write some help-functions to make it easier to add new shapes and keep the code readable. These include bevat() addLine() and drawHeart/Happy/etc.