Inspected the whistle detection dataset as described in the RoboCup Symposium paper. Had to install g++multilib with command sudo apt-get install gcc-4.8-multilib g++-4.8-multilib to be able to compile the code (trying to read 32bits data on a 64bits machine). The main reads the file freqData_ideal, which contains 2005 samples, half of them identified as target (to be learned), the other half as non-target (resp. 1005/1000 samples).
Inspected the code from Niels, but the accuracy (and F1-score) is calculated with SciKit Learn metrics module, which does not contain a standard deviation.
This article describes the relation between different statistical measures (and why retrieval scientist do not use standard deviations).
Tried the alredballtracker_start.py example with a real ball (on a green field), but still no response. Added v = tracker.getTargetPosition(0) # FRAME_TORSO to the loop, but when a print this vector it seems empty.
Added targets = tracker.getSupportedTargets() to the initialization, but this nicely reported ['RedBall', 'Face', 'LandMark', 'LandMarks', 'People', 'Sound']
October 4, 2014
Tried the walk to a goal, but the robot is very quick with disbanding its turn. Set collision to All disabled and added a StandInit pose to Workflow.
Tried a Walk Towards, but now the robot keeps on turning. Tried to stop it with a Get Parameter-box combined with a If-box, but this didn't fire.
Read also the documentation of ALRedBallDetection. The suggestion of the documentation is to look at the code of Walk Tracker Choregraphe Box. Yet, this function is no longer available (just the Red Ball Tracker).
Testing the script, but it fails on motionProxy = ALProxy("ALMotion", robotIP, PORT), while the IP and PORT are correctly set. Also the old /tmp/say.py doesn't work.
Tried the code from hello world tutorial, but also here the ALProxy fails on ALNetwork::getModuleByName
failed to get module ALTextToSpeech http://192.168.0.99:9559. While advanced naoqi shows that both modules are active.
Installed the new 2.1.0.19 python SDK and problem is solved (python say.py werkt). Also python almotion_moveTo1.py works.
Created lmotion_moveToBall.py, but GetBallPositionX is no longer a function. The Release notes indicate:
ALFaceTracker and ALRedBallTracker are now deprecated. They are replaced by a new and more generic module: ALTracker.
Read documentation of ALTracker. Documentation contains an example alredballtracker_start.py, but robot doesn't move (first switch off external anti-collision?).
Added disabled collision to follow redball script, but still no reaction. Maybe tomorrow with better light / ball. Good replacement for GetBallPosition is probably ALTrackerProxy::getTargetPosition()
The problem had problems standing up, and finally it saw the ball:
October 3, 2014
Tried to walk, but no movement! Strange, behavior seems to finish before any step is made. The second output indicates that the distance is still 1m (but fires). With Obstacle avoidance the robots walks (although on his place).
Anyway, the stiffness is no longer relevant, should say something about autonomous life.
The not walking is known issue. The solution is to add a box Set External Anti-Collision (Body Part = Move) (Action = Disable). Solution seems to work.
October 1, 2014
Trying to connect Choregraphe 2.1 to Bleu running NaoQi 1.14.5, but got no connection. Started Choregraphe 1.14.5 and made the connection. Tried to download nao-yoga. Unfortunatelly, I was disconnected before when the download was on 63%. Second try (after copy of Install to the Tough Drive) worked.
Installed Choregraphe 2.1 in Programs/v2.1 instead of Programs/Aldebaran/v2.1.
Downloading nao-life and nao-life-channel. According to release notes, the behaviors are now managed by the package manager.
Via the option at the startup screen I imported the nao-yoga.crg into a project, and could upload that to Emerald. Yet, when I executed the first behavior (nao-yoga), I got the warning ALAudioPlayer::loadFile could not find function.
Received the 4th version of the script for the opening of the Network Institute. Tried to play it on Rouge, but NaoQi v1.14 contains no Dutch.
Updating Rouge to v1.14.5 (equal to my version of Choregraphe). Update successful. Still, the choice for the language is still only English & French. Yet, connecting via the WebInterface to WebServices gave the message: "Now able to speak Dutch". Yet, still same error message (not support the following language). Maybe I should try to reboot. Reboot not needed; was able to select Dutch from WebInterface.
Speech goes well, yet Rouge does not perform the moves. Tried to download Naos-life-channel from Bleu, but get complaint that project doesn't contain a behavior file. On Bleu the behavior Hungry1 functions fine. Hungry is only performed after the sentence (and also contains a mmm).
Second sentence is synchronized, only Laugh is quite wild (and contains Laugh). Removed sound (length of both sequences is the same).
Hurt 1 is ook iets te druk.
September 13, 2014
Added movements to the Network Institute script by opening one of the behavior files in the project content area. A new tab, with an AnimationLibrary comes available. Most movements take longer than the sentence, so put the stop on the behaviors.
September 11, 2014
Made several Pickup requests. Tested Rouge, Bleu and Cyan, but Tai Chi is performed fine. Could not find github with Hardware status.
Tried to reach the head Nao1 with BHuman code. Connected the head directly with a wire to the server, the ethernet connection is green at both sides. Still, the device doesn't show up at the netgear wizard.
Made the different sentences creating a new project, copying the sentence in that projecct, save the project and upload it to the robot (which makes all 18-sentences playable). Named them starting with string 01-18, so that they are available in the right order.
September 9, 2014
Preparing the opening of the network institute. Blue has NaoQi 14.1, My Choregraphe is version 14.5. Could set via the webinterface the language to Dutch.
First attempt worked partly. First words are recognized ("Oh nee"), but those are not specific enough (better add "interview"). Another problem is that a second VoiceRecognizeBox doesn't work, because the ASR module is still running.
ROS nao driver seems to have a solution, with stop_recognition;reconfigure.
Added self.onUnload() when word is recognized, still ASR is running.
The Tutorial suggest to use the Choice box. Yet, this box explains the options you have.
July 9, 2014
Tried to install Blue's head on the new Nao body from Maastricht, but the software complained that the version (1.14.5) was not recent enough for the body.
July 7, 2014
Tested the say.py script which was generated by the interactive site. When I modified the code to ttsProxy = ALProxy("ALTextToSpeech","192.168.1.4",9559) the script worked, although according to the documentation the same function with one argument should alos work.
Tried to start the same script from the commandline on nb-udk. Changed PYTHONHOME from v1.14.1\simulator-sdk-1.14.1-win32-vs2010 to C:\Programs\Aldebaran\v1.14.5\Choregraphe\lib. Still ImportError: dynamic module does not define init function.
My hypothesis is that the debug dll is not available in the Choregraphe directories, so I installed naoqi-sdk-1.14.5 on nb-udk.
The trick was not to install the C++ sdk, but the python-sdk. This installer installs the naoqi libraries in C:\Programs\Python2.7\Lib\sitepackages.
Tried to reproduce this in the Virtual Box. I had an Oracle Virtual Box with an NaoQi image of v1.14.5 already configured. I could connect to the VirtualNao, but got the error that it failed to get module ALTextToSpeech on 192.168.1.53.
Downloaded Webots 7.4.3 (90 days trail) from cyberbotics. License costs 320 CHF (edu) / 2300 CHF (research). Yet, it has both an interface to the Nao as the youBot.
Copied the project BHuman2013 from Eugenio. Issued in Make/Linux the command make SimRobot. The resulting exectable is Build/SimRobot/Linux/Develop/SimRobot. Opened the scene Game2013Fast. Could inspect the behaviorControl of robot1.
May 7, 2014
Should ask to include robolab to networkinstitute reservation form. Maybe mail to news@networkinstitute.org.
April 15, 2014
Read an article about the HUMAVIPS project, which used a stereo head for the nao. The stereo software is available at INRIA. The Visual SLAM code is behind a password.
February 17, 2014
Bleu doesn't want to start up. Chestbutton only lights up green shortly.
Tested Rouge: it has NaoQi 14.0 but has no problem with performing Tai Chi (except the speaker; the music failed some times.
Opened Bleu's head and losened the head connection a bit. Now the Nao starts again. Closed the head at made NaoLife the default behavior.
Bleu reported that it updated Nao Live, but it doesn't start automatically.
Bleu has NaoQi 14.1. The Debug window gives some warnings (NaosLife/Tracker/Stop'' does not exist in ALMemory !; The event '/nao-life/language'' does not exist in ALMemory !).
After a second restart Bleu starts in NaoLife. Made a direct connection. Both nb-udk and Bleu got a adress in the 169.154 domain with subnet mask 255.255.0.0, so Bleu was visible in Choregraphe.