Get the source

GIT repositories

HBot repositories are hosted on gitorious. You can get anonymous access to current version, run:

git clone git://

If you get the code from git repository, and not from the release, do not forget to run autoreconf to generate the configure script.


Click here for old releases

Latest release:

Tarball Date Main features Release Note
HBot v0.8.0 22/08/13 valgrind, lego nxt2, opencv, rootooth README.0.8.0.txt


On host PC (simulation)

HBot can compiled with some optional libraries:

  • CGAL and potrace are used for vectorial maps support (in a debian-based distribution, package name is libcgal-dev; potrace needs to be compiled for having the development files)
  • libpng for creating images from maps (package name: libpng3-dev)
  • Player/Stage for robotics HAL; without that, you will not have access to the Stage simulator (robot-player and stage packages).
  • Boost Graph Library for A* pathfinding in vectorial maps (libboost-graph-dev).
  • Boost for signal/slots and networking (boost::asio)

The HBot package also embeds various projects such as lua, potrace, OpenSteer, libmindstorms and fann, they can be found in “3rdparty” directory in hbot source packages.

Prior to version 0.7.7, the project was using autotools. Starting with v0.7.7, you will need cmake.

Using autotools: if you got your code from git repository, do a autoreconf before anything else. The project is autotooled, so it should compile smoothly with ”./configure && make”. Try ”./configure –help” for option flags.

Using cmake: you should build out of tree, so prepare a build directory somewhere, then run “cmake <path_to_hbot> && make”


Starting with v0.7.7, once you compiled hbot for your machine and provided you have the google test library, you can run unit-tests. Go to build/gtest and run gtesthbot.

Cross-compilation with OpenEmbedded

For cross-compilation and other nice things, I use OpenEmbedded. It helps a lot for preparing kernel/rootfs, supports a good wealth of platforms (including EZX and BeagleBoard) and has a growing user basis. If you don't know a thing about OpenEmbedded, I suggest that you first learn how to use it, install a basic console image on your device, before starting cross-compilation of hbot and its dependencies (which may or may not compile, OE being grumpy from time to time).

I have made a git tree for hbot overlay for OpenEmbedded. It contains recipes for hbot, for its dependencies that are not in OE (e.g. player, CGAL, potrace), and will have a recipe for building an image with all tools and etc-files for having the prototype running. You can clone the tree with:

git clone git://

In order for OE to take the hbot recipes into account, you must edit your local.conf file and add something like:

# HBot files
BBFILES += "${HOME}/openembedded/oe-hbot/*/*.bb"

Then in your OE directory you can run

bitbake hbot

If OE bothers you with missing checksums, try this. Once a funtional image is installed on your device, and your host PC bluetooth interface is correctly configured (eg. pand connected towards your device, BNEP interface configured in the subnet), connect to your robot and lauch luabot.

Play in simulator

You need hbot compiled for your computer, and player/stage. Start player on a stage configuration file (I use “simple.cfg” from stage world files), then start luabot on the playerconfig.lua file, and it should work!

  1. NB.1 You may have to remove the bumpers from playerconfig.lua though, since bumper are not supported in stage 3: I added it back and wait for upstream to push my patch, but it may take some more time.
  2. NB.2 If you don't see the maps, try changing the repository path in playerconfig.lua, or create the path and put an empty “index.htm” file in the path (by default, make ./www directory and create ./www/index.htm).

With HBot-Eyes, you will have the opportunity to play with the simulated robot (either with UI buttons or with a gamepad), to construct a map of the simulated environment, and to try the pathfinder.


You can either run hbot on PC (as explained above), or on-board (I am doing this on EZX). You can then control the robot with your host PC thanks to HBot-Eyes (using the IP address of the robot's bluetooth interface). When using luabot, you will have to give a script file as parameter. When plugged to a roomba, use roomba.lua for example. Provided a web server is on the robot AI unit, the maps will be available through HTTP. Tweak the lua files to adapt it to your configuration.

### on the robot EZX, plugged to a roomba
$ luabot roomba.lua

### on host PC
$ cd hbot-eyes
$ ./hbEyes

### with your favorite navigator, open URL 

You can also run hbot on your PC, with a serial link to the hbot hardware controller (but attention to the serial cable, it will limit robot freedom of move). You will have to change roomba.lua line about serial device, replacing the device with the host serial interface (e.g. /dev/ttyUSB0).


There is certainly a good idea that you would like to see in hbot. You can either ask to the developpers to do it, or better, you can do it by yourself, and propose your improvements back to developpers. I look forward for your improvements!

sources.txt · Last modified: 2013/08/22 15:43 by hadrien
Recent changes RSS feed Creative Commons License Donate Powered by PHP Valid XHTML 1.0 Valid CSS La rache Driven by DokuWiki