Low-cost LIDARs!(4) Indoor mesurement

Data comparison (indoor)

The video of the measuring in the living room at home is shown below. Considering the use in the robot, the height of sensing was set to 15 cm. I am walking around the sensor.


The shape of the room is beautifully reflected, and the walking foot looks like a semicircle.

Sweep (10Hz)

We understand the shape of the room somehow. Although foot is reflected, we do not know the shape.

Sweep (3Hz)

It produces the same resolution as RPLIDAR A2, so you can understand the shape of the room. The shape of the foot is vague and it is hard to understand because the update is slow.

Indoor summary

If it is used indoors, RPLIDAR A 2 seems to be good in terms of angular resolution and distance measurement accuracy. Even with use up to now, RPLIDAR A 2 worked perfectly with indoor SLAM like a normal living room or office.

Next time, we will look at outdoor measurement results.

Low-cost LIDARs!(3) ROS package

ROS compatibility

RPLIDAR A2 and Sweep have their respective SDKs released, which is possible to use data from our own software. However, if you are using it for a robot, no reason not to use ROS. Here we will look at the ROS compatibility.

In both RPLIDAR and Sweep, the ROS driver code has already been created.

Sweep has not been released yet, so it needs build from source.

RPLIDAR has already been released, it can be installed with apt comand

$ sudo apt install ros-kinetic-rplidar-ros

Activation of sensor

Both can easily retrieve the data with the included launch file and see it with rviz.


$ roslaunch rplidar_ros view_rplidar.launch


$ roslaunch sweep_ros view_sweep_pc2.launch

Differences in messages

rolidar_ros outputs the sensor_msgs / LaserScan message.

Sweep_ros, on the other hand , outputs sensor_msgs / PointCloud 2 . This is probably because Sweep’s measurement is not synchronized with rotation, as mentioned before. However, by using scan_tools / pointcloud_to_laserscan , PointCloud2 messages can be converted to LaserScan messages.

Displayed in Rviz

I tried visualizing data in the same place.


RPLIDAR A2 is the default setting and you can understand the shape of the room. We also confirmed that it can be used enough for map generation and autonomous movement of robots .

Sweep rotating at the same 10 Hz in the default state is here.

Sweep(10Hz, sample_rate=500Hz)

Well, after all, at the same speed, the roughness of Sweep stands out. But it seems to me that the resolution is too low.

I noticed that the default sample rate of the sweep_ros node was half of the maximum specification (1 kHz) at 500 Hz . I would like to have the maximum performance by default, but we can set sample_rate parameter to 1000 and tried again.

Sweep (10Hz, sample_rate=1kHz)

The detail level has improved much. However, compared to RPLIDAR A2, the shape of the room is hard to understand.

Continue trying to lower the scan (rotating) speed to 3 Hz.

Sweep (3Hz, sample_rate=1kHz)

I can understand the shape of the room this time. The renewal cycle is about three times longer than RPLIDAR A2, as the rotation speed has decreased.

Next, let’s compare the data with the video.

9/14 ROSCon Japan is coming!

ROS Developer Conference ROSCon will be held in Japan this year!

It will be held in Akihabara,Tokyo on September 14 (Friday) the previous week before ROSCon 2018 held in Madrid, Spain. Mr. Brian Gerkey, founder of ROS, will also visit Japan and make a keynote address.

Applications will begin in mid-April for general lectures. Why do not you announce your own ROS project on this occasion?

We are looking forward to it, we would like to contribute to its success!

ROS Workshop Schedule in April 2018

Here are the schedule of our ROS workshop series in April 2018!

Apr. 05 Thu. 13:30- Introductory
Apr. 07 Tue. 13:30- Introductory
Apr. 26 Thu. 13:30- Introductory
Venue: Yurakucho, Tokyo

Apr. 10 Tue. 13:30- Introductory
Apr. 23 Mon. 13:30- Introductory
Venue: Nagoya, Aichi

Inquiries: info[at]opensource-robotics.tokyo.jp


Low-cost LIDARs(2) Specification

Product Specifications

Let’s compare the specifications of the sensors.

It is an important precaution!

  • Be sure to check the specifications by yourself when you actually use it!
  • TORK can not take responsibility even if your project does not work out due to errors in this article!


For industrial LIDAR, distance measurement by ToF (Time of Flight) method is often used. This is a method of calculating the distance by measuring the time until the laser light is reflected by the object and returned to the sensor. Sweep uses the ToF method to measure distance. On the other hand, RPLIDAR A2 uses the principle of triangulation instead of ToF. This is a method of measuring the distance to the reflection point by measuring the displacement of the light receiving point with a sensor. The light emitting point, the reflection point and the light receiving point constitute a triangle, I believe you may know already.

Triangulation generally seems easier in general, but as the distance to the reflection point increases, even if reflected light can be measured, the resolution of distance decreases. On the other hand, the ToF can increase the maximum measuring distance because the resolution does not decrease even if the distance to the reflection point increases (as far as the reflected light can be measured).

There is another big difference. For RPLIDAR A2, sampling is done in synchronization with the angle. In other words, the angle of the laser to be measured is the same for each rotation (even if there are delicate variation actually). On the other hand, although Sweep is synchronized with the scan start at the angle 0 position, each distance measurement is not synchronized with the angle. From the specification of Sweep, the data obtained from the sensor is a set of angle and distance. The laser that Sweep emits has a pattern rather than a single pulse, so the sample rate is not constant (default is 500 to 600 Hz), so it seems to be like this.

High-end two-dimensional LIDAR such as URG and SICK is a method that can acquire distance synchronized with angle, so it seems that Sweep’s method is a bit difficult to use.

Angle resolution, measurement cycle, sample rate

As an important performance in LIDAR, there is angle resolution and measurement cycle. The angular resolution is the fineness of the angle of the measurement point. When the resolution is low, the measurement points of the object and the environment are too small, and the shape of the measurement is not well understood. The measurement period is the time taken for one scan (one rotation), and when the measurement cycle is large, accurate measurement becomes difficult when measuring a moving object or when the robot itself moves.

Both of the sensors are measuring the surrounding by rotating the part where the light source and the measurement part are integrated. RPLIDAR A2 has variable rotation speed from 5 Hz to 15 Hz, but the number of measurement points is the same (400 points) regardless of the rotation speed.

Sweep has a variable rotation speed from 1 Hz to 10 Hz, but since the sample rate is 1000 samples / sec at maximum (precisely 1000 to 1025), the faster the rotation speed is, the measurement points per revolution is getting lesser.

  • RPLIDAR A2: 400 samples / rotation
  • Sweep: 1000 samples / sec

If both are rotating at 10 Hz,

  • RPLIDAR a 2: 400 samples / rotation
  • Sweep: 100 samples / rotation

Therefore, RPLIDAR A2 has an angular resolution four times higher. If you want to obtain the same resolution with Sweep, you need to rotate the cycle at 1/4 (2.5 Hz).

Maximum measurement distance and distance resolution

Maximum measurement distance is also important performance. It seems better to measure distantly, however, from the experience 10m is enough for office navigation, and 100 m will not be enough for  applications like automatic driving. It depends the application.

RPLIDAR A2, confusingly, the measurement distance is different depending on the model. Attention is required for model number when purchasing. This time I used RPLIDAR A2M8 with a measuring distance of up to 8 m.

  • RPLIDAR A2M4 Maximum measurement distance 6 m
  • RPLIDAR A2M8 Maximum measurement distance 8 m
  • RPLIDAR A2M6 Maximum measurement distance 16 m

On the other hand, Sweep has a maximum measurement distance of 40 m.

By the way, the minimum measurement distance, which is the limit of the near side, is about the same.

  • RPLIDAR A 2 M 8 minimum measurement distance 0.15 m
  • Sweep Minimum measurement distance 0.15 m

How about distance resolution? The distance resolution of RPLIDAR A2M8,

  • 0.5 mm or less when 1.5 m or less
  • At a distance more than 1.5m, 1% of the distance or less

The nearer the resolution is higher, so the shape of the object can be taken properly if it is close. Instead, it can be predicted that both distance and resolution will fall as the distance increases (data did not exist for errors in specifications).

On the other hand, for Sweep, 

  • Sweep distance resolution 1cm or less

regardless of the measurement distance. The Sweep specification also includes a graph of the distance measurement error, but you can see that the error does not increase even if the measurement distance becomes longer. It is an advantage of ToF method.

Sweep measurement error (excerpted from Sweep manual)


Both are fairly small, but since the top parts rotate, installation is necessary to avoid touching them.

  • RPLIDAR A2: diameter 70 mm
  • Sweep: Diameter 65 mm

For Sweep, the cable connector is oriented in two directions, one side and the lower side, making it easy to wire. RPLIDAR seems to be difficult to shorten the cable length because the cable is straight out.

Connector on the back side of Sweep

When both are connected by USB, they operate with USB bus power. Also, you can communicate with a UART such as a microcomputer, but in that case you need a separate power supply.


Although it is two sensors whose outline looks the same, you can see that the design philosophy is actually slightly different. It can be said that RPLIDAR A2 is short distance, for indoor use, and Sweep is specification for long distance and outdoor use. I am afraid that angle resolution of Sweep might be low for some usages.

Next we see the ROS support for both.

adi_driver: ROS package for ADI sensors is released!

We are happy to announce the release of  a new ROS package, adi_driver!

This is a driver package for ADI (Analog Devices Inc.) sensors. Currently, it supports ADIS16470 and ADXL345 (experimental).

It is a compact, high performance IMU breakout board

The ADIS16470 is a brand new product of IMU incorporating 3 axis gyro sensor and 3 axis acceleration sensor. It has remarkably very wide measuring range. The range of angular velocity is ± 2000 [deg / sec] and the range of acceleration is ± 40 [g]. This is enough performance for various robots such as wheeled mobile robots, drone and manipulators.
The price of the sensor alone is $ 199 and the price of the breakout board is 299 dollars. For details, please see the product page.

If you prepare a sensor breakout board and a USB-SPI converter (USB-ISS), you can easily use 3-dimensional posture information with this package. For detailed instructions , please refer to the document on GitHub . If you encounter problems, please report at Issues .


Low-cost LIDARs!(1) RPLIDAR A2 vs Sweep

Key device for robots – LIDAR

Speaking of autonomous mobile robot’s key device, it is LIDAR (Light Detection and Ranging). This is a sensor that measures the distance to an object by emitting a laser beam (usually an infrared laser) to the surroundings to detect the reflected light hitting the object.

A commonly used product is a two-dimensional product that measures the distance from a sensor on a plane by rotating a single laser beam. For example, it is a product of SICK and Hokyo Electric .

SICK’s LIDAR products

URG series (Hokuyo Denki)

Originally in the study of mobile robots, LIDAR has been used to measure the environment to create maps and autonomous movements. Also, at the 2007 DARPA Urban Challenge , Velodyne provided breakthrough 3D LIDAR solution, causing one breakthrough in automated driving vehicles.

Destructive innovation of LIDAR?

These LIDAR products were originally for industrial use, for example used in factory production equipment and safety equipment. Therefore, it has functions such as ruggedness and waterproofing, but on the other hand it is quite expensive. Regardless of the research and development work, it is not something that can be easily bought by an individual.

Recently, however, low price LIDARs have been on sale in general. Although it is insufficient in performance and reliability for industrial use, it provides sufficient and simple functions for hobby and research. Personal opinion, I think that it is possibly “destructive innovation” -like product and will be used for applications that we do not expect in the future.

This time, two of them,

RPLIDAR A2 (Slamtec 社)

Sweep (Scanse社)

I will introduce these two sensors while comparing them.


Say hello to Hiro at Shinshu Univ. !

Yamazaki Laboratory students

One day in February, I visited Shinshu University Yamazaki laboratory . We introduced recent activities of TORK, the latest information on ROS, and exchanged opinions. While listening diligently, a lot of questions came out.

Hiro with a trolley

We can see two Hiros in Yamazaki laboratory, and we checked the functions of MoveIt! recently added together and answered the usage questions as much as possible.

In the Yamazaki laboratory, they are doing research to deal with flexible materials made of cloth like towels and shirts. We are looking forward to more dexterous robots.

Dr. Yamazaki and all the students in the laboratory, thank you!

New aibo has arrived!

Sony’s pet robot “aibo” has arrived in TORK!

We named it “TORK-chan ()”.

They announced “ROS are mounted in aibo”, and it seems real. Cool!

You can see in the following pages the list and licensing of open source software that are installed.

In addition, we can now download the open source codes from here.

While we examined a little, we can not find the way to access the aibo system from the user.  We can expect some SDKs will be published in near future. Until the time being, let’s enjoy this cute pet robot!

We’ve visited THK Co., Ltd.!

We have visited THK CO., LTD. and told about robots’ ROS support.

This robot also works with ROS!
It can be purchased by order-to-order production method.
Please refer to the following link if you are interested.
Contact: SEED Solutions

We also accept ROS introduction, ROS package development and releases method, and OSS consultations.
Please do not hesitate to contact us!
info [at] opensource-robotics.tokyo.jp

ROS Workshop on Navigation was held in Nagoya

in Nagoya Fushimi

The ROS workshop: intermediate autonomous navigation was held in Nagoya. Participant is already mastering the basis of ROS, so we were able to advance the curriculum smoothly.

Experience of the autonomous navigation

In the autonomous navigation workshop, firstly we got to try the functions of the navigation on simulation. Next, Roomba-based teaching materials robot Roomblock was hired to operate in the room and corridor of the office, making map, navigating autonomously. Then we can learn the configuration of the navigation packages in ROS system. Questions about ROS and robotics other than the curriculum will be answered as much as possible.

Autonomous mobile robot Roomblock

The public workshop schedule in November !

Also it has published a blog to the first study to people the ROS. Please see for your reference.

Private workshops, we also offer consultation for other OSS.
Please feel free to contact us!
info [at] opensource-robotics.tokyo.jp

ROS Workshop Schedule in November 2017

Here are the schedule of our ROS workshop series in Autumn 2017!

Nov. 07 Tue. 13:30- Introductory
Nov. 15 Wed. 13:30- Introductory
Nov. 30 Thu. 13:30- Introductory
Venue: Yurakucho, Tokyo

Nov. 14 Tue. 13:30- Introductory
Nov. 20 Mon. 13:30- Introductory
Nov. 28 Tue. 13:30- Introductory
Venue: Nagoya, Aichi

Inquiries: info[at]opensource-robotics.tokyo.jp


Interactive ROS Indigo environment with Docker (1)

Indigo? Kinetic?

ROS will continue to be newly released. The latest version is Lunar Loggerhead (Lunar) and the previous, Kinetic Kame (Kinetic) and  Jade Turtle (Jade). Traditionally, they named after turtles.

However, I think still many people are actually using the previous version Indigo Igloo (Indigo). There are many Indigo packages that have not yet been released in Kinetic. For the same reason, even TORK is currently using the Indigo in our workshop materials and contract development .

Indigo is based on Ubuntu14.04 while Kinetic and Lunar is based on Ubuntu16.04 or later. We feel a bit outdated to install Ubuntu14.04 into our PC only in order to use the Indigo.

It could be a solution to use virtual machine (VM), such as VMWare or VirtualBox.  However, it can be a problem when you try to use the hardware. For example, there are often device driver problem such as graphics performance, USB camera compatibility, not compatible with USB3.0 and so forth.

Docker can be the remedy for this issue. You can work easily in the interactive Indigo development environment using devices and GUI while you installed the Ubuntu16.04 on your PC. 

Installation of Docker

First install Docker. Let’s install the latest version according to the instructions in the pages of the Docker. Use of that Community Edition (CE). 

Preparation of Indigo of Docker image

OSRF prepares the Docker container of Indigo on DockerHub.

Get the Indigo of the container with the following command.

$ docker pull osrf/trusty-ros-indigo:desktop

By the way, TORK also publishes the Indigo Docker container. It contains additional packages for the TORK workshops, Nextage Open and other useful tools. You may also find it here.

For example, if you want to check the version by the command ‘lsb_release’, 

$ docker run osrf/trusty-ros-indigo:desktop lsb_release -a

No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 14.04
LTSRelease: 14.04
Codename: trusty

User permissions of execution will be root by default. To change the user rights and the access rights to the devices, you need to set the options at the time of the separate docker run. For more information around here, Please refer to Docker Run Reference.

Interactive start-up that share user and home directory

Here we want to run the shell with their own user rights, and to work interactively in the home directory. We created a script to do some of the settings that are required for that.

$ ./docker_shell osrf/trusty-ros-indigo:desktop

After that, you are on the shell as if it’s OS has become Trusty + ROS Indigo.

$. /opt/ros/indigo/setup.bash

You can use the shell like normal Indigo environment. You must be careful the home directory is shared. If you are using the Kinetic in the base system, you can easily get confused whether your workspace is for Kinetic or Indigo.

If you are using the NVidia driver

If the PC is not using a special X graphics driver, you can also use GUI, such as rviz. But if the PC is using the NVidia driver, it does not work apps, including OpenGL, such as rviz. This solution will introduce the next time.

Successful World MoveIt! Day Tokyo!

MoveIt World! Day Tokyo ended successfully. On the day we had about 30 people. We would like to thank everyone who participated, sponsored, and supported.


Nihon Binary kindly sponsored the robot Baxter. It stood in the fashionable LEAGUE Yurakucho of the floor.


In addition, Nihon Binary exhibited Jaco2 arm and HEBI arm. Some experts tackled them to make them move with MoveIt! and Gazebo.

ARTC in Singapore, the another WMD host, stayed connected in a video chat.


Everyone tried MoveIt! simulators and robot while taking a snack. The trouble of installation and operation has been supported by the staff as possible.


Some people actually moved Baxter and challenged programming with moveit_commander. Meaningful results were obtained according to their level and concern. At the end some people presented the results of today.

Thank you everyone!

In Japan, we have found that many people have an interest in MoveIt!. We will continue using the MoveIt! and ROS with everyone, let’s improve it with feedbacks and commits.

World MoveIt! Day 2017 Tokyo is incoming!

MoveIt! is one of the main major package of ROS intended for the robot manipulator. A local events of “World MoveIt! Day 2017” for all developers and users, will be held in Tokyo this year.

  • Title: World MoveIt! Day Tokyo
  • Date and time: October 19 (Thursday) 11:00 to 18:00 (The date is differ from other western region)
  • Location: [ LEAGUE Yurakucho ] event space
  • Admission: Free

For detailed program and registration, please visit the following site. The schedule is possible to change without notice.

It is to be a very ‘loose feeling’ event. Anyone can join those who are using MoveIt!, who do not yet use it and who do not heard it at all.

If it is possible to bring your own notebook PC where you installed the ROS, you can experience MoveIt! with the simulator or actual robot. You can learn how to use it while our staff and participants can teach each other.

In addition, it also carried out exchange of information on additional functions or bug fixes of MoveIt! itself. Let everyone will boost the MoveIt! in!

New tutorial for Nextage Open has published in Japanese !

Nextage Open

Comprehensive tutorial on ROS package for Nextage Open is available, but it is provided in English for the users around the world. Now we just have published Japanese version of the tutorial, requested from many Japanese users.

Nextage Open Japanese version tutorial

We hope it can serve you in the research and development field. In addition, we think it can be helpful for those who are considering to introduce Nextage Open system.

Of course, English version of the tutorial ( Http://Wiki.Ros.Org/rtmros_nextage/Tutorials ) is also available as before (link to the Japanese version has also been added).

ROS Workshop Schedule in October 2017

Here are the schedule of our ROS workshop series in Autumn 2017!

Oct. 04 Wed. 13:30- Introductory
Oct. 18 Wed. 13:30- Introductory
Oct. 24 Tue. 13:30- Introductory
Venue: Yurakucho, Tokyo

Oct. 10 Thu. 13:30- Introductory
Oct. 23 Mon. 13:30- Introductory
Venue: Nagoya, Aichi

Inquiries: info[at]opensource-robotics.tokyo.jp


ROS Workshop for beginners in 27th September !

We had ROS Workshop at Yurakucho, Tokyo.
We used custom ROS-preinstalled LiveUSB, so people had ROS experience without change your PC environment.
All of attendee ran through the all topics.

Great work, everyone!!

The workshop schedule of October is now open!

in October, ROS Workshop Intermediate Course (Navigation) is also open!

It’s blog for beginners studying ROS for the first time. Please refer to it.

Private workshops, other consultations on OSS are also accepted.
Please do not hesitate to contact us!
info [at] opensource-robotics.tokyo.jp

We participated in ROSCon2017, Vancouver!

We took part in ROSCon2017, held in Vancouver in September 21 and 22!

Scenes from the Vancouver Convention Center

ROSCon is a meeting for the exchange of information once a year, by gathered people involved in ROS in a variety of positions such as users, developers, universities, companies. 

Dr. Brian Gerkey opened the session

TORK did not sign up for the presentation in advance, however, we were able to report our activities in the three minutes lightning talk.

TORK reported on our new packages

There were many presentations from very wide variety of countries and organizations on new robots, new devices, robotics research and development. In addition, this time we could hear many topics of ROS2.

It is anyway so fun conference for developers and users of ROS, why not consider the participation of the future. The venue and the schedule for next year has not yet been determined but should coming soon.

ROS + Snappy Ubuntu Core (4) : Ubuntu Core install on Raspberry Pi 3

Following the contents on this page, install Ubuntu Core on Raspberry Pi 3.

The following work was done in the environment of Ubuntu 16.04. First, download the ‘Ubuntu Core 16 image for Raspberry Pi 3’ image file in Ubuntu Core image. It is 320 MB.

Next I will copy this to the MicroSD card. In my environment,  the MicroSD card was recognized as /dev/sda.

$ xzcat ubuntu-core-16-pi3.img.xz | sudo dd of=/dev/sda bs=32M
$ sync

Launch RaspberryPi3 with the MicorSD card.

Raspberry Pi 3 of the start-up screen

When you first start up, you need to log in from the console, so connect the keyboard and display.

The must have an account of ubuntu.com advance

Oops. It tell me to enter the email address of the login.ubuntu.com account. I need to register ubuntu.com account to use Ubuntu Core. Internet connection is the prerequisite for installation, so it seems to be a problem in the proxy environment.

Please access login.ubuntu.com from another PC and create an account.

Account creation screen of ubuntu.com

Furthermore, it is said that there is no ssh key. It is necessary to set ssh’s public key on the ssh page of login.ubuntu.com. With authentication with this private key corresponding to this public key, you can login to Raspberry Pi 3 with ssh.

Set of SSH Keys

If your machine is Ubuntu, type:

$ ssh-keygen

~/.ssh/id_rsa.pub should be created, copy its contents and paste it on the form of SSH Keys.

From the machine on which you set the ssh key, you will be able to log in with ssh to Raspberry Pi. The username is the user name set in ubuntu.com. IP address of the Raspberry Pi3 is, since it is obtained by DHCP, can checked on the console.

$ ssh username@

In the initial state, it does not use familiar apt and dpkg command in Ubuntu. In the future, we will continue to expand by putting snappy package, let’s first try to install the snapweb package to see the package list in the web browser.

$ sudo snap login username@your_address.com
$ sudo snap install snapweb
$ snap list
Name        Version       Rev   Developer  Notes
core        16-2          1443  canonical  -
pi2-kernel  4.4.0-1030-3  22    canonical  -
pi3         16.04-0.5     6     canonical  -
snapweb     0.26-10       305   canonical  -

In the Web browser,  try to access “”.

snapweb screen of

snapweb screen of

Package list is or looks that are installed now. It seems there is also a store,

Screen of the App store

Screen of the App store

Well, I do not know well, it looks like there are a few apps.

ROS Workshop for Beginners on 30th August

We had ROS Workshop at Yurakucho, Tokyo.
We used custom ROS-preinstalled LiveUSB, so people had ROS experience without change your PC environment.
All of attendee ran through the all topics.

Great work, everyone!!

ROS Workshop in Nagoya is started

Here are the schedule of our ROS workshop in Nagoya!

September 5 Tue. 13:30- Introductory
September 15 Fri. 13:30- Introductory

Venue: Fushimi, Nagoya

Inquiries: info[at]opensource-robotics.tokyo.jp

ROS + Snappy Ubuntu Core (3) : Looking into the Snappy package

Snappy + ROS, https://www.crowdsupply.com/krtkl/snickerdoodle/updates/1890

How does the Snappy package work?

$ which talker-listener.listener 
$ ls -l /snap/bin/talker-listener.listener 
lrwxrwxrwx 1 root root 13 Jul 21 16:04 /snap/bin/talker-listener.listener -> /usr/bin/snap

This seems to be equivalent to doing the following.

$ snap run talker-listener.listener

The snap file is simply expanded to /snap/. The directory ‘current’ should the latest version, but this is a symbolic link to ‘x1’. It will be easy to roll back.

$ ls -l /snap/talker-listener/
total 0
lrwxrwxrwx  1 root root   2 Jul 21 16:04 current -> x1
drwxrwxr-x 10 root root 222 Jul 21 16:00 x1
$ ls /snap/talker-listener/current
bin                       command-roscore.wrapper  etc  meta  snap  var
command-listener.wrapper  command-talker.wrapper   lib  opt   usr

I see the command-listener.wrapper. Looking inside,

export PATH="$SNAP/usr/sbin:$SNAP/usr/bin:$SNAP/sbin:$SNAP/bin:$PATH"
export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$SNAP/lib:$SNAP/usr/lib:$SNAP/lib/x86_64-linux-gnu:$SNAP/usr/lib/x86_64-linux-gnu"
export ROS_MASTER_URI=http://localhost:11311
export ROS_HOME=${SNAP_USER_DATA:-/tmp}/ros
export LC_ALL=C.UTF-8
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$SNAP/lib:$SNAP/usr/lib:$SNAP/lib/x86_64-linux-gnu:$SNAP/usr/lib/x86_64-linux-gnu
export PYTHONPATH=$SNAP/usr/lib/python2.7/dist-packages:$PYTHONPATH
export PATH=$PATH:$SNAP/usr/bin

# Shell quote arbitrary string by replacing every occurrence of '
# with ''', then put ' at the beginning and end of the string.
# Prepare yourself, fun regex ahead.
    for i; do
        printf %s\n "$i" | sed "s/'/'\\''/g;1s/^/'/;$s/$/' \\/"
    echo " "

BACKUP_ARGS=$(quote "$@")
set --

if [ -f $SNAP/opt/ros/kinetic/setup.sh ]; then
    _CATKIN_SETUP_DIR=$SNAP/opt/ros/kinetic . $SNAP/opt/ros/kinetic/setup.sh

eval "set -- $BACKUP_ARGS"

export LD_LIBRARY_PATH="$SNAP/opt/ros/kinetic/lib:$SNAP/usr/lib:$SNAP/usr/lib/x86_64-linux-gnu:$LD_LIBRARY_PATH"
exec "rosrun" roscpp_tutorials listener "$@"

It seems that the command is executed after ROS environment setting etc are done with this wrapper file. In other words, you do not need “source setup.bash” which you always do with ROS. It seems to be easier.

What else is included in the snap package?

$ tree -L 3 -d /snap/talker-listener/current/
├── bin
├── etc
│   ├── ca-certificates
│   │   └── update.d
│   ├── dpkg
│   │   └── dpkg.cfg.d
│   ├── emacs
│   │   └── site-start.d
│   ├── gss
│   │   └── mech.d
│   ├── ldap
│   ├── openmpi
│   ├── perl
│   │   ├── CPAN
│   │   └── Net
│   ├── python2.7
│   ├── python3.5
│   ├── sgml
│   ├── ssl
│   │   ├── certs
│   │   └── private
│   ├── systemd
│   │   └── system
│   └── xml
├── lib
│   └── x86_64-linux-gnu
├── meta
│   └── gui
├── opt
│   └── ros
│       └── kinetic
├── snap
├── usr
│   ├── bin
│   ├── include
│   │   ├── apr-1.0
│   │   ├── arpa
│   │   ├── asm-generic
│   │   ├── boost
│   │   ├── c++
│   │   ├── console_bridge
│   │   ├── drm
│   │   ├── gtest
│   │   ├── hwloc
│   │   ├── infiniband
│   │   ├── libltdl
│   │   ├── linux
│   │   ├── log4cxx
│   │   ├── misc
│   │   ├── mtd
│   │   ├── net
│   │   ├── netash
│   │   ├── netatalk
│   │   ├── netax25
│   │   ├── neteconet
│   │   ├── netinet
│   │   ├── netipx
│   │   ├── netiucv
│   │   ├── netpacket
│   │   ├── netrom
│   │   ├── netrose
│   │   ├── nfs
│   │   ├── numpy -> ../lib/python2.7/dist-packages/numpy/core/include/numpy
│   │   ├── openmpi -> ../lib/openmpi/include
│   │   ├── protocols
│   │   ├── python2.7
│   │   ├── rdma
│   │   ├── rpc
│   │   ├── rpcsvc
│   │   ├── scsi
│   │   ├── sound
│   │   ├── uapi
│   │   ├── uuid
│   │   ├── video
│   │   ├── x86_64-linux-gnu
│   │   └── xen
│   ├── lib
│   │   ├── compat-ld
│   │   ├── dpkg
│   │   ├── emacsen-common
│   │   ├── gcc
│   │   ├── gold-ld
│   │   ├── lapack
│   │   ├── ldscripts
│   │   ├── libblas
│   │   ├── mime
│   │   ├── openmpi
│   │   ├── pkgconfig
│   │   ├── python2.7
│   │   ├── python3
│   │   ├── python3.5
│   │   ├── sasl2
│   │   ├── sbcl
│   │   ├── ssl
│   │   ├── valgrind
│   │   └── x86_64-linux-gnu
│   ├── sbin
│   ├── share
│   │   ├── aclocal
│   │   ├── applications
│   │   ├── apps
│   │   ├── apr-1.0
│   │   ├── bash-completion
│   │   ├── binfmts
│   │   ├── boostbook
│   │   ├── boost-build
│   │   ├── bug
│   │   ├── ca-certificates
│   │   ├── cmake-3.5
│   │   ├── debhelper
│   │   ├── dh-python
│   │   ├── distro-info
│   │   ├── doc
│   │   ├── doc-base
│   │   ├── docutils
│   │   ├── dpkg
│   │   ├── emacs
│   │   ├── glib-2.0
│   │   ├── icu
│   │   ├── libtool
│   │   ├── lintian
│   │   ├── man
│   │   ├── mime
│   │   ├── mpi-default-dev
│   │   ├── numpy
│   │   ├── openmpi
│   │   ├── perl
│   │   ├── perl5
│   │   ├── pixmaps
│   │   ├── pkgconfig
│   │   ├── pyshared
│   │   ├── python
│   │   ├── python3
│   │   ├── sgml
│   │   ├── sgml-base
│   │   ├── xml
│   │   └── xml-core
│   └── src
│       └── gtest
└── var
    └── lib
        ├── sgml-base
        ├── systemd
        └── xml-core

144 directories


The talker-listener ‘s snap package contains the entire directory structure of the required Linux system as well as ROS. It feels a little overkill… But this is nowadays popular as Docker and other do, so-called container virtualization.

By the way, the generated snap file,

$ ls -sh talker-listener_0.1_amd64.snap 
154M talker-listener_0.1_amd64.snap

There are also 154 Mbyte! Is it acceptable when the package increases?
There are probably ways to solve this because the package contains a lot of unnecessary files.

Roomblock(5) : 3D Printable Frame Structure

Roomblock has Raspberry Pi, Mobile battery and RPLIDAR on a Roomba. We need to a frame structure to make them bind.

Battery stage

Battery stage

The frame is a extendable shelf-like structure. It consists of the first battery stage, second Raspberry Pi stage, and the top RPLIDAR stage.

Raspberry Pi stage

Raspberry Pi stage

We named this robot Roomblock, because it is extendable like “blocks”.

Roomblock's frame

Roomblock’s frame

We publish the STL file for 3D printing, so that you can print it out with your own 3D printer. Please check the Thingiverse site.

We share how to build Roomblock at Instructables. Please try it at your own risk.

ROS Workshop for Beginners on 8th August

TORK has just passed 4th annual mark on August 8th, 2017.
Thank you all for your business with us, and understanding toward the opensource robotics.

We had ROS Workshop at Yurakucho, Tokyo.
We used custom ROS-preinstalled LiveUSB, so people had ROS experience without change your PC environment.
All of attendee ran through the all topics.

Great work, everyone!!

ROS Workshop for Beginners on 2nd August

We had ROS Workshop at Yurakucho, Tokyo.
We used custom ROS-preinstalled LiveUSB, so people had ROS experience without change your PC environment.
All of attendee ran through the all topics.

Great work, everyone!!

ROS + Snappy Ubuntu Core (2) : Let’s make it Snappy!

Last time I introduced about Ubuntu Core. Ubuntu Core is using Snappy as a package system, however, so-called “Snappy” packaging system can be tried without installing Ubuntu Core. This time, I will try to make “Snappy” ROS package on ordinary Ubuntu.

Snappy + ROS, https://www.crowdsupply.com/krtkl/snickerdoodle/updates/1890

Here is a tutorial on how to make a Snappy ROS package. Let’s follow this street for the time being.

Before doing this we recommend you to read through the Snapcraft Tour tutorials.

The following procedure is what I tried with Ubuntu 16.04 + ROS Kinetic. First, install Snappcraft which is a Snappy packaging tool.

$ sudo apt install snapcraft

I need the ROS package to make it a Snappy package, but it is a pain to type in my own code, so I clone the ros_tutorial source code from github.

$ mkdir -p catkin_ws / src
$ cd ~/catkin_ws/src
$ git clone https://github.com/ros/ros_tutorials.git

Initialize with snapcraft.

$ cd ~/catkin_ws 
$ snapcraft init

A directory named snap is created and a file named snapcraft.yaml is created below it. Rewrite this file to include the binaries in roscpp_tutorials. Refer to the tutorial for explanation as to how to rewrite it. I think that you can understand in the atmosphere.

Well, finally I will create a package.

$ cd ~/catkin_ws
$ snapcraft

I started downloading various things from the network … snapcraft can understand the catkin workspace, so it looks at package.xml and download the necessary packages with rosdep.

Perhaps you think “All the necessary ROS packages are already in the system!” Because the Snappy makes all the necessary items into the virtual container as a snappy package. It downloads the packages, compiles it from source code, and installs it in the virtual container, every time its build.

After a while, the processing is over,

Snapped talker-listener_0.1 _ amd64.snap

A snap file is generated.

To install it,

$ sudo snap install --dangerous publisher-subscriber_0.1 _ amd64.snap

will do. Let’s see if it is installed.

$ snap list
Name Version Rev Developer Notes
Core 16-2.26.9 2381 canonical -
Talker-listener 0.1 x 1 -

Let’s start the nodes in turn.

$ talker-listener.roscore
$ talker-listener.talker
$ talker-listener.listener

You can see familiar nodes of talker-listener began to proceed.

Amazing montage video incl. NEXTAGE OPEN celebrates MoveIt! 5-year

MoveIt!, de-facto standard motion planning library for ROS, now celebrates 5th year since its initial release by an amazing compilation of application videos.

This is the 2nd time MoveIt! maintenance team makes such a montage. Comparing with the one from 4 years ago back in 2013 soon after the software was just released, we can see many more Pick&Place applications this time.

Also captured my personal interest was that there are some mobile base/subsea rover manipulation apps, which is one of the future improvement items of MoveIt! (see this page “Mobile base integration”). It’d be absolutely a great contribution if the developers of those apps would give back their development to the upstream MoveIt! software.

As has always been, NEXTAGE Open, a dual-arm robot that TORK has been actively contributing to its maintenance and providing support service, appears in the video as well thanks to a Spanish system integrator Tecnalia presumably for their work with Airbus.

Hironx in motion from MoveIt! 5-year montage. Image courtesy of Tecnalia

Hironx in motion from MoveIt! 5-year montage by courtesy of Tecnalia

TORK has been a motivated, skillful supporter of ROS and MoveIt! since our launch in 2013. If you’re wondering how you could employ MoveIt! to your robot, please consider our hands-on workshop series too.

P.S. List of all application’s developers are also available as follows:

(0:06) Delft Robotics and TU Delft Robotics Institute
(0:09) Techman Robot Inc.
(0:13) Correll Lab, CU Boulder
(0:37) Nuclear & Applied Robotics Group, Unv Texas
(0:50) Beta Robots
(1:03) Team VIGIR
(1:34) Honeybee Robotics
(1:49) ROBOTIS
(2:05) Correll Lab, CU Boulder
(2:26) TODO Driving under green blocks
(2:38) ROBOTIS
(2:54) Fetch Robotics
(3:05) Hochschule Ravensburg-Weingarten
(3:12) TU Darmstadt and Taurob GmbH – Team ARGONAUTS
(3:20) isys vision
(3:27) Technical Aspects of Multimodal System Group / Hamburg University
(3:33) Clearpath Robotics
(3:43) Shadow Robot

ROS + Snappy Ubuntu Core (1) : What is it?

What are the challenges to introduce the market infrastructure of applications such as iPhone’s AppStore and Android’s Google Plain into the world of robots? ROS has been looking into the “Robot App Store” since its inception, but it has not yet been realized.

Among them, the recent appearance of a mechanism called Ubuntu Snappy Core seems to greatly contribute to the opening of the Robot App Store.

Snappy + ROS, https://www.crowdsupply.com/krtkl/snickerdoodle/updates/1890

I’d like to write about Ubuntu Snappy Core and ROS in the several future entries.

The circumstances of Ubuntu

Ubuntu, on which ROS depends as the main base operating system, has two Regular releases twice a year, and its support period is nine months. There is also a release called LTS (Long Term Support) every two years, and the support period is five years. People seeking practical use continue to use the stable LTS while adopting the latest technology and using the Regular release for the new development.

However, devices targeted by Ubuntu are spreading not only to desktop PCs and servers but also to edge devices such as IoT and routers. Unfortunately, Ubuntu Phone would not come out…

For these devices, from the security point of view, it is not a synchronous one like Ubuntu, but continuous updates of more irregular and finer intervals are indispensable. In addition, fault tolerance, such as rolling back when software containing defects is delivered, is required.

Snappy Ubuntu Core

In response to these demands, Ubuntu has developed a mechanism called (Snappy) Ubuntu Core for IoT and edge devices.

This is to separate the OS and device drivers, separate the kernel and the application, and make it possible to update each independently, in a fine cycle.

The circumstances of ROS

Robots can also be seen as a kind of IoT and edge devices, so in the future ROS, this Snappy package system may become mainstream. Also, the release system of ROS seems to be getting flawed.

Until now, ROS has done synchronous releases like Ubuntu. However, with the release once a year, it seems too late to adopt the technologies of moving forward. On the other hand, when using ROS for business, priority is given to operation, and it tends to become conservative not to update frequently.

Also, the ROS package depends on many external libraries. Every time the API of the external library is changed, the package of ROS needs to correspond to it. When the specification changes, it is necessary to check the operation after matching the package.

Therefore, every time it is released, more packages are coming off the release. Despite being a necessity and commonly used package, there are cases in which it is not released due to the fact that there is no maintainer to perform the corrective operation, although the fix is ​​necessary for release, or the release is delayed.

If you think about selling robot products with ROS, at the timing of Ubuntu or ROS update, you do not know what kind of specification change or malfunction will be mixed, and it takes huge resources to deal with it.

Based on the above, we anticipate that Snappy ROS systems will become mainstream in the future.

It is reliable that a robot engineer working for Canonical (Mr. Kyle Fazzari) is vigorously disseminating information. The following series of blogs and videos released in April are also must-see.

Roomblock(4): Low Cost LIDAR: RPLIDAR A2

Roomblock is using a low cost RPLIDAR A2.


This is very easy to use, and low cost.

range data in rviz

Range data in rviz

This sensor is ROS-ready. You can view the data by typing one command line.

scan matching

Scan matching

laser_scan_matcher package help you to estimate sensor’s motion only from the laser scan data by scan matching.

We share how to build Roomblock at Instructables. Please try it at your own risk.

Roomblock: Autonomous Robot using Roomba, Raspberry Pi, and RPLIDAR(3)

Roomblock is controlled by a Raspberry Pi. As you know, Raspberry Pi is low cost ARM based board computer. We use Raspberry Pi 2, Ubuntu and ROS are installed on it.


Raspberry Pi 2

The ROI connector on the Roomba and Roomblock are connected with a USB-serial adapter cable.

IMG_2328IMG_1720 (1)
USB-Serial adapter cable
The power source of the Raspberry Pi is a mobile battery.


Mobile battery

We share how to build Roomblock at Instructables. Please try it at your own risk.

Roomblock: Autonomous Robot using Roomba, Raspberry Pi, and RPLIDAR

You can use Roomba 500, 600, 700 and 800 series as a base of Roomblock. They have a serial port to communicate with external computers. Notice the flagship model Roomba 900 series have no serial port, so you cannot use it.

You can see the serial port as the picture shows. Be careful not to cut your fingers or break your Roomba!

Roomba 500 series ROI connector

Roomba 700 series ROI connector
We share how to build Roomblock at Instructables. Please try it at your own risk.

Mapping Experiment in a Large Building

We did very loose evaluation of ROS mapping packages with a relatively large scale building including loop closure. The loop closure is difficult issue for mapping, and it shows some package can close the loop even with the default parameters.


Generated map(from left, gmapping, slam_karto, hector_slam, cartographer)


Notice, this result doesn’t compare true efficiency of each algorithm. We just use nearly default parameters for each package. We recommend you to select mapping packages for your problem by yourself. ROS is suitable for doing it.

Roomblock: Autonomous Robot using Roomba, Raspberry Pi, and RPLIDAR(1)

We are using a robot named “Roomblock” as a material of ROS workshop. Roomba is wonderful not only for room cleaning, but also for learning robotics. Roomba has an serial port to communicate with PC, and of course, ROS system.


You can convince your family to buy a Roomba, of course for cleaning 🙂


The first Roomba I bought in 2007, broken now 🙁

We share how to build Roomblock at Instructables. Please try it at your own risk.

ROS USB camera driver for Kinetic

A ROS USB camera driver for Kinetic which was a long pending problem was released!

Although these packages was used in Indigo a lot, recently the maintainer was absent, and it was not possible to use it in Kinetic.
A voluntary group named Orphaned Package Maintainers is launched and a system has been established to be released in the transition to Kinetic.

In addition to this it has become possible to freely select from the following six additional drivers. Enjoy ROS-CV Programming.


Autonomous Navigation Demo and Experiments at Meijo Univ.

We had a chance to perform a demo and experiments of our autonomous navigation robot at Meijo University, Nagoya, Aichi. Our Roomba based autonomous robot “Roomblock” could impress the students. ROS would help them to accelerate their study.

2017-06-28 12.32.23 navigation_stage_3

And we can do mapping experiments with the robot in a beautiful campus building. We are going to report the result in following posts.

IMG_2401 IMG_2414
We truly appreciate the staffs and students for the kind cooperation. Thank you!

Updated MoveIt! Now Comes with Trajectory Introspection

Through the package update earlier June 2017, MoveIt! now allows you to introspect planned trajectory pose by pose.
Using the newly added trajectory slider on RViz, you can visually evaluate the waypoints in a planned trajectory on MoveIt!.

Introspect waypoints in a planned trajectory on MoveIt! on NEXTAGE Open.

As you see on the slider on the left side, you can now introspect each waypoint in a planned trajectory on MoveIt! on NEXTAGE Open.

See MoveIt! tutorial for how to activate this feature.

New Generation Mobile Robot,TurtleBot3!

We had meeting with Mr. Shibata and Ms. Morinaga, ROBOTIS Japan, at Yurakucho,Tokyo.

Did you check the TurtleBot3, yet ?
TB3 will be released on this summer. We can’t wait for it!

Let’s visit the following website for details!

Thank you for visiting our office today!



TORK Adds Another ROS wiki Mirror

As we mentioned a few months ago, mirrored web sites for ROS documents are in much need, not just when the original web site isn’t accessible but for a number of other reasons.

In ros-users forum there was recently an update announcement about ROS wiki mirroring status, and improved maintenance method for mirrors. On or before that announcement, TORK started mirroring wiki and api docs as well, which is also now noted in the list of mirrors on wiki.ros.org.

Needless to mention that this list of mirror sites aren’t accessible when the web site is down too…so I’ve updated a “mirror” of the list of mirrors.

Announcement for starting support service of ROS

Tokyo Opensource Robotics Kyokai Association (TORK) was established in August 2013 as an incorporated association aimed to form and develop the robotics discipline based in open source software. Until today, we have been supporting user communities of the open source software such as OpenRTM and ROS, etc. This time, we would like to announce that, from January 2014, we will be inviting corporate members, and at the same time, will start the support services for the corporate members to specifically help them solve various problems related to ROS implementations.