ROS Workshop Intermediate Course (Navigation) is coming soon

We have been coordinated three workshops for ROS learners:

  • ROS workshop basic course
  • ROS workshop intermediate course (Manipulation)
  • ROS workshop with Baxter

Now, we are pleased to announce a new course will be added soon:

  • ROS workshop intermediate course (Navigation)

This course is focus on the navigation system of ROS. You can learn the advantage of  ROS navigation using simulation. Furthermore, we prepare mobile robot named “Roomblock”, which consists of Roomba, Raspberry Pi and RPLIDAR. The overview of the navigation system and the packages are also lectured.

Same as before we’re coordinating in Japanese language but any interest is appreciated toward info[a t]opensource-robotics.tokyo.jp.


ROS Workshop for Beginners on 10th May

We had ROS Workshop at Yurakucho, Tokyo.
We used custom ROS-preinstalled LiveUSB, so people had ROS experience without change your PC environment.
All of attendee ran through the all topics.

Great work, everyone!!

IMG_20170510_171206 IMG_20170510_171247


Head-mount stereo camera calibration for NEXTAGE OPEN

Recently we’ve updated the instruction for the NEXTAGE OPEN’s calibration using head-mount stereo cameras.

There has been mutilple camera/robot calibration packages in ROS; calibration package worked well for the PR2 robot, a high-functional reference model of ROS in the early era until very recently. Then for industrial usage industrial_extrinsic_cal was developed with abundant functionalities to cope with complexed needs. Relatively newer robot_calibration package, which is maintained by the warehouse robot manufacturer Fetch, even updates the robot’s kinematics parameters using the calibration result. Almost all of these practical features come with a cost of labor in hardware and/or software, however.

The approach we’re presenting is simple enough since all you need is a checkerboard — not even a board but a plain or even a piece of paper will do as long as checker matrix is printed on it. Detailed steps are available on ROS wiki. Basically get the pose of the camera mounted on top of the head or actually anywhere on the robot, by recognizing the checkerboard. You can also carry out the calibration on Gazebo simulation as long as you know the location to attach the camera so that you can spawn the camera’s 3D model in the virtual world.


Detecting a checkboard at the chest.


Recognizing the checker pattern, which is in a fixed pose w.r.t the robot, yields the camera’s relative pose w.r.t the robot as a result of “tf” computation.


Seeing checkerboard through a simulated Kinect on Gazebo.

We’ve also made the design of the checkerboard kit publicly available on NEXTAGE OPEN’s GitHub repository so that any handyperson can go through these steps.

As of now the instruction in the wiki above only uses Kinect or Xtion, i.e. the operation of the Ueye cameras that NEXTAGE OPEN comes with by default is a bit different but that’s not documented yet, although the calibration part is intended for any ROS-based stereo camera so it should work for Ueye too. Every one is welcomed to edit ROS wiki to add your findings in this regard. Power of opensource!


ROS Workshop Schedule from April to June 2017

Here are the schedule of our ROS workshop series during the 2nd three months of 2017!

Apr. 28 Fri. 13:30- Introductory
May 10 Wed. 13:30- Introductory
May 18 Thu. 13:30- Introductory
May 25 Thu. 13:30- Introductory
June 08 Thu. 13:30- Introductory
June 14 Wed. 13:30- Introductory
Venue: Yurakucho, Tokyo

Inquiries: info[at]opensource-robotics.tokyo.jp

IMG_20151112_182120


We have moved !

We are pleased to announce that our office will be relocated to the following address as of April 1, 2017.

Location:
the 6th floor of the Tokyo Kotsu Kaikan building
2-10-1 Yurakucho
Chiyoda-ku, Tokyo, 100-0006, Japan

Phone number is unchanged.

IMG_20170216_143001