How to Make a Vision System for the Dobot Magician with the Pixy 2 Camera

Vision systems and robot arms are used in many modern production lines. With the availability of cheap vison cameras for consumers it has now become possible to make your own vision system controlled robot arm at home.

This article describes how to add a vision system to the Dobot magician robotic arm, allowing it to identify parts automatically and place them in a designated drop off location.

In the following sections the project is described from idea to a functioning pick and place system:

  • Connect the Pixy2 and Dobot to the Arduino
  • Camera position and mounting bracket design
  • Camera Settings and lighting
  • Camera calibration routine
  • Arduino programming and menu system
  • Pick and Place routines

Connect the Pixy2 and Dobot to the Arduino

In this project both the Pixy2 and the Dobot Magician are connected to an Arduino Mega, which contains the program for controlling the pick and place routines. The Pixy2 is connected to the I2C bus on the center of the Arduino Board. The Pixy can be connected using the supplied flatcable. However, since it is too short for this application I made a new cable at the desired length. The flat cable connectors and the cable itself were purchased from a local electronics store.

Crimping the flat cable connectors (caveman style)

Since I do not have a crimping tool I used a pair of pipe wrench pliers to crimp the connectors, which is not advisable, but it works.

To verify if all the wires made a proper connection I verified all wires with a multimeter by checking restistance between connectors on each socket. An LCD shield is placed onto the arduino board with some female sockets in between to clear the serial connector below the board.

Placing the LCD shield

A serial connection is used to communicate between the Dobot and the Arduino. Please note that an Arduino Uno cannot be used for this since it only has a single serial connection which is used to communicate to the PC. The Arduino Mega has 3 spare serial ports, on pins 14(TX)-15(RX), 16(TX)-17(RX) and 18(TX)-19(RX).
The Rx and Tx pins from the Dobot interface port are connected to Tx and Rx pins on the Arduino. In this case pins 18 and 19 are used.
The Ground pin on the Dobot is connected to one of the ground pins on the Arduino.

Wire connections

Camera Position and Mounting Bracket

I have considered 2 options for where to position the camera; either in a stationary setup, with the camera mounted on a separate stand, or with the camera fixed near the end effector on the arm itself. Both options have their pros and cons. The benefit of a stationary setup is that the camera can register parts at the same time the robot is performing pick and place operations. This option leads to the quickest cycle times and may therefore be the best option for industrial use in a fixed production line. The drawback however is that the camera is separated from the robot arm and therefore needs to recalibrated if either of the two are moved to a different position. This is the reason why I chose to mount the camera directly onto the robot arm.

The position of the camera now only needs to be calibrated once and is still valid when the arm is picked up and moved to a different location.

I have designed a camera mounting fixture which is used to fix the Pixy to the front of the robot arm. The part was designed in Fusion 360.

Modeling the Pixy2 holder in Fusion360

The STL file for the Pixy2 holder can be downloaded here:

There are a couple if requirements which lead to the current design. The camera should be close to the end effector, while making sure it is at a position where the gripper or suction cup do not block the view of the camera. In order to align the field of view of the camera as well as possible with the movement range of the arm I decided to place the camera on de side of the end effector instead of mounting it directly in front. The camera is also slightly rotated since the arm is also rotated at a certain angle when the camera is directly above the area where the parts are located. This angle does not need to be exact. It is only intended to get as much as the possible of the picking area in the image, which would not be possible if the image was tilted. Any residual angle if the image will be taken care of automatically in the calibration procedure.
A picture of the Pixy2 module was imported into the Fusion360 model for visualization purposes and more importantly, to transfer the location of the mounting holes to the fixture.

Since the module is very light it is sufficient to use only 2 of the avaliable mounting holes. Standard M3 screws are used to to fix the module and are screwed directly into the non-threaded undersized holes.

Camera Settings and Lighting

The Pixy2 is a small camera module with built in part detection, which can track objecs at 60 frames per second. It does not have the resolution, feature recognition or any of the other fancy features of professional vision systems, but with the features that it does have it is perfectly capable handling less demanding object detection tasks. The object detection on the Pixy2 is based on blob detection, where something is recocnized as an object when enough pixels of a similar predefined colour are found in the image.

Pixymon software showing recognized signatures

Objects need to have a more or less uniform colour across the part, which differs significantly from its surroundings. The pixy2 does not recognize the shape of an object, so part identification based on geometry or finding the angle at which a part is rotated is not possible. It has to be noted that the pixy can find the rotation angle of parts, but this only works if the part has 2 adjacent areas made from different colors, so a part would have to be made with such a color code added to make this work. Also functions are available for tracking lines and reading barcodes.

The method of lighting has a large effect on if the camera is able to find part signatures. Parts may become overexposed when using too much light, or the opposite: the camera may not be able to recognize a color when the lighting conditions are too dim.
I have found that there is not a lot of leighway for the lighting setup which means I have to regularly change the camera settings if the amount of ambient light is changing, for example at a different time of day.
In order to get a correct exposure and visibility of the parts you can play around with the background color of the table, the type of light source, the position and angle of the light source factors. The Pixy itself also has a built in LED to illuminate the scene, which could be another option. I have obtained the best results during day time, when just using incoming sunlight as a light source, but your mileage may vary. Linked below is a helpful article on lighting for machine vision systems:

There are a couple of settings in the Pixy monitor software that can also greatly help to improve part detection.
-Camera brightness: sets overall brightness level can help better identify different colored parts.

  • Signature range: set this for each color signature individually to prevent false positives, while still picking up as much surface area of the part as possible.
  • Settings->Camera-> you can choose whether the camera applies auto exposure and auto white balance.
  • Settings->Camera-> Flicker Avoidance: enable when using indoor lighting, which eliminates flickering. However, this makes the picture much brighter and can cause overexposure. To counteract this, adjust light intensity until a good exposure is achieved.

Camera Calibration Routine

The calibration routine basically makes it possible to transform XY locations in the coordinate system of the Pixy camera to the coordinate system of the Dobot. So each time a part is found by the camera, its location is transformed to the coordinate system of the robot arm so it is able to pick up the part. Several factors need to be taken into account when trying to transform the coordinates. The routine I am about to decribe takes care of scaling in X and Y direction, rotation and translation. To keep things simple I did not take into account skewed coordinates or lens distortions.
Basic trigonometry was used to perform the transformations.

First the camera is moved to a position at a height where the entire work area is in view. A routine is performed where the camera searches for 3 calibration dots; one representing the origin of the coordinate system and the other 2 points at a random location on the X axis and Y axis.

A calibration sheet was made using Fusion 360 and printed on an A4 piece of paper. The calibration sheet is basically a simple sketch with 3 dots of any given color. Black dots do not work since they are not recognized by camera.

Calibration sheet

The distance between the points is not relevant, only that they describe 2 intersecting lines which are perpendicular to eachother.

When the calibration routine is started the Pixy will register the coordinates of the dots and communicate then to the Arduino.

The location of the calibration dots are stored in the EEPROM of the Arduino controller so they can be used later, even after turning off the system.
The second part of the calibration procedure is to move the robot arm physically to each of the calibration points as accurately as possible. These locations are also recorded in the EEPROM.

We now have the data we need to transform coordinates in the first coordinate system to a corrresponding location in the second coordinate system. This is performed as follows:

  • The angle of each system is calculated as well as the scale of the X and Y axes and the distance between the origins of both systems.
  • The coordinate is moved in X and Y by the same distance as the origin location of system 1.
  • Coordinate system 1 is rotated to make the X horizontal and Y vertical.
  • The X and Y axes are scaled to match system 2
  • Another rotation is made matching the angle of system 2
  • the coordinate is translated in X and Y by the same distance as the origin location of system 2
    When all of this is done we should have found the location of a part in the coordinate system of the Dobot, which is required to send the robot arm to the right location.

The calculations used in this project can be found in the arduino code.
Note that the calibration procedure only finds the angle and XY-scales between the Pixy and Dobot coordinate systems. The actual transformation is taking place in the pick and place routine, each time a part is found by the camera.
Visit he link below for a useful page on 2D tranformations, which are typically used for computer graphics, but can also be applied for 2D transformations in a vision system:

Arduino programming and Menu System

The Arduino program for this project can be downloaded from my Github page under the following repository:

An Arduino library has been made available for the Pixy2, allowing it to be integrated easily into Arduino projects.
The library for the Pixy2 can be downloaded from: and installed in the Arduino IDE by selecting the Sketch menu -> Include Library -> Add .ZIP library. Doing this also makes Pixy demo sketches available under the File->examples menu
For the Dobot no library needs to be installed.

The Arduino IDE generated an error message when including pixy library.
The error indicated multiple definition error of `__vector_15′ between ZumoBuzzer.cpp (used by pixy) and FlexiTimer2.cpp (used by dobot), basically meaning the same name was used by both that Pixy and flexitimer library. Since I did not think I needed the zumo library I deleted all Zumobuzzer and Zumomotor .ccp and .h files from my pixy library folder, which seemed to fix the issue.

The Arduino program for this project consists of 3 main sections, which are also covered under 3 menu items on the Arduino LCD shield.
This article covers the menu items at a high level. If you would like to have more details on the underlying Arduino code, you can download the program through the link at the top of this section. Explanations are added behind many of the code lines, but if you have any questions, please leave a comment below.

During startup all previously stored user data is read from the EEPROM after which 3 menu items are available to the user:

1 – Jog Menu: Used for manually operating the robot arm .
The Jog menu allows the user to set the jog increment in several steps from 0.1 mm to 20 mm. The robot arm is moved with the selected increment by pressing the up or down buttons.
All four axes can be controlled; X, Y and Z for the movement of the arm, and R for the rotaion of the gripper. The “Vac” option allows the user to manually open and close the gripper, or activate the vacuum for the suction cup, depending on which end effector is selected.

2 – Run Menu: From this menu the automatic cycle can be activated, in which parts are picked from the table and dropped off at a predefined location.

3 – Settings Menu: here, the user can set the following parameters:
Move the arm to the following coordinates, or store the coordinates in the EEPROM of the Arduino:

  • Calibration position: this is the position where the Pixy2 camera has a full view of the area where the parts are positioned on the table.
  • Z-down position: this is the Z-position at which parts are picked from te table. This is different form both the vacuum cup and the gripper and may also differ depending on the part size.
  • Calibration locations A, B and C: these are the locations used in the calibration procedure to determine the angle and scale of the axes between the Pixy2 and Dobot coordinate systems.
  • Start position: Initial pose of the arm after startup.
  • Drop off locations 1, 2 and 3: the XYZ positions for the drop off locations of the three different part signatures.
  • Start the calibration routine: covered under the calibration section above.
  • End effector type: Select if the end effector is a gripper or vacuum cup. For the gripper both vacuum and compressed air are used (open and close), while for the vacuum cup only vacuum is needed.

Pick and Place Routines

The program contains a simple pick and place routine, which can be started from menu item 2. The routine consists of a loop, designed to pick parts from the table and drop them off at the correct location:

  • The arm moves to the calibration position, which is also used to find the parts on the table in the automatic pick and place cycle.
  • The Pixy2 camera grabs a single frame and communicates the coordinates of all recognized signatures to the arduino though the serial connection. The coordinates are placed into an array.
  • Coordinates of the first available signature are read from the array.
  • The arm performs a series of moves where the part is picked from the table and dropped off in the correct location. (the drop off locations are previously set in the settings menu)
  • The arm moves back to the calibration position. If any other signatures are found the cycle repeats. If not, the loop is aborted and a message “zero parts found” is displayed on the LCD screen.

Final Thoughts

I have found that integrating the Pixy2 into a project seems quite daunting at first, but is actually not that hard. Most of the heavy lifting (image processing, serial communication) is done by the Pixy itself. The only thing you have to do in your Arduino project is to read the coordinates from an array and apply appropriate actions to it in your project, in this case: move a robot arm to perform pick and place tasks.

It might be good to note that the Pixy2 module is very cheap at only €60 at the time of writing, while the Dobot is quite expensive at well over €1000 . Not everyone is willing to make this kind of investment to play around with robotics at home. However, the same vision system could be applied to much cheaper servo based robot arms.

If you have any questions or comments, please leave them in the comment section below.



7 Replies to “How to Make a Vision System for the Dobot Magician with the Pixy 2 Camera”

  1. Hi Robin,
    Thanks for your amazing work.
    I made a small enhancement in your code replacing prints with the F macro to free some memory. I tried to change it via Github.
    Let me know if it worked

    1. Hi Xukyo, thanks for updating the code, the F function is indeed very useful. The Arduino memory is tiny and easy fills up when using a lot of print commands (which I do)
      I am not able to test the code at the moment since I am on vacation, but I have seen your fork on GitHub and looks great, thanks again, nice to hear back from someone who is experimenting with this program.

  2. Hi roben
    Thanks for all information but can u tell me the Resolution of Pixy 2 cam? Also the center coordinates x and y

    1. Sure, the resolution is 320 by 200. The actual coordinates run from 0 to 319 and from 0 to 199. Since these are even numbers technically there is not a single pixel representing the center. For most applications this is not an issue and you should just be able to pick one of the 4 pixels surrounding the center. For example 160, 100 and just call that your center coordinate.

  3. Hi Craig,
    Thanks for visiting my website. No problem, it might indeed be difficult to find the basic commends within the code I added. Note that I actually also started off from the Dobot example program, which is unfortunately quite limited.
    It does however include code for moving the arm if you add the various axes. Please have a look at the “void moveArm” in my program. I will leave some comments to it below:

    //This sets the PTPCmd.x/y/z/r/ values to the values you used to call the function
    gPTPCmd.x = x;
    gPTPCmd.y = y;
    gPTPCmd.z = z;
    gPTPCmd.r = r;

    // this is the actual command for the Dobot to move
    SetPTPCmd(&gPTPCmd, true, &gQueuedCmdIndex);

    //I have added a section of code for opening and closing gripper or activating vacuum cup based on the end effector you are using. Note that this assumes you have declared a variable int with the name “endeffectorGripper” at the start of your program (of course you can choose any other variable name)

    if (endeffectorGripper == 1) {
    if (vacuumOn == false && vacuumOn != currentVac) {
    Serial.println(“Open GRIPPER”);
    SetEndEffectorSuctionCup(false, true, &gQueuedCmdIndex);
    ProtocolProcess(); //have command(s) executed by dobot
    SetEndEffectorGripper(true, true, &gQueuedCmdIndex); // open gripper (compressed air on);
    ProtocolProcess(); //have command(s) executed by dobot
    SetEndEffectorGripper(false, true, &gQueuedCmdIndex); // stop activating gripper when it is openend (compressed air off)
    if (vacuumOn == false) SetEndEffectorSuctionCup(false, true, &gQueuedCmdIndex);

    if (vacuumOn == true && vacuumOn != currentVac)SetEndEffectorSuctionCup(true, true, &gQueuedCmdIndex);

    //the following command needs to be perfomed last to have the Dobot “process” and execute the commands:

    Finally I copy the values of x/y/z and r to other variables which I use to determine the current position in other sections of the program, so this is optional for you:
    currentX = x;
    currentY = y;
    currentZ = z;
    currentR = r;
    currentVac = vacuumOn;

    The easiest way to start is probably to copy the “void moveArm” to your program above the “void loop” section, declare the necessary variables at the beginning of your program and then modify the code to your needs.

    I hope this helps.

    Happy holidays!


    Edit: GitHub code showed “if endeffectorGripper = 1”, this should be “if endeffectorGripper == 1”. (I forgot one of the equal signs)
    Let me know if you have any other questions, I will help if I can.

  4. Hi Robin,
    Great work with the Dobot and Pixy2. A couple of questions. First, I have an original PIXY, will this work with your code or did they make a lot of changes between the two Pixy versions?
    Second, As I am a teacher I have purchased a Dobot to use for demonstrations in my digital technologies classes. I had hoped to be able to program the Dobot using an arduino (rather than Python). Unfortunately there appears to be few examples of arduino control of a Dobot. I have the example sketch from and your sketch. I am wondering if you might have a basic sketch or similar of controlling movement (x, y, z) and gripper using arduino. Sorry to ask but the additional code for the LCD and PIXY2 has made your code somewhat complex for the task I had hoped to complete.
    Thanks for the reply. Craig (Brisbane, Australia)

Leave a Reply

Your email address will not be published. Required fields are marked *