How to Make a Vision System for the Dobot Magician with the Pixy 2 Camera

Vision systems and robot arms are used in many modern production lines. With the availability of cheap vison cameras for consumers it has now become possible to make your own vision system controlled robot arm at home.

This article describes how to add a vision system to the Dobot magician robotic arm, allowing it to identify parts automatically and place them in a designated drop off location.

In the following sections the project is described from idea to a functioning pick and place system:

  • Connect the Pixy2 and Dobot to the Arduino
  • Camera position and mounting bracket design
  • Camera Settings and lighting
  • Camera calibration routine
  • Arduino programming and menu system
  • Pick and Place routines

Connect the Pixy2 and Dobot to the Arduino

In this project both the Pixy2 and the Dobot Magician are connected to an Arduino Mega, which contains the program for controlling the pick and place routines. The Pixy2 is connected to the I2C bus on the center of the Arduino Board. The Pixy can be connected using the supplied flatcable. However, since it is too short for this application I made a new cable at the desired length. The flat cable connectors and the cable itself were purchased from a local electronics store.

Crimping the flat cable connectors (caveman style)

Since I do not have a crimping tool I used a pair of pipe wrench pliers to crimp the connectors, which is not advisable, but it works.

To verify if all the wires made a proper connection I verified all wires with a multimeter by checking restistance between connectors on each socket. An LCD shield is placed onto the arduino board with some female sockets in between to clear the serial connector below the board.

Placing the LCD shield

A serial connection is used to communicate between the Dobot and the Arduino. Please note that an Arduino Uno cannot be used for this since it only has a single serial connection which is used to communicate to the PC. The Arduino Mega has 3 spare serial ports, on pins 14(TX)-15(RX), 16(TX)-17(RX) and 18(TX)-19(RX).
The Rx and Tx pins from the Dobot interface port are connected to Tx and Rx pins on the Arduino. In this case pins 18 and 19 are used.
The Ground pin on the Dobot is connected to one of the ground pins on the Arduino.

Wire connections

Camera Position and Mounting Bracket

I have considered 2 options for where to position the camera; either in a stationary setup, with the camera mounted on a separate stand, or with the camera fixed near the end effector on the arm itself. Both options have their pros and cons. The benefit of a stationary setup is that the camera can register parts at the same time the robot is performing pick and place operations. This option leads to the quickest cycle times and may therefore be the best option for industrial use in a fixed production line. The drawback however is that the camera is separated from the robot arm and therefore needs to recalibrated if either of the two are moved to a different position. This is the reason why I chose to mount the camera directly onto the robot arm.

The position of the camera now only needs to be calibrated once and is still valid when the arm is picked up and moved to a different location.

I have designed a camera mounting fixture which is used to fix the Pixy to the front of the robot arm. The part was designed in Fusion 360.

Modeling the Pixy2 holder in Fusion360

The STL file for the Pixy2 holder can be downloaded here:

There are a couple if requirements which lead to the current design. The camera should be close to the end effector, while making sure it is at a position where the gripper or suction cup do not block the view of the camera. In order to align the field of view of the camera as well as possible with the movement range of the arm I decided to place the camera on de side of the end effector instead of mounting it directly in front. The camera is also slightly rotated since the arm is also rotated at a certain angle when the camera is directly above the area where the parts are located. This angle does not need to be exact. It is only intended to get as much as the possible of the picking area in the image, which would not be possible if the image was tilted. Any residual angle if the image will be taken care of automatically in the calibration procedure.
A picture of the Pixy2 module was imported into the Fusion360 model for visualization purposes and more importantly, to transfer the location of the mounting holes to the fixture.

Since the module is very light it is sufficient to use only 2 of the avaliable mounting holes. Standard M3 screws are used to to fix the module and are screwed directly into the non-threaded undersized holes.

Camera Settings and Lighting

The Pixy2 is a small camera module with built in part detection, which can track objecs at 60 frames per second. It does not have the resolution, feature recognition or any of the other fancy features of professional vision systems, but with the features that it does have it is perfectly capable handling less demanding object detection tasks. The object detection on the Pixy2 is based on blob detection, where something is recocnized as an object when enough pixels of a similar predefined colour are found in the image.

Pixymon software showing recognized signatures

Objects need to have a more or less uniform colour across the part, which differs significantly from its surroundings. The pixy2 does not recognize the shape of an object, so part identification based on geometry or finding the angle at which a part is rotated is not possible. It has to be noted that the pixy can find the rotation angle of parts, but this only works if the part has 2 adjacent areas made from different colors, so a part would have to be made with such a color code added to make this work. Also functions are available for tracking lines and reading barcodes.

The method of lighting has a large effect on if the camera is able to find part signatures. Parts may become overexposed when using too much light, or the opposite: the camera may not be able to recognize a color when the lighting conditions are too dim.
I have found that there is not a lot of leighway for the lighting setup which means I have to regularly change the camera settings if the amount of ambient light is changing, for example at a different time of day.
In order to get a correct exposure and visibility of the parts you can play around with the background color of the table, the type of light source, the position and angle of the light source factors. The Pixy itself also has a built in LED to illuminate the scene, which could be another option. I have obtained the best results during day time, when just using incoming sunlight as a light source, but your mileage may vary. Linked below is a helpful article on lighting for machine vision systems:

There are a couple of settings in the Pixy monitor software that can also greatly help to improve part detection.
-Camera brightness: sets overall brightness level can help better identify different colored parts.

  • Signature range: set this for each color signature individually to prevent false positives, while still picking up as much surface area of the part as possible.
  • Settings->Camera-> you can choose whether the camera applies auto exposure and auto white balance.
  • Settings->Camera-> Flicker Avoidance: enable when using indoor lighting, which eliminates flickering. However, this makes the picture much brighter and can cause overexposure. To counteract this, adjust light intensity until a good exposure is achieved.

Camera Calibration Routine

The calibration routine basically makes it possible to transform XY locations in the coordinate system of the Pixy camera to the coordinate system of the Dobot. So each time a part is found by the camera, its location is transformed to the coordinate system of the robot arm so it is able to pick up the part. Several factors need to be taken into account when trying to transform the coordinates. The routine I am about to decribe takes care of scaling in X and Y direction, rotation and translation. To keep things simple I did not take into account skewed coordinates or lens distortions.
Basic trigonometry was used to perform the transformations.

First the camera is moved to a position at a height where the entire work area is in view. A routine is performed where the camera searches for 3 calibration dots; one representing the origin of the coordinate system and the other 2 points at a random location on the X axis and Y axis.

A calibration sheet was made using Fusion 360 and printed on an A4 piece of paper. The calibration sheet is basically a simple sketch with 3 dots of any given color. Black dots do not work since they are not recognized by camera.

Calibration sheet

The distance between the points is not relevant, only that they describe 2 intersecting lines which are perpendicular to eachother. An example can be downloaded below:

When the calibration routine is started the Pixy will register the coordinates of the dots and communicate then to the Arduino.

The location of the calibration dots are stored in the EEPROM of the Arduino controller so they can be used later, even after turning off the system.
The second part of the calibration procedure is to move the robot arm physically to each of the calibration points as accurately as possible. These locations are also recorded in the EEPROM.

We now have the data we need to transform coordinates in the first coordinate system to a corrresponding location in the second coordinate system. This is performed as follows:

  • The angle of each system is calculated as well as the scale of the X and Y axes and the distance between the origins of both systems.
  • The coordinate is moved in X and Y by the same distance as the origin location of system 1.
  • Coordinate system 1 is rotated to make the X horizontal and Y vertical.
  • The X and Y axes are scaled to match system 2
  • Another rotation is made matching the angle of system 2
  • the coordinate is translated in X and Y by the same distance as the origin location of system 2
    When all of this is done we should have found the location of a part in the coordinate system of the Dobot, which is required to send the robot arm to the right location.

The calculations used in this project can be found in the arduino code.
Note that the calibration procedure only finds the angle and XY-scales between the Pixy and Dobot coordinate systems. The actual transformation is taking place in the pick and place routine, each time a part is found by the camera.
Visit he link below for a useful page on 2D tranformations, which are typically used for computer graphics, but can also be applied for 2D transformations in a vision system:

Arduino programming and Menu System

The Arduino program for this project can be downloaded from my Github page under the following repository:

An Arduino library has been made available for the Pixy2, allowing it to be integrated easily into Arduino projects.
The library for the Pixy2 can be downloaded from: and installed in the Arduino IDE by selecting the Sketch menu -> Include Library -> Add .ZIP library. Doing this also makes Pixy demo sketches available under the File->examples menu
For the Dobot no library needs to be installed.

The Arduino IDE generated an error message when including pixy library.
The error indicated multiple definition error of `__vector_15′ between ZumoBuzzer.cpp (used by pixy) and FlexiTimer2.cpp (used by dobot), basically meaning the same name was used by both that Pixy and flexitimer library. Since I did not think I needed the zumo library I deleted all Zumobuzzer and Zumomotor .ccp and .h files from my pixy library folder, which seemed to fix the issue.

The Arduino program for this project consists of 3 main sections, which are also covered under 3 menu items on the Arduino LCD shield.
This article covers the menu items at a high level. If you would like to have more details on the underlying Arduino code, you can download the program through the link at the top of this section. Explanations are added behind many of the code lines, but if you have any questions, please leave a comment below.

During startup all previously stored user data is read from the EEPROM after which 3 menu items are available to the user:

1 – Jog Menu: Used for manually operating the robot arm .
The Jog menu allows the user to set the jog increment in several steps from 0.1 mm to 20 mm. The robot arm is moved with the selected increment by pressing the up or down buttons.
All four axes can be controlled; X, Y and Z for the movement of the arm, and R for the rotaion of the gripper. The “Vac” option allows the user to manually open and close the gripper, or activate the vacuum for the suction cup, depending on which end effector is selected.

2 – Run Menu: From this menu the automatic cycle can be activated, in which parts are picked from the table and dropped off at a predefined location.

3 – Settings Menu: here, the user can set the following parameters:
Move the arm to the following coordinates, or store the coordinates in the EEPROM of the Arduino:

  • Calibration position: this is the position where the Pixy2 camera has a full view of the area where the parts are positioned on the table.
  • Z-down position: this is the Z-position at which parts are picked from te table. This is different form both the vacuum cup and the gripper and may also differ depending on the part size.
  • Calibration locations A, B and C: these are the locations used in the calibration procedure to determine the angle and scale of the axes between the Pixy2 and Dobot coordinate systems.
  • Start position: Initial pose of the arm after startup.
  • Drop off locations 1, 2 and 3: the XYZ positions for the drop off locations of the three different part signatures.
  • Start the calibration routine: covered under the calibration section above.
  • End effector type: Select if the end effector is a gripper or vacuum cup. For the gripper both vacuum and compressed air are used (open and close), while for the vacuum cup only vacuum is needed.

Pick and Place Routines

The program contains a simple pick and place routine, which can be started from menu item 2. The routine consists of a loop, designed to pick parts from the table and drop them off at the correct location:

  • The arm moves to the calibration position, which is also used to find the parts on the table in the automatic pick and place cycle.
  • The Pixy2 camera grabs a single frame and communicates the coordinates of all recognized signatures to the arduino though the serial connection. The coordinates are placed into an array.
  • Coordinates of the first available signature are read from the array.
  • The arm performs a series of moves where the part is picked from the table and dropped off in the correct location. (the drop off locations are previously set in the settings menu)
  • The arm moves back to the calibration position. If any other signatures are found the cycle repeats. If not, the loop is aborted and a message “zero parts found” is displayed on the LCD screen.

Final Thoughts

I have found that integrating the Pixy2 into a project seems quite daunting at first, but is actually not that hard. Most of the heavy lifting (image processing, serial communication) is done by the Pixy itself. The only thing you have to do in your Arduino project is to read the coordinates from an array and apply appropriate actions to it in your project, in this case: move a robot arm to perform pick and place tasks.

It might be good to note that the Pixy2 module is very cheap at only €60 at the time of writing, while the Dobot is quite expensive at well over €1000 . Not everyone is willing to make this kind of investment to play around with robotics at home. However, the same vision system could be applied to much cheaper servo based robot arms.

If you have any questions or comments, please leave them in the comment section below.



68 Replies to “How to Make a Vision System for the Dobot Magician with the Pixy 2 Camera”

  1. Hi Robin, I want to ask about my suction cup which is not working 100%. Is it a coding error or another factor? I have followed the instructions as above. Can we discuss via email?
    Thank You

    1. Hi, there have been other users with vacuum activation issues. I am not sure if I can resolve this via chat or email. The best thing to do is to check if the vacuum cup is working with manual activation and then with a simple program to isolate the cause of the issue. If a simple program works (for example turn it on and off at 1 second intervals) but your regular program does not, at least you will know that the harware is not the issue.

  2. Hello, sir robin, I already set all calibration. The camera also can detect the color that I set. But when it runs, it picks somewhere else. And may I know where the point b_calib and C_calibrate? Any ideas? Thanks sir.

    1. After calibration goto to each of the calibration points by selecting go to option to make sure calibration was successful. Also check the live feed from the pixy during automatic cycle to make sure it is not reading random noise as a signature. If that is the case increase light in your room. Hope this helps, Robin

  3. Hi, robin. For the a, b, and c positions, are we need to go to the z position 1st? Because I already set a, b, and c positions, but when I go to cal a_xy, it is just above the green point A but when I go to z position first, then the dobot goes to the exact position that has been set. Any idea?

    1. Hi, the calibration for a, b and c is only done on X and Y. The Z when you are calibrating is completely ignored because this is not needed. The only Z position that is stored is the Z-down position. When you press the button to go to a_xy it stays at the Z position where it currently is. If you press goto Z-down first, the Z-down will be the current z position. I wanted the user to select the z position they want for evaluating the calibration positions, without the robot going to a fixed position each time. Hope this helps.

  4. Hi, robin. I already set all positions, but when I try to start a program, the dobot cannot detect the colors block that I already set at pixymon software. Its move randomly. Anything that is missed?

    1. Hi, a few ideas: of you go back to the calibration menu and press the buttons for going to positions a, b and c, does it actually go to the correct points? This is to check if calibration went ok. Also, please check to live view in the pixy software when the robot is in automatic cycle. Does it find a part, or is it identifying some random noise as a signature? In that case make sure the lighting sufficient and constant. Hope this helps to solve the issues.

  5. Hi, robin. Have a question.
    Are we need to set the cal pos?
    Are we need to position the pixy for the calibration part?
    Are we need to set a signature (pixy) when we do the calibration?

    I have failed this part.


    1. Hi Ril, I am not sure what specifically is goed ng wrong in your setup, but yes all of the steps you mentioned are needed to find the part and to translate the camera coordinates to the robot coordinates.

  6. hi robin, thanks for the response. I try to delete the zemobuzzer file, but when I try to verify the code after deleting the zemobuzz, the file keeps coming. Is there anything that I miss? thanks.

    1. Hi, another suggestion: search for “zumobuzzer” in all open tabs in your arduino IDE when you have the project open. Then comment out or delete that line.

  7. Hi, robin. Thanks for the instruction about dobot and pixy. I have a problem running the Arduino code. This happens when I try to verify the code. if there something that I miss?

    C:\Users\shahril\AppData\Local\Temp\arduino-sketch-96D5972E4BE914E3BC8A8A43AE9803F5\libraries\Arduino-Pixy2-Dobot-Magician-Pick-and-Place-master\FlexiTimer2.cpp.o (symbol from plugin): In function `FlexiTimer2::set(unsigned long, double, void (*)())’:
    (.text+0x0): multiple definition of `__vector_15′
    C:\Users\shahril\AppData\Local\Temp\arduino-sketch-96D5972E4BE914E3BC8A8A43AE9803F5\libraries\Pixy2\ZumoBuzzer.cpp.o (symbol from plugin):(.text+0x0): first defined here
    collect2.exe: error: ld returned 1 exit status

    exit status 1

    Compilation error: exit status 1

    1. Hi Shahil,
      I recall having seen a similar issue. There is probably a conflict with a same variable used by different libraries. If I recall correctly you can delete the zumobuzzer file as displayed in the error message since it is not needed for the project. Make sure you make a backup copy of the file just in case. Reload the project and see if the error is still there. Hope this works. Best, Robin

  8. Please can I ask, where is the coordinate system 2 (of dobot) obtained. Used to convert between dobot coordinate system and pixy2 . coordinate system

    1. The Robot has its own coordinate system which is basically fixed for the robot arm and defines the coordinates of the robot arm at any specific location. So I am not changing anything to the coordinate system of the Robot, also not for the pixy. The only thing I am doing is a conversion between the 2 coordinate systems. Hope that makes sense.

        1. The A4 paper is to create a common reference between the coordinate system of the pixy and the one of the Robot. For the dobot the reference is made by going to each of the green dots. For the pixy the reference is made by image recognition of the 3 dots. For both systems the a4 does not have to be aligned. Some math is used to make the translation between the pixy and dobot coordinate system. Once the factors for scale, translation and rotation are known you all points within the field of view of the camera can be converted to coordinates for the Dobot. I hope this provides some more insight.

    1. Hi, I am not sure what you mean by each step. If you mean directly controlling the steppers I am not sure if that is possible. What I did was control the coordinates and let the robot figure out how many steps are needed for each joint.

  9. Hi Robin, I’m having a problem doing this project, I have calibrated it correctly, the image processing is clear, but the robot still can’t attract the right center of the object? What should I do now? Can you help me?

    1. Did you also calibrate the signatures for the colors of the actual parts that you are using in the pixy software? Also make sure that the signature numbers in the pixy software match the signature numbers in the Arduino code. In principle, if the pixy software can detect the signature it has to come up with a coordinate for the Arduino. The signature should be a nice stable box around your part and you should not see any other signatures jumping around on the screen. If so, adjusting the lighting should help. I hope retraining signatures resolves the issue.

  10. Hi Robin, I followed you and I’m having a problem with the dobot going down to pick up the missing object. Can you help me!

    1. Hi, sure I can try to help you. Can you give me more details on the issue? Am I understanding correctly that it is finding an object that is not actually there? In that case you can tweak the settings in the pixy software to only recognize parts with a minimum size. Anything smaller will be considered noise and therefore ignored. If there is another issue please let me know. Robin

      1. During calibration is it mandatory for pixy to receive green squares with squares around them. When I calibrate the squares surrounding the green box are blinking. Does that have any effect?

        1. The square indicating the found signature should be steady and about as large as the part you are trying to find. If it is not stable, make sure you have enough light. Otherwise it is difficult for the camera to distinguish colors. After adjusting any light source teach the signature again until it is stable. If it is blinking you will not get good results either in calibration or in actual use with other color signatures. There are a lot of settings in the pixy software, play with them to see what gives you a stable result.

          1. I made many attempts to read the coordinates, but was not successful. Also the dobot forum could not help with this issue. I finally gave up on this feature. It should be possible though, but sorry I have no solution for this.

          2. Hi Robin. I have a problem that naturally, dobot running in automatic mode MENU 2 outputs DobotX, DobotY coordinates, which are very large, about 16000; even though i did the same calibration steps as you

          3. When you run the calibration and leave the Arduino connected to the PC, the Arduino IDE debug window should show the x and y for the calibration dot after you just saved it. Are those values normal? If you select “goto calibration point” from the menu on the Arduino, does it go back to the correct point or is it then also going to the wrong place?

          4. I ran at first it on benchmarks. But after the error it does not return to the standard point, the x and y coordinates are changed

          5. This looks odd. If I understand correctly the robot does not return to the same point, but is that in coordinates or in the physical world. It would be strange if the coordinates change. Do you get any errors when compiling? Did you change anything to the program? It is hard for me to troubleshoot this without seeing what happens. If you go through the steps one by one and see where it starts to fail that might help. For example, go back and forth between 2 (calibration) points to see if it keeps returning to same point. Go back and forth between a point and home position. Keep watching the display and see when it starts to deviate. Is it only one or all axes. If nothing helps try to re start with a fresh copy of the program from GitHub. Try to find out if it is a scaling error or if the error keeps going to the same direction. Hope this helps. If you have more details I might be able to help further.

          6. Hi, I had to look this up, it has been a while, but this should be it: C is top left, A is bottom left and B is bottom right. So, AB should be 180 and AC should be 120. Note that the image is also rotated 90 degrees between camera and actual world, so this might be confusing if you look it up in the code. If the calibration procedure worked you should see the distances mentioned above (of course with some deviation due to various factors, but roughly these values).

  11. Hi

    Is it possible to setup 2 picy 2 caneras on one dobot arm? i want to have one mounted on the robot itself, the ither in a stand both doing separate inspection tasks

    1. Good question, I think it is possible, but the Arduino mega only has a single I2C bus. Apparently these can be daisy chained, but I am not sure if you will be able to run the input of 2 cameras on an single Arduino simultaneously. If not you can always use a separate Arduino for the other camera and let the Arduinos talk to each other.

  12. I have a problem with the calibration. The LCD keypad indicates that the calibration has failed. I can’t get the calibration sheet to recognize the green circles from the camera. Do you have a tip for me?

    1. Hi, to have stable lightning is very critical with the Pixy. Make sure that when you view the image with the Pixy min software that most of the green circle is identified as being part of the signature. Otherwise re-teach the signature. It also helps to play with the various camera settings in the Pixy mon software to see which setting helps improve image detection with your lighting setup. The green circles are very even in color and flat, so it should pick up the whole circle.

    2. Hello Robin,
      I’m Neng Ayu from Indonesia.
      Thank you for making this tutorial, it really helped me to develop my dobot.

      I have many problems when I try to follow your tutorial but I never give up. I always try again and again. After 2 weeks I can actually connect dobot with pixycam. Then for those who have problems while following this tutorial, please just follow all the rules that Robin gave us, especially in calibration. this is very important.
      When you try to calibrate you have to calculate the distance between the green dots, i.e. 180, and 120. While not exact, it may be close to the sum of the distances. And it worked for me.

      But I have another problem that I can’t solve. The problem is in the gripper. sometimes the grip works well and then it doesn’t.
      the gripper only works for 3 times, in 4 times the gripper can’t grip the object. then in 5 times, the gripper works fine again.
      What should I do?
      Robin please help me to solve my problem.

      Thank You

      1. Hi Robin,
        we have the same problem. The pump does not start in the program. It runs 2-3 times and then again for a long time not or not at all. But I can control the pump via the robot, where the pump reacts. When I do that, the pump switches off appropriately. Can you tell me something about that? Robin do you have an idea? Many greetings

        1. Hi Nicole,

          Looking in the comments I can’t find a reply to Neng, so Neng sorry about that, I think I missed your comment.
          I recall sometimes having some issues with the vacuum pump being unresponsive, but later on in the project I never had issues again, so that would indicate the program itself is ok. This would leave specific settings, firmware or hardware issues as options. Do you have the arduino IDE debugger set to the same baud rate as indicated in the program? I have set it to 115200 baud, to accommodate the large amount of data sent to the serial monitor and communication with the pixy. You mention the vacuum does work when operating the robot, I assume this is the Dobot software? Does the vacuum also respond properly when operating it manually from the Arduino from the jog menu?
          Do you still have the dobot connected to the PC when running the Arduino program? In that case, try running the dobot without USB connection to the PC and only connected to the Arduino, this would exclude any interfering commands. If you set the end effector to gripper, does it operate normally with a sequence of compressed air on for opening and then stop and vacuum on for closing and then stop). Another option is firmware: do you have you updated the dobot magician to the latest firmware? Also, try to home the dobot each time you turn it on. This defines the home position, but might also help to reset it to some default settings (not sure, but it does hurt to home it) I always use the key on the back to home it. If I think of anything else I will add it here.
          I hope some of this does the trick and you end up getting it to work. Let me know if you have any other questions.

          1. Hi Nicole, I just thought of something else, which is very important. (I think this is one of the most likely causes for your issue)
            I was not able to get a feedback signal from the dobot, so I used delays in the automated cycle. If for some reason the arm does not reach its destination before the delay expires it might actually miss the next command, which would be to open or close the vacuum pump for the gripper/vacuum cup. So, there are 2 options to solve this:
            1: reduce speed of the arm. Find the following lines in the code
            gPTPCommonParams.velocityRatio = 50;
            gPTPCommonParams.accelerationRatio = 50;
            Lower the values to slow down the arm and recomplile and upload the program again.
            2: increase the various delays in the program. If you are working with the older “arduino-shield” program you would have to manually change the delays under the “menu 2” section of the arduino program. Just double all delays to see if it helps, you can always dial them back again.
            If you are using the newer Arduino program from the controller with touchscreen then it is easier. In that case just increase the delay time or reduce speed in the “speed” menu on the touchscreen interface.
            In both cases, start with a conservative setting and optimize from there. Look for if the arm has actually stopped moving above the part it needs to pick up, then the delay time is long enough.

    1. Hi, I have just added the Calibration sheet in PDF format to the article, in the paragraph under the picture where the sheet is shown.

  13. Hi Robin,

    Me and my friends try to build your project from your article. We got stuck at calibration moment because we have all the wires connected. We have prepared calibration sheet and we don’t know what to do next.
    We have tried to use Start Calibration option after setting Calibration position to view the whole sheet.
    Then option result was FAILED. Also, we tried to set manually Start Position, A_xy positon, B_xy position, C_xy positon nad Zaxis position. And then also we tried to Start Calibraiton but again we have failed result.
    What we are doing wrong?
    Could you please give us some tutorial about calibration process, because we think we missed something.
    Thanks in advance for your help.

    1. Hi Sebastian,
      The calibration routine is looking for signature 7 as found by the Pixy. In your case it looks like this signature is not found ( or actually the 3 signatures are not found) did you check in the Pixy mon software if the camera sees the 3 calibration dots and indicates them as signature #7?
      If you can see them in the Pixy mon software it should work. If you are still having issues when everything is set correctly, please check the serial monitor when running the program and verify if the signatures are found and if their coordinates make sense.
      Hope this helps,

      1. hi robin. In your case it looks like this signature is not found ( or actually the 3 signatures are not found) did you check in the Pixy mon software if the camera sees the 3 calibration dots and indicates them as signature #7? (That You Mention in the comment). are these 3 signatures need to be set in pixy mon software? it can be any name ?

        1. Hi Wan, yes you need to train the calibration dots in the pixy software. They also need to be signature #7 because the arduino is looking for this specific siga ture number during the calibration routine. It also checks if there are 3 of these signatures found, otherwise the calibration aborts. The signature number is included in the arduino code and can only be changed there if needed. Note that the blue Dot in the center is not needed for anything and does not need to be trained. Yes, if I recall correctly they can be any name, as long as it is #7. Search for the calibration routine in the arduino code to hopefully provide more details on how it works. Robin

          1. Hi robin. What do u mean by signature #7? Do we need set the signature example
            ( calibration_circle_green #7). Srry robin im still confused but i almost there to complete calibration. Thanks again

          2. Hi, no problem. I did have to look it up in my own code since it has been a while since I have seen it.
            In the beginning of the Arduino code there is a line that says: const int signatureCalib = 7;
            This assigns a variable to signature 7 from the pixy. This is also the 7th signature in the Pixymon software (name does not matter but it makes sense to call it something like “calibration_circle_green”. So if you want to change this for any reason, change the 7 in this line to a different number. I used 7 because it is far away from the lower numbers which are a more logical choise for the parts you want to pick. (and prevent you from mistakenly assing a part signature to the same number as a calibration signature).

            Further down in the code you will see
            ” pixy.ccc.getBlocks();” in a couple of places. This makes the camera capture an image and put all found signatures in an array. The calibration routine then looks for signatures #7 (signatureCalib variable) and does some math to calibrate the coordinate system between pixy and real world. Hope this answers your question. I can understand some of the things might be a bit confusing. I was a bit confused as well many times during the project 🙂 Hope this helps, Robin

  14. Disculpa amigo no puedo subir el sketch Ami placa Arduino mega
    Dise error copulando para la tarjeta Arduino mega ir mega 2560 existe status 1

  15. Hello Robin,
    so the software is on the Arduino and running. Thanks for your help, but I still have two questions:
    1. can you possibly send me the Calibration Sheet for printing!
    2. I use the SuctionCup not the Gripper. if I now switch in the Manual Jog at Vac from 0 to 1 nothing happens, I switch back to 0 so the Vaccummotor runs a moment and then switches off again, I would like to lift Smarties sort and put in vessels by color.
    What do I have to change for the SuctionCup on the software!
    Have previously changed the code so that stands at “if (endeffectorGripper==false) otherwise the Vaccummotor did not start at all!

  16. Hello Robin,
    i downloaded your software from here “”
    And also the library for the Pixy2 Cam here “”
    But when I want to upload the Arduino code to the Mega it always comes up with the following error message:
    “redefinition of `class PIDLoop`
    What am I doing wrong !
    I would be happy if you could help me further.
    Greetings Reiner

    Translated with (free version)

    1. Hi, I believe this indicates the PIDloop class is defined in 2 different places in your project, which would generate this error. If you do a search in all open tabs you should find the double instance of this. See if you can delete one of them.

  17. Hi Robin,
    Thanks for your amazing work.
    I made a small enhancement in your code replacing prints with the F macro to free some memory. I tried to change it via Github.
    Let me know if it worked

    1. Hi Xukyo, thanks for updating the code, the F function is indeed very useful. The Arduino memory is tiny and easy fills up when using a lot of print commands (which I do)
      I am not able to test the code at the moment since I am on vacation, but I have seen your fork on GitHub and looks great, thanks again, nice to hear back from someone who is experimenting with this program.

  18. Hi roben
    Thanks for all information but can u tell me the Resolution of Pixy 2 cam? Also the center coordinates x and y

    1. Sure, the resolution is 320 by 200. The actual coordinates run from 0 to 319 and from 0 to 199. Since these are even numbers technically there is not a single pixel representing the center. For most applications this is not an issue and you should just be able to pick one of the 4 pixels surrounding the center. For example 160, 100 and just call that your center coordinate.

  19. Hi Craig,
    Thanks for visiting my website. No problem, it might indeed be difficult to find the basic commends within the code I added. Note that I actually also started off from the Dobot example program, which is unfortunately quite limited.
    It does however include code for moving the arm if you add the various axes. Please have a look at the “void moveArm” in my program. I will leave some comments to it below:

    //This sets the PTPCmd.x/y/z/r/ values to the values you used to call the function
    gPTPCmd.x = x;
    gPTPCmd.y = y;
    gPTPCmd.z = z;
    gPTPCmd.r = r;

    // this is the actual command for the Dobot to move
    SetPTPCmd(&gPTPCmd, true, &gQueuedCmdIndex);

    //I have added a section of code for opening and closing gripper or activating vacuum cup based on the end effector you are using. Note that this assumes you have declared a variable int with the name “endeffectorGripper” at the start of your program (of course you can choose any other variable name)

    if (endeffectorGripper == 1) {
    if (vacuumOn == false && vacuumOn != currentVac) {
    Serial.println(“Open GRIPPER”);
    SetEndEffectorSuctionCup(false, true, &gQueuedCmdIndex);
    ProtocolProcess(); //have command(s) executed by dobot
    SetEndEffectorGripper(true, true, &gQueuedCmdIndex); // open gripper (compressed air on);
    ProtocolProcess(); //have command(s) executed by dobot
    SetEndEffectorGripper(false, true, &gQueuedCmdIndex); // stop activating gripper when it is openend (compressed air off)
    if (vacuumOn == false) SetEndEffectorSuctionCup(false, true, &gQueuedCmdIndex);

    if (vacuumOn == true && vacuumOn != currentVac)SetEndEffectorSuctionCup(true, true, &gQueuedCmdIndex);

    //the following command needs to be perfomed last to have the Dobot “process” and execute the commands:

    Finally I copy the values of x/y/z and r to other variables which I use to determine the current position in other sections of the program, so this is optional for you:
    currentX = x;
    currentY = y;
    currentZ = z;
    currentR = r;
    currentVac = vacuumOn;

    The easiest way to start is probably to copy the “void moveArm” to your program above the “void loop” section, declare the necessary variables at the beginning of your program and then modify the code to your needs.

    I hope this helps.

    Happy holidays!


    Edit: GitHub code showed “if endeffectorGripper = 1”, this should be “if endeffectorGripper == 1”. (I forgot one of the equal signs)
    Let me know if you have any other questions, I will help if I can.

  20. Hi Robin,
    Great work with the Dobot and Pixy2. A couple of questions. First, I have an original PIXY, will this work with your code or did they make a lot of changes between the two Pixy versions?
    Second, As I am a teacher I have purchased a Dobot to use for demonstrations in my digital technologies classes. I had hoped to be able to program the Dobot using an arduino (rather than Python). Unfortunately there appears to be few examples of arduino control of a Dobot. I have the example sketch from and your sketch. I am wondering if you might have a basic sketch or similar of controlling movement (x, y, z) and gripper using arduino. Sorry to ask but the additional code for the LCD and PIXY2 has made your code somewhat complex for the task I had hoped to complete.
    Thanks for the reply. Craig (Brisbane, Australia)

Leave a Reply

Your email address will not be published. Required fields are marked *