Thursday 17 September 2015

Augmented Reality Library for MATLAB


I have developed a simple Augmented Reality library in MATLAB. By using this library, anyone with no knowledge about AR can implement the functionalities of AR  in MATLAB easily without any codings.

The only requirements from the user are,

  • 1. MATLAB R2014a or greater
  • 2. Target / Trigger Image
  • 3. Overlay Image / Video
  • 4. Image Acquisition Device
The video demonstration of the library can be found in the below video,




The library file can be downloaded from this link .



Implementation:

  • Download the file and change the MATLAB path to the file path
  • Call the function "ARinvoke" with appropriate arguments to render AR
  • Syntax (type in MATLAB command line):
    • ARinvoke('TargetImagePath', 1/2, 'OverlayObjectPath', 'ImageAcquisitionDeviceName', Device number);
  • Here, the second argument (1/2) means,
      • 1 for image overlay over the target image
      • 2 for video overlay over the target image

  • Example:
      • ARinvoke('C:\targetimage.jpg',1,'C:\overlayimage.jpg','winvideo',1);  // second argument - 1 -> overlay image
      • ARinvoke('C:\targetimage.jpg',2,'C:\video.avi','winvideo',1);   // second argument - 2 -> Overlay video

  • Device name (winvideo) is the webcam name in windows. You can use any camera of your choice. Use "imaqhwinfo" for list of cameras available in your device
  • Device number (1) means internal camera of laptop and (2) is external USB webcam connected. 
Currently, only 2D  objects rendering has been done. I am about to work  on 3D objects rendering and will update it once it is finished.

For now, only one image can be tracked. If anyone interested to track more than one image, feel free to contact me. 

Note:
The video being rendered (videooverlay) might be slow based on the processing speed of the Computer and MATLAB. I have increased the video framespersecond to 100 fps for better speed.

Try the library and have fun with your own custom images and videos.

Monday 17 August 2015

Gesture Controlled Robotic ARM using Microsoft Kinect and MATLAB


Kinect sensor is one of the amazing product from Microsoft. It enabled me to implement my gesture controlled tasks very easily. I have built a 3 DOF robotic arm, which is to be controlled by human hand gestures. I have done this by obtaining my  wrist coordinates from the skeletal structure acquired from the kinect sensor in MATLAB by calling functions of Kinect SDK in MATLAB. Once tracking of my arm coordinates has been done, the coordinate system of the robotic arm can be calibrated to the X and Y axis of the wrist coordinates. The control signal from the MATLAB is sent to the servo motors of the Robotic ARM through the Aduino-MATLAB interface.


The demonstration of the gesture controlled robotic arm using Kinect can be seen below,






Saturday 4 April 2015

Vision Guided Robot- Video Processing with BeagleBone Black and OpenCV C++

This is what kept me really busy for two months!. As I was new to BeagleBone Black and to Linux operating system, I struggled a bit during the starting stage of the project. But excellent tutorial of Dr.Derek Molly helped me a lot with the BeagleBone Black. (This project was done as a part of the Texas Instruments Innovation Challenge)

I was aiming at developing a simple robot that have the ability to bring the required object to the user. For this purpose, the robot was mounted with a robotic arm to grasp the object and a camera to sense the objects in its surroundings. 

Detailed description of the various parts of the robot and the operational demo can be seen in the following video,




If you are very curious about the programming part of the robot, you can see the BeagleBone Black coding of the robot in my github repository. I have tested the coding with the Ubuntu 14.04 LTS and latest version of Debian Operating System on the BeagleBone Black(BBB) and it worked fine. Object Recognition with OpenCV was faster in Ubuntu than in Debian (I have no idea why recognition is slower is Debian..trying to figure it out!!)


For those who would like to try this project, here are few tips,

1. Boot your BBB with any OS of your choice following the instructions from here (I`ve installed Ubuntu)

2. Install OpenCV to the BBB (some OS comes with pre installed OpenCV) using the following command,

                  sudo apt-get install libopencv-dev

3. Download my code from here to the BBB.

4. I have used simple template matching to identify objects. So choose your object of interest and specify the path to the image in the program.

5.Connect the webcam to the BBB. Make sure your webcam works with BBB first. I was initially testing with a webcam that worked fine with the computer but not with the BBB. So I have bought a new Logitech C70 webcam and it worked fine.

6. Thats it!, Compile the code and run it. I have coded in such a way that the GPIO pins(14,15,16,17,18) of P8 header of the BBB will respond to the object based on its coordinates. and a log file of the processed video output will be stored as a media file in .avi format. If you are using Windows OS and Putty to develop the coding you can use Win SCP software access the logged file from your windows machine. Also if you are using lxde session in Debian OS, you can very well use imshow() function of OpenCV to see the realtime object recognition.
       





Thursday 8 January 2015

Face and Eye detection, Cornea(eye center) tracking using OpenCV



I was just thinking to work on eye gaze detection to estimate where a person is looking. So, as a first step, I need to detect the eyes and then the cornea/pupil. Since I have planned to execute the project in Android, I have done the coding in OpenCV, so that I can use the same function in Android.

Algorithm:

First, I have detected the face using haar cascade and extracted the ROI. Then segmented the left and right eye by rough calculations and within the segmented region of interest, I have used image gradients to locate and track the cornea/pupil.

The demonstration of the above algorithm can be seen in the video. The work is still under progress and will share more about the techniques used later.