This section mentions projects completed by me during my MS and during 4 years of UnderGrad.
This project was part of Mobile Robotics course(CSE-483,monsoon-2012). The objective was to demonstrate the concept of velocity obstacle for collision avoidance between multiple mobile robots. If a robot detects that its velocity vector lies in collision cone w.r.t to other robot, it modifies its velocity to get out of collision cone and avoid a potential collision.
The code was tested in simulation using MobileSim for multiple cases and was implemented on couple of amigobots. This video shows simulation of velocity obstacle in MobileSim. Robot is a simulation model for amigobots. Two obstacles(robot at bottom and robot at right) move towards robot which is moving in forward direction.
This project was also part of Mobile robotics course. The objective was to localize robot w.r.t to a known point feature based map.
State of robot is parameterised as a gaussian. Combining two erroneous source of measurement ( odometry and sensor measurement) using Kalman filter robot tries to localize itself using known map features. Simulation for this was done on Matlab and implementation on hardware was done using a single amigobot.
This video shows matlab simulation for localization of robot using Extended Kalman Filter. Blue circles reperesent map feature points.Two moving circles represent predicted(red) and actual(green) position of robot. Red ellipse represents the uncertanity of the robot's position(covariance matrix). As the robot observes a map feature point, it updates its predicted value which is very close to actual value.
This project was also part of Mobile robotics course. FAST slam is a particle based SLAM technique. Unlike EKF based slam it is able to handle large number of map features. This project first required implementation of particle based monte carlo localization, which was then extended for FastSLAM. Similar to previous project map features were considered as points. I demonstrated this project as a simulation in MATLAB.
This project was part of computer vision course(CS5765,Spring-2012). The goal of the project was to build an image search interface where user provides input image and then a similar image is retrieved from the available database.
The main technique used was bag-of-words. Using different feature detectors and descriptors like SIFT and PHOG, a visual vocabulary was build to perform image retrieval. Besides bag-of-words, a learning based image retrieval interface was also implemented. Training was done on subset on dataset using SVM and ANN and then image retrieval was tested on remaining dataset.This project was implemented on MATLAB and was completed by group of two students.
This project was part of Computer Vision course. In this project goal was to stitch together 5 images to build a panorama.
Common features among different images were detected using SIFT. Homography matrix between different images were thenn estimated using RANSAC based algorithm and images were aligned to plane of central image using the appropriate homography transformations.
As our final year project, me and three other students built a 4Dof Robotic arm. This robotic arm has 4 revolute joints and a gripper as end effector.
The idea was to move the arm in labview and make the robotic arm replicate that movement. Commands generated for motors(joint movement of simulation model) were communicated to a micro-controller using serial communication and the microcontroller latched those commands to each servo motor(1 at every joint).
My role in this project was to design Labview interface, make serial communication working between labview and microcontroller and program the microcontroller for receiving serial commands from laptop, process it and send appropriate commands to every motor. The burden of kinematic design and fabrication of arm was bared by my team mates. Besides this I also helped in designing the circuit board for the project.The complete details of arm has been opensourced and can be accessed here and details of serial communication between Labview and microcontroller can be accessed here . Video of arm can be seen here.
I completed this project during my training at KUKA robotics, Pune . KUKA Roboter Gmbh is industrial robotics and automation magnate and I had a great opportunity to work at there training center at Pune for 6 weeks.
In this project a joystick is connected with KUKA robot to control its movement. Communication between robot and joystick is established using Ethernet and protocol used is TCP\IP.Basic concept of this project is to exchange XML strings .Robot sends a XML strings with current coordinates and a correction is done on these coordinates using joystick and is sent back to the robot.
Communication on robot side is controlled by EthernetKrlXml a utility provided by KUKA and it also provides Xml structure for data exchange and on external system Labview is used for:
I was part of the team winning Micromouse competition held at NIT,Trichy in 2010. In this project my role was to design and fabricate the circuit board and help in programming of micro-controller.