About Me
I am an Erasmus Mundus Joint Master's Scholar in Intelligent Field Robotic Systems at Universitat de Girona, Spain, and the University of Zagreb, Croatia. My research interests span a diverse range of cutting-edge disciplines, including robotics, computer vision, and deep/machine learning.
I am passionate about exploring the intersection of these fields to push the boundaries of technology and create innovative solutions that address complex real-world challenges.
Additionally, I have proven leadership qualities demonstrated in global communities including Google Developer Student Clubs, TEDx, Google Developer Groups Live, U.S. Embassy programs, and 10Pearls.
Education
Erasmus Mundus Joint Master’s in Intelligent Field Robotic Systems | Universitat de Girona
Semester I & II in Girona: Autonomous Systems, Machine Learning, Multiview Geometry, Probabilistic Robotics (Kalman Filtering), and Robot Manipulation, Localization (SLAM), Planning, Perception (Computer Vision), and Intervention.
Semester III in Zagreb: Aerial Robotics, Multi-Robot Systems, Human-Robot Interaction, Robotic Sensing, Perception, & Actuation, Deep Learning, and Ethics & Technology.
B.E. in Software Engineering | Mehran University of Engineering and Technology
Agent Based Intelligent Systems, Data Science & Analytics, Simulation & Modeling, Cloud Computing, Statistics and Probablity
CGPA 3.96 / 4.00 - Silver Medal Distinction & First Position
Experiences
Master Thesis/Intern | Saxion Smart Mechatronics and Robotics Research Group
Currently working on my Master's thesis under the KIEM, avoiding the invisible project. My research focuses on algorithms for multimodal target tracking and drone-based following, with three key objectives: Multi-modal (visual & thermal) target detection and tracking, drone control for target following, and integration of both into a unified software pipeline.
ROSCon 2024 Diversity Scholar | Open Robotics, Denmark
I secured a diversity scholarship to attend the ROSCon 2024 in Denmark where I had the opportunity to network with companies and ROS contributors globally. I specifically got extensive hands-on experience by attending the workshops named “Open source, open hardware hand-held mobile mapping system for large scale surveys” which gave exposure to essential processes such as LIDAR odometry and multi-session refinement for large-scale mapping and “ros2_control” where we learned about controller chaining, fallback controllers, and async controllers.
Robotics Intern | Paltech Robotics GmbH
Worked on testing and comparing two new ultrasonic sensors i.e. Bosch and Valeo for the obstacle avoidance task to include the safety braking feature (setting thresholds to slow down or stop the robot with ROS2) which involved performing multiple field tests of different high grass.
My Projects
Sim2Real: Controlling a Swarm of Crazyflies using Reynolds Rules and Consensus Protocol
This project implements swarm control for Crazyflies UAVs using Reynolds Rules for flocking and a Consensus Protocol for coordinated movement. It integrates rendezvous and formation control in ROS2 and Gazebo, enabling agents to converge and maintain geometric formations. Tested in both simulation and real-world environments, the system demonstrates adaptability and scalability, with results highlighting the impact of communication topologies on swarm dynamics.
Stereo Visual-Odometry (VO) on the KITTI Dataset
This projects contains the implementation of Stereo VO pipeline in Python on the KITTI dataset. It processes stereo image data using SIFT, feature matching using BFMatcher, triangulation of points, to estimate the motion of a camera (w.r.t its starting position) in 3D space using the approach of minimizing the 3D to 2D reprojection error with PnP and RANSAC.
Deep Learning
In this Deep Learning course lab work, PyTorch implementations included working on logistic regression and gradient descent, implementing fully connected models on the MNIST dataset, Convolutional models for image classification tasks on MNIST and CIFAR, Recurrent models for analysis of sentiment classification with the Stanford Sentiment Treebank (SST) dataset followed by detailed implementations on metric embeddings.
Human Detection and Tracking
This project focuses on human detection and tracking using the state-of-the-art YOLOv9 object detection model and the DeepSORT multi-object tracking algorithm. The methodology integrates Kalman filtering for motion prediction and deep learning-based appearance matching. The system is tested under various conditions, addressing challenges such as occlusions, identity switches, and tracking interruptions.
Frontier Based Exploration Using Kobuki Turtlebot
Frontier exploration project using an RGB-D camera mounted on a Kobuki Turtlebot. The project integrates advanced path planning techniques, combining the RRT* algorithm with Dubin’s path to map unknown environments. Additionally, a hybrid control system, which merges PID control with principles from the Pure Pursuit Controller used to optimize the robot’s velocity profiles. The implementation is done in Python within ROS framework, with simulation testing performed in the Stonefish simulator before real-world testing.
Monocular Visual Odometry for an Underwater Vehicle
Monocular visual odometry (VO) for an Autonomous Underwater Vehicle (AUV) through an integrated approach combining extended Kalman filter (EKF) based navigation. The methodology employs SIFT feature detection and FLANN matching to process images from a ROSBag. A key contribution of this work is the incorporation of EKF to provide a refined estimation of the vehicle´s motion and trajectory.
Pose Based SLAM using the Extended Kalman Filter on a Kobuki Turtlebot
Pose based EKF SLAM algorithm using the Extended Kalman Filter (EKF), incorporating view poses where environmental scans are integrated into the state vector. This algorithm was evaluated through both simulation and real-world testing.
Kinematic Control System for a Mobile Manipulator, based on the Task-Priority Redundancy Resolution Algorithm
Kinematic control system derived and implemented on a differential-drive robot (Kobuki Turtlebot 2), fitted with a 4 DOF manipulator (uFactory uArm Swift Pro). The system is predicated on the task-priority redundancy resolution algorithm. The implementation is done using ROS and the Stonefish simulator.
ROS2 Collision Avoidance Using Cross and Direct Echo of Bosch Ultrasonic Sensor Systems
Testing and comparing of two ultrasonic sensors i.e. Bosch and Valeo for the obstacle avoidance task to include the safety braking feature (setting thresholds to slow down or stop the robot for collision avoidance with ROS2) which involved performing multiple field tests of different high grass.
Stereo Visual Odometry (VO) for Grizzly Robotic Utility Vehicle
Developed VO pipeline from stereo camera calibration, feature extraction, and matching using SURF features and utilizing bucketing strategies and circular matching for accurate apparent motion computation and effective noise/outlier rejection, Structure from motion (2D-to-2D, 3D-to-2D, and 3D-to-3D) for triangulation and refinement using bundle adjustment. The final VO trajectory was also extensively compared with GPS-generated ground truth data.