e thesis presents a new algorithm to track humans following a mobile robot in a 3D environment for applications such as a robotic guide. The current approaches use sophisticated robots with a large field of view sensors, which is not practically feasible due to the affordability of the robot. The proposed algorithm uses a multi-behavioural social force-based particle filter that uses a 'follow the robot' motion model that predicts the human velocity, rotation and 3D position based on social etiquettes. The motion model takes into account the robot's position and tracks a group of moving humans from a moving robot. The filter also uses an observation-model made using a limited field-of-view monocular camera. The proposed algorithm is particularly better when the robot needs to make sharp turns, in which case there is a sharp change in the position of the human in the image that the filtering techniques erroneously smoothen. The model accounts for attraction and repulsion between the people of the group and those with the robot in order to maintain a comfortable social distance with each other at equilibrium. Additionally, when any person leaves the group then the track is deleted and after joining the track is automatically reinitialised. The tracked trajectory is compared with ground truth and the proposed system gives much fewer errors when compared with several baseline approaches. False positives are reduced, and the accuracy is also increased for the proposed model as compared to other baseline methods on several scenarios.