In this paper, a robust visual tracking method is proposed to track an object in dynamic circumstances that include motion blur, illumination changes, pose variations, and occlusions. To cope with these challenges, multiple trackers with different features are utilized, and each feature shows different robustness to certain object appearance changes. Then, to fuse such independent trackers, we propose two configurations, i.e., tracker selection and interaction. The tracker interaction is achieved based on a transition probability matrix (TPM) in a probabilistic manner. The tracker selection extracts one tracking result among multiple tracker outputs by choosing a tracker that has the highest tracker probability. According to various object appearance changes, the TPM and tracker probability are updated in a recursive Bayesian form by evaluating each tracker reliability measured by a robust tracker likelihood function (TLF). When the tacking is finished at each frame, the estimated object motion is obtained and fed into the reference update via the proposed learning strategy that keeps the TLF robust and adaptive. The experimental results demonstrate that our
proposed method is robust in various benchmark scenarios.