Jitter, small fluctuations in the signal, is one of the major sources for a decrease of motor performance and a negative user experience in virtual reality (VR) systems. Current technologies still cannot eliminate jitter in VR systems, especially in the eye-gaze tracking systems embedded in many head-mounted displays. In this work, we used an HTC Vive Pro Eye, artificially added 0.5°, 1°, and 1.5° jitter to the eye-tracking data, and analyzed user performance in a ISO 9241:411 pointing task with targets at 1 or 2 meters visual distance using angular Fitts’ law. The results showed that the user’s error rate significantly increases with increased jitter levels. No significant difference was observed for time and throughput. Additionally, we observed a significant decrease in terms of time, error rate, and throughput for the more distant targets. We hope that our results guide researcher, practitioners, and developers towards better gaze-tracking-based VR applications.