Camera-based visual navigation techniques can provide six degrees-of-freedom estimates of position and orientation (or pose), and can be implemented at low cost in applications including autonomous driving, indoor positioning, and drone landing. However, feature matching errors may occur when associating measured features in camera images with mapped features in a landmark database, especially when repetitive patterns are in view. A typical example of repetitive patterns is that of regularly spaced windows on building walls. Quantifying the data association risk and its impact on navigation system integrity is essential in safety critical applications. But, literature on vision-based navigation integrity is sparse. This work aims at quantifying and bounding the integrity risk caused by incorrect associations in visual navigation using extended Kalman filters.