It's quite easy, when not using some kind of rigorous system, to come to conclusions that suit one's own ideas.
Let's say that early on you run into an example where fruit causes you problems. You get the idea that "fruit may be bad" so you ask yourself if it is and you keep an eye out for more information. When you see examples of fruit causing problems you answer your own question of "is fruit bad?" with "yes". Soon you're thinking "fruit's bad, isn't it?" and you seek examples that support your idea, both intentionally and without thinking. The evidence out in the world could be just as strong against the idea, but you have gone down a particular path of belief, and going down a belief path is self-reinforcing.
Do you reinforce your own ideas?
If your response to the idea of self-reinforcing beliefs is "that's bad -- I'm not biased like that", then it clearly shows that you, like just about everyone else, have emotional responses that guide how you assess information. This is the root of self-reinforcing beliefs.
If your response to the idea of self-reinforcing beliefs is "Huh. Am I like that?" then you're one of the very few who, for some reason, have less of the emotion-driven urges.
I think it's useful to look at our own tendencies of how we take in information first. This helps us to be more accurate when we then get to doing research, which means we stand a better chance at getting at truth, rather than just what comforts us.
This is all just a side note. If you'd like to discuss it more detail, I'd be glad to. Meanwhile, I leave you to return to your actual research.