Who cares what they think of you? It's their job to help you take care of your health.
I always figure that no matter what I say to a doctor and no matter what I present with, they've seen it before, probably worse. In fact based on the stories I've heard since my sister became a nurse last year, I'm sure of it.



Reply With Quote