Most people go to the doctor when they are sick to get a diagnosis, medication, or simply to feel better. We all know how it goes…
You tell them how you feel, describing your symptoms the best way you can. Then they tell you how they think you feel. When did going to the doctor become a way for someone to tell YOU how YOUR body feels?
Recently I grappled with this question. I went to see a doctor because I was experiencing flu/cold-like symptoms. I explained my symptoms, while my doctor paraphrased what they thought I said. A test was done which came back negative (always happens!). Therefore, I was told that nothing was wrong with me and I would feel better in a few days. WRONG. Days passed, my symptoms worsened and contrary to what was told to me, I did not feel better. Being a sick body, I thought how ridiculous it was for someone to tell me that my body was fine when I knew the contrary. I felt the symptoms and the discomfort, not the doctor. I know my own body better than anyone.
Be confident in how you are feeling and never discount what your body is telling you.