I don't know how common this is, but I have a weird need to take care of all my injuries and physical discomfort on my own without any input from anyone, medical professional or otherwise. It's not that I dislike doctors; I actually don't mind going to them, and don;t have any special dislike of needles or medical stuff either. I just don't like people dealing with something that I can take care of myself. I rarely even tell anyone when I'm hurt, even if its something serious like a neck injury, because I cannot stand people telling me that i need to get it checked out. A lot of the time, I feel like everyone is making a much bigger deal out of things than needs to be made. I mean, yeah I flipped over at a party and couldn't tilt my head past a 40 degree angle for 2 months without stabbing pain, but it's not like it's a big deal? Yeah a kid hit me with a shovel and made a big bloody spot on the back of my head but there's more important things to worry about, like horses and how cool they are. So what if my hip sometimes sends sharp pains down my leg every time I take a step? It's not that bad and it only happens every once in a while. But apparently people seem to think that these things are 'important' and I 'need to see a doctor before things get worse' pffhh whatever. Maybe it's some primitive-brain impulse to hide any weakness? Maybe it's a fucked up display of independence or self-sufficiency? Who knows! I do know that I am 100% willing to stitch my own wounds with a needle and fishing line before I even tell anyone that i got hurt, and if anyone tries to get me to go to the doctor about it there's no guarantee that I won't scream you out of the room. So I guess my question is, How serious does something have to be before you'd go to the doctor for it? I really don't have any frame of reference to work with, so it'd be cool to finally have that.