So the other day I went to the doctor to get a physical. (Fortunately thank god everything checked out aok knock wood spit on the floor. tuh!). The doctor looked in my ears. We talked about the weather. He listened to my breathing and we talked about the photographs on his wall. He looked at my throat and I said, 'Ahhh...' It was ok. We talked more about the photographs. Then it was time to drop my pants and turn around for the finger in the butt. He put his finger in my butt. Afterwards, I pulled my pants up feeling a little humiliated and shamed. And right afterwards he asked me my thoughts about the Yankees.
It's a weird moment. But the almost weirder part is hanging out afterwards with the doctor pretending like nothing happened. Continuing a conversation as if he didn't have his finger in my butt! Like just now! Doctors have weird jobs. It's like normal conversation normal conversation normal conversation finger in butt normal conversation normal conversation. I guess they're the only people who can really do stuff like that. I mean if you were having a conversation with someone you don't know that well and all of a sudden they stuck their finger in your mouth you'd freak out. And that's just your mouth!
I dunno. Maybe I'm immature about the whole thing but it still strikes me as super strange to go some 'office' and have some dude stick his finger in your butt and then you pay them and then you leave. I know chicks get it like 10x weirder or whatever. I get that. But I get finger jinger like every couple of years and every time it happens I get the same thought afterwards as I walk down the street....
Yarg! Some guy (who didn't even buy me dinner or nuttin!) just stuck his finger in my butt!
(ok this post is dumb. I know!)