Anyhow, last night I went to see The King's Speech (awesome movie, btw). Anyhow, one of the previews, for the movie Company Men, had a tag line that caught my eye and has been rattling around in the back of my mind all night. It went something like this: "In American, we give our lives to our job. It's time we took it back." (Okay, something like that--it's not exactly right, but you get the idea :)
Anyhow, it got me thinking of how true this is. In this country (and I'm sure elsewhere, too), we ARE our job. From a young age we ask "what do you want to be when you grow up?" Like our job is going to define who we are completely. One of the first things that is asked when you meet someone new is regarding what they do for a living. Like that means something. Like it defines their live totally and completely and you can tell something intrinsic about who they are because of what they do to pay the bills.
Now, the thing I find most disturbing about this is that I have totally and completely fallen into this trap. I know I define myself by my roll as a scientist, a teacher, a graduate student. In some way it's a part of who I am. And to some extend, I get that: it's what I spent the bulk of my time doing, so therefore it is part of me. Sure. But does it define me completely? Heck no! And part of me really hates that people lump my by that. Still, when I introduce myself I find myself giving the same old answers of what I do for a living like it is the ultimate answer of who I am, like it might influence someone into thinking one way or another about me. For some reason that just irks me.
Anyhow, does anyone have any thoughts on this? Are we our jobs? Or is there any other way to define who we are? Just something to think about at least!
And just for fun:

No comments:
Post a Comment