Jobs like nursing and teaching at day-care centers are sometimes referred to as “pink collar” because they’ve traditionally been associated with women. More men these days are going into such fields, particularly nursing, than ever before. Do you think that there’s still a stigma attached to such jobs for men?  If you’re male, how would you / do you feel about telling others what you do for a living?  If you’re female, how would/do you feel if a male relative or partner did/does such work?

Leave a Comment

Your email address will not be published. Required fields are marked *