We have heard it many times, women think they are always right, but do you honestly believe it?
Do you believe we are better judges of character, we are better at directions, we always give the right advice, we are always right about our children, we are always right about dimensions, we are always right about real estate, jobs………? What do you think ladies?
I for one, know that i haven’t always been right, don’t take me wrong, i would have loved to be right, just because i always want to! I think it’s a generalisation that happened through the years, from generation to generation, coming down to us from former matriarchs that were so strong, everyone was maybe scared to tell them they weren’t right!!
I don’t think anyone is always right, because no one’s perfect, and no one knows everything (my mom would disagree with that!). So, give me your opinion, do you think women are always right?