For a while I have been noticing something going on in the relationships between Men and Women, for a while I have been seeing that the dynamics have been changing more and more in the favors of women.
women are taking over; that’s a fact, that lookout of the window, check the media, the magazines and your friends, you sure notice that things are different. Not only that men are becoming different too, let me explain.
If you grab some men magazines most articles are now oriented towards how to look better, they even selling products like moisturizers to men and masks and things that were exclusive to women just a decade ago, and also you can see articles on how to pleasure women in all sort of ways from sexual to social, generally men are becoming more docile and more submissive.Just look around your friends, specially your male friends in long relationships, who wears the pants? how many times you heard the guy says he wants to “consult” with his girlfriend (which in fact its just getting her approval), how many guys do you know are turned on now by women who are dominant sexually?see!
On the other hand, if you check some women magazines, other than eauty tips that are more or less the same (with added improvements), the other articles are oriented on empowering them , sexually, politically, socially, etc. Women are more independent than ever, they are wilder and more confidant, and generally in control.
Women are taking over the world, prepare yourself men!