Complementarianism
Wiktionary, Creative Commons Attribution/Share-Alike License
noun.
The
doctrine
that
genders
in a
society
should have
complementary
roles
.
Word Usage
"The Council on Biblical Manhood and Womanhood exist only to peddle their brand of patriarchy called "complementarianism.""
Etymologically Related
egalitarianism