A lot of Democrats believe that a powerful patriarchy has caused pain, damage and social tension in America. A lot of hard core feminists in the Democratic Party and other leftwing groups believe that "all men are rapists" since the Trump Access Hollywood tape, the Women's March, the #MeToo anti-sexual harassment movement. If men are bad humans, therefore men who marry and have children become fathers. What role should fathers have in their children and their lives? We know that the mother is the most important part of the life, the nurturer and adviser. What's the role of the dad?
There is no role. He doesn't have to exist. Neither does a mom. What a child needs most of all is love - and it doesn't matter if the person or people giving that to them have a vagina or a penis or neither or both.