Although I am not American, as we see and hear about Trump all the time, everyone has an opinion on him. The only people I know who like him are people that are notoriously hard right, are openly and proudly racist and are always whining about "The Woke" I also have noticed that overwhelmingly women can't stand him. I can only imagine how American women must feel given that he has helped take away their rights. Surely there comes a time when having rights taken away from you is far more important that any misplaced ideology that does not even represent what people think anymore.