Whether homosexist or heterosexist, it's all sexist, and everyone needs to get over themselves. These sexist political agendas are ruining our culture. Everyone should just teach their children about sexuality THEMSELVES, since it's mostly a religious vs. anti-religious argument, and not have a teacher indoctrinate them into either side of it in a classroom. If they are to be taught sex ed, teach REAL health, i.e. what happens to the body chemically and physically during puberty, where a baby comes from, and STDs/STIs.