Interested in the thoughts of other women


I tried asking this question elsewhere, but received a lot of what I considered to be absurd answers. While I’m not looking for people to simply agree with me, I do expect some degree of thought, regardless of your opinion.

Anyway, here it goes.

I was recently reading a study which stated that in the years 2005-2010, some 127 men (and a few boys) were permanently disfigured as a result of trying sexual activities with animals.
In all of these cases, the male either had part or all of his penis bitten off, and sometimes his testicles as well.

Do you think that in these cases what the men were doing should be considered rape? I’m not saying they should be charged with rape, but do you see it as a form of rape?

Also, do you think the men deserved what happened to them (penis bitten off) for forcing animals into sexual acts?