Naturism

Noun

  • The belief in or practice of going nude or unclad in social and usually mixed-gender groups, specifically either in cultures where this is not the norm or for health reasons.
  • The belief or doctrine that attributes everything to nature as a sanative agent.

Leave a Reply

Your email address will not be published.