The term “naturism” refers to a variety of movements, taking different forms, that advocate a return to nature. It originated as a medical term, and initially designated alternative therapeutic practices based on the use of natural elements. In the late nineteenth century, it was gradually associated with calls for lifestyles that were more in keeping with nature, and then at the dawn of the twentieth century, with collective and mixed nudity during leisure time. This practice, which was initiated in Germany in relatively narrow circles, spread to Western Europe during the interwar period. It became popularized after World War Two and was integrated into the economy of mass tourism. Although the hedonistic dimension of nudity often overshadows the initial project of a naturist reform of lifestyles, nudism nevertheless remains, for most of its followers, associated with the ideal of a return to nature.
Naturism: the body as a central element in the return to nature