Organicism
Full definition of organicism
Noun
- (philosophy) The treatment of society or the universe as if it were an organism
- The theory that the total organization of an organism is more important than the functioning of its individual organs
- (dated, medicine) The theory that disease is a result of structural alteration of organs
© Wiktionary