• Organicism

    Full definition of organicism

    Noun

    organicism

    (uncountable)
    1. (philosophy) The treatment of society or the universe as if it were an organism
    2. The theory that the total organization of an organism is more important than the functioning of its individual organs
    3. (dated, medicine) The theory that disease is a result of structural alteration of organs

    Anagrams

    © Wiktionary