Do you know of any good books from a right wing, religious perspective, that addresses the use of art to promote things you may be opposed to? Like anti-religious sentiments in art, gay lifestyles, or degradation of the Christian values that America was founded upon? I would be interested in reading any you could recommend. If they addressed the years between the late 70's to just prior to the 90's that would be ideal.
Art used to support Christianity to a great extent. Today it seems to have turned its back on it, sometimes when Christianity has turned its back to art, art has left a knife in it.
Also anything from your perspective would be appreciated. Maybe an article you read or some other resource, such as a website, you could suggest, I would be interesting in reading too.