I have read a couple of popular Christian books lately that have left me feeling frustrated and discouraged. It seemed I would enjoy these books very much, considering that I believe in loving and caring for other people as an integral part of living out a Christian faith. And, on the surface anyway, it seemed like this was what these books would be about.
But these books, rather than simply encouraging this approach to faith, spend a huge amount of time tearing down what they are against. And, to hear them tell it, what they are against is basically Christianity in America as a whole.
For one thing, I feel that to negatively characterize the whole of Christianity in America is painting with a broad brush. Are there American churches that are so integrated with the American dream that they are completely self-focused and self-dependent? Well, yeah. And it's true that this is not the gospel. But I honestly do not believe that the majority of Christians in America have this attitude. I think most of us are doing our best to live an authentic faith. Instead of hearing that we are probably living unbiblically, perhaps we should instead be encouraged to seek the heart of God even more. In the end, books are not going to convince me of the heart of God or his plan for me. God can and will do that as I continue to seek him.
But this approach bothers me for another reason, and it may take a few posts to completely address my thoughts about it. So please bear with me.
The first question I have been asking myself is, "When we want to see change, is it best to tear down what we are against, or encourage what we are for?" It seems like semantics, a fine line, and some people would say our job is to do both. When fighting for a particular change, aren't you naturally fighting against something else?
I have encountered this struggle in my own life regarding something I care very much about: discipline of children. I will be very honest; I would love to see the day when all parents are able to effectively discipline their children without feeling the need to use spanking as an option. And that applies to all ends of the spectrum, from the spankings that border on abuse all the way down to the light swat on a diapered bottom. But does that mean I must fight against spanking, tear it down, and characterize people who do it as bad parents? Or does it mean I should encourage the kind of discipline I believe in so strongly? (Please note that this is not what this post is actually about, so if you disagree with me here, that's okay; this is just here as a personal example of the issue I am addressing.) I have become more and more convinced that my role is that of an encourager, not one who tears down what I am against.
But the next natural question is, "Is there a time for specifically speaking out against something?" That is my next question, anyway. And I am still thinking on that, so I will have to address it later. ;)
But for today, I will leave you with this: When you are passionate about seeing change, do you think it is more effective to fight against the thing you don't like, or to fight for the thing you want to see? Do you see a distinction between the two, or do you feel they are the same thing?