Sometimes I read a book, and that book makes my skin crawl with its (how should I put this?) message. Its blatant self-importance is like ice water to the face. Sometimes this type of book is full of anti-religious material, but sometimes it's over-the-top church. In either case I usually CAN'T finish it.
Not that there is anything wrong with church and writing it into stories (after all, spirituality can be a huge part of a person's life), but turning something I believe in into sappy, corny smash-the-wedding-cake-in-your-face with the preaching/judging/narrow world view just gets me all riled up. Yes you may be happier if you marry someone with common beliefs, but to insinuate that anyone who doesn't subscribe to your religion will make you unhappy or will doom you to a life of heartache? That anyone who looks different than a straight-laced, high-collared church goer is EVIL? That I have a hard time with. Or maybe it's just the generalizing and throwing everyone in the same category that gets me.
I just read a train-wreck of such a book, but I couldn't look away. I had to know just how much worse it was going to get. And worser it really got. It is exactly the sort of book that sets young girls up to think Twilight is great literature (not that Twilight doesn't have its place, but still).
Is it possible for an author to write religion into a story without it becoming a battering ram and its own in-your-face character? I want to know have YOU read a book where it's obvious (okay, besides Les Mis) that God is a big part of a main character's life without feeling like the author was giving you a song and dance routine? Maybe I just appreciate a little more subtlety.