There are two questions here. First one is whether christianity is declining?
The second is why christianity seems to decline in the western world?
I guess there are now more Christians on the globe than ever before. It is just that the western world is in a different situation than Africa, Asia or South America. Western world, especially Europe, is becoming a missionary field receiving missionaries from Africa and elsewhere.
What is wrong in the western world? I believe that Lesslie Newbigin (books: Foolishness to the Greeks: the gospel and western culture; The gospel in a pluralist society) and Tim Keller (How to reach the west again; plus other writings) noted something important in their books. I have only read parts of their books but recommend these for anyone interested about the topic.
One observation has been that mainline Christianity in the western world is in a state of syncretism, tangled with the western culture. The western culture is currently having a dominant position in this relationship while western christianity has turned to a powerless version of the original movement. Western christianity does not anymore challenge the prevailing culture in a healthy way as it used to do. Where there is some kind of challenge, it shoots off the goal - YEC and comparable ideas are not healthy ways to challenge the ungodly parts of the western culture.