I've always been a Christian. I've always believed in God. Why? Because I've always been taught to believe in God?
This topic might be a bit more basic than many of the others that are discussed on this site, but recently I've been having a crisis of faith.
How do we really know that God exists? I've never personally seen a miracle (if that is indeed what we should really be looking for; probably not). How do we REALLY know that Christianity is true? Why should I believe the Bible? Why did God create the world in such a way that some people must go to Hell? Why do some people die without ever even hearing about Jesus or God or whatever?
I know there are many questions here. I don't expect (but I do hope for) clear answers that will save and increase my faith in God and I don't expect that every question will be answered to my satisfaction. But I'm losing my faith and I'm upset, so this is just an outpouring of my frustration, sadness, and confusion.
Help if you can, and pray for me.