I’ve always believed that Christians go to heaven when we die. It’s a staple of many people’s faith. Where did that doctrine come from, and what does the Bible say about heaven? Join me as I survey the various verses that talk about heaven.
What’s more important? Evangelicalism, or Christianity? Lately I’ve seen a lot of people defending the former at the expense of what I believe is Jesus’ example of how we should live as a community of faith.
Jesus said we’d know a tree by its fruit, and perhaps we can also know a doctrine the same way. I grew up with an end-times doctrine that is widespread in the church, but which I now recognize is bearing a lot of bad fruit. Maybe it’s time to recognize the harm.