Evangelism Trends are Usually Rooted in Bad Theology
Christians in most American churches are encouraged to evangelize – which means sharing the gospel with unbelievers and calling them to faith and repentance in Jesus Christ. This highly appropriate activity is rooted in the words of the New Testament and obedience to our Lord who commands us to make disciples. Yet while we see many circles and pockets of Christendom “...
Keep Reading