How do I tell my new co-workers that I am a Christian? Do I need to? I kinda assume that they are going to figure it out without me introducing myself like this: “Hi. My name is Ed. I’m a Christian.” I have never done this, yet all of the people I directly work with know that I am a Christian, most of them know I plan to go into full-time ministry in the future. How is this possible? They see me reading my Bible or referring to my faith in a natural way, then we have a discussion when they bring it up. I sure didn’t walk in on the first day and proclaim, “Repent, for the Kingdom of Heaven is at hand!” Nor did I refer to them all as “sinners,” “heathens,” or “sons of the Devil” – that’s just not going to work in the culture of the twenty-first century.
Paul was a master of this. He walked into Athens, saw their idols and instead of condemning them, he told them he would like to introduce them to the “unknown god” they had a shrine set up for. He talked and reasoned with the religious elite and the philosophers. He didn’t condemn or lecture unbelievers (except the Jews that refused to see that Jesus was the Messiah).
Honestly, the question shouldn’t be “How is this possible (for them to know without me telling them)?” but “Is it true (am I really a Christian) if I have to tell them?” I’ve read in leadership books that the one going around telling everyone he is the leader, isn’t the real leader (even if he has the title).
My plan is simple: infiltrate the culture and change the atmosphere from the inside-out by serving those around me through love in the name of Jesus Christ, making introductions for those that wish to know Him and making disciples of those that want to be like Him. If I can do this, I believe with everything in me, opportunities to share my faith will find me.