Your address will show here +12 34 56 78
Leadership
Jonathan Falwell on ‘ministry measurements’. In a post over at the Christian Post, here is some of what he had to say: “I believe that we have self-imposed measurements of success that are skewed, that are wrong… The measurements of success are all messed up.” While there is nothing wrong with the “Top 25” or “Top 100” largest churches or most influential lists, trying to make it to those lists has forced many pastors to focus on the masses rather than “the one.” “Stop focusing on the ‘big ministry’ and the ‘big outreach,” he urged, noting that ministers place too much pressure on themselves. “Start focusing on one person, one hurting person, who’s lost, … who’s desperate to hear the Gospel.” “We have a responsibility to minister to the one.” You can read more here… Your thoughts? Todd
7

Trends
My friend Kent Shaffer hits a great point over at the Church Relevance blog:  What is most interesting about these lists is no longer the data itself but rather how many churches are choosing to no longer take part in these studies. Kent continues:  “From a research perspective, this nonparticipation is sad. But theologically speaking, the reasons many churches choose to not broadcast their numbers are quite noble. Many nonparticipating churches just don’t want to negatively affect other churches. And, of course others, just forget to report their numbers to the researchers.” I agree.  I think there’s one other possible reason that some churches aren’t reporting their numbers this year.  I think… just possibly… that some churches may not have reported their numbers because their numbers are down. I mean, who wants to go from the 5th largest church in America to 8th? When this whole list thing started a few years back, it was fun.  It was interesting.  And, the first list ever presented, was probably the most honest. Human tendency says that the next year, the pressure was on to put up a better number for the list than the year before.  Now years into it, it’s more and more difficult to produce an honest number that looks better the year before’s number.  After all, you have people out there (like Kent!) who actually look at the numbers and compare them to other lists and other years. I’m not saying that churches knowingly fudge their numbers.  Not at all.  I’m just saying that there is an inherent pressure to make your numbers look better than last year’s numbers.  And if you can’t, maybe you don’t participate in the top 100 list. Of course, this is just a theory.  It could be that some of these churches are just taking the more noble approach as Kent suggests, or that they simply forgot to report their numbers. I respect the people who do the research on the list.  I know them, and they are honest researchers and publishers.  Unfortunately for them though, having a top 100 list where a growing number of churches refuse to participate does not help their cause.  You simply can’t have a top 100 list when part of the 100 is not included. What do you think?  Do you look at the top 100 lists?  Are they helpful?  Do you think they’ve run their course?  And do they lose any credibility when not everyone is included? I’d love to hear your thoughts. Todd
7