top of page
Search
Writer's pictureSr. Software Adviser

Big Data, Data Quality And the GIGO Rule



Big Data is important for nonprofits too!

In the association and nonprofit world, we are very big on acronym’s, aren’t we? However, in the software world we have carved out quite a few of our own. And, the acronym GIGO (Garbage In, Garbage Out) was the purported brainchild of an IBM guru George Fuechsel. According to a piece published on the Princeton University website:

“[The term is] used primarily to call attention to the fact that computers will unquestioningly process the most nonsensical of input data (garbage in) and produce nonsensical output (garbage out).”

To be sure, it’s a phrase, and an acronym, that’s likely as old as the computer industry itself. You would think that we would have eliminated it completely by now, after all of these decades of experience. And yet there still seems to be some fairly smelly trash out there.

However, the caution is even more poignant today in the age of powerful computers, vast database processing capability, and this term “Big Data“. If you want your member or donor database to be worth the effort, pay attention to the five precepts of ensuring good data quality regardless of how big that data may be:

1. Set up data validation rules. You would not want a donor’s dollar amount entered into a date field. Setting up a data entry rule for each entry prevents many data entry errors and omissions.

2. Use a data entry screen. The best way to ensure consistency is to devise a user-friendly data entry form that accommodates one record at a time.

3. Devise detailed data labels to avoid confusion. A data label is your field name. An ambiguous or confusing data label would be Date. A detailed data label would be Date Donation Received.

4. Control the vocabulary, or program a choice list to limit the length and amount of the data being entered. Most database programs include this as a data field programming choice, and you can block unwanted or erroneous data entry by restricting the data entry variables accepted by your database

5. Always document your database design. The object is to make sure your data is immediately understandable to others who use it in future data collection projects — as well as a memory jog to yourself later on.

The impact of bad processes with regards to data can be dramatic. For example, here is one quick illustration. Last year’s donor or member database, with 1,100 names, addresses, telephone numbers and e-mail addresses, has an 11 percent statistical probability of having obsolete entries.

That equates to over 120 returned fund-drive letters or bounced e-mails, as well as wasted time on the part of your telephone marketers to get people to make a contribution. If your donor database is the flagship of your fund drives or annual event, obsolete data would be the barnacles on its hull.

Since we’re into metaphors here, when it comes to avoiding data rot, you need to eat that 72 OZ Steak one bite at a time. You do that through an ongoing data verification and smoking out errors during data transcription. There are also database analysis applications like our partner Updentity that will run quality checks to keep your database current.

It may be time to move up

Your member or donor spreadsheet may have become too wide and deep, or your database may have meandered into the obtuse realm of undocumented confusion. Nonprofit organizations have unique software requirements and the aforementioned barebones applications eventually buckle under the sheer weight of the data they support.

We discuss the unique needs of nonprofit software in our blog in Engagement Management Software. If it is time for your nonprofit organization to bring all your data together, or if we can answer any of your questions on data quality or the value of an integrated database to make better decisions, please contact us.

Comments


bottom of page