GovNews Blog


Saving Time, Money and Preventing Identity Theft – Part 3: Data Entry Best Practices

Posted by Troy Burke on Oct 11, 2016 8:00:00 AM

Preventing Identity Theft Data Entry 

The ultimate goal of your data entry process is to create an organized set of data, in a specified format, which can be easily accessed and utilized.  The success of your organization in achieving this goal depends on the quality of your data entry process.  Here is a list of Top 10 Best Practices that--if followed--can deliver an accuracy rate of 99% or higher from Invensis.net.

Ensure Data Standards Are in Place

At a minimum you must have a set of standards that all operators adhere to for every project.  Ideally these standards are well documented and not open to interpretation.  Many organizations are moving to a “key it as you see it” approach to eliminate abbreviations, translation tables and user inconsistencies.

Data Validation

Once the data entry is complete you need to have a validation process for quality control.  Many data entry applications can access existing databases to validate information that is entered, provide choices, and restrict entries. Another common form of validation is to compare two double blind data entry passes, and present exceptions to an independent third person to validate.

Post Data Entry Storage

If possible, store the data in a non-proprietary, unencrypted and uncompressed format.  The objective is to have data that can be accessed by any application at any time. Here are some preferred file formats from Stanford University:

    • Containers: TAR, GZIP, ZIP
    • Databases: XML, CSV
    • Statistics: ASCII, DTA, POR, SAS, SAV
    • Still Images: TIFF, JPEG 2000, PDF, PNG, GIF, BMP
    • Text: XML, PDF/A, HTML, ASCII, UTF-8

Familiarization with Poor Data Entry Practices

Even with data entry standards different people index differently.  Whether it be inconsistent formats for names or entering different information in a particular column or field – these mistakes can make it difficult for users to find the necessary data.  It is important to train operators to recognize, identify and avoid common and uncommon errors.  If mistakes are made they should be addressed immediately to avoid being repeated.

Descriptive Names

Create field names that are descriptive that do not contain spaces or special characters.  This can cause problems if the data is used for future analysis.

Consistency in Column and Row Filing

Data should be entered consistently and not in chunks or blocks stored in different locations. By tagging each column or field with labels that indicate an alphabetical or numeric status and consistently filling with letters and numbers, the data will be easier to understand.  If you want to perform analysis, transfer the data, or convert to another format in the future it will make carrying out that task simpler.

Missing Data

Establishing guidelines for handling missing data in your Data Standards will save time, and questions down the road.  There are several ways to handle missing data:

    • Leave the field empty and assign a NULL value or NO value to it
    • Enter a distinct value like 9999 to indicate a missing number in a numeric field if you are unable to assign other values
    • Use NA, Not Available or Not Applicable for text fields
    • Data flags can be place in a separate column to define the missing value

Complete Lines of Data

When using a spreadsheet, missing data can lead to issues when sorting.  It is a good practice to ensure all cells in a column are complete.

Keeping a Log

Tracking errors and difficulties encountered during data entry will denote missing information, wrong or inaccurate data, fields requiring clarification, and the action taken.  Many data entry applications allow users to apply specific tags indicating the issue with a particular record, or can flag for a supervisor to review.  This information can be used to fine-tune the data entry process.

Automate

And last but not least, use automation to carry out large amounts of data entry.  Automated indexing software can accurately capture 70-90 percent of the desired data, and combined with manual validation will increase the accuracy rate to 99%. By developing rules to focus on the most common documents, technology shifts the focus from data entry to validation and allows your staff more time to perform other critical office functions.

To learn more about our automated Indexing Software FlexIndex or to schedule a demo please contact sales@extractsystems.com or complete the form by clicking the button below.

 

 

   

Subscribe to GovNews

Recent blog posts