Eliminating two very obscure, extremely redundant costs

How accurate does your data have to be to provide you with good information? Have you ever considered just how accurate your data really is? Have you ever thought about the amount of time your company spends correcting data errors? Sometimes even more alarming, do you ever reflect on and consider how much time is lost in looking for information which really should have been instantly available?

In many companies there are individuals, sometimes even large teams of accounting people, who are assigned to do nothing but correct errors. Then there are some organizations which have decision makers who spend almost 100 percent of their time looking for the information they need to make a decission, not actually making decisions, the job they were hired to do. Sound familiar? Almost all company executives, middle managers and support staff, regardless of the industry or size in which they work, spend far too much time doing both.

Unfortunately most IT and administrative managers either do not, or chose not, to recognize the fact that the problem exists, or, have lived with it so long and become so accustomed to it that they honestly believe that continually looking for things and correcting mistakes is simply a natural and unavoidable cost of doing business.

Conversely, the real truth is: mistakes can be avoided! The information you require can always be at your fingertips! In many organizations, 50 percent or more of administration costs could be avoided by implementing information systems specifically designed to address both of these two very costly issues.

Were you to ask some managers about the accuracy of their corporate data, they might say that they believe their information to be 90 or maybe 95 percent accurate. Others might say theirs is 97 or 98 percent accurate but maybe some parts of it probably only 85 percent. The real truth is, in most cases, they really don’t know and almost all will have fallen short of the degree of perfection they would really like to realize. Would it not be much better to simply know it was just real good and be able to say so with a high degree of confidence?

Making the assumption of what one feels the degree of accuracy is in their data can be is a dangerous statement to make because their statement may become the influencing decisions picked up by others who may not share a similar point of view. The probability that anyone else understands exactly where the data is prone to be in error is most likely low and that 15 percent area you considered prone to error just might unknowingly constitute 75 percent of the details analyzed in as valid information report requested by someone else. Not a good situation to find oneself in when one considers that decisions are based on what is considered or supposed to be reliable information. Have you ever wondered just how very bad a good decision could really be when it is based on bad or incomplete information?

Now, having said all that, we can safely and justifiably say that information not 100 percent accurate is therefore also 100 percent wrong. Management at all levels requires virtually instant information along with a high level of confidence that it is also extremely accurate. There are several steps systems designers need to take in order to make information systems truly efficient.

The timing of the collection of information is critical to ensuring data integrity because timing considerations can dictate who is actually doing the recording. The absolute worst time to capture data, which is how most systems work, is long after the actual event has taken place. This is especially true when the recording of the event is no longer critical to the event actually taking place. Therefore, typically, if and when such errors are made, they are (by default or lack of time) generally considered to be not important, no longer worth fixing.

The best time to collect data is in one of the following three stages. Often the recording of an event requires a combination of all three. The more data captured in the first and or second stages, the better.

  • As a setting up or a preparation process of the activity to be executed.

 

  • As the activity is actually taking place, in effect, the recording may be simultaneously directing the activity and recording the event, both equally important.

 

  • The instant the activity matures and or as the activity becomes the basis of a successive activity in the continuation of a workflow process.

 

Any piece of data, even a tiny piece, not 100 percent correct is also 100 percent wrong.  We can therefore only conclude that any and every piece of data collected is as significant as the whole of that particular set of data itself, for, if it is wrong, obviously the entire set of data is also incorrect because information is based on the data from which it was derived. How we should be collecting data and what we should or should not do is the essence of the following points if a high level of accuracy is to be attained.

  • Systems engineers and designers need to design systems with a mindset of creating information systems, not developing accounting tools. Accounting systems in themselves do not create revenues nor do they make positive additions to the bottom line. However, information systems which assist people in getting the job done effectively will always contribute to the bottom line through either increased revenues or reductions in costs or both. In good information systems, accounting is simply a by-product of the process of getting the job done.

 

  • Always make the recording of the activity an intricate step in the execution of the activity itself. For example, if we want to purchase something, then create a purchase order as a part of the activity so that the creation of the purchase order becomes a defining part of the process itself. It must be correct or we will not receive what we want to purchase.

 

  • Always implement fully integrated, totally seamless, database file techniques supported by linked and/or moving workflow processes.

 

  • Implement logical verification checks at every possible opportunity to verify that every possible piece of data is as logically correct as humanly possible. For example; if a unit of measure code must be on every purchase transaction set up a unit of measure table in a verification file. In like manner, edit all data for everything editable including the unit of measure code. Never allow data, in any record, to be written to disk until every field in the record passes all edit tests.

 

  • Never allow changes to any previously posted (updated) data. Always totally reverse the original transaction, then copy it, revising the copy to what the original should have been in the first place and then repost. This provides a perfect audit trail of everything you have done, totally removes erroneous data, eliminates questionable adjustment calculations and the net result is perfect data.

 

  • Always move the recording of the activity as far back in the process as possible, preferably back to the person actually doing the job. No one can be as responsible or know the details of the transaction as well as the person doing the task.

 

  • Always forward existing data to successive job steps so there is minimal keying in successive job steps.  This obviously not only removes tremendous amounts of duplication of effort and opportunity for error but even more importantly, it allows, forces, the succeeding processes to become verification processes rather than just being another opportunity for keying errors. The successive person will not only be adding additional details but also, quite unwittingly, unknowingly, also be verifying previously entered data, making any required corrections. As you can see, if a purchase order is succeeded by a receiving report in a workflow receiving process, followed by a workflow payables invoice entry recording process, by the time it is posted for payment, the transaction very likely has a very high probability of being 100 percent correct.

 

  • Use new high speed, direct connect imaging techniques to record all documents which have been created and supplied by third parties. Take the stand that no Information is accurate if any of the facts are missing – even externally created information..  Handle all third party documents only once, using the computer to automatically create all cross reference and link values to the image repository.  Doing so promotes consistency and a high level in employee performance as well as ensuring accuracy. Pressing the function key to take the image and selecting a record type to define the image type should be the user’s only responsibility to the recording of the image. Due diligence in recording third party documents means making sure you have all the facts at your immediate disposal. Remember, when problems do arise, and a story emerges, there are always two sides to the story. The more you know about both, the better. Better still, when you have it at your instant disposal, you tend to possess the winning hand. You will then know with confidence that “due diligence has truly been served!”

 

  • Make certain that correcting mistakes is easy and swift. We all can and occasionally do make mistakes. From time to time the mistakes we find are not our mistakes at all but someone else’s or they may occur as a result of circumstances entirely beyond our control. For example, even the best of logic can be defiled by reality, (we ordered 24 color crayon boxes but received 48 color boxes instead), make sure it extremely easy to recover. The easier it is to recover, the greater the likelihood of the recovery process actually being executed and the probability of its correctness   elevated.

 

  • Make sure all data is available to users the instant it is recorded and also make certain it is not removed for a very long time – consider as long as 10 years (or even more if possible).

 

Most importantly, we must consider why. Why we are collecting the data in the first place and why is it important to ensure all data is pristine. Of course, all the normal reasons prevail, paying the bills, collecting receivables, protecting assets, satisfying shareholders and an extensive list of reports which need to be run on a routine basis providing all manner of analysis.  To most, the reason why appears routine and mundane if not obvious.

More importantly however, data must be accurate for reasons we have not yet discovered, It must be both accurate and complete when we need it, perhaps now, a year from now, five years from now, even ten years from now, or whenever. We definitely will want the assurance of absolute integrity, regardless of the age of the data we selected for the compilation of our report, whatever the report might be. Confidence in our decision making can only be assured through the knowing that our data has always had a very high level of integrity. If we fail on data integrity we fail ourselves and there is no recovery. We cannot return to our past to fix today. If we fail on data integrity, we fail on tomorrow.

Attaining something near 100 percent accuracy can be achieved by using every means at our disposal. It takes good programming, good administration and strong commitment, but it can be done. Will it ever be perfect? No, not likely. But getting to the point of knowing that your data is real good is definitely achievable and three very significant things will happen when you know it is good.

  • No one will be spending countless hours correcting mistakes, theirs nor anyone else’s.

 

  • No one will be consuming endless hours seeking buried details to produce information in order to make and justify a decision which should have taken something more like 30 seconds to achieve.

 

  • Perhaps you will never know the exact figure, but there is also a profound satisfaction in feeling, perhaps even knowing, you cut your administrative costs, at least in some departments, by as much as 50 percent or perhaps even more.

 

  • There is also a certain sense of peace and comfort in knowing where you stand at all times. Almost everything you want or need to know is virtually, instantly at your fingertips. As a bonus, you will have an exciting newfound sense of knowing that when you do run a report, it will probably be quite correct – and all is well.

 

Unfortunately there is no quick fix for either of these issues. Getting system engineering and development moved from an accounting orientated mindset to an information orientated mindset is going to take a new style of systems engineer. People developing and writing these new systems will require an extremely good understanding of both the requirements of accounting and what are and what are not powerful, reliable computer technologies. Objectives cannot be attained by patching our existing mainstream software systems. The entire school of thought on systems design will evolve entirely new concepts in virtually every aspect of what an information system can and will be. Entirely new systems will emerge. Disk storage capacities will become much larger and information retrieval methodologies will become extremely fast.  The requirement for both exist to-day and that requirement will not go away.  The challenge is to meet it.

Obviously, opportunities abound to address these issues and the benefits to be obtained extend far beyond the direct cost of fixing our mistakes or finding documents in question. Today’s decisions really need to be, and can be, made to-day, not tomorrow, with a much higher probability of making the correct decision – with accurate and timely data. As a bonus, huge amounts of human resources will be channeled into honest productivity thereby making corporations far more competitive. Personal satisfaction levels of everyone, both internal and external to the organization, will also improve.

Leave a Reply

Your email address will not be published.

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>