How Data Modeling Can Save Money

It’s not uncommon to discover databases that have simply evolved over time rather than having been built according to some overall design or “master plan”. And there seem to be no shortage of answers as to why this occurs. Sometimes DBA’s are just far too busy with a myriad of other tasks. Sometimes developers are under the gun for deliverables so quickly that there’s just not time. Or my favorite, this project started as just a couple new tables and then ballooned into a monster. Whatever the reason, the situation exists far too often in the real world of information systems.

 

Let’s put this into perspective. Would you purchase a brand new house where the builder simply showed you a high level “floor plan”, and then said “trust us” – we’ll just hammer together the lumber and stuff until we’re done, don’t worry. Not too sure very many of us would so nonchalantly encumber ourselves with a 30 year loan on such vague promises. We automatically expect the home builder to use that rough diagram as the precursory for an architectural blueprint that meets local codes, standards and ordinances. We expect the city zoning commission or code enforcement division to sign off on key items during the construction. We expect to spend the week before taking ownership to inspect the home and to mark things to be fixed before the closing. In short, when it’s our money and/or financial future – we expect things to simply work right.

 

The same is true in many other disciplines that we utilize every day. We get on elevators knowing that they were well designed and recently inspected. We readily fly on airplanes knowing that the FAA and other government agencies are making sure it’s safe and well regulated. We undergo surgery at hospitals knowing that doctors and administrators have jumped through major hoops to guarantee a reasonable level of safety. Yet we still build computerized systems with all the skill of a MASH surgeon – just get it stable enough to pass onto the next guy. No wonder then that both database administrators and application developers are quickly catching up to lawyers and politicians in terms of likeability.

 

So let’s start this week by just looking at the two most critical database success factors: effectiveness (doing the right thing) and efficiency (doing it expediently).

 

How can anything be effective if it’s just cobbled together haphazardly over time? I’m not saying that everyone must do a logical data model and transform that into a physical data model. That may not be most peoples’ “cup of tea”. But slowing down a wee bit to think of the overall physical design should yield a more consistent business solution. That should translate into higher effectiveness.

 

The same can be said for efficiency. So by not simply and seemingly randomly adding tables and columns on an as need basis can easily result in a more fluid design. It’s much easier too to code against such a design, which can also mean lower costs in terms of the development (i.e. less complicated to work with). Furthermore, this generally translates into lower ongoing maintenance costs as well – and often far less unplanned crisis too.