This post on Dratz's Confessions of an IT Hitman isn't the first place where I've heard this said. Although I wouldn't go so far as to say that Chet is completely biased towards data-centric-thought-process and Databases, his letter to the CIO was a little one-sided; probably because he was trying to drive his point home. I am more inclined to lean towards Ralph Wilson when he reiterates the old addage that if a hammer is your only tool, every problem is a nail.
I'll admit- Data is the most important artifact that business generates. But, IMHO, the most important output of the software development activity is, by far, the simulation of a business process.
This follows the analogy that every program is a process-
a systematic series of actions directed to some end.
A program/system 'does' things; most evidently facilitates, automates or accelerates a given business function. Well, atleast good software.
Data generated by a system changes as much as the business processes that generate it; in fact data goes out of date faster. Today's data will be archived and stored for 'future reference', but will rarely be in 'active duty' as long as the processes that manipulate it. For all Thomases doubting my experience in this matter, I work in a team that maintains a Mortgage Origination System used in North America by a MNC bank. We recently wrote a job that archives data pertaining to about 8,000 mortgages every month; and they just archive unprocessed loans, add closed loans to that and the monthly figure would sky rocket.
Anyway, getting back to the point. I believe there is some merit to Chet's point of putting logic in the database, but...
you should put as much of the logic in the database as humanly possible
is a little extreme! I mean I don't want to run an INSERT and have an ORA-06502: PL/SQL: numeric or value error thrown to know that I entered one-too-many-zeros or entered a date in the wrong format! Trust me, there's dudes who stretch the idea this far.
Sidenote: Several situations warrant the temporary storage of data that doesn't meet all the integrity constraints. Well designed UI's would allow users to set arbitrary savepoints- even when ALL required data has not been entered. Think of all the times you were filling up a 53 field web form; and realized at field 24; that you need to run down to the drawing room to pull out the referral code for a vendor discount printed on a coupon that you got in the mail 3 months ago; but you don't want the form to timeout while you go fish the coupon out of the bowl in which you keep all of your discount coupons. So, you're sitting there gritting you teeth, seething away, wondering why the fr*cking developer didn't think of putting a SAVE button there! And when you do put one in, if the application is tied in to the database- you have data in inconsistent state; or you end up designing a schema that doesn't enforce all the integrity constraints declaratively. That's where a Domain model (sprinkled with a healthy dose of the NullObject pattern) comes into picture.
Good design is about tradeoffs. I'm not promoting 'Rail'ed webbers who think ActiveRecord is a Silver Bullet, but I'm also not saying that the ActiveRecord Pattern doesn't have any merit. IMHO, it can be implemented as well in PL/SQL as it can be in Ruby/Java/C#. The point is- where does it make the most sense given the constraints (environment) you are working with?
The notion that data is the MOST important thing in an application leads to the development of webapps that babysit single dimensional databases.
That's just saaad! Where is the CREATIVITY?
Update (04-Oct-2004): This just hit me, given that replacing an application (medium-to-large-scale purpose built, not some COTS) almost always involves data migration effort, representation of data (database) is as transient as the process (application)!