In most organizations, the data is significantly more important than the applications which use it - data is king. The fundamental reason is simply that data almost always outlives the applications that create it. Data is quasi-permanent, whereas the applications that use the data usually change over time. For example, it's common for an end-user application to be completely replaced after a certain number of years, using a more modern set of tools. In this case, the database may:
- remain untouched
- be upgraded a more modern version of the same database
- be changed to improve its design
- be converted into an entirely different database altogether
Persistent data is just that - persistent - it stays around for a long, long time.
If data is king, then the most important tasks for a developer are those involving the database. Any effort put into improving the database, or how it's used, will have a lasting, long-term benefit. In particular, the application programmer should make sure that:
- the database structure is robust, well-defined and (usually) normalized. If the application programmer is not experienced in database design, then they should seek help.
- the data is always validated before being entered into the database (this cannot be stressed enough)
- actions that consist of more than one SQL statement are properly encapsulated in a transaction, and rolled back correctly when a failure occurs. In general, the integrity of the data should always be preserved.
- applications should never assume that they "own" the data. Databases are independent processes, and are built to interact with many clients applications, not just one. For example, many databases are loaded initially using a data load tool. When data is loaded with a script, it bypasses all validation logic implemented in code. Thus, it's incorrect for a calling application to make assumptions regarding the validity of the data.