To ensure data integrity, we need foreign keys and constraints. Be careful not to overuse or underuse these integrity checks. Domain tables are effective for enforcing integrity. Domain tables work well when there are many values to be checked against, or the values to be checked are frequently changing. One issue can be that developers decide that the application will check integrity. The issue here is that a central database might be accessed by many applications. Also, you generally want to protect the data where it is: in the database. If the possible values are limited or in a range, then a check constraint may be preferable. Let’s say that messages are defined as either Incoming or Outgoing, in which case there is no need for a foreign key. But, for something like valid currencies, while these may seem static, they actually change from time-to-time. Countries join a currency union and currencies change. Applications should also perform integrity checks, but don’t rely only on the application for integrity checking. Defining integrity rules on the database ensures that those rules will never be violated. In this way, the data satisfies the defined integrity rules at all times.
Keys often generate controversy: primary keys, foreign keys, and artificial keys. Tables need a primary key that identifies each row. The art is to decide which columns should be part of the primary key and what values to include. For proper normalization, each table needs an identifying key. Uniqueness must be guaranteed. Yet, natural keys and primary keys don’t have to be the same. In fact, they may not be, as long as the table has a natural key. Some data modelers prefer an artificial key for uniqueness. Yet some modelers prefer a natural key to ensure data integrity. So, should we use a natural key as the primary key? One challenge arises if the natural key must be changed. If the natural key consists of many columns, you may need to make changes in many places. Another challenge is using an artificial key as the only key for a table. As an example, you might have a table storing information about products. The table may be defined with an artificial key such as a sequence, a code for the short alphabetic name for the product and the product definition. If uniqueness is ensured only by the artificial key, there may be two rows with the same product code. Are these the same product that is entered twice? Perhaps a key with the product code is more appropriate.
Naming conventions might not appear important during the design. In reality, names provide the insight to understanding a model. They are an introduction and should be logical. Inconsistent naming serves no purpose. It only frustrates developers who must access the data, administrators of the database, and modelers who must make changes in the future. When “ID” is used for some artificial keys but some tables use a different naming convention (such as Number), developers, analysts, and DBAs may waste time to understand the exceptions. Weak naming conventions also lead to errors in development because the naming is not consistent. Hand-in-hand with documentation, using a naming convention makes it in the future for someone to understand the model. Do not randomly switch between using “ID” (like CustomerID) and “Number” (AccountNumber) as the keys for tables. Only make exceptions to the conventions when they are justified. Document what the exception is and why the convention is not respected. The same applies to cryptic names like “XRT1” – is that the extended reference tables? Your guess is as good as mine. I hope that the designer knew why he chose such a cryptic name, but I doubt that the next person to access the database can guess the reason. Naming conventions are a matter of personal choice. Make sure decisions are consistent and documented. If I succeeded to convince you to apply naming convention in your database design, feel free to read my next article entirely devoted to this subject.
When making a data model, everything seems obvious. You name the objects so that their purpose is evident and everyone will understand the meaning just by reading the name. This may be true, but it isn’t as obvious as you might think. When choosing names for tables and columns, make it clear what the usage of each object will be. Over time, the meaning of objects will be unclear without documentation. Using a naming convention is one step towards effective documentation. When you have to make changes in the future, you will appreciate any existing documentation. A short, simple document that describes the decisions that you made and describes the design will help explain the design choice at that time. You want enough documentation so that the database can be managed by a new administrator and they can understand the meaning without having to come back to you for explanation. If the data model and the environment are not documented, it is difficult to maintain or change it as requirements change. To some extent, documentation has little to do with the data modeling. Documentation is about communicating the design and making it understandable in the future. Documentation is often an afterthought. When schedules are short, documentation gets ignored. Yet, this is technical debt with a high cost. Cutting corners during the development cycle will accrue costs in the future for database changes, problem identification, tracking bugs and for understanding the data model and nature of the data. As an example, data models often have an “ID” field as the primary key for a table or a portion of the name of a key. This might be a primary key like TransactionID on the Transaction table. If some tables use “Number” as part of the name of a key, then it is good to document why. Perhaps ReferenceNumber is used as the name of the primary key on the Message because that is what the reference is called in the business area. For example, in financial services, financial messages typically include a reference number. Document the definition of tables, columns and relationships so that programmers can access the information. The documentation must describe expectations of the database structure. In the Vertabelo tool, I can immediately include comments on any item: tables, columns, references, alternate keys, which means that the documentation is stored immediately with my model rather than in some extra document to be maintained separately. Poor or absent documentation is often due to shortsighted thinking, but do not ignore its importance. This is still an issue to be addressed.
One problem that I have seen is when data modeling occurs at the same time as software development. This is like building the foundation before completing the blueprints. In the past, planning seemed an obvious step before starting development. Development teams would not create databases without planning just like architects would not build buildings without blueprints. In application development, it is critical to understand the nature of the data. Planning is often ignored so that developers can just “start coding”. The project starts and when issues come up, there is no slack in the schedule to address them. Developers take shortcuts with the intent to fix them later but this rarely if ever happens. Careful planning is how to ensure that you end up with a proper database that is not hacked together. If you don’t spend the time and effort upfront addressing the data required by the processes, you’ll pay for it later with a database that must be reworked, replaced or scrapped. Even if planning isn’t always done, many data modelers still follow these guidelines. That’s not to say we can predict every design need in advance, but most modelers believe that it’s worth the effort to understand the data and its usage. You would not want a design for transaction processing when the need is analytic report creation. Times have changed; Agile methodologies are more prevalent so database teams must rethink their approach to data modeling. In Agile, the Domain Model from Use Cases is used instead of Entity Relationship Diagrams. However, the need for planning has not diminished. We need to understand the data and what it’s supposed to do. In general, the first few Sprints must focus on data design. So it’s not Agile that is the issue for database modelers, but rather individuals who do not grasp the nature of data. Some see database development as the same as application development. Database modeling and software development are different and need appropriate focus. The database is the core of most software applications. You must take the time to analyze the requirements and how the data model will meet them. This decreases the chance that the development will lose course and direction. The developers must understand the importance of data and its contribution to the development process. We live in the information age. Applications display and manipulate data. It is the information contained in the data that gives meaning to the application. It is not possible to foresee every requirement nor every issue, but it is important to prepare for problems by careful planning.
When you take a look at the document on your screen, all you find is the frame edges, mess, and guides scrambled everywhere on your pasteboard. But, it is not the way the document is going to look at its final stage. If your intention is to take a better look at your document without wasting the papers and follow the process to create a PDF file, InDesign has got you covered. At a press of the key W, InDesign will take you to the preview mode removing the gray out guides, frame edges and every component that won’t be present in the final document. Another way to look at the final document is by the press and hold the Preview icon in the toolbar. It displays multiple preview modes, including slug, bleed, and new presentation mode. InDesign is a huge software with numerous options that can come in handy to designers and layout artists from diverse industries. The article intends to present some of the easiest and effective tools that will come to use for as many professionals as possible. We hope the set of features discussed will let you explore more possibilities out of InDesign.
If your document is overwhelming with a heavy load of images, text, frames, and all sorts of objects on top of each other, it is surely a tough job for you to manage them all while designing. With the help of the stacked frames feature, you can move through the whole bunch of objects one by one at a time. To use the stacked frame feature, hold ctrl/cmd, and click on the top most object you want to start the selection. Hold the mouse bottom until Adobe InDesign digs deep to select the last object you need.
Now, mimicking the settings of one object to the other is just a click away. The eyedropper functionality, which was first introduced in Adobe Illustrator to copy the color, fill and stroke from one object to another, was an overnight success. Immediately, the same feature is used in Adobe InDesign, which helps the designers much better than the former. To copy and apply the style from one parent object to the children objects do the following. Select the children elements which you want to affect. Choose the eyedropper tool and click on the parent object. You can choose to carry on the same process by double-clicking on the eye dropper tool in InDesign. An options box appears with various selection and de-selection options.
Designing is a tedious craft. The more you work on it, the more beautiful it is, and just like every other art, the finishing is the job that demands more and more time. Miniature jobs like aligning the text or applying styles and colors to objects, resizing take as much time as designing the layout, and in many cases, they may take longer than the actual job. InDesign’s attempt to minimize the time investing in designing went successful with the Quick Apply feature. As the name speaks for itself, the feature access menu commands and apply styles quickly. Press Ctrl + enter or cmd + enter to launch quickly, apply, and type the intended menu or style. Enter to apply the style to the object.
The longing worry of the designing community is the inability to design with the help of code. Most of the designers are not scripting experts, but many of them can use codes and tags. It offers the advantage of doing routine stuff quickly, which saves more time and completes the design job relatively in less time. With Adobe InDesign, anything that is done manually can be done on codes. And the best thing is that you need not be a hardcore coder to use the InDesign codes for a script. By simply copying and pasting the script into a respective folder in the InDesign folder, designers can quickly achieve the intended results. By default, InDesign ships with numerous scripts already installed with the software. You can access the scripts panel through the Window > Utilities > Scripts command and have a look through the sample scripts.