[AUDIO LOGO] Look, it's no secret. We have a very strong product, erwin Data Modeler. Can you give me a rundown of how erwin Data Modeler can help with the database migration?
So you absolutely want to take advantage of logical data modeling. So make sure that in that modeling process that you use that model to engage with your stakeholders, understand what the objectives of this migration are, and make sure that they're reflected in that logical data model so that the data is represented in the way that you want. Make sure that you understand all of those data business requirements and turn those into understandable data structures that are going to meet those requirements.
Always normalize your data in a logical data model. A lot of databases have undergone some level of denormalization for performance purposes. You want to have a clear picture of what that true normalized data is because that's the business requirement. That's what people understand, and that's what you need to be true to no matter what you're doing in any database system to make it perform with the application or perform at the speed that the business needs it.
From that, make sure that you establish those relationships even if they're not explicitly supported or defined in a database because they can also be supported and defined in an application. So make sure that you understand the business rules behind those relationships so that you're not breaking those business rules. Everybody's clear on that. And then from that, make sure that can articulate a clear data dictionary of what data is in this system and what good means in terms of that data, what it should be, not necessarily what it is in the database, but what it should be from a business perspective because that will always be your true north.
So logical data modeling is hugely important in a migration because that's where you're really going to understand what the business has today and what the business wants and how you're going to get there. Now, from a physical data modeling perspective of there's some really key pieces there in terms of understanding what data is at risk, what data is sensitive in your organization. So being able to understand, PII, health care information-- anything that might be deemed sensitive so that from a governance and regulatory perspective, you're not putting something out there and putting your organization at risk. Optimize-- and also will give you that framework and blueprint for how you address security around this system to make sure that there's no gaps, no sort of back doors for people to get in there because you understand the nuances of the data.
Optimize for performance and scalability. So use that model to understand where this thing's being queried, how it's being queried, and then what that query is doing to potential performance. So make sure that you have the right indexes, partitioning, storage strategies. And you can do that and play with that in a model without having to build and rebuild a full system at great expense.
And then make sure that you've got the right data types for the data that you have, because a lot of time people will use the lowest common denominator. But you can be using a much more precise data type to get a higher level of quality in the data that you're absolutely wanting to see in that system. And then make sure that as you go to this new system that you have clear understanding of all of the referential integrity between these different entities, attributes, and relationships so that you can put the constraints and maintain the integrity of that data no matter what type of system you're moving to.
And then, of course, finally you've got now a working document of your schema where you can now start to go and model changes, enhancements, new requirements and do that in a much more managed and controlled way where you're not introducing problems or potential risk into your system because you have a much clearer picture of what it is that the business needs, what the impact will be on the system that you have today, and how to deploy that in the best possible way to make sure that the business is getting what they want from this new platform that you've migrated to.
OK, good. Is there any other features or capabilities that data monitoring can help with respect to database migration? I think you've touched on it quite a lot there.
Well, there's so much. On the logical side, you can start breaking down sections of this database into what we would call subject areas where you understand what part of the business is being addressed and what part of the application is addressing what the business needs to break it down, to get rid of some of the complexity so that people can understand where if it's a large system, this is order entry, this is marketing, this is a sales opportunity. It depends on what's there. You can break that up. So subject area is very important.
Domains to have commonality across data types in your organization so that date fields are all implemented in the same way, so that any sort of currency type fields are all implemented in the same way because again, anything, any time you don't have standardization across those things. And Data Modeler is a great tool to standardize how you manage data in your organization. It lowers the cost and the time and risk that it takes to integrate data over time.
It also makes it easier during a migration because, again, you're going from apples to apples. You're not taking a date that's been used in a certain way or a currency field that's been used in a certain way and then just implementing something completely different because you've standardized that. The data modeling tool, and