Dealing with OLTP type of models, there is a logical data model tease out to understand the business. The physical implementation is then influenced by web/mobile UI application needs and optimization needs (fast read and write), of small chunks driven by API calls. Some of that is mixed model denormalization for speed, some structures pinned in memory. Not your typical CRUD screen from the past, more likely driven by thousands of customers reading and writing to carts simultaneously.
To be honest... the last few sentences caught my eye... had to re-read, not sure if the analogy works for me from a technical book view point. It would be the first technical book I have read that mentioned pre-marital sex. LOL. Could be the drinking can lead to that direction! Which can lead to operational debt and technical debt, non-identifying relationships, and orphaned children :). Not going into the possibility of a deadly embrace! Slippery slope.... LOL.
"Dealing with OLTP type of models, there is a logical data model tease out to understand the business" Interestingly, a challenge of this chapter is bringing the performance parts in line with CDM/LDM. Plenty of SWE's I know don't know what a LDM is. I need to publish the chapter on CDM, LDM, PDM at some point soon.
Removed the last few sentences. Quite a few people liked it, some hated it. I decided to keep it out for now. It definitely drives home the point, but I like ending the intro w/ the dogma part. That's a more powerful ending.
True, LDM's are always the challenge in this day and age, we live in a physical world. And it is not taught very well as part of a college curriculum. A good tool makes it easier to understand and reuse the logical and build the big picture with the conceptual. As in my previous post complaints, the top tools are expensive (ER/Studio, ERWin) - and have no subscription models, slow to adapt the increasing number of databases (or even files) - still on-premise I believe. Keeping an eye out for a disruptor!
Ending it with the dogma statement has some bite, still thinking of another example. Look forward to the CDM, LDM, PDM discussions.
Thanks - good stuff - can't wait for the movie! ;)
Great job in bringing up this topic. I try to first create business data documentation (aka LDM design) of the business data using the appropriate level of normalization decided upon. I can then choose transformations (de-normalization) and the physical implementation endpoint along with characteristics to consider (SQL, Oracle, Databrick, MongoDB, etc). I sometimes see some use actual physical implementation as being the design and trying to reverse engineer that as the “as is” LDM. That may or may not reflect how the business data actually works. The other thing to remember is you still need to document business names, business definitions plus other business-technical metadata you require. I love using a data modeling tool which allows to me flexibility in designing and choosing the appropriate implement choices and the endpoint target.
I'm sure this must have been brought up before, but the complaint “What do you mean the data doesn’t relate?!” shows a fundamental misunderstanding. The relational model isn't called as such because entities relate, but that the name for an entity is a Relation, which is a group of tuples. So star schema, OBT etc. are still relational, just denormalised.
I must be getting old. At least Kimball abides.
He's a good dude with good ideas. Just like plenty of people in this field.
Alt-relational data relies on Mixed Model Arts for effective structuring.
Dealing with OLTP type of models, there is a logical data model tease out to understand the business. The physical implementation is then influenced by web/mobile UI application needs and optimization needs (fast read and write), of small chunks driven by API calls. Some of that is mixed model denormalization for speed, some structures pinned in memory. Not your typical CRUD screen from the past, more likely driven by thousands of customers reading and writing to carts simultaneously.
To be honest... the last few sentences caught my eye... had to re-read, not sure if the analogy works for me from a technical book view point. It would be the first technical book I have read that mentioned pre-marital sex. LOL. Could be the drinking can lead to that direction! Which can lead to operational debt and technical debt, non-identifying relationships, and orphaned children :). Not going into the possibility of a deadly embrace! Slippery slope.... LOL.
"Dealing with OLTP type of models, there is a logical data model tease out to understand the business" Interestingly, a challenge of this chapter is bringing the performance parts in line with CDM/LDM. Plenty of SWE's I know don't know what a LDM is. I need to publish the chapter on CDM, LDM, PDM at some point soon.
Removed the last few sentences. Quite a few people liked it, some hated it. I decided to keep it out for now. It definitely drives home the point, but I like ending the intro w/ the dogma part. That's a more powerful ending.
Thanks as always Maury.
True, LDM's are always the challenge in this day and age, we live in a physical world. And it is not taught very well as part of a college curriculum. A good tool makes it easier to understand and reuse the logical and build the big picture with the conceptual. As in my previous post complaints, the top tools are expensive (ER/Studio, ERWin) - and have no subscription models, slow to adapt the increasing number of databases (or even files) - still on-premise I believe. Keeping an eye out for a disruptor!
Ending it with the dogma statement has some bite, still thinking of another example. Look forward to the CDM, LDM, PDM discussions.
Thanks - good stuff - can't wait for the movie! ;)
It doesn’t help that even Date calls conceptual models “logical” in some of his writings. Definitely gets confusing
Great job in bringing up this topic. I try to first create business data documentation (aka LDM design) of the business data using the appropriate level of normalization decided upon. I can then choose transformations (de-normalization) and the physical implementation endpoint along with characteristics to consider (SQL, Oracle, Databrick, MongoDB, etc). I sometimes see some use actual physical implementation as being the design and trying to reverse engineer that as the “as is” LDM. That may or may not reflect how the business data actually works. The other thing to remember is you still need to document business names, business definitions plus other business-technical metadata you require. I love using a data modeling tool which allows to me flexibility in designing and choosing the appropriate implement choices and the endpoint target.
I'm sure this must have been brought up before, but the complaint “What do you mean the data doesn’t relate?!” shows a fundamental misunderstanding. The relational model isn't called as such because entities relate, but that the name for an entity is a Relation, which is a group of tuples. So star schema, OBT etc. are still relational, just denormalised.
You read the last chapter on relational modeling, right?