How to quickly model SQL data warehouses based on transactional systems
by Mountain Computers Inc, Publication Date: Saturday, May 6, 2023
View Count: 247, Keywords: OLTP, OLAP, Data Modeling, Hashtags: #OLTP #OLAP #DataModeling
I was just working on some OLTP re-configurations for a FinTech and 3PL business, and said to myself, let's see what others are doing in my same space. This link below gave me a nice smile as to how the heck some do it, and while others do it a little differently, like me, this was enlightening, and I learned some new tricks as well.
my secret? ensure normalization, micro-batching, and nearly if not perfect indexing and execution plans, parameterize everything, keep your functions tight and fast, leverage temp tables en mass, don't use stored views if you can and instead use stored procedures, and create and update highly defined and optimized SQL pivot tables that maintain your dimension table calculated summaries based on your OLTP fact pseudo-table source columns. try it. it works without the hassle of going full OLAP.
Always keep learning my friends, and sharing what you learn -- Enjoy (see REF link). Good Karma!!
ps. I did see a note in this reference about > 10 million versus < 10 million rows/records, and my crossing line is much higher, more like 100 million records, and my max record management set I maintain today is about 800 million to 1 billion records in just one database alone and several staging databases both on the front end and back end.
more to come...
if you found this article helpful, consider contributing $10, 20 an Andrew Jackson or so..to the author. more authors coming soon
FYI we use paypal or patreon, patreon has 3x the transaction fees, so we don't, not yet.
© 2024 myBlog™ v1.1 All rights reserved. We count views as reads, so let's not over think it.