We run a DWH with 3 databases. The first DB, called Staging, is the "input buffer", copies the operational data, seven source systems, 12 parallel Extraction Jobs running SSIS packages. Mostly full refresh 40 GB data every night. Second DB is called catalog. Data is transformed using about 1,500 views and tables, 120 GB. Third is the Datamart database. 80 Facts and 120 Dimensions in 70 datamarts. 60 GB. We read and write about 160 million rows each night and this takes about 5:15 hours at the moment. We use 4 CPU's and 8 GB RAM 32-bit enterprise edition. Looking forward we decided to upgrade to a 64 bit environment. A no optimized test showed us a runtime of 4:15 without optimization. Now my question: I am searching for CPU and RAM recommendations or facts on how I can find the right sizing for these two components. Measures? Network and SAN is fixed and fast enough.
Free Guide: Managing storage for virtual environments
Complete a brief survey to get a complimentary 70-page whitepaper featuring the best methods and solutions for your virtual environment, as well as hypervisor-specific management advice from TechTarget experts. Don’t miss out on this exclusive content!