I've got a process that loads 56million rows daily (via bcp) into a table, maintaining this data for 30days. Our company is too cheap to pay for semantic partitioning so i'm wondering if anyone has any ideas on how to make queries respond in a semi-timely manner. There's one date column, and two other columns that we routinely do a sum() on, by date. Because of this tables size we can't apply indexes. (It took several hours and still didn't complete so we scrapped the index idea). Does anyone have any suggestions on configuration so that we can run a daily query in under two hours? I've partitioned the table into 31 partitions and each day i load data into a separate partition. I'm not sure if this will help or not, but i'm experimenting with it for now. Here's our environment:
Sun Fire X4250 with 2-quad 3.16GHz cpus, with 32GB RAM running on Solaris 10 x86_64 A - ASE 15.0.3. ESD#1
Software/Hardware used: Sun Fire x4250 2 x Quad Xeon(R) CPU - 32GB ram
Free Guide: Managing storage for virtual environments
Complete a brief survey to get a complimentary 70-page whitepaper featuring the best methods and solutions for your virtual environment, as well as hypervisor-specific management advice from TechTarget experts. Don’t miss out on this exclusive content!