Database Project Definiton

0 pts.
Tags:
Database architectures
Database design
A corporation has decided to move its production data off the OS 390 platform to a windows platform running the windows data center operating system. *There are 18 terabytes of data of mission critical data that needs to be available 24 hrs a day. *Because of the size of the customer database they need to be partitioned. *The database must have the capability to support up to 500+ users concurrently; with potential to expand. *Not all the users are in the same location as the data; therefore remote connectivity capability is necessary. *The order Entry application is part of the of the suite of desktop applications, written in Microsoft Visual Basic 5.0, that automate business processes. This application provides order entry and tracking functionality, including related interfaces to manage customers, employees, and suppliers. *There is a hot site for disaster recovery approximately 50 miles away from the primary location. The Hot site is equipped with a tape silo, tapes, and adequate DASD to perform a complete recovery. There is a T3 line that runs between the corporate office and hot site. To present this proposal; Which database will you use and why? How will you move the data to the new platform? How will you restore your data at the hot site? What will be required to connect the database to the business application?
ASKED: November 26, 2004  10:40 PM
UPDATED: August 22, 2008  7:49 PM

Answer Wiki

Thanks. We'll let you know when a new response is added.

With that size of database and access needed, go with teradata database.

Discuss This Question: 4  Replies

 
There was an error processing your information. Please try again later.
Thanks. We'll let you know when a new response is added.
Send me notifications when members answer or reply to this question.

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
  • CWSage
    First of all, I wouldn't want to run that kind of data volume on a windows platform. I will freely admit that having spent twenty-odd years mostly on various Unix platforms and having several successful multi-terabyte projects under my belt make me more than a little prejudiced in that respect, though. With that kind of data volume I would tend to go with Oracle. The simple reason is that I know from first-hand experience that slightly smaller databases (~5TB) with massively partitioned data work in a 24 by 7 setting, provided they are designed carefully and maintained diligently. How the data could best be moved to the new platform is difficult to say without detailed knowledge about the application. If at all possible I would try to set up a "soft migration", meaning that historic data (that is not changed any more) is moved to the new site step by step until only real moving data and that part of standing data which regularly changes (and is therefore not really aptly named) remains. For this remainder you will then probably need an extended downtime. If you are lucky, you might be able to use Xmas next year for that. The application should also be updated somewhat, I feel. As it is all in VB5 you might think about staying with VB but moving at least part of the application to ASP. The aim would be to reduce bandwidth requirements for remote users. All this is just off the top of my head. Without a lot more details about the setting and the application (and a lot more time and effort besides) it is impossible (at least for me) to give you a profesional opinion. So what you got instead was what I would have answered had you asked me over a beer in the pub round the corner ...
    0 pointsBadges:
    report
  • Geoffreyhalleds
    For a database of this importance and size I would need track record and success stories of the whole picture. I wouldnt recomend anything other than SQL on windows and I wouldnt recomend windows. A 24*7 solution also wants a tried and trusted replication and failover solution. Both hardware and software. Is there sufficient bandwidth for the remote users. Its useless having a 24*7 if the bottleneck is the network because your SLA to the customer is measured on the availability to the user wherever he is. The hardware vendors would offer the correct spec for 500 users and 18 terrabytes. The memory would be also a function of the type of use the users have and whether they connect through an app server where most of the processing occurs. With growth in mind and the fact that the larger the spec the less reliable it is I would recommend a grid style system where you can slot in and out memory and cpu modules. The location and level of integration of the related interfaces is also important in the overall strategic plan and could affect your availability to the remote user (depends how integrated). You need to be aware that however hot your backup site is you cannot give 24*7 if you cannot switcht to this site quickly and point your users or app server at it immediately. tapes do not equal 24*7. (24*7 costs) This sounds like a situation where plenty of redundancy is necessary and must not be built piece meal. I see several app servers connecting to several instances of the database which would be on a SAN. The SAN would be replicated to the hot site on a continuous basis and would be constantly used and tested for immediate switch over. I dont know of reliable microsoft solutions here. On a unix platform Oracle have these style solutions as other big vendors may (DB2 and Universal and Sybase). Migrating data depends which database it comes from. DB2 to Oracle and vice versa is fairly standard and this procedure needs to be tested and timed until the big day, unless there is a logical separation with the data and its access which is doubtful. These are just some thoughts. I hope I have been useful to you.
    0 pointsBadges:
    report
  • Grokrc
    Your questions go far beyond a simple tip or even a point answer. The only answer for you that I see that does stand out clearly is peer-to-peer remote mirroring, with just 50 miles possibly even syncronous but not recommended without investing in a detail performance model first. Any high-end SAN product from the OEMs IBM, EMC, or HDS could do this (except maybe IBM for async - I am not sure on that). SUN and HP are not OEMs for high-end SAN, but they are VARs with there own trademarks and brands. HDS makes HP high-end storage. The mix of 18 terabytes data and only 500 interactive users tells me that the details of the application could be important. Possibly just retail accounts with a lot of purchase history, so you could move the history first. At any rate the data model and application are the keys to what to do and even if you can fit it onto Wintel with the performance that you need. I would recommend that you hire a performance consultant. An accurate answer to your questions would require not only more input information, but also about 6 man-months of analysis and simulation model construction. The best that I know of is Tom Bell at Rivendel Consultants (310-377-5441 from a 1999 CMG resource manual). If you think that they charge too much, then do a little research on re-platform initiatives that failed to scale. I know of several ranging from $1M to over $100M that were taken as 100% business loss. Connie Smith of Performance Engineering Services (505-988-3811) is also one of the best in this line of work. If I were available I would offer to take it on myself for $100K, but I don't have a PhD and my own well established business like the refs I gave you. Also, anytime a company decides to make a big commitment like re-platforming then it is good to first assess where you think the company will be going over the next five to ten years (e.g., sucessful companies always plan to broaden there marketing channels).
    0 pointsBadges:
    report
  • Northcarolinaken
    The question begs several others. The 500 clients, are they Windows PCs? If so do they access the data through existing NAS servers or is that part of the question? The remote accesses suggest iSCSI. Are these remote servers? The disaster recovery site suggests a storage array with remote replication abilities. Does the data now exist on SAN arrays? Is it direct attached and the customer needs a consolidated solution? I have to quarrel with the assertion of Teradata as a good fit for this project. I love the technology and I am an old NCR guy but Teradata is specifically designed for data warehouse applications and it is the best in the world at it. Characteristic of data warehouse transactions are complex SQL queries. Teradata breaks these up by areas of the data base managed by the servers in the system, the data shared nothing. That is, every server has its unique address space it manages. The queries are then processesed MP (multiprocessing) style in parallel. This application is a combination of batch and random access. The best interactive database by far is Oracle and Oracle runs on Microsoft servers and just about anything else. If you can partition the address space by application rather than backup time or some other parameter, Microsoft SQL Server could be a good solution for the customer. 18 Terabytes a couple of years ago was huge. Today, it is midrange so one shouldn't get too excited about it. It is well within range of most solutions. The big data bases these days are into the Petabytes. If NAS systems are not yet installed, Sun may well have the best solution in the second Calendar quarter of next year. The 5310 NAS machine will be available in clustered form and will handle up to 32 Terabytes with 146GB 10K drives. If the customer can't wait, I would reluctantly suggest a Network Appliance clustered solution like the FAS960c. Ken
    0 pointsBadges:
    report

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to:

To follow this tag...

There was an error processing your information. Please try again later.

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Thanks! We'll email you when relevant content is added and updated.

Following