IBM suggest that your batch processing time will increase by 3-5% with the right hardware and tuning. A great deal matters whether you are journaling the entire system for HA or doing before and after journaling on a DB2 database esp. using commitment control. I would recommend jumping one processor size, say from a 550 to a 570/MMA if your business can’t afford longer run times.
Every process that uses a CPU cycle or a byte of storage will impact system performance. Why would you think journaling wouldn’t?
Does compiling a program affect performance? Yes. Does starting a display session and leaving it sit quietly affect performance? Yes. Does leaving spooled files sitting on output queues affect performance? Yes, but people do it for months without a second thought.
It’s not a question of “if” performance is affected. The questions are “how much?” and “will the impact be positive or negative?” and “does the impact matter to me?” Most likely, journaling will use CPU and storage resources that could be available for something else. But we have no way to know if it will make any difference for you.
Is your system running continually at 95+% CPU utilization? If it is, then journaling will probably have a noticeable negative impact if you journal enough transactions against enough files. Is your disk status continually showing 40% or more busy across your drives? If so, then journaling will probably have a noticeable negative impact.
If you aren’t using more than 90% of your CPU capacity, why are you wasting so much processor capacity? Why don’t you put it to use? You paid for it. Why be concerned about using more of it?
Note that there are journaling practices that are good and bad. But there are also good and bad backup practices. Without knowing about your system and business environment, there’s no way to say much beyond “Probably.”