Working alongside SAP Business Suite

Page 1 of 3123

February 20, 2015  3:20 PM

Getting ready for SAP S/4 HANA

Clinton Jones Clinton Jones Profile: Clinton Jones
ERP, HANA, SAP, SAP Fiori

s4The SAP press engine wheeled out a big gun this month with the announcement of S/4 HANA – pitched as the successor to SAP ERP – Hasso Plattner and Bill McDermott both made statements about the importance of the announcement. While there still seem a great many unanswered questions about how S/4 will work and run, one thing is clear, SAP has been preparing for this announcement for some time. While the specifics may still be very murky we know a few very specific things.

The first fact that we know is that S/4 HANA will run on the SAP HANA platform and the primary user experience will be delivered through the Fiori application layer. For some companies this will be a welcome announcement as they toss coins to decide whether to adopt, stay with or abandon SAP as an ERP platform provider. There can be little doubt that for companies that have made a significant investment in SAP’s flagship ERP product, this announcement holds the promise of a new lease of life on SAP’s aging applications.

The HANA platform was the first horse to be presented from the new SAP stable around 2010 with the SAP Fiori UX being formally released a short time later and now boasting some 400+ role based apps SAP ‘solutions’ providing a personalized responsive and simple user experience.

Certainly customers that I speak to are all over the place in terms of their hopes and aspirations for HANA, some are great advocates, having already moved off of a dependency on one of the traditional RDMS for the underlying database but few or none talk about their own applications that they have developed on the HANA platform. Internally we have played around with the standard APIs and demonstrated an ability to directly write to the tables with but without clear use cases it seems a little premature to put too much more effort into this.

What does hold promise, is the announcement in that SAP S/4 HANA will be offered in the cloud, on-premise and as a hybrid deployment to provide maximum choice to customers. The SAP Simple Finance solution was the first SAP S/4 HANA solution to be offered and was announced in June 2014. While the specifics of the next generation solutions seems a little vague I did reach out to some of the more prominent advocates of HANA to determine just how exactly customers could transition to an S/4 HANA experience today even with only Simple Finance 1.0 on offer.

An important component that has to be considered is the SAP Landscape Transformation Replication Server, which you can learn more about through the Application Operations Guide targeted for customer consumption at help.sap.com.

SAP Landscape Transformation (SAP LT) Replication Server is technology that allows you to load and replicate data in real-time from ABAP or non-ABAP source systems to SAP HANA environments. LT Replication Server uses a trigger-based replication approach to pass data from the source system to the target system.

The Replication Server can be installed either as a separate SAP system, or if  you have enough iron, on any ABAP source system.

The following graphic outlines the basic concept and the typical landscape (for an ABAP source system) using the trigger-based data replication approach of the SAP LT Replication Server.

2015-02-20_13-46-39

In order to replicate data the Replication Server needs to be configured  to load and replicate data from one source system to up to four target database schema of HANA systems. Data can by replicated either in real-time, or by schedule. When using trigger-based data replication table-based retrieval of data occurs from application tables in the source system (or source systems). Transformation rules can also be defined for the data replication process.

As you can see, the approach from the application tables to the Replication Engine is to integrate by way of RFC but between the Replication Server and a HANA target system the connection is achieved using a DB connection and so is likely to perform well.

Using RFC

Typically on ERP systems each time a connection is established, logon data and other system parameters, such as character code pages, are exchanged which causes a system load of approximately 2.5 – 3 KB on the network each time a user logs on to the SAP System, for example when you use a BDC session with Winshuttle Transaction to send data from Excel, the initial logon tends to take some time.

The largest amount of overhead for an RFC transmission occurs when making the connection (or calling a function like a transaction). The size of the data blocks is configurable but there is a whole art to choosing optimal block size depending on the payload that you intend to exchange over RFC – you often need to make a compromise between the size of the data block and the manageability of the data. SAP’s own tests have shown that for data of approximately 100 KB and greater, the overhead when making the connection or calling the function is negligible.

SAP says large amounts of data only produce additional improvements in performance when ‘very good data compression programs’ are used, one can only assume that the mechanisms used for the LT replication fall into this category.

Certainly returning to the considerations of finance, if your intent is to continue with the status quo of running your financials on ERP and you plan to do more analytics and reconciliation on an S/4 system today then you will need to set up replication between BSEG, BKPF, BSIS and BSAS and the S/4 system as a minimum. Certainly if you have big data in these tables the replication may not perform very well unless perhaps you are running on a HANA database.

I have been told that in Simple Finance 1.0 the standard Accounting interface BAPI’s are used but in 2.0 there are dedicated posting programs so for existing SAP customers, trying out S/4 today on Simple Finance 1.0 may not be as difficult as one thinks. You can read more about the possibilities for integrating ERP finance with S/4 by way of the blog post by Jens Krueger and you can read more on how to optimize RFC also at help.sap.com

January 14, 2015  2:55 PM

I said you could save time and money – here’s an explanation on how

Clinton Jones Clinton Jones Profile: Clinton Jones
Data, ERP, IT Automation, ROI, SAP R/3

time-is-moneyOver the years I have gone to great pains to explain how businesses can save time and money with automation of business processes in systems of record like SAP but despite this, there are some who doubt that real time and money savings can be obtained through automation of things like transactions.

The power of preparation

The approach is actually quite straightforward, if you already have your data in a spreadsheet then the next question is how you get that data into the system.

Many respected experts, industry productivity gurus in particular, will argue that the data should never be in a spreadsheet and should never have found its way to a spreadsheet. This statement is made on that assumption that data processing tasks are all very hygienic and straightforward.

As any business user will tell you, placing orders, doing adjustments, mass changing data is anything but straightforward and a spreadsheet is often the most straight forward way of collating and compiling data before taking any action.

If you’re taking the output of the spreadsheet and either copying and pasting the data into entry screens or manually transcribing then consider how much time you are spending on that task? In fact, if you are even converting it into comma separated values for loading through some tool, consider the amount of time you spend on that, the frequency and relative reliability of that approach.

Injecting data into systems is a prescription for improved productivity

Every minute you spend on transcription, copy and paste, reformat, checking and rechecking, could be precious moments that you will never recover.

If you eliminate those steps, accelerate the data injection into the system of record and avoid the likelihood of duplicate entries, omissions and transcription faults then you will reap the reward of time savings.

Multiply the time saved by the value of your time and that’s a part of your ROI. You can test this out with any ROI calculator that allows you to define a given process.

The great thing about using a standardized method for interfacing with your system of record is that it can be used for small quantities and big quantities of data and depending on the nature of the tool that you use to create the automation, you may have many automation scenarios for different tasks that you need to perform.

While continuing to use spreadsheets to stage data may seem old fashioned to some users, the reality is that this is one of the most powerful and most flexible ways available for preparing data and while automation doesn’t necessarily save you time in the preparation stage it will save you time when you finally have to apply the prepared data to your system of record.

2015-01-14_14-41-53


December 3, 2014  11:29 AM

Putting holes in the suit of armour that protects ERP systems

Clinton Jones Clinton Jones Profile: Clinton Jones
HANA, SAP ERP, SAP HANA, SAP SuccessFactors, SuccessFactors

Armour of a cuirasse du carabinier holed by a cannonball at the battle of WaterlooI recently attended a webinar by SAP and SuccessFactors that attempted to argue the merits of cloud solutions as primarily being able to meet and address the lowered sustainment costs associated with using SaaS by eliminating the opportunity for customers to customize their cloud application – what SAP mentor Chris Paine promotes is the idea of  enhancing functionality through extensions – using vendor guaranteed APIs.

On the one hand it was argued that SaaS solutions aren’t customizable but are extensible but the specific nuances of the difference between these two concepts get very blurry very quickly.

As the speaker rightly pointed out, some organizations really do differentiate themselves from their competitors by virtue of the fact that they have unique business processes and unique business process solution requirements.
This opportunity of SaaS cloud solutions unfortunately does seem unattainable to those organizations that already have a solution that addresses their needs but for which they pay a heavy price not only in maintenance but also in terms of flexibility.

Creating safe connections

What SAP is now pushing therefore is the notion that you can augment your cloud based solutionssap-teched-2013-cd105-extending-successfactors-employeecentral-with-apps-on-sap-hana-cloud-platform-16-638 by making use of the SAP HANA cloud platform. For those who have been tracking HANA for some time, this is all little or no surprise. Where the HANA seems to really differentiate itself or appears to be trying to differentiate itself is in terms of the methods that it uses to allow safe and secure integration between your SaaS solution and any applications you might develop as custom functionality in the cloud.

At Winshuttle  I have seen similar challenges for years with clients at the desktop but it has also started to become a challenge at the web form level with low code solutions with an increasing number of companies wanting to enable collaboration with their business partners either as suppliers, vendors or joint ventures. The idea of cross collaboration between companies is nothing new, it of course helps to reduce business friction and accelerate the speed with which transaction processing can be completed but it comes with a cost.

Aside from the tax associated with customizing ERP there is an additional cost which is most often manifest in the need for the business to create an access path in the suit of armour that it presents to the outside world to facilitate collaboration.

Sending email outside of your business is one thing but exposing your ERP and file systems is something entirely different especially if you are considering doing something mobile.

Systems of record exposed to the world

Interface bus and message based systems get the business down this path very effectively; are made robust with encryption and password and user identification but tend to be inflexible and rigid. If the requirements change, the interface needs to be changed and the message structure changed and this takes time. Backwards and forwards compatibility may also represent a challenge, especially if you are dealing with many business partners.

sap-teched-2013-cd105-extending-successfactors-employeecentral-with-apps-on-sap-hana-cloud-platform-10-638While the new promise of the HANA Cloud platform will not solve all the difficulties and resolve all the anxieties of companies pushing to open systems and the cloud, it seems to represent a good option for companies that are committed to a heavy investment in redeveloping existing custom solutions or embarking on a new way of thinking how they use cloud applications and extend them with unique business differentiating features.

For those watching the budget it seems that the pricing model is heavily dependent on the physical number of extensions that a company decides it needs. Extension packages are offered as Standard, Enterprise and Supplemental editions based on the size of the usage profile per named user. It is conceivable that the cost can become significant therefore, when you have a large number of users.


November 10, 2014  5:45 PM

Where mobile fits in the data movement challenge with ERPs like SAP

Clinton Jones Clinton Jones Profile: Clinton Jones
Automation, BPM, BPR, Business Process Automation, ERP, IT Process Automation, Mobile, Online transaction processing, Process Redesign, SAP

I have been going through an exercise with colleagues evaluating stories around Business Process Automation solutions and where they fit into the classic data movement challenge with SAP. For some years now we have taken the position that there are three areas of interaction with ERP systems like SAP. At the lowest level there is the grunt work associated with manual entry.

You take data from a form or a phone call, enter the ERP data entry screen and enter the leave request, material definition, the purchase requisition, the sales order or the manual journal. This approach to manual entry works perfectly well for probably up to a dozen or so entries of low to moderate complexity.

The manual entry process gets a little hairier when the entry requires multiple levels of data entry and goes beyond a few simple entry screens and in some instances we find that companies preferred a template based approach even on single records when the number of fields go beyond about twenty to thirty. screensimplificationThe rationale for using a template even on a single record is all about the data quality issue.

By using a template they are able to positively influence the data quality metric by being able to define certain characteristics as mandatory or bound by certain rules even when the SAP system itself doesn’t have rules around certain fields.

This is a classic business process automation sweet-spot for Business User Application Development. As is inevitable in such circumstances, the question that then arises, is what does one do with exceedingly large volumes of data or terribly complex scenarios where a template based approach is simply not good enough.

What we have noticed is that for relatively stable end-to-end processes where say purchase orders, sales orders or similar activities occur with particular regularity in high volumes, that a more IT delivered system-based approach to data capture is preferable.

Minimizing human interaction

NoManualOften there is a requirement for a solution where as little human interaction as possible, is preferred. Examples of this might be the integration of a Warehouse Management System, a shopping cart solution or a point-of-sale system.

Where these are in use, the application of IDOCS, EDI or something like PI/XI or an integration server are by far the most preferred approach. The reason for this is largely about scale but sometimes it is also about the asynchronous way in which many of these kinds of solutions acceptably work.

The challenge then is in addressing that mid-range of data movement challenges, the one where you have enough data to create or maintain that manual entry is awkward or a nuisance or doesn’t justify the building of a formal IT ‘big solution’.

Part of the issue with this area of ERP use is that IT is nowadays expected to do much more with less and as a consequence their budgets are not being allocated to the smaller business process improvement initiatives such as keystroke minimization, UI simplification or transcription avoidance. If the business wants these attended to then the business needs to find the funding of such initiatives as a project from operational budgets.

Mobile options

Composer-XK01-Vendor-Create-Form-2-Mobile-Approval[1]Where IT will step into the fray with a large solution for the business is on extending accessibility to the ERP platform via mobile devices. The thinking behind this, is more broad brush than specifically addressing individual use cases or providing point solutions.

Solutions like SAP Personas and SAP Fiori speak to making the functionality and data within SAP ERP accessible to a larger community in a more digestible way. Mobile solutions in particular, are an interesting area for business process improvement not only because they provide access but also because they accelerate the end-to-end process by enabling participants in an end-to-end process to be more responsive but doesn’t address the challenge of working with a number of records simultaneously.

Additionally, mobile isn’t very useful for dealing with large quantities of discrete data entry even when dealing only with a single record. While I could see a sales person for example entering a one line sales order into a mobile a device the idea that they might enter a fifty line sales order seems a little less likely.

If this challenge with using a mobile device is well understood for a sales order then extending the challenge further to the creation of a material master that may have as many as a 100+ fields the use of a mobile device to do this, seems even less acceptable.

When considering mobile solutions it should be clear then that while a mobile application does often bring improved usability, the challenge that the business has, may not be only related to usability it may also be related to data volumes and a mobile solution may not adequately address that challenge.


October 23, 2014  11:42 AM

I see the future and the future is gold

Clinton Jones Clinton Jones Profile: Clinton Jones
MySAP, SAP ERP

blueglasssweepThe forty something company SAP has changed it’s branding logo and now it is all gold.

Just the like the messages that are coming from CEO Bill McDermott and User Experience labs in the form of products like Fiori and Personas, everything is much simpler and changing the way the brand is presented seems to be a part of that.

SAP does seem to give considered thought to topics like colours and branding at many levels.

Part of it seems to be intent on tugging at emotions if the Forbes post by Ryan Somers of SAP is to be believed. Even Dennis Howlett opined that customers may actually care about the new branding, after-all SAP is not the only major software vendor engaged in brand transformations, Salesforce recently changed theirs too.

A few days ago we heard that Apple was readying a push out of brands like Bose that might compete with Beats and today we hear that Microsoft will retain the Lumia brand and discard the Nokia brand as part of the integration of the Windows phone division.

The iconic blue Trapezium that SAP employees, consultants and customers have been familiar with since the 1970’s has now been replaced with the bold, bright ‘gold’ letters S, A and P.

angular lines are old news

It seemed only yesterday that the navy blue of the trapezoid was changed to light blue but actually it was three years ago – my how time flies.

The new gold colour is not a new one in the SAP palette. Gold or that orangey yellow has been used on numerous occasions with SAP content like the partner network of which companies like Winshuttle is a member, the Business Objects ( BOBJ ) product line, SAP baby product Business One ( B1 ) and the seemingly failed cloud ERP Business by Design ( Byd). The colour is most commonly found associated with the analytics side of the branding house.

While the loss of the trapezium and comforting blue will likely cause a brief stir among the dyed-in-the-wool SAP fans, the only ones who are likely to have severe regret may be those with tattoos but hopefully there aren’t too many of those around. I think too, that despite the news of the new alliance between SAP and IBM on hosting, the rebranding really does demonstrate the final departure from the original root that inspired the SAP product – IBM.

RUN SimpleWhile the significance of the letters has morphed over the years the message of what SAP is all about hasn’t really changed, there are a lot more hounds in the kennels but the objective and mission largely remains the same. What has really changed though is the technical components that now make up the offerings.

Over the coming months and years we will likely see more convergence of the technology and efforts to make the experience and look and feel of how users work with the applications more consistent and seamless.

There’s an acknowledgment that getting more business advocacy means appealing not just to the back office paper shufflers but also to the front of house and the executive and the best way to do that is to put an easy to access and usable face on what is undeniably the boldest player in the ERP market  irrespective of what colour it chooses to brand itself.

Perhaps a new era of SAP user experience and the cloud along with this new branding will genuinely come with a little bit of a Midas touch for everyone.


October 10, 2014  4:23 PM

That bad data – why haven’t you fixed it by now?

Clinton Jones Clinton Jones Profile: Clinton Jones
Data governance, Data Management, Master data management, Master Data Services

‘Dirty Data’ is a business problem, not an IT Problem, says analyst firm Gartner. More than 25% of critical data in leading companies has defects but if you’re relying on IT to solve the problem you may have it all wrong…

As you can imagine, with possible data growth rates of anything between 10 and 50% year-on-year, roughly a quarter of all the new data will be faulty in some way and unless you address matters, your ratio of bad data may remain constant but the physical number of items will grow just as fast as your data.

Organizations have two ways that they can tackle the challenge. They can deal with the data quality issue passively or actively – relying on IT to ‘fix’ the data will likely never get your data to a level of quality that satisfies the needs of the business.

Closing the stable door after the horse has bolted

Passive data governance means addressing data issues through identification and remediation after the records have already been created.

This is an approach that is pretty commonplace especially when you are relying on IT and it is generally accepted as a traditional way to improve data quality and clean up faulty data in systems.

This approach typically batches the data hygiene activity and involves data extraction, profiling, transformation and update.

There are many products on the market that offer this capability with varying degrees of automation – you can also handle this issue manually and simply throw bodies at cleaning up the data but in many instances if you already have the faulty data in your system and are not able to prevent more faulty data from being created, you’re continually in ‘catch-up’ mode.

Corral your data

buschA second approach is a little more systematic in the data creation and maintenance process and is typically considered as an Active Data Governance approach.

In active data governance, systems are configured in such a way that data cannot physically be entered into a system if it does not conform to a basic set of rules. The rules could be based on a rules engine or the configuration of a given application. Depending on whether the data is of a transactional, referential or master record nature determines the kinds of data validation that one might establish. For example. The entering of sales orders into a system may only accept sales orders where inventory is available for dispatch or where an existing customer identifier is provided.

Other kinds of order entry systems might allow you to create the order right up to the point where you define the delivery address and the contents of the order but which allow anonymous sales or sales that are tied to your email address. In such instances there is no existing requirement for a customer account.

How you choose to configure your systems really hinges on the nature of your business and your business rules but if you have loose rules and light configuration the risks of faulty data getting into your systems is greater.

Embedded rules and tribal knowledge

unlock_keyThe characteristics of your rules for your systems are often implicitly embedded in the very systems that you use to capture the data and this is one of the reasons why you may be struggling to get on top of that pile of faulty data and not able to easily prevent additional faulty data being gathered. Changing the rules may be difficult or impractical and sometimes you simply have to work around them or live with their inadequacies. Things may be further exacerbated by the fact that no single person actually knows what all the rules are or where they are maintained or held.

In my conversations with SAP customers for example, I often hear that the reason bad data is gathered and entered is because often the roles of the people in the data management process are defined but without good consideration given to the actual capabilities of the individuals who fulfil those roles.

As a consequence inventory masters for example, may be created with data attributes like storage Location = TBD (To be determined) or a gross weight of 1 gram instead of the actual weight of the inventory item. There’s actually a rule here, but the rule is, if you don’t know the answer use TDB or 1 gram depending on the field. These data management rules are therefore tribal knowledge and probably oblivious to the rest of the business.

The only way this data will be corrected is if you pick it up in a passive data governance pass after the fact. An approach where you extract all items with storage location equal to ‘TBD’ for example and then sit down and assign storage locations and update the records accordingly. This may be quite an acceptable approach if you have the resources and discipline and can afford the time delay.

When there’s no urgency or real concern about this data then this approach may be acceptable but in the longer term it needs to be recognized that slow reaction to poor data simply means more bad data gets added to a growing pile.

cannedConAgra Foods an American packaged foods company that makes and sells supermarkets, restaurant and food service brands chose active data governance over passive data governance as the preferred approach to data management in their SAP system of record and this approach included improved handling of hundreds of rules and fields to get products to market faster.

Taking stock

Understanding whether your way of dealing with data management is a passive or an active one is something you should get a clear understanding of.

Working out where your business rules are, and how you assign roles and responsibilities in terms of data management is another – sales people for example, often don’t care about logistics data or even accounting data unless it impacts them personally. Similarly, accounting folks typically don’t care about the purchasing contact within a customer but they do care about the accounts payable contact when the account becomes overdue.

Only when you have a clear understanding of what your current data management approach is and what your data quality is like can you really apply appropriate energy to improving it. You need to be able to understand whether you need additional tools or data management reorganization in order to manage your data better but these won’t miraculously change the state of your data.

Getting your organization right is often a first step but it’s not the only one – start by recognizing data quality as a serious problem. Measure it, report it and determine how you want it to look in the future then choose a path that is most appropriate to your business needs.


October 4, 2014  11:19 AM

Front and center – Concur sings to the hearts of middle management and exec PA’s

Clinton Jones Clinton Jones Profile: Clinton Jones
Data Management, Data management and storage, expense management software, Mobile expense management, SAP, SAP and data management, User experience

President Barack Obama, right, and first lady Michelle Obama, second from right, with recipients of the 2010 Kennedy Center Honors, from left, producer, television host and actress Oprah Winfrey, songwriter and musician Paul McCartney, sing the National Anthem during the 2010 Kennedy Center Honors Gala at the Kennedy Center in Washington. At far left in the background is Stedman Graham. (AP)The Twittersphere, LinkedIN and the wires have been ablaze with the news of SAP’s plan to purchase Concur for $129 per share, or $8.3 billion. Although it is alleged that dialog has been going on for some time it seems SAP was the only real suitor.

Of course the media fallout quickly began and Safra Catz, recently announced CEO of Oracle was among the first to lay into the announcement saying SAP basically paid too much. NO comment was made about whether or not this was a smart move to add to the SAP litter or not, it seems that may have been unnecessary.

How much is enough ultimately? In reality SAP still labours under a mantle of usability challenges – FIORI will address some of thse but that’s a long term vision.  Concur becomes yet another example of how SAP is trying to address that balance between usability, functionality and executive advocacy. In reality usability and adoption matters immensely, they are application traits that sing to the hearts and minds of the middle tier of corporate management and the personal assistants of executives has a very special place in solution advocacy.

In many respects, the Concur acquisition will be not dissimilar to Successfactors except that it closes the loop on an even larger audience of organizational players. All organizations that have personnel who incur expenses will be interested in a solution that provides a simple, easy to use interface and an available everywhere, anywhere on anything experience.

My first brush with Concur

write-off-travel-and-entertainment-expenses-pop_7108My first brush with Concur was back in around 2005/2006 when we adopted Concur for expense reimbursement during my work at one of the mobile carriers in the US. The Concur implementation was well managed internally, piloted and then rolled out and for the most part went off without a hitch. On the SAP, side the integration steps were rapidly put in place and accounting quickly adopted and accelerated the deployment. It was a resounding success despite the fact that the whole user experience involved little or no SAP interaction. You filled up a web template, printed it, scanned receipts, faxed them to a number, or attached scans of your receipts or scanned everything into a PDF and sent that to an email address. The whole experience was so simple. No internal envelopes with paperwork to get lost, minimal delays in processing, a simple workflow and bam! Your were reimbursed for out of pocket expenses

When I moved to working for SAP, expense submissions were excruciating. They involved entering your expenses into a browser based NetWeaver session and then having to do a posting simulation printing a hardcopy of the simulation and sending that to accounting for reimbursement. Not only were thousands of SAP employees adding to the mountain of filed paper, we were also contributing to the incredibly expensive environmental costs associated with using paper. It was such an incredible step backwards that I think I was somewhat dumbfounded but of course if you’re relatively low down the totem pole you don’t rock the boat unnecessarily. While the process was pretty well oiled, the turnaround time on expense reimbursement was about two weeks but this may have had more to do with ludicrous batching of accounting cycles but who really knows. Working with SAP involved anything except simplicity.

Concur Expense Management supports many form factorsRoll around 2014 and my employer Winshuttle has been using Concur for probably around a year or more now and the end-to-end process is so incredibly simple. I may have thought my first brush with Concur was relatively painless but it is even more so today. Receipts are captured with my mobile phone, there’s an app for Windows and IOS and undoubtedly for Android too. There’s the web interface and attachments that are supported include PDF’s and the usual image files. We use a basic workflow and once I have submitted my claim I get notifications about its status.

There are some things that I don’t like about the way we specifically handle it processing but that’s a different matter. The reality is that the process is painless for me and certainly for accounting, they really do applaud the decision to choose Concur over the previous method we used with spreadsheets and attached receipts. We don’t use SAP for our back end accounting but I know our corporate cards are linked to Concur so when an expense appears of the corporate credit card it shows up as an un-expensed charge in your inbox.

At Winshuttle we really appreciate the way that the application works because screams usability and it really makes this whole data management activity straightforward for everyone. It improves data quality, posting accuracy and promotes data capture at the source.

Again, I want to emphasize that when you consider this acquisition, the curiosity for some may seem to be that SAP already had an expense and travel management solution of its own, just as it had a full-blown full-faceted HCM solution before the acquisition of SuccessFactors but if you’re thinking that all this is an alternative lipstick on the pig and a lazy way to do what SAP is doing with solutions like FIORI  then you’re missing important aspects to the direction SAP is going – this acquisition was  seemingly well considered.

Cool, adaptable and functional.

SAP battles on multiple fronts. The high end ERP’s have saturated the big corporates and all that is left is a few holding out with their custom systems or legacy solutions that they remain addicted to. Eventually those systems may get displaced but those are long sell cycles and career changers. There’ll be a view upgrades from middle tier ERP users to SAP along the way too.

On another front, there’s the proverbial trench wars that are going on between Oracle and SAP where customers with deep pockets and a radical palate for change will rip and replace as part of renewal or transformation initiatives. At the time of the centenary of the WWI it is easy to see strong similarities in the high end ERP market and the dig in mentality of opposing forces in France and Belgium in 1914.

The front that the system integrators only now seem to have realized is where the real new money may exist is in the SMB market. SAP knows this is a burgeoning and under serviced market – it is positioned for the small ERP customers along with the likes of SAGE and others with solutions like B1 but this is a market where keenly priced full function ERP solutions that position for aspirational growth can be incredibly successful, something where B1 is akin to DUPLO (clumsy and basic) and the ERP suite is the CREATOR (old school but infinitely flexible) series. Continuing this LEGO analog, Concur and Successfactors may be the MIXELS – cool and adaptable. Customers want what’s cool and adaptable as well functional.

For these prospects though, a crappy UI is not going to get you the attention and advocacy that you want and desperately need. When you’re signing a seven or eight figure deal for a new system it has to sing to your heart – Concur and Successfactors will do that!

SAP seems to have come to the realization that it can buy faster than it can build. The real questions is what next?

Picture: President Barack Obama, right, and first lady Michelle Obama, second from right, with recipients of the 2010 Kennedy Center Honors, from left, producer, television host and actress Oprah Winfrey, songwriter and musician Paul McCartney, sing the National Anthem during the 2010 Kennedy Center Honors Gala at the Kennedy Center in Washington. At far left in the background is Stedman Graham. (AP)


September 29, 2014  1:55 PM

Losing the kingdom for want of a nail

Clinton Jones Clinton Jones Profile: Clinton Jones
Uncategorized

Horseshoe_nails_for_lead_came_glasswork_01-300x239The phrase relates to an ancient proverb which was first referred to around the death of King Richard III of England at the Battle of Bosworth Field.

Richard III was the last king of the House of York and the last of the Plantagenet dynasty. Most recently he has featured in the news when his remains were reported to have been found under a parking lot in Leicester!

I was reminded of this proverbial rhyme describing how small events can result in much larger consequences in a recent meeting during which we were analysing the result of a finance automation survey that we conducted with the finance community.

Though the context of the reference to the proverb was not directly associated with my thoughts in this post, it does lead into the great message that this proverb hopes to convey, namely that small things can have large consequences and that the failure to address issues as they are identified  – can ultimately lead to a more pervasive or more profound problem elsewhere.

This is important in the way we consider how things work and the way we conduct business.

History is littered with other examples

Ancient narratives and the press are littered with examples of seemingly insignificant things that have had much more far reaching and sometimes catastrophic implications.

One of these was the sad outcome of NASA’s Challenger. On January 28 1986, the Challenger and its 7 member were lost 73 seconds after launch when the apparently less significant component of a booster seal failure resulted in break-up of the vehicle.
titanicrivetAnother such small issue could be considered the sinking of the Titanic and the suggestion that the metal rivets that held the ship together became brittle in the frigid waters and broke apart on impact with the iceberg resulting in more damage to the vessel  than the iceberg would normally have caused.

Both decisions that relate to these disasters were ultimately attributed to failures in the making of sound decisions about small things which in the end is directly attributable to quality of information. Information about the condition, the background, the composition and characteristics of parts that make up the whole. In vehicles this can be crucial but we shouldn’t underestimate the importance of information about other small things and how these can compound to make bigger issues.

For one Winshuttle customer that spoke at the Winshuttle User Group meeting in Brussels this was the difference between a GRIR account that amounted to more than £9,000,000 and a more manageable account of less than £500,000 after they had used Winshuttle to make adjustments and corrections.

Daily decisions are made on data

Decisions are made daily by business around whether to pivot on a course of action, change direction or do something completely different but such decisions are generally made based on information and not on sentiment or gut instincts. Certainly some decisions are made using instinct or sentiment but for businesses many of them are based on things like sales volumes, market projections, competitor behaviour, the state of the economy and even, how much money we owe and how much we have in the bank – all derived from business data.

Arriving at good decision making based on data requires that data to be as good as it possibly can be and that can really only achieved in a couple of ways – fixing existing data and ensuring that new data is created in a proper and complete way.

decisions-407742_640SAP customers work with data every day. More interesting, is the fact that they work with data across a broad swathe of ERP functionality – everything from master data to sales orders, financial postings, personnel records, projects and plant maintenance.

In the realm of projects and plant maintenance many industrial manufacturers and service organizations run thousands of data objects through automated processes to move their decision support systems closer to perfect information in order to avoid the loss of the nail which ultimately could lead to the loss of the Kingdom.

My advice is to get online and find out what’s available in the market – there are many SAP solutions and SAP partner solutions addressing a broad swathe of industry solutions and functional areas.

Continuously challenge the status quo and keep learning – don’t assume that custom development and custom applications are the only choices available even when working SAP ERP.

Every day I discover something new that I hadn’t thought of before or even conceived of. There are some pretty cool technologies and approaches out there that could really be taking some of the pressure off both IT and the business in automation and business process improvement – many of them also won’t cost you a King’s ransom!


September 9, 2014  2:47 PM

A rule based business model for data governance

Clinton Jones Clinton Jones Profile: Clinton Jones
Approval Workflow, SAP, Workflow

Finance-Master-RecordsRules are an integral part of daily life, we’re governed by them continuously and yet they are an aspect of the way businesses function that these days are becoming increasingly automated.

Implementing business rules effectively hinges largely on how a given organization chooses to tackle rules management. Many companies manage rules from an endpoint or ‘passively’  through analytics and audit rather than actively and in real-time.

Rules can easily be implemented around data creation and maintenance scenarios but are often deployed in highly constrained and relatively inflexible ways where ones relies on the application configuration to determine whether data conforms to business rules or not. In some cases even configuration is not possible and the rules are embedded in the application logic.

Workflow engines

Using tools like workflow engines and forms designers to build rules based data management and creation processes do have their place in augmenting adherence of data to business rules but they may not always be the best fit.  A lot of the decision around whether or not to adopt a business rules engine (BRE) therefore depends on not only the entire systems landscape but also on the complexity and relative importance of the data and rules being gathered and respected as well as the degree to which the business rules change.

A workflow engine with a form for example,  has the distinct advantage over a pre-built solutions in that it can be actively applied to data management scenarios and deal with data creation and maintenance issues before they become fixed in the system of record. Additionally, they tend to be highly customisable, in other words, you get to decide what and how you want the data rules to be applied and to which data elements.

Of course the key disadvantage of using such tools is that synchronization between systems and consistency matching of the rules is sometimes a cumbersome process especially if rules change frequently. Building such a solution and having it perform rule consistency checks also requires a fair degree of hand stitching of verification and synchronization methods.

In a nutshell, an active business rules engine (BRE) integrated with enterprise applications, helps manage and enforce business rules.

Rules engines as a service

Typically modern systems are architected in such a way that they use BRE’s as service components that separate the business rules from the business applications themselves to provide more flexibility and broader applicability. Having the BRE outside the application reduces the time, effort, and costs of application maintenance by allowing the business users to modify the rules as necessary without the need for core application changes.  The most sophisticated of these detect inconsistencies within individual business rules as well as the concept of a rule-set. Rule-sets in this case are collections of rules that apply to a particular event and require that all rules in the set be evaluated simultaneously.

In terms of components a BRE often has the following characteristics:

  • Rule Repository – a database that stores the business rules and rule-sets defined by the business
  • Rule Designer/Editor – front-end application and a user interface that allows users to define, design, document, and edit business rules
  • Reporting – allows users and rules administrators to query and report existing rules
  • Engine – the actual application code that enforces the rules or provides flags that indicate whether data entered is consistent with the rules and rule-sets

When shopping for a BRE it should be clear then that not all BRE’s are made equal.  Some products that are positioned as BRE tools are nothing more than analytics and reporting tools they therefore constitute passive data governance tools.

Passive governance tools use an end-point approach in that they apply rules after transactions and master data have been created in systems like SAP. In particular they add operational overhead because the resulting actions are often data rework rather than blocking actions at data creation or change time. If action based on certain data is time critical then a passive approach may not meet the real business objectives.

In a scenario where active governance is in play,  data is not able to be created or changed if it is not aligned with the defined business rules. For some organizations this is an acceptable approach because active governance can sometimes impair the smooth flow of business transactions especially when data rules are ambiguous or poorly defined or the rules are in a state of continuous evolution.

Business rules in SAP

clarifySAP’s ERP has a configurable BRE around certain defined data objects and activities but it is not an exhaustive one – implementing an exhaustive one usually requires customization or the addition of third party solutions. In addition, certain data validation may be pared back in terms of its precision in order to accommodate the special circumstances of a particular business such as vendor or customer creation because of either a legacy of data challenges or some special characteristics around the way data needs to be created or maintained.

An example of this might be duplicate address checking in SAP at the time of customer or vendor creation.  Some companies have this switched on and some have it switched off because they have had issues with it in the past.
In reality the duplicate address check is a brute force checking mechanism that doesn’t have a terribly elegant way of working.

For companies that have multiple business units billing to/from the same address but which need to be defined separately for accounting purposes, the duplicate address check as defined is not a great approach and something more sophisticated needs to be in place. Address validation is an example of a narrow limited functionality BRE, some companies will use an external address validation call to maintain addresses in a way that is consistent with achieving proper deliveries.

A credit application is also often considered a good example for a situation where a basic BRE can be designed in a form for first level clearing of an application or credit request. Here you might enter, name, address, income, identity number and annual income. While this may give you an idea of how much credit you might be initially considered for, a credit check is the next step which means a sophisticated algorithm evaluates your credit profile and gives you a score. At this point a company would often lean on a third party solution to do the credit checking and then apply some sort of blocking value if the record needs further consideration.

A workflow solution with a form could make a recommended credit limit or approval based on the credit score but in the end it is doing nothing more than comparing the score read from the scoring system against a range of possible credit amounts pre-configured in the form or workflow. This conveys the impression of something sophisticated to the end-user but ultimately this is a very simple approach to applying business rules. For 80% of applications this approach may be more than adequate but it is likely that this approach adds very little to the end-to-end process.

If little added value is derived from this approach then this shouldn’t be considered an ideal solution. A call to an appropriately configured service or BRE as part of the form and workflow design would add considerable value to the process.

A BRE will likely make the calculations and reach conclusions based on a complex algorithm and would likely not be dependent on simply what is entered by the business users alone – behind the scenes it will collect other data and make inferences that are not necessarily clearly tied to the data entered. A good active BRE may actually even push for more inputs to be provided by the input method, something a simple form design with workflow would not do. A BRE could drive your solution design.

A rules engine has one core focus and that has to be value in consistent and correct data. Everyone understands why an address, contact information etc needs to be consistent and correct but when you move into the realm of ‘other’ master data like pricelists, payroll, inventory data and specifications clarity and understanding may become diluted.

In certain industry segments the need to have robust rules engines is of much higher value than in others.  In the financial services industry for example, credit checking may be considered far more critical than in a retail or wholesale context. Value translates as opportunity cost in many cases but it can also be derived from penalties, fines and other compliance or regulatory strictures avoided or adhered to that may pressure certain kinds of decision making around data inputs.

Following the rules

v11Workflows can be built to work in a similar fashion to a BRE for a very specific set of data or a scenario but the challenge is that the rules have to be designed into the workflow or form itself. While this approach can achieve the same result, it is not as flexible, adaptable and sophisticated as what you might do with a BRE.

Companies can use generic rules engines like Oracle Enterprise Data Quality Services with workflow and forms solutions like Winshuttle to give a more complete solution to the business up front but this requires investing in two components, the generic rules engine and the workflow and forms tool.  With this approach you can achieve active governance.

Conclusions

Building a BRE in a workflow tool would likely require very complex workflow design and make use of a great deal of embedded rules in the forms and workflows and but have some flexibility by making use of data stores like SharePoint lists. Workflow and forms in this instance become a black box that can only have their rules tested with automated testing tools.

Because of the way a BRE works, rules are more accessible for scrutiny because of the way they are designed and implemented but comparing them with what you have in SAP or any ERP is likely largely impossible to do because the alignment of the rules repository and the ERP configuration approach are incompatible.

At the very best you could take a sampling of transactions or data that has passed through the ERP and then compare the end results with similar inputs passed through the BRE. This is effectively just an analysis task and probably indicative but certainly not exhaustive – another approach to consider is one that layers the front end data collection or data manipulation process of the ERP with a forms and workflow solution that invokes checks against a BRE at certain key points before effecting changes in the ERP. Such an approach adds considerably to data quality and control for ERP systems without having to make changes to core ERP.

Which solutions you ultimately choose really depends on what it is that you want or need to do around managing data or business data events. Start with defining the value of your data and then evaluate whether you can afford a passive or active data governance approach.


June 30, 2014  12:58 PM

Simplicity is the ultimate sophistication

Clinton Jones Clinton Jones Profile: Clinton Jones
FICO, Financials, SAP, Uncategorized

human-scale[1]This quote reportedly attributed to Leonardo da Vinci actually tells us a lot more about the amount of effort involved in delivering a simple experience to users than we probably realise.

In fact simple interactions, particularly with systems can rapidly become complex in their nature quite simply through variability.

Interestingly we see that complexity can arise even when simple components and simple interactions are combined in various combinations and permutations. We often see this at Winshuttle when we see how a seemingly simple process that one thinks can be easily improved through a form becomes extraordinarily complex when all the variations in process attempt to be incorporated. Automating finance processes in SAP in particular then become more easily attainable with Excel than with a web form.

Run SAP simpler

SAP’s apparent new mantra of making interaction with their leading ERP ‘Simple’ presents an interesting opportunity for Finance professionals to rethink the way that they work with the SAP ERP in the finance context but the question will be whether this simplicity can really be as digestible, effective and appropriate as diverse businesses require.

On face value, financial accounting is nothing more than debits and credits but any finance professional will tell you that the way you present those debits and credits ultimately determines how financial accounts are interpreted and relied upon for current and future decisions. The level of detailing also helps in arriving at more qualified conclusions.

A particular problem with working with the current model of SAP Finance Solutions is performance of the accounting system. Performance is often variable relative to the levels of automation, integration and transaction volumes but the biggest challenge seems to be the way the SAP financial system has been engineered and is used relative to contemporary expectations. Synchronization with a reporting system itself may also take many hours, hours that are particularly precious at period close.

Many companies complain the periodic processing is still heavily dependent on manual processing of certain transactions particularly when systems are not fully unified and while this could be addressed with systematic automation there are many variants in use cases that don’t justify systematic automation because of relatively low value or low volume of transaction volumes.

tesler application complexityThose that have been using SAP financials for a good number of years know that the core SAP financial tables (excluding the CO tables) of BSEG and BKPF can contain very large volumes of data and that efforts to try and accelerate the ways in which reports are delivered can be very time consuming and not easy to iterate for accelerated period close. Reports are long running, resource intensive and problematic.

SAP’s promise of SAP Simple Finance suggests that a simple finance document can be returned to in the data processing model without the need to have diversified views of the same data.

Some of the legacy constraints of deficient technologies inherent to recording financial transactions in multiple tables in a relational database have been the reason for this slowness and will now be avoided with Simple Finance.

Simplification with fewer tables

Of course, simplifying the experience for those who use the data comes at a price. There are prerequisites and among these it involves a wholesale rip and replace of the underlying infrastructure like switching to HANA as a database and applying Enhancement Pack7  (Ehp7).

SAP has engineered an application layer that has more technical complexity in order to support an enhanced on-the-fly reporting experience.  In order to get to a single version of the truth the ultimate objective for financial accounting is to logically connect FI line items contained in BSEG to the controlling line items in COEP to create single document views.

Focusing on the postings into financial and management accounting in the controlling part of SAP is important because the costs and revenue postings determine the value of individual transactions in any given accounting data recording system. In the past companies would work with just summary information in order to cope with the transaction volumes but you lose some of the advantages of granularity through this approach and detract from good analytics and address some of the data redundancy issues associated with SAP data in the FI-CO modules.

The new approach will harmonize internal and external reporting  which should alleviate some of the problems associated with data intensive processing and analytics. While this will not necessarily be interesting to all companies this will accelerate reconciliation and tighten the period close activities for many. Four main tables will be the end result, document headers and line items. What used to be considered tables will now be handled with Core Data Service (CDS) views leveraging HANA.  Those who have highly customized financials will need to take careful consideration of this if SAP Simple Finance is on their radar as an objective. Reporting will be faster, particularly for insurance and banking institutions.

If da Vinci’s statement is true then,  SAP’s new financials model really will bring sophistication at least for those focused on proper accounting and controlling but it will have achieved it only through an extraordinary amount of re-engineering at the application and platform logic level.

For consultants and developers this new approach will require reeducation and the understanding that they will need to move to account based CO-PA to reap the best advantages of the new design.


Page 1 of 3123

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: