Working alongside SAP Business Suite

Page 1 of 212

October 10, 2014  4:23 PM

That bad data – why haven’t you fixed it by now?

Clinton Jones Clinton Jones Profile: Clinton Jones
Data governance, Data Management, Master data management, Master Data Services

‘Dirty Data’ is a business problem, not an IT Problem, says analyst firm Gartner. More than 25% of critical data in leading companies has defects but if you’re relying on IT to solve the problem you may have it all wrong…

As you can imagine, with possible data growth rates of anything between 10 and 50% year-on-year, roughly a quarter of all the new data will be faulty in some way and unless you address matters, your ratio of bad data may remain constant but the physical number of items will grow just as fast as your data.

Organizations have two ways that they can tackle the challenge. They can deal with the data quality issue passively or actively – relying on IT to ‘fix’ the data will likely never get your data to a level of quality that satisfies the needs of the business.

Closing the stable door after the horse has bolted

Passive data governance means addressing data issues through identification and remediation after the records have already been created.

This is an approach that is pretty commonplace especially when you are relying on IT and it is generally accepted as a traditional way to improve data quality and clean up faulty data in systems.

This approach typically batches the data hygiene activity and involves data extraction, profiling, transformation and update.

There are many products on the market that offer this capability with varying degrees of automation – you can also handle this issue manually and simply throw bodies at cleaning up the data but in many instances if you already have the faulty data in your system and are not able to prevent more faulty data from being created, you’re continually in ‘catch-up’ mode.

Corral your data

buschA second approach is a little more systematic in the data creation and maintenance process and is typically considered as an Active Data Governance approach.

In active data governance, systems are configured in such a way that data cannot physically be entered into a system if it does not conform to a basic set of rules. The rules could be based on a rules engine or the configuration of a given application. Depending on whether the data is of a transactional, referential or master record nature determines the kinds of data validation that one might establish. For example. The entering of sales orders into a system may only accept sales orders where inventory is available for dispatch or where an existing customer identifier is provided.

Other kinds of order entry systems might allow you to create the order right up to the point where you define the delivery address and the contents of the order but which allow anonymous sales or sales that are tied to your email address. In such instances there is no existing requirement for a customer account.

How you choose to configure your systems really hinges on the nature of your business and your business rules but if you have loose rules and light configuration the risks of faulty data getting into your systems is greater.

Embedded rules and tribal knowledge

unlock_keyThe characteristics of your rules for your systems are often implicitly embedded in the very systems that you use to capture the data and this is one of the reasons why you may be struggling to get on top of that pile of faulty data and not able to easily prevent additional faulty data being gathered. Changing the rules may be difficult or impractical and sometimes you simply have to work around them or live with their inadequacies. Things may be further exacerbated by the fact that no single person actually knows what all the rules are or where they are maintained or held.

In my conversations with SAP customers for example, I often hear that the reason bad data is gathered and entered is because often the roles of the people in the data management process are defined but without good consideration given to the actual capabilities of the individuals who fulfil those roles.

As a consequence inventory masters for example, may be created with data attributes like storage Location = TBD (To be determined) or a gross weight of 1 gram instead of the actual weight of the inventory item. There’s actually a rule here, but the rule is, if you don’t know the answer use TDB or 1 gram depending on the field. These data management rules are therefore tribal knowledge and probably oblivious to the rest of the business.

The only way this data will be corrected is if you pick it up in a passive data governance pass after the fact. An approach where you extract all items with storage location equal to ‘TBD’ for example and then sit down and assign storage locations and update the records accordingly. This may be quite an acceptable approach if you have the resources and discipline and can afford the time delay.

When there’s no urgency or real concern about this data then this approach may be acceptable but in the longer term it needs to be recognized that slow reaction to poor data simply means more bad data gets added to a growing pile.

cannedConAgra Foods an American packaged foods company that makes and sells supermarkets, restaurant and food service brands chose active data governance over passive data governance as the preferred approach to data management in their SAP system of record and this approach included improved handling of hundreds of rules and fields to get products to market faster.

Taking stock

Understanding whether your way of dealing with data management is a passive or an active one is something you should get a clear understanding of.

Working out where your business rules are, and how you assign roles and responsibilities in terms of data management is another – sales people for example, often don’t care about logistics data or even accounting data unless it impacts them personally. Similarly, accounting folks typically don’t care about the purchasing contact within a customer but they do care about the accounts payable contact when the account becomes overdue.

Only when you have a clear understanding of what your current data management approach is and what your data quality is like can you really apply appropriate energy to improving it. You need to be able to understand whether you need additional tools or data management reorganization in order to manage your data better but these won’t miraculously change the state of your data.

Getting your organization right is often a first step but it’s not the only one – start by recognizing data quality as a serious problem. Measure it, report it and determine how you want it to look in the future then choose a path that is most appropriate to your business needs.

October 4, 2014  11:19 AM

Front and center – Concur sings to the hearts of middle management and exec PA’s

Clinton Jones Clinton Jones Profile: Clinton Jones
Data Management, Data management and storage, expense management software, Mobile expense management, SAP, SAP and data management, User experience

President Barack Obama, right, and first lady Michelle Obama, second from right, with recipients of the 2010 Kennedy Center Honors, from left, producer, television host and actress Oprah Winfrey, songwriter and musician Paul McCartney, sing the National Anthem during the 2010 Kennedy Center Honors Gala at the Kennedy Center in Washington. At far left in the background is Stedman Graham. (AP)The Twittersphere, LinkedIN and the wires have been ablaze with the news of SAP’s plan to purchase Concur for $129 per share, or $8.3 billion. Although it is alleged that dialog has been going on for some time it seems SAP was the only real suitor.

Of course the media fallout quickly began and Safra Catz, recently announced CEO of Oracle was among the first to lay into the announcement saying SAP basically paid too much. NO comment was made about whether or not this was a smart move to add to the SAP litter or not, it seems that may have been unnecessary.

How much is enough ultimately? In reality SAP still labours under a mantle of usability challenges – FIORI will address some of thse but that’s a long term vision.  Concur becomes yet another example of how SAP is trying to address that balance between usability, functionality and executive advocacy. In reality usability and adoption matters immensely, they are application traits that sing to the hearts and minds of the middle tier of corporate management and the personal assistants of executives has a very special place in solution advocacy.

In many respects, the Concur acquisition will be not dissimilar to Successfactors except that it closes the loop on an even larger audience of organizational players. All organizations that have personnel who incur expenses will be interested in a solution that provides a simple, easy to use interface and an available everywhere, anywhere on anything experience.

My first brush with Concur

write-off-travel-and-entertainment-expenses-pop_7108My first brush with Concur was back in around 2005/2006 when we adopted Concur for expense reimbursement during my work at one of the mobile carriers in the US. The Concur implementation was well managed internally, piloted and then rolled out and for the most part went off without a hitch. On the SAP, side the integration steps were rapidly put in place and accounting quickly adopted and accelerated the deployment. It was a resounding success despite the fact that the whole user experience involved little or no SAP interaction. You filled up a web template, printed it, scanned receipts, faxed them to a number, or attached scans of your receipts or scanned everything into a PDF and sent that to an email address. The whole experience was so simple. No internal envelopes with paperwork to get lost, minimal delays in processing, a simple workflow and bam! Your were reimbursed for out of pocket expenses

When I moved to working for SAP, expense submissions were excruciating. They involved entering your expenses into a browser based NetWeaver session and then having to do a posting simulation printing a hardcopy of the simulation and sending that to accounting for reimbursement. Not only were thousands of SAP employees adding to the mountain of filed paper, we were also contributing to the incredibly expensive environmental costs associated with using paper. It was such an incredible step backwards that I think I was somewhat dumbfounded but of course if you’re relatively low down the totem pole you don’t rock the boat unnecessarily. While the process was pretty well oiled, the turnaround time on expense reimbursement was about two weeks but this may have had more to do with ludicrous batching of accounting cycles but who really knows. Working with SAP involved anything except simplicity.

Concur Expense Management supports many form factorsRoll around 2014 and my employer Winshuttle has been using Concur for probably around a year or more now and the end-to-end process is so incredibly simple. I may have thought my first brush with Concur was relatively painless but it is even more so today. Receipts are captured with my mobile phone, there’s an app for Windows and IOS and undoubtedly for Android too. There’s the web interface and attachments that are supported include PDF’s and the usual image files. We use a basic workflow and once I have submitted my claim I get notifications about its status.

There are some things that I don’t like about the way we specifically handle it processing but that’s a different matter. The reality is that the process is painless for me and certainly for accounting, they really do applaud the decision to choose Concur over the previous method we used with spreadsheets and attached receipts. We don’t use SAP for our back end accounting but I know our corporate cards are linked to Concur so when an expense appears of the corporate credit card it shows up as an un-expensed charge in your inbox.

At Winshuttle we really appreciate the way that the application works because screams usability and it really makes this whole data management activity straightforward for everyone. It improves data quality, posting accuracy and promotes data capture at the source.

Again, I want to emphasize that when you consider this acquisition, the curiosity for some may seem to be that SAP already had an expense and travel management solution of its own, just as it had a full-blown full-faceted HCM solution before the acquisition of SuccessFactors but if you’re thinking that all this is an alternative lipstick on the pig and a lazy way to do what SAP is doing with solutions like FIORI  then you’re missing important aspects to the direction SAP is going – this acquisition was  seemingly well considered.

Cool, adaptable and functional.

SAP battles on multiple fronts. The high end ERP’s have saturated the big corporates and all that is left is a few holding out with their custom systems or legacy solutions that they remain addicted to. Eventually those systems may get displaced but those are long sell cycles and career changers. There’ll be a view upgrades from middle tier ERP users to SAP along the way too.

On another front, there’s the proverbial trench wars that are going on between Oracle and SAP where customers with deep pockets and a radical palate for change will rip and replace as part of renewal or transformation initiatives. At the time of the centenary of the WWI it is easy to see strong similarities in the high end ERP market and the dig in mentality of opposing forces in France and Belgium in 1914.

The front that the system integrators only now seem to have realized is where the real new money may exist is in the SMB market. SAP knows this is a burgeoning and under serviced market – it is positioned for the small ERP customers along with the likes of SAGE and others with solutions like B1 but this is a market where keenly priced full function ERP solutions that position for aspirational growth can be incredibly successful, something where B1 is akin to DUPLO (clumsy and basic) and the ERP suite is the CREATOR (old school but infinitely flexible) series. Continuing this LEGO analog, Concur and Successfactors may be the MIXELS – cool and adaptable. Customers want what’s cool and adaptable as well functional.

For these prospects though, a crappy UI is not going to get you the attention and advocacy that you want and desperately need. When you’re signing a seven or eight figure deal for a new system it has to sing to your heart – Concur and Successfactors will do that!

SAP seems to have come to the realization that it can buy faster than it can build. The real questions is what next?

Picture: President Barack Obama, right, and first lady Michelle Obama, second from right, with recipients of the 2010 Kennedy Center Honors, from left, producer, television host and actress Oprah Winfrey, songwriter and musician Paul McCartney, sing the National Anthem during the 2010 Kennedy Center Honors Gala at the Kennedy Center in Washington. At far left in the background is Stedman Graham. (AP)


September 29, 2014  1:55 PM

Losing the kingdom for want of a nail

Clinton Jones Clinton Jones Profile: Clinton Jones
Uncategorized

Horseshoe_nails_for_lead_came_glasswork_01-300x239The phrase relates to an ancient proverb which was first referred to around the death of King Richard III of England at the Battle of Bosworth Field.

Richard III was the last king of the House of York and the last of the Plantagenet dynasty. Most recently he has featured in the news when his remains were reported to have been found under a parking lot in Leicester!

I was reminded of this proverbial rhyme describing how small events can result in much larger consequences in a recent meeting during which we were analysing the result of a finance automation survey that we conducted with the finance community.

Though the context of the reference to the proverb was not directly associated with my thoughts in this post, it does lead into the great message that this proverb hopes to convey, namely that small things can have large consequences and that the failure to address issues as they are identified  - can ultimately lead to a more pervasive or more profound problem elsewhere.

This is important in the way we consider how things work and the way we conduct business.

History is littered with other examples

Ancient narratives and the press are littered with examples of seemingly insignificant things that have had much more far reaching and sometimes catastrophic implications.

One of these was the sad outcome of NASA’s Challenger. On January 28 1986, the Challenger and its 7 member were lost 73 seconds after launch when the apparently less significant component of a booster seal failure resulted in break-up of the vehicle.
titanicrivetAnother such small issue could be considered the sinking of the Titanic and the suggestion that the metal rivets that held the ship together became brittle in the frigid waters and broke apart on impact with the iceberg resulting in more damage to the vessel  than the iceberg would normally have caused.

Both decisions that relate to these disasters were ultimately attributed to failures in the making of sound decisions about small things which in the end is directly attributable to quality of information. Information about the condition, the background, the composition and characteristics of parts that make up the whole. In vehicles this can be crucial but we shouldn’t underestimate the importance of information about other small things and how these can compound to make bigger issues.

For one Winshuttle customer that spoke at the Winshuttle User Group meeting in Brussels this was the difference between a GRIR account that amounted to more than £9,000,000 and a more manageable account of less than £500,000 after they had used Winshuttle to make adjustments and corrections.

Daily decisions are made on data

Decisions are made daily by business around whether to pivot on a course of action, change direction or do something completely different but such decisions are generally made based on information and not on sentiment or gut instincts. Certainly some decisions are made using instinct or sentiment but for businesses many of them are based on things like sales volumes, market projections, competitor behaviour, the state of the economy and even, how much money we owe and how much we have in the bank – all derived from business data.

Arriving at good decision making based on data requires that data to be as good as it possibly can be and that can really only achieved in a couple of ways – fixing existing data and ensuring that new data is created in a proper and complete way.

decisions-407742_640SAP customers work with data every day. More interesting, is the fact that they work with data across a broad swathe of ERP functionality – everything from master data to sales orders, financial postings, personnel records, projects and plant maintenance.

In the realm of projects and plant maintenance many industrial manufacturers and service organizations run thousands of data objects through automated processes to move their decision support systems closer to perfect information in order to avoid the loss of the nail which ultimately could lead to the loss of the Kingdom.

My advice is to get online and find out what’s available in the market – there are many SAP solutions and SAP partner solutions addressing a broad swathe of industry solutions and functional areas.

Continuously challenge the status quo and keep learning – don’t assume that custom development and custom applications are the only choices available even when working SAP ERP.

Every day I discover something new that I hadn’t thought of before or even conceived of. There are some pretty cool technologies and approaches out there that could really be taking some of the pressure off both IT and the business in automation and business process improvement – many of them also won’t cost you a King’s ransom!


September 9, 2014  2:47 PM

A rule based business model for data governance

Clinton Jones Clinton Jones Profile: Clinton Jones
Approval Workflow, SAP, Workflow

Finance-Master-RecordsRules are an integral part of daily life, we’re governed by them continuously and yet they are an aspect of the way businesses function that these days are becoming increasingly automated.

Implementing business rules effectively hinges largely on how a given organization chooses to tackle rules management. Many companies manage rules from an endpoint or ‘passively’  through analytics and audit rather than actively and in real-time.

Rules can easily be implemented around data creation and maintenance scenarios but are often deployed in highly constrained and relatively inflexible ways where ones relies on the application configuration to determine whether data conforms to business rules or not. In some cases even configuration is not possible and the rules are embedded in the application logic.

Workflow engines

Using tools like workflow engines and forms designers to build rules based data management and creation processes do have their place in augmenting adherence of data to business rules but they may not always be the best fit.  A lot of the decision around whether or not to adopt a business rules engine (BRE) therefore depends on not only the entire systems landscape but also on the complexity and relative importance of the data and rules being gathered and respected as well as the degree to which the business rules change.

A workflow engine with a form for example,  has the distinct advantage over a pre-built solutions in that it can be actively applied to data management scenarios and deal with data creation and maintenance issues before they become fixed in the system of record. Additionally, they tend to be highly customisable, in other words, you get to decide what and how you want the data rules to be applied and to which data elements.

Of course the key disadvantage of using such tools is that synchronization between systems and consistency matching of the rules is sometimes a cumbersome process especially if rules change frequently. Building such a solution and having it perform rule consistency checks also requires a fair degree of hand stitching of verification and synchronization methods.

In a nutshell, an active business rules engine (BRE) integrated with enterprise applications, helps manage and enforce business rules.

Rules engines as a service

Typically modern systems are architected in such a way that they use BRE’s as service components that separate the business rules from the business applications themselves to provide more flexibility and broader applicability. Having the BRE outside the application reduces the time, effort, and costs of application maintenance by allowing the business users to modify the rules as necessary without the need for core application changes.  The most sophisticated of these detect inconsistencies within individual business rules as well as the concept of a rule-set. Rule-sets in this case are collections of rules that apply to a particular event and require that all rules in the set be evaluated simultaneously.

In terms of components a BRE often has the following characteristics:

  • Rule Repository – a database that stores the business rules and rule-sets defined by the business
  • Rule Designer/Editor – front-end application and a user interface that allows users to define, design, document, and edit business rules
  • Reporting – allows users and rules administrators to query and report existing rules
  • Engine – the actual application code that enforces the rules or provides flags that indicate whether data entered is consistent with the rules and rule-sets

When shopping for a BRE it should be clear then that not all BRE’s are made equal.  Some products that are positioned as BRE tools are nothing more than analytics and reporting tools they therefore constitute passive data governance tools.

Passive governance tools use an end-point approach in that they apply rules after transactions and master data have been created in systems like SAP. In particular they add operational overhead because the resulting actions are often data rework rather than blocking actions at data creation or change time. If action based on certain data is time critical then a passive approach may not meet the real business objectives.

In a scenario where active governance is in play,  data is not able to be created or changed if it is not aligned with the defined business rules. For some organizations this is an acceptable approach because active governance can sometimes impair the smooth flow of business transactions especially when data rules are ambiguous or poorly defined or the rules are in a state of continuous evolution.

Business rules in SAP

clarifySAP’s ERP has a configurable BRE around certain defined data objects and activities but it is not an exhaustive one – implementing an exhaustive one usually requires customization or the addition of third party solutions. In addition, certain data validation may be pared back in terms of its precision in order to accommodate the special circumstances of a particular business such as vendor or customer creation because of either a legacy of data challenges or some special characteristics around the way data needs to be created or maintained.

An example of this might be duplicate address checking in SAP at the time of customer or vendor creation.  Some companies have this switched on and some have it switched off because they have had issues with it in the past.
In reality the duplicate address check is a brute force checking mechanism that doesn’t have a terribly elegant way of working.

For companies that have multiple business units billing to/from the same address but which need to be defined separately for accounting purposes, the duplicate address check as defined is not a great approach and something more sophisticated needs to be in place. Address validation is an example of a narrow limited functionality BRE, some companies will use an external address validation call to maintain addresses in a way that is consistent with achieving proper deliveries.

A credit application is also often considered a good example for a situation where a basic BRE can be designed in a form for first level clearing of an application or credit request. Here you might enter, name, address, income, identity number and annual income. While this may give you an idea of how much credit you might be initially considered for, a credit check is the next step which means a sophisticated algorithm evaluates your credit profile and gives you a score. At this point a company would often lean on a third party solution to do the credit checking and then apply some sort of blocking value if the record needs further consideration.

A workflow solution with a form could make a recommended credit limit or approval based on the credit score but in the end it is doing nothing more than comparing the score read from the scoring system against a range of possible credit amounts pre-configured in the form or workflow. This conveys the impression of something sophisticated to the end-user but ultimately this is a very simple approach to applying business rules. For 80% of applications this approach may be more than adequate but it is likely that this approach adds very little to the end-to-end process.

If little added value is derived from this approach then this shouldn’t be considered an ideal solution. A call to an appropriately configured service or BRE as part of the form and workflow design would add considerable value to the process.

A BRE will likely make the calculations and reach conclusions based on a complex algorithm and would likely not be dependent on simply what is entered by the business users alone – behind the scenes it will collect other data and make inferences that are not necessarily clearly tied to the data entered. A good active BRE may actually even push for more inputs to be provided by the input method, something a simple form design with workflow would not do. A BRE could drive your solution design.

A rules engine has one core focus and that has to be value in consistent and correct data. Everyone understands why an address, contact information etc needs to be consistent and correct but when you move into the realm of ‘other’ master data like pricelists, payroll, inventory data and specifications clarity and understanding may become diluted.

In certain industry segments the need to have robust rules engines is of much higher value than in others.  In the financial services industry for example, credit checking may be considered far more critical than in a retail or wholesale context. Value translates as opportunity cost in many cases but it can also be derived from penalties, fines and other compliance or regulatory strictures avoided or adhered to that may pressure certain kinds of decision making around data inputs.

Following the rules

v11Workflows can be built to work in a similar fashion to a BRE for a very specific set of data or a scenario but the challenge is that the rules have to be designed into the workflow or form itself. While this approach can achieve the same result, it is not as flexible, adaptable and sophisticated as what you might do with a BRE.

Companies can use generic rules engines like Oracle Enterprise Data Quality Services with workflow and forms solutions like Winshuttle to give a more complete solution to the business up front but this requires investing in two components, the generic rules engine and the workflow and forms tool.  With this approach you can achieve active governance.

Conclusions

Building a BRE in a workflow tool would likely require very complex workflow design and make use of a great deal of embedded rules in the forms and workflows and but have some flexibility by making use of data stores like SharePoint lists. Workflow and forms in this instance become a black box that can only have their rules tested with automated testing tools.

Because of the way a BRE works, rules are more accessible for scrutiny because of the way they are designed and implemented but comparing them with what you have in SAP or any ERP is likely largely impossible to do because the alignment of the rules repository and the ERP configuration approach are incompatible.

At the very best you could take a sampling of transactions or data that has passed through the ERP and then compare the end results with similar inputs passed through the BRE. This is effectively just an analysis task and probably indicative but certainly not exhaustive – another approach to consider is one that layers the front end data collection or data manipulation process of the ERP with a forms and workflow solution that invokes checks against a BRE at certain key points before effecting changes in the ERP. Such an approach adds considerably to data quality and control for ERP systems without having to make changes to core ERP.

Which solutions you ultimately choose really depends on what it is that you want or need to do around managing data or business data events. Start with defining the value of your data and then evaluate whether you can afford a passive or active data governance approach.


June 30, 2014  12:58 PM

Simplicity is the ultimate sophistication

Clinton Jones Clinton Jones Profile: Clinton Jones
FICO, Financials, SAP, Uncategorized

human-scale[1]This quote reportedly attributed to Leonardo da Vinci actually tells us a lot more about the amount of effort involved in delivering a simple experience to users than we probably realise.

In fact simple interactions, particularly with systems can rapidly become complex in their nature quite simply through variability.

Interestingly we see that complexity can arise even when simple components and simple interactions are combined in various combinations and permutations. We often see this at Winshuttle when we see how a seemingly simple process that one thinks can be easily improved through a form becomes extraordinarily complex when all the variations in process attempt to be incorporated. Automating finance processes in SAP in particular then become more easily attainable with Excel than with a web form.

Run SAP simpler

SAP’s apparent new mantra of making interaction with their leading ERP ‘Simple’ presents an interesting opportunity for Finance professionals to rethink the way that they work with the SAP ERP in the finance context but the question will be whether this simplicity can really be as digestible, effective and appropriate as diverse businesses require.

On face value, financial accounting is nothing more than debits and credits but any finance professional will tell you that the way you present those debits and credits ultimately determines how financial accounts are interpreted and relied upon for current and future decisions. The level of detailing also helps in arriving at more qualified conclusions.

A particular problem with working with the current model of SAP Finance Solutions is performance of the accounting system. Performance is often variable relative to the levels of automation, integration and transaction volumes but the biggest challenge seems to be the way the SAP financial system has been engineered and is used relative to contemporary expectations. Synchronization with a reporting system itself may also take many hours, hours that are particularly precious at period close.

Many companies complain the periodic processing is still heavily dependent on manual processing of certain transactions particularly when systems are not fully unified and while this could be addressed with systematic automation there are many variants in use cases that don’t justify systematic automation because of relatively low value or low volume of transaction volumes.

tesler application complexityThose that have been using SAP financials for a good number of years know that the core SAP financial tables (excluding the CO tables) of BSEG and BKPF can contain very large volumes of data and that efforts to try and accelerate the ways in which reports are delivered can be very time consuming and not easy to iterate for accelerated period close. Reports are long running, resource intensive and problematic.

SAP’s promise of SAP Simple Finance suggests that a simple finance document can be returned to in the data processing model without the need to have diversified views of the same data.

Some of the legacy constraints of deficient technologies inherent to recording financial transactions in multiple tables in a relational database have been the reason for this slowness and will now be avoided with Simple Finance.

Simplification with fewer tables

Of course, simplifying the experience for those who use the data comes at a price. There are prerequisites and among these it involves a wholesale rip and replace of the underlying infrastructure like switching to HANA as a database and applying Enhancement Pack7  (Ehp7).

SAP has engineered an application layer that has more technical complexity in order to support an enhanced on-the-fly reporting experience.  In order to get to a single version of the truth the ultimate objective for financial accounting is to logically connect FI line items contained in BSEG to the controlling line items in COEP to create single document views.

Focusing on the postings into financial and management accounting in the controlling part of SAP is important because the costs and revenue postings determine the value of individual transactions in any given accounting data recording system. In the past companies would work with just summary information in order to cope with the transaction volumes but you lose some of the advantages of granularity through this approach and detract from good analytics and address some of the data redundancy issues associated with SAP data in the FI-CO modules.

The new approach will harmonize internal and external reporting  which should alleviate some of the problems associated with data intensive processing and analytics. While this will not necessarily be interesting to all companies this will accelerate reconciliation and tighten the period close activities for many. Four main tables will be the end result, document headers and line items. What used to be considered tables will now be handled with Core Data Service (CDS) views leveraging HANA.  Those who have highly customized financials will need to take careful consideration of this if SAP Simple Finance is on their radar as an objective. Reporting will be faster, particularly for insurance and banking institutions.

If da Vinci’s statement is true then,  SAP’s new financials model really will bring sophistication at least for those focused on proper accounting and controlling but it will have achieved it only through an extraordinary amount of re-engineering at the application and platform logic level.

For consultants and developers this new approach will require reeducation and the understanding that they will need to move to account based CO-PA to reap the best advantages of the new design.


June 6, 2014  8:34 PM

SapphireNow 2014 and the year of industry confusion

Clinton Jones Clinton Jones Profile: Clinton Jones
SAP

2014-06-04 08.31.44-smallSAP and ASUG’s annual love fest is over now though undoubtedly the blogs, press releases and twittersphere will continue to trail with summaries and tweets and of course I do look forward to some of the show floor footage appearing on Youtube.

As with all shows of this nature it seems that the conclusions on success have mixed views from different spheres. As an employee of Winshuttle I was fortunate to be able to attend and participate in some of those conversations with partners, SAP employees and customers.

2014 from my vantage point has to have been the most confusing year imaginable for delegates to SapphireNow. In the past three of four years there has been some confusion but this year seems to have taken the cake.

In prior years we would have conversations with BusinessOne customers and sadly inform them that we had nothing to help them with their challenges in the SAP space.

Those focused purely on enterprise reporting didn’t seem to grasp the nuance between tactical and strategic reporting and so were left with their BOBJ, BI/BW or Crystal Reports stuff.

The Successfactors set had their own set of challenges just getting core ERP to talk to SF in a synchronous, seamless and reliable way without having to make gigantic investments and we didn’t really have much to say to them either.

This year many people came up saying, “we’re running HANA” and we want to see how you can help us with HANA data and Excel.

When you probed them further on this, it seemed they were somewhat committed to using HANA as a platform – but their core back office operational processes were still happening elsewhere, often not even in SAP ERP.

What they were looking for, was a way to feed HANA with data from disparate sources like Excel without having to dive into developing static interfaces for every data source and without having to work with text files .

The elephant in the room

Just as for 2013, 2014 SapphireNow was about HANA and the Cloud and SAP’s big play to be a major Cloud player.

However when you scrape aside all the fluffy marketing stuff, my sense is that at the core the fundamental problems still remain. There is an elephant in the room and marketing and to some extent engineering aren’t sure where to start with dealing with it. How can any SAP ERP customer really leverage this new stuff in a really meaningful way with what they are using?

For many, the central nervous system is core ERP and with the exception of a few newcomers to SAP, this remains an on premise behemoth probably running on a database other than HANA – it seems then that it is almost impossible to shift without a major reinvestment.  Such a reinvestment has many aspects including consulting services around data migration, buying new hardware, rejigging application logic and in some instances complete business process reengineering.

What I heard from attendees was that moving from an existing installation to a ‘better one’ that reaps all the benefits of HANA and Fiori with mobility and analytics, they may as well treat their existing SAP installation as a ‘legacy’ system and consider the shift as a full blown SAP implementation project anew.

Getting funding for SAP renewal initiatives, even in a business climate that seems fairly buoyant, strikes me as a hard sell to the board. Particularly since it is likely the board understood that they invested in SAP in the first place to be a scalable flexible platform for years to come.

Feeding the beast

While the declaration is now that new face-paint like Fiori is available at no extra cost to existing application licensed users it seems that for many delegates this was a bit of a “huh?”. In part because there are some critical application and BASIS layer prerequisites and on the other hand, because Fiori doesn’t yet cover all the scenarios that a business might want to beautify or simplify -so frustrations with initial low adoption rates may continue for some time to come, but more importantly, how this relates to everything else may not be very obvious.

For executive management it will certainly hold some appeal but for the backoffice it will likely be as successful (useful)  as Personas, ESS, MSS, the NWBC and Portal transactions. The backoffice is data-centric – Fiori and usability approaches are about finessing the user experience to make things intuitive and easy to navigate and use not about feeding the ERP beast.

Fiori is more contemporary, more consumable and more UX-centric but to be a broadly usable reality there needs to be a catch up in the infrastructure space. The real question will be whether new revenue generating or cost reducing initiatives will be viewed as more valuable investment forums for IT spend than an investment in making SAP more accessible and consumable to business leaders.

All Your Base Are Belong to Us

The SAP portfolio of product offerings in 2014 is substantial but the problem is that while a lot of ground is covered, not many of the solutions play nicely with one another.

For the red headed step children that form part of the SAP product family, Fiori is not really that interesting – except perhaps for SuccessFactors and some aspects of the analytics space (alternative ways to access BI/BW/BOBJ data) though this will undoubtedly all change in the future – this seems in part due to the continued poor integration and unification of the disparate technologies.

Given all the other mobility offerings out there it is difficult to understand which ones will be retained and developed, continue to be supported, or killed off altogether by SAP.

With so many different kinds of customers with different interests in the fold, so to speak, SAP has become the IBM of the new millennium with a burgeoning portfolio of disparate offerings that cover a broad swathe of industry segments and functionality - a portfolio of solutions that is even larger when seen with funding by SAP Ventures.

John Appleby actually gave a pretty good summary of the current play in his assessment of the implications of CTO Vishal Sikka’s departure for SAP AG and its focus and innovation on SAP’s SCN in May 2014.

It is hardly surprising then that partners and customers alike struggle to sift through one another’s’ exact position in order to find common ground – I for one know that getting to a mutual understanding of need and solution and clarity in understanding is becoming increasingly difficult.


March 31, 2014  12:13 PM

Finding a clear path through consistency

Clinton Jones Clinton Jones Profile: Clinton Jones

oldticketTraveling abroad brings with it a series of experiences that can be both frustrating and surprising. This weekend past I had the opportunity to visit one of the lands of the Vikings, Norway. Although I only stayed in Oslo I did have the opportunity to travel on the three principal local public transport systems, buses, trams and the metro.

Just as in other large metropolitan cities the Ruter # as it is known   coordinates, orders and markets public transport in Oslo and Akershus. All services are performed by various operating companies that work by contract for Ruter and by NSB with local trains – all within the same ticket and price system.

This consistency in the ticket and price system is great! From an end user experience perspective it works for the most part but for out-of-towners there can be some challenges.  The first of these is that the buses come in a variety of colours. Red and neon green and bicolored red and yellow ones. The red ones are local buses, criss-crossing Oslo and providing links to all areas not served by one of the other forms of transport.

busesThe green buses are regional buses, travelling further afield and generally starting/finishing from Oslo bus terminal. An important distinction to note is on the green regional buses you must enter at the front and show your ticket, stating if you will be travelling further than the city limits.

On the red local buses you can enter using any door and need only scan your ticket if you need to validate your period pass or pay for a single journey. So here we hit the first inconsistency. If you don’t know this, you could have an embarrassing time.

Like many of the transit systems these days, the Ruter has upgraded from the old paper ticket to the impulskort which contains an RFID loop. Tagging cards is cheap enough these days that this is the best way to provide you with a semi-durable ticket. Another issue that arises with the impulskort is that on some buses the LCD display houses the reader and on others the reader is housed in a scanning pad.

For those not familiar with both technology environments this too can be embarrassing or downright awkward and can compound delays on getting on the bus.

Types of readers on the Ruter system

Much has already been said in the past regarding the UI issues associated with working ERP’s and a new generation of applications is being churned out in the form of FIORI applications that essentially enable business users to leverage the same functionality on a mobile device or in a browser that historically has been largely only available via the SAP GUI, the NWBC or the SAP Portal.

This is mostly telling us that legacy methods are acknowledged as less than ideal and there is an expectation that something fresher and more contemporary should be available for use and here’s where the rub sets in.

This is not to say that those old paper tickets didn’t get the job done, just that they were vulnerable to forgery and inappropriate reuse, inflexible and not very useful for statistical reporting. In many respects these same deficiencies in old paper tickets are the same for modern day systems.

We want more control, more flexibility and better reporting however the way that we make all of those mechanisms possible and the way we facilitate those functions must still be inherently usable.

This is a balance that is not easy to find – I refer to the overly complex solutions as solutions that get you stuck in a deadwood forest. If you fail to address the concerns of an aging UI though, you land in a wasteland like ‘death valley’ where systems don’t keep up with the complexity and demands of business and processes.

There is a clear path, a clear path for automation in the absence of UI redesign, not necessarily one that is always obvious – a path that requires understanding the dimensions and maturity of people, process and technology but also the context in which your business processes function.

For finance operations using SAP this means leveraging that tenured or immature skillset of the people along with the most commonly used tools like workflow and Microsoft Excel and Web forms. Products like those from Winshuttle can support you in this. Most particularly it means endorsing the position that your SAP ERP is your system of record – a hungry and demanding system with high expectations in terms of data inputs.

Winshuttle-ClearPath-image

Much has been made of the expectations of the millennial workforce and of course they will be the knowledge workers of tomorrow – they will expect better systems than those of the past and they will want high performance, a slick UI and useful and responsive reporting. This can only be achieved by re-architecting the user experience and undoubtedly FIORI is on that path.

In parallel we have to think of all the other aspects of working with systems, the need to move away from manual data entry and closer to integrated systems. This is a hard expectation to meet when there’s a counter-pull suggesting that best-in-class solutions are more effective and less costly.

In the evaluation of an approach to finding the clear path you need to have vision and an idea of the lie of land as it is today but you also have to have some degree of faith that there is no single solution that likely fits all needs.

Challenge your assumptions regarding the solutions that you think are most appropriate, see what your competitors and peers are doing and don’t be shy to take on something new especially if it is low risk.


March 26, 2014  5:27 PM

The SAP General Ledger : moving from Classic to New

Clinton Jones Clinton Jones Profile: Clinton Jones

ledgerI just returned from the SAP Insider Financials 2014 in Orlando and apart from attending a number of SAP customers speaking sessions I also attended several sessions conducted by SAP. Part of the agenda for these sessions was to encourage SAP customers to evaluate the NewGL.

I had previously taken a look at Johannes Le Roux’s  slideshare deck on Migration to the New GL and noticed on slide 16 that he described that the Migration to the New GL would have reporting gaps that could be addressed by Excel based reporting tools like E-Server and Spreadsheet server.

I guess in my mind the question was, is that a practical way to address the reporting challenge and should companies consider converting historical data or is that seen as impractical and risky?   Additionally, what are businesses really trying to report?

In a follow on dialog with Le Roux he indicated  that history migration is not an option so you will need some type of database to do comparison reporting – BW, HANA etc. He felt unsure that Microsoft Excel would work unless you download all the history and current data to a central server datamart.

I also know that in discussions with Winshuttlecustomers, a good number have already made the move to NewGL but a great many are also still using Classic and have no upcoming plans to switch for this and other reasons.

That said, a lot of the impetus behind evaluating the NewGL is being driven around the convergence of accounting standards, in particular the adoption of IFRS over GAAP or local GAAP or in parallel.

The push to the NewGL

The adoption of IFRS requires impacted enterprises to report under IFRS and local Generally Accepted Accounting Principles (GAAP) in parallel for a period of at least one year before the final change-over date.

This requires multi-GAAP accounting and reporting capabilities in the finance systems which can be a challenge with Classic GL.

Embedding IFRS changes into the ERP is the preferred option in the long term and there are alternative approaches that involve managing IFRS changes outside the ERP system but these would often be considered less acceptable from an audit standpoint.

An alternative to changing the SAP system and adopting the NEW-GL is to post local GAAP adjustment to an IFRS configured system. This approach helps adoption by making top side adjustments and is potentially cheaper and faster to implement than the other two possible options for SAP systems as no software upgrade is required.

standardsThe problem with this approach would be the impact on the GL close process timing. There is extensive manual compilation outside the SAP application to determine journal entries. There would be a need for controls to pass audits and customization of the SAP system in order to create and feed data at an appropriately accurate and granular level. This is one of the reasons that external reconciliation and period-close solutions like Trintech and Blackline are so popular.

This approach further entails a high cost in sustainment of reporting and reconciliation – so, retaining Classic GL is quick to implement but difficult to sustain in the long term as it is mostly manual and impacts close processes and reconciliations.

From 2005 onwards, consolidated financial statements of listed European companies have had to comply with IFRS (IAS). Many German companies began adopting those standards in the 1990s, on a voluntary basis, because of their need to access international capital funding. Spanish companies, by contrast, are not permitted to adopt IFRS before 2005. So for some companies at least, the matter is a moot point but for the remainder, plans will need to be put in place to get ready for a move at a fiscal year end.

What’s in a name?

The NewGL is also known as FlexGL, New-GL and Flex-GL. The official name of the parallel Lodger is the SAP General Ledger and it enables combining Classic, Profit Center and Special Purpose Ledgers into a new environment.

Several sessions at the Financials show specifically targeted conversations around the NewGL but also made mention of the up and coming Smart Accounting that is to be offered with HANA.

On HANA the new accounting model will be referred to as Smart Accounting or Smart Financials though the final descriptor is not yet set. On HANA the promise is lightning fast continuous reconciliation.

You can read more on this by checking out this supplementary content:


December 6, 2013  10:21 AM

Predictions for 2014, the internet of things, predictive analytics and SAP’s HANA as a DBaaS

Clinton Jones Clinton Jones Profile: Clinton Jones

Cloud_BigData_468A few events this week triggered some thoughts about where HANA and in-memory computing should be considered in the context of your overall SAP ERP landscape.

As many will know, SAP is officially positioning itself as a premier cloud provider and this is achieved in a number of ways both through the cloud based product offerings that include Ariba, SAP Business By Design (BYD) , HANA in the cloud, SuccessFactors and other products in the SAP cloud portfolio.

SAP AG’s CEO, Jim Hagemann Snabe, spoke at Credit Suisse’s 2013 Annual Technology Conference this week and claimed that the cloud makes SAP more agile and able to compete with small companies and startups. This is not to say that SAP turns its back on on premise enterprise customers, they will always exist. Rather that SAP sees the cloud as offering 100% recurring revenue as opposed to what they refer to as the “traditional business model” which yields only 22% recurring revenue. For SAP  ‘the Cloud’ is therefore a big roll of the dice but also, where SAP sees the biggest opportunities.

Surprisingly, at Winshuttle we see new installations of SAP continue to pop up on the radar. This is surprising, because best of breed software vendors try constantly to convince the market that ERP is dead, so those claims continue to be undermined by business decisions. Perhaps more surprising is the fact that these new adopters of ERP run the full gamut of sizes of organizations and industries.

For these new adopters of SAP it makes sense to go straight to HANA as the platform and possibly even ERP on the cloud. HANA as the underlying database makes sense principally because SAP and SAP BASIS administrators have always hated Oracle, SQLServer and other database administrators playing around with the underlying database at the RDBMS level. The preferred way to administer the database has always been through the BASIS layer.

With SAP having engineered and essentially ‘owning’ the database component, everything now falls to BASIS administration and Operating System administrators with the DBA being essentially being made redundant. So if you’re implementing SAP for the first time, HANA is likely to be your first choice for database platform. A difficult position to dispute when big guns like Gartner even position SAP as a Market Leader in Operational Database Management Systems.

Perhaps more significant is also the fact that HANA itself doesn’t require SAP’s ERP to be present, to be useful. Many of SAP’s stories old and new, for HANA revolve around applicability even in the absence of ERP – particularly data coming from things like SCADA systems and potentially  the ‘internet of things’ apart from traditional business transactional data commonly found in ERP.

The SAP Big Data BusSAP is making big noise about big data and the pertinence of HANA and the SAP analytics tools that can be connected to HANA. So much so, that there is a great deal of thought leadership being driven out of SAP by folks like David Ginsberg – SAP’s Chief Data Scientist and Shekhar Iyer, Global VP of Business Intelligence and Predictive Analytics – who not only promote HANA but also SAP analytics services and the emergent role of the Data Scientist.  One of the biggest areas of special interest in this space is predictive analytics.

While the legacy on premise customers contemplate where HANA fits into their technology mix, I’d encourage them to give serious consideration to the fact that HANA is accessible in several different ways both as SaaS and on premise as an appliance and theoretically on generic hardware. As Eby-Brown so eloquently described in their ‘Journey with HANA’ webinar this week, they got to demonstrate that certain ERP processes could be executed faster on HANA but more importantly, that they could execute some functions that previously had been impossible due to the nature of their data. SAP even offers an Amazon cloud instance that can be ‘played with’ to test.

So, for those still skeptical about the relevance, consider loading data from well, pretty much any system that has large tracts of data and from that, determine whether HANA isn’t perhaps a better performing alternative to your existing BI/BW or DataMart environment. It is hard to argue against the potential when market leaders like SAS have officially bought into HANA as being a credible alternative platform for the data repository.

If my crystal ball skills are any good, 2014 will see more migrations and implementations of HANA and furthermore more evaluation of HANA for the predictive analytics modeling aspect of business intelligence particularly as a Database as a Service (DBaaS) thereby minimizing business’ need to stand up new on premise infrastructure to support fast large scale analytics.  The fundamental question is going to be whether it is practically possible to populate these cloud based databases with the piles of data that business to date, have hosted in-house.


October 21, 2013  1:56 PM

The rise of the Process Efficiency Tools (PETs)

Clinton Jones Clinton Jones Profile: Clinton Jones

Picture by Bonnie Manion vintagegardengal comRoughly a year ago I commented on the fact that ERP solutions run the risk of being implemented and run in relative isolation to the larger requirements of the business – I described ERP as being an island – it is a phenomenon that continues even with new implementations today.The causes are numerous, from costs of integration to flaws in the implementation methodology and overall philosophy – probably the worst is when the ERP implementation is considered an IT project.

As a consequence, folks like the sales team, engineering and yes even accounting, land up investing in supplementary technologies that help them simply get their jobs done. It could be argued that if you look at offerings from companies like SalesForce.com, IBM’s Maximo,  WorkForce.com and others, the only reason that they exist is that the core ERP system either lacks functionality or usability.

SAP doesn’t work the way I need it to

Arguing that SAP lacks functionality is a hard one to swallow by virtue of the fact that SAP has such a large customer base and in such a dizzying array of industry segments. However it would be true to say that it lacks certain ways of working that are perhaps more attuned to the way that salespeople, maintenance engineers and human resources specialists would like to work.  Any organization that has committed to ERP has to maximize the value that ERP potentially represents. This means you may have to consider spending more money on more things…

pet2In my organization – Winshuttle-  we have similar challenges. We have an accounting system which integrates with our customer sales system and for a while we used the service aspect of this system but to use it more effectively we found that customization was an inevitable activity that we would have to undertake.

In an ideal world with deep pockets and unconstrained time-lines we would likely always choose best in class to achieve our objectives but since that cannot be our reality we have to pragmatically look to what we can afford. Microsoft Excel gets many organizations through some significant hurdles around operational efficiency without making too many compromises in efficiency.

Excel and SAP don’t really play nicely together

The problem is that natively, SAP and Excel don’t really play nicely together and if they do, it isn’t a consistent experience.

In every organization that I have worked, Excel continued to be the mainstay that business users defaulted to when they needed to collate, track or present certain kinds of monetary or statistical information. Even in the marketing organization, loading up leads from trade events usually comes through via a workbook or spreadsheet. Excel is ubiquitous – trying to deny the relevance of Excel or pretending that you can live without it or something like it, is a fool’s errand and trying to eliminate use of it is pointless if you are not offering meaningful and appropriate alternatives.

MS Office is a PET

Microsoft office itself is a PET, after-all it allows you to create documents in a relatively consistent format, without having to suffer the need to rely on the clatter of typewriter iron with sheaves of crisp white cartridge and shimmering carbon paper. Indeed MS Office is even pitched as well aligned with six sigma and you can learn how to align your MS Office utilization with this operational improvement and effectiveness philosophy in mind. For some amusing anecdotes, tips and comments on MS Office and using it efficiently and effectively take a read of Annik Stahl, the Crabby Office Lady columnist’s 10 years worth of posts.

Business intelligence tools like BI Excellence come with an integrated ability to export to Microsoft Excel and there is an increasing number of other business intelligence technologies on offer in the market that support loading data into your core system of record from formats that avoid the need to use the traditional transaction entry screens or developing costly application interfaces – things like FIORI and SAP Screen Personas.  The way you allow users to use Microsoft Office in the workplace together with your ERP is the most important thing. Managing risk, ensuring compliance and streamlining operational use.

If you had any doubts about the usability and optimized effective use of your ERP system today, be aware that all is not lost. You do have the opportunity to improve things, all that you need to remember is that despite spending hundreds of thousands or even millions of dollars on its implementation – there are still some missing components in the landscape. The components that you are missing primarily focus on making people, not technology, more efficient when working with ERP.

When these behemoths like SAP were originally conceived, the technology was expensive and people were relatively cheap – today that is not as true and out-of-the-box efficiency tools don’t generally come bundled with the solution – you have to invest in them. If you don’t invest in PETs you will be investing in customization.


Page 1 of 212

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: