Traveling abroad brings with it a series of experiences that can be both frustrating and surprising. This weekend past I had the opportunity to visit one of the lands of the Vikings, Norway. Although I only stayed in Oslo I did have the opportunity to travel on the three principal local public transport systems, buses, trams and the metro.
Just as in other large metropolitan cities the Ruter # as it is known coordinates, orders and markets public transport in Oslo and Akershus. All services are performed by various operating companies that work by contract for Ruter and by NSB with local trains – all within the same ticket and price system.
This consistency in the ticket and price system is great! From an end user experience perspective it works for the most part but for out-of-towners there can be some challenges. The first of these is that the buses come in a variety of colours. Red and neon green and bicolored red and yellow ones. The red ones are local buses, criss-crossing Oslo and providing links to all areas not served by one of the other forms of transport.
The green buses are regional buses, travelling further afield and generally starting/finishing from Oslo bus terminal. An important distinction to note is on the green regional buses you must enter at the front and show your ticket, stating if you will be travelling further than the city limits.
On the red local buses you can enter using any door and need only scan your ticket if you need to validate your period pass or pay for a single journey. So here we hit the first inconsistency. If you don’t know this, you could have an embarrassing time.
Like many of the transit systems these days, the Ruter has upgraded from the old paper ticket to the impulskort which contains an RFID loop. Tagging cards is cheap enough these days that this is the best way to provide you with a semi-durable ticket. Another issue that arises with the impulskort is that on some buses the LCD display houses the reader and on others the reader is housed in a scanning pad.
For those not familiar with both technology environments this too can be embarrassing or downright awkward and can compound delays on getting on the bus.
Much has already been said in the past regarding the UI issues associated with working ERP’s and a new generation of applications is being churned out in the form of FIORI applications that essentially enable business users to leverage the same functionality on a mobile device or in a browser that historically has been largely only available via the SAP GUI, the NWBC or the SAP Portal.
This is mostly telling us that legacy methods are acknowledged as less than ideal and there is an expectation that something fresher and more contemporary should be available for use and here’s where the rub sets in.
This is not to say that those old paper tickets didn’t get the job done, just that they were vulnerable to forgery and inappropriate reuse, inflexible and not very useful for statistical reporting. In many respects these same deficiencies in old paper tickets are the same for modern day systems.
We want more control, more flexibility and better reporting however the way that we make all of those mechanisms possible and the way we facilitate those functions must still be inherently usable.
This is a balance that is not easy to find – I refer to the overly complex solutions as solutions that get you stuck in a deadwood forest. If you fail to address the concerns of an aging UI though, you land in a wasteland like ‘death valley’ where systems don’t keep up with the complexity and demands of business and processes.
There is a clear path, a clear path for automation in the absence of UI redesign, not necessarily one that is always obvious – a path that requires understanding the dimensions and maturity of people, process and technology but also the context in which your business processes function.
For finance operations using SAP this means leveraging that tenured or immature skillset of the people along with the most commonly used tools like workflow and Microsoft Excel and Web forms. Products like those from Winshuttle can support you in this. Most particularly it means endorsing the position that your SAP ERP is your system of record – a hungry and demanding system with high expectations in terms of data inputs.
Much has been made of the expectations of the millennial workforce and of course they will be the knowledge workers of tomorrow – they will expect better systems than those of the past and they will want high performance, a slick UI and useful and responsive reporting. This can only be achieved by re-architecting the user experience and undoubtedly FIORI is on that path.
In parallel we have to think of all the other aspects of working with systems, the need to move away from manual data entry and closer to integrated systems. This is a hard expectation to meet when there’s a counter-pull suggesting that best-in-class solutions are more effective and less costly.
In the evaluation of an approach to finding the clear path you need to have vision and an idea of the lie of land as it is today but you also have to have some degree of faith that there is no single solution that likely fits all needs.
Challenge your assumptions regarding the solutions that you think are most appropriate, see what your competitors and peers are doing and don’t be shy to take on something new especially if it is low risk.
I just returned from the SAP Insider Financials 2014 in Orlando and apart from attending a number of SAP customers speaking sessions I also attended several sessions conducted by SAP. Part of the agenda for these sessions was to encourage SAP customers to evaluate the NewGL.
I had previously taken a look at Johannes Le Roux’s slideshare deck on Migration to the New GL and noticed on slide 16 that he described that the Migration to the New GL would have reporting gaps that could be addressed by Excel based reporting tools like E-Server and Spreadsheet server.
I guess in my mind the question was, is that a practical way to address the reporting challenge and should companies consider converting historical data or is that seen as impractical and risky? Additionally, what are businesses really trying to report?
In a follow on dialog with Le Roux he indicated that history migration is not an option so you will need some type of database to do comparison reporting – BW, HANA etc. He felt unsure that Microsoft Excel would work unless you download all the history and current data to a central server datamart.
I also know that in discussions with Winshuttlecustomers, a good number have already made the move to NewGL but a great many are also still using Classic and have no upcoming plans to switch for this and other reasons.
That said, a lot of the impetus behind evaluating the NewGL is being driven around the convergence of accounting standards, in particular the adoption of IFRS over GAAP or local GAAP or in parallel.
The push to the NewGL
The adoption of IFRS requires impacted enterprises to report under IFRS and local Generally Accepted Accounting Principles (GAAP) in parallel for a period of at least one year before the final change-over date.
This requires multi-GAAP accounting and reporting capabilities in the finance systems which can be a challenge with Classic GL.
Embedding IFRS changes into the ERP is the preferred option in the long term and there are alternative approaches that involve managing IFRS changes outside the ERP system but these would often be considered less acceptable from an audit standpoint.
An alternative to changing the SAP system and adopting the NEW-GL is to post local GAAP adjustment to an IFRS configured system. This approach helps adoption by making top side adjustments and is potentially cheaper and faster to implement than the other two possible options for SAP systems as no software upgrade is required.
The problem with this approach would be the impact on the GL close process timing. There is extensive manual compilation outside the SAP application to determine journal entries. There would be a need for controls to pass audits and customization of the SAP system in order to create and feed data at an appropriately accurate and granular level. This is one of the reasons that external reconciliation and period-close solutions like Trintech and Blackline are so popular.
This approach further entails a high cost in sustainment of reporting and reconciliation – so, retaining Classic GL is quick to implement but difficult to sustain in the long term as it is mostly manual and impacts close processes and reconciliations.
From 2005 onwards, consolidated financial statements of listed European companies have had to comply with IFRS (IAS). Many German companies began adopting those standards in the 1990s, on a voluntary basis, because of their need to access international capital funding. Spanish companies, by contrast, are not permitted to adopt IFRS before 2005. So for some companies at least, the matter is a moot point but for the remainder, plans will need to be put in place to get ready for a move at a fiscal year end.
What’s in a name?
The NewGL is also known as FlexGL, New-GL and Flex-GL. The official name of the parallel Lodger is the SAP General Ledger and it enables combining Classic, Profit Center and Special Purpose Ledgers into a new environment.
Several sessions at the Financials show specifically targeted conversations around the NewGL but also made mention of the up and coming Smart Accounting that is to be offered with HANA.
On HANA the new accounting model will be referred to as Smart Accounting or Smart Financials though the final descriptor is not yet set. On HANA the promise is lightning fast continuous reconciliation.
You can read more on this by checking out this supplementary content:
A few events this week triggered some thoughts about where HANA and in-memory computing should be considered in the context of your overall SAP ERP landscape.
As many will know, SAP is officially positioning itself as a premier cloud provider and this is achieved in a number of ways both through the cloud based product offerings that include Ariba, SAP Business By Design (BYD) , HANA in the cloud, SuccessFactors and other products in the SAP cloud portfolio.
SAP AG’s CEO, Jim Hagemann Snabe, spoke at Credit Suisse’s 2013 Annual Technology Conference this week and claimed that the cloud makes SAP more agile and able to compete with small companies and startups. This is not to say that SAP turns its back on on premise enterprise customers, they will always exist. Rather that SAP sees the cloud as offering 100% recurring revenue as opposed to what they refer to as the “traditional business model” which yields only 22% recurring revenue. For SAP ‘the Cloud’ is therefore a big roll of the dice but also, where SAP sees the biggest opportunities.
Surprisingly, at Winshuttle we see new installations of SAP continue to pop up on the radar. This is surprising, because best of breed software vendors try constantly to convince the market that ERP is dead, so those claims continue to be undermined by business decisions. Perhaps more surprising is the fact that these new adopters of ERP run the full gamut of sizes of organizations and industries.
For these new adopters of SAP it makes sense to go straight to HANA as the platform and possibly even ERP on the cloud. HANA as the underlying database makes sense principally because SAP and SAP BASIS administrators have always hated Oracle, SQLServer and other database administrators playing around with the underlying database at the RDBMS level. The preferred way to administer the database has always been through the BASIS layer.
With SAP having engineered and essentially ‘owning’ the database component, everything now falls to BASIS administration and Operating System administrators with the DBA being essentially being made redundant. So if you’re implementing SAP for the first time, HANA is likely to be your first choice for database platform. A difficult position to dispute when big guns like Gartner even position SAP as a Market Leader in Operational Database Management Systems.
Perhaps more significant is also the fact that HANA itself doesn’t require SAP’s ERP to be present, to be useful. Many of SAP’s stories old and new, for HANA revolve around applicability even in the absence of ERP – particularly data coming from things like SCADA systems and potentially the ‘internet of things’ apart from traditional business transactional data commonly found in ERP.
SAP is making big noise about big data and the pertinence of HANA and the SAP analytics tools that can be connected to HANA. So much so, that there is a great deal of thought leadership being driven out of SAP by folks like David Ginsberg – SAP’s Chief Data Scientist and Shekhar Iyer, Global VP of Business Intelligence and Predictive Analytics – who not only promote HANA but also SAP analytics services and the emergent role of the Data Scientist. One of the biggest areas of special interest in this space is predictive analytics.
While the legacy on premise customers contemplate where HANA fits into their technology mix, I’d encourage them to give serious consideration to the fact that HANA is accessible in several different ways both as SaaS and on premise as an appliance and theoretically on generic hardware. As Eby-Brown so eloquently described in their ‘Journey with HANA’ webinar this week, they got to demonstrate that certain ERP processes could be executed faster on HANA but more importantly, that they could execute some functions that previously had been impossible due to the nature of their data. SAP even offers an Amazon cloud instance that can be ‘played with’ to test.
So, for those still skeptical about the relevance, consider loading data from well, pretty much any system that has large tracts of data and from that, determine whether HANA isn’t perhaps a better performing alternative to your existing BI/BW or DataMart environment. It is hard to argue against the potential when market leaders like SAS have officially bought into HANA as being a credible alternative platform for the data repository.
If my crystal ball skills are any good, 2014 will see more migrations and implementations of HANA and furthermore more evaluation of HANA for the predictive analytics modeling aspect of business intelligence particularly as a Database as a Service (DBaaS) thereby minimizing business’ need to stand up new on premise infrastructure to support fast large scale analytics. The fundamental question is going to be whether it is practically possible to populate these cloud based databases with the piles of data that business to date, have hosted in-house.
Roughly a year ago I commented on the fact that ERP solutions run the risk of being implemented and run in relative isolation to the larger requirements of the business – I described ERP as being an island – it is a phenomenon that continues even with new implementations today.The causes are numerous, from costs of integration to flaws in the implementation methodology and overall philosophy – probably the worst is when the ERP implementation is considered an IT project.
As a consequence, folks like the sales team, engineering and yes even accounting, land up investing in supplementary technologies that help them simply get their jobs done. It could be argued that if you look at offerings from companies like SalesForce.com, IBM’s Maximo, WorkForce.com and others, the only reason that they exist is that the core ERP system either lacks functionality or usability.
SAP doesn’t work the way I need it to
Arguing that SAP lacks functionality is a hard one to swallow by virtue of the fact that SAP has such a large customer base and in such a dizzying array of industry segments. However it would be true to say that it lacks certain ways of working that are perhaps more attuned to the way that salespeople, maintenance engineers and human resources specialists would like to work. Any organization that has committed to ERP has to maximize the value that ERP potentially represents. This means you may have to consider spending more money on more things…
In my organization – Winshuttle- we have similar challenges. We have an accounting system which integrates with our customer sales system and for a while we used the service aspect of this system but to use it more effectively we found that customization was an inevitable activity that we would have to undertake.
In an ideal world with deep pockets and unconstrained time-lines we would likely always choose best in class to achieve our objectives but since that cannot be our reality we have to pragmatically look to what we can afford. Microsoft Excel gets many organizations through some significant hurdles around operational efficiency without making too many compromises in efficiency.
Excel and SAP don’t really play nicely together
The problem is that natively, SAP and Excel don’t really play nicely together and if they do, it isn’t a consistent experience.
In every organization that I have worked, Excel continued to be the mainstay that business users defaulted to when they needed to collate, track or present certain kinds of monetary or statistical information. Even in the marketing organization, loading up leads from trade events usually comes through via a workbook or spreadsheet. Excel is ubiquitous – trying to deny the relevance of Excel or pretending that you can live without it or something like it, is a fool’s errand and trying to eliminate use of it is pointless if you are not offering meaningful and appropriate alternatives.
MS Office is a PET
Microsoft office itself is a PET, after-all it allows you to create documents in a relatively consistent format, without having to suffer the need to rely on the clatter of typewriter iron with sheaves of crisp white cartridge and shimmering carbon paper. Indeed MS Office is even pitched as well aligned with six sigma and you can learn how to align your MS Office utilization with this operational improvement and effectiveness philosophy in mind. For some amusing anecdotes, tips and comments on MS Office and using it efficiently and effectively take a read of Annik Stahl, the Crabby Office Lady columnist’s 10 years worth of posts.
Business intelligence tools like BI Excellence come with an integrated ability to export to Microsoft Excel and there is an increasing number of other business intelligence technologies on offer in the market that support loading data into your core system of record from formats that avoid the need to use the traditional transaction entry screens or developing costly application interfaces – things like FIORI and SAP Screen Personas. The way you allow users to use Microsoft Office in the workplace together with your ERP is the most important thing. Managing risk, ensuring compliance and streamlining operational use.
If you had any doubts about the usability and optimized effective use of your ERP system today, be aware that all is not lost. You do have the opportunity to improve things, all that you need to remember is that despite spending hundreds of thousands or even millions of dollars on its implementation – there are still some missing components in the landscape. The components that you are missing primarily focus on making people, not technology, more efficient when working with ERP.
When these behemoths like SAP were originally conceived, the technology was expensive and people were relatively cheap – today that is not as true and out-of-the-box efficiency tools don’t generally come bundled with the solution – you have to invest in them. If you don’t invest in PETs you will be investing in customization.
Probably the most significant comment that you will hear, will be the commentary that the producst are a mish-mash of technologies without any specific cohesiveness and without a particularly consistent set of underlying technologies or a consistent user experience. In some respects that is true but in other respects that is quite untrue. Consider an early post made on SCN back in 2004 by Karin Schattka in which she plausibly demonstrated the apparent mish-mash of technologies even in use at that time, being credibly rendered either in the NetWeaver Portal or subsequently in the NWBC.
Of particular importance would probably be the fact that the portfolio of acquired technologies has the most acute differences of all the products in the SAP portfolio but even that is only a half truth – many have ABAP as a core underpinning technology or at least connect with ABAP systems using conventional methods or PI/XI interfaces and now even Gateway can be used. The real challenges is that different user needs need to be serviced by different technologies and approaches and innovation doesn’t have to be purely homed in the world of ABAP. For hard core ABAP developers that is preferable, and from an operations perspective, avoiding an army of disparate skillsets is good for keeping the IT headcount numbers down. But it seems SAP has a plan for changing and improving all that.
One could easily critique the differences between ERP, CRM, APO and SRM but at least at their core they have some fundamental design aspects that still remain somewhat aligned with the original concept of the R/3 Business Suite and the subsequent versions but products like B1 for the SMB market, Ariba, SuccessFactors, ByDesign, MDM, StreamWorks, Syclo, Sybase and the BusinessObjects BI portfolio are so wildly different from the original concept that one wonders at why they were ever acquired to be part of the total portfolio offering.
5 Core focus Areas
Just as an industrial giant like BMW manufacturer could produce, motorcycles, automobiles, pick up trucks, good trucks, buses and aeroplane engines, so too, a software company can have a portfolio of products that are not necessarily all fully integrated and not necessarily all complementary – some products will be built for specific segments without synergies with others. Having a spread of technologies moves that business closer to being a one stop shop for all technologies that a business given might need.
SAP today very clearly goes after five areas, the classic Business Applications, Databases and pure technology solutions, Analytics, the Cloud and Mobile.
All areas overlap in some respects but for newcomers to the vendor the main draw is often the business applications for which SAP is the most renowned. Of late much has been made in the press of the in-memory computing option of SAP HANA as an alternative database to contenders like the Oracle and Microsoft SQLServer RDBMS but in the less publicized circles of Human Capital Management and Human Resources, SAP made big waves with the acquisition of SuccessFactors.
My recent visit to Budapest to the Deloitte Shared Services Conference courtesy of Winshuttle involved a number of conversations with leaders in the HCM space for multinationals and when questioned on whether they had SuccessFactors in play or were planning to implement, many of the responses were a ‘no’ and a ‘we’re going to wait and see’ – when I probed them further it seems that the integration aspects were still the area that they were the most uncertain about. In-Memory computing didn’t come up though it is probably in there somewhere. Consider why, and you’ll understand, that business users, who were the primary attendees at the conference, don’t care about the underlying machinery, they care about the comfort of the ride.
Those that had implemented SAP’s Business Applications HR modules for example didn’t see any changes coming that would be of particular concern and of course those who had implemented the original version of SAP’s performance management module were generally unimpressed but the SuccessFactors acquisition presented a whole raft of new opportunities that were constantly under consideration but not yet decided upon, there was no compelling event as yet that was forcing their hand. Does that mean that they’re holding back on SAP investment? Quite the contrary, they’re exploring additional avenues that may or may not involve expanding existing uses or adoption of different SAP products.
In selecting any technology it seems that enterprises look at a number of variables among them:
- The business need
- What the market is offering
- The reputation of the vendor
- The cohesiveness of the vendor’s product portfolio
- What peers are doing
Although the last point is often claimed to be a non-criteria, we know from past experience that demonstrating that a competitor or seeing our competitors with a particular technology spawns envy and as a consequence observing and identifying what peers and competitors are doing and investing in, is part of the evaluation criteria. Many companies it seems, are seriously looking at HANA as an under-pinning technology so that they can run their systems cheaper, deal with one vendor (choke one throat) and improve ERP and analytics performance, if its on the cloud, great, but in all likelihood a hybrid approach of mixed cloud and on premise technologies will do.
The first three variables in the list speak for themselves and of course SAP has a strong brand and reputation In the market but the cohesiveness is where things seems to unravel a little. Analyst Elizabeth Hedstrom Henlin of TBR wrote some time back that “SAP is in a strong position across core and new addressable markets.” The portfolio address diverse customer needs and are backed by some proof points in support of integration. Henlin also said she see SAP’s mobile division as being the heart of SAP’s growth to date, with the acquisitions of Syclo and Sybase serving as the core of an applications-led, horizontal approach to mobility that touches and integrates SAP’s diverse portfolio and go-to-market strategy. So back then, in 2012 at least, it seems mobile was the flavor of the day – logically that won’t have changed, even if you change core underlying technology like the database, even if you push infrastructure to the cloud, even if you add analytics… you need a user experience and in all likelihood, mobile needs to be part of that, so there is some clear logic to having a mobility offering that works seamlessly with all of the others.
Herding the developer cats – Vishal Sikka is the man !
As recently as July again Henlin stated that SAP’s alignment of development entirely under Vishal Sikka brought the promise of increased speed within the expansion and strategic growth of the product portfolio and ongoing restructuring of SAP’s product teams to integrate on- and off-premises resources was the critical asset for SAP to sustain growth into 2014. So what did that all mean? Consolidation?
Just like every technology company, SAP’s engineering teams suffer from that bipolarity associated with wanting to work on the cool stuff but also needing to fix and maintain old stuff. The latter is the uncool but necessary stuff but this doesn’t appear to be part of Sikka’s portfolio right now. It is important to mention sustainment though, after-all the maintenance revenue stream that pays for the fixing of old stuff and that is a large chunk of change in SAP’s annual coffer collections.
Sikka’s realigning of SAP’s R&D functions are likely all focused on the new hip stuff, but there’s a need for oversight on the sustainment too, so in all likelihood there’s an element of dominion in his role there too.
Let’s look at another couple of pups in the litter.
When we then look at Enrico Camerinelli’s comments on Ariba this seems to align with the above but he does suggest that SAP’s acquisition of Ariba was purely about driving existing shareholder value. Ariba’s pre acquisition shareholders enjoyed a one-off premium of nearly 20% over the company’s share value when the company was acquired by SAP and SAP secured a specialized and highly populated B2B network into which it was able to expand its market reach. He feels then, that there isn’t really a commitment to facilitating any kind of integrated B2B financial supply chain and his comments are based on SAP’s inability to articulate a clear strategy on how it intends to leverage the Ariba acquisition for financial services.
HANA on the other hand continues to be where the big focus will continue to be and the remaining products in the portfolio will be offered but given less consideration in the media and show-and-tells; Greg Chase commented this week how more than a third of the offerings, a stunning 263 items at the upcoming SAP Teched Las Vegas in the US are HANA related – a big focus on Analytics, Big Data and In-Memory Computing. Only 5 mention Ariba and 23 SuccessFactors.
KXEN doesn’t feature at all but this is hardly suprising given that the acquisition is only recent news. KXEN technology at least has pertinence in the analytics, bigdata and HANA play.
With SAP’s portfolio continuing to grow one can only wonder at the spectacles that Sikka will be able to pull out of the proverbial magician’s hat as he continues to remould R&D at SAP, perhaps these puppies aren’t as ugly as you think and what we have is the establishment of a hunting pack positioning itself for presenting businesses with a new perspective on how business can buy and use SAP and enterprise technology to be successful.
Apart from the large presence of Deloitte consultants and partners, there were of course many companies represented and many of them spoke about the ways in which their individual businesses were transforming their business operations through the use of Shared Services models, Business Process Outsourcing or the implementation of the concept of Global Business Services.
I had a chance to have a number of conversations with company representatives and consultants alike but I also had the chance to talk to a couple of vendors and representatives of development agencies and discovered some interesting facts that I thought I would share.
The top of my mind was the revelation from MIDAS that the greater Manchester area is home to more than 7 million people within in an hour’s commute and over 30 shared service centres are located in the area, among them Aegis, Sodexo, RBS, BUPA and Worldpay.
SAP customers located in the area include Northern Rail, Smith & Nephew, Kellogg Company , HJ Heinz, AstraZeneca, Premier Foods, Japan Tobacco and of course Sodexo as well as a number of consulting groups.
Another important conversation that I had was with a couple of the vendors supporting SAP Finance groups focused on the Record-To-Report process with a particular concern with accelerating the period close, improving it and accelerating it. Verifying the accuracy of the financial close process and ensuring the integrity of financial statements are two important corporate challenges and they are best addressed with automation as an aid.
I have long been an advocate of trying to encourage organizations to automate as much of the finance process as possible. Of course in mature organizations that are heavily regulated or have a great deal of external accountability there is often a considerable amount of automation either with tight integration via configuration of the SAP modules or through the building of interfaces between various systems.
In the SAP ecosystem several key players provide keenly positioned solutions to business to more effectively close the manual entry gap and among these are Runbook, Chesapeake, Trintech and Blackline all of whom were at the conference. Comparisons between the three of them can be challenging, each brings different strengths that may be important to different organizations with different levels of finance automation maturity.
Trintech for example offers an integrated solution beginning at the reconciliations, continuing through the close, risk management, compliance, and financial reporting with flexibility for current practices, including configurable templates, workflow, and task management. There is some degree of standard and ad hoc reporting capability also. In the financial governance software market, the competitive landscape varies by product line. There is no competitor that Trintech compete with across all product lines although primarily they compete with Chesapeake Systems, BlackLine Systems for Reconciliations, Paisley Consulting and Open Pages in the Sarbanes-Oxley compliance market and MedAssets in the healthcare market.
Blackline is a well established SAP Partner and runs with the tagline ‘No More Bullsheet’ it positions itself with the basic premise that you shouldn’t be using spreadsheets as part of your reconciliation and close process. Their subscription based offering provides workflow, reconciliation review and approval, task lists used for both close and preparation and delivery of schedules to stakeholders like auditors who can download the analyses directly through their own account on the system, as well as other functions.
Runbook provides workflow to try and minimize the dependence on spreadsheets, emails and phone calls, arguing that the use of mixed tools to close the period make your financial close more frustrating and more expensive than it actually should be. Leading with SMARTClose for reconciliation it is described as flexible, intuitive and powerful in that it allows businesses to take their existing process and apply best practices and reusable templates to transform their close process.
Chesapeake System Solutions offers T-Recs for easing the financial close process by way of a rule-driven solution, similar in some respects to the other offerings.T-Recs claims to be able to reconcile any account based on customer-identified definitions of data characteristics, business processes, and data display templates.
Not all have strengths in all geographies and accordingly have different market stories. Pretty much all of them have a smattering of customers in the FTSE 1000 and NYSE 1000, however unlike vendors in the OCR and scanning solutions space, the principal positioning of these companies is try to cut down on umnanaged objects in the reconciliation process.
There have been many cries over the years to get business to move away from Excel and yet it remains the dominant product of choice for the majority of bookkeepers and accountants. Despite its many weaknesses, it remains the most flexible choice especially for businesses that are not shy abouit spending yet more on their SAP environment or cannot justify expenditure on additional products with very specific functionality. So Excel often becomes the default ‘only choice’.
The issue revolves largely around unmanaged spreadsheets and building and using good spreadsheet designs. Proper consideration given to a well design spreadsheet with good, robust integration with the backend ERP system and a governance wrapper can effectively achieve the same objectives.
As one Finance expert on Linkedin wrote, Accountants are like rock candy : “The bottom line is that if you cut an accountant in half, they have EXCEL written through the middle. That has remained a constant for years, and is unlikely to change, so maybe the focus should be on how tho build great spreadsheets, rather than killing Excel…”
So my thoughts return again to my sponsors at the event and the fact that for more than ten years now, Winshuttlehas been advocating Excel as the environment that much of businesses continue to rely on.
It is something that is unlikely to go away any time soon, and something, that even with the implementation of one or more of the technology solutions described above, will continue to be in active use. Leveraging robust ECM solutions like SharePoint together with SAP give you the best of both worlds. The tight integration of Excel and SharePoint give you heaps of additional meta data that enable you to leverage the sourcing documents natively. Something that you often lose when using an island system like a pure reconciliation solution. Yes, the latter may be best of breed or best in class, but does it actually offer you flexibility to address what you consider to be uniquely differentiating aspects of your business? Don’t automatically assume that Excel is bad, bad, bad, it has its place and you can make it more robust and dependable.
The trick, as with anything, is getting the balance right, between appropriate and proper use of tools to automate and smooth operational processes.
The launch of Fiori and Screen Personas by SAP represent an interesting commentary on how customers use SAP and how SAP is actively pursuing initiatives to improve the user experience. For many years SAP has been a much maligned product from a user experience perspective. This is hardly a surprise given the complexity and functionality that the leading ERP product affords a dizzying array of industry segments. Fiori and Screen Personas could be easily thought of as alternatives to one another but in reality they tackle different aspects of accessibility and usability. It is easy to consider this as “lipstick on the proverbial pig” but when you consider the strategy and the actual application, the products actually provide much more.
A prerequisite for Fiori is the presence of the Netweaver Gateway. The applications are priced per user for the bundle with some prerequisite licensing requirements.
Screen Personas tackle a different problem, namely the complexity of the existing SAP UI. Here, a number of different third party products, custom development and customization have developed over the years to simplify the overall user experience of core SAP functionality. Products like GUIXT on the SAP GUI client and Winshuttle for Microsoft Excel made large inroads in reducing the number of fields that a user needed to touch in order to complete a transaction in SAP however with Screen Personas the approach enables the implementation of group and user specific focused user-experience management in a centralized way that leverages existing organizational and security models in SAP.
SAP positions Sscreen Personas as “Personalization without Programming”. Using a simple drag and drop approach to screen design, users can modify many common SAP GUI screens to make them more usable and visually appealing. Prior to this product being made available, custom development or a third party product would have been the only choice available to simplify the user experience. Users leverage Screen Personas through the internet browser or NWBC. SAP Screen Personas works on most standard SAP GUI Dynpro screens but does not work for WebDynpro applications such as SAP SRM or the SAP CRM Web Client User.
Prerequisites are SAP Innovation Kernel, Release 7.21 and at present only English and German only are supported. Personas is licensed on a per-user basis. Users are defined as anyone that uses a modified screen and licenses. There is no developer license for people that build Personas screens. This is included in the license. Authorizations are handled in the backend and all authorization steps happen the same way whether you use Personas, the SAP WebGUI, or the SAP GUI.
SAP recently announced that it would entertain involvement in its PartnerEdge program from partners who previously had never been involved in the SAP space. This apparent ‘new’ offering through their website differs from other partner programs in the past in that it is promoting very specific capabilities namely “help create and monetize innovative, specific applications in the mobile, cloud, database or high-performance in-memory areas” It is clear from this statement that the targets are Mobile, Cloud and HANA.
While historically a on-premise software technology provider, SAP itself has produced a number of mobile applications for free and for a variety of uses from Business Objects to Order Status Inquiries from ERP. Downloading the applications themselves is probably fairly straightforward but getting them to talk to your back end systems is another matter; you have to navigate the complex world of SAP security authorization, externally facing ERP systems etc.
Against the backdrop of more recent announcements regarding SAP system vulnerabilities it is likely then that newcomers and business adoption levels will likely be low.
The SAP application landscape, like those of many enterprise back office behemoth systems suffer from a kind of innovation stalemate. When the business commits resources to enabling certain functionality then the likelihood of success is higher than if individuals try to create a groundswell of interest in functionality by leveraging peripheral or micro apps like many of these mobile applications to address certain edge case business scenarios.
It will be interesting to see whether opening the proverbial floodgates to innovation partners with little or no prior experience with working with SAP and SAP systems will in fact make a difference. The landscape is complex, the technology quirky and old fashioned in some respects and yet the business dependence on all of these, immense.
The opportunities for those innovators who manage to crack the nut of addressing a real business need cost effectively and at the same time keeping IT and the Auditors at bay will be tremendous.
One of my colleagues frequently uses what he refers to as the buses versus trains analog to describe some of the options that companies have with regard to some of the integration enabling technologies that are available for use with SAP.
Depending on the characteristics of the conversation that you are having the analog works well on a number of levels.
On the one hand you have the idea that a bus has a maximum capacity of around 100 or less people, on the other hand, commuter rail or even light rail has much larger capacity with some trains or trolley buses accommodating several hundred passengers simultaneously. When considering why this is possible you have to look to the infrastructural requirements associated with railed versus wheeled transportation. Railed transportation is often selected where a clearly defined commute route has been defined and services the majority. The choice between light-rail versus commuter rail is often bound up in the distances to be traveled and the speeds to be attained, the number of stops etc. You can interpret this in a business process as being somewhat akin to variance or logic complexity. In the ABAP world you would look at IDOCS, PI/XI integration and perhaps even other complementary technologies like EDI, Biztalk, Tibco etc. The point here is that these approaches typically require some degree of heavy lifting to establish.
Heavy Capital Investment is not a prerequisite for SAP Integration
Defining a data route, putting down the supporting technology and then establishing a marshaling yard for trains of data that have issues or risk failure for any number of reasons is another aspect of the overall design that adds another layer of complexity. For this reason, SAP’s Solution Manager monitoring and third party solutions from companies like Seeburger exist, to help the business keep track of what their integration layers are doing.
Another approach might be to deploy custom ABAP programs to achieve the integration. Many companies that have long established SAP infrastructure use this approach to integration developing a combination of ABAP, web Dynpro or Java applications to keep the data flowing from the many possible sources to the final target system. In the case of these programming intensive approaches, the ability to pivot the process and adapt to a changing set of business requirements is relatively difficult. As with anything programmatic; it of course can be done. The question is how quickly and with what sort of levels of commitment. Changing the technology that supports a given process also brings significant risks with it, if certain scenarios or outcomes are unforeseen and arise.
Is support for SAP collaboration automation without programming really possible?
With that said, buses on wheels offer some significant advantages over railed transport in terms of the flexibility they afford a city to service areas of public transport that are either not easily serviced by rail or which don’t justify the capital investment in rail. Additionally, buses support the ability to service a smaller volume of transportation objectives without major expense. However it gets to a point where buses may not be a practical approach when the requirements have demanding throughput expectations. Too many buses on the roads can lead to congestion that is exacerbated when more buses are added. If a bus route services too few requests the transportation authority can reevaluate the route, pivot the service area or discontinue the service. Under such circumstances the consumers of this service are forced to either self-drive or engage in lower level effort like cycling or walking depending on what is most practical.
There are a couple of technologies that support the bus scenario when looking at usability and automation with collaboration in the SAP space but these technologies all have varying strengths and weaknesses. Some technologies that you can look at in this area are Winshuttle, K2, Blackline, Promenta.
In your evaluation process though consider some of the following factors and determine if these are important as selection or buy criteria:
- Support of secondary technologies like SharePoint and Microsoft Excel – the ability to incorporate Microsoft Excel based data gathering processes may be important in the scenarios that you are looking to automate with collaboration
- Ease of deployment and maintenance
- Technical versus a functional focus – can this technology be deployed without major IT involvement? In this era of shrinking IT operations and support budgets a technology solution that requires extensive IT involvement may not meet your agility objectives. Is a “no programming” approach important to you?
- Proven history of deployments with contactable references
- Functional focus agnosticism – does your overarching objective for collaborative automation include a mix of functionalities that transcend any single functional area? Often the materials creation process for example, incorporates not only elements of data management but also finance, logistics, quality management and others.
- Scalability – how many differentiated processes would you be looking at and is this something you could deploy in an evolving way? When you make a commitment to a given technology are you required to build a fortress or a stockade? Sometimes a stockade will do.
- Workflow – What are your workflow requirements are they fully bound to SAP or is Workflow in SAP simply a nice to have but in reality you simply need some kind of auditable workflow?
There’s a lot of talk about the fact that HANA adoption may be on the up-and-up and SAP’s Q1 numbers were largely buoyed by a transformation in the mindset of IT. The “radical transformation of the industry” statement was made by SAP co-CEO Jim Hagemann Snabe in an interview on CNBC and reported by Forbes.
There should be some concern about this perception of IT’s role in servicing the business though. Conceptually, renting someone else’s infrastructure to run IT operations, the way cloud technology is positioned, isn’t really something terribly new, is it?. Data processing bureaux have been around almost since the first commercially viable computer systems became possible. The nature of the big businesses though, is that they like control and pushing their technology off to a third party provider will not be palatable to many of them. It may be palatable to IT perhaps, but ultimately not very appealing or of much interest to the business. How would IT convince the business to commit to a HANA investment, cloud based or otherwise? The answer has to be simply; higher performance, scalability and lower cost of ownership (TCO). The cloud has the potential for the first two but does it really mean a lower TCO?
SAP’s current incarnation of in-memory technology is something that all the major technology players have fiddled with to some extent. Most significantly though, the recent proposal that businesses move their On Line Transaction Processing (OLTP) systems to full in memory technology seems extraordinarily expensive. Systems that are memory laden are something that Oracle have been pushing for some time with their own appliance offerings and undoubtedly there is some sentiment that the whole HANA initiative is a direct attempt to undermine the existing Oracle database stronghold as the RDBMS supporting SAP ERP but it is much more, it is a genuine acknowledgement that there’s a lot of data in those systems and the current analytics approaches are not performing well enough to address business needs.
I won’t go into the benefits of memory based OLTP versus a hybrid approach of volatile memory, disk caching and disc storage but it is important to understand that there is inevitability to this technology and now is as good a time as any to get on board. Any aspirations that IT might have that they can consolidate all their peripheral systems to a single HANA instance is likely overly ambitious. Consider too, that the way this technology is being positioned is that it presupposes that it can be the best solution for everything i.e. OLTP and On-line Analytical Processing (OLAP). See Martin Klopp’s article on consolidation and some unaudited HANA performance numbers, of particular interest may be the statement that “HANA inserts rows at 1.5M records/sec/core or 120M records/sec per node” – those sound like screaming numbers particularly if you are planning on using a product Winshuttle Transaction with SAP ERP to push batched or pre-staged data into your OLTP system.
The ‘solves all performance issues’ suggestion that IT may be thinking may be a little surprising since many of the use cases hailed as incredibly beneficial have been squarely in the OLAP and analytics space and more particularly, when there is no synchronization or latency issues between the OLTP data being replicated or made accessible to the OLAP technology. For OLTP consumers to reap the benefits of the in-memory technology the expectations are pretty clear. Faster data retrieval, faster decision making and faster storage. In the Farber et al “The SAP HANA Database – An Architecture Overview” paper HANA is succinctly described but it also points out that some of the inherently advantageous characteristics of HANA are in fact the compression schemes which mean less memory consumption and memory bandwidth utilization. Perhaps surprising is the suggestion that real-world OLTP actually doesn’t match the TPC-C profile. This has resulted in SAP’s Benchmark Council taking a different view on transaction processing throughput and how it should be measured.
In their paper, Farber at al suggest that real OLTP workload has larger portions of read operations than standard the TPC-C suggests and this is perhaps particularly true when one considers that changing and updating existing records in a given ERP system is very commonplace.
Those unfamiliar with the TPC-C benchmark may be interested to know something about the TPC. In 1988, the database and application vendors formed a consortium called the Transaction Processing Performance Council’s (TPC). The TPC’s goal was to reduce the bench-marketing hype and smoke being created by hardware, database and application vendors by defining a level playing field on which all vendors could compete and be measured. In 1989 the TPC-A benchmark was born. TPC-A defined metrics for performance in transactions per second (tps)as well as price / performance ($/tps). The TPC-C benchmark soon followed and continues to be a popular yardstick for comparing OLTP performance on various hardware and software configurations.
Wholesale porting of SAP ERP to HANA may not necessarily bring immediate performance benefits without modifications to the underlying application code and this is borne out by a raft of training being made available specifically for developing on HANA platforms that suggests that some of the data retrieval logic may need to be reworked. A question then will be what should one do in terms of a migration strategy if one is considering a HANA based system as the next target for your ERP migration? Should you consider a migration of your existing system as is, and can you in fact do that without coding changes? Or should you rather meticulously plan rework of some of your existing code in preparation for a migration.
SAP’s key event, Sapphire, is coming up in Orlando next week and when you’re having a HANA conversation this is certainly something you will be wanting to ask.