A few events this week triggered some thoughts about where HANA and in-memory computing should be considered in the context of your overall SAP ERP landscape.
As many will know, SAP is officially positioning itself as a premier cloud provider and this is achieved in a number of ways both through the cloud based product offerings that include Ariba, SAP Business By Design (BYD) , HANA in the cloud, SuccessFactors and other products in the SAP cloud portfolio.
SAP AG’s CEO, Jim Hagemann Snabe, spoke at Credit Suisse’s 2013 Annual Technology Conference this week and claimed that the cloud makes SAP more agile and able to compete with small companies and startups. This is not to say that SAP turns its back on on premise enterprise customers, they will always exist. Rather that SAP sees the cloud as offering 100% recurring revenue as opposed to what they refer to as the “traditional business model” which yields only 22% recurring revenue. For SAP ‘the Cloud’ is therefore a big roll of the dice but also, where SAP sees the biggest opportunities.
Surprisingly, at Winshuttle we see new installations of SAP continue to pop up on the radar. This is surprising, because best of breed software vendors try constantly to convince the market that ERP is dead, so those claims continue to be undermined by business decisions. Perhaps more surprising is the fact that these new adopters of ERP run the full gamut of sizes of organizations and industries.
For these new adopters of SAP it makes sense to go straight to HANA as the platform and possibly even ERP on the cloud. HANA as the underlying database makes sense principally because SAP and SAP BASIS administrators have always hated Oracle, SQLServer and other database administrators playing around with the underlying database at the RDBMS level. The preferred way to administer the database has always been through the BASIS layer.
With SAP having engineered and essentially ‘owning’ the database component, everything now falls to BASIS administration and Operating System administrators with the DBA being essentially being made redundant. So if you’re implementing SAP for the first time, HANA is likely to be your first choice for database platform. A difficult position to dispute when big guns like Gartner even position SAP as a Market Leader in Operational Database Management Systems.
Perhaps more significant is also the fact that HANA itself doesn’t require SAP’s ERP to be present, to be useful. Many of SAP’s stories old and new, for HANA revolve around applicability even in the absence of ERP – particularly data coming from things like SCADA systems and potentially the ‘internet of things’ apart from traditional business transactional data commonly found in ERP.
SAP is making big noise about big data and the pertinence of HANA and the SAP analytics tools that can be connected to HANA. So much so, that there is a great deal of thought leadership being driven out of SAP by folks like David Ginsberg – SAP’s Chief Data Scientist and Shekhar Iyer, Global VP of Business Intelligence and Predictive Analytics – who not only promote HANA but also SAP analytics services and the emergent role of the Data Scientist. One of the biggest areas of special interest in this space is predictive analytics.
While the legacy on premise customers contemplate where HANA fits into their technology mix, I’d encourage them to give serious consideration to the fact that HANA is accessible in several different ways both as SaaS and on premise as an appliance and theoretically on generic hardware. As Eby-Brown so eloquently described in their ‘Journey with HANA’ webinar this week, they got to demonstrate that certain ERP processes could be executed faster on HANA but more importantly, that they could execute some functions that previously had been impossible due to the nature of their data. SAP even offers an Amazon cloud instance that can be ‘played with’ to test.
So, for those still skeptical about the relevance, consider loading data from well, pretty much any system that has large tracts of data and from that, determine whether HANA isn’t perhaps a better performing alternative to your existing BI/BW or DataMart environment. It is hard to argue against the potential when market leaders like SAS have officially bought into HANA as being a credible alternative platform for the data repository.
If my crystal ball skills are any good, 2014 will see more migrations and implementations of HANA and furthermore more evaluation of HANA for the predictive analytics modeling aspect of business intelligence particularly as a Database as a Service (DBaaS) thereby minimizing business’ need to stand up new on premise infrastructure to support fast large scale analytics. The fundamental question is going to be whether it is practically possible to populate these cloud based databases with the piles of data that business to date, have hosted in-house.
Roughly a year ago I commented on the fact that ERP solutions run the risk of being implemented and run in relative isolation to the larger requirements of the business – I described ERP as being an island – it is a phenomenon that continues even with new implementations today.The causes are numerous, from costs of integration to flaws in the implementation methodology and overall philosophy – probably the worst is when the ERP implementation is considered an IT project.
As a consequence, folks like the sales team, engineering and yes even accounting, land up investing in supplementary technologies that help them simply get their jobs done. It could be argued that if you look at offerings from companies like SalesForce.com, IBM’s Maximo, WorkForce.com and others, the only reason that they exist is that the core ERP system either lacks functionality or usability.
SAP doesn’t work the way I need it to
Arguing that SAP lacks functionality is a hard one to swallow by virtue of the fact that SAP has such a large customer base and in such a dizzying array of industry segments. However it would be true to say that it lacks certain ways of working that are perhaps more attuned to the way that salespeople, maintenance engineers and human resources specialists would like to work. Any organization that has committed to ERP has to maximize the value that ERP potentially represents. This means you may have to consider spending more money on more things…
In my organization – Winshuttle- we have similar challenges. We have an accounting system which integrates with our customer sales system and for a while we used the service aspect of this system but to use it more effectively we found that customization was an inevitable activity that we would have to undertake.
In an ideal world with deep pockets and unconstrained time-lines we would likely always choose best in class to achieve our objectives but since that cannot be our reality we have to pragmatically look to what we can afford. Microsoft Excel gets many organizations through some significant hurdles around operational efficiency without making too many compromises in efficiency.
Excel and SAP don’t really play nicely together
The problem is that natively, SAP and Excel don’t really play nicely together and if they do, it isn’t a consistent experience.
In every organization that I have worked, Excel continued to be the mainstay that business users defaulted to when they needed to collate, track or present certain kinds of monetary or statistical information. Even in the marketing organization, loading up leads from trade events usually comes through via a workbook or spreadsheet. Excel is ubiquitous – trying to deny the relevance of Excel or pretending that you can live without it or something like it, is a fool’s errand and trying to eliminate use of it is pointless if you are not offering meaningful and appropriate alternatives.
MS Office is a PET
Microsoft office itself is a PET, after-all it allows you to create documents in a relatively consistent format, without having to suffer the need to rely on the clatter of typewriter iron with sheaves of crisp white cartridge and shimmering carbon paper. Indeed MS Office is even pitched as well aligned with six sigma and you can learn how to align your MS Office utilization with this operational improvement and effectiveness philosophy in mind. For some amusing anecdotes, tips and comments on MS Office and using it efficiently and effectively take a read of Annik Stahl, the Crabby Office Lady columnist’s 10 years worth of posts.
Business intelligence tools like BI Excellence come with an integrated ability to export to Microsoft Excel and there is an increasing number of other business intelligence technologies on offer in the market that support loading data into your core system of record from formats that avoid the need to use the traditional transaction entry screens or developing costly application interfaces – things like FIORI and SAP Screen Personas. The way you allow users to use Microsoft Office in the workplace together with your ERP is the most important thing. Managing risk, ensuring compliance and streamlining operational use.
If you had any doubts about the usability and optimized effective use of your ERP system today, be aware that all is not lost. You do have the opportunity to improve things, all that you need to remember is that despite spending hundreds of thousands or even millions of dollars on its implementation – there are still some missing components in the landscape. The components that you are missing primarily focus on making people, not technology, more efficient when working with ERP.
When these behemoths like SAP were originally conceived, the technology was expensive and people were relatively cheap – today that is not as true and out-of-the-box efficiency tools don’t generally come bundled with the solution – you have to invest in them. If you don’t invest in PETs you will be investing in customization.
Probably the most significant comment that you will hear, will be the commentary that the producst are a mish-mash of technologies without any specific cohesiveness and without a particularly consistent set of underlying technologies or a consistent user experience. In some respects that is true but in other respects that is quite untrue. Consider an early post made on SCN back in 2004 by Karin Schattka in which she plausibly demonstrated the apparent mish-mash of technologies even in use at that time, being credibly rendered either in the NetWeaver Portal or subsequently in the NWBC.
Of particular importance would probably be the fact that the portfolio of acquired technologies has the most acute differences of all the products in the SAP portfolio but even that is only a half truth – many have ABAP as a core underpinning technology or at least connect with ABAP systems using conventional methods or PI/XI interfaces and now even Gateway can be used. The real challenges is that different user needs need to be serviced by different technologies and approaches and innovation doesn’t have to be purely homed in the world of ABAP. For hard core ABAP developers that is preferable, and from an operations perspective, avoiding an army of disparate skillsets is good for keeping the IT headcount numbers down. But it seems SAP has a plan for changing and improving all that.
One could easily critique the differences between ERP, CRM, APO and SRM but at least at their core they have some fundamental design aspects that still remain somewhat aligned with the original concept of the R/3 Business Suite and the subsequent versions but products like B1 for the SMB market, Ariba, SuccessFactors, ByDesign, MDM, StreamWorks, Syclo, Sybase and the BusinessObjects BI portfolio are so wildly different from the original concept that one wonders at why they were ever acquired to be part of the total portfolio offering.
5 Core focus Areas
Just as an industrial giant like BMW manufacturer could produce, motorcycles, automobiles, pick up trucks, good trucks, buses and aeroplane engines, so too, a software company can have a portfolio of products that are not necessarily all fully integrated and not necessarily all complementary – some products will be built for specific segments without synergies with others. Having a spread of technologies moves that business closer to being a one stop shop for all technologies that a business given might need.
SAP today very clearly goes after five areas, the classic Business Applications, Databases and pure technology solutions, Analytics, the Cloud and Mobile.
All areas overlap in some respects but for newcomers to the vendor the main draw is often the business applications for which SAP is the most renowned. Of late much has been made in the press of the in-memory computing option of SAP HANA as an alternative database to contenders like the Oracle and Microsoft SQLServer RDBMS but in the less publicized circles of Human Capital Management and Human Resources, SAP made big waves with the acquisition of SuccessFactors.
My recent visit to Budapest to the Deloitte Shared Services Conference courtesy of Winshuttle involved a number of conversations with leaders in the HCM space for multinationals and when questioned on whether they had SuccessFactors in play or were planning to implement, many of the responses were a ‘no’ and a ‘we’re going to wait and see’ – when I probed them further it seems that the integration aspects were still the area that they were the most uncertain about. In-Memory computing didn’t come up though it is probably in there somewhere. Consider why, and you’ll understand, that business users, who were the primary attendees at the conference, don’t care about the underlying machinery, they care about the comfort of the ride.
Those that had implemented SAP’s Business Applications HR modules for example didn’t see any changes coming that would be of particular concern and of course those who had implemented the original version of SAP’s performance management module were generally unimpressed but the SuccessFactors acquisition presented a whole raft of new opportunities that were constantly under consideration but not yet decided upon, there was no compelling event as yet that was forcing their hand. Does that mean that they’re holding back on SAP investment? Quite the contrary, they’re exploring additional avenues that may or may not involve expanding existing uses or adoption of different SAP products.
In selecting any technology it seems that enterprises look at a number of variables among them:
- The business need
- What the market is offering
- The reputation of the vendor
- The cohesiveness of the vendor’s product portfolio
- What peers are doing
Although the last point is often claimed to be a non-criteria, we know from past experience that demonstrating that a competitor or seeing our competitors with a particular technology spawns envy and as a consequence observing and identifying what peers and competitors are doing and investing in, is part of the evaluation criteria. Many companies it seems, are seriously looking at HANA as an under-pinning technology so that they can run their systems cheaper, deal with one vendor (choke one throat) and improve ERP and analytics performance, if its on the cloud, great, but in all likelihood a hybrid approach of mixed cloud and on premise technologies will do.
The first three variables in the list speak for themselves and of course SAP has a strong brand and reputation In the market but the cohesiveness is where things seems to unravel a little. Analyst Elizabeth Hedstrom Henlin of TBR wrote some time back that “SAP is in a strong position across core and new addressable markets.” The portfolio address diverse customer needs and are backed by some proof points in support of integration. Henlin also said she see SAP’s mobile division as being the heart of SAP’s growth to date, with the acquisitions of Syclo and Sybase serving as the core of an applications-led, horizontal approach to mobility that touches and integrates SAP’s diverse portfolio and go-to-market strategy. So back then, in 2012 at least, it seems mobile was the flavor of the day – logically that won’t have changed, even if you change core underlying technology like the database, even if you push infrastructure to the cloud, even if you add analytics… you need a user experience and in all likelihood, mobile needs to be part of that, so there is some clear logic to having a mobility offering that works seamlessly with all of the others.
Herding the developer cats – Vishal Sikka is the man !
As recently as July again Henlin stated that SAP’s alignment of development entirely under Vishal Sikka brought the promise of increased speed within the expansion and strategic growth of the product portfolio and ongoing restructuring of SAP’s product teams to integrate on- and off-premises resources was the critical asset for SAP to sustain growth into 2014. So what did that all mean? Consolidation?
Just like every technology company, SAP’s engineering teams suffer from that bipolarity associated with wanting to work on the cool stuff but also needing to fix and maintain old stuff. The latter is the uncool but necessary stuff but this doesn’t appear to be part of Sikka’s portfolio right now. It is important to mention sustainment though, after-all the maintenance revenue stream that pays for the fixing of old stuff and that is a large chunk of change in SAP’s annual coffer collections.
Sikka’s realigning of SAP’s R&D functions are likely all focused on the new hip stuff, but there’s a need for oversight on the sustainment too, so in all likelihood there’s an element of dominion in his role there too.
Let’s look at another couple of pups in the litter.
When we then look at Enrico Camerinelli’s comments on Ariba this seems to align with the above but he does suggest that SAP’s acquisition of Ariba was purely about driving existing shareholder value. Ariba’s pre acquisition shareholders enjoyed a one-off premium of nearly 20% over the company’s share value when the company was acquired by SAP and SAP secured a specialized and highly populated B2B network into which it was able to expand its market reach. He feels then, that there isn’t really a commitment to facilitating any kind of integrated B2B financial supply chain and his comments are based on SAP’s inability to articulate a clear strategy on how it intends to leverage the Ariba acquisition for financial services.
HANA on the other hand continues to be where the big focus will continue to be and the remaining products in the portfolio will be offered but given less consideration in the media and show-and-tells; Greg Chase commented this week how more than a third of the offerings, a stunning 263 items at the upcoming SAP Teched Las Vegas in the US are HANA related – a big focus on Analytics, Big Data and In-Memory Computing. Only 5 mention Ariba and 23 SuccessFactors.
KXEN doesn’t feature at all but this is hardly suprising given that the acquisition is only recent news. KXEN technology at least has pertinence in the analytics, bigdata and HANA play.
With SAP’s portfolio continuing to grow one can only wonder at the spectacles that Sikka will be able to pull out of the proverbial magician’s hat as he continues to remould R&D at SAP, perhaps these puppies aren’t as ugly as you think and what we have is the establishment of a hunting pack positioning itself for presenting businesses with a new perspective on how business can buy and use SAP and enterprise technology to be successful.
Apart from the large presence of Deloitte consultants and partners, there were of course many companies represented and many of them spoke about the ways in which their individual businesses were transforming their business operations through the use of Shared Services models, Business Process Outsourcing or the implementation of the concept of Global Business Services.
I had a chance to have a number of conversations with company representatives and consultants alike but I also had the chance to talk to a couple of vendors and representatives of development agencies and discovered some interesting facts that I thought I would share.
The top of my mind was the revelation from MIDAS that the greater Manchester area is home to more than 7 million people within in an hour’s commute and over 30 shared service centres are located in the area, among them Aegis, Sodexo, RBS, BUPA and Worldpay.
SAP customers located in the area include Northern Rail, Smith & Nephew, Kellogg Company , HJ Heinz, AstraZeneca, Premier Foods, Japan Tobacco and of course Sodexo as well as a number of consulting groups.
Another important conversation that I had was with a couple of the vendors supporting SAP Finance groups focused on the Record-To-Report process with a particular concern with accelerating the period close, improving it and accelerating it. Verifying the accuracy of the financial close process and ensuring the integrity of financial statements are two important corporate challenges and they are best addressed with automation as an aid.
I have long been an advocate of trying to encourage organizations to automate as much of the finance process as possible. Of course in mature organizations that are heavily regulated or have a great deal of external accountability there is often a considerable amount of automation either with tight integration via configuration of the SAP modules or through the building of interfaces between various systems.
In the SAP ecosystem several key players provide keenly positioned solutions to business to more effectively close the manual entry gap and among these are Runbook, Chesapeake, Trintech and Blackline all of whom were at the conference. Comparisons between the three of them can be challenging, each brings different strengths that may be important to different organizations with different levels of finance automation maturity.
Trintech for example offers an integrated solution beginning at the reconciliations, continuing through the close, risk management, compliance, and financial reporting with flexibility for current practices, including configurable templates, workflow, and task management. There is some degree of standard and ad hoc reporting capability also. In the financial governance software market, the competitive landscape varies by product line. There is no competitor that Trintech compete with across all product lines although primarily they compete with Chesapeake Systems, BlackLine Systems for Reconciliations, Paisley Consulting and Open Pages in the Sarbanes-Oxley compliance market and MedAssets in the healthcare market.
Blackline is a well established SAP Partner and runs with the tagline ‘No More Bullsheet’ it positions itself with the basic premise that you shouldn’t be using spreadsheets as part of your reconciliation and close process. Their subscription based offering provides workflow, reconciliation review and approval, task lists used for both close and preparation and delivery of schedules to stakeholders like auditors who can download the analyses directly through their own account on the system, as well as other functions.
Runbook provides workflow to try and minimize the dependence on spreadsheets, emails and phone calls, arguing that the use of mixed tools to close the period make your financial close more frustrating and more expensive than it actually should be. Leading with SMARTClose for reconciliation it is described as flexible, intuitive and powerful in that it allows businesses to take their existing process and apply best practices and reusable templates to transform their close process.
Chesapeake System Solutions offers T-Recs for easing the financial close process by way of a rule-driven solution, similar in some respects to the other offerings.T-Recs claims to be able to reconcile any account based on customer-identified definitions of data characteristics, business processes, and data display templates.
Not all have strengths in all geographies and accordingly have different market stories. Pretty much all of them have a smattering of customers in the FTSE 1000 and NYSE 1000, however unlike vendors in the OCR and scanning solutions space, the principal positioning of these companies is try to cut down on umnanaged objects in the reconciliation process.
There have been many cries over the years to get business to move away from Excel and yet it remains the dominant product of choice for the majority of bookkeepers and accountants. Despite its many weaknesses, it remains the most flexible choice especially for businesses that are not shy abouit spending yet more on their SAP environment or cannot justify expenditure on additional products with very specific functionality. So Excel often becomes the default ‘only choice’.
The issue revolves largely around unmanaged spreadsheets and building and using good spreadsheet designs. Proper consideration given to a well design spreadsheet with good, robust integration with the backend ERP system and a governance wrapper can effectively achieve the same objectives.
As one Finance expert on Linkedin wrote, Accountants are like rock candy : “The bottom line is that if you cut an accountant in half, they have EXCEL written through the middle. That has remained a constant for years, and is unlikely to change, so maybe the focus should be on how tho build great spreadsheets, rather than killing Excel…”
So my thoughts return again to my sponsors at the event and the fact that for more than ten years now, Winshuttlehas been advocating Excel as the environment that much of businesses continue to rely on.
It is something that is unlikely to go away any time soon, and something, that even with the implementation of one or more of the technology solutions described above, will continue to be in active use. Leveraging robust ECM solutions like SharePoint together with SAP give you the best of both worlds. The tight integration of Excel and SharePoint give you heaps of additional meta data that enable you to leverage the sourcing documents natively. Something that you often lose when using an island system like a pure reconciliation solution. Yes, the latter may be best of breed or best in class, but does it actually offer you flexibility to address what you consider to be uniquely differentiating aspects of your business? Don’t automatically assume that Excel is bad, bad, bad, it has its place and you can make it more robust and dependable.
The trick, as with anything, is getting the balance right, between appropriate and proper use of tools to automate and smooth operational processes.
The launch of Fiori and Screen Personas by SAP represent an interesting commentary on how customers use SAP and how SAP is actively pursuing initiatives to improve the user experience. For many years SAP has been a much maligned product from a user experience perspective. This is hardly a surprise given the complexity and functionality that the leading ERP product affords a dizzying array of industry segments. Fiori and Screen Personas could be easily thought of as alternatives to one another but in reality they tackle different aspects of accessibility and usability. It is easy to consider this as “lipstick on the proverbial pig” but when you consider the strategy and the actual application, the products actually provide much more.
A prerequisite for Fiori is the presence of the Netweaver Gateway. The applications are priced per user for the bundle with some prerequisite licensing requirements.
Screen Personas tackle a different problem, namely the complexity of the existing SAP UI. Here, a number of different third party products, custom development and customization have developed over the years to simplify the overall user experience of core SAP functionality. Products like GUIXT on the SAP GUI client and Winshuttle for Microsoft Excel made large inroads in reducing the number of fields that a user needed to touch in order to complete a transaction in SAP however with Screen Personas the approach enables the implementation of group and user specific focused user-experience management in a centralized way that leverages existing organizational and security models in SAP.
SAP positions Sscreen Personas as “Personalization without Programming”. Using a simple drag and drop approach to screen design, users can modify many common SAP GUI screens to make them more usable and visually appealing. Prior to this product being made available, custom development or a third party product would have been the only choice available to simplify the user experience. Users leverage Screen Personas through the internet browser or NWBC. SAP Screen Personas works on most standard SAP GUI Dynpro screens but does not work for WebDynpro applications such as SAP SRM or the SAP CRM Web Client User.
Prerequisites are SAP Innovation Kernel, Release 7.21 and at present only English and German only are supported. Personas is licensed on a per-user basis. Users are defined as anyone that uses a modified screen and licenses. There is no developer license for people that build Personas screens. This is included in the license. Authorizations are handled in the backend and all authorization steps happen the same way whether you use Personas, the SAP WebGUI, or the SAP GUI.
SAP recently announced that it would entertain involvement in its PartnerEdge program from partners who previously had never been involved in the SAP space. This apparent ‘new’ offering through their website differs from other partner programs in the past in that it is promoting very specific capabilities namely “help create and monetize innovative, specific applications in the mobile, cloud, database or high-performance in-memory areas” It is clear from this statement that the targets are Mobile, Cloud and HANA.
While historically a on-premise software technology provider, SAP itself has produced a number of mobile applications for free and for a variety of uses from Business Objects to Order Status Inquiries from ERP. Downloading the applications themselves is probably fairly straightforward but getting them to talk to your back end systems is another matter; you have to navigate the complex world of SAP security authorization, externally facing ERP systems etc.
Against the backdrop of more recent announcements regarding SAP system vulnerabilities it is likely then that newcomers and business adoption levels will likely be low.
The SAP application landscape, like those of many enterprise back office behemoth systems suffer from a kind of innovation stalemate. When the business commits resources to enabling certain functionality then the likelihood of success is higher than if individuals try to create a groundswell of interest in functionality by leveraging peripheral or micro apps like many of these mobile applications to address certain edge case business scenarios.
It will be interesting to see whether opening the proverbial floodgates to innovation partners with little or no prior experience with working with SAP and SAP systems will in fact make a difference. The landscape is complex, the technology quirky and old fashioned in some respects and yet the business dependence on all of these, immense.
The opportunities for those innovators who manage to crack the nut of addressing a real business need cost effectively and at the same time keeping IT and the Auditors at bay will be tremendous.
One of my colleagues frequently uses what he refers to as the buses versus trains analog to describe some of the options that companies have with regard to some of the integration enabling technologies that are available for use with SAP.
Depending on the characteristics of the conversation that you are having the analog works well on a number of levels.
On the one hand you have the idea that a bus has a maximum capacity of around 100 or less people, on the other hand, commuter rail or even light rail has much larger capacity with some trains or trolley buses accommodating several hundred passengers simultaneously. When considering why this is possible you have to look to the infrastructural requirements associated with railed versus wheeled transportation. Railed transportation is often selected where a clearly defined commute route has been defined and services the majority. The choice between light-rail versus commuter rail is often bound up in the distances to be traveled and the speeds to be attained, the number of stops etc. You can interpret this in a business process as being somewhat akin to variance or logic complexity. In the ABAP world you would look at IDOCS, PI/XI integration and perhaps even other complementary technologies like EDI, Biztalk, Tibco etc. The point here is that these approaches typically require some degree of heavy lifting to establish.
Heavy Capital Investment is not a prerequisite for SAP Integration
Defining a data route, putting down the supporting technology and then establishing a marshaling yard for trains of data that have issues or risk failure for any number of reasons is another aspect of the overall design that adds another layer of complexity. For this reason, SAP’s Solution Manager monitoring and third party solutions from companies like Seeburger exist, to help the business keep track of what their integration layers are doing.
Another approach might be to deploy custom ABAP programs to achieve the integration. Many companies that have long established SAP infrastructure use this approach to integration developing a combination of ABAP, web Dynpro or Java applications to keep the data flowing from the many possible sources to the final target system. In the case of these programming intensive approaches, the ability to pivot the process and adapt to a changing set of business requirements is relatively difficult. As with anything programmatic; it of course can be done. The question is how quickly and with what sort of levels of commitment. Changing the technology that supports a given process also brings significant risks with it, if certain scenarios or outcomes are unforeseen and arise.
Is support for SAP collaboration automation without programming really possible?
With that said, buses on wheels offer some significant advantages over railed transport in terms of the flexibility they afford a city to service areas of public transport that are either not easily serviced by rail or which don’t justify the capital investment in rail. Additionally, buses support the ability to service a smaller volume of transportation objectives without major expense. However it gets to a point where buses may not be a practical approach when the requirements have demanding throughput expectations. Too many buses on the roads can lead to congestion that is exacerbated when more buses are added. If a bus route services too few requests the transportation authority can reevaluate the route, pivot the service area or discontinue the service. Under such circumstances the consumers of this service are forced to either self-drive or engage in lower level effort like cycling or walking depending on what is most practical.
There are a couple of technologies that support the bus scenario when looking at usability and automation with collaboration in the SAP space but these technologies all have varying strengths and weaknesses. Some technologies that you can look at in this area are Winshuttle, K2, Blackline, Promenta.
In your evaluation process though consider some of the following factors and determine if these are important as selection or buy criteria:
- Support of secondary technologies like SharePoint and Microsoft Excel – the ability to incorporate Microsoft Excel based data gathering processes may be important in the scenarios that you are looking to automate with collaboration
- Ease of deployment and maintenance
- Technical versus a functional focus – can this technology be deployed without major IT involvement? In this era of shrinking IT operations and support budgets a technology solution that requires extensive IT involvement may not meet your agility objectives. Is a “no programming” approach important to you?
- Proven history of deployments with contactable references
- Functional focus agnosticism – does your overarching objective for collaborative automation include a mix of functionalities that transcend any single functional area? Often the materials creation process for example, incorporates not only elements of data management but also finance, logistics, quality management and others.
- Scalability – how many differentiated processes would you be looking at and is this something you could deploy in an evolving way? When you make a commitment to a given technology are you required to build a fortress or a stockade? Sometimes a stockade will do.
- Workflow – What are your workflow requirements are they fully bound to SAP or is Workflow in SAP simply a nice to have but in reality you simply need some kind of auditable workflow?
There’s a lot of talk about the fact that HANA adoption may be on the up-and-up and SAP’s Q1 numbers were largely buoyed by a transformation in the mindset of IT. The “radical transformation of the industry” statement was made by SAP co-CEO Jim Hagemann Snabe in an interview on CNBC and reported by Forbes.
There should be some concern about this perception of IT’s role in servicing the business though. Conceptually, renting someone else’s infrastructure to run IT operations, the way cloud technology is positioned, isn’t really something terribly new, is it?. Data processing bureaux have been around almost since the first commercially viable computer systems became possible. The nature of the big businesses though, is that they like control and pushing their technology off to a third party provider will not be palatable to many of them. It may be palatable to IT perhaps, but ultimately not very appealing or of much interest to the business. How would IT convince the business to commit to a HANA investment, cloud based or otherwise? The answer has to be simply; higher performance, scalability and lower cost of ownership (TCO). The cloud has the potential for the first two but does it really mean a lower TCO?
SAP’s current incarnation of in-memory technology is something that all the major technology players have fiddled with to some extent. Most significantly though, the recent proposal that businesses move their On Line Transaction Processing (OLTP) systems to full in memory technology seems extraordinarily expensive. Systems that are memory laden are something that Oracle have been pushing for some time with their own appliance offerings and undoubtedly there is some sentiment that the whole HANA initiative is a direct attempt to undermine the existing Oracle database stronghold as the RDBMS supporting SAP ERP but it is much more, it is a genuine acknowledgement that there’s a lot of data in those systems and the current analytics approaches are not performing well enough to address business needs.
I won’t go into the benefits of memory based OLTP versus a hybrid approach of volatile memory, disk caching and disc storage but it is important to understand that there is inevitability to this technology and now is as good a time as any to get on board. Any aspirations that IT might have that they can consolidate all their peripheral systems to a single HANA instance is likely overly ambitious. Consider too, that the way this technology is being positioned is that it presupposes that it can be the best solution for everything i.e. OLTP and On-line Analytical Processing (OLAP). See Martin Klopp’s article on consolidation and some unaudited HANA performance numbers, of particular interest may be the statement that “HANA inserts rows at 1.5M records/sec/core or 120M records/sec per node” – those sound like screaming numbers particularly if you are planning on using a product Winshuttle Transaction with SAP ERP to push batched or pre-staged data into your OLTP system.
The ‘solves all performance issues’ suggestion that IT may be thinking may be a little surprising since many of the use cases hailed as incredibly beneficial have been squarely in the OLAP and analytics space and more particularly, when there is no synchronization or latency issues between the OLTP data being replicated or made accessible to the OLAP technology. For OLTP consumers to reap the benefits of the in-memory technology the expectations are pretty clear. Faster data retrieval, faster decision making and faster storage. In the Farber et al “The SAP HANA Database – An Architecture Overview” paper HANA is succinctly described but it also points out that some of the inherently advantageous characteristics of HANA are in fact the compression schemes which mean less memory consumption and memory bandwidth utilization. Perhaps surprising is the suggestion that real-world OLTP actually doesn’t match the TPC-C profile. This has resulted in SAP’s Benchmark Council taking a different view on transaction processing throughput and how it should be measured.
In their paper, Farber at al suggest that real OLTP workload has larger portions of read operations than standard the TPC-C suggests and this is perhaps particularly true when one considers that changing and updating existing records in a given ERP system is very commonplace.
Those unfamiliar with the TPC-C benchmark may be interested to know something about the TPC. In 1988, the database and application vendors formed a consortium called the Transaction Processing Performance Council’s (TPC). The TPC’s goal was to reduce the bench-marketing hype and smoke being created by hardware, database and application vendors by defining a level playing field on which all vendors could compete and be measured. In 1989 the TPC-A benchmark was born. TPC-A defined metrics for performance in transactions per second (tps)as well as price / performance ($/tps). The TPC-C benchmark soon followed and continues to be a popular yardstick for comparing OLTP performance on various hardware and software configurations.
Wholesale porting of SAP ERP to HANA may not necessarily bring immediate performance benefits without modifications to the underlying application code and this is borne out by a raft of training being made available specifically for developing on HANA platforms that suggests that some of the data retrieval logic may need to be reworked. A question then will be what should one do in terms of a migration strategy if one is considering a HANA based system as the next target for your ERP migration? Should you consider a migration of your existing system as is, and can you in fact do that without coding changes? Or should you rather meticulously plan rework of some of your existing code in preparation for a migration.
SAP’s key event, Sapphire, is coming up in Orlando next week and when you’re having a HANA conversation this is certainly something you will be wanting to ask.
I am notoriously bad at crystal ball gazing on matters in general but in the world of IT I think I have been around for long enough to have the ability to tell the difference between a rat and a cat and in this instance I call rat!
The rat in question is the whole question of Microsoft’s Office365 being successful and achieving the pervasive use and ‘buy in’ that major corporate customers would consider viable for their long term IT sustainment needs.
There are plenty of partners and vendors out there promoting cloud based technologies and Microsoft is a little later to the game than the likes of Google but in the grand scheme of things will the biggest of corporations make the leap.
From my perspective a lot of the view of Office365 is that it is nothing much more than a file sharing environment for word documents, powerpoint presentations and excel spreadsheets. Many organizations attempt to point to wide adoption through sensational headlines like “Fast Growing Wholesale Sales Agency Turns To Office 365” for their expansion plans but when you dig a little deeper and look at the characteristics of the case study customer, you determine that the customer is a small or medium business. convergencealimentaire.info published an infographic back in 2012 and identified that ten brands principally own the world of consumer goods ie they are manufacturers, buyers, distributors and sellers of much of the world’s stuff that it consumes.
Can one reasonably expect these companies to move their corporate information in bulk to a hosted provider like Microsoft or Google. I think not. Many of these companies run the leading ERP’s, sometimes many versions of them. In the idealistic world of cloud software solution providers all of these solutions are envisioned as being run across the interwebs.
Office365 itself has seen a very chequered history from the early days of Hotmail, through passport, Live etc etc The latest incarnation, building on the successes seemingly, of XBOX and the apparent success of Google on the cloud has undoubtedly seen significant investment, certainly in the days when I worked with Microsoft, the investment in datacenters alone was considerable and in the interceding years no doubt more revenue has flowed to shoring up this infrastructure. Is this ultimately all enough?
For cloud solutions, or any solution for that matter to be significantly of interest four fundamental factors need to be in play:
- Easy to use
Cheap is relative of course, my Google storage just came up for renewal for 2013 and didn’t auto renew because my credit card attached to my Google Wallet was out of date, I am debating whether I need to renew at all, I subscribed almost on a whim but do I really need this thing, I don’t have a chrome book and I don’t use an android phone, I spend my life in MS Outlook, Word, Excel and Powerpoint, on reflection I am not even sure why I signed up for Google Cloud. I am sure I am not alone. For my employer or customer to invest in Google Cloud would cost considerably more and the decision would not be taken lightly. MS Office Home Premium subscriptions strike me as expensive at $99.99 a year. I would pay $20 a year for MS Office but PCMAG http://www.pcmag.com/article2/0,2817,2414803,00.asp for example predicts it will fail and I am inclined to agree. The last time I bought a full suite of MS Office was 5 years ago and I consider it an investment well made it lasted me until December when Windows8 decided to delete it from my machine as part of the upgrade from Windows7 – major #fail there Microsoft. If Microsoft can’t get the consumer pricing right what is the likelihood of getting it right for corporations.
Appropriate is of course all about relevance. While the look and feel of Office365 professional looks not much different from any of the other new Metro themed applications, does it address the functional needs that my business requires. Unlike products from companies like Adobe. Apple and Microsoft products don’t transform over time, they tend to stagnate until the next major version and as a result lag behind in features and functionality that seem more contemporary. Adobe releases frequent incremental changes to the whole functionality and experience – for some this spells instability, for others this speaks to relevance and appropriateness. Cloud solutions are able to be released with new functionality centrally but the question that has to be asked, is that functionality enough to address the needs of the average corporate user. I try to send folks URLs to n networked documents internally rather than attaching them to emails but sometimes sending a document is a more effective way to get stuff where it needs to go – again, I may not be perfectly representative of the average user but in conversations, I have learned, that shared and sending documents is the norm rather than the exception. How easy even with cloud applications, is it to share documents in such a way without compromising on access controls and overall document integrity and security.
Reliability seems to be potentially the biggest bugaboo that will make or break cloud solutions as being appropriate to the corporate customer in particular. Anyone who has worked the support gristmill will tell you that quality of service is a challenge even with internal solutions, move them out into a world managed by a third party and all of a sudden your business users no longer have the proverbial employee throat to choke. SO many factors can influence quality of service that it seems highly unlikely that businesses will adopt cloud technologies like office365 wholesale unless they can afford a compromise in system availability and performance.
Ease of Use, this is probably the one area where software producers like Microsoft have tremendous opportunities to understand better, their existing and future products. What features and options do users really use. It is possible to instrument existing desktop applications but anxiety around performance, privacy and other factors all influence whether this actually happens. This is extended further by the question of who scrutinizes these statistics and when a decision is made to change or axe a feature what levels of consensus are required from existing customers before this will actually happen.
Ease of use may be the one thing that Office365 will be wonderfully successful at, but it is likely to fail on the reliability and appropriateness front. For mainstream corporate users all four factors will come into play. My prediction is that Office365 will languish for several years and eventually move to commodity status like email but for ordinary consumers not those with rich coffers and deep pockets.
I have just spent the week at the SAPInsider Financials2013 conference held in Las Vegas and organized by WISPubs. SAPInsider is the premier SAP focused conference track for all things SAP and the Financials show is one of the highlights of the SAP Finance related calendar. Attending the Financials conference is always an opportunity to have a number of different conversations with different stakeholders in the SAP world and in particular this year there was a strong focus on controls, audit, compliance and generally keeping your SAP systems safe.
I was particularly interested in conversations with customers and prospects alike who have challenges with being prevented from performing mass actions in their SAP environment. Access to mass transactions tends to be restricted because there is some illusion that entering data manually through the transaction numerous times is safer than running a mass action that could potentially break dozen, hundreds or thousands of records, or worse, create garbage records in your system that are almost impossible to get rid of or correct. The fear of mass transactions is understood but withholding access is not well understood given that IT and the auditors don’t own the business data and the business is ultimately responsible for making sure that the SAP data is correct, present and all accounted for. Worse, delays in changing data can lead to profoundly negative outcomes for the business, far worse than the risk of some of the minor errors that may arise from the incorrect use of a mass tool.
SAP is many things to many business users, but one thing it definitely cannot be described as, is an application platform that is easy and forgiving to use. A certain modicum of proficiency in navigating transaction screens, application logic and the UI as a whole is accepted to be a requirement for effective use however there are a great many inefficiencies in the way the application is presented to the ordinary user. Recent developments in UI optimization have attempted to remediate the situation but on the whole the experience is still sub par and the gap continues to widen as business users come to expect more and more with a richer and more sophisticated experience.
The enhancements to the user experience still do not address the mass actions that business users may need to perform and as a consequence business users are often left with a very limited list of options for performing mass actions against SAP systems.
One approach is the deployment of custom code that allows the mass actions to be performed based on a data input file. The latter is often a text file and the automation code is often of a very fixed and rigid nature. This fixed characteristic may achieve the objectives of the day but in the long term as the requirements evolve and in fact further requirements arise, the custom application may prove to be more of a problem to maintain than initially thought.
A second option that the business may want to consider is the use of 3rd party tool for mass loading of SAP data. This 3rd party tool can use the existing security authorizations and existing transaction processing logic that a user might use when posting the records manually into SAP however it can be enhanced further applying workflow wrappers to the end to end process so that there is visibility into the data activities that the business intends to apply to the system of record. Of perhaps distinctive interest is the fact that the workflow wrappers need not be exclusively exposed to business users but can also be provisioned in such a way that folks from SAP Security, SAP IT and any internal controls or audit group, can weigh in and add their voice to any given planned action.
SAP doesn’t natively support the use of workflow with mass transactions without special configuration but with the use of 3rd party applications, particularly those that ruin atop SharePoint, it is possible to still achieve business efficiency and yet still maintain governance, control risk and ensure compliance.