Working alongside SAP Business Suite

Page 1 of 41234

April 6, 2016  4:28 PM

How many SAP systems do business users really use?

Clinton Jones Clinton Jones Profile: Clinton Jones
2 SAP Instance, Cloud instances, instances, Landscape, SAP, SAP system



I recently wrote a piece on How many SAP systems are enough, over on IT Toolbox.

The catalyst for the piece was largely a discussion with an existing Winshuttle customer that is looking to expand their adoption of Winshuttle technology.

Working with many SAP systems in my experience is fraught with challenges for end-users but a deconstruction of how customers actually work with those SAP systems reveals some interesting characteristics of usage, and usage patterns.

SAP has made it easy to proliferate systems.  Most customers have at least three or four,even if they are virtualized but as soon you introduce additional functionality, projects and the fall-out of M&A activities, you suddenly find dozens of systems in the landscape, and those are just ABAP systems, there is also the non ABAP systems to consider too.

In reality what I have found though, is that even amongst the master data teams in a given organization, there are only a small number of systems that are used regularly. Regular users are almost exclusively using one system and only one system. You can consider this, then, their favourite.

While IT generally has many systems to choose from (internally I have more than twenty – and we don’t even use SAP!) – this should not be confused with the way business users use SAP systems. In fact it should be a major red flag to audit and compliance if users are using more than one ERP system. Reconciliations, consolidations and ensuring all the necessary checks and balances are in place for transaction processing can be terribly tricky when you have users using too many systems.

What do you see?

I am convinced that the sampling of customers that I took a look at, are representative, in fact, even more interesting, is the fact that almost none of them spend any meaningful time with integration solutions outside of productive systems. This behaviour with Excel to SAP integration tools must surely represent in some way, their behaviour when using the SAP UI’s directly.

On a separate note, if you have so many systems, are you aggressively looking to reduce them? If not, why not? One of the oft stated reasons is that the functionality requirements of the various systems in the landscape are different.

That different business units have different needs. I am again not 100% convinced about this argument. Many businesses have successfully consolidated SAP systems or run their global business operations off a single SAP instance.

I am not saying it is easy, I am simply saying it can be done.



March 11, 2016  1:22 PM

Is dropbox just a drop kick from an enterprise goal?

Clinton Jones Clinton Jones Profile: Clinton Jones
CRM, Enterprise, ERP, Integration, SAP, Shadow IT

Enterprise CRM of yesteryearA colleague recently posted an article internally.

The Yahoo article described Salesforce and Dropbox and the fundamentally different ways in which these two organizations view the sales organization in the enterprise space.

We have similar challenges in the SAP space at Winshuttle.

As a part of the SAP Nation we offer some very distinctive products that improve how companies use SAP and Salesforce but we are torn between selling to individuals, workgroups, departments, Corporate IT and the office of the CIO. Who is the best target audience for conversations?

Selling operational DNA

My view is that the reality is that when you sell a CRM or ERP concept you’re selling something that is going to be part of a business’ operational DNA.

This makes a big difference when it is compared with selling a file sharing and collaboration utility or any single function utility for that matter. We don’t want to be just thought of as some silly little narrow focus utility – we want widespread adoption and in particular, endorsement by IT – in fact in some bigger companies we have proven our value and earned it!

You don’t get eight out of the top ten most powerful brands in the world as customers by selling a novelty item. For some companies, if Winshuttle stops working, business efficiency is seriously impaired.

In the dim past, before what we understood CRM systems as they are  today, I remember the ‘smart’ and ‘professional’ account managers and salesmen used to write and keep copious notes about customers. These people gathered business cards, had bulging Filofax binders with business card slots as an alternative to a bulk desk-bound Rolodex etc.

Some of the biggest software makers like Dropbox often say that they don’t need a huge enterprise-sales team to sell their products to other companies.

Because the company wouldn’t invest in these proprietary solutions and systems, the ‘smart’ guys and gals in sales bought their own systems and proudly paraded them in meetings and around the office.

Eventually, due to genius, pressure, inspiration or clever marketing – all the sales guys had these systems and they were branded with the company logo!

Pioneering BYOD

Enterprise office equipmentIn the accounting department we had a similar thing,  the Kalamazoo cheque, payslip and cash analysis systems. We also had desktop calculators, or at least some of us did.

Today it would be a bit laugh-worthy, but those who were serious about their job would either invest in specialized stationery or even buy their own calculators. Calculators back then we’re prized office accessories and desktop computers were almost unheard of.

If you were super serious you would invest in a adding machine that had a tally-roll – if you could afford it, you could use cash register paper if it had a regular printer head otherwise you had to buy expensive thermal paper just like for the fax machine!

In the end eventually someone again decided everyone could be more accurate if they all had calculators and although the tally roll versions still took longer to hit us, everyone was issued with some desktop calculator device.

When you changed jobs you eventually influenced your new employer to adopt these systems because you’d already been sold on them in a previous role or job.

Viral stuff

Enterprise stationeryDropbox has been successful because in my view corporate IT  failed.

Enterprise IT gets a bad rap a lot of the time but with this topic it has been an abysmal failure at providing a simple yet critical resource.

Addressing the need for a simple external file sharing and personal file synchronization tool – something that overcomes the limitations of email attachment limits and avoids choking mail-servers.

Corporate IT will say it failed because of budget, security, provisioning and other concerns blah blah blah… but what is the reality?

The reality is that they had nothing decent that was easy to administer and cheap to deploy.

Wrapped up in trying to cope with the vulnerabilities of Microsoft technologies as a whole – if you remember the ILOVEYOU worm, the Pikachu virus, Klez, Zlob and Stuxnet, that was their lives ten years ago.

Trying to curtail and manage internet access, dealing with all those viruses, trojans and malware kept them busy.  Then also coping with all the management of the standardized back office systems like ERP and CRM, why then would corporate IT care much about file sharing?

They care about file sharing less than they care about Excel spreadsheets – corporate IT in the enterprise just doesn’t get it…

“Why don’t you Email it, thumb drive it” they would say… whatever!…. We found a tool call Yousendit – that worked for a while, unsanctioned for use but a great single purpose file transmission tool.

If you were lucky, you managed to get an FTP site but such geriatric resources were horrible to work with and IT didn’t really like having to manage them either.

Dropbox was viral though, word of mouth resulted in it becoming a staple for everyone and now it is pretty much proven itself and is ubiquitous.

There are still kinks, risks and all kinds of other things that could go awry but I am sure they will work them all out – the only challenge is that they may have to spend such a lot on hardening and securing the technology  –  it may lose its simplicity and earlier charm and of course if you want all that management and control as a customer, you need to be prepared to pay.

For this ‘utility’ to go the next round, to get broad endorsement and acceptance and sign off and funding by IT there has to be changes and I believe IT will become hard-nosed and resistant – at the moment the use of Dropbox in corporations is still largely a skunk-works operation or what we call Shadow IT.  500 million users, but how many of those are just ordinary individuals?

Perfect for now

cloud-file-sharing[1]Drop Box may get a couple of beach-head enterprise deals the way it operates today but as soon as they get too big and don’t align with all the other enterprise things, their capabilities will be curtailed or corporate IT will have its hand forced and clamp down on it.

I am already seeing this.

For early releases of beta versions of Winshuttle software like Studio v11.0 and Winshuttle Analytics I have had to use Dropbox.

I like to get product into the hands of enterprise evaluators as fast as possible.

I share meeting recordings, powerpoint slides, PDF files even photos this way. It’s so easy, I love it. But increasingly my business partners are getting blocked by their corporate IT policies.  Corporate IT in the enterprise it seems, is waking up to the fact that by letting Dropbox go viral, they have allowed a vulnerability to creep in. The magnitude of the vulnerability is yet to be determined. Microsoft will tell corporate IT they don’t need Dropbox, that they have SharePoint, they have Office365, they even have Microsoft OneDrive – they don’t need yet another file sharing tool.

I have a Onedrive storage, 30gb of it, do I use it ? Nope…

In fact I am not even sure how to use Onedrive, though it is probably really easy.  My dropbox folder is 8Gb (I managed to co-opt some signups by friends and family and garnered the 500mb increments for every signup early on when Yousendit (now Hightail) starting curtailing my ‘fair usage’ and restricting what i could send.

Dropbox serves my purposes and periodically I clear it out when it tells me i am out of space. Dropbox costs me nothing but I am neither compelled to upgrade it nor discard it.

My files are anywhere I can get to the internet and they are on all my devices, it is ‘perfect’ for my purposes. Perfect until the day I am asked for money or until my business partners can use it no longer.  My partners are struggling now as mentioned, corporate IT seems to be restricting access – maybe i should consider Onedrive?

For Salesforce just like ERP I see things as a little different then.

Both Enterprise Resource Planning and Customer Relationship Management are more than the Filofax and a lot more than a file sharing and synchronization tool, though there are elements of those to them too – Salesforce has a content management system and SAP has a document management system and supports various kinds of documentation libraries.

The expectation of these companies is that they must be highly secure, highly robust, highly available and highly flexible. They must be, after-all, for the salesman and daily operations for this will be their first port of call for everything they might want to do with a customer or part of the business.

As the Yahoo article says, for the likes of Salesforce, there are competitors, there are pricing challenges, there are doubters, there are existing approaches to displace; hearts and minds to win, Winshuttle has the same, so does Dropbox with its 500 million users.

Being the 800lb gorilla in a particular space is not an opportunity to be complacent it is instead an opportunity to rethink your game.

Salesforce has relatively poor traction in parts of Europe for exactly this reason, the approach proposed, of keeping your stuff in the cloud is still somewhat revolutionary for some parts of Europe, disconcerting and anxiety inducing – there’s a lot of fear and distrust of all things related to the cloud and major fears about privacy and protecting secrets. Salesforce has a tough time convincing Europeans that they should shift from on-premise to the cloud, SAP and Dropbox will have the same challenge, as does Microsoft and others.

Why change to enterprise?

The other variable to consider is the lack of a compelling event.

Why should corporate IT have a compulsion to adopt Dropbox for the enterprise?

Because Dropbox created a bold consumer version of a file sharing product? Hardly a compelling argument.

While software companies of widely adopted solutions may be presented as thriving,  what I have today may be adequate, why would i consider changing at all?

I might only consider a change if you give me a viable alternative or cut away what I use today – the latter seems to be presenting itself as the compelling event.

I see these same issues with SAP customers switching from traditional methods of ERP usage and integration. They prefer EDI, they prefer PI/XI and robust integration services, they prefer them not because they are necessarily more reliable or flexible, but because they are known, secure and robust – but robust for only specific scenarios (read inflexible) – the scenarios defined at the outset and the solutions constructed by IT at a point in time are rapidly being outstripped by the needs of the business.

So for me, as for Dropbox the challenge is to position and convince SAP customers that there are easier and better ways to integrate data for SAP and Salesforce. Dropbox has to demonstrate that there is no extra overhead for IT and just loads of value. With Winshuttle for Excel in particular this is an ongoing battle with IT, but the reality is that Winshuttle is a lot more than smarter Excel  – it works with other sources and in interesting and innovative ways. Winshuttle wants to be the SAP and Salesforce Application Data Management platform of first choice just as Dropbox wants to be the file hosting and cloud storage, file synchronization, personal cloud solution of choice.

I don’t have the luxury of being able to give Winshuttle software away for free but I do have the advantage of being able to tell you what the switch to a new way of doing things is worth – it’s worth better data quality, faster and better application data management with a lean toolset and the opportunity to redirect limited and valuable resources to higher value work.

February 9, 2016  4:53 PM

The business calendar and why i just don’t get it

Clinton Jones Clinton Jones Profile: Clinton Jones
Calendar, Workflow

A photo by Dafne CholetWhy are we bothering with maintaining a business calendar at all in process management?

Of course the answer isn’t simple. At the most basic level it is because when we do process metrics we want to be sure to not penalize anyone unduly because the weekend had come or there was a public holiday during the flow of a process.

At a more sophisticated level it is in order to provide opportunities to smooth the flow of work and not have weekends and public holidays interfere with the end-to-end process measurement.

What this fails to recognize though is that some people are permanently connected to their work and very attentive around workflow tasks and will approve stuff at all hours of the day.

This means that even if the calendar says that they are booked off work (past siren time) and enjoying their weekend or even on their holidays, if they choose to take action on a workflow task on their phone or tablet or even PC; they are effectively confounding the metrics – worse of all – we don’t typically give them credit for their accelerated performance on those occasions. Shouldn’t we?

If a workflow process redirects something for action like a dispatcher and moves a task to someone else dynamically, based on the calendar then that’s a great advantage and a really good reason to have that calendar influence the metrics, but if the only purpose is to provide a process safety net for performance calculations, then I am not convinced of its merits

A few calendar examples

As a precursor to working in the Middle East I bought a small book years ago entitled ‘Don’t They Know It’s Friday: Cross Cultural Considerations for Business and Life in the Gulf’ by Jeremy Williams. It is a lightweight easy read and for those who work or have worked in one of the Gulf states it is something one can easily relate to.

Working in the Middle East on a Sunday you become particularly aware of the fact that the rest of the world is not working. Correspondence with suppliers, vendors and some customers goes unanswered and you can forget about phoning them.

In this transcontinental business world that we live in, it is not unusual to have business processes serviced by groups in all kinds of geographies and as a consequence when my day starts, the day is already over for people just my side of the international date line and those in Hawaii and the West Coast have long gone home  and may possibly have even gone to bed with a full equivalent day still to come.

If my business process relies on any of those people to do something, they’re not going to pick it up until they start their day again. The reality though, is that 5 – 8 hours will have elapsed until they get to the work at hand.

Rethinking the process metric

I think there is an argument for rethinking the process metric and going back to the fundamental question of ‘why do I have a process metric at all?’.  The reason I believe we have metrics at all is because we need to demonstrate progress against goals and service levels mutually agreed within the business, to have a good handle on bottlenecks, inefficiencies or problems in the process. This could be related to a task or an individual.

The laggard participant, for example who always takes ages to approve or action things, or the task that is so complex that it has many subtasks and actions that take place outside of the transparency of the workflow.

We want to understand who the laggards are, call them out and try to ascertain why they are delaying the process. We want to understand which tasks are troublesome and look for ways to optimise and streamline.

If we hide behind calendars though we frustrate transparency on other aspects of work that allow people to game the reporting statistics – this becomes really important if you’re paying a premium for a specific service level.

Consider the person who submits the request 5 minutes before the deadline or the person who arrives at work earlier than their colleagues and performs all the work assigned to them in their queue before everyone else even gets started. Does the former get penalized and the latter get rewarded? Is their behaviour transparent in the process metrics?

If our objective is to have a task completed within 24 hours of it being submitted, what is the basis of that 24 hour metric – is that based on Standardized Work? Is it based on some assumptions about down-time, cycle times, smoke breaks, lunch breaks and other delays?

The preferred KPI

urgentIn some instances ‘timing’ is factored in. This makes this an ‘adjusted KPI’ – but sometimes, the idea of an adjusted KPI is an accepted value but not the preferred KPI. When we walk up to a cash machine we expect to conclude our transaction in just a couple of minutes. We wouldn’t be very happy if the machine said. Come back in half an hour i will have your money ready then…

The preferred KPI would be, the amount of time it would take if this was a single participant process – that’s your Standardised Work.

This means that if I have to code and post a 1000 line accounting document in the system of record, I should be able to do that in say an hour from the moment I receive it, assuming I have all the data that I need to do the job properly and I am not doing something else the moment I receive it. This also means that in an 8 hour day I should be able to process at least 8. It follows then that if my group receives 240 of these documents per day, then I need 30 people in order to get through the work within the SLA.

In reality though, if I receive all 240 documents in the last hour of the work day, I am not going to get the work done within that hour. You could argue that that same staggering will likely occur tomorrow too, but as we all know, work doesn’t always flow that way. This is one of the reasons why another measure often gets introduced in the form of a business process management metric that tracks cadence or flow and alerts on extraordinary peaks and troughs. Interestingly, these too become subject to the calendar.

A further complication occurs when the period end arises which then overrides all the KPIs and simply states. ‘Here’s the date by which it all needs to be done!’.

So what I would really like to see is the calendar discarded altogether and the acceptance that if a process took 50 hours to complete because a weekend happened in the middle between submission and completion, that was how long it took – we may have to get granular with the classification of the time itself but that’s another matter.

In aggregate it will all smooth out anyway but at least it will eliminate or flag up any work cadence issues. Today that may not be very transparent with the dependence on calendars.

In the end we all have to own the end-to-end quality of the work we do, even when we have to rely on others to help us achieve our goals.



January 29, 2016  11:54 PM

A join, a join, my kingdom for a join…

Clinton Jones Clinton Jones Profile: Clinton Jones
IN MEMORY, Query, SAP, Winshuttle

I was quietly minding my own business when one of our sales account managers over at Winshuttle sent me an unsolicited email suggesting that I make myself available for a couple of hours to walk a prospect through a business problem and see how we could solve the challenge with a product or combination of products.

I don’t shy away from challenges but without more context I mentally rolled my eyes and thought, oh boy! The next I saw, was another calendar invitation from a solutions engineer suggesting that I join a ‘discovery’ call with the prospect and dig into what it is exactly that they wanted to be able to do. At this point, you can imagine more eyes rolled heavenward and me wondering what I had let myself in for.

In the world of SAP there are four kinds of people, the know it alls, the know nothings, the know a bit and the fakers. In my time of working with the product I have seen and danced with all four. If I were to classify myself, it would be the ‘know a bit‘ class. I am not downplaying what I know, I just know that there’s an awful lot I don’t know and that chasm grows wider and wider every day or rather the bit I know, seems to be becoming increasingly diminutive the more I learn.

So when you come across someone who has played with SAP for some time, you’re always a little reticent, after-all, the more years of experience the person has with SAP the more likely they are to know some stuff. Play along with me and you will see what I mean.

So on the call I am presented with a problem, a problem that interestingly, appears on SAP’s SCN too!

Dear… how to find in active customers list? ex : i gave given open items  days 3 months in ova8 – suppose if i want to know which customers are not paying money for nearly 3 months then how to find inactive customers list or give me any table

Ok so the English is godawful and broken but you get the idea. How do I identify active customers? Or rather, in this case, how do I identify inactive customers. You would think this is an easy thing to ascertain, after-all doesn’t SAP deliver everything? Well if you look in that SCN article, evidently not.

The clue comes in the posting that says

Technically Inactive Customers means the Customers who have done no transactions (Sales / Payments) with business since long time. While what you need is the list of Customoers (sic) who have not made any payments since last three months. For this I believe FBL5N will be a good report for get the necessary details. Secondly you can also approach your FI Consultant. He can pull data of Incoming Payments from relevant tables & get the list of Customers who have paid in last three months. Which will means that all the others Customers have not paid.  Also try T-Code FD11 which will give Analysis for Customer’s Account.

And now we come to the critical part, “approach your FI Consultant. He can pull data of Incoming Payments from relevant tables & get the list of Customers“.

joinThis is basically where the conversation with this prospect started.

Effectively he tells us that one of his IT staff took two days to dump the data from the relevant tables and then analyse the data arrive at the answer. How would you do it?

Again, being pretty lazy, I didn’t want to spend too much time working out what exactly I needed so I begged forgiveness and ignorance and asked for the table names and fields from the caller and a day later I received them.

The issue here is that when you build a query of joined tables, there are relationships between the tables and these are generally by some common key, like the customerID, the further challenge is that in certain instances, sales invoices vs. payments from customers, the customerID may be common but the document ID’s in SAP are not. Every payment gets its own document ID even though it may relate to a sales invoice ID. So I started simply enough with joining customer to BSAD and then customer to BSID, because I knew about this document ID challenge I guessed that an outer join might be better . Unfortunately I get to only use one outer join because the Winshuttle Studio Query module only allows one outer join to minimize the impact on system performance.

BSID and BSAD are the tables in SAP with open and cleared items of customer accounts.  Think of them as a mashup of the primary accounting data also found in BSEG and BKPF. Again with credit to SAP’s SCN articles 

So the main tables of FI document are BKPF and BSEG, if BSEG-KOART (account type) is D, it means it’s customer item, so you can find that item in BSID or BSAD, where:

By using BSID and BSAD  it is possible to find customer items faster than directly reading BKPF and BSEG, because KUNNR (customer) is not a key field in BKPF or BSEG and these tables can be massive. So you can consider BSID and BSAD as secondary index tables, to improve the performance when you need to pick customer data directly. Additionally, they are transparent tables, not that you should worry too much about this, but that effectively means that as they appear in the dictionary (as tables), so they are in the database – company code and customer number therefore make them really easy to use. You can read more on the other types of tables here – all of which incidentally can be used and joined to with a Winshuttle Query.

So returning to my problem. Joining these tables and extracting them in real-time meant that I could identify the active customers and their last activity but the rest didnt really work, because every event potentially had more than one entry and in the end I wasn’t getting a very satisfactory result.

So I bit the bullet and created a single table query script for each table but then added the sweetener – I linked them together as a chain and bound each query to a different sheet in the workbook.

Ok, so be aware that I am looking at a 1,000,000 row limitation per sheet in Excel, but hey! I am working with IDES data and even the best of our organ grinders can’t make that much stuff in these tables unless they really want to.

My queries ran pretty quickly, less than 30 seconds each, Dumping KNA1, KNB1, BSID, BSAD. I could have joined the first two, after-all I only needed the first one for the customer name but in the end it’s ok it was a quick run. I needed KNB1 because the same customer could appear under different company codes so knowing the valid combinations was important.

All tolled, about 22k customer company code combinations, a few thousand BSID’s and about 30,000 BSAD entries. I extracted the document posting date because that was the key to the last activity.

Now that I had the data, what next ?

Well, this neat little Winshuttle Analytics add-on for Excel kicked in at this point. Converting all of the Excel table extracts to in-memory tables with WS Analytics I now had the ability to perform some ad hoc inquiries.

The first of these was how many unique combinations of company code and customerID were in BSAD ? Next, what was the latest entry in BSID and BSAD for each of those company code customerID combinations. Building the queries took seconds, casting them down the combinations a few seconds more. My next part of the challenge was actually doing the same against the KNB1 table because in the end, I want to identify the dormant accounts. The casting took a little longer, since Winshuttle Analytics  is effectively a custom plug-in for Excel and brings the in memory tables with data pivots on steroids to the table, I am still hamstrung by the speed of the Microsoft Excel calculation engine, and sometimes, to be quite frank, it can be a dog. I got there after a few minutes and then added a few more tweaks. I developed a third formula in excel, one that would assign a deletion candidate indicator. This I could use with a transaction recording to flip the record deletion flag in the SAP master record.

The whole exercise took about an hour – pretty good when you compare that with two days!

I showed this to our visitors. I was disappointed that I hadn’t been able to do it all with a query but in the end, it didn’t make so much sense to do it all with a query. The alternative would likely have been a long running ABAP program but I managed it with an hour of thinking and doing and nothing more than a couple of desktop tools and most importantly no changes to my SAP system!


June 1, 2015  11:03 AM

SAP Security Authorizations – the devil is in the details

Clinton Jones Clinton Jones Profile: Clinton Jones
ABAP, authorizations, PFCG, SAP Fiori, Security

harddisk and padlock

I’ll be the first to admit that my knowledge of SAP security authorizations is lightweight.

I know enough to be dangerous but not enough to be even close to something like a ninja or knowledgeable person. If I am confounded by something invariably I have to read up about it or find a SME to help me.

What I can tell you though, is that in my present and past lives, SAP security authorizations have been the bane of users’, support personnel, and developers’ lives.

I am not entirely sure why this is the case.

My long held suspicion is that this has a lot to do with mistakes, omissions and explicit decisions made over the years with respect to the way solutions and applications have been architected. Unfortunately a holistic view of the security authorization model is often absent among developers and they rely on the auditors and the authorization framework to determine what can and cannot be done safely – it’s not a very good approach  and is fraught with risk.

The situation is further compounded by the absence of robust knowledge and subject matter expertise among developers as a whole.  The latter is one of the reasons why the annual security and controls audit that many companies are subject to, are an equally painful experience. The auditors come in, run a bunch of scripts, dump a bunch of data to files and then run away with them, returning some weeks later with a list of questions the length of one’s arm. It is something I became quite good at handling but nonetheless something I really hated, a massive time suck.

When we implemented BW and the Portal, admittedly some years ago, we found our ABAP security authorizations model became infinitely more complex. There was security and authorizations that was now specific to these environments too! Then came SAP CRM, and again, authorizations had to be set up for that too. With the growing adoption of HANA and the promise of S4HANA the question will come up as to whether the security model will change and to what extent. It seems for security at least, there’s more complexity coming…

SAP Fiori Security Authorizations

sauceSAP Fiori applications communicate with the ABAP stack through OData services.  In addition to ABAP authorization users must be granted authorization to access the HTML 5-based Fiori applications and the OData services in SAP NetWeaver Gateway. SAP Fiori applications require users and roles in SAP NetWeaver Gateway.

A SAP Netweaver Gateway PFCG role contains start authorizations for OData Services. SAP doesn’t deliver these roles to customers.

The SAP Fiori Launchpad relies on the authorizations in the ABAP back-end server and the authorizations to access the OData services in the SAP NetWeaver Gateway in the ABAP front-end server. Configuring these relies on Launchpad catalogs and UI PFCG roles.

A technical role is needed for the SAP Fiori Launchpad and SAP delivers a predefined set of technical objects for the SAP Fiori Launchpad, in particular the required profiles.  For ‘own’ catalogs these and possibly other technical objects are also required.  The UI PFCG defined roles bundle all front-end privileges required for execution of Fiori apps. Through user assignment to back-end roles, additional privileges are provided to execute the specific applications or access the OData services.    The powerful ‘Fact sheets’ require authorization roles in the front end, which grant the users authorizations for the ICF services and business server pages (BSP) to support the fact sheets.

Simple? Not a a chance…

It could be that things have started to simplify, but I have some doubts, or rather, I am not optimistic – this all suggests – “more complex”. SImplifying things on the front end will mean more complexity on the back end.

I was further reminded of this, this week when discussing EHS/EHSM with a consultant. We were debating the merits of using BAPIs and rFM’s to create and maintain EHS data and considered the implications for the security authorization model. After-all, we wouldn’t be using transactions for the automations. In the end we concluded it might be ok. But the question ultimately, will be whether or not this stands up to audit scrutiny.

If you wonder what all the fuss could be about then you only have to trawl over to SAP SCN  and read some of the many discussions on security and authorizations.

Conversely, if you’re the kind of person who likes to learn more through self study, then you might want to make a small investment in Andrea Cavalleri and Massimo Manara’s book from Galileo Press – 100 things you should know about Authorizations in SAP   (ISBN: 9781592294060). For the ABAP developer a slightly older volume but indispensible guide is Authorizations in SAP Software: Design and Configuration by Volker Lehnert (ISBN:9781592293421). Both are available at SAP Press and through Amazon. These don’t contain anything on the Fiori model, for that you will need to look elsewhere like SAP Help.

Unfortunately the SAP Fiori Security Guide that was originally published as a PDF has been taken down from its original location, perhaps because it is a bit out of date so again, refer to SAP Help for more information.



April 28, 2015  2:17 PM

Bad bad bad ABAPer

Clinton Jones Clinton Jones Profile: Clinton Jones
ABAP, Inventory, SAP

My eldest son says that I am ‘very judgemental’ – I don’t really have a defence and though I do tend to complain a lot about stuff, the truth is that there is a lot of stuff out there to complain about!

You can guess then where this post is going. It’s another rant about bad programming and poor applied thinking to technology. I’ll concede that things have changed immensely over the years since SAP was first let loose on the unsuspecting business community but you would be surprised how much new bad code crops up.

Yesterday I was posed the challenge of trying build automation around a multi-headed gorgon in the form of the SAP transaction MI10. It’s one of those peculiar transactions that shouldn’t be used that much but which surprisingly probably gets used quite a lot given the fact that inventory sometimes gets damaged, lost or mysteriously grows legs and walks out of the warehouse.

On face value, the transaction is probably not altogether terrible but it is when you discover that certain complex scenarios make it super difficult to work with.

Getting physical with your inventory

MI10 Data Entry ScreenTo automate any kind of process for more efficient use you need to look to repeatable patterns to reduce the amount of logic that is needed to stream the data. In this instance, if you are making use of serialized inventory and something needs to be said, stated or changed in relation to that, then you need to abide by your systems’ accounting rules and provide the write downs and write offs with as much information as possible in order to avoid an audit exception note.

With tight margins, efforts to constrain inventory and free up working capital, the physical inventory process is key to ensuring that you have in stock what your system says you have in stock, and if you have variances – account for them.

MI10 can be used as an umbrella transaction for the equivalent activities of taking and recording physical inventory and making adjustments. SAP supports you being able to break this process into three tasks streams but also allows you to unify them in MI10. All well and good. At the time of making the postings to adjust inventory you should know what the physical quantity of items is that you wish to adjust for. In MI10 this quantity value then dictates how many fields need to be populated with data by rendering a screen with as many field spaces for serial numbers as there are items to adjust for.

MI10 SerializationSince would be a fine approach except that this doesn’t make use of an ALV grid in the ABAP program but instead presents up to 19 fixed fields which are used to check against existing serial numbers. The screen is pretty dumb. When you have more than 19 items to enter you also have to scroll down to get to more fields.

For a human entering the data this is fine, for the most part, but transcription errors are easy to introduce and manual entry is terribly inefficient.

It’s unfortunate that this is such a difficult transaction to work with and it is unfortunate that serialization was apparently tackled as an after-thought in the way that the program was enhanced.

When I tried to automate this process using an alternative method, say with a BAPI, I found one – BAPI_MATPHYSINV_POSTDIFF  , but unfortunately it doesn’t support the important serialization aspect. Weird, since BAPI_MATPHYSINV_COUNT does….

So it’s disappointing to say the very least. Looking out on SCN it seems the only way around this is to once again get an ABAPer to write something, something to fix an inherent defect in the thinking around this important business process.

March 23, 2015  11:06 AM

Training employees in how to use SAP

Clinton Jones Clinton Jones Profile: Clinton Jones
eLearning, IT training, Job training, SAP

Apprenticeships-2[1]It’s a conundrum that has pestered business managers running SAP for decades.

How do you ensure that your employees have just the right level of understanding and skills to meet the demands of the job but keep them motivated and assure data quality.

Michael Management (MMC) , a specialist SAP eLearning company revealed some interesting characteristics about how SAP users feel about their training in general in their most recent survey.

Almost half of the respondents (44%) indicated that they preferred eLearning self paced online training and 43% indicated that they hadn’t received sufficient training to perform their job responsibilities adequately.  Almost a third indicated that they should have received at least 41 hours or more training in the last year to do their jobs properly and yet hadn’t received that. MMC also revealed that one of the reasons that the training doesn’t happen is that for more than a quarter of the respondents, there isn’t enough budget  for training and another quarter indicated that there was nothing particularly useful or appropriate on offer. With only a quarter indicating that there isn’t enough time for training what are the choices left?

On the job training

On the job training seems to be the most common approach used to getting new employees up to speed to be able to do their jobs adequately but with this being heavily dependent on how well processes and procedures are documented as well as the relative effectiveness of the person giving the OJT, it should be little suprise that employees are frustrated or demoralized.  Larger, more sophisticated organizations have comprehensive and extensive training programs blending a combination of job shadowing, classroom instruction and eLearning but it can be a challenge keeping this content current and contemporary.

One of the ways to minimize the dependence on training programmes to assure data quality and business process efficiency, is to automate as many processes as possible or to reduce processes to simple procedures that can be easily understood and leave little to interpretation or discretionary decision making.  Businesses don’t want to necessarily employ automatons but at the same time they do want to ensure that employees cross all the T’s and dot all the I’s in the execution of their job tasks.

Another survey by ON24, a webcasting and virtual communications solutions company indicated that more than half of entry level employees need explicit skills training and that the best trained industry segment is the medical and pharmaceutical industry where half of respondents indicate that they receive good training. While almost half of all the respondents reported that training is a top priority in their organizations more than three quarters also feel that training helps them do their jobs better, improving company performance and enhancing personal career goals.

SAP implementation projects

Training as part of a new SAP implementation is pretty commonplace, it is usually an integral part of the project delivery and it gets incorporated into the overall budget of the implementation but long after the project has ended and the consultants have moved on, how does your organization ensure that employees know what they are doing and why they do it to ensure data quality and proper business transaction processing?

Many of the companies that I interact with, indicate that the appropriate use of SAP is not something that represents a core focus of training existing or new hires. What they do indicate is that data quality and operational data processing challenges continue to plague the business. The challenges are so great that they are scurrying to implement more robust approaches to data gathering and data maintenance in core systems of record like SAP.

By implementing increased oversight and governance to the end to end data management process, particularly for master data and master records objects, they are obviating the need to expend extraordinary amounts on expensive training and education programs for employees.

Your own army of SAP consultants

It’s my opinion that spending money on training employees how to use an infinitely configurable ERP solution like SAP is a bit of a double edged sword. When you educate and teach your employees about ERP capability  through formal classroom education programs that have certification processes at the conclusion, you are grooming some of your potentially best employees in how to ultimately become SAP consultants. If you’re a consulting company or have ambitious other ERP projects then fine.

ERPJEDIIf you can retain them and leverage that learning for future projects then that is all well and good, but invariably, those very classroom settings, expensive as they are, don’t teach your employees much about the way that you run your business and why certain processes and procedures exist. You’re better off investing in some very targeted eLearning and using that to explain concepts and principles that are relevant to your business and how it uses SAP. At Winshuttle this is further augmented by periodic webinars and online training courses and webcast content.

The difference  in what you save on appropriate eLearning programs versus expensive proprietary training courses in classrooms is better spent on deploying lean data management approaches to automating and guiding processes and procedures.

Mechanisms such as automations with templates and workflows assure data quality, process governance and ensure compliant data management processes are in place and adhered to.

March 9, 2015  1:53 PM

Mass Materials Change

Clinton Jones Clinton Jones Profile: Clinton Jones
AppData, BDC, BODI, BODS, ERP, LSMW, SAP, Winshuttle

There are many options available for mass data maintenance in SAP and while third party applications that connect over RFC are not necessarily the ideal method for mass actions in SAP, the other options tend to be either very technical or very limited.

LSMW is quite frankly a sorely abused tool that auditors complain vociferously about. IT has no business messing with data in productive systems and business users have no business having access to LSMW.

sweetspotThe mass transactions and even BAPIs on offer in SAP natively, even the newer and revised ones, lag behind the capabilities of the configured or customized transactions in ERP itself and often never get configured or modified to align with what the business needs so they simply never catch up or align.

It might be proposed that business consider BODI/BODS but then this quickly falls into the realm of heavy technical lifting and architecting and configuring a complex solution with a long tail of deployment before the business can actually get going with the solution. Besides, BODS/BODI are often exclusively tools in the domain of IT or technically proficient users. If in the interceding period between selection and implementation of such a solution, the business pivots or changes direction, how quickly do these solutions re-align with the requirements of the business? My experience and that of others, suggests that they take a long time and maintaining them comes at a tremendous technical and operational expense.

In the absence of something better, business is back to an agile solution that can quickly connect to the configured transactions in SAP ERP  (like MM01/MM02) calling them over RFC and filling the screen fields in the background (BDC) or interactively (GUI Scripting). Performance and throughput hinges heavily on several factors but not least of these is the way the application is positioned and the resources committed for the task.

Proximity is everything

The closer the client is to the SAP applications, the faster the throughput (you’re then not dealing with network latency as a degrading factor). Also consider running these sessions against a batch server rather than just any old application server on the front end. If the call of the transaction over RFC process has to compete with regular dialog users in daily work then someone or something is going to have a bad performance experience, the batch processor is more tolerant but the wet ware (people) tend not to be…

The selected scripted method also has a tremendous impact – if you use GUI scripting then this the process is going to be dog slow as each screen gets painted and each field filled – this classic screen scraping should be avoided as much as possible.

While BDC over RFC is not as high performance as LSMW you reap benefits in other ways, the user who owns the data gets to load the data ( not usually an option with LSMW) and the errors (if any) show up in the excel workbook or access database in line with the records you are trying to change or create – using other methods you often have to convert the data to CSV or text and load a text or CSV file and then try to reconcile the items to the log – not pretty and also pretty painful.

Finally I will add that abuse of these third party tools is also often a challenge seen in the field, consider where they are positioned.

Transaction recorders are not recommended for the millions of records data sets, do they get used for them? Sure, but should they – probably not.

My experience is that once you hit about 30,000 rows of data, the question becomes one of, how long are you prepared to wait for this job to run? With throughput of around 30 fully loaded materials per minute using a single BDC session on a small ECC box can you afford almost 1,000 minutes (16 – 17h) of load time?

Do you have thoughts on better ways than these?

February 20, 2015  3:20 PM

Getting ready for SAP S/4 HANA

Clinton Jones Clinton Jones Profile: Clinton Jones

s4The SAP press engine wheeled out a big gun this month with the announcement of S/4 HANA – pitched as the successor to SAP ERP – Hasso Plattner and Bill McDermott both made statements about the importance of the announcement. While there still seem a great many unanswered questions about how S/4 will work and run, one thing is clear, SAP has been preparing for this announcement for some time. While the specifics may still be very murky we know a few very specific things.

The first fact that we know is that S/4 HANA will run on the SAP HANA platform and the primary user experience will be delivered through the Fiori application layer. For some companies this will be a welcome announcement as they toss coins to decide whether to adopt, stay with or abandon SAP as an ERP platform provider. There can be little doubt that for companies that have made a significant investment in SAP’s flagship ERP product, this announcement holds the promise of a new lease of life on SAP’s aging applications.

The HANA platform was the first horse to be presented from the new SAP stable around 2010 with the SAP Fiori UX being formally released a short time later and now boasting some 400+ role based apps SAP ‘solutions’ providing a personalized responsive and simple user experience.

Certainly customers that I speak to are all over the place in terms of their hopes and aspirations for HANA, some are great advocates, having already moved off of a dependency on one of the traditional RDMS for the underlying database but few or none talk about their own applications that they have developed on the HANA platform. Internally we have played around with the standard APIs and demonstrated an ability to directly write to the tables with but without clear use cases it seems a little premature to put too much more effort into this.

What does hold promise, is the announcement in that SAP S/4 HANA will be offered in the cloud, on-premise and as a hybrid deployment to provide maximum choice to customers. The SAP Simple Finance solution was the first SAP S/4 HANA solution to be offered and was announced in June 2014. While the specifics of the next generation solutions seems a little vague I did reach out to some of the more prominent advocates of HANA to determine just how exactly customers could transition to an S/4 HANA experience today even with only Simple Finance 1.0 on offer.

An important component that has to be considered is the SAP Landscape Transformation Replication Server, which you can learn more about through the Application Operations Guide targeted for customer consumption at

SAP Landscape Transformation (SAP LT) Replication Server is technology that allows you to load and replicate data in real-time from ABAP or non-ABAP source systems to SAP HANA environments. LT Replication Server uses a trigger-based replication approach to pass data from the source system to the target system.

The Replication Server can be installed either as a separate SAP system, or if  you have enough iron, on any ABAP source system.

The following graphic outlines the basic concept and the typical landscape (for an ABAP source system) using the trigger-based data replication approach of the SAP LT Replication Server.


In order to replicate data the Replication Server needs to be configured  to load and replicate data from one source system to up to four target database schema of HANA systems. Data can by replicated either in real-time, or by schedule. When using trigger-based data replication table-based retrieval of data occurs from application tables in the source system (or source systems). Transformation rules can also be defined for the data replication process.

As you can see, the approach from the application tables to the Replication Engine is to integrate by way of RFC but between the Replication Server and a HANA target system the connection is achieved using a DB connection and so is likely to perform well.

Using RFC

Typically on ERP systems each time a connection is established, logon data and other system parameters, such as character code pages, are exchanged which causes a system load of approximately 2.5 – 3 KB on the network each time a user logs on to the SAP System, for example when you use a BDC session with Winshuttle Transaction to send data from Excel, the initial logon tends to take some time.

The largest amount of overhead for an RFC transmission occurs when making the connection (or calling a function like a transaction). The size of the data blocks is configurable but there is a whole art to choosing optimal block size depending on the payload that you intend to exchange over RFC – you often need to make a compromise between the size of the data block and the manageability of the data. SAP’s own tests have shown that for data of approximately 100 KB and greater, the overhead when making the connection or calling the function is negligible.

SAP says large amounts of data only produce additional improvements in performance when ‘very good data compression programs’ are used, one can only assume that the mechanisms used for the LT replication fall into this category.

Certainly returning to the considerations of finance, if your intent is to continue with the status quo of running your financials on ERP and you plan to do more analytics and reconciliation on an S/4 system today then you will need to set up replication between BSEG, BKPF, BSIS and BSAS and the S/4 system as a minimum. Certainly if you have big data in these tables the replication may not perform very well unless perhaps you are running on a HANA database.

I have been told that in Simple Finance 1.0 the standard Accounting interface BAPI’s are used but in 2.0 there are dedicated posting programs so for existing SAP customers, trying out S/4 today on Simple Finance 1.0 may not be as difficult as one thinks. You can read more about the possibilities for integrating ERP finance with S/4 by way of the blog post by Jens Krueger and you can read more on how to optimize RFC also at

January 14, 2015  2:55 PM

I said you could save time and money – here’s an explanation on how

Clinton Jones Clinton Jones Profile: Clinton Jones
Data, ERP, IT Automation, ROI, SAP R/3

time-is-moneyOver the years I have gone to great pains to explain how businesses can save time and money with automation of business processes in systems of record like SAP but despite this, there are some who doubt that real time and money savings can be obtained through automation of things like transactions.

The power of preparation

The approach is actually quite straightforward, if you already have your data in a spreadsheet then the next question is how you get that data into the system.

Many respected experts, industry productivity gurus in particular, will argue that the data should never be in a spreadsheet and should never have found its way to a spreadsheet. This statement is made on that assumption that data processing tasks are all very hygienic and straightforward.

As any business user will tell you, placing orders, doing adjustments, mass changing data is anything but straightforward and a spreadsheet is often the most straight forward way of collating and compiling data before taking any action.

If you’re taking the output of the spreadsheet and either copying and pasting the data into entry screens or manually transcribing then consider how much time you are spending on that task? In fact, if you are even converting it into comma separated values for loading through some tool, consider the amount of time you spend on that, the frequency and relative reliability of that approach.

Injecting data into systems is a prescription for improved productivity

Every minute you spend on transcription, copy and paste, reformat, checking and rechecking, could be precious moments that you will never recover.

If you eliminate those steps, accelerate the data injection into the system of record and avoid the likelihood of duplicate entries, omissions and transcription faults then you will reap the reward of time savings.

Multiply the time saved by the value of your time and that’s a part of your ROI. You can test this out with any ROI calculator that allows you to define a given process.

The great thing about using a standardized method for interfacing with your system of record is that it can be used for small quantities and big quantities of data and depending on the nature of the tool that you use to create the automation, you may have many automation scenarios for different tasks that you need to perform.

While continuing to use spreadsheets to stage data may seem old fashioned to some users, the reality is that this is one of the most powerful and most flexible ways available for preparing data and while automation doesn’t necessarily save you time in the preparation stage it will save you time when you finally have to apply the prepared data to your system of record.


Page 1 of 41234

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: