I have a confession to make. The confession is that I love to get mired in funny sayings, words and colloquialisms.
In particular I like to look to the origins or etymology of words and sayings to determine if the original meanings still hold true.
I also love the concept of automation and the notion that automation can bring tremendous value not only in terms of time savings but also in the context of data quality.
Spick and span apparently dates to the late 16th century (in the sense ‘brand new’): from spick and span new, an emphatic extension of the dialect phrase “span new” which has its origins from the Old Norse spán-nýr, from spánn ‘chip’ + nýr ‘new’ – meant as new as a freshly cut wood chip, such as those once used to make spoons.
In a metaphor dating from at least 1300, something span-new was meant to describe something as neat and unstained. The term spick is influenced by the Dutch spiksplinternieuw, literally ‘splinter new’; a spick (a spike or nail) was another metaphor for something neat and trim.
Apparently in 1665, Samuel Pepys used spicke and span in his famous diary. Now why wouldn’t you want some shiny and new data ?
Just like turns of phrase though, automation has a root – successful automation is dependent on good quality data. When one looks to automate processes though, often data management and data hygiene are overlooked in terms of their ability to influence the relative success of automation initiatives.
Data quality at the root
Poor data management and data hygiene can happen to the best of organizations, you only have to look at the article about Target Stores’ Canada failed ERP implementation in the context of expansion in North America to understand how.
It can happen to any organization no matter how experienced and no matter how deep the pockets.
When failures happen, scapegoats are sought out, excuses made and the proverbial heads roll. If ever you wanted an example of how bad data can impact operations and reputation – Target Canada is likely a good recent one.
Just looking at customer data you can see how horribly things can go wrong if you have a poor or incomplete plans. According to SiriusDecisions and others:
- 25% of the average B2B database is inaccurate
- 60% of companies in their surveys have an overall data health scale of “unreliable”
- 80% of companies have “risky” phone contact records
- 64% of companies in one survey classified their data as “unreliable”
- 34% described their data as “questionable”
- 61% of companies say over a third of their records are “incomplete”
That’s an awful lot of bad data.
In looking how you work with SAP application data there are some key questions that you need to ask.
- What sources are used to create, build, maintain, and update SAP records?
- How do master data and operational teams keep data current and appropriate?
These are mission-critical questions that need a master and operational data management plan. Operating without a plan puts operations and your business reputation at risk and in the end these can all influence growth and revenue!
Some ideas on how to clean it all up
Here are a couple of ideas on how you can stem the tide of growing bad data and clean up what you have.
Establish an application data management strategy
If you don’t have a strategy, it’s time to set one up. Different operational groups tend to have different degrees of maturity around data management strategy. I have found that it varies wildly from business unit to business unit.
Where cost and cost management is a big issue I have found a strong focus on the definitions of the accounting classifications, where the product portfolio is large, products and materials are key. Where the business’ lifeblood was the supply chain, the data surrounding procurement and procurement processes was key.
Prioritize the data you know to be troublesome and then establish a strategy and business case along with a budget to support your plan.
Introduce data health checks
Good data quality can only be achieved by doing a data health checkup, profiling your data, deduplicating, correcting and revising. After you have done the initial pass es of data adjustment the next step is defining a health KPI (Key Performance Indicator) and then regularly checking out the quality of your data against that KPI – and reporting on it!
Health checks are also important because if your business is growing – so is your data ! Engagement in data hygiene activities constitutes investment in the data – one that positions your business for further automation activities which in turn can drive sales and revenue.
As part of the strategy budget you will need to make investments in profiling tools and reporting tools as well as mechanisms that can help in the updating and appending of data records in SAP. There are a number of different options that you can look at from companies like SAP and also SAP partners like Winshuttle for both the desktop working with Excel data and the enterprise deployment working with forms and workflow.
Also take a look at the concept of Lean SAP Application Data Management – an approach that involves building MDG (Master Data Governance) and MDM (Master Data Management) solutions without programming using BUAD (Business Unit Application Development) concepts where solutions are delivered by and within the business without necessarily impacting or demanding scarce resources from IT who are engaged in bigger and more strategic business transformation initiatives.
Establish data management standards
Good data quality data needs to be ensured at the start, so check bad data through a gating procedure before it contaminates all the preceding good work that you have done. Many types of data need regular updates, and data has a natural decay rate, particularly for contact data because people change jobs, companies go out of business, and mergers and acquisitions occur.
All of these events directly impact the effectiveness of your customer and vendor data but they also impact things like procurement contracts, items that you may stock and even the definitions of components that might be a part of a bill of materials for something that your business makes.
Keeping all the records in the database in a healthy state is strategic to maintaining robust transaction processing and good quality business reporting
Engage in routine maintenance and review and – be assertive about it
Application data functions like an engine. Regular maintenance keeps the engine running smoothly and efficiently with the best performance. Data management and hygiene is not something to think of as a one time tactical activity – it is strategic and helps keep the business running strong .
Putting off data health checks, avoiding the hard decision of gating bad data and engaging in active data maintenance will directly impact revenue, opportunities, customer interaction and business reputation.
Finally, consider that sometimes, just sometimes, data quality can even decide whether or not someone gets to keep their job.
In my own words, that’s pretty much how I see the Japanese 5S methodology for workplace organization.
The transliterated terms of seiri, seiton, seiso, seiketsu, and shitsuke literally mean sort, set, shine, standardize and sustain and they have some strong parallels to the whole lean methodology and principles of six sigma.
In the context of application data management they have strong relevance, perhaps in some areas more than others.
Consider for example, the concept of seiri – seiton or sort and set. In a workshop this is simply understood as a principle whereby you go through your tasks, tools and materials – prioritize them and then set them up in a way that makes you most efficient and effective. Folks in logistics get this, as being a way to optimize things like routes, picking, packing and general movement of resources. Think of it as having the right stuff in the right place at the right time.
With application data management I think you can consider this as a focus on what really matters. I often talk about the cascading effects of mistakes made early on in the data management process having a tremendous impact downstream. This means that master data and master records are often a big focus area of sorting through what is critical and setting it all straight!
A lot to do
There is a lot of master data in any application but in an SAP system there are a few key bits that have the biggest impact. Of course depending on your focus, you may choose a different set as more important over another. In finance, for example, we care about the bank account details, the postal address and phone number, the contact person in accounts and the validity of the tax documents and perhaps the payment terms as basic examples.
While we also care about the credit limit, the account assignment, the physical address these are less important for accounting and finance and more important for perhaps for sales, procurement and logistics.
For a master data group, that’s a challenge. But it is less of a challenge if you make each of the stakeholders responsible for their own bits. Look to an SAP system and you’ll see that it supports all of these ideas. There’s a transaction for example, that allows you to manage all aspects of a customer or vendor – centrally; and then there are transactions that allow you to manage just pieces.
In the blurred lines of responsibility world of SAP and in fact systems of record in general; the clear segregation of responsibility for data is not always so clear as defined by a transaction and so customers are now presented with the challenge of somehow banging together several different ways of working with the constraints of the technology.
This is where having a strategy for collaboration that transcends the limitations and complexities of the underlying technologies is important and this is where template-based and form-based data collection approaches have a role to play.
Many Winshuttle customers start off clearly understanding the importance of sort and set but combine this with shine and standardization – the notion of cleaning up data and ensuring that there is a procedure and policy in place for not only ad hoc data management but also the regular process of new data creation and scheduled data maintenance. Winshuttle refers to the 5S kind of approach to application data management as Lean Application Data Management.
What could go wrong?
Customer data alone, is constantly changing. Data decays at an average rate of 2 percent per month, which means you can expect 25 to 30 percent of your organization’s contact data to go bad each year. Basic data goes out of date or is simply incomplete. Looking again to the impact downstream it makes perfect sense that we understand how important it is to have a good set of routines for managing that data appropriately.
If data management doesn’t have a set of priorities, policies and procedures then the downstream effect is rework, failures and risk. Risk of penalties, losses and reputation. SiriusDecisions postulate that it costs $1 to verify a record as it is entered, $10 to correct it later and $100 if nothing is done.
The fifth S is shitsuke – sustain. Of course this is well understood by Six Sigma advocates in the context of implementing continuous improvements under Plan Do Check Act. This is the review of what you do and have done and the course adjustment. Five ‘S’ is not prescriptive but it is suggestive, it suggests that if you look to your work and understand how well it is aligned with these five principles of effective and efficient work, then you can be more effective and more efficient – get more done, get it done with quality and ultimately reap the rewards of time savings, lower cost, less fire-drills and reduced risk.
I recently wrote a piece on How many SAP systems are enough, over on IT Toolbox.
The catalyst for the piece was largely a discussion with an existing Winshuttle customer that is looking to expand their adoption of Winshuttle technology.
Working with many SAP systems in my experience is fraught with challenges for end-users but a deconstruction of how customers actually work with those SAP systems reveals some interesting characteristics of usage, and usage patterns.
SAP has made it easy to proliferate systems. Most customers have at least three or four,even if they are virtualized but as soon you introduce additional functionality, projects and the fall-out of M&A activities, you suddenly find dozens of systems in the landscape, and those are just ABAP systems, there is also the non ABAP systems to consider too.
In reality what I have found though, is that even amongst the master data teams in a given organization, there are only a small number of systems that are used regularly. Regular users are almost exclusively using one system and only one system. You can consider this, then, their favourite.
While IT generally has many systems to choose from (internally I have more than twenty – and we don’t even use SAP!) – this should not be confused with the way business users use SAP systems. In fact it should be a major red flag to audit and compliance if users are using more than one ERP system. Reconciliations, consolidations and ensuring all the necessary checks and balances are in place for transaction processing can be terribly tricky when you have users using too many systems.
What do you see?
I am convinced that the sampling of customers that I took a look at, are representative, in fact, even more interesting, is the fact that almost none of them spend any meaningful time with integration solutions outside of productive systems. This behaviour with Excel to SAP integration tools must surely represent in some way, their behaviour when using the SAP UI’s directly.
On a separate note, if you have so many systems, are you aggressively looking to reduce them? If not, why not? One of the oft stated reasons is that the functionality requirements of the various systems in the landscape are different.
That different business units have different needs. I am again not 100% convinced about this argument. Many businesses have successfully consolidated SAP systems or run their global business operations off a single SAP instance.
I am not saying it is easy, I am simply saying it can be done.
A colleague recently posted an article internally.
The Yahoo article described Salesforce and Dropbox and the fundamentally different ways in which these two organizations view the sales organization in the enterprise space.
We have similar challenges in the SAP space at Winshuttle.
As a part of the SAP Nation we offer some very distinctive products that improve how companies use SAP and Salesforce but we are torn between selling to individuals, workgroups, departments, Corporate IT and the office of the CIO. Who is the best target audience for conversations?
Selling operational DNA
My view is that the reality is that when you sell a CRM or ERP concept you’re selling something that is going to be part of a business’ operational DNA.
This makes a big difference when it is compared with selling a file sharing and collaboration utility or any single function utility for that matter. We don’t want to be just thought of as some silly little narrow focus utility – we want widespread adoption and in particular, endorsement by IT – in fact in some bigger companies we have proven our value and earned it!
You don’t get eight out of the top ten most powerful brands in the world as customers by selling a novelty item. For some companies, if Winshuttle stops working, business efficiency is seriously impaired.
In the dim past, before what we understood CRM systems as they are today, I remember the ‘smart’ and ‘professional’ account managers and salesmen used to write and keep copious notes about customers. These people gathered business cards, had bulging Filofax binders with business card slots as an alternative to a bulk desk-bound Rolodex etc.
Some of the biggest software makers like Dropbox often say that they don’t need a huge enterprise-sales team to sell their products to other companies.
Because the company wouldn’t invest in these proprietary solutions and systems, the ‘smart’ guys and gals in sales bought their own systems and proudly paraded them in meetings and around the office.
Eventually, due to genius, pressure, inspiration or clever marketing – all the sales guys had these systems and they were branded with the company logo!
In the accounting department we had a similar thing, the Kalamazoo cheque, payslip and cash analysis systems. We also had desktop calculators, or at least some of us did.
Today it would be a bit laugh-worthy, but those who were serious about their job would either invest in specialized stationery or even buy their own calculators. Calculators back then we’re prized office accessories and desktop computers were almost unheard of.
If you were super serious you would invest in a adding machine that had a tally-roll – if you could afford it, you could use cash register paper if it had a regular printer head otherwise you had to buy expensive thermal paper just like for the fax machine!
In the end eventually someone again decided everyone could be more accurate if they all had calculators and although the tally roll versions still took longer to hit us, everyone was issued with some desktop calculator device.
When you changed jobs you eventually influenced your new employer to adopt these systems because you’d already been sold on them in a previous role or job.
Dropbox has been successful because in my view corporate IT failed.
Enterprise IT gets a bad rap a lot of the time but with this topic it has been an abysmal failure at providing a simple yet critical resource.
Addressing the need for a simple external file sharing and personal file synchronization tool – something that overcomes the limitations of email attachment limits and avoids choking mail-servers.
Corporate IT will say it failed because of budget, security, provisioning and other concerns blah blah blah… but what is the reality?
The reality is that they had nothing decent that was easy to administer and cheap to deploy.
Wrapped up in trying to cope with the vulnerabilities of Microsoft technologies as a whole – if you remember the ILOVEYOU worm, the Pikachu virus, Klez, Zlob and Stuxnet, that was their lives ten years ago.
Trying to curtail and manage internet access, dealing with all those viruses, trojans and malware kept them busy. Then also coping with all the management of the standardized back office systems like ERP and CRM, why then would corporate IT care much about file sharing?
They care about file sharing less than they care about Excel spreadsheets – corporate IT in the enterprise just doesn’t get it…
“Why don’t you Email it, thumb drive it” they would say… whatever!…. We found a tool call Yousendit – that worked for a while, unsanctioned for use but a great single purpose file transmission tool.
If you were lucky, you managed to get an FTP site but such geriatric resources were horrible to work with and IT didn’t really like having to manage them either.
Dropbox was viral though, word of mouth resulted in it becoming a staple for everyone and now it is pretty much proven itself and is ubiquitous.
There are still kinks, risks and all kinds of other things that could go awry but I am sure they will work them all out – the only challenge is that they may have to spend such a lot on hardening and securing the technology – it may lose its simplicity and earlier charm and of course if you want all that management and control as a customer, you need to be prepared to pay.
For this ‘utility’ to go the next round, to get broad endorsement and acceptance and sign off and funding by IT there has to be changes and I believe IT will become hard-nosed and resistant – at the moment the use of Dropbox in corporations is still largely a skunk-works operation or what we call Shadow IT. 500 million users, but how many of those are just ordinary individuals?
Perfect for now
Drop Box may get a couple of beach-head enterprise deals the way it operates today but as soon as they get too big and don’t align with all the other enterprise things, their capabilities will be curtailed or corporate IT will have its hand forced and clamp down on it.
I am already seeing this.
I like to get product into the hands of enterprise evaluators as fast as possible.
I share meeting recordings, powerpoint slides, PDF files even photos this way. It’s so easy, I love it. But increasingly my business partners are getting blocked by their corporate IT policies. Corporate IT in the enterprise it seems, is waking up to the fact that by letting Dropbox go viral, they have allowed a vulnerability to creep in. The magnitude of the vulnerability is yet to be determined. Microsoft will tell corporate IT they don’t need Dropbox, that they have SharePoint, they have Office365, they even have Microsoft OneDrive – they don’t need yet another file sharing tool.
I have a Onedrive storage, 30gb of it, do I use it ? Nope…
In fact I am not even sure how to use Onedrive, though it is probably really easy. My dropbox folder is 8Gb (I managed to co-opt some signups by friends and family and garnered the 500mb increments for every signup early on when Yousendit (now Hightail) starting curtailing my ‘fair usage’ and restricting what i could send.
Dropbox serves my purposes and periodically I clear it out when it tells me i am out of space. Dropbox costs me nothing but I am neither compelled to upgrade it nor discard it.
My files are anywhere I can get to the internet and they are on all my devices, it is ‘perfect’ for my purposes. Perfect until the day I am asked for money or until my business partners can use it no longer. My partners are struggling now as mentioned, corporate IT seems to be restricting access – maybe i should consider Onedrive?
For Salesforce just like ERP I see things as a little different then.
Both Enterprise Resource Planning and Customer Relationship Management are more than the Filofax and a lot more than a file sharing and synchronization tool, though there are elements of those to them too – Salesforce has a content management system and SAP has a document management system and supports various kinds of documentation libraries.
The expectation of these companies is that they must be highly secure, highly robust, highly available and highly flexible. They must be, after-all, for the salesman and daily operations for this will be their first port of call for everything they might want to do with a customer or part of the business.
As the Yahoo article says, for the likes of Salesforce, there are competitors, there are pricing challenges, there are doubters, there are existing approaches to displace; hearts and minds to win, Winshuttle has the same, so does Dropbox with its 500 million users.
Being the 800lb gorilla in a particular space is not an opportunity to be complacent it is instead an opportunity to rethink your game.
Salesforce has relatively poor traction in parts of Europe for exactly this reason, the approach proposed, of keeping your stuff in the cloud is still somewhat revolutionary for some parts of Europe, disconcerting and anxiety inducing – there’s a lot of fear and distrust of all things related to the cloud and major fears about privacy and protecting secrets. Salesforce has a tough time convincing Europeans that they should shift from on-premise to the cloud, SAP and Dropbox will have the same challenge, as does Microsoft and others.
Why change to enterprise?
The other variable to consider is the lack of a compelling event.
Why should corporate IT have a compulsion to adopt Dropbox for the enterprise?
Because Dropbox created a bold consumer version of a file sharing product? Hardly a compelling argument.
While software companies of widely adopted solutions may be presented as thriving, what I have today may be adequate, why would i consider changing at all?
I might only consider a change if you give me a viable alternative or cut away what I use today – the latter seems to be presenting itself as the compelling event.
I see these same issues with SAP customers switching from traditional methods of ERP usage and integration. They prefer EDI, they prefer PI/XI and robust integration services, they prefer them not because they are necessarily more reliable or flexible, but because they are known, secure and robust – but robust for only specific scenarios (read inflexible) – the scenarios defined at the outset and the solutions constructed by IT at a point in time are rapidly being outstripped by the needs of the business.
So for me, as for Dropbox the challenge is to position and convince SAP customers that there are easier and better ways to integrate data for SAP and Salesforce. Dropbox has to demonstrate that there is no extra overhead for IT and just loads of value. With Winshuttle for Excel in particular this is an ongoing battle with IT, but the reality is that Winshuttle is a lot more than smarter Excel – it works with other sources and in interesting and innovative ways. Winshuttle wants to be the SAP and Salesforce Application Data Management platform of first choice just as Dropbox wants to be the file hosting and cloud storage, file synchronization, personal cloud solution of choice.
I don’t have the luxury of being able to give Winshuttle software away for free but I do have the advantage of being able to tell you what the switch to a new way of doing things is worth – it’s worth better data quality, faster and better application data management with a lean toolset and the opportunity to redirect limited and valuable resources to higher value work.
Why are we bothering with maintaining a business calendar at all in process management?
Of course the answer isn’t simple. At the most basic level it is because when we do process metrics we want to be sure to not penalize anyone unduly because the weekend had come or there was a public holiday during the flow of a process.
At a more sophisticated level it is in order to provide opportunities to smooth the flow of work and not have weekends and public holidays interfere with the end-to-end process measurement.
What this fails to recognize though is that some people are permanently connected to their work and very attentive around workflow tasks and will approve stuff at all hours of the day.
This means that even if the calendar says that they are booked off work (past siren time) and enjoying their weekend or even on their holidays, if they choose to take action on a workflow task on their phone or tablet or even PC; they are effectively confounding the metrics – worse of all – we don’t typically give them credit for their accelerated performance on those occasions. Shouldn’t we?
If a workflow process redirects something for action like a dispatcher and moves a task to someone else dynamically, based on the calendar then that’s a great advantage and a really good reason to have that calendar influence the metrics, but if the only purpose is to provide a process safety net for performance calculations, then I am not convinced of its merits
A few calendar examples
As a precursor to working in the Middle East I bought a small book years ago entitled ‘Don’t They Know It’s Friday: Cross Cultural Considerations for Business and Life in the Gulf’ by Jeremy Williams. It is a lightweight easy read and for those who work or have worked in one of the Gulf states it is something one can easily relate to.
Working in the Middle East on a Sunday you become particularly aware of the fact that the rest of the world is not working. Correspondence with suppliers, vendors and some customers goes unanswered and you can forget about phoning them.
In this transcontinental business world that we live in, it is not unusual to have business processes serviced by groups in all kinds of geographies and as a consequence when my day starts, the day is already over for people just my side of the international date line and those in Hawaii and the West Coast have long gone home and may possibly have even gone to bed with a full equivalent day still to come.
If my business process relies on any of those people to do something, they’re not going to pick it up until they start their day again. The reality though, is that 5 – 8 hours will have elapsed until they get to the work at hand.
Rethinking the process metric
I think there is an argument for rethinking the process metric and going back to the fundamental question of ‘why do I have a process metric at all?’. The reason I believe we have metrics at all is because we need to demonstrate progress against goals and service levels mutually agreed within the business, to have a good handle on bottlenecks, inefficiencies or problems in the process. This could be related to a task or an individual.
The laggard participant, for example who always takes ages to approve or action things, or the task that is so complex that it has many subtasks and actions that take place outside of the transparency of the workflow.
We want to understand who the laggards are, call them out and try to ascertain why they are delaying the process. We want to understand which tasks are troublesome and look for ways to optimise and streamline.
If we hide behind calendars though we frustrate transparency on other aspects of work that allow people to game the reporting statistics – this becomes really important if you’re paying a premium for a specific service level.
Consider the person who submits the request 5 minutes before the deadline or the person who arrives at work earlier than their colleagues and performs all the work assigned to them in their queue before everyone else even gets started. Does the former get penalized and the latter get rewarded? Is their behaviour transparent in the process metrics?
If our objective is to have a task completed within 24 hours of it being submitted, what is the basis of that 24 hour metric – is that based on Standardized Work? Is it based on some assumptions about down-time, cycle times, smoke breaks, lunch breaks and other delays?
The preferred KPI
In some instances ‘timing’ is factored in. This makes this an ‘adjusted KPI’ – but sometimes, the idea of an adjusted KPI is an accepted value but not the preferred KPI. When we walk up to a cash machine we expect to conclude our transaction in just a couple of minutes. We wouldn’t be very happy if the machine said. Come back in half an hour i will have your money ready then…
The preferred KPI would be, the amount of time it would take if this was a single participant process – that’s your Standardised Work.
This means that if I have to code and post a 1000 line accounting document in the system of record, I should be able to do that in say an hour from the moment I receive it, assuming I have all the data that I need to do the job properly and I am not doing something else the moment I receive it. This also means that in an 8 hour day I should be able to process at least 8. It follows then that if my group receives 240 of these documents per day, then I need 30 people in order to get through the work within the SLA.
In reality though, if I receive all 240 documents in the last hour of the work day, I am not going to get the work done within that hour. You could argue that that same staggering will likely occur tomorrow too, but as we all know, work doesn’t always flow that way. This is one of the reasons why another measure often gets introduced in the form of a business process management metric that tracks cadence or flow and alerts on extraordinary peaks and troughs. Interestingly, these too become subject to the calendar.
A further complication occurs when the period end arises which then overrides all the KPIs and simply states. ‘Here’s the date by which it all needs to be done!’.
So what I would really like to see is the calendar discarded altogether and the acceptance that if a process took 50 hours to complete because a weekend happened in the middle between submission and completion, that was how long it took – we may have to get granular with the classification of the time itself but that’s another matter.
In aggregate it will all smooth out anyway but at least it will eliminate or flag up any work cadence issues. Today that may not be very transparent with the dependence on calendars.
In the end we all have to own the end-to-end quality of the work we do, even when we have to rely on others to help us achieve our goals.
I was quietly minding my own business when one of our sales account managers over at Winshuttle sent me an unsolicited email suggesting that I make myself available for a couple of hours to walk a prospect through a business problem and see how we could solve the challenge with a product or combination of products.
I don’t shy away from challenges but without more context I mentally rolled my eyes and thought, oh boy! The next I saw, was another calendar invitation from a solutions engineer suggesting that I join a ‘discovery’ call with the prospect and dig into what it is exactly that they wanted to be able to do. At this point, you can imagine more eyes rolled heavenward and me wondering what I had let myself in for.
In the world of SAP there are four kinds of people, the know it alls, the know nothings, the know a bit and the fakers. In my time of working with the product I have seen and danced with all four. If I were to classify myself, it would be the ‘know a bit‘ class. I am not downplaying what I know, I just know that there’s an awful lot I don’t know and that chasm grows wider and wider every day or rather the bit I know, seems to be becoming increasingly diminutive the more I learn.
So when you come across someone who has played with SAP for some time, you’re always a little reticent, after-all, the more years of experience the person has with SAP the more likely they are to know some stuff. Play along with me and you will see what I mean.
So on the call I am presented with a problem, a problem that interestingly, appears on SAP’s SCN too!
Dear… how to find in active customers list? ex : i gave given open items days 3 months in ova8 – suppose if i want to know which customers are not paying money for nearly 3 months then how to find inactive customers list or give me any table
Ok so the English is godawful and broken but you get the idea. How do I identify active customers? Or rather, in this case, how do I identify inactive customers. You would think this is an easy thing to ascertain, after-all doesn’t SAP deliver everything? Well if you look in that SCN article, evidently not.
The clue comes in the posting that says
Technically Inactive Customers means the Customers who have done no transactions (Sales / Payments) with business since long time. While what you need is the list of Customoers (sic) who have not made any payments since last three months. For this I believe FBL5N will be a good report for get the necessary details. Secondly you can also approach your FI Consultant. He can pull data of Incoming Payments from relevant tables & get the list of Customers who have paid in last three months. Which will means that all the others Customers have not paid. Also try T-Code FD11 which will give Analysis for Customer’s Account.
And now we come to the critical part, “approach your FI Consultant. He can pull data of Incoming Payments from relevant tables & get the list of Customers“.
This is basically where the conversation with this prospect started.
Effectively he tells us that one of his IT staff took two days to dump the data from the relevant tables and then analyse the data arrive at the answer. How would you do it?
Again, being pretty lazy, I didn’t want to spend too much time working out what exactly I needed so I begged forgiveness and ignorance and asked for the table names and fields from the caller and a day later I received them.
The issue here is that when you build a query of joined tables, there are relationships between the tables and these are generally by some common key, like the customerID, the further challenge is that in certain instances, sales invoices vs. payments from customers, the customerID may be common but the document ID’s in SAP are not. Every payment gets its own document ID even though it may relate to a sales invoice ID. So I started simply enough with joining customer to BSAD and then customer to BSID, because I knew about this document ID challenge I guessed that an outer join might be better . Unfortunately I get to only use one outer join because the Winshuttle Studio Query module only allows one outer join to minimize the impact on system performance.
BSID and BSAD are the tables in SAP with open and cleared items of customer accounts. Think of them as a mashup of the primary accounting data also found in BSEG and BKPF. Again with credit to SAP’s SCN articles
So the main tables of FI document are BKPF and BSEG, if BSEG-KOART (account type) is D, it means it’s customer item, so you can find that item in BSID or BSAD, where:
BSID/BSAD = BKPF + BSEG.
By using BSID and BSAD it is possible to find customer items faster than directly reading BKPF and BSEG, because KUNNR (customer) is not a key field in BKPF or BSEG and these tables can be massive. So you can consider BSID and BSAD as secondary index tables, to improve the performance when you need to pick customer data directly. Additionally, they are transparent tables, not that you should worry too much about this, but that effectively means that as they appear in the dictionary (as tables), so they are in the database – company code and customer number therefore make them really easy to use. You can read more on the other types of tables here – all of which incidentally can be used and joined to with a Winshuttle Query.
So returning to my problem. Joining these tables and extracting them in real-time meant that I could identify the active customers and their last activity but the rest didnt really work, because every event potentially had more than one entry and in the end I wasn’t getting a very satisfactory result.
So I bit the bullet and created a single table query script for each table but then added the sweetener – I linked them together as a chain and bound each query to a different sheet in the workbook.
Ok, so be aware that I am looking at a 1,000,000 row limitation per sheet in Excel, but hey! I am working with IDES data and even the best of our organ grinders can’t make that much stuff in these tables unless they really want to.
My queries ran pretty quickly, less than 30 seconds each, Dumping KNA1, KNB1, BSID, BSAD. I could have joined the first two, after-all I only needed the first one for the customer name but in the end it’s ok it was a quick run. I needed KNB1 because the same customer could appear under different company codes so knowing the valid combinations was important.
All tolled, about 22k customer company code combinations, a few thousand BSID’s and about 30,000 BSAD entries. I extracted the document posting date because that was the key to the last activity.
Now that I had the data, what next ?
Well, this neat little Winshuttle Analytics add-on for Excel kicked in at this point. Converting all of the Excel table extracts to in-memory tables with WS Analytics I now had the ability to perform some ad hoc inquiries.
The first of these was how many unique combinations of company code and customerID were in BSAD ? Next, what was the latest entry in BSID and BSAD for each of those company code customerID combinations. Building the queries took seconds, casting them down the combinations a few seconds more. My next part of the challenge was actually doing the same against the KNB1 table because in the end, I want to identify the dormant accounts. The casting took a little longer, since Winshuttle Analytics is effectively a custom plug-in for Excel and brings the in memory tables with data pivots on steroids to the table, I am still hamstrung by the speed of the Microsoft Excel calculation engine, and sometimes, to be quite frank, it can be a dog. I got there after a few minutes and then added a few more tweaks. I developed a third formula in excel, one that would assign a deletion candidate indicator. This I could use with a transaction recording to flip the record deletion flag in the SAP master record.
The whole exercise took about an hour – pretty good when you compare that with two days!
I showed this to our visitors. I was disappointed that I hadn’t been able to do it all with a query but in the end, it didn’t make so much sense to do it all with a query. The alternative would likely have been a long running ABAP program but I managed it with an hour of thinking and doing and nothing more than a couple of desktop tools and most importantly no changes to my SAP system!
I’ll be the first to admit that my knowledge of SAP security authorizations is lightweight.
I know enough to be dangerous but not enough to be even close to something like a ninja or knowledgeable person. If I am confounded by something invariably I have to read up about it or find a SME to help me.
What I can tell you though, is that in my present and past lives, SAP security authorizations have been the bane of users’, support personnel, and developers’ lives.
I am not entirely sure why this is the case.
My long held suspicion is that this has a lot to do with mistakes, omissions and explicit decisions made over the years with respect to the way solutions and applications have been architected. Unfortunately a holistic view of the security authorization model is often absent among developers and they rely on the auditors and the authorization framework to determine what can and cannot be done safely – it’s not a very good approach and is fraught with risk.
The situation is further compounded by the absence of robust knowledge and subject matter expertise among developers as a whole. The latter is one of the reasons why the annual security and controls audit that many companies are subject to, are an equally painful experience. The auditors come in, run a bunch of scripts, dump a bunch of data to files and then run away with them, returning some weeks later with a list of questions the length of one’s arm. It is something I became quite good at handling but nonetheless something I really hated, a massive time suck.
When we implemented BW and the Portal, admittedly some years ago, we found our ABAP security authorizations model became infinitely more complex. There was security and authorizations that was now specific to these environments too! Then came SAP CRM, and again, authorizations had to be set up for that too. With the growing adoption of HANA and the promise of S4HANA the question will come up as to whether the security model will change and to what extent. It seems for security at least, there’s more complexity coming…
SAP Fiori Security Authorizations
SAP Fiori applications communicate with the ABAP stack through OData services. In addition to ABAP authorization users must be granted authorization to access the HTML 5-based Fiori applications and the OData services in SAP NetWeaver Gateway. SAP Fiori applications require users and roles in SAP NetWeaver Gateway.
A SAP Netweaver Gateway PFCG role contains start authorizations for OData Services. SAP doesn’t deliver these roles to customers.
The SAP Fiori Launchpad relies on the authorizations in the ABAP back-end server and the authorizations to access the OData services in the SAP NetWeaver Gateway in the ABAP front-end server. Configuring these relies on Launchpad catalogs and UI PFCG roles.
A technical role is needed for the SAP Fiori Launchpad and SAP delivers a predefined set of technical objects for the SAP Fiori Launchpad, in particular the required profiles. For ‘own’ catalogs these and possibly other technical objects are also required. The UI PFCG defined roles bundle all front-end privileges required for execution of Fiori apps. Through user assignment to back-end roles, additional privileges are provided to execute the specific applications or access the OData services. The powerful ‘Fact sheets’ require authorization roles in the front end, which grant the users authorizations for the ICF services and business server pages (BSP) to support the fact sheets.
Simple? Not a a chance…
It could be that things have started to simplify, but I have some doubts, or rather, I am not optimistic – this all suggests – “more complex”. SImplifying things on the front end will mean more complexity on the back end.
I was further reminded of this, this week when discussing EHS/EHSM with a consultant. We were debating the merits of using BAPIs and rFM’s to create and maintain EHS data and considered the implications for the security authorization model. After-all, we wouldn’t be using transactions for the automations. In the end we concluded it might be ok. But the question ultimately, will be whether or not this stands up to audit scrutiny.
If you wonder what all the fuss could be about then you only have to trawl over to SAP SCN and read some of the many discussions on security and authorizations.
Conversely, if you’re the kind of person who likes to learn more through self study, then you might want to make a small investment in Andrea Cavalleri and Massimo Manara’s book from Galileo Press – 100 things you should know about Authorizations in SAP (ISBN: 9781592294060). For the ABAP developer a slightly older volume but indispensible guide is Authorizations in SAP Software: Design and Configuration by Volker Lehnert (ISBN:9781592293421). Both are available at SAP Press and through Amazon. These don’t contain anything on the Fiori model, for that you will need to look elsewhere like SAP Help.
Unfortunately the SAP Fiori Security Guide that was originally published as a PDF has been taken down from its original location, perhaps because it is a bit out of date so again, refer to SAP Help for more information.
My eldest son says that I am ‘very judgemental’ – I don’t really have a defence and though I do tend to complain a lot about stuff, the truth is that there is a lot of stuff out there to complain about!
You can guess then where this post is going. It’s another rant about bad programming and poor applied thinking to technology. I’ll concede that things have changed immensely over the years since SAP was first let loose on the unsuspecting business community but you would be surprised how much new bad code crops up.
Yesterday I was posed the challenge of trying build automation around a multi-headed gorgon in the form of the SAP transaction MI10. It’s one of those peculiar transactions that shouldn’t be used that much but which surprisingly probably gets used quite a lot given the fact that inventory sometimes gets damaged, lost or mysteriously grows legs and walks out of the warehouse.
On face value, the transaction is probably not altogether terrible but it is when you discover that certain complex scenarios make it super difficult to work with.
Getting physical with your inventory
To automate any kind of process for more efficient use you need to look to repeatable patterns to reduce the amount of logic that is needed to stream the data. In this instance, if you are making use of serialized inventory and something needs to be said, stated or changed in relation to that, then you need to abide by your systems’ accounting rules and provide the write downs and write offs with as much information as possible in order to avoid an audit exception note.
With tight margins, efforts to constrain inventory and free up working capital, the physical inventory process is key to ensuring that you have in stock what your system says you have in stock, and if you have variances – account for them.
MI10 can be used as an umbrella transaction for the equivalent activities of taking and recording physical inventory and making adjustments. SAP supports you being able to break this process into three tasks streams but also allows you to unify them in MI10. All well and good. At the time of making the postings to adjust inventory you should know what the physical quantity of items is that you wish to adjust for. In MI10 this quantity value then dictates how many fields need to be populated with data by rendering a screen with as many field spaces for serial numbers as there are items to adjust for.
Since would be a fine approach except that this doesn’t make use of an ALV grid in the ABAP program but instead presents up to 19 fixed fields which are used to check against existing serial numbers. The screen is pretty dumb. When you have more than 19 items to enter you also have to scroll down to get to more fields.
For a human entering the data this is fine, for the most part, but transcription errors are easy to introduce and manual entry is terribly inefficient.
It’s unfortunate that this is such a difficult transaction to work with and it is unfortunate that serialization was apparently tackled as an after-thought in the way that the program was enhanced.
When I tried to automate this process using an alternative method, say with a BAPI, I found one – BAPI_MATPHYSINV_POSTDIFF , but unfortunately it doesn’t support the important serialization aspect. Weird, since BAPI_MATPHYSINV_COUNT does….
So it’s disappointing to say the very least. Looking out on SCN it seems the only way around this is to once again get an ABAPer to write something, something to fix an inherent defect in the thinking around this important business process.
How do you ensure that your employees have just the right level of understanding and skills to meet the demands of the job but keep them motivated and assure data quality.
Michael Management (MMC) , a specialist SAP eLearning company revealed some interesting characteristics about how SAP users feel about their training in general in their most recent survey.
Almost half of the respondents (44%) indicated that they preferred eLearning self paced online training and 43% indicated that they hadn’t received sufficient training to perform their job responsibilities adequately. Almost a third indicated that they should have received at least 41 hours or more training in the last year to do their jobs properly and yet hadn’t received that. MMC also revealed that one of the reasons that the training doesn’t happen is that for more than a quarter of the respondents, there isn’t enough budget for training and another quarter indicated that there was nothing particularly useful or appropriate on offer. With only a quarter indicating that there isn’t enough time for training what are the choices left?
On the job training
On the job training seems to be the most common approach used to getting new employees up to speed to be able to do their jobs adequately but with this being heavily dependent on how well processes and procedures are documented as well as the relative effectiveness of the person giving the OJT, it should be little suprise that employees are frustrated or demoralized. Larger, more sophisticated organizations have comprehensive and extensive training programs blending a combination of job shadowing, classroom instruction and eLearning but it can be a challenge keeping this content current and contemporary.
One of the ways to minimize the dependence on training programmes to assure data quality and business process efficiency, is to automate as many processes as possible or to reduce processes to simple procedures that can be easily understood and leave little to interpretation or discretionary decision making. Businesses don’t want to necessarily employ automatons but at the same time they do want to ensure that employees cross all the T’s and dot all the I’s in the execution of their job tasks.
Another survey by ON24, a webcasting and virtual communications solutions company indicated that more than half of entry level employees need explicit skills training and that the best trained industry segment is the medical and pharmaceutical industry where half of respondents indicate that they receive good training. While almost half of all the respondents reported that training is a top priority in their organizations more than three quarters also feel that training helps them do their jobs better, improving company performance and enhancing personal career goals.
SAP implementation projects
Training as part of a new SAP implementation is pretty commonplace, it is usually an integral part of the project delivery and it gets incorporated into the overall budget of the implementation but long after the project has ended and the consultants have moved on, how does your organization ensure that employees know what they are doing and why they do it to ensure data quality and proper business transaction processing?
Many of the companies that I interact with, indicate that the appropriate use of SAP is not something that represents a core focus of training existing or new hires. What they do indicate is that data quality and operational data processing challenges continue to plague the business. The challenges are so great that they are scurrying to implement more robust approaches to data gathering and data maintenance in core systems of record like SAP.
By implementing increased oversight and governance to the end to end data management process, particularly for master data and master records objects, they are obviating the need to expend extraordinary amounts on expensive training and education programs for employees.
Your own army of SAP consultants
It’s my opinion that spending money on training employees how to use an infinitely configurable ERP solution like SAP is a bit of a double edged sword. When you educate and teach your employees about ERP capability through formal classroom education programs that have certification processes at the conclusion, you are grooming some of your potentially best employees in how to ultimately become SAP consultants. If you’re a consulting company or have ambitious other ERP projects then fine.
If you can retain them and leverage that learning for future projects then that is all well and good, but invariably, those very classroom settings, expensive as they are, don’t teach your employees much about the way that you run your business and why certain processes and procedures exist. You’re better off investing in some very targeted eLearning and using that to explain concepts and principles that are relevant to your business and how it uses SAP. At Winshuttle this is further augmented by periodic webinars and online training courses and webcast content.
The difference in what you save on appropriate eLearning programs versus expensive proprietary training courses in classrooms is better spent on deploying lean data management approaches to automating and guiding processes and procedures.
Mechanisms such as automations with templates and workflows assure data quality, process governance and ensure compliant data management processes are in place and adhered to.
There are many options available for mass data maintenance in SAP and while third party applications that connect over RFC are not necessarily the ideal method for mass actions in SAP, the other options tend to be either very technical or very limited.
LSMW is quite frankly a sorely abused tool that auditors complain vociferously about. IT has no business messing with data in productive systems and business users have no business having access to LSMW.
The mass transactions and even BAPIs on offer in SAP natively, even the newer and revised ones, lag behind the capabilities of the configured or customized transactions in ERP itself and often never get configured or modified to align with what the business needs so they simply never catch up or align.
It might be proposed that business consider BODI/BODS but then this quickly falls into the realm of heavy technical lifting and architecting and configuring a complex solution with a long tail of deployment before the business can actually get going with the solution. Besides, BODS/BODI are often exclusively tools in the domain of IT or technically proficient users. If in the interceding period between selection and implementation of such a solution, the business pivots or changes direction, how quickly do these solutions re-align with the requirements of the business? My experience and that of others, suggests that they take a long time and maintaining them comes at a tremendous technical and operational expense.
In the absence of something better, business is back to an agile solution that can quickly connect to the configured transactions in SAP ERP (like MM01/MM02) calling them over RFC and filling the screen fields in the background (BDC) or interactively (GUI Scripting). Performance and throughput hinges heavily on several factors but not least of these is the way the application is positioned and the resources committed for the task.
Proximity is everything
The closer the client is to the SAP applications, the faster the throughput (you’re then not dealing with network latency as a degrading factor). Also consider running these sessions against a batch server rather than just any old application server on the front end. If the call of the transaction over RFC process has to compete with regular dialog users in daily work then someone or something is going to have a bad performance experience, the batch processor is more tolerant but the wet ware (people) tend not to be…
The selected scripted method also has a tremendous impact – if you use GUI scripting then this the process is going to be dog slow as each screen gets painted and each field filled – this classic screen scraping should be avoided as much as possible.
While BDC over RFC is not as high performance as LSMW you reap benefits in other ways, the user who owns the data gets to load the data ( not usually an option with LSMW) and the errors (if any) show up in the excel workbook or access database in line with the records you are trying to change or create – using other methods you often have to convert the data to CSV or text and load a text or CSV file and then try to reconcile the items to the log – not pretty and also pretty painful.
Finally I will add that abuse of these third party tools is also often a challenge seen in the field, consider where they are positioned.
Transaction recorders are not recommended for the millions of records data sets, do they get used for them? Sure, but should they – probably not.
My experience is that once you hit about 30,000 rows of data, the question becomes one of, how long are you prepared to wait for this job to run? With throughput of around 30 fully loaded materials per minute using a single BDC session on a small ECC box can you afford almost 1,000 minutes (16 – 17h) of load time?
Do you have thoughts on better ways than these?