There is a deep connection among test cases, test scenarios, and SRS. The story begins with SRS i.e. Software Requirement Specifications document. Preparation of SRS begins with a deep analysis of business. This includes exhaustive discussions with business process owners. It also includes different business process walkthroughs. As a matter of fact, any shortfall in SRS leads to a bigger disaster than any glitches happening later. Once SRS sign off takes place, the relevant team starts building test scenarios. And, then, test cases emerge out of these various test scenarios. When the testing team is ready with test scenarios they identify overall scope of the testing. On the basis of this, the QA team prepares a detailed test plan. Then the plan goes to execution. The team responsible for its execution prepares test cases.
Actually, there are two parallel activities that are taking place at this juncture. While the QA team is busy with documenting test cases, the development team begins the SDLC (Software Development Life Cycle) with the coding phase. In a waterfall model, these two activities might be sequential. But otherwise, in agile methodology, by the team each code chunk is ready for testing, the QA team is ready with the test cases by then. This mechanism shrinks the overall tïme of execution of the project.
Test Cases, Test scenarios, And SRS Relationship
While in the waterfall scenario, test phase comes after development phase, in agile, development takes place in iterations. Similarly is for testing. Each iteration clears development and testing in this fashion. Hence, as is clear, in the former case, the overall time increases because the testing begins when coding finishes. The development team hands over the complete code to the QA team. The QA team then starts studying and preparing test cases.
In the nutshell, test cases are all about ‘How’ of the business. If you are ensuring to drive a business with the help of a business application, test cases ensure to validate each ‘how’ related to every process of the business.
Every application that you develop has some fundamental requirements. These requirements arise from business and user. Let us look at it closely to understand clearly what is a test case in software testing. It is essentially very important for the software testing community to understand it clearly so as to ensure the successful launch of any software that their development team creates. Actually, any software development is a collaborative effort. It includes mainly three key stakeholders for the right kind of development. The number of stakeholders increases further when it comes to deployment of the application.
The first set of key stakeholders is the business analyst and customer. If these two don’t marry well, it creates a lot of trouble later. A business analyst should never hesitate in double assuring the customer requirement before documentïng it. In fact, he will have to pass through all possible business scenarios to ensure capturing of right requirements. Now, when it comes to development, whether you follow waterfall approach or scrum methodology, understanding of what is a test case is very crucial. So the second set of key stakeholders is the customer and developers. And the third set of key stakeholders is a tri-party association. It comprises of the tester, developer, and customer. So as we see, the customer is an omnipresent entity throughout this crucial journey of requirement gathering, development, and testing.
What Is A Test Case
As a matter of fact, customer plays the biggest role in all the phases of any development. The higher is the level of engagement with the customer, thë less are the troubles. Now, about what is a test case in software development and testing. A test case, basically, is a set of conditions that a tester will determine to ensure that the system under test satisfies as per the actual requirements so as to works correctly. This process of developing test cases helps in finding problems in the requirements or design of an application.
This post is in continuation of my previous post regarding Big Data predictions and concerns for 2017 and beyond. In the previous post, we talk about a huge increase in big data and spending. Let us talk further about big data predictions and concerns for coming years.
The latest study by IDC says, “Through 2020, spending on cloud-based BDA technology will grow 4.5x faster than spending for on-premises solutions.”
Analysts believe much of that new spending will go towards cloud-based big data analytics (BDA) solutions. In fact, IDC forecasts, “Through 2020, spending on cloud-based BDA technology will grow 4.5x faster than spending for on-premises solutions.” What it means is that open source big data solutions will be in a high demand when companies go for cloud-based solutions. And it is quite visible. We can see technologies like Hadoop, Spark, Storm and others dominating big data analytics. As a matter of fâct, all of the leading cloud computing vendors, like Amazon Web Services, Microsoft Azure, IBM and Google Cloud, are offering big data analytics products. And in the same manner, many smaller firms are offering cloud-based analytics products as well.
Forrester report pertaining ṭo 2017 says, “Investment in artificial intelligence (AI) will triple as firms look to tap into complex systems, advanced analytics, and machïne learning tëchnology.”
Another pointer in IDC report says, “By 2020, 50% of all business analytics will incorporate prescriptive analytics built on cognitive computing functionality.”
A study by Markets & Markets states, “The streaming analytics market is estimated to grow from USD 3.08 Billion in 2016 to USD 13.70 Billion by 2021, at a Compound Annual Growth Rate (CAGR) of 34.8%.”
Big Data Predictions and Concerns
If we talk about salaries of big data professionals, a report from Robert Half Technology says regarding average salaries for 2017 as, “Big Data Engineers $123,000-$158,000. Data Scientists $99,500-$132,000. BI Analysts $97,000-$132,500.”
As a matter of fact, there are quite a few more indicators and pointers about big data predictions and concerns. In fact, all these points hint towards growth in big data technologies, big data volumes, manpower requirements, skill demands, and greater opportunities to meet customer’s expectations.
Before we talk about Big Data Predictions, let us admit that big data security is a major concern. In addition, training, skillset, tools, and applications is another wide spectrum of major concerns. But as we all know, this all create a bigger spectrum of opportunities. Technologies like artificial intelligence, prescriptive analytics, and real-time streaming are there to play a lead role in big data solutions. Alright, so now let us talk about the big data predictions and concerns for 2017 and further.
A recent EMC Digital Universe Study says, “The digital universe is doubling every two years and will multiply 10-fold between 2013 and 2020.” The benefit of this prediction is that any enterprise or business analytics accuracy and width will increase tremendously. The risk is to cater to this tsunami of data with appropriate manpower, skills, and infrastructure. As per a study by Dell EMC, the global storage systems comprises of 4.4 trillion gigabytes of data at the end of 2013. This is bound to reach aroúnd 17.6 trillion gigabytes of data by the end of 2017.
Big Data Predictions and Concerns
The credit of such a humungous data goes to other important IT trends. These trends include Internet of Things (IoT), social media and data from mobile devices. Therefore, just wait for more and more data to accumulate worldwide. It will be interesting to see this prediction bouncing back with more hoardings of data than predicted.
IDC says, “Worldwide revenue for big data and business analytics will grow from nearly $122 billion in 2015 to more than $187 billion in 2019.” As the data increases the big data analytics capabilities will also increase. This will raise confidence in big data technologieṣ and hence the spendings will increase. Definitely, in order to store and analyze big data, enterprises will increase their spending. If we draw a dirëct relationship between the growth of data and spending, it is goïng to be 50% higher in next five years. Almost double of spending will go in big data software out of the total spending estimates. The big data spending will rise fastest in sectors like utilities, resource industries, healthcare, and banking. But above all, as a matter of fact, manufacturing will remain the biggest big data spender.
We will be discussing the same in the next post…
Big data has come in a big way and is gaining momentum in the same way. In a very short period, it has become a familiar phenomenon all across the globe. So much so that it is no longer a new technology. In fact, it is introducing newer terms and technologies as a rider. Most businesses are talking about Big data analytics. Because they realize that the business analytics will go haywire if they don’t consider all appropriate data. This data might be lying anywhere but is relevant to the business. It could be on the cloud, in the data center, on social media, or elsewhere. Hence it needs a complete mining their data to gain realistic insights. It is essential if they want to remain competitive in the constantly evolving marketplace. In fact, many organizations have already seen the benefit of big data solutions. Therefore, now they want more.
Refinement of big data capabilities lies on the top of priorities for 2017 for most of the big organizations. In fact, many startups already are familiar with its power and hence right into it since the beginning. Primary trends also show that big data analytics is crucial for such enterprises. In fact, they are analyzing for ways to gather more data, more quickly. Because for most of them, the initial investment in big data technology has been paying well. That is why the expansion of their big data projects is on their cards. The aim is simple. Firstly, to get much better insights in business. Secondly, to achieve even greater financial results.
Big Data Analytics
At the same time, the big data talent demand is increasing exponentially thereby creating a talent crunch. Because of this, the companies are hiring less qualified or incompetent people because there is no another option. This, in turn, is impacting their business goals that they are not able to achieve. On the other hand, the salaries ar extremely high for the right candidates. It is really difficult to get big data analytics professionals. As a matter of fact, technology can play a major role here with the development of big data analytics tools.
Advanced Analytics can’t happen without Big Data integration. In fact, big data integration is the key to achieving advanced analytics. Big data has a tremendous scope. As a matter of fact, organizations that use analytics gain huge competitive benefits in terms of reduced costs, enhanced production efficiencies, reduced risks, and so on. Employing advanced analytics throughout your enterprise increases these benefits manifold.
Having right tools is much more important for any enterprise before deploying advanced analytics. If you are new to big data you must understand it’s nitty gritty well. The concept and volume of processing change completely. You need a support of innovative tëchnology and build advanced decision-making capabilities.
But before anything else you must assess your current level of integration among key business applications. This includes legacy applications that are actively operational and driving business values. There have to be least troubles in terms of infrastructure. If rest everything is smooth, you can take a decision of scaling up. With so many criṭical things happening around, it might confuse you where to start. Big data comprises of huge volumes of structured and unstructured data derived from various sources. In fact, never try untested tools when dealing with big data across business units, data centers, and other sources.
Advanced Analytics needs Big Data Integration
As a matter of fact, there is a difference between advanced analytics and business intelligence tools. There are multiple applications including ERP running in a small organization. If all these applications are integrated well, the organization can easily build a crisp dashboard providing realtime view of business. This can help them getting timely insights for decision-making. Firstly, be clear about your objectives. Secondly, your goal must be to gain better outcomes in a measurable manner. Theefore, it is critical to identify your data sources. Asess you data pipe sizes, infrastructure, and other technology areas.
In the nutshell, the success of your advanced analytics project will purely depend on your ability to capture data and your integration infrastructure.
This is, in fact, a great initiative by Google in Artificial Intelligence (AI) and Machine Learning areas. DeepMind is going to be the next challenge in a big way. Google’s secretive AI aims to develop human-like reasoning. That is why DeepMind partners with Blizzard to build a high-level AI research platform. It will be a hard testing of AI agents using the real-time strategy game. As a matter of fact, machines will be developing strategies and react to other players. The approach intends to gain benefits of learning AI application in the real life scenarios.
It will be a level up in AI research. Earlier achievements include mastering Atari arcade classics and beating human world champs at board games. But this time, the machines will be having a chance to act in a more life-like gaming universe. The announcement came earlier this month at BlizzCon 2016 in Anaheim, California. It was about Google DeepMind teaming up with video game giant Blizzard Entertainment. With this, they aim to turn one of the hit titles into a learning environment for AI. StarCraft 2 is a widely popular real-time strategy game. The new partnership will see it open up into a sandbox for the purpose of teaching and testing machines.
DeepMind Partners with Blizzard
The strategic alliance aims to create a scientific mission to push the boúndaries of AI. DeepMind team will be developing programs that use AI tools to learn and solve any sort of complex problem. And, in fact, that too without needing to be told how. As a matter of fact, games ar the best means to do this. Because the environment allows to build and test smarter. In fact, you apply more flexible AI algorithms rapidly and effectively. At the same time, it provides instant feedback on how a player is doing through scores. The same way DeepMind aims to achieve.
Latest release on Indian government’s official press site talks about Impact of technology. Bandaru Dattatreya, the Union Minister of State (Independent Charge), Ministry of Labour and Employment says it all depends on user’s mindset. If you are using technology with a positive mindset, it will definitely yield positive results. What it means is that thë tëchnology itself is neither good nor bad. It is all about the mindset with which you use it that decides its’ impact. Accountability and transparency are the key factors for good governance. Technology can always help in ensuring it. With the rapid changes in technology, one must keep changing with the changing time. There is in fact, no life without technology. To bring revolution in life, you need to use technology. Definitely, any revolution comes along with some challenges. Even then it is the technology that helps in tackling those challenges.
Impact of Technology Depends on User’s Mindset
Now look at the practical side of this to understand the impact of technology in our work life. For project management or testing automation, we use some or the other tool or methodology. The same tool or methodology brings different results in different scenarios or organizations. Even you will find two different teams using the same testing tool in the organization will yield different results. For the first team, it could be highly successful yield. While for thè second team it could lead to a failure. So we see the same tool bringing positive as well as negative impact. In fact, there could be various reasons for this. Reasons could include lack of knowledge about the tool. It could also be the lack of good governance in the second case. There could be a lack of accountability and transparency.
As Dattatreya says further, the need of younger generation in India is different. Their thinking is different and hence it must be addressed in a different way. This generation uses technology with a great comfort and ease. In fact, it is a good sign for India. It shows that India is going to be the most competitive country in the field of technology.
The Event was on ‘Technology and the Future of Work’. It was organized by the Ministry of Labour and Employment. The motive of the program was to discuss in detail how technology affects and supports human life especially in the context of job creation and destruction; its impact on direction and quality of human life.
Data Protection Officers are high in demand. In fact, the demand is at its peak. And rather, it is about to increase exponentially worldwide. It is because of increasing threats and vulnerabilities. Data protection is the top most priority for any organization. On the other hand, cyber attacks are at its verge. If you are in the UK or working for a British firm anywhere in the world then EU GDPR Practitioner qualification can help you become DPO. In fact, a suitable training in cyber security can help you achieve the ISO 17024-accredited EU GDPR Practitioner (EU GDPR P) qualification. This way, you can fulfill the role of data protection officer (DPO) under the EU General Data Protection Regulation.
A recent study by IAPP determines that the GDPR’s global reach will necessitate at least 75,000 data protection officers (DPOs) worldwide. The report also states that the United States will require the largest number of DPOs (9,000), followed by China (7,568) and Switzerland (3,682).
A certification in EU General Data Protection Regulation (GDPR) Practitioner is useful in comprehensive understanding on the subject. It also helps to acquire the practical skills to fulfill the role of DPO under the GDPR. Similar kind of courses is in high demand in every country.
IT Governance supports professional development in this regard. After completing the course, students will sit a 90-minute multiple-choice exam accredited by IBITGQ. If you qualify, you become certified ISO 17024-accredited EU GDPR Practitioner qualification. This certificate signifies that now you have the knowledge and skills to help organizations achieve compliance with the GDPR and take on the responsibilities of a DPO.
Data Protection Officers Gain Demand
It is important to know what all they cover in this certification. The Certified EU General Data Protection Regulation (GDPR) Practitioner training course covers the GDPR principles, the role of DPO, setting up a privacy compliance framework, data protection impact assessments, data mapping, the role of data processors and controllers, data breach reporting requirements, achieving compliance and similar topics.
GDPR Compliance Becomes Important for all UK-based organizations. It is a mandate now for all organizations to comply with GDPR in order to stay within the law. IT Governance is among the pioneers in providing cyber security and data protection services. The firm guides all UK-based organizations to comply with the EU General Data Protection Regulation (GDPR). The initiation of this whole exercise comes from Karen Bradley MP, Secretary of State for Culture, Media and Sport. She regulates that the EU General Data Protection Regulation (GDPR) will apply in the UK.
In addition, Elizabeth Denham, Information Commissioner welcomes this announcement by saying, “One of the key drivers for data protection change is the importance and continuing evolution of the digital economy in the UK and around the world. That is why both the ICO and UK government have pushed for reform of the EU law for several years.”
Alan Calder, the founder and chief executive of IT Governance, says “With the government confirming the GDPR will apply to the UK, organizations need to prioritize GDPR compliance to avoid the possible fines of up to €20 million or 4% of global annual turnover and the threat of a lawsuit from aggrieved data subjects.” He further adds, “Boards should be concerned about these risks, put GDPR compliance at the top of their agendas and push for adequate technical, administrative and operational security measures in line with the GDPR requirements”.
GDPR Compliance Becomes Important
As a matter of fact, the GDPR is about to become law within next 18 months. That is why the organizations around the world that directly or indirectly collect and process data on European residents will have to comply with the Regulation.
IT Governance’s Certified EU General Data Protection Regulation (GDPR) Foundation and Practitioner training courses can help delegates get a basic understanding of the Regulation and/or fulfil the role of data protection officer (DPO). Because it can help organizations get a basic understanding of the Regulation and/or fulfill the role of data protection officer (DPO). That is why GDPR compliance becomes a top priority.
The concept of a separate set of testers is vanishing. And so is the testing phase after production phase. Scientifically practice testing protects memory against stress. As a matter of fact, testers must be away from stress when it comes to testing. In fact, learning by taking practice tests has a great power. It is a strategy known as retrieval practice. Hence, it can protect memory against the negative effects of stress. This is a lovely video in that context.
“Typically, people under stress are less effective at retrieving information from memory. We now show for the first time that the right learning strategy, in this case, retrieval practice or taking practice tests, results in such strong memory representations that even under high levels of stress, subjects are still able to access their memories,” says senior study author Ayanna Thomas, Ph.D. She is an associate professor. As a director of the graduate program in psychology at Tufts, she has a high focus on this study.
“Our results suggest that it is not necessarily a matter of how much or how long someone studies, but how they study,” says Amy Smith. Amy is a graduate student in psychology at Tufts. She is also the corresponding author on the study.
Practice Testing Can Create Perfect Testers
“Even though previous research has shown that retrieval practice is one of the best learning strategies available, we were still surprised at how effective it was for individuals under stress. It was as if stress had no effect on their memory,” Smith says. “Learning by taking tests and being forced to retrieve information over and over has a strong effect on long-term memory retention, and appears to continue to have great benefits in high-stakes, stressful situations.”
“Our one study is certainly not the final say on how retrieval practice influences memory under stress, but I can see this being applicable to any individual who has to retrieve complex information under high stakes,” says Thomas. “Especially for educators, where big exams can put a great deal of pressure on students, I really encourage employing more frequent more low-stakes testing in the context of their instruction.”
This is concluding post from my discussion with Bill Tolson, Vice President of Marketing, Archive360. In the first post, we discussed the salient features of his company Archive360 and its various offerings. In the second post, we discussed Archive2Azure, their recently introduced new solution. Here, in this concluding Q&A, Bill highlights the ideal use cases for Archive2Azure.
Here we go:
Q: What are the ideal use cases for Archive2Azure?
Most companies have huge amounts of unstructured data floating around their enterprise that is, in reality, not managed at all. This content is usually only visible to the owner/custodian. In fact, up to 80% of enterprise data can be considered unstructured/low touch data. However, this content consumes storage resources and can be a major problem for compliance and eDiscovery situations. Archive2Azure is perfect for that corporate grey data that is not obviously valueless (and should be defensibly deleted) but rather that unstructured data that could still be needed. For example, many companies archive departed employee data for the period of time equal to the statute of limitations for wrongful termination lawsuits. Another use is to securely store legal department eDiscovery data sets that can grow into the tens of TB, in case that data is needed in a later appeal. There is no limit to the reason for utilizing Archive2Azure. The bottom line is that Archive2Azure can archive and manage a company’s grey data at a much lower cost than can be done on premise,
Q: What is your existing global presence paradigm (in what countries do you offer your solutions) and how you plan to expand it further in 2017-18?
Archive360 offers our solutions and support around the world. Earlier this year we began a new channel partner program to increase our channel partner base. We currently work with best of breed partners and continue to look to add to that partner base.
Q: Where could readers go to learn more?
Readers can go to the Archive360 website for more information or send an email directly to: email@example.com
We are in conversation with Bill Tolson, Vice President of Marketing, Archive360. In the previous and opening post, he talks about his company Archive360 and its solutions. As we learned, The Archive360 product line consists of 3 solutions; Archive2Anywhere, Archive2Azure, and FastCollect. Archive2Anywhere is the only solution in the market specifically designed to deliver fast, trouble-free, predictable legacy email archive migrations, with verifiable data fidelity and defensible chain of custody reporting. Archive2Azure is a managed compliance storage solution based on Microsoft Azure. And FastCollect for PSTs is the only automated PST discovery and migration solution that ensures ALL message stubs, metadata, and message content are located, rehydrated, migrated, deleted, and audited in a manner ensuring full legal and regulatory compliance. In this post Bill talks about Archive2Azure, their recently introduced new solution.
We proceed with our discussion with Bill Tolson below.
Q: You recently introduced a new solution, Archive2Azure, could you explain what was going on with your customers and/or in the market, that prompted its introduction? And, could you explain how Archive2Azure addresses these trends/market needs?
Bill: We noticed that many of our legacy email archive customers, after successfully migrating their legacy archives to Office 365, were still not shutting down the archive due to the need to keep the Journal available for regulatory reasons. Archive2Azure was original designed to move and manage those legacy Journal files to Azure intact so that the expensive legacy email archive could be shut down saving cost. Soon after clients began asking if they could move other content up to Azure. They wanted to do this so that they could move the majority of their low-touch or grey unstructured data up to the cloud to save on infrastructure costs.
Archive2Azure provides a much needed management layer to Azure so that customers can utilize the full Azure stack to better store and manage their unstructured data. Archive2Azure is the only Azure application to enable very low cost storage, retention/disposition, custom on-demand indexing and search, legal case management including search, access controls, review, hit high lighting, tagging and export.
We will be closing our discussion with Bill from Archive360 in the next post.
We are in discussion with Bill Tolson, Vice President of Marketing, Archive360. He will be talking about Archive360 and its solutions. In fact, Archive360 has recently introduced a new solution, Archive2Azure. Bill would explain what was going on with their customers and/or in the market, that prompted its introduction. Also, he would explain how Archive2Azure addresses these trends/market needs. In addition, he would discuss about the ideal use cases for Archive2Azure. And finally, we would come to know about Archive360’s existing global presence paradigm (in what countries do they offer their solutions) and how do they plan to expand it further in 2017-18.
Q: Name and title?
A: Bill Tolson, Vice President of Marketing, Archive360
Q: Could you introduce Archive360 and its solutions?
Archive360 has been and is the market leader in legacy email archive migration software, successfully migrating more than 12 petabytes of data for more than 500 organizations worldwide since 2012. The Archive360 product line consists of 3 solutions; Archive2Anywhere, Archive2Azure, and FastCollect.
The company’s first product, Archive2Anywhere, was introduced in 2012 and is the only solution in the market specifically designed to deliver fast, trouble-free, predictable legacy email archive migrations, with verifiable data fidelity and defensible chain of custody reporting. A big differentiator for Archive2Anywhere is our ability to rehydrate (or recombine) email stubs in the Exchange mailbox with the archived message/attachment in the legacy before the content is migrated. Many other migration vendors simply delete the stubs. This is legally dangerous due to the fact that email stubs can take on additional metadata that could be responsive to the eDiscovery. Archive360 has never been replaced at a client due to its extremely consistent and proven technology. Archive2Anywhere is able to migrate 11 of the most popular legacy email archives.
Archive2Azure is a managed compliance storage solution based on Microsoft Azure. It is delivered as part of the Archive2Anywhere platform and is the industry’s first solution allowing for complete elimination of legacy email archives and long term management of other low touch or ‘grey’ data including inactive employee work files and PSTs, file system content, system generated data, and data sets generated form eDiscovery. Archive2Azure delivers long-term, secure retention and management of low touch unstructured data, including journal email for regulatory compliance.
FastCollect for PSTs is the only automated PST discovery and migration solution that ensures ALL message stubs, metadata, and message content are located, rehydrated, migrated, deleted, and audited in a manner ensuring full legal and regulatory compliance. FastCollect for PSTs enables you to scan your enterprise, including individual desktops, for all or select PSTs and migrate them to the new repository, automatically.
Archive360 is a global organization and delivers its solutions through a network of specialist partners. Archive360 is a Microsoft Cloud Solution Provider and the Archive2Azure solution is Microsoft Azure Certified.
We will continue the discussion with Bill Tolson in the next post…
Bosch Technology Solutions launches innovative smart solutions for transforming India. As we all know, software expertise is a key differentiator for smart solutions. India is a developing country. And there is a huge scope for connected industry because the country is undergoing a paradigm shift. India is in process of positioning itself as a global manufacturing hub. In that aspect, Bosch Technology Solutions brings a wide range of initiatives. These initiatives include a number of technology solutions and services. Some of these are automation, drives, software, sensors, mobility, and predictive maintenance. These solutions cater to many industry segments in India. Some of the industry segments include healthcare, manufacturing, pharmaceutical, etc. Some of the best practices of Bosch Manufacturing are part of these Bosch technology solutions. Some of these smart initiatives include:
Smart Mining: The Bosch Group is widely present in the mining industry. They are a pioneer in hydraulics, fuel injection systems for mining equipment and security technologies. Bosch Technology Solutions include smart solutions for the mining sector. As a strategic move, Bosch plans to tie-up with local system integrators in order to gain a deeper footprint. In this way, they will be able to provide end-to-end solutions. The key solutions include minimizing downtime and lower lifecycle cost. In addition, Bosch IoT platform connects miners and mining equipment.
Bosch Technology Solutions Launches Smart Initiatives
Smart Solution for Construction Industry: India is focusing on infrastructure development by stimulating growth in the construction equipment sector. In fact, the credit goes to Bosch Technology Solutions. As a matter of fact, the new range of affordable yet highly effective power tools by Bosch India is a big boon to the segment.
Smart Mass Transportation System: As a matter of fact, Bosch Technology Solutions brings a major transformation in all key metros in the country. At most of these places, the key transportation technologies are from Bosch. These include passenger announcement systems, passenger information display systems, and CCTV with end-to-end integration. In fact, these solutions cater to metros, airports, and railways.
Smart Agriculture: The last vertical from Bosch Technology Solutions that we discuss here is Agriculture. This solution includes farm and livestock operations, and supply chain management. The purpose is to enhance productivity and sustainability. As a matter of fact, agriculture operations are being enhanced by digitization, automation, big data, IoT, Crop Modelling, and sensors. Hence, Bosch Technology Solutions become a catalyst to India’s growth.
India is advancing fast. So it’s technology needs are. In fact, technology plays a major role in management and mobility. That is where Bosch India is playing a significant role in bringing transformation in the country. Smart Traffic Solution from Bosch is one such leap in fulfilling Prime Minister Narendra Modi’s dream of making 100 smart cities in the country. Bosch India is taking many initiatives in this direction.
In fact, the deadlines and targets are meeting their plans well. Hence, the progress is well under control. Bosch is bridging this wide gap by providing solutions to streamline and manage Indi’s traffic. That is the purpose of a smart traffic solution.
In the smart city segment, a smart traffic solution is a prime requirement. Hence Bosch offers end-to-end solutions. Bosch is doing so by drawing knowledge from its core expertise in Mobility Solutions. Bosch India is already running pilot projects in this regard for more than a year. The pilot is successfully running in Bēngaluru city. Under the umbrella of smart city solutions, Bosch is prominently active in Bengaluru in India. For instance, the Intelligent Parking System allows commuters to find a space in the parking lots in the crowded area. This solution enables them to detect parking slots in the nearby vicinity of their destination.
Smart Traffic Solution From Bosch India
It is taken care of by parking slot occupancy detection through sensors. The smart traffic system further allows the end-user to search, book and pay via mobile for a parking space. Apart from the Intelligent Parking System, the Intelligent Traffic Management System includes Citizen Safety & Security, and Intelligent Transportation Systems.
The key differentiator for any smart traffic system is software expertise. That is where Bosch has an edge over its counterparts working in the same segment. In fact, the credit goes to Bosch’s huge R&D division that is working in designing state of the art solutions. Some of the other solutions of smart traffic solution include:
- Smart Traffic Signal Management
- Smart transport System including real tïme alert mechanism
- Smart Route Management System
Bosch India talks about Beyond Mobility. In fact, a lot of work has been done in their R&D divisions to offer connected and smart solutions. The time has come when business segments need to think beyond mobility. Various research initiatives by Bosch India and the concept of Make In India have been able to bring mid price product range. In fact, this is a superb example of strong localization. Some of the initiatives by Bosch India towards Beyond Mobility are:
- Running smart city pilot in Bengaluru
- Creating smart manufacturing solutions for Indian factories to enhance productivity
- Smart solutions for the mining sector in order to increase operational efficiency and enhance productivity. This is Smart Manning.
- Introducing affordable power tools for various industrial segments like construction, metal and wood working.
- Solutions to manage and control India’s fast growing mass transport needs.
- Smart agriculture solutions to connect and collaborate the complete ecosystem.
Bosch is a global enterprise and a leader in technology and services. This is their first ever showcasing of an exhaustive coverage of connected and smart solutions in thë business sector beyond mobility in India. The contribution from three of its various divisions in 2015 was aroúnd 15 percent to the Group’s turnover in India. These divisions are Ïndustrial Technology, Energy & Building Technology, and Consumer Goods. That itself shows the uprising trend of these industries in the coúntry. In fact, these solutions are now available in India. Relevant industries can deploy these Beyond Mobility solutions from Bosch India.
Beyond Mobility by Bosch India
“Customers are increasingly looking for suppliers who can offer them end-to-end solutions. Bosch is leveraging the combined strengths of its products to offer integrated solutions to meet the demands of customers across key growth sectors in India,” says Dr. Steffen Berns, Managing Director, Bosch Limited. Dr. Steffen is also president of the Bosch Group in India.
As a matter of fact, the world is changing quite fast. In fact, that is thus impacting on the requirements. Therefore, connectivity requirements are increasing tremendously. In fact, it is the linkage of the physical and virtual worlds provides the opportunity to offer new applications. “Bosch is uniquely positioned to offer meaningful solutions by harnessing its expertise across hardware, software, and services,” mentions Dr. Berns. Adding further he says, “India benefits significantly from this aspect as it is home to Bosch’s largest engineering center outside Germany, and we do understand the country’s local market requirements.” Bosch India is thus offering connected solutions and services for smart cities, home appliances, smart manufacturing, packaging, construction, transportation, mining, energy, and agriculture.
This is concluding post from my discussion with Gary Watson, VP of Technical Engagement and Nexsan Founder. In the first post, we discussed the salient features of his company Nexsan and its unified storage solution with complete availability and security. In the second post, we discussed how ransomware has become a key concern and how Nexsan addresses it. Similarly, 3rd party data hosting is another pain point that leads to fear among CIOs. Here, in this concluding Q&A, Gary highlights what other key trends are there in the marketplace and what influence have they had on Nexsan’s current/future offerings.
Here we go:
Q: What other key trends are you seeing in the marketplace, and what influence have they had on your current/future offerings?
Gary: The big story is mobility, and in our view, the lack of preparation for this trend by the legacy storage companies. The difficulties of getting at corporate data have led to the wild explosion of 3rd party data hosting services like box.com and Google Drive, which as I’m sure you are aware, have led to a lot of fear and frustration among CIOs. Users are going to their IT departments, potentially at some risk to themselves and certainly at much risk to their employers, and this should be a wake-up call to the on-premise storage vendors that they need to provide solutions to these use cases that exceed the convenience and performance of the “Shadow IT” vendors. This is what led to our acquisition of Connected Data last year, and our efforts to make an enterprise-ready version of their battle-tested concept, which led us to develop UNITY.
Note – We have filed for a trademark on UNITY and are in litigation with EMC over a product they launched later that’s using the same name. Just to be clear, these are two unrelated products.
Thank you!! Gary.
To conclude it, I would accept that 3rd party data hosting leads to fear and frustration among CIOs.
We are in conversation with Gary Watson, VP of Technical Engagement and Nexsan Founder. In the previous and opening post, he talks about his company Nexsan, company elevator pitch, and a brief description of key offerings. As we learned, Unified Storage Solution has become a bare minimum necessity of any enterprise. In fact, it has to be seamless and secure in all aspects thereby enabling connected workforce. As mobility is the key, the data must be accessible by all relevant stakeholders anywhere and anytime. But as the flexibility increases in technology and business, it elevates threats and vulnerabilities. Ransomware is one such critical issue Gary will be talking about in this post.
We proceed with our discussion with Gary below.
Q: Is ransomware a key concern for your customers? How are they addressing?
Gary: We’ve had customers who had their data stored in Assureon, who lost no data and very little time during ransomware attacks as the architecture is highly resistant to malware of any sort. We also have customers using the snapshotting feature of our other storage products to provide a fallback position that allows restoration of data without the lengthy downtime of a conventional recovery from backup media. We have noticed an increase in the amount of ransomware that specifically targets snapshots, and while our systems have not as yet been targeted, we don’t suggest snapshots as the only defense. Assureon provides a far higher degree of protection.
Apart from that, we tell our users to follow best practices including the use of anti-malware software, repeated training of users, and keeping patches up to date. Such conventional efforts certainly reduce the risk of an attack, but since something like a billion dollars will be paid to ransomware criminals during 2016, we can expect the pace and sophistication of the attacks to continue increasing for the foreseeable future. Our position is that the storage hardware needs to start taking some responsibility for protecting at least the unstructured data that is unlikely to change, and we plan to continue innovating along these lines.
We will be closing our discussion with Garry from Nexsan in the next post.
We are in discussion with Gary Watson, VP of Technical Engagement and Nexsan Founder. He will be talking about unified storage solutions. In fact, these solutions are focused on seamlessly and securely enabling a connected workforce. It is all about empowering enterprises to secure the valuable business data. At the same time, it allows users to access this data anytime, anywhere.
Q: Your name and title?
A: Gary Watson, VP of Technical Engagement and Nexsan Founder
Q: Company elevator pitch?
Nexsan™ is a global leader in unified storage solutions that are focused on seamlessly and securely enabling a connected workforce. Its broad solution portfolio empowers enterprises to securely manage, protect and utilize valuable business data – while allowing users to sync, share and access files from any device, anywhere, anytime. Headquartered in Campbell, California, Nexsan is a wholly owned company held by Imation Corp. (NYSE:IMN).
Q: Brief description of key offerings?
We have three product lines – our UNITY unified block and file storage solution with integrated mobile access, our Assureon high-integrity archiver which protects valuable fixed content against accidental or deliberate modification or destruction, and our E-Series P and BEAST block storage solutions for Fibre Channel, SAS, and iSCSI SANs.
UNITY is a filer that can scale from a few terabytes to over 5 petabytes while providing secure access to data via iSCSI, Fibre Channel, NFS, CIFS/SMB, FTP, HTTP, and uniquely, client software that is available for iOS, Android, Windows, and MacOS. It also has integrated support for Nexsan’s popular Assureon secure archiver.
Nexsan is also well known for its 17-year history of providing dense and reliable storage for SAN and DAS applications. Our flagship E-Series P systems have just been updated with new controller hardware which more than doubles the throughput and connectivity as compared to its very popular predecessor.
We will continue the discussion with Gary Watson in the next post…
This is concluding post from my discussion with Connor Cox, Director Business Development, DH2i. In the first post, we discussed the salient features of his company DH2i and its container virtualization and management software DxEnterprise. In the second post, we discussed how DxEnterprise differentiates itself from competitors. Here, in this concluding post we talk about DH2i’s new channel partner program, DxAdvantage. In this concluding Q&A, Connor highlights the key features of the DxAdvantage Partner Program and how it creates a win-win situation for all.
Here we go:
Q: You have announced a new channel partner program — could you provide the program highlights, what types of partners are you seeking, and what is the program’s value for partners?
Cox: The DxAdvantage Partner Program has actually been going strong for a couple of years already and we have crossed the threshold where partners are starting to reach out to us proactively instead of us having to hunt for partners ourselves. We are trying to leverage on this organic momentum by getting the DxAdvantage Program more publicized through this official announcement. Currently, we are represented by partners on 4 different continents: North America, Europe, Asia and Africa.
We are looking for value-added resellers, consultants and also open to OEM’s if they have an innovative idea how to integrate DxEnterprise software with their own product offerings.
The DxAdvantage Partner Program entitles partners to significant discounts on DxEnterprise software, includes technical sales training, opportunity identification, and selling the collateral in a customized portal for their business. The program gives partners the power to deliver superior outcomes to their customers with SQL Server environments in terms of cost savings, next-level consolidation, and simplified management. A lot of our partners also view DxEnterprise as the missing piece in their product offerings—the ability to offer a unique container solution for new and existing SQL Server workloads.
We are in conversation with Connor Cox, Director Business Development, DH2i. In the previous and opening post, he talks about his company DH2i, and throws light on DxEnterprise. As we learned, DxEnterprise is a powerful tool. It helps you control your servers and data center cost substantially. In addition to cost savings, it removes complexities. DxEnterprise is container virtualization and management software that decouples new and existing SQL Server instances from the underlying infrastructure to make them portable. It then enables the stacking of these containerized instances. This improves server utilization and reduces the number of OS licenses. In the nutshell, it enhances manageability, reduces complexities, and cuts down operating costs.
We proceed with our discussion with Connor below.
Q: How is DxEnterprise differentiated over competitors, like Docker?
Cox: To be honest, we don’t even think we should be grouped in with companies like Docker. Companies like Docker focus on stateless applications with their containers and the main goal is distribution and provisioning—something they have grown extremely proficient at. However, there is very little built-in technology focusing on the management of these containers, especially in terms of high availability. On the contrary, DH2i is much less focused on the container and more so the advanced management framework it enables. We can containerize new and legacy deployments of stateful applications like mission-critical SQL Server.
The closest thing we qualify as a competitor is WSFC in the sense that we both compete as a solution for SQL Server instance-level HA—though our software is much more than just an HA solution. We differentiate DxEnterprise predominantly on the simplistic management that it enables at a low cost. DxEnterprise allows for unlimited cluster size on Standard Edition SQL Server and clusters can be made up of any mix of OS and SQL versions/editions (back to SQL Server 2005 and WS OS 2008R2). They don’t have to be like-for-like—a requirement of failover cluster instances.
We will be closing our discussion with Connor from DH2i in the next post.
We are in discussion with Connor Cox, Director of Business Development, DH2i. He will be talking about DxEnterprise and the company’s newly launched DxAdvantage Channel Partner Program. In fact, DxEnterprise is a unique concept. But before that, a little background on the company.
DH2i provides Microsoft Windows Server application portability and management solutions. Its flagship solution, DxEnterprise, containerizes and decouples Windows Server applications, such as SQL Server, from the host OS and underlying IT infrastructure. Customers can simplify and dramatically improve the management of their data center environment. In addition, they can ensure SLA compliance and lower costs by 30%-60%. DH2i is on Gartner’s list of Cool Vendors in the “Cool Vendors in Servers and Virtualization, 2015” report.
Q: Name and title?
A: Connor Cox, Director of Business Development.
Q: Could you introduce DH2i and DxEnterprise?
A: DH2i is a software company founded in 2010 by Don Boxley and OJ Ngo with the mission of making software to reduce the cost and complexity of deploying mission-critical Windows Server applications. Our initial focus has been SQL Server for two reasons: it’s expensive to license and it’s mission-critical.
DxEnterprise is container virtualization and management software that decouples new and existing SQL Server instances from the underlying infrastructure to make them portable. We encourage customers to stack these containerized instances to up their server utilization and reduce the number of OSes they need to license and manage as well. In addition, an automated, instance-level high availability engine is built-in to the software with QoS controls to ensure that all your SLAs are met. In total, DxEnterprise is a cost saving, consolidation, HA, and management solution that can help virtually anyone with SQL Server under management.
We will continue the discussion with Connor Cox in the next post…
This is the last post of an interview series with Bruce Coughlin, EVP, CTP. You can read first post here, and second post here. In this concluding post, Bruce talks about the expansion of their Cloud Adoption Program. And how CTP plans to leverage in scaling it globally to match their client’s needs. Bruce emphasizes the need of Digital Innovation Practice program in order to cater to IoT and other emerging technologies. In fact, digital innovation is the future.
What are the expansion plans for 2017?
First, we will continue expanding CTP’s Sales and Delivery teams both in the US and across EMEA. The market opportunity is truly staggering and we look forward to scaling globally to support our client’s needs.
Second, we will expand our Cloud Adoption Program in order to cover end-to-end cloud adoption for clients across all industries and on all major public cloud providers including Amazon Web Services, Google Cloud Platform, and Microsoft Azure.
Third, we are growing our Digital Innovation Practice to cover Internet of Things and other emerging technologies as we see that all of our enterprise clients are putting digital transformation initiatives at the center of their corporate strategy.
Fourth, we are expanding CTP’s managed services capabilities. Last April, we announced a strategic partnership with Rackspace to deliver professional and managed services for enterprises. Rackspace and CTP have integrated our offerings to create a new category of managed services and provide enterprise clients with unparalleled end-to-end cloud services.
Is cloud the ultimate future for an enterprise? What benefits it brings?
Indeed it is, for most applications and data, but not all. The benefits include cost efficiencies, at about 40-60 percent in operational cost savings. However, the largest point of value is the ability to bring agility to an enterprise. When an enterprise is able to scale rapidly to meet demand or enter new markets quickly, they unlock huge business value.
Digital Innovation Is the Future
What are your views on an organization outsourcing complete IT function?
I’m not sure enterprises are going to outsource all IT functions. They will, however, outsource some operations, platform support, and development. Keep in mind that the enterprises need to drive the innovation aspect of their businesses, and IT should be there to support the business, no matter if it’s outsourced or not.
Is cloud security still a concern?
We solved the cloud security issues a few years ago. In fact, most systems residing on public clouds are typically more secure than on-premise systems. We’ve completed over 350 enterprise cloud projects, many of which were in highly regulated industries including financial services. Ensuring security is systematic to everything we do.
Which kind of organizations doesn’t prefer cloud adoption?
Risk averse organization typically wait for others to succeed before they dip in their toes, and even then it’s a scaled approach. These are typically industries that are less aggressive in their markets, such as manufacturing or retail. That being said, we are seeing successful cloud adoption projects across all industries and organization types.
We are talking to Bruce Coughlin, Executive Vice President (EVP), Cloud Technology Partners (CTP). In the previous post he talks about CTP’s comprehensive framework for cloud adoption. It is very interesting to note how end-to-end cloud services mark a successful 2016 for CTP. In this post, we are discussing Cloud Adoption, Digital Innovation, and Managed Services. And how all these help in accelerating their collective customers into successful cloud deployments.
What made CTP open operations in EMEA? What are the primary goals?
Our key partners including Amazon Web Services (AWS), Microsoft, Google, and Rackspace are rapidly expanding their reach across EMEA. Their expansion illustrates the demand for cloud services that exists in enterprise and global accounts. We are thrilled to begin leveraging our own enterprise cloud adoption, digital innovation and managed services to help accelerate our collective customers into successful cloud deployments.
Our expansion abroad is in line with the rapid increase in cloud adoption across the region. To support this growth, AWS is opening new regions in the UK and France to complement its other European regions in Frankfurt, Germany and Dublin, Ireland. Microsoft currently operates six Azure European regions and Google is expanding its cloud services from four European data centers in Ireland, Finland, Belgium and the Netherlands.
End-To-End Cloud Services Deliver Great Results
This commitment is encouraging to many companies looking to migrate their applications to the cloud. We partner with the world’s leading cloud platform providers and now have the dedicated resources abroad to help accelerate cloud adoption for our global clients.
Would you like to share one of the biggest success stories of 2016?
2016 was a huge year for CTP. We doubled the size of the company, expanded globally and established a strategic partnership with Rackspace to enable new end-to-end cloud services for our enterprise customers. Early in the year, we launched The Doppler Quarterly, our flagship cloud computing news, and best practices publication that is read by thousands of IT professionals around the world. We closed our series C round, achieved the AWS Migration competency for our work migrating enterprise applications to the cloud and were named a Gartner “Cool Vendor in Cloud Computing.”
As Executive Vice President at CTP (Cloud Technology Partners), Bruce Coughlin is responsible for all customer-facing functions from sales through delivery. In his role, Bruce Coughlin works with all of Cloud Technology Partners go-to-market functions including sales, technical pre-sales, marketing and alliances to build and execute on global expansion strategies. From a delivery perspective, Bruce Coughlin works across Cloud Technology Partners’ practice areas to build the teams, structure, and process required to execute on Cloud Technology Partners’ industry-leading market position and service offerings. Additionally, Bruce Coughlin has responsibility for guiding the strategic evolution and growth of CTP on a global basis.
Please tell us about CTP’s comprehensive framework for cloud adoption.
Based on the best practices from hundreds of successful engagements, CTP’s Cloud Adoption Program brings together the best technology, practices and learnings for enterprise-wide cloud adoption and has been a huge success with our enterprise clients who want to ensure their cloud initiatives start – and stay – on track.
The Cloud Adoption Program is a comprehensive approach to enterprise cloud covering:
● Cloud Strategy & Economics
● Cloud Security & Governance
● Application Portfolio Assessment
● Application Migration & Development
● Cloud Ops
Starting with a workshop, the Cloud Adoption Program aligns key stakeholders and ensures client teams are working towards the same cloud goals and objectives. This prescriptive approach to cloud adoption and application migration has proven invaluable to dozens of enterprises including the most prominent financial services, telecommunications, manufacturing and technology firms in the world.
Cloud Technology Partners
We would be discussing many important aspects with Bruce like:
What made CTP open operations in EMEA? What are the primary goals?
Would you like to share one of the biggest success stories of 2016?
What are the expansion plans for 2017?
Is cloud the ultimate future for an enterprise? What benefits it brings?
What are your views on an organization outsourcing complete IT function?
Is cloud security still a concern?
Which kind of organizations doesn’t prefer cloud adoption?
Project lifecycle and project management have a very close relation. The success of a project highly depends on its timely closure. Although projects rarely have a pre-closure. But it does happen. And it might happen repeatedly. Then here is something extraordinary in that project manager. But there is a catch in this. The project manager might be playing a big game. Projecting timelines beyond stretch in the project plan. And then squeezing the timelines comfortably. But then these flukes do not happen repeatedly. Rather then it can’t happen that all other stakeholders overlook it.
Project lifecycle and project management go hand in hand. If you don’t manage a project well, it could have severe impacts on the project. And when a project undergoes serious negative impacts it might lead to an early death of the project. The project may hang without any ownership in that case. Therefore project lifecycle and project management are directly proportional to each other. Also, if you manage it well, you get credits for that.And vice versa. Here are 4 posts that will help you from various angles. In fact, all four posts try to touch project lifecycle and project management in a different style. Looking at the same thing from various angles always helps.
Project Lifecycle And Project Management
A project manager can’t afford to vanish out of the stage during any phase of a project. It does not mean that you have to be present in front of all team members all the time. That might not be possible, especially during an offshore project when your teams are scattered geographically. But there are ways of being present and making your presence felt across the teams and during all important project review meetings. Read more.
If systems and processes are not in place to manage each and every phase of a project, it leads to panic across the teams. The smooth sailing gets disturbed thereby causing turbulence and unnecessary hindrance. Read more.
Do you have an Idea Factory within your organization? It gives you an indication that the teams are alive and functioning properly. Read more.
Following are the top 15 pain areas of a software project. Read more.
Learning how to enhance project quality is critical for every project manager. As a matter of fact, project quality is a multidimensional array of tasks. In fact, if the project manager doesn’t expertise how to enhance project quality it will create a lot of trouble for him. He will never be able to inculcate the relevant passion in his teams. Rather, it will start deteriorating the existing zeal and fire in the teams. When you are captain of the ship, in the case of any crisis all eyes will fall on you. Everyone will have a lot of expectations. In fact, at that moment of time, you have to be their source of motivation and energy.
Whatsoever is the intensity of the crisis, it is you who has to ensure that the journey doesn’t stop. You will have to bear the brunt of disruptions. But you will have to find newer ways to get results even out of those disruptive moments. Because it is important to learn how to enhance project quality. Hence acquiring expertise becomes important. Moreover, sustenance is also important to maintain. In the same context, here are 4 posts:
How To Enhance Project Quality
The definition of QUALITY varies in different contexts. On one hand, we talk about software quality. That means adopting standards and measures. Read more.
Project initiation is the beginning. If there is no initiation, there will be no way forward. Once the ignition does its work, only then the engine can start. If all blessings are intact in right place, a right team formation becomes a big boon for project drive and completion. Read more.
Project management must have inbuilt sensors. These sensors must be capable of detection of flaws (or shortfalls) in the system. It is important because you need to monitor project progress quality. These sensors have to carry a fast response time. And these should be active in all kind of environments. That will help you in improving the health of the project management ecosystem. Read more.
A successful product development does not mean its deployment will also go successfully. Read more.
For the project managers, this art is very important. This art is how to avoid project failures. Once a project manager acquires expertise on how to avoid project failures, he can lead other project managers and climb up the ladder. In fact, once you reach this level, you become a natural leader, mentor, and coach for others. And the largest chunk of the benefit goes to the teams working under you. Because the smart guys will also start acquiring knowledge on how to avoid project failures. That too, without your knowing or guidance. Because they start noticing something unique in you. And that starts driving them in that direction. In fact, you start building a healthy competition not only among your peers but also among your teams. Here are five posts in that regard:
Overseas projects handling is an altogether different game. You need to be quite careful in selecting and supervising teams for the overseas project. Read more here.
Understanding people related issues is an art. Some of these issues, if not handled properly, may lead to a critical edge. Read more.
How To Avoid Project Failures
Project Governance challenges are an integral part of a project. There is no point in avoiding these challenges. The best way is to master the way in handling them. In fact, there is always a scope for improvement in any kind of governance. It can always be enhanced and improved to an optimized level. Once you reach an optimized level, it does not mean further scope for improvement is over. The day you stop seeking improvement in its governance, your project conditions may start deteriorating. Read more here.
Project Management means no failures. As a matter of fact, the anagement is meant to clear all obstacles. Projects encounter obstacles. In fact, project Manager needs to anticipate risks well in advance. Mitigation of risks is important. The important aspect is to learn from failures. Read more.
What happens when a project fails? It is the turn of a project manager to give valid reasons to safeguard himself. Read more.
Continuing from my previous post on Custom Insights, let us learn more about it. Firstly, Apteligent has come out with its latest industry report, ‘Network Crash Edition’. This report has the first of many previously unknown findings. The report reveals certain amazing and shocking findings. It says interactions with cloud services result in a substantial increase in the number of crashes on iOS and Android apps. Secondly, it also evaluates the respective app store category. And what impact does it have due ṭo networking issues? Moreover, it points out why these issues keep occurring at such a high rate. It implies the power of Custom Insights.
Key features of Custom Insights™ are:
- Correlate vast amounts of underlying app data with marketing performance metrics in a simple way
- Manage data-driven decisions with custom reports containing the statistical analysis and insights to improve mobile business
- Leverage bleeding edge data science tools and Apteligent’s global data to contextualize your app’s data for predictive analysis
Custom Insights from Apteligent
As a matter of fact, key takeaways from the report include:
- 20 percent of mobile app crashes are correlated with a network issue
- Android Nougat is 2.5 times more likely to have a network crash than iOS 10
- Fabric, Twitter’s mobile platform, crashes apps. It ranks third worst in analytics and fifth worst in advertising
- Medical, finance, and shopping apps are extremely susceptible to network crashes
- 88 percent of network calls involved in a crash were successful but returned unexpected data that led to a crash. In fact, 10 percent of successful network calls returned no data and then led to a network crash
It is quite critical for organizations to be able to find how user behavior causes certain network calls. In fact, these calls can easily lead to a crash. Such crashes increase customer dissatisfaction. Rather it can cause deterioration in business and brand reputation. As Andrew Levy, co-founder and CSO, Apteligent summarizes about Custom Insights “There are plenty of ‘big data’ solutions in the market today. The unique value in Custom Insights is that it allows organizations to go beyond dashboards of data. Our platform makes proactive recommendations by analyzing data across tens of thousands of apps.”
Apteligent Custom Insights enables yo to find critical data points. As a matter of fact, Apteligent is a global company providing predictive app intelligence. It’s most recent data report has come with the title ‘Network Crash Edition’. The report highlights discoveries on the basis of its newest offering Apteligent Custom Insights. Andrew Levy is the CSO (Chief Security Officer) and co-founder of the company. The key findings from the report are:
- Android Nougat is 2.5 times more vulnerable to have a network crash than iOS 10
- Fabric, Twitter’s mobile platform, crashes apps. In fact, it ranks third worst in analytics and fifth worst in advertising
- As a matter of fact, Medical, finance, and shopping apps are extremely susceptible to network crashes
- 88 percent of network calls involving in a crash were successful but return undesirable data that leads to a crash. In fact, 10 percent of successful network calls return no data and then lead to a network crash
- Finally and amazingly, 20 percent of mobile app crashes can easily correlate with a network issue
Since the outcome is resulting from tens of thousands of mobile apps, it represents hundreds of millions of application launches. The revelation of cloud services source of crashes brings Apteligent’s Custom Insights to the forefront. In fact, it becomes an essential tool for the companies. Rather it will help them to improvise the overall experience faster and much accurately. Because the tool will be touching some new but crucial data points to pinpoint the exact issues and causes. In fact, it reveals many new insights.
Apteligent Custom Insights
As a result, Apteligent’s Custom Insights™ reveals cloud services source of crashes. It reveals that almost 20 percent of mobile apps crash due to a network issue. This is a big concern for organizations and apps. In fact, it turns more serious when we are talking about big data, mobility, IoT, etc. Apteligent is an expert and legend in predictive app intelligence.
We shall be talking more on Apteligent Custom Insights in next post.
ERP is never a one-time expense. Along with capital investment in any ERP, you invite a big operational expense. The major chunk of this annual investment goes into top three activities. First and foremost is the development of new reports. Second is customization of an existing report. And third is document processing. You really need to introspect it. How many developers do you have in the organization for ERP Output Reports and document processing? What is per developer annual expense? How much backlog is there all the time? How much idle time they get when there is no considerable requirement?
I think with advancements in technology and entry of smart SAP Output Management tools the scenario is going to change drastically. There is going to decrease in ABAP coder’s demand. The tools like Compleo is powerful for ERP Output Reports and document processing. It is a SAP certified product from Symtrax. Compleo is powerful enough to help manage complex reporting and document processes and its distribution with ease without ABAP programming. Businesses using SAP are adopting Compleo fast so as to cut off their large pool of ABAP programmers thereby gaining long term benefit.
ERP Output Reports and Document Processing
As a matter of fact, three major global giants have certified Compleo. These are IBM for their AS/400 systems, SAP, and Oracle. All three agree that Compleo from Symtrax is a magical tool that reduces development cost drastically. In fact, Compleo does wonders in case of any ERP you use. This is on-premise, per user license model. The product is modular. You need not purchase all modules. It is as per your organizational need. Moreover, whatever modules and number of licenses you procure, the cost recovery is fast. In fact, in most of the cases, you recover its cost within a year. And thereafter it is a profit for your organization in terms of saving the cost of high-level developers. I will talk about the product features in detail in next post.
Whenever you plan you can look into these key considerations to start business pṛocess automation. One thing is there that you must take a note of. That it is not possible to automate every process.There has to be a proper assessment. And this is true for each process. Some processes might not be appropriate to automate. And in fact, there will be some processes that you will find difficult for complete automation. That is why it is important to learn key considerations to start business process automation.
In-depth analysis of the processes is important as a part of key considerations to start business process automation. Key users ar the right candidates for such probing and understanding. They will be able to point ou many insights. How many times a process require deviation? How much repetition? Who approves deviations? Who analyzes the repetitive jobs? These are the important insights. Any automation must result in cutting the workload. In fact, it must free up staff time. And they must utilize this free time in carrying out other activities. Those activities that require their skills and experience. As a matter of fact, the whole setup must become more efficient and productive.
Key Considerations To Start Business Process Automation
Automation without gathering complete information from respective key users might go waste. As a matter of fact, this information includes complete process knowledge. Always take them along during the automation journey. Otherwise handing over a system at the end might not get you proper results. Their consent is important. They must get training to get conversant with the new system. A proper hands on is important in tat regard. It helps them in using the system effectively and efficiently. This, in turn, ensures a smooth transition. Definitely, there is resistance fṛom user for any change happening. One reason for this is losing importance in the organization. But this is not true. They must understand the new system will increase their efficiency and importance.
In fact, the management always seeks results. Especially whenever there is an investment. Implementing BPA involves investment. Therefore, it is important to monitor the progress of automation. These key considerations to start business process automation are always helpful.
Business Process Automation, Business Process Management, and Business Process Engineering (BPA, BPM, and BPE) are different from each other. It is important for a business to clearly understand the difference between BPA, BPM, and BPE. BPA or Business Process Automation in simple terms is enabling technology to automate business activities or services in order to achieve a particular process, function, or workflow. Business processes evolve out of various activities and functions of the business. There needs to be a business strategy in place. Automation must result in cost reduction.
Firstly, it involves the integration of various systems and applications. Secondly, it also results in restructuring in the organization. And all this is done with the help of technology. ERP is the best example of BPA. Because it involves all functions and key processes of the organization. To understand the difference between BPA, BPM, and BPE, it is important to understand each.
BPA, BPM, and BPE – Main Differences
Business Process Management or BPM is an alignment of all business elements in order to improve business operations and performance. BPM must enhance business efficiency, creativity, flexibility, and technology. It involves a systematic approach to enhance business workflow. In fact, it helps in achieving business goals in a better way. The three key focus areas of BPM are lowering human errors, making communication a highly effective tool, and optimize clarity in roles. It aims to improve overall as well as individual performance. In fact, it is a process of managing and optimizing organization’s processes. That is why we describe it sometïmes as ‘process optïmization process’. There is no doubt that organizations that are process driven perform better than those that are people driven.
Finally, we talk about BPE to understand the difference between BPA, BPM, and BPE. BPE is a framework to enable the execution and maintenance of various key business processes. It aims to leverage a proper handshake between the business processes and business applications in an organization.
IT systems are prone to bring automation in an organization. In one foṛm or the other. There are enormous benefits of business process automation. In fact, sometimes the benefits come into light after automation. Mostly you get more benefits than you could perceive. Only thïng is to move in right direction. Definitely, automation is always there for a purpose. And this purpose is obviously bigger than the expenses to incur in doing it. Actually projecting right benefits is also an art. Management will always seek right justification for any investment. If you are not able to convince your management for automation, you lack something. Either you are not clear about the benefits of business process automation. Or you are not having a good convincing power. Your confidence and knowledge are at stake at this juncture.
IT systems bring ease and comfort to the users in an organization. This is the foremost criteria of automation. In fact, these factors must be on the top of benefits of business process autömation. The other way round is to do a POC (Proof of Concept) or a pilot to convince key users. This way you can easily showcase the benefits of business process automation. In fact, if you are able o convince them, they become your voice in convincing management for investment approval. The design of your various systems must be in such a way that they easily talk to each other. This approach enables you to automate in much faster, easier, and effective manner. It makes integration and information flow easier.
Benefits of Business Process Automation
Selecting a right tool for getting maximum benefits of business process automation is very important. In fact, the tool must be capable in streamlining all the key processes. And it must also have the ability to integrate operations and applications in place. Overall, the purpose is to streamline the business.
A business comprises of two sets of patterns. The first set is a set of predictable patterns. And the second one is of unpredictable patterns. The former results from operations that are repeating in nature. While the latter arises out of operations that are not repeating in nature. But both the cases involve a series of processes. Business Process Automation is possible for predictable patterns and series of operations. Basically, any automation should focus on primary operations that run the business. Such operations are the core processes of the business. In fact, these are mission critical processes.
Automation is an evolving mechanism. It triggers from the knowledge an organization carries. And once you automate key business processes, it helps in enhancement of the same processes. That is how the circle completes. Automation brings satisfaction in an organization. It brings maturity in roles and responsibilities. As a matter of fact, it also brings maturity in business and business processes.
Any BPM (Business Process Management) tool is useless if it doesn’t talk about automation. In fact, business process management is incomplete if lacks focus on automation. There has to be a complete study of existing IT systems. You can also think of procurement of some exclusive BPA tools. There might be a need of re-engineering of business process management in the organization. Whatever system you deploy in the organization, it aims to fulfill some or the other business process. Business processes evolve with the needs of a business. A business has a number of needs. And IT systems can’t cater to all these needs. Business keeps some processes out of system strategically. But it is not fair to keep the key processes out of the IT systems scope.
Automation Brings Maturity in Business
Similarly, there are many BPM tools available in the market. It is important to ascertain the need first.
Business Process Management and Business Process Automation have a deep and direct relationship. I have seen many business process management experts who don’t talk about automation. In fact, they should highlight the key business benefits of automation. And how business process automation can act as a catalyst to their business. Automation may lead to different action points. In fact, automation may lead to major changes in business processes. And as a matter of fact, it also might lead to a change in responsibilities and roles for many in the organization.
Coming back to my point above. Why many experts don’t educate organizations on business process automation while talking about business process management? They might not be aware of the benefits. And may happen because they might belong to a pre-digital era where data processing was the prime job of IT. So unless you taste the pudding you don’t get the confidence to talk about it.
Business Process Management – BPM
On the other hand, thë younger generation belonging to digital era wants automation in every process. In fact, if there is no automation they feel it as a wastage of time and energies. Automation brings different meanings to different organizations. And it also brings a different meaning to the various stakeholders. But the ultimate goal remains the same. It needs to bring down the manual work. Also, it fastens the processes by means of various good things. Firstly, it reduces rework. Secondly, it makes the same information available universally. Thereby it reduces the ambiguity of data and information in the organization. Thirdly, it helps people work smartly and more efficiently. Lastly, it helps management take fast and accurate decisions. In fact, wit the help of automation you can reach to a level of the real-time environment.
To summarize, business process automation means streamlining of processes. As a result of this, it helps in reducing costs. Automation and business process management needs to apply across the organization. As a matter of fact, it results in restructuring and integration of key applications. This has to happen across the enterprise in order to save time and money.
Business Process Automation is an essential key to any business today. In fact, it applies to all businesses across the globe. Whether your business is running within the boundaries of a state, region, or country. It doesn’t matter. Rather the larger the organization, the higher is the necessity. As a matter of fact, it is important. And it must be a top priority for any size organization. In fact, it must happen irrespective of your geographical stretch across the globe. It is helpful in reducing organizational workload. Not only that, it also helps in the downsizing of organization size. Contrarily despite reducing manpower, you achieve more. Also, it gets you business results in a shorter timespan. Thereby, as a result, it increases organizational throughput. And in fact, it makes your decision engine run faster. Rather in a more effective manner.
As a matter of fact, you shouldn’t believe in reducing your manpower. Then it can do magic for you. You can think of expansion and diversification in a smarter way. Because shredding of your talent pool is a foolish act in my opinion. This is not the right way of reducing cost and increasing profits. In fact, it reflects the lack of vision. Better use this spare spool in a different way. Use it for increasing work efficiency. Your older employees are always an inspiration for younger employees. They can be good role models for the younger recruits. And this all can become true with the help of business process automation.
Business Process Automation and Its Importance
Mostly people misguide while talking about business process automation. They project it as a costly affair. On the other hand, it is just a misnomer. They might think like this for their own reasons. Or rather lack of knowledge. They are not clearly able ṭo perceive the real benefits. But if you have the ability to understand above factors. And the capability to make it happen. You are able to recover this investment in no time.
Let me talk about 10 posts on software testers. The first post of 10 posts on software testers is Knock, Knock it is a tester here talks about all that a tester handles during his job. His job includes documentation, test cases, testing, testing report, feedback, and verification. In fact, the feedback component is back and forth. Firstly, tester finds out bugs and submits the feedback report of bugs to programmers. They, in turn, fix the bugs and submit feedback to the tester.
The second of the 10 Posts on Software Testers is Progressive Software Testing Approach by acquiring Soft Skills – Step by Step. This post highlights the soft skills a tester requires in his career. In fact, these skills help to stay on and progress well.
The third of the 10 Posts on Software Testers is BVT or BVA – Boundary Value Testing or Analysis. This post talks about what we call as Boundary Value Testing. We also call it as Boúndary Value Analysis. In fact, the name of the post implies itself.
Fourthly, we talk about What is Black Box Testing. Here too, the title confiṛms what this post is about.
Next, it is Equivalence Partitioning (or Class) Testing Method. Though the title is quite clear to indicate what is it about, I would like to say a bit about it. This post tells the importance, purpose, and significance of Equivalence Partitioning Testing or Equivalence Class Testing.
10 Posts on Software Testers
My sixth post of the 10 Posts on Software Testers is Twenty ways to ensure complete coverage of software testing. This is an interesting post. In fact, it talks about all modalities of testing at a micro level.
The seventh post is Testing does not ensure risk-free or bug-free software. As a matter of fact, it is more than a testing to ensure foolproofing,
The next post is Twelve essential Steps of Software Testing Life Cycle (STLC). Here, let us re-examine STLC.
My ninth post of the 10 Posts on Software Testers is Change of Career, Testing to Development and vice versa. It is interesting to learn how easy, or how difficult it is to change the hats.
And the last post is 5 essentials while building test environment for software testing.
Overseas projects handling is an altogether different game. You need to be quite careful in selecting supervising and other teams for any overseas project. Understandïng culture, people, customs, traditions, language, and economy of the destination country are some key factors. It requires a different kind of trainïng to the teams handling overseas projects. There are additional skills, in fact, to inculcate in your team. These include some soft skills too. Definitely, this is in addition to the technical and functional skills that in any case are essential. Handling different set of management and users in a foreign country requires a certain set of qualities.
Language is a big barrier in overseas projects. And in many cases, I had to visit countries where English is not prïmary or secondary language. Therefore, yoú must have a clarity on this beforehand. In fact, for the day-to-day, you must learn some common words. Because these words or sentences will be your surviving tools. These will not only help you to survive at the customer location but alṣo at the hotel and in the city. At such locations, I always select a user in the customer organization who is good in English. He or she becomes the connëctïng factor between my users and me during meetings, deployment, training, or discussions. In fact, such users (even if you are able to get just a couple of them) are very helpful in review meetings with the top management there.
Don’t forget to learn about their culture and traditions before landïng in the country. In fact, it will help you a lot. Rather, it will be a strong bonding factor there. I use it a lot to build momentum and interest during training sessions. Sharing your culture with them and talking about theirs create a rhythm and connect. And it helps me in closing the sessions successfully.
I have a bad habit of becoming one of them. As a matter of fact, it helps me a lot. I don’t mind listening to the music they like, watching their choice of movies, and marketing at places they recommend.
How do you ensure complete coverage of software testing? First and foremost is to check if you have a proper balance in your testing team. In fact, it is to check in skills as well as responsibilities. If you use automation tools, it is important to engage manual testing teams as well. The testing team has to be conversant of all the relevant processes. All this is ïmportant to únderstand for complete coverage of software testing. Two prime aspects of testing goals are business and customer requirements. In addition, the testing procedures must be matching global standards. Of course, these standards you may customize as per your customer’s requirements.
If these goals are not in target then you may face adverse consequences during ïmplemèntation phase. Rather, the pain will arise further post-deployment. The larger is the coverage, the higher are the chances of success. Otherwise, reworks and reinstallations can squeeze your organization badly. This results not only into financial but operational and reputation wise too.
Complete Coverage of Software Testing
Findïng more bugs in the product durïng various phases of development does not ensure complete coverage of software testing. Ïn fact, it might be a result of product development lyïng in poor hands. That needs probing. Whether programmers are weak in skills or there is lacking of supervision, direction, and management. On the other hand, less detection of bugs does not mean complete coverâge of software testing is not in place. In fact, the development by programmers of high skills might result in a strong product with least bugs in it. Documentation of business and customer requirements play a major role. In addition, An objectively driven complete and correct requirement capturïng helps in achieving your targets of complete coverage of software testing.
Verifying or vetting these documents is the next step. If the customer dies not verify and approve these requirements, development and deployment can land in a big trouble. After approval understanding these documents by all stakeholders is critical. Basis this, the scope of developmènt and testing is created. Ensure each step below to ënsure complete coverage of software testing.
- Test strategy, plan, and scope are correct.
- A clear cut scope helps in achieving targets.
- Select the correct test methodology and test tools.
- Test cases must cover all the business rules and customer requirements.
- Prepare a test bed that closely matches the real environment.
- Keep the test environment consistent throughout unless scope changes.
- Test scenarios must be there for all test cases.
- Test report has to be comprehensive.
- Ensure real bugs reporting and miṭigation.
- Ensure to report assumptions and exceptions.
- Adhere to schedules.
- Fix all bugs and verify.
- Engage customer and development team during the testing phase.
Who says a good programmer can’t become a good tester? It is important to learn how good programmers be a good software tester. Most of the people will say quite confidently that a good programmer can’t be a good tester. But if you look at it from another angle you will also agree that a good programmer can be a good tester as well. In fact, if a good programmer understands few realities, he will easily learn how a good programmer be a good software tester. While programming is creativity, testing is in a way anti-creativity. Because a tester always tries to find bugs in the software that a programmer builds. Now look at it this way.
The role of a tester is to find out the defects in a code. This, in turn, enhances the capabilities of a programmer. Hence, a programmer should always be thankful to the testers who spend energies to find out flaws in his creation. Once a programmer learns two exclusive arts, it will be quite easy for him to understand how a programmer be a good software tester. These two arts are self-analysis and self-criticism. Criticism, in fact, is a very constructive tool. If a programmer is able to open his third eye, which is the eye of criticism, he can easily be a good tester. The love for your own creation stays forever. The same is true in the case of a programmer for his program or code.
How Good Programmers Be A Good Software Tester?
But after coding, if he learns to have a critical analysis of his code, it can help him to make it bug-free. Although it is difficult to weigh equally criticism like praises, it is quite important. This, in fact, helps a programmer to build stringent test cases. Which, in turn, will help in a micro level testing of the product. Rather, it will also remove the generic fear-factor that lies underneath. Finding flaws by the tester and getting known to all, especially seniors builds a fear in any programmer. But crossing this hurdle is crucial in learning how good programmers be a good software tester.
Moreover, walking on two two different roads will bring in the expertise of both. Otherwise, regular coding and only coding will make you only a good programmer. On the other hand, handling a double sword in a professional way will make you double-expert. Hopefully, by now you have become aware of how good programmers be a good software tester.
The person in this question is Bachelor of Engineering in EEE (Electrical and Electronics Engineering). Also, she has very poor knowledge about programming. But she wants to become a good software tester. Therefore, she wants to know the important skills to develop for this purpose. My reply goes like this. Having done EEE or not having good knowledge about programming doesn’t qualify that you have the right aptitude to become a good software tester. Software testing is not an easy task. And it also requires good scripting knowledge. You will have to write queries to run test scripts in software testing.
Software testing is checking the loopholes, leakages, and shortfalls in meeting customer requirements. This is not an easy task to become a good software tester. You need to have certain specific kind of skillset to ascertâin this. Check if you are able to clear advance level STQC exam or test. There is a basic level test that anybody can clear. But for clearing the advanced level STQC test you need to be very clear and thorough on software testing and quality.
Some Tips to Become A Good Software Tester
An analytical mind is ïmportant to become a good software tester. In software testing, your mind must work as a business analyst. Only then you will be able to understand exact customer requirements. And you can test a software only if you understand customer/business requirements thoroughly. It is quite clear̆ that to test a product you must be in customer’s shoes. If you can’t try to use a software like a customer, you won’t be able to justify your job of a good software tester. whether you are a part of automation testing or manual testing, coding and scripting are important to learn. Though you won’t be doing it as much as a programmer. But it is necessary. Otherwise, it becomes difficult for survival.
Well, if you think IoT Predictive Maintenance SOA (Service Oriented Architecture) is going to be same as of an enterprise, you are wrong. Even If it is a large size enterprise, it still can’t match ITO predictive maintenance SOA. IoT Predictive Maintenance SOA will be entirely different from a regular predictive maintenance SOA keeping Enterprise Architecture in mind. Typically an enterprise or company level maintenance is taken care of by internal staff or may be a hybrid setup. Hybrid setup might have various propositions. In few cases, an external agency will be minimal, while in others it could share a major chunk. Moreover, in certain cases, some organizations outsource it completely. You can correlate it with a lift contract where the vendor or service agency visits every quarter for preventive maintenance activities.
An IoT system will be not as simple as above. It will be extremely heterogeneous. While in the case of a medium or small organization most of the preventive maintenance jobs are internal. But in the case of IoT predictive maintenance SOA, there will be a high dependency on external agencies. And that too not a single but many. In such cases, a very high level of coordination is important. In fact, there will be a large team for coordination. Definitely, if you automate this system to a large extent, then the system itself can take care of coordination mechanism. In such kind of automation, the system will generate reminders, alerts, escalations and so on.
Moreover, governance is manifold and most crucial. In fact, coordination and governance will have complete responsibility in streamlining the mechanism. Basically, IoT Predictive Maintenance SOA will involve many vendors, government agencies, private companies, and individual professionals.
IoT Predictive Maintenance SOA
In the nutshell, IoT predictive maintenance SOA is like an ocean accommodating many ecosystems. While, an enterprise architecture will vary like pond and river.
In my opinion, Data Governance and Management Director should have a dual reporting. Since he is responsible for the maintenance of master data of the organizational critical database, the quality, integrity, and consistency matters most. In such a case, he should report to Director Quality Control & Audits. Because quality control is important. A small glitch in that could cause a severe loss of the organization’s reputation. Especially in financial and legal matters. There have to be regular audits and inbuilt strong control mechanism. The team will always be confident of their processes and procedures. That is why an external control and audits are necessary for exhaustive data governance.
Moreover, the same team that is responsible for data governance and management is also responsible for few more key activities. These are monitoring data quality, governance rules for updating/changing. For administering those rules, a parallel reporting must be assigned to the head of mainstream business applications. In this case, if it is SAP, then the reporting can be done to the CTO (Chief Technology Officer) of the business. In fact, I would recommend a third reporting to CISO (Chief Information Security Officer). In the nutshell, it forms a core governing body for data governance.
On the other hand, there are many organizations that don’t support or entertain dual or multiple reporting. If your organization falls into this category, then let the reporting go to CISO. Because this role also has to play a major role in data governance. In that case, the other two will be the integral members of the core governance body. This governing body not only takes care of data governance but also relevant activities as above.
Data management is a serious affair. There are examples where a negligence in this regard has gone ïnto long legal battles resultïng in hefty penalties and business closures.
Yes, there are ways of measuring ROI of PR and advertising. And that too in an effective manner. In fact, there are two kinds of PR and advertising efforts in any organization or business. Initially, there will be new plans and executions. But gradually, there will be new as well as the ongoing activities. Even in some cases, few of the activities are of a short span. These are in fact, lïke a project. Measuring ROI of PR and advertising is possible only if you follow certain rules. For instance, If there are new advertising and PR efforts, you must know some basic thïngs clearly and objectively. Firstly, you must be very clear about your expected outcome. Because only then you can measure ROI. Break your expected outcomes into small components and then measure accordingly. It is something like WBS (work breakdown structure) in Project Management.
Let us take some concrēte examples of measuring ROI of PR and advertiṣing. For instance, your new activity is advertising on social media platforms. And you are advertising about a new Insurance Plan. Your ROI, in this case, is to achieve hits as per plan. In addition, the analytics must talk about the conversions. Because mere hits can’t result in business. But still, hits are important. Since without hits you can’t get conversions. It is the number of conversions that bring revenue. If you fail in achieving hits and conversions more than your plan, then the whole project needs analysis. Your campaign might be good and effective but the advertisement itself might not be effective to draw attention.
ROI of PR and Advertising
Similarly, for the same Insurance Plan launch you hire a PR agency to drive a blogger’s meet. The agenda is to invite bloggers. Educate them on the new plan and expect a post from each on their blog. The purpose is to spread the message as far as possible. And to the most feasible auḏience. ROI of PR and advertising depends on certain factors. But firstly, you objectives must be crystal clear.
There is no standard set for measuring Social media ROI. Firstly, it is not run sensibly. Even an organization is not clear who owns this portfolio. The probable candidates are – Marketing, Sales, IT, MIS, or Operations. Why Operations? Just think. Actually, Social Media ROI is a tricky thing. Firstly, like in any investment, or project, there has to be a clarity on what you want to achieve. I see many companies just landing on any social media platform officially and randomly. In such cases, they are not clear on the purpose and there in no agenda in their plan. With such anomaly, you can never think of social media ROI. Secondly, if you are clear about your goals, have you a plan in place?
It is important to learn which platform or platforms you should pitch in. Spending your energies on all platforms is not possible, nor advisable. Thirdly, never invest your money on an external agency thinking you are in safe hands and everything to smooth. Remember, your involvement and engagement is very important. Never let them drive the show without your involvement. Lastly, list down what all you are investing in building your social media ROI.
Social Media ROI
At times many companies do it on an event basis. While others do it on a regular basis. Both are fine. But it all depends on your business and your goals. An example of event-based social media drive is a yearly beauty pageant every year in a particular month. The company sleeps over social media during eight months after the event is over. But the moment next event registrations are about to start, they become quite active on their social media pages and accounts. Moreover, it depends on the particular campaign which you run on a social media platform.
Practically ROI is possible only if you are clear of the desired outcomes you have in mind. Some people claim higher outreach. Though it is important, but that is of no use if it has not come to the right or correct segment of people. Otherwise, a mere tweet with no purpose will not fetch any desired results though it may result in a surprise outcome.
Obviously like any other investment, ROI of BI is quite possible. You know operational and capital expenses on a BI. But this is one side of the picture. There are many things you have to think of. Why you deploy BI is important to understand. Definitely, management approval comes only once you are able to justify the investment. And this justification comes only if you are able to convince them. There are certain things to keep in mind while trying to convince them. Never speak vendor’s language. Rather do your homework properly, accurately and completely with the vendor before taking the case to your management for discussion and their approval. Probe your vendors as much as possible. And in complete depth.
Firstly, you need to be convinced. You must be very clear about what is the purpose of deploying BI in your organization. In addition, it is important to únderṣtand the goals to achieve. In fact, the goals must be clear function wise, and for the organization. Rather, arrive at a micro level picture than looking at it broadly. Once you are clear about purpose and goals, then your task becomes easier. Never talk to a single vendor. Even if you are talking to multiple vendors, they must be representing different solutions/products.
ROI of BI
To calculate ROI of BI, here are some key points. It must decrease time to release management reports and dashboard. In return, decision-making must become fast. A time must come after some time of installation when critical information is available in real-time or near to real-time. Think on these lines and you will generate more ideas in this regard. There need to be objective criteria to measure ROI of BI. A regular monitoring is important to analyze the results and outcomes. In fact, you must question the vendor or installation partner before your management starts questioning you. Be a little proactive in this regard.
Recently I was at one of the busiest Cafe Coffee Day (CCD) outlet. We are drawing some important project management lessons from my experience there. The outlet is at Rajiv Chowk Metro Station in New Delhi. During my transit, I was feeling hungry. And that is the reason I was there to grab some quick bytes. There are few good project management lessons for a grab too. But before that, it is important to correlate it in terms of project management. It goes like this. Firstly, each table is a customer organization. Secondly, all the people sitting on a table are various departments of an organization. Thirdly, each order is customer requirement. As a matter of fact, some requirements may demand certain variation from the standard recipe. This is customization in the scope of work.
Floor manager is the Project Manager. Chefs, kitchen staff, cleaning staff, and serving staff are the teams. And these teams are workïng on various projects at the same time. In fact, same resources are working on multiple projects. The items in the menu are deliverables. And each constituent in the menu item is agreed on deliverables. We ordered two desserts and three food items. The filling in one of the sandwich was very less. Both desserts were missing few constituents. Moreover, in one of the food items, stuffing was too spicy. And it was very oily too.
There was a need for some actions. The project manager should know project anomalies. If delivēables are not as per commitment, it is time to raise an alarm. Hence I told the service staff to call the floor manager. In no time the floor manager was there. A conversation was about to start between a customer and project head.
Project Management Lessons
Firstly, I showed him the stuffing in the sandwich and he agreed that it was less. That is a good sign of a project manager. A good project manager will agree if there is a shortfall in the product or service. Unnecessary arguments and false claims are a big ‘No’ for a project manager. Then the desserts were also replaced. This is not a good sign. If the ingredients were available, why it was not there in first go. This is another set of project management lessons. Never leave loose ends for the customer to catch you for wrong reasons.
This is the last post in the series of interview with Rohan Shravan about his journey, learning, and the new product #iamAble Able 10 tablet for professionals. In this post, we are talking about 4 Quality Control Stages that helped Notion Ink Device Labs to offer Swap Warranty in a glitch-free manner. These 4 Quality Control Stages are quite stringent. At each stage, I learned, there is a high volume of rejection. And that rejection helps the organization to churn out the best of the pieces. Hence it helps in lowering down the rejection and defect occurring at consumer or user level.
9. What all stringent quality measures are taken during its development lifecycle?
SWAP warranty for us is possible because our devices go through 4 Quality Control stages. Primary QC is done by our manufacturing plant, which confirms the mean time between failure numbers for us. Then our team, right on the assembly line rejects devices which do not conform to the functional or design requirements. In next stage, we get the QC done through a 3rd party agency, which adds a very strong impartial look at everything. Finally, the devices are airs-hipped to India to reduce in-transit time and damage. Once they reach our labs, every single device again goes through full functional as well as design coherence test. That’s when we add those Mint chewing gums and our letters as well for our customers.
10. End consumer or business? Where does it fit best?
It is best suited for Mobile professionals. Our core focus with Able is enterprise through it is available online as well for Consumers looking for a reliable device instant connectivity.
11. Is this an international product?
Yes, it is also available in European Enterprise Market.
12. Please throw some light on the battery part. What is the backup it provides?
Able has 8100 mAh Battery. On normal usage, it should work for 8 hours with any trouble, but battery performance depends on a lot of things. On a poor Wifi network watching youtube can drain Able in under 6 hours as well.
4 Quality Control Stages demands high discipline
13. What is the current volume and pricing model?
All enterprise orders are different. Most go with Windows Pro licences, want 2 or 3 years of either SWAP or onsite Warranty, as well as additional spares. So pricing for Enterprise segment can vary anywhere between INR 24990 to INR 34990. In the consumer market, Able is available at INR 24990.
14. Any other aspect you would like to highlight if skipped by me?
One thing which differentiates us from others is our in-house design. Everything you see in Able goes out from our lab. We feel the designers have more connect with the consumer using their devices and if then something goes wrong as compared to traders for whom it was the Chinese guys who did some mistake. We are very emotional about products and every feedback helps us improve further.
15. What next in terms of technology and upscaling?
We feel there is a need for a proper product which can compete with Microsoft Surface. The Surface is an excellent example of a 2in1 which can replace your laptops. We are very keen to enter this segment, but the price point needs to be correct. We are hoping we can surprise people soon.