The clean room delivery model


There are always new trends on the horizon when it comes to software development approaches that improve efficiency.

The Clean Room Delivery Model is the epitome of software development and one that BizXcel plans to investigate further to see how it relates to the Business Intelligence and Analytics environment.

This models looks at automating as much of the development life cycle as possible. This allows for the integration of development tools for a continuous process form start to finish. Tools include source control, app build, repository/app versioning, quality checks (functional and non-functional testing), security checks, deployment and support.

Companies have different requirements, so the model varies between teams. For example, some code analysis tools could be used to analyze the code just before the app build step.

Each step has check points to ensure that all automated tests pass (unit, functional and non-functional). If one of these does not pass, the process stops to report on the issue. The ideal process has no issues and completes the process without intervention.

Software development has been consistently improving in how much is automated. In the past, this has involved numerous customized scripts to automate app builds, test case execution, and deployment between servers - all of which are time consuming. In the past, automation has been segmented and meant building multiple versions of the same application.

But with cloud-based models, the trend towards automation is allowing us to connect all of these steps. This is called Continuous Integration and Continuous Delivery (CI/CD). When code is checked into a server, a trigger starts the build process on the build server and once the build is completed the automated tests are executed, a trigger then sends the application to the test server for automated functional testing, another trigger passes it on to a non-functional load test server for automated load testing, and optionally to any other server.

We can even take this one step further, and automatically deploy right to production. This is called Continuous Integration and Continuous Deployment (CI/CD) II. The process automatically deploys to the production server after all development and testing triggers are done. Most companies choose to stop just before the final step to production and manually deploy the application due to the sensitivity of the environment.

Here are the essential elements of the clean room delivery model:


So, what does this all mean? What is really means is a fundamental shift in culture so that we are focusing on how to effectively and efficiently ensure our customers are getting the highest standard of service. Is your company thinking about this type of development?

We need to start informing our decisions with data


Decisions, decisions…making a decision can sometimes be overwhelming and cause a great sense of pressure and anxiety.

We’ve all seen the meme where the boyfriend asks the girlfriend what she wants to eat for dinner and the need for that quick decision causes some anxiety, an argument may even ensue, and the decision is delayed and sometimes never made. Now, this could very well be due to the oh so real hanger (anger induced hunger = hanger), but it nevertheless rings true.

Making a big decision with consequences, whether it be financial or other, can certainly be tough. However, a best practice should be to consult with the available resources and conclude with data.

Gone are the days where a reliable and trusted decision can be made on intuition alone. We need to embrace data and learn how to incorporate it into our business decisions.

In his blog post, Kevin Hanegan of Qlik talks about the five main issues that data illiteracy can have when it comes to making informed decisions in the workplace. They are:

  • Basing decisions on data that is incorrect and not trusted

  • Basing decisions on incorrectly built and interpreted visualizations and analytics

  • Incorrectly communicated decisions

  • Making ineffective decisions based solely on intuition or solely on data

  • Failure to make any decisions because of the fear that there is not enough data or due to incorrect mental models

Kevin discusses each point in detail in his blog post which you can view here:

Data is the new oil


In 2017, data surpassed oil to become the most valuable asset on earth. Let that sink in for a moment.

There is certainly no shortage of data analytics tools out there to help you capture, house, maintain and analyze your data, and we’re seeing an increase lately in data security as well. But more interestingly, we’re seeing an increase in the monetization of data. The value of our data is growing and its important to think about how we will measure and maintain a healthy environment within and around our data ecosystems. David Couchon of Qlik discusses this very topic in his blog post “Competing and Winning in the Data Economy”.

View David’s discussion on the monetization of data here:

Stop the Spreadsheet Madness

Do you have 2019 Business Intelligence goals of decreasing expenses or increasing revenue?

They often fall under one of these headings:

  • Better Customer Knowledge

  • Greater Overall Efficiency

  • Updating Processes

  • Improving Risk Assessment

  • Identifying Future Opportunities

When these targets are getting met, they invariably lead to saving money or making money.

Valuable data accumulates throughout the different areas of every organization and analyzing what gets tracked, generated, collected and monitored as a function of making business decisions makes immediate objectives possible and larger desired outcomes attainable.

Companies thinking they are either too small for analytics or they have to make do with a cobbled-together in-house solution, or be stuck with the limited reporting capabilities of other existing platforms, but knowing they are sitting on data that could be working more for them, can start the transformation gradually.

Whatever the catalyst for implementing analytics or improving what is in place, the endeavour shouldn’t be intimidating and the best approach is to do it incrementally.

One project tackled is better than trying to have an all-encompassing roll out that is long, expensive and disruptive.

Pick an initiative, define the objectives, consult with and involve team members who can contribute to the success of the project.

Data Image.jpg

Maximizing the benefits of pulling meaning from data and adding analytics as a regular practice is an ongoing cycle.

Flexibility is key to progress; be sure to allow for adjustments so there is advancement to the next stage and the next stage and the next stage. 

If new people are brought in or outside expertise is required as part of becoming a data driven company, facilitate helpful internal communication so the work being done is part of a shared goal for everyone.

Update the onboarding process for new hires to include data literacy and have resources for existing employees to add data literacy to their skills.

The ability for everyone in the organization to understand dashboards and have confidence in accurate reports that can be generated as each project comes to fruition will build momentum, spark creativity for additional use cases and ultimately buy time.

How? By automating manual processes, eliminating duplicated efforts, stopping the spreadsheet madness and seeing the whole story that lives inside your data from all its sources.

Inevitably someone will ask, why change? Why now? Will problems actually get resolved? Will the results really lead to greater profitability? Where do we start?

These are the right questions and seeing is believing.

BizXcel has built thousands of analytics applications for businesses and organizations in many different industries and sectors.

Getting from here to there doesn’t have to be an added strain on IT.

Book a demo and then let’s talk about analytics strategies to meet your 2019 Business Intelligence objectives.

Flood Zone Mapping


We’ve had the opportunity to work with a really interesting wholesale broker and managing general agency who serves independent insurance agents in parts of the United States.  One of their top priorities once Qlik Sense was installed was to explore mapping capabilities. This company wanted to uploaded addresses from their system into Qlik and see those addresses on a map with FEMA flood maps overlaid on those points so they could easily see which points fall into flood zones. The second request was to be able to extract the addresses from the maps in Qlik Sense and be able to see the FEMA flood maps in Qlik, have the ability to utilize a lasso tool to select a particular area and get it to return all addresses within that specific selection.

Using GeoAnalytics Base and a tool called LocationIQ, our Professional Services team was able to deliver what they were looking for. In order to make this happen our team had to build script logic to fetch all the points as the calls have limits which needed to be accounted for and the flood data needed to be integrated with their current application.

There was certainly a good amount of work to be done here. Our team had to work on the logic to retrieve flood zones from the FEMA service and understand how the service worked and the limitations. Once this was completed, our team created the script logic. We ended up leveraging the Batch GeoCoding option that is offered by LocationIQ as it allowed the client to GeoCode all current addresses and the new ones would be completed daily. In order to ensure everything worked smoothly, we sent the LocationIQ team a processed CSV file that contained addresses for the quotes and they returned the appropriate latitude and longitude coordinates.

This client is thrilled with the mapping capabilities and they are using it to drive strategic conversations.

What would an extra day mean to you?


What would you do if you could get back 20 hours a quarter and redirect that time and effort elsewhere into your work?

Well for some of our insurance customers they are celebrating this huge win with us – an entire 80 hours a year is saved because of a claims report our team developed inside the Insurance Template powered by Qlik Sense.

Before this report was created it was taking companies 20 hours a quarter to gather these numbers from different data sources which left the door open for human error. Prompt reporting is crucial in the insurance industry and these customers are now able to get the same numbers, eliminate the risk of human error and produce this report now in a few clicks. Customers can analyze gross unpaid claims by accident year and drill down Claim Type and Major Cause of Loss.

By producing the claims report faster, it can lead to immediate communication and strategic decisions around clients, products and services and avoid any late penalties for reporting keeping companies in good standing. The ability to provide accurate, prompt reporting has a direct impact on the overall outcome of the claims. So, what would an extra day mean to you?

Assembling and validating data, gains speed with insurance template



The Insurance Template powered by Qlik Sense is improving daily operations for Insurance Companies. There are four applications in total in this accelerator. For the sake of this blog, we will be discussing the business value of the Operations Application. This application provides management and staff with the ability to analyze and monitor their daily operations. It focuses on business retention and growth, the impacts of discounts and charges, policies in force, profitability analysis and diagnostics, claims analysis, including loss development and large loss analysis.


Business Pains

Some of the insurance companies we are currently working with didn’t have an analytical tool in place before Qlik Sense. Companies had to rely on reports from their system vendors and data extracts taken directly from the database and then assemble these into Excel reports. As you can already picture, this took a considerable amount of time to assemble and then validate to ensure accuracy. Because of the repetitive nature of these reports, this tied up valuable time from the business analyst role and prevented these individuals from performing other value-added work and resulted in some metrics not being examined on a regular basis.

Let’s look at business and policy retention. These reports were taking some of our clients 2-person days to assemble and validate. As a result of this time-consuming report, the report was not being generated as frequently as they would have liked. The need to understand changes in business retention is critical to remaining competitive with new companies entering the markets on a regular basis.

Another challenge companies were experiencing was understanding their profitability. The financial statements provided profit or loss at a corporate level but not insight into where they were making profit and where they were losing money. This needed to be broken down by agent selling the business, geography, industry and product to analyze further.

Profitability in an insurance company is measured on two levels. What is referred to as Gross reflects how much profit or loss the company has on the insurance products that they have sold to their customers. Understanding this profitability at this level reflects what products they have sold and indicates how they are managing their operations.

The second level is what translates back to the financial statements, and that is the Net profitability. The world of insurance is comprised of insurance companies selling a policy contract directly to a customer but then ceding (essentially selling a portion of this policy or reinsuring) to other companies so that risk is spread. This is accomplished through contracts like subscription, quota share, excess loss, stop loss and catastrophe. So, when there is a loss that triggers one of these contracts, the primary company can recover from the other companies a portion of the loss. This results in a net loss that flows through the financials. It is important to understand and manage both levels of profitability as it reflects the true profitability story.


Business Value

As part of the implementation process with these specific insurance clients, we worked closely with them to validate all the data and expressions used for the various metrics in the application. This led to strong confidence in the solution to deliver accurate results. This piece alone has resulted in large savings in time (which always equates to dollars) that was previously associated with assembling, validating, producing and distributing reports. Rather than hire additional analysts or alternatively not having timely information, these clients can get answers almost instantly. Their work has now become less stressful and they can focus on other high priority tasks. Goodbye to manual reporting!

That retention report we discussed above? It used to take 2-person days to generate and now is generated with a click. Based on the needs of these companies, most have their data updated and refreshed over night allowing them to come into work in the morning ready to work with fresh data they can trust. Those quarterly reports are now generated any time because there is no additional effort needed by the IT department or Business Analysts. If you think about those reports alone, on a quarterly basis, it is saving companies a minimum of 8-person days or more per year.

Circling back around to profitability now. With additional filters, companies can examine profitability in such detail with over 200 dimensions. This has been key to determining profitable vs. not profitable business. It has been then tied into rate reviews which help determine pricing adjustments and product development. This used to be a tedious and slow process which would put companies at a competitive disadvantage. With the insurance template powered by Qlik Sense they are able to adjust to market changes fast and stay competitive.

Insurance Badge.png

BizXcel Inc. is a recognized Qlik Partner, certified with a specialization in the insurance industry.

5 reasons BI platforms fail


Business intelligence platforms are a valuable asset to organizations. They provide visibility, time savings, the ability to invest in ongoing improvements, real-time access to information and a single version of the truth for everyone to work off of, removing confusion and misunderstandings.

Consolidating reports and having instant access to all your information at your fingertips in one place with the ability to explore it at will without IT is undoubtedly appealing.

And with the deluge of data most organizations experience, it only makes sense to tame and harness the power of it.

However, it is no secret that the failure of dashboards is notorious and organizations have spent billions of dollars on BI products that haven’t panned out.

So what’s going on?

Here are five common reasons business intelligence platforms fail:

1. KPI’s not identified

Without effectively capturing key requirements from users, whether they be executive, finance, sales, operations, etc., you run the risk of having a fancy tool with lots of features that’s of no use to anyone.

Taking the time in the beginning to identify the key performance metrics that need to be tracked is one of the most important parts of the entire process.

2. End users are not involved in the design and development

When you don’t involve the very people who will be using the platform during the design and development you risk losing their buy-in. Not only is it important to understand how they will use the system and use their knowledge to ensure it is the right fit for the job, but their involvement goes a long way to making them comfortable with the change and turning them into champions of the new system. With their investment in the project, it is more likely they will use it and promote it to others.

3. Doesn’t fit with current processes

Another mistake that often occurs when not having end users involved in the development (and sometimes even when they are), is not ensuring that the use of the new platform fits well with the way the users currently work. If users have to spend extra time and energy using the new system, it risks not being used at all. Furthermore, if the system isn’t intuitive and user-friendly, users will quickly become frustrated.

4. Poor dashboard design

When it comes to dashboard design, a lot of time is invested in making the visualizations look fancy and slick. While undoubtedly a perk, for a dashboard to be truly worth the time and money invested, it simply needs to highlight the right information, to the right people in a way they can understand and take action on.

If the design of the dashboard doesn’t allow this to take place, it doesn’t matter how nice it looks.

5. Implementation takes too long

A big reason many BI platforms fail to provide their full potential is the length of time it takes to get them up and running – in some cases months and even years.

There are a few ways this could adversely affect a project. First, the needs of the business can easily change in this time, making the objectives of the project obsolete. Second, the longer the project goes on the bigger the risk of it becoming bogged down with shifting priorities that may dilute its initial purpose. Third, organizations risk losing user buy-in for a project that looks like it will never be completed and leaves users frustrated because they don’t have access to information they need.

While the benefits of a business intelligence platform are undeniable and the only way for organizations to stay ahead of the curve today, it is important to ensure that your project doesn’t fail due to any of these pitfalls.

Sales manager: Excel expert or manager?


Have you ever stopped to think how much time you spend over the day trying to gather the information you need to make a decision? It would likely scare you if you began tracking it and thought about how ineffective this time is.

If a Sales Manager spends 1 hour over the course of a day accumulating and manipulating data in Excel in order to figure out what is going on with their team, products, region, etc. they are losing over 250 hours a yearfighting to get actionable information. This means they are not doing all of the things that they need to be doing as a Sales Manager during those hours.

What happens if you are not an Excel expert? Or if you simply don’t have the time to spend gathering and manipulating the data in Excel? You rely on gut instinct. This may work some of the time but it is a dangerous gamble for the business, especially when sales are the future of your business. Relying on gut feelings could be placing your business at risk.

Managers should be managing, not spending their precious time working Excel to bring information together so they can make a decision. These are time wasters that are distractions which lower the organizations performance.

I suspect that you as a Sales Manager don’t enjoy your time spent searching for data and pouring through it trying to make sense of it. You would rather arrive at an understanding quickly on what is performing and what is not. Taking this insight and creating plans and strategies to improve sales and taking proactive measures to ensure that you meet your targets. You would like to know what sales representative needs your extra assistance and motivating them to reach their full potential.

The good news is that all of this time trying to get actionable information and insights are not necessary. There are easy to use, self-serve tools that allow the Sales Managers to actually manage with accurate and current information. It can bring data together from different sources and allow for the associations between the information to lead to better, enlightened decisions.

What are you going to do with the extra 250 hours a year you will free up?

Take a few of those minutes now to learn how you can get control and recover those lost hours:


When should you use industry key performance indicators?


At the time Robert Stickle our COO was working for an insurance company and one of the industry performance indicators that was used by the industry as a benchmark was called “Expense Ratio”.  At the time, that company didn’t focus on this ratio.  The reasons were many but it came down to the fact that if they tried to bring the company to the “Industry Benchmark” for this key performance indicator it would have caused them to miss their strategic objectives at the time.

The issue was that the benchmark was based on the entire industry and what regulators felt was an acceptable value.  The company was a small company which reduced the economies of scale and made achieving this benchmark difficult.  The company’s strategic objectives required them to focus on different KPIs.

One of the objectives was to reduce the number of claims.  No one wants to have an insurance claim.  This means that they have lost something that they valued and that causes stress and emotional upheaval.  So to assist their clients in not having a loss, they had a robust loss prevention program that assisted clients in identifying risks and mitigating them.  However this very program added costs to their company and drove the “Expense Ratio” up.

Rather than focusing on the “Expense Ratio” they measured themselves on a number of other indicators that matched their strategic and tactical objectives.  They used a number of other indicators, such as Claims Frequency and Underwriting Ratios for instance.  The Claims Frequency helped them refine their loss prevention programs into areas their clients were having losses.  The Underwriting Ratio allowed them to ensure that while their Expense Ratio may not be at industry standards according to the benchmark of the time, they remained profitable.

Having said all of that, they did keep an eye on the Expense Ratio but their internal targets for that KPI was higher than the benchmark.  Since the industry used this as a KPI they couldn’t ignore it but they didn’t allow it to rule them.  They set their own path and developed performance indicators that assisted them in achieving their objectives.

They also had to keep an eye on this indicator because it was an industry KPI and benchmark.  Even though they didn’t focus on it the same way as the rest of the industry, within the industry they were compared to others using it and they had to be able to defend their position and results.

The key for acceptance in the industry was having the strategic plan and developing their performance indicators to support that plan.  This made it easy to defend why they didn’t achieve the Expense Ratio indicator. 

In closing, the first step is to have a strategic plan and objectives.  Performance Indicators have the ability to shape the behaviour of your organization and staff so you need to ensure that they are driving you to where you want the organization to go, rather than just following the industry.  Once you have the strategic direction set, examine the industry KPIs and their associated benchmarks to determine which ones will assist in achieving your goals.  Where there are gaps, develop your custom performance indicators.  For those industry key performance indicators that didn’t match your plan, look to see if they can by useful if you modified the target benchmark.  If you are going to ignore them completely, make sure you also have the rational to support why.  Others will compare you to the industry on those but if it is planned, it is easy to defend.

How self service BI empowers workers with more control and faster access


There is an increasing awareness in today’s economic environment of the power of data analytics.

With business needs shifting every day, organizations must remain agile and maintain the ability to make smarter, faster decisions. Because of this, business intelligence is becoming more important than ever.

The most successful organizations understand that one of their most valuable assets is their data; it is what gives them their competitive edge and allows them to discover new business opportunities.

But despite this awareness of the power of data, many organizations are still making decisions based on gut feeling, anecdotal evidence, intuition, and benchmarking – a perilous position.

As Marshall McLuhan said, “A point of view can be a dangerous luxury when substituted for insight and understanding.”

Much of this is because the current tools organizations are using aren’t up to meeting the demand for information and analytics.

For some organizations, there has been a heavy reliance on Excel and its ability to process and analyze small data sets. However, Excel has many inherent limitations that can severely hinder organizations.

In the past, to overcome these limitations, organizations implemented traditional BI systems which many are still using today.

These were fine when the demand for data was limited, but today’s information workers are demanding more data, more control and faster access to BI and business data.

And often it seems like IT is acting as a gatekeeper between them and the information they need.

The Waiting Game

Traditional BI is a long, painful cycle. IT spends months or even years building out dashboards and reports based on user requirements and when users get their hands on them, they immediately request changes. Then IT must spend even more weeks and months implementing them.

Once the system is finally in place, the flow of data moves slowly. Employees must wait days or weeks to get their hands on the reports and information they need. And if they want to explore the data further or ask more questions they are reliant on IT again.

It’s not that IT wants to keep these employees from the data, they are just chained to unwieldy and time consuming systems. Many departments have been stripped down to the barest numbers and the increased demand leaves them overwhelmed and everyone frustrated.

Information workers are increasingly dissatisfied with the prescriptive reports and dashboards handed down to them from IT.

There is a disconnect between the user and the data, often crippling decision-making capabilities.

This lack of system agility means that opportunities are often missed. Time and effort are at a premium and people can’t wait days or weeks for IT to generate costly reports that may or may not provide all the information they need.

Today’s organizations need speed, flexibility, user friendliness and the ability to generate knowledge quickly and accurately and distribute it anywhere without fuss or complexity.

Removing Distractions and Improving Performance with Self-Service BI

According to the Wisdom Crowds report, an annual survey of business intelligence users by BI specialist Howard Dresner, self-service BI has been among the top technology priorities for two years running.

Self-service BI systems, like QlikView and Qlik Sense, create an environment in which information workers can create and access specific sets of reports, queries and analytics themselves with little involvement by IT. They get real-time data and analysis precisely focused on the business problem they are trying to solve.

Information workers can create personalized reports and analytical queries, as well as share and collaborate on knowledge and analysis across individuals and groups within the organization.

Self-service BI systems overcome the distractions and limitations introduced by traditional BI systems which didn’t allow for speed, agility or ease of use. There is no longer a disconnect between BI and business analytics. Employees can jump in and start analyzing data without having to wait for IT to run complex reports.

Information workers are no longer limited to pre-defined paths they must follow, or questions that need to be formulated ahead of time, possibly causing them to miss out on crucial associations and insights.

They are free to ask and explore any way they want – up, down and sideways if need be; allowing them to do their work smarter and faster.

Self-service BI enables users to be self-sufficient, creating reports and accessing the information they need to solve problems and answer questions as fast as they can think them up. No wait time. No IT middle-men.

“This type of BI empowers everyone in the organization to get the information when they want it,” says Lucas Blancher, BizXcel BI Specialist. “It gives users the ability to start at a high level with a piece of information, have it speak to them and continue down a path with it, exploring what it means. It allows them to not only ask one question, but the next question and any other questions they have after that one. It’s very flexible.”

Easy to Use, Available Everywhere

Self-service BI systems are designed to be easy-to-use with most as intuitive as Google and Facebook. With the right training people are quickly able to start working with them to follow their own paths to solutions.

Analysis need not only take place on desktops in the office either. Most self-service BI systems are designed for mobile, so decisions can be made in the factory, on the retail floor, on the road with the sales person’s tablet or on the manager’s smartphone.

Easing the Burden on IT

No longer bogged down writing and re-writing reports, tweaking queries and building cubes, IT is free to concentrate on other tasks. They can remain focused on data security, data and application provisioning, data governance, and system maintenance.

Searching for Insight

As John Dryden said, “He who would search for pearls must dive below.” If organizations expect to be able to get the full value from their data and achieve a true competitive edge, they must allow information workers to access it and explore it fully in order to gain the insights they need. Self-service BI is a big step in this direction, allowing people to search deeper and in ways they couldn’t imagine before.

What behaviour are your performance indicators driving?


Creating your own KPIs and performance indicators also means being careful in considering the types of behaviour you are going to create with the indicator.  As the saying goes, you achieve what you measure.  The act of measuring something means that we are placing focus on it and as a result we are pulled towards achieving it.  It can be easy to put in a performance indicator thinking it will lead you to what you are trying to achieve but it might have a different effect.

For instance, let’s say that you are managing a call support centre for your organization.  One of your strategic objectives is to improve the experience and satisfaction for those that call in and use your service.  One of the tactical goals to support the strategic objectives of “improve the experience and satisfaction for those that call in” is to reduce the wait time before getting to a customer service representative (CSR).  As such you put a performance indicator in that measures the number of calls that each CSR takes in a day, thinking that if each CSR takes more calls per day the current wait time will drop.

Sure enough, after implementing the performance indicator and holding people accountable to the performance indicator the number of calls per CSR increased and the wait times did drop.  However, so did the satisfaction rating.  What occurred was the CSRs were rushing calls so they could get to the next call and as a result the individuals calling in for assistance were not having their problems completely dealt with.  The organization was getting what they were asking for, more calls per day per CSR but it was actually working against their strategic objective of improving the experience and satisfaction. 

Performance indicators are powerful and assist in driving the behaviours in your organization in the direction that you want.  However, they must be carefully thought out or they can do as much damage as good.  In our example, the indicator of number of calls per day per CSR may have reduced the wait time of the caller but didn’t improve the satisfaction level.  If we really examine the indicator that was created we would likely see it for what it is, an efficiency indicator for the staff.  However, in this case it wasn’t driving effectiveness, which is what the manager was really after.

When we are building a new indicator to use in measuring our organization, we need to examine each potential indicator against what kind of behaviour it is going to generate.  Is the type of behaviour that will be created the behaviour that we are looking for?

We also need to look at the performance indicator in context with the other indicators the company is using.  The Manager may need to keep the Number of Calls per CSR per Day to watch for trends and to assist in identifying problem areas.  However, it may be an internal indicator that isn’t widely published and it is used in conjunction with a satisfaction indicator, type of call, average call length or other indicators to ensure that the balance is achieved while driving towards the strategic objective of improving the experience and satisfaction.

With more information about the calls occurring, the data can be used to discover insights about what is going on and proper initiatives can be undertaken.  Through the data discovery process the manager may have learned that 40% of the calls are looking for hours of operation and locations.  Using features in the phone system and changes to a website might reduce the call volume, allowing for wait times to be reduced for those that need assistance.

This has just been one example of how performance indicators and KPIs can be leveraged to drive a strategic objective.  The key is to create the indicators with the desired behaviour in mind and that we have to look at all of the related indicators.  Measuring it will drive behaviour, so it is imperative that you are measuring the right things.

Getting started with Qlik


After creating many emails over the years about how to get up and started using Qlik as a developer, I thought it would be useful to compile this information into one simple post. One of the great things about Qlik is that both Qlikview and Qlik Sense share a lot of similarities within the core engine, so what works in Qlikview should work in Qlik Sense when it comes to load scripts and expressions.

One of the many great things about Qlik that I have found over the years is the feeling of community that it has.  The people who work with Qlik technologies regularly love to talk about it and there is so much information out there.  With so many people talking about Qlik and so much information available it can be a bit daunting trying to figure out where to start.  This page should serve as an index by topic of resources that have been created to help users and developers get the most out of Qlikview and Qlik Sense.

I will try to improve and refine this list of resources as time passes, as I hope it will cover everything from beginners to Qlik’ers looking to get more power out of the Qlik platform.  I know it will never be an exhaustive list but I hope it acts as a good starting point.  If you feel that I have left any links off that should be there, please email me at lblancher(at)


Happy Qlik’ing!



Getting Started

Getting Started with Qlik Sense

Getting Started with Qlik Sense Enterprise (Server)



Free Qlik Sense Training videos

Free Qlikview Training eLearning

Qlik Community

Qlik Branch - Developers Community

Qlik Blogs



Set Analysis

A beginners Guide to Set Analysis Video Series

A great write up on how Set Analysis works

A tool for generating Set Analysis

How to use the Set Analysis Wizard





Qlik Sense Extensions

D3 - Open source visualization Library

Predictive Analysis

5 reasons to not procrastinate on analytics


The growth of companies adopting analytical solutions is continuing to grow quickly and can be evidenced by offerings in self-serve BI capabilities, cloud based solution, pay per go models, ability to get analytics on any device, and the list continues.  Business Intelligence is becoming common place and extremely affordable, in part because of the self-serve aspect of the tools now in the market.  If you are not using analytics in your business, you are quickly becoming disadvantaged.

Let’s look at 5 typical reasons given for procrastinating on analytics and why they are no longer good reasons for delaying.

1.You are looking for a canned solution handed to you that will provide you with the analytics and KPIs you need to run your business. The problem with a canned solution and KPIs is that it isn’t designed for the uniqueness of your business.  Every business is unique.  This is how we compete with others in our space.  Therefore we need a solution that allows us to measure and analyze our unique key metrics to allow us to ensure we are succeeding on our own strategy.

2.You have so many business units, divisions or departments with everyone having different needs that it is too large of a project to fund or accomplish.  This may be the case, but the key to implementing a massive project is to break it into smaller, manageable projects.  Take the Enterprise wide analytics project and break it down.  Pick a division, department, region, etc. that would really benefit from the solution and start there.  Once the analytics is successful there it will be easier to roll out to others.

3.You have data in different spots and different sources and you don’t know how to bring this all together.  There are tools available, like Qlik which allow you to bring this data together without having to get into expensive data warehouse projects.

4.You don’t have all of the data to support what you would like to measure.  Don’t feel like you are alone in this.  As you create new strategies for your business, it is common that you won’t have historical data to support the new metrics that you have to track.  Because they are new strategic initiatives.  Start with what data you have and it may be possible to use proxies for a period of time while you determine the required data, its sources and begin collecting it.

5.You want the solution to be perfect or believe that it needs to be.  It is great to strive for perfection but that shouldn’t stop you from working with analytics now.  In order to improve you have to understand where you are today and create a plan.  Analytics are key in that understanding.  Allow analytics to evolve as you evolve on your path towards greater success.

The bottom line is, don’t procrastinate.  Begin working with what you have and get value from it.  As you gain value, you will be able to undertake new initiatives to get what you need to go to the next level.  You will find that as you learn more about your data and operations and about the metrics you use your needs will continue to evolve.

P&C insurance company empowered through Qlik Sense


Ayr Farmers Mutual Insurance Company empowered through quicker and deeper analysis to data.

Ayr Farmers Mutual Insurance Company needed a more confident picture of the company’s operations, with the ability to analyze at their fingertips.

For 125 years, Ayr has been building strong relationships and supporting the community in the Property and Casualty Insurance space., To continue to serve their customers, consolidating, validating and performing detailed analysis on their insurance members is critical but their current tools and methods made this difficult and time consuming.

With no analytics solution prior to Qlik Sense, the process was tedious and tied up valuable resources to manually compile the data, consolidate and validate, and prepare in a report. Ayr was relying on standard reporting through their core software system addressing policy, claims and billing. A time-consuming process that did not allow the company to get at the data quickly to report various KPI’s and statistics for reporting purposes.

Immediate wins

Matt Papple, the Business Analyst and project lead for the Insurance Template for Qlik Sense®, identified that ease of implementation and ease of use were extremely important for an analytics solution. With Matt Papple as the Business Analyst supporting Ayr employees and clients through an analytics solutions and other tasks – it was important that the solution was up and running quickly and confidently and not negatively impact other projects that were taking place. Rather he was looking to increase efficiency, so he could focus on analysis and eliminate the manual process. “We are now able to generate the data and reports much faster than life without Qlik Sense® without the potential for errors and wait times to generate and consolidate.”

Ayr Farmers Mutual Insurance Company were one of the original 5 Ontario Mutual’s who assisted in the development of the Insurance Template for Qlik Sense® through continuous feedback to deliver an accelerator template that would deliver immediate value and transform with their organization.

Before the Insurance Template for Qlik Sense®, Ayr was spending hours and in some cases, days because of the susceptibility to error, to consolidate and validate the data. Now with the solution in all its power, Matt Papple says, “We have a solution that provides fast and instant access to our data where we can gain insights and provide reports in a quick and cost-effective way with minimal turnaround time.”

Instant answers, instant decision making

Now, operations are more easily analyzed and understood through easy drill down capabilities and visualizations of company KPI’s and many other sheets. According to Matt Papple, “We also like the option to quickly extract the information in an additional report via PDF or Excel file for easy distribution.”

The Insurance Template for Qlik Sense® has allowed the company to view over 144 sheets of data across 4 applications: governance, operations, agent and broker performance and workflow. The ability to analyze these sheets helps identify weaknesses and opportunities for improvement and growth.

Ayr uses the template for Qlik Sense® to review operations and governance applications daily to monitor the health of the organization. The company also looks at retention rate data and visualizations as one of the most valuable sheets as prior to the solution it would manually take 2 days of work to calculate, extract and go through the process. Now it is a few clicks with the ability to drive deeper into the data and understand relationships and impacts. The profitability sheets are allowing Ayr to analyze for rate reviews and product development which is speeding up time to market and giving the company a competitive edge. Ayr can identify by postal code where they are performing well and areas they are not. With the ability to sort by gross loss ratio, Ayr now can quickly analyze their performance across various lines by postal code.  

Matt Papple says, “We can get answers much faster, almost instantly. It no longer takes a considerable amount of time and effort to generate what used to be manual reports.” Effort can now be redirected at other tasks because this is a natural way of doing business now for Ayr Farmers Mutual Insurance Company. 

BizXcel Inc. is a recognized Qlik Partner, certified with a specialization in the insurance industry.

The foundation of analytics is your people


As with any new tool or program implemented into an organization there will always be a ramp up period. But are you able to keep the momentum going and not fall off the deep end?

I could not agree more with David Avery at Qlik who said, “The foundation of analytics at your organization is not data – it’s the people using it.” Programs and tools don’t work unless your people make it do so. We love enabling our clients and creating analytic super teams so this is why I love what Avery has to offer in this article about his starter kit.

David says to have that all-star team you need a couple important things:

-A champion who is willing to lead the analytics implementation. If you don’t have a bold leader, you don’t have a project that will get off the ground because decisions get drawn out and it’s hard to get buy in from the surrounding environment.

-You need those data gurus. These are the individuals who set up, install and extract data. Most companies will outsource but if you have a data guru…don’t lose them! They don’t necessarily build the applications but they deal with all of the logistics of getting it working and they do it so smoothly.

-And now you need the development superstars. These individuals build out your applications and make it all come alive visually.

-When you have applications you’ll need to get accurate reporting so that’s where you will need to have a strong business analyst on your team or at least have access to an outsourced analyst.

-Next you need someone who can help your data speak and bring life to the numbers by being an exceptional storyteller.

-And finally Avery says that your analytics must be embedded into the culture. People must see value in it and want to use it. These champions can be found throughout your organization.

I highly recommend checking out David Avery’s full article to check out the first tool in his starter kit. 

Why small businesses need analytics too

Typically, when you hear the words, “data analytics” you often don’t associate a small business with it. People have this idea that data analytics comes with a large price tag and that small businesses cannot compete in that market because of it.

But with the growth of technology there is certainly better access to data analytics tools than ever before and different buying options; for example look at the cloud and now many companies are moving towards a monthly subscription offering.

Let’s look at a ABC company who sends bi-annual snail mail campaigns to existing customers and potential customers. They send their standard stack of flyers out to hundreds of people about discounts, deals, new services and so on – whether these people need the information or not they feel it’s best to get in front of people anyway possible. ABC company started analyzing sales from those flyers and when it came to preparing budgets for the following year they sat back and thought long and hard.

You see, even small businesses need a data analytics tool. With the use of a smart and powerful tool they can send targeted information to existing customers and targeted information to potential customers depending on geographic area or age groups. With this information it can lead to higher levels of engagement and higher sales because we’ve analyzed the data and acted on it.

Small businesses are starting to see the value of analytics but know they face some barriers; including size of their organization, resources and their current data environment; if any.

But I also see the other side of that. Small businesses get to start from scratch. A clean slate. They have the ability to work with clean data and begin asking questions they didn’t realize before. Running any business is difficult, but as important as it is for a large company to have clarity into the future it is just as important for small companies too. This allows for agility and making sure your business is around for years to come. 

The challenge of insurance policies in force in insurance analytics


There are a couple of challenges that are faced when trying to bring Insurance In Force data into any analytics platform.  

First is the challenge with Policies In Force (PIF) being a point in time metric.  PIF is a snapshot at a specific point in time of policies that are In force as of that moment.  This specific point in time makes it difficult to perform calculations on as it changes over time and is usually accomplished by running multiple reports and building different spreadsheets.  The creation of this information typically isn’t very dynamic and takes time to create and manage.  

Users also expect that we can compare multiple periods or do time series analysis which can be difficult when you have to regenerate the data for each period in the series.  The challenge is further complicated because we don’t know what that series will be in advance, so we can’t prepare the data in advance.  Our users want to select based on their business questions of the day and not what was determined by the IT Department months or years ago when the solution was built.  With analytics solutions, we want to be able to pick any period and get instant results. 

The second challenge of Policies in Force (PIF) in analytics is merging or integrating in force data with transaction data to allow for a full analysis.  With In force data being a snapshot of the business at a point in time it behaves differently than transactional data.  Where as transactional data builds a story of the business over time.   In force data tells a story at a specific moment but may not be relevant before or after that moment.

The integration of the two is important to allow for the discovery of the whole story in one spot.  When looking at a dashboard the user wants to see metrics that tell how many policies are currently in force but also what premium has been written and earned for the period or year to date.  All three of these are different types of calculations that would traditionally be done in separate reports and would not be dynamic and certainly not be together.  The expectation of analytical solutions is that they are dynamic and linked.  That we can select different periods or dimensions and have results brought back to our screen without having to request a new report to be run.  

So how is this problem solved?  I’ll be honest, this had us scratching our heads for a while.  Over the period of a couple years we experimented with several different solutions but found that they didn’t meet the criteria that we had.  The solution needed to allow the selection of any period (year, quarter or month range) and it needed to handle new periods being added to the data without being rebuilt.  Therefore, pre-building the metrics was out of the question if it was to be dynamic for the user.  The solution also had to handle large data sets and still provide a responsive result to the user.

Then after a period we did solve this problem using Qlik with its associative engine (either QlikView or Qlik Sense). Using the combination of advanced data modelling and visualizations with formulas we are able to build a Policy In Force solution that integrates with transactional data.  Our users are now able to explore their Policies in Force and immediately look at other aspects of their business such as written and earned premiums or incurred claims all in one spot.  This allows them to extract more meaning and do a deeper dive into the business to see the whole story in their data.  With a deeper understanding of their business, they are able to support their decisions with facts instead of intuition.

One of the additional benefits of our method for Policies in Force (PIF) calculations is Business Retention analysis.  With the ability to do PIF for any period, we can do accurate PIF Retention calculations.  This method is better than the traditional way of calculating retention using just transactional data.  With PIF information for a point in time, mixed with rolling period analysis we get an accurate business retention.  Policy retention analysis over periods of time allows users to see trends that may have not been visible in the past using traditional tools and methods.

Data can give you a competitive advantage: expense management

We know that analyzing expenses is vital to an organization and is surely being done by the CFO or individual responsible for this business task. But what’s important here is how it is being analyzed, what’s being analyzed and how much time is the analysis taking.

We had a great meeting with a company last week who is currently manually creating reports, struggling with communication around the data, not having the ability to investigate into business lines and geography. As the business continues to grow, there is more data becoming available and they realize the data at hand is what can give them their competitive advantage. Each manager has been spending hours each week creating reports for their departments which is taking valuable time away from other aspects of their job.

This company is after an easy-to-read dashboard that makes their data speak for all business users. They feel that the company should have the business’s vital signs at their fingertips rather than just the CFO. It’s time that all managers sleep better, by having the ability to watch performance, production and quality numbers at their fingertips; whether this is off site or on the floor.

Businesses, especially manufacturing, right now are under increased demand and pressure to improve efficiencies and profitability. The company we met with last week sees the Qlik Sense solution as a huge opportunity for them to be able to consolidate all of their data, analyze sales trends, identify gaps and enable executives, managers and front-line staff through data visibility.

What we keep hearing from our manufacturing clients is that they are now taking the applications built in Qlik Sense and making them available on the floors whether it is made visible through televisions that are updated every hour or iPads that can be easily transported around the floor and utilized as visuals during department meetings. This is allowing departments to make smarter, faster business decisions that is ultimately saving them time and money in their products and services.

A great example; a company shared with us that a huge challenge they were experiencing was trucks showing up to pick up product and leaving without that scheduled product. Why? Because the business line was not running efficiently and waste was increasing. With real-time intelligence, this company was able to monitor, make quick decisions and adjust the business line so that product was ready to be shipped in time and keeping customers satisfied and keeping their business.

So now executives can dive deeper into the business, in an instant and secure manner; managers can deliver top of the line products, streamline efficiencies, identify risks and manage costs; and frontline employees have visibility into the department and the accountability has now shifted from executives to company wide. 

Why your board of directors need analytics


Analytics always seems to be the hot topic at board meetings and continues to make it in strategic planning documents. Many decision makers, such as a board of director, are now pushing to understand data analytics, how it can improve the company and how it can support the board for better decision making and help with meeting efficiency.

Through our experience there are many benefits that data analytics can offer to a board of directors:

1.       Identify new opportunities that can assist in strategy planning.
2.       Gain support from members by having the ability to get insights into the company, resulting in evidence based decision making
3.       Can become your competitive advantage
4.       Creates accountability for board members if they have access to a dashboard where they can continually assess the current situation and not just during meetings
5.       In relation to number 4, this increases trust and communication

It’s all about providing the board with the information they need, at the right time and creating a clear picture of the specific ways it can create business value. Because companies can no longer stick with the approach of “let’s invest and hope for the best” mentality. The capital is not available nor is their time for these types of mistakes because one mistake puts the competition two steps ahead.