Do you have the right environment?

iStock-502320910.jpg

Even with the best intentions, business intelligence initiatives often end up failing. Why is this? The main reason is that most companies put most of their focus and effort on the tools for the job and forget to ensure they have the right environment to ensure their initiative’s success.

If you’re thinking of embarking on a new BI project or are in the midst of one, take a moment to make sure you have the right environment to support it. Here are five areas you should review:

Strong Mission

Everyone on the team should know what the main goals of the group are and how the BI project fits in with these goals. Not only will they be more likely to support the project when they see this, but it will also help them apply the data they gather later to meet these goals.

Trust

Is the team working in a strong environment of trust? Trust is necessary to ensure people speak up when something isn’t going well with the tool or they find an issue. If there isn’t a strong level of trust people may hesitate to speak up about problems.

Collaboration and Sharing

An environment that fosters participation and the sharing of information will ensure that more value will come from the data and people will build upon what others have done.

Knowledge-Building

A team that is encouraged to constantly be updating their knowledge and continue to learn new skills will be more likely to want to learn how to use a new tool and appreciate what it can bring to their team. 

Curiosity

Teams that have been encouraged to ask questions and explore new information will do better at fully mining data available to them in a new BI project.

Is your data discovery investment leaving you disappointed?

iStock-522482902.jpg

On calls with prospects, we often hear about their disappointment with new analytical platforms in which they’ve invested so much time and energy. Despite thinking they were fantastic based on what the vendor told them, they ended up being more of a headache than anything else because their end users didn’t end up embracing them like they thought and they didn’t see their ROI.

They then end up investing in additional systems hoping to entice users with access to data and better insights, but usually see their efforts fall short again.

If this is you, you’re not alone, and there is a way to move forward. Once you find a platform that checks all the Must-Haves on your list, you need to focus on your users to ensure it provides what they are looking for in a platform.

So what drives user adoption? 

What we have learned when it comes to data is that users are engaged and energized by having the ability to see new insights in a new format because it challenges the way they think and can offer new solutions. 

To make sure your new data discovery investment doesn’t, here are a few important steps to keep in mind:

1.Talk to the end users to understand what they need and want

It seems simple enough, but this component is often overlooked. Before you jump in full force to create new KPIs, set the data aside for a moment and talk to the users. Find out why they use the data, what decisions they make, what data would add more value if it were visible, how they would like reports displayed, etc. During this process, you will probably be inundated with information, but you don’t have to complete it all at once. Compile the information, present it back to the users, and ask them to prioritize the objectives. Perhaps you have existing fields in place that can address some of what the users said and then other elements may take longer. Users get excited to see change and the platform come to life. It’s when the process is stalled that users begin to lose confidence in the platform and those responsible for it. 

2.Bring real life business scenarios to life with applications

Once you’ve gathered the information from users, seek out case studies and applications that will help them visually see their similar experiences. Anyone can talk up a system, but it’s when the system can speak for itself that people see its full potential. An analytical platform should be able to tell the users a story about the data, and as quickly as users can ask questions, the story should be changing from the click of a mouse. Keep these conversations going with users, include them before, during, and even after the process so applications can be refined and grow with the changes and needs of the organization. 

3.Focus on the data and tie up the loose ends

Now that you have all of the information you need, you can begin to identify what is needed to meet the requirements. It is important to identify any other systems that may need upgrading or internal processes that need to be tweaked or added. 

Companies can find themselves hesitating at this step. If internal processes need to be revisited, often companies get overwhelmed because there just doesn’t seem to be any more time in the day. The great thing about updating processes is that it can be done through business intelligence consulting or business process improvement. Our clients have engaged us in these services so they can continue to focus on the data, BizXcel can map out the processes and give recommendations and even rewrite them – again, all by engaging the end user. 

 

If you take the time upfront you will gain user trust and buy-in, and you will maximize the analytical platform so it doesn’t leave you disappointed. 

Same old thinking

iStock_67832297_SMALL.jpg

There’s a lot of noise out there. The capabilities and technology specifically in the BI and data stream are so diverse, so powerful and are in such a competitive state of growth right now that many organizations struggle to identify the best way to leverage sophisticated BI tools without risking business users feeling overwhelmed and not adopting it. Regardless of any other factors, one fact will govern the success of your company with business analytics:

I believe that the best teams are those who use data to reinforce the behaviours of communicative and collaborative problem solvers. 

Regardless of anything else, the whole point of business analytics is to give business users the ability to become think tanks; to leverage their expertise and experience and direct it towards actively building process and planning for success. When making changes in an organization a HiPPO (Highest Paid Person’s Opinion) no longer dictates decisions by gut feel but rather facilitates decision making by unlocking the expertise of their people and allowing their data to tell a story.

Through enabling collaborative data discovery, companies can support more effective business units which perpetuate a culture of data  that permeates all levels of the organization and becomes integral in every aspect of decision making.

Tools like Qlik Sense enable this cyclical reaction where business users have easy accessibility to the data they need in order to push performance, inform predictions and collaboratively leverage organizational expertise. When people can share, develop and make decisions from the right data tools, it fosters engagement which drives strong user adoption, a data driven culture and a strong ROI.

Are you giving your data a voice?

iStock_89721381_SMALL.jpg

Data is a powerful story but without giving it a voice the impact is lessened. You don’t find everything in the numbers, you find everything by allowing your data to speak and tell a story. Why do we struggle so much to hear what our data is telling us? Why don’t we put the time and effort into making the numbers come alive and make sense?

We’ve all heard of the popular phrase; work smarter, not harder. The same applies to analytics. By keeping our analytics organized, insightful, consistent messaging, visually appealing and engaging for our audience we can function smarter as a team without stretching all of our resources.

Visualizations are great because they highlight an important message through the use of colours. And through visualizations we are able to contribute to the engagement of the audience but it doesn’t stop there. Everyone wants to feel connected to the data and if the story is told correctly, with clear connections and messaging the next steps to take action become clearer for people to work towards. When data is disorganized, insights are not easily recognized then you can have all the data in the world but it means nothing.

When we identify a problem and clearly understand the impact on the organization we can work hard to resolve it together.

So spend time on giving your data a voice and the actions will flow from that.

Business intelligence return on investment

iStock_76238021_SMALL.jpg

We often get asked what the return on investment will be by undertaking a business intelligence (BI) project.  Depending on the circumstances a hard ROI number can be calculated for the project before we begin.  More often it is not possible to calculate a hard ROI number or value.  We have to rely on what I refer to as soft ROI, value that is hard to quantify but is real never the less.

A hard ROI is possible when we can point to existing processes in developing reports and add up the amount of time spent creating it and value of that time.  This includes the amount of time to gathering, validating and correcting the data into something useful, building the reports and distributing them.  In these situations, it isn’t new reports and knowledge that is sought but rather get the reports and knowledge they are currently receiving for less cost and in less time.   The value of the report and the knowledge is already established otherwise they wouldn’t continue the manual process to get it.  As long as the Business Intelligence project costs less than the 3 year or 5 year costs of the manual process there is ROI to the BI project.

The ROI that most projects fall under is not the above situation or at least not anymore.  For many companies, BI solutions and tools have been around long enough that they have addressed the replacement of manual processes.  The value they seek is in understanding or knowing what they don’t know today.  

It is difficult to measure the value of having facts and information to base decisions on.  What is the ROI of making a better decision?  What is the cost of using our gut feeling and making a wrong decision?  It could be small or it could mean the end of the organization.  Having the correct information could be priceless.  There is the value of making the better decision which could result in cost savings or higher revenues but there is also personal value in the form of reducing the stress associated with having to make that decision with no information.  

Evidence-based decisions are about minimizing risk.  When decisions are made using information and data, we are making informed decisions.  It does not mean that there is a guarantee that our decisions are correct.  If it was black and white a decision would not have to be made, the right course of action would just be known.  Evidence-based decisions bring information into the process of the decision and assists us in making the correct decision.

Estimating the ROI in these situations is difficult.  How do you put a value on learning what you don’t know?  It is through insight that we grow and control our future, so what is the value of acquiring that insight.  Often, not undertaking a project to understand and gain this insight results in harm coming to us because or competitors are implementing solutions to gain this insight.  Being able to keep up with our competitors is a matter of survival.  Using new information and knowledge to get ahead is priceless.  Measuring the future ROI of better decisions is difficult because the results are mixed together with everything else we do.  Measuring not implementing business intelligence to understand and gain insight might be easier, it will show itself in a poor bottom line.  

Qlik pioneered Data Discovery and leads the industry with their patented in-memory associative data indexing (QIX) engine (Associative Model) that allows for the probing of all of your associated data.  Unlike the traditional query based tools that force users down pre-defined drill-down paths and limit the user to structures conceived when it was created, Qlik allows the user to explore all of their data in the tool, navigating through it as questions arise.  With a few selections or simple searches using the global search capabilities the user can find the data they need to answer their question.  Most importantly, they can then jump to answering the next question and find the data for it.

Not sure where to get started? When it comes to planning there is only one question you need to answer…

Do you want to be a great organization? And if so, are you willing to put in the hard work and dedication to be one?

If you’re aiming to be a great organization a plan is the first step. 

Data-driven succession planning

iStock_79580241_SMALL.jpg

No matter how successful a company is, succession planning is an essential component to do good business. It’s easy to put off planning when things seem to be lining up so perfectly but succession planning should never be put off for many reasons.

No one can plan for a disaster. No matter how great the company is or how great the employees are we can’t predict unforeseen illness, sudden decisions to retire/quit or fire. So although we cannot plan for such adversities we can put in  place a system that will help the transformation of business change goes smoothly while maintaining positive relationships with employees. 

Succession planning is an invaluable benefit to your company now. Just as companies grow and change every day, so does a succession plan. What you may have had in the past will likely not be relevant today. A succession plan is not meant to be a reactive solution to change, it is used and planned for properly to help a company stay afloat, be agile and position itself to take advantage of new opportunities. 

Not only does this type of planning sustain workplace balance but it allows the bigger picture to be brought to light. It allows high performing individuals to be recognized and allows necessary time to coach these employees for new responsibilities.  It can uncover real potential from hidden talent as well.

Recently, we have had a few of our regular clients ask us about succession planning. They have seen some changes in their organization and know there will be many retiring over the next 5 years so they want BizXcel to come in and help with this consulting piece. This includes capturing corporate knowledge and identifying knowledge and skills to continually build competitive advantage for years to come.

We focus on two things: consulting and training. One of our clients asked us, “How do I monitor employees to make sure our high performing potential talent doesn’t go unnoticed?” This particular client recognizes that if those high performing employees are not given the proper support and training they could become disengaged and even search for employment elsewhere. 

Consulting and training is a great way to start the process. And in addition to these services, there are tools that can help manage succession planning that are very useful for managers. 

With the right data analytics tool a manager can track and manage positions, employee counts, employee performance that is aligned with core competencies, current skill sets versus skill sets needed, hiring data and more. 

Data is important to help a company know at what level people are currently performing at and what needs to be done for succession planning to take place. 

Using data analytics for succession planning, you can uncover:

•Hidden talent

•Hidden staffing risks

•Average tenure/turnover rates

•Combinations of job, skill, geography

Take the future of your business into your own hands. 

Stop relying on Excel

iStock_11589067_SMALL.jpg

There’s no denying how valuable data is to an organization, from sales forecasting to human resources, the marketing department, management, finance and all other areas. 

Companies see the important role that data plays and are joining the millions of companies adopting big data strategies. You would think that is great, right? Not always. Companies may be jumping on board with big data but they’re trying to make it happen with outdated systems and tools. We hear many companies that say they still use spreadsheets to communicate in meetings. That means wasted time on manually updating and trying to create simplistic visualizations. Think of how much time a year is wasted on spreadsheets when data should be at your fingertips, in real time. 

I’m not saying spreadsheets are terrible, because they are great for the right job. But if you are communicating the present state of a company and trying to make decisions for the future, a spreadsheet is not the way to go. 

Employees want information. They want to be kept in the loop and see how they impact the numbers. It provides accountability and we have seen proper business intelligence tools improve communication and even relationships within an organization. 

People rely on spreadsheets because:

1)They’ve always used it and they feel most comfortable with it

2)They don’t know the benefits of a powerful BI tool

3)They don’t know where to start

4)They don’t think they have enough data for a tool to be useful

Why you should be using a BI tool:

1)Not everyone is fluent in Excel. With a simple tool it can allow many people in an organization to have the full picture and the power to generate new insights

2)Data governance is not possible in a spreadsheet, you run the risk of misplacing data or having it deleted with no backup. Even more inconvenient is sharing through email – our inboxes are already full as it is, it’s more effective to have a dashboard that is accessible at any time by login

3)Data doesn’t stay hidden

4)Makes data analysis much simpler with clean visualizations

5)Data can be updated in real time, every few hours, every week/month 

Companies won’t be able to rely on spreadsheets for ever.

 

Are you willing to take the risk?

How I use data in the boardroom

iStock_69773283_SMALL.jpg

There are 5 key reasons that I use data in the boardroom. Having an agile BI tool allows me to make decisions on the spot with updated real-time data. 

Here is how data talk always makes it into the Boardroom and promotes agility throughout the organization. 

1.Open and honest data leads to open and honest conversations. I am a strong believer that numbers should be transparent. Data shouldn’t be boring or feel like a chore, it should be the driver for conversations. Managers should be exceptional at data storytelling so employees feel comfortable enough to ask questions and provide feedback or comments. 

I love the book, The Five Dysfunctions of a Team by Patrick Lencioni. Dysfunction #2 highlights fear of conflict. This is also known as the site of artificial harmony. I use data (and good storytelling skills) to avoid artificial harmony tendencies. 

Patrick Lencioni says, “If we don’t trust one another, then we aren’t going to engage in open, constructive, ideological conflict. And we’ll just continue to preserve a sense of artificial harmony.” Have you ever been in a meeting where one person is speaking and everyone else is smiling and nodding their head? That is artificial harmony. I want our team to be engaged, understand the data, connect with it, ask questions and comment. Data gives life to what is happening in the organization and I find it eliminates artificial harmony in meetings. 

2.Data builds better strategy and helps drive change. One cannot plan ahead without numbers and data analysis. In our meetings we like to discuss our current strategy and revisit our strategy map. We evaluate based upon real time data and real outcomes rather than turning it into a guessing game. Strategy discussions don’t have to drag out over multiple days or weeks, but we can make quick decisions based on valuable content. 

3.Sending reports to staff before we meet. When our sales team gets together to strategize I like to send out data reports in advance so people can put their thoughts together prior to the meeting. Data helps us make great use of our time because reporting is an asset not a headache. 

4.If someone on our team wants to ask questions and drill deeper, we have the ability to do so on the spot and not let the passion or the question leave the boardroom. I’ve seen too many companies with great ideas and great questions, once it leaves the boardroom it seems to more often than not disappear in the background. With our BI tool we can get instant, valuable insights and dig in multiple directions. This is where the data visualizations come in to play – information simplified is information well retained. There’s something so empowering about live data changing right in front of your eyes. 

5.I believe that data should be shared by all, it should illicit passion with everyone. Data aligns departments, strategy and people. We have found in our own organization that performance increases when everyone is using analytics (or shown the numbers) and tracking their performance because it provides accountability in the workplace.  

Data can always have a seat at our boardroom table. 

Show me the money - give them the info

iStock_000086644597_Small.jpg

When was the last time you had to justify a request for a large capital expenditure in your business?  Perhaps if you’re working in manufacturing, you need investments in machinery to make your production more efficient.  If you’re in the service industry, maybe more investment is required in soft skills training.   In today’s economy, business leaders are consistently operating under more scrutiny with respect to making financial decisions and increasing return on their investment for shareholders.

Often times when training soft skills in manufacturing facilities, we will hear about the struggles that shop floor operators go through when trying to do their job.   A lot of the time, faulty or old machinery is blamed for inefficiencies and poor quality.  And sometimes it seems like requests for new machinery fall on deaf ears.    But think for a moment, perhaps it’s the fact that capital expenditures require a lot of information before a decision can be made to justify purchases.  Approaching the negotiating table without that information is one reason why expenditures might not even be considered.

Make sure you are armed with the information you need to support an informed decision.  Where does this information come from?   It might come from quality figures captured on a production floor database system.    Machine uptime/downtime might be captured in a spreadsheet updated by the maintenance department.   Employee satisfaction rates are tracked in human resources software.   When requesting capital expenditures, every piece of information that you can gather together is beneficial in its justification.

BI analytics tools are available that can help you achieve this.  In particular, Qlik Sense is a very easy to use product that will enable you to pull information in from all of the various systems mentioned above.   It provides an easy to use interface that supports strong visualizations and allows for very quick analysis. 

So, the next time you find yourself asking the powers that be for more money for your shift, department or division, come prepared with the answers to all of the questions and figures to justify your request.  It might be a very pleasant surprise when the answer is ‘Yes’.

Qlik Sense repository failure to start after a windows upgrade

iStock-531783792.jpg

This week we had an interesting support case to fix and I thought I would share the finding in case someone else has the same problem we did.

One of our customers had a Qlik Sense server that needed to have Windows (Windows 2012 R2 x64) updates applied. After Windows Update had completed the server restarted and upon restart the Qlik Sense Repository service would no longer start.

The error in the System_Repository.txt log was the following:

Unable to determine the provider name for provider factory of type 'Devart.Data.PostgreSql.PgSqlProviderFactory'

After reviewing the update logs, I found that many updates and security patches where applied to the .Net framework that was installed on the server. I assumed that one of these updates had uninstalled the Qlik Sense PostgreSQL drivers and I should attempt to re-register the missing DLL by using the GACUtil.exe tool. After re-registering the DLL, the Qlik Sense Repository service still would not start.

At the same time, Qlik Support team had responded back to me. Based on their feedback, I was able to edit the server’s machine.config (found in %windir%\Microsoft.NET\Framework\v4.0.30319\config\machine.config) file and fix the problem. Here is a screenshot of the problem.

2016-03-09 14_40_11-10.3.3.4 - Qlik Sense (10.3.3.4) - Remote Desktop Connection Manager v2.7.png

 

You will see that in the System.Data section of the machine.config file there is a second DbProviderFactories node. Removing this second entry and restarting the server fixed the problem. The Qlik Sense Repository started up and everything was back online.

As special thanks to Pierce @ Qlik Technical support and to Mike over at stackoverflow.com for helping getting this problem resolved.

Here is the full stacktrace of the problem from the logs.

8 20160309T111920.895-0500 ERROR XXXXXXX System.Repository.Repository.Core.Repository.Common.ModelRepository`1[[Repository.Domain.Model.LocalConfig, Repository.Domain, Version=2.1.1.0, Culture=neutral, PublicKeyToken=null]] 14 08513d28-2a50-43de-8973-903425e70657 NT AUTHORITY\SYSTEM Unexpected error in ExecuteGetAll Unable to determine the provider name for provider factory of type 'Devart.Data.PostgreSql.PgSqlProviderFactory'. Make sure that the ADO.NET provider is installed or registered in the application config.↵↓An exception was thrown while invoking the constructor 'Void .ctor()' on type 'DatabaseContext'. ---> Unable to determine the provider name for provider factory of type 'Devart.Data.PostgreSql.PgSqlProviderFactory'. Make sure that the ADO.NET provider is installed or registered in the application config. (See inner exception for details.) at System.Data.Entity.Infrastructure.DependencyResolution.DefaultInvariantNameResolver.GetService(Type type, Object key)↵↓ at System.Collections.Concurrent.ConcurrentDictionary`2.GetOrAdd(TKey key, Func`2 valueFactory)↵↓ at System.Data.Entity.Infrastructure.DependencyResolution.CachingDependencyResolver.GetService(Type type, Object key)↵↓ at System.Linq.Enumerable.WhereSelectArrayIterator`2.MoveNext()↵↓ at System.Linq.Enumerable.FirstOrDefault[TSource](IEnumerable`1 source, Func`2 predicate)↵↓ at System.Linq.Enumerable.WhereSelectArrayIterator`2.MoveNext()↵↓ at System.Linq.Enumerable.FirstOrDefault[TSource](IEnumerable`1 source, Func`2 predicate)↵↓ at System.Data.Entity.Infrastructure.DependencyResolution.CompositeResolver`2.GetService(Type type, Object key)↵↓ at System.Data.Entity.Infrastructure.DependencyResolution.DbDependencyResolverExtensions.GetService[T](IDbDependencyResolver resolver, Object key)↵↓ at System.Data.Entity.Internal.InternalConnection.get_ProviderName()↵↓ at System.Data.Entity.Internal.DefaultModelCacheKeyFactory.Create(DbContext context)↵↓ at System.Data.Entity.Internal.LazyInternalContext.InitializeContext()↵↓ at System.Data.Entity.Internal.InternalContext.Initialize()↵↓ at Repository.Core.Repository.Database.Common.AbstractDatabaseContext..ctor()↵↓ at lambda_method(Closure , Object[] )↵↓ at Autofac.Core.Activators.Reflection.ConstructorParameterBinding.Instantiate()↵↓ at Autofac.Core.Activators.Reflection.ConstructorParameterBinding.Instantiate()↵↓ at Autofac.Core.Activators.Reflection.ReflectionActivator.ActivateInstance(IComponentContext context, IEnumerable`1 parameters)↵↓ at Autofac.Core.Resolving.InstanceLookup.Activate(IEnumerable`1 parameters)↵↓ at Autofac.Core.Lifetime.LifetimeScope.GetOrCreateAndShare(Guid id, Func`1 creator)↵↓ at Autofac.Core.Resolving.InstanceLookup.Execute()↵↓ at Autofac.Core.Resolving.ResolveOperation.GetOrCreateInstance(ISharingLifetimeScope currentOperationScope, IComponentRegistration registration, IEnumerable`1 parameters)↵↓ at Autofac.Core.Resolving.ResolveOperation.Execute(IComponentRegistration registration, IEnumerable`1 parameters)↵↓ at lambda_method(Closure )↵↓ at Repository.Core.Repository.Common.EntityTransactionRepository.All[T](Boolean includeDeleted)↵↓ at Repository.Core.Repository.Common.EntityTransactionRepository.Find[T](Expression`1 expression, Boolean includeDeleted)↵↓ at Repository.Core.Repository.Common.ModelRepository`1.ExecuteGetAll(Expression expression, Boolean appendPrivileges, Int64 privilegesFilter, Func`2 queryModifier) 08513d28-2a50-43de-8973-903425e70657

9 20160309T111921.565-0500 ERROR XXXXXXX System.Repository.Repository.QRSMain 14 f49d0072-30c1-4e2e-bcca-8f016ff0ad38 NT AUTHORITY\SYSTEM Fatal exception Unable to determine the provider name for provider factory of type 'Devart.Data.PostgreSql.PgSqlProviderFactory'. Make sure that the ADO.NET provider is installed or registered in the application config.↵↓An exception was thrown while invoking the constructor 'Void .ctor()' on type 'DatabaseContext'. ---> Unable to determine the provider name for provider factory of type 'Devart.Data.PostgreSql.PgSqlProviderFactory'. Make sure that the ADO.NET provider is installed or registered in the application config. (See inner exception for details.)↵↓The "GetAll" operation failed↵↓An exception was thrown while invoking the constructor 'Void .ctor(Qlik.Sense.Logging.IQSLogManager, Repository.Core.INodeStaticInfo, Qlik.Sense.Common.Security.ISecuritySetup, Qlik.Sense.Common.Communication.REST.Server.IRESTEngineFactory, Repository.Core.ISystemInformation, Qlik.Sense.IO.ISystemFolderInformation, Repository.Core.Certificates.ICertificatePasswordVerificationWebService, Qlik.Sense.Common.Logging.ILogMaster)' on type 'SetupService'. ---> The "GetAll" operation failed (See inner exception for details.) at System.Data.Entity.Infrastructure.DependencyResolution.DefaultInvariantNameResolver.GetService(Type type, Object key)↵↓ at System.Collections.Concurrent.ConcurrentDictionary`2.GetOrAdd(TKey key, Func`2 valueFactory)↵↓ at System.Data.Entity.Infrastructure.DependencyResolution.CachingDependencyResolver.GetService(Type type, Object key)↵↓ at System.Linq.Enumerable.WhereSelectArrayIterator`2.MoveNext()↵↓ at System.Linq.Enumerable.FirstOrDefault[TSource](IEnumerable`1 source, Func`2 predicate)↵↓ at System.Linq.Enumerable.WhereSelectArrayIterator`2.MoveNext()↵↓ at System.Linq.Enumerable.FirstOrDefault[TSource](IEnumerable`1 source, Func`2 predicate)↵↓ at System.Data.Entity.Infrastructure.DependencyResolution.CompositeResolver`2.GetService(Type type, Object key)↵↓ at System.Data.Entity.Infrastructure.DependencyResolution.DbDependencyResolverExtensions.GetService[T](IDbDependencyResolver resolver, Object key)↵↓ at System.Data.Entity.Internal.InternalConnection.get_ProviderName()↵↓ at System.Data.Entity.Internal.DefaultModelCacheKeyFactory.Create(DbContext context)↵↓ at System.Data.Entity.Internal.LazyInternalContext.InitializeContext()↵↓ at System.Data.Entity.Internal.InternalContext.Initialize()↵↓ at Repository.Core.Repository.Database.Common.AbstractDatabaseContext..ctor()↵↓ at lambda_method(Closure , Object[] )↵↓ at Autofac.Core.Activators.Reflection.ConstructorParameterBinding.Instantiate()↵↓ at Autofac.Core.Activators.Reflection.ConstructorParameterBinding.Instantiate()↵↓ at Autofac.Core.Activators.Reflection.ReflectionActivator.ActivateInstance(IComponentContext context, IEnumerable`1 parameters)↵↓ at Autofac.Core.Resolving.InstanceLookup.Activate(IEnumerable`1 parameters)↵↓ at Autofac.Core.Lifetime.LifetimeScope.GetOrCreateAndShare(Guid id, Func`1 creator)↵↓ at Autofac.Core.Resolving.InstanceLookup.Execute()↵↓ at Autofac.Core.Resolving.ResolveOperation.GetOrCreateInstance(ISharingLifetimeScope currentOperationScope, IComponentRegistration registration, IEnumerable`1 parameters)↵↓ at Autofac.Core.Resolving.ResolveOperation.Execute(IComponentRegistration registration, IEnumerable`1 parameters)↵↓ at lambda_method(Closure )↵↓ at Repository.Core.Repository.Common.EntityTransactionRepository.All[T](Boolean includeDeleted)↵↓ at Repository.Core.Repository.Common.EntityTransactionRepository.Find[T](Expression`1 expression, Boolean includeDeleted)↵↓ at Repository.Core.Repository.Common.ModelRepository`1.ExecuteGetAll(Expression expression, Boolean appendPrivileges, Int64 privilegesFilter, Func`2 queryModifier)↵↓ at Repository.Core.Repository.Common.ModelRepository`1.ExecuteGetAll(Expression expression, Boolean appendPrivileges, Int64 privilegesFilter, Func`2 queryModifier)↵↓ at Repository.Core.Repository.Common.ModelRepository`1.GetAll(Expression expression, Boolean appendPrivileges, Int64 privilegesFilter)↵↓ at Repository.Core.Repository.Common.SecurityAwareRepository.RunWithoutSecurity[TResult](Func`1 func)↵↓ at Repository.Core.Settings.LocalConfigStash.GetFromRepository(LocalConfigKey key, ISecurityAwareRepository repository)↵↓ at Qlik.Sense.Common.Ioc.WorkScope.Work[T,TResult](Func`2 func)↵↓ at System.Collections.Concurrent.ConcurrentDictionary`2.GetOrAdd(TKey key, Func`2 valueFactory)↵↓ at Repository.Core.Settings.LocalConfigStash.TryGet(LocalConfigKey key, LocalConfigCachePolicy cachePolicy, ISecurityAwareRepository repository, LocalConfig& localConfig)↵↓ at Repository.Core.Settings.LocalConfigStash.Get[T](LocalConfigKey key, T defaultValue, LocalConfigCachePolicy cachePolicy)↵↓ at Repository.Core.SystemInformation.b__0(ISecurityAwareRepository repository)↵↓ at Qlik.Sense.Common.Ioc.WorkScope.Work[T](Action`1 action)↵↓ at Repository.Core.SystemInformation.GetTemporaryFolder(ISystemFolderInformation systemFolderInformation)↵↓ at Repository.Core.Services.SetupService..ctor(IQSLogManager logManager, INodeStaticInfo nodeInfo, ISecuritySetup securitySetup, IRESTEngineFactory restEngineFactory, ISystemInformation systemInformation, ISystemFolderInformation systemFolderInformation, ICertificatePasswordVerificationWebService certificatePwdWebService, ILogMaster logMaster)↵↓ at lambda_method(Closure , Object[] )↵↓ at Autofac.Core.Activators.Reflection.ConstructorParameterBinding.Instantiate()↵↓ at Autofac.Core.Activators.Reflection.ConstructorParameterBinding.Instantiate()↵↓ at Autofac.Core.Activators.Reflection.ReflectionActivator.ActivateInstance(IComponentContext context, IEnumerable`1 parameters)↵↓ at Autofac.Core.Resolving.InstanceLookup.Activate(IEnumerable`1 parameters)↵↓ at Autofac.Core.Resolving.InstanceLookup.Execute()↵↓ at Autofac.Core.Resolving.ResolveOperation.GetOrCreateInstance(ISharingLifetimeScope currentOperationScope, IComponentRegistration registration, IEnumerable`1 parameters)↵↓ at Autofac.Core.Activators.Reflection.ConstructorParameterBinding.Instantiate()↵↓ at Autofac.Core.Activators.Reflection.ReflectionActivator.ActivateInstance(IComponentContext context, IEnumerable`1 parameters)↵↓ at Autofac.Core.Resolving.InstanceLookup.Activate(IEnumerable`1 parameters)↵↓ at Autofac.Core.Resolving.InstanceLookup.Execute()↵↓ at Autofac.Core.Resolving.ResolveOperation.GetOrCreateInstance(ISharingLifetimeScope currentOperationScope, IComponentRegistration registration, IEnumerable`1 parameters)↵↓ at Autofac.Core.Resolving.ResolveOperation.Execute(IComponentRegistration registration, IEnumerable`1 parameters)↵↓ at Autofac.ResolutionExtensions.ResolveService(IComponentContext context, Service service, IEnumerable`1 parameters)↵↓ at Autofac.ResolutionExtensions.Resolve[TService](IComponentContext context, IEnumerable`1 parameters)↵↓ at Qlik.Sense.Common.Ioc.WorkScope.Work[T](Action`1 action)↵↓ at Repository.QRSMain.Main() f49d0072-30c1-4e2e-bcca-8f016ff0ad38

Do you have a data-driven culture? 7 things to consider

iStock_000073978937_Small.jpg

Today, our business disruptions are higher than ever. Organizations are facing a constant battle for continued success and survival. One of the biggest battles is knowing how far your company is willing to go to get those data insights. 

We can no longer accept mediocrity in today’s business world, we cannot wait for the data to find us. We need to accept the intuitive data, the non-intuitive data and embrace it as a learning experience.

To ensure you are building a data-driven culture there are seven things you must consider:

1. Companies that are data-driven have clearly defined processes that support key metrics. Once these are in place, companies work hard to communicate them to staff and make sure they understand them and how their work contributions fit into it. 

2. Data-driven cultures work hard to make sure there is only one single version of the truth. This enables users and keeps them up to date in a timely manner, ensures data is fresh, relevant and centralized. Too often we see clients that are working a siloed environment where data is not cross-functional resulting in the value of the data being decreased.

3. To mitigate risk to the company, there is governed data access. Companies must define what these policies are and once again, communicate to the staff.

4. Consistently measure and communicate progress when it comes to metrics. There is nothing worse than knowing there are metrics but not knowing if your company/team is actually working towards them. 

5. I can’t stress this point enough. It is so crucial to have the right resources at all times. Don’t hire people specifically based upon their skills. It’s important but it should not be the deciding factor. Because technology is changing so rapidly you need to hire according to their ability and willingness to learn. Let’s face it, the skills you hire for today may not be the skills you need 6 months from now, so you need people who are agile and able to adapt to new technology, new models and be good at problem solving. 

6. Naturally it makes sense for me to say invest in training following #5. Because technology changes so quickly and teams face challenges often it is important to invest in training to keep people feeling enabled, engaged and enthused. We can’t expect people to figure it out themselves so we must equip them with the proper tools to contribute to the data-driven culture we strive for.

7. Collaborate, collaborate, collaborate! We know different departments tend to speak very different languages. So it is essential to get your IT department and the rest of the business in the same playing field. Often there is such a high level of disconnect because IT goals are different than management goals and IT may have the technical background but may lack the strategic analysis background. It’s time to get everyone thinking with the right mindset.

Companies will always be learning in the world of analytics so don’t get discouraged. We do know one thing for sure, and that is that collecting and analyzing data is no longer optional.

Where to begin with your analytics

iStock_000060286140_Small.jpg

We hear this so many times, “I know I need analytics and better reporting but I don’t know where to start”.  People feel overwhelmed with the task of either implementing analytics or improving what they have.  This being the beginning of a New Year, let’s tackle this question of “where to start”.

Understand you are definitely not alone.  As I said, we hear it often.  Others are also rustling with this but ignoring analytics and reporting is not really an option anymore.  Your competitors are leveraging their data to gain insight and knowledge.

The first step is to understand what is causing this feeling of being overwhelmed.  By understanding the root cause of our feeling of being overcome you can begin to create a plan to deal with it.  Is it the lack of knowledge about your options around analytics and business intelligence? Is it the volume or diversity of the data you are being asked to deal with? Maybe it is the sheer size of the solution, problem or project?

When you develop your plan we advise you break it down into small projects that last no more than 3 months.  You may end up with a series of projects but this provides a number of benefits.  Firstly, projects of 3 months or less tend to be less complex than trying to accomplish longer projects because we have taken the complexity and broken it into manageable pieces.  Secondly, by tackling smaller projects, we can evaluate the changing needs of the organization over time and if need be, reprioritize the projects to address any changing needs.  Andlastly, completing a 3 month project allows for celebration on its win.  We can begin to build momentum and morale in the team rather than feeling like we are caught on a treadmill that we can’t get off of.

Having decided to do smaller projects, what should the first project be?  The bottom line is that it should be the one that will deliver the most value.  It sounds straight forward but deciding this can be daunting as well.  Many organizations will have defined methods like NPV, IIR or Payback Period.  Don’t be overwhelmed by this if you don’t currently have a method.  If you can’t determine easily which project will deliver the most value, keep it simple.  Pick something that may be a little easier to accomplish (lower risk) and will allow for some momentum to begin with a quick completion.  Your first project will be as much about learning as it is about providing the organization with the solution.

The final step is to celebrate the win after completing your first project.  Throw your team a party or lunch, whatever works for your team.  Celebrating the project is also about showing it off.  Hold an open house or presentation to allow others in the organization to view what you have done.  It will build morale in the team and organization.  In addition, it will result in others beginning to think about what they truly need for analytics and reporting.

In the end, just move forward by getting started.  Staying put and doing nothing really means falling behind as others move beyond you.  

Is your dashboard stagnant?

iStock_000017950517_Small.jpg

I always find it interesting that as a company we go through planning processes each year but then fail to support the implementation and wind up scratching our heads as to why we didn’t reach our objectives.  Each year we develop or update our strategic plan.  We develop budgets for the coming year.  We spend countless hours planning our future, what we want it to look like and developing strategies to get us there.  And then we put them in a binder (or file folder) and shelve them until next year when we review them in preparation for the planning process again. 

There are many reasons that these game changing plans get shelved.  We become absorbed in the day to day activities and don’t act on strategies.  Or we may have budgeting processes that don’t actually include the strategic plan.  But one of the biggest reasons is that we don’t ensure the metrics that we define and use to monitor our business truly reflect this new strategy that will propel us into the future we have envisioned.

We are likely still using the same dashboards and metrics that we defined many years ago.  Once we develop our dashboards and analytical solutions we tend to forget that they may not be forever.  Hopefully they reflected the strategy of the day when they were created but now they may no longer be relevant.  They therefore should have a sunset just the same as our strategic plan does.

We understand the need to refresh our strategic plan annually but we don’t do the same for the very thing that is used to measure its success.  Instead we continue to use the key performance indicators of the past and wonder why we are not achieving our goals from the current strategy.  The obvious disconnect is that we plan annually but we don’t change what we are going to use as a map to get there.  How do we know if we are achieving our plan if the metrics that we use to guide our business haven’t been updated to reflect our new plan?

I've written in the past about my thoughts on creating custom KPIs and how the KPIs that we use to measure our business drive the behaviour of our employees and therefore our organizations results.  So if you are using metrics from several years ago and they don’t reflect the strategies that you are planning to implement to drive your company into the future, you shouldn’t be surprised when you don’t achieve your strategic objectives.

Do your organization a favour (and your career) and add “update of the metrics and dashboards” to your annual planning processes to ensure you are positioned to reap the rewards of your plan.

Is your company disadvantaged?

iStock_000069037555_Small.jpg

Most companies collect mountains of data, with some not even realizing how much they really have.  The challenge for most is to turn this valuable asset into something that generates dollars.  Some companies sell their data or subsets of the data but for most of us, this is not an option.   So how do we leverage this very real asset?

If selling the data isn't an option then we must leverage it to make better decisions, faster and with accuracy.

Today, having actionable knowledge is the key to competitive advantage. In today's knowledge economy, what is differentiating companies is the knowledge they possess and what they can do with this knowledge.  It is about making the right decisions before the other guy and then acting on them before someone else does.

In its infancy, the internet was presented as the great equalizer.  Today what we see is those with the information and resources to do something with it are the companies that are leaping in front and dominating their market and industry.

Unfortunately, it appears that large companies have the advantage at this time.  They have been amassing large stores of information, they have the computing power and software to work with it and they have people that that can work with this information and turn it into something. 

Sounds bleak for the rest of us, but there is an equalizer.  Qlik platforms allow non-analysts to analyze and begin turning all of that data into knowledge.  They now have the ability to turn the information into actionable knowledge. In the past access to this information has been kept to the select few, the analysts that had the tools to do something with the data.  But today, everyone can participate in leveraging the data they have access to resulting in fact-based decisions that benefit their team, department and organization.

Qlik uses the concept of “Natural Analytics ™”.  Their platforms allow the exploration of the data in a natural way that works the way our mind works.  With their tool you can leap from one insight to explore another and use our natural ability to determine patterns to reach evidence based decisions quickly.  All of this power is available to everyone, from the smallest company to the largest enterprise.  It is a great equalizer. 

 

 

Natural Analytics ™ is a registered trademark of Qlik.  

5 steps to creating customer key performance indicators

iStock-641919976.jpg

In order to create custom Key Performance Indicators (KPI) and their supporting indicators you have to plan on where you want to take your organization.  You need to look at where you have been, what your strategic objectives are and how you measure success of those objectives.   AND you have to have a clear understanding of the milestones that you must achieve on the way to those successes.

Performance indicators are powerful tools that can assist in shaping the behaviours of your staff and guide the direction of the company.  So it is important that the direction is set and the indicators are aligned with the plan.  This is one of the reasons that we recommend a holistic approach to developing key performance indicators so they work in concert with your organizations plan, people and processes.  We use our Delta P4 Methodology to accomplish this.

When you get down to actually creating the performance indicators the following 5 step process will assist you.

1.Discovery
The Discovery step will involve the Kick-off of the project with meetings and communications to inform the organization and departments of what is being undertaken and the expected benefits to the organization and themselves.  This is also the first opportunity to begin managing expectations of the various stakeholders to the analytics project.

During the Discovery step, the existing organization’s strategic plan and objectives will be reviewed as well as any departmental plans and objectives and business cases related to the project.  As mentioned above, it is key to have alignment of the performance indicators with the plans of the organization.

2.Problem Definition
During the Problem Definition step, the various stakeholder groups to the project will be identified and sessions held with representatives of the stakeholder groups to determine their needs and requirements.  Hearing directly from the various groups that will be using or impacted by the analytics project will help increase the success rate of the final solution.

3.Determining Measurements
Once the problems and challenges have been defined the KPIs and metrics are defined.  Specifically the measurements needed to support and ensure that the objectives and goals are being achieved are flushed out.  As part of this process, the data currently being collected and available is reviewed to understand if there are any gaps between what is needed as measurements of success and what the current data will support.  This step is best accomplished by utilizing the subject matter experts within the organization.

4.Design Solutions
The Design Solution step focuses on the output of the previous steps to begin developing a solution.  The purpose of this step is not for the development of the actual solution but rather developing the requirements and possibly a look and feel of what will eventually be the final solution.  If there were any data gaps identified in the previous step, these are examined and potential subsequent projects are put forth for consideration to close those gaps. 

5.Review
The last step in the process is to review the outcomes with management and the stakeholders to ensure that a complete understanding has occurred and the correct solution(s) will be created.  It is also an opportunity to step back and ensure that what is being proposed still aligns with the organization’s strategic plans. 

This step is vital.  If adjustments need to be made to the final proposed solution, better they occur at this stage rather than when the solution is being rolled out to the users.

Using these steps, you can create custom Key Performance Indicators that help drive your organization to achieve its goals.  Remember, just as an organization will review their strategic plan annually to ensure that it is relevant and still steering them in the right direction, the Key Performance Indicators also need to be reviewed to ensure that they are still relevant and aligned with the plan, people and process of the organization.

Upgrading to Qlik Sense 2.0 - Applications not migrated after update (Hostname/IP doesn't match certificate's altnames)

iStock-840519184.jpg

Here at BizXcel, we just upgraded our Qlik Sense 1.1 server to Qlik Sense 2.0.1 and had a couple “growing” pains that I want to share and how to work around them.    The upgrade to Qlik Sense 2.0 was simple and didn’t produce any errors except that when we logged in the /hub no applications where visible to any users.  After reviewing the Apps section of the QMC, I noticed that all of our applications had a Migration status of ‘Unknown’.  This got to me to thinking that any Qlik Sense 1.1 applications had to undergo an upgrade from 1.1 to 2.0 as well.  After looking around, I was able to find migrations logs located at ‘C:\ProgramData\Qlik\Sense\Log\AppMigration’ in a default install.  Once I opened the logs I found the following message.

Logger  Severity               Date      MicroSeconds   Message

17b3e342-7d61-4abd-ab07-0fc9c51e3c94              INFO      2015-07-02T17:04:18.038Z            576059360.5       Initial state set to 'Migration pending'

17b3e342-7d61-4abd-ab07-0fc9c51e3c94              INFO      2015-07-02T17:04:18.040Z            576061226           Transitioned state from 'Migration pending' to 'Migration in progress'

17b3e342-7d61-4abd-ab07-0fc9c51e3c94              WARN   2015-07-02T17:04:18.256Z            576277020.1       Migration failed: Hostname/IP doesn't match certificate's altnames: Error: Hostname/IP doesn't match certificate's altnames\n    at SecurePair.<anonymous> (tls.js:1389:23)\n    at SecurePair.emit (events.js:92:17)\n    at SecurePair.maybeInitFinished (tls.js:979:10)\n    at CleartextStream.read [as _read] (tls.js:471:13)\n    at CleartextStream.Readable.read (_stream_readable.js:340:10)\n    at EncryptedStream.write [as _write] (tls.js:368:25)\n    at doWrite (_stream_writable.js:225:10)\n    at writeOrBuffer (_stream_writable.js:215:5)\n    at EncryptedStream.Writable.write (_stream_writable.js:182:11)\n    at write (_stream_readable.js:601:24)\nFrom previous event:\n    at Function.Promise$Defer (C:\\Program Files\\Qlik\\Sense\\ServiceDispatcher\\Node\\migration-service\\src\\node_modules\\bluebird\\js\\main\\promise.js:267:13)\n    at new e (C:\\Program Files\\Qlik\\Sense\\ServiceDispatcher\\Node\\migration-service\\src\\migrate\\mocks\\deferred.js:1:127)\n    at Object.o.rpc (C:\\Program Files\\Qlik\\Sense\\ServiceDispatcher\\Node\\migration-service\\web\\assets\\core\\models\\rpc-session.js:1:2575)\n    at o.rpc (C:\\Program Files\\Qlik\\Sense\\ServiceDispatcher\\Node\\migration-service\\web\\assets\\core\\models\\engine.js:1:2297)\n    at o [as openDoc] (C:\\Program Files\\Qlik\\Sense\\ServiceDispatcher\\Node\\migration-service\\web\\assets\\core\\models\\engine.js:1:1452)\n    at Object._.openApp (C:\\Program Files\\Qlik\\Sense\\ServiceDispatcher\\Node\\migration-service\\web\\assets\\core\\models\\engine.js:1:2706)\n    at C:\\Program Files\\Qlik\\Sense\\ServiceDispatcher\\Node\\migration-service\\src\\migrate\\migrate.js:1:959\nFrom previous event:\nFrom previous event:\n    at new Promise (C:\\Program Files\\Qlik\\Sense\\ServiceDispatcher\\Node\\migration-service\\src\\node_modules\\bluebird\\js\\main\\promise.js:84:37)

17b3e342-7d61-4abd-ab07-0fc9c51e3c94              INFO      2015-07-02T17:04:18.257Z            576278677.3       Transitioned state from 'Migration in progress' to 'Migration failed'

17b3e342-7d61-4abd-ab07-0fc9c51e3c94              INFO      2015-07-02T17:04:18.258Z            576279473.3       Notifying callback URI

17b3e342-7d61-4abd-ab07-0fc9c51e3c94              WARN   2015-07-02T17:04:18.289Z            576310403.4       Failed to notify callback URI 'https://XXX.XXX.XXX.XXX:4242/qrs/app/17b3e342-7d61-4abd-ab07-0fc9c51e3c9...Hostname/IP doesn't match certificate's altnames

17b3e342-7d61-4abd-ab07-0fc9c51e3c94              INFO      2015-07-02T17:04:18.290Z            576311158.9       Log closed, migration completed

If you look closely at this log, you will see two errors with the same error message “Hostname/IP doesn't match certificate's altnames”.  The first error is the invocation of the migration script and the second is the message the migration status back the console. 

To fix the problem, you need to make two changes to Qlik Sense 2.0 Migration Service application files.  Start by stopping the Qlik Sense Repository Service(with related services Scheduler, Engine and Proxy) and the Qlik Sense Dispatcher.

Promise.js located at C:\Program Files\Qlik\Sense\ServiceDispatcher\Node\Migration-Service\src\node_modules\bluebird\js\main\promise.js.  Between lines 84 and 85 the following new line should be inserted  ‘process.env.NODE_TLS_REJECT_UNAUTHORIZED = 0;’

The second change is to requests.js in C:\Program Files\Qlik\Sense\ServiceDispatcher\Node\Migration-Service\src\utils\request.js.  In this file, you will find ‘rejectUnauthorized: !0,’ change it to be ‘rejectUnauthorized: 0,’. 

Once you have completed these two changes, you can start up with Repository, Scheduler, Engine, Proxy and Dispatcher services.  Then log into the QMC and go to App, select the application you would like to migrate to 2.0.  Once the application is selected, you can press the Migrate button at the bottom of the screen and it should now migrate without any problems.

I believe the issue that we encountered is an issue with how node.js handles self-signed certificates that use IP addresses instead of hostnames.  I.e. our Qlik Sense server uses XXX.XXX.XXX.XXX instead of qliksense.bizxcel.com.  Reading https://github.com/chilts/awssum/issues/164 gave me the information needed to work around this problem.  I do recommend after doing the application migrations that you remove the changes made so that any security issues introduced with the changes are removed from the code going forward.

Happy Qlik’ing,

Lucas

 

Qlik Sense as a platform for change

iStock-694005116.jpg

Over the past year, as I have been learning Qlik Sense I have realized that there is something fundamentally different about Qlik Sense when compared to Qlikview.  It’s not that Qlik Sense is based on cutting edge web standards such as Web Sockets or the use of intelligent visualizations, it’s that it is designed from the ground up to change organizations.  It’s built using all of these fancy things to make people understand what is happening in a way they never could before.

Over the last 10 years a lot of consultants have talked about the “paperless” office.  To me, the objectives of the paperless office are very noble, but most organizations haven’t reached their goals because of one fundamental issue, people like paper! People can touch it, feel it or even smell it if they want to.   Also, it doesn’t occupy much space on their desk.  Moving from paper to digital has been hard because of that factor.  To me, the biggest innovation to happen since the paperless movement began (but not because of it) is mobile computing.  Laptops, tablets, phones and watches have now entered the minds of people and are being used by everyone regardless of where they are!  This computing revolution happened too late to boost the paperless one, but is having an impact on it.  

Enter Qlik Sense with its mobile first development model where organizations can now build once and run anywhere.  Now instead of having static paper reports, teams can utilize their BYOD to get any information they need.  No longer is it necessary to say, I’ll have to get that report.  It should be at their fingertips on their phone.  Do you need to compare this year vs last year financials to see a change during a meeting?   No longer is it necessary to find a computer and print what is needed.  Everyone can pull out their phone and just get the information they need with no trees killed in the process!

Another philosophical shift that Qlik had in mind when designing Qlik Sense was the “drink your own medicine” mantra.  Cutting edge companies such as Amazon and Google are using this when developing their systems and are creating better systems because of it.  What this means is that there are no more hidden or vendor only API’s.  What they use, you get to use as well!  For Qlik Sense, all of the visualizations that are available out of the box use the same API’s that third party developers get to use when developing their own extensions.  When you look at Qlik Sense from this prospective, it becomes a platform for development, not just a visualization tool! Developers can use Qlik Sense to power visualizations for libraries such as D3.js and many others out on the web.  Developers can also see what other people are doing by taking a look at http://branch.qlik.com

Qlik Sense capabilities can be taken outside of the Qlik Sense walls as well.   Often, it’s important to show team members productions or sales information on a regular basis.  With traditional tools, a person has to navigate to a separate system and look for the information themselves.  This takes time and if it is cumbersome it just won’t get done.   With Qlik Sense, you can take the information to them.  All of the visualizations can be integrated into other systems by embedding visualizations right into a systems users regularly use.  If you need to have yesterday’s sales number or call volumes in corporate portal, use Qlik Sense and you can just embed the chart and users get so see the numbers every time they go to the site.   Now users don’t have to go anywhere for the information, they just get it!  It might only save them 30 seconds, but that’s one more call your sales team can be making!

Qlik Sense isn’t just another BI tool, it has the power to change how organizations work.  Qlik created a platform for organizations to build upon to move them towards more informed team members and better profits for shareholders.

Setting up a generic LDAP user director connector (UDC)

One of the requirements for our Qlik Sense server setup was to create a UDC to pull in users from our LDAP source.

This being my first time creating a UDC I started by watching the YouTube video http://youtu.be/40GjDjvEhZ8that was created by Michael Tarallo. It’s a nice simple video that explains what a UDC is and how it can be used. It also shows how to create a few sample connectors.

Once I had some basic knowledge on how the UDC’s worked I followed the documentation that is available from Qlik Sense help to setup and configure the connector.

**Important. I suggest using a separate LDAP tool to confirm that your credentials and information are correct. This makes the LDAP UDC creation process simpler since you can eliminate an extra layer of potential problems.

Here are some tips and things to pay attention to when configuring the connector options.

Editing the UDC

  • Path
    • Using your LDAP tool to get the specific path and pasting into here will insure you have it correct. It’s easy to miss spaces, commas or other characters so double check.
  • LDAP filter
    • Since I don’t write LDAP filters often, I found it helpful to use an LDAP tool that generated the filter for me, then I just pasted it into this field.
  • Page size of search
    • The documentation mentions this but I wanted to also point it out. This field often needs to be set to no value.
  • Directory Entry Attributes
    • These values are case sensitive and need to match up with your specific LDAP configuration. This is another good reason to use an outside LDAP tool since you will need to verify the attributes for your LDAP setup.

Errors/Debugging

  • In most cases your logs will be located here: C:\ProgramData\Qlik\Sense\Log
  • Make sure to pay attention to all levels of logging as some important messages will be classified under the INFO level.
  • When I experienced odd errors or behaviour I found a simple restart of the Qlik Sense services and QMC helped.

Syncing

  • If you have a large number of users to sync then I suggest limiting the amount you sync until you are confident that all the attributes are correct. You could accomplish this with an LDAP filter. This will save you time in having to delete the “bad” users.

Following the help documentation and making some minor changes to the default field values I was able to get the LDAP UDC working. The Qlik community is a great source for more information and help if you run into any issues.

Qlik Sense server installation

Eric Delorme, a member of our IT team recently had the opportunity to install a basic version of the Qlik Sense server platform for the first time. This particular installation was setup as a single Qlik Sense node environment with all services installed on the same machine. Overall he found the installation fairly straightforward.

He started by downloading the appropriate version of the Sense Platform. The steps required for accessing and downloading the correct version are outlined in the Download Video link below.

After downloading the setup file he followed the steps outlined in the ‘Qlik Sense Quick Installation Guide’, which is available through the Installation PDF Guide link below. The guide is well written, easy to follow, and should assist you in getting Sense successfully installed.

***Important. When you get to step 10 of the ‘Quick Step by Step Installation’ double check that you have entered the correct machine address. This is where he made his mistake. He accidently typed an incorrect value and this resulted in having to later re-install Qlik Sense. If you need to uninstall Qlik Sense the Qlik community has information on the necessary steps. We've also provided a link to a post that goes over what needs to be completed for a successful uninstallation.

Eric followed all the on screen instructions from the installation wizard and completed the installation. Since he had initially typed in the incorrect machine address he wasn’t able to connect but some searching in the online Qlik community led to the problem and solution.

Other than having to re-install because of user error, the installation went well and he was able to successfully install Qlik Sense Server. If you encounter any errors during the installation  we suggest starting with the ‘Troubleshooting’ section of the PDF and then visiting the online Qlik community for further assistance.

 

Helpful links:

Connecting to web services with QlikView

iStock-800364688.jpg

Over the past while I have been on a bit of a Web Services kick with Qlikview. Unfortunately for me the software’s standard GET connection didn’t really allow me to return much data directly into Qlikview.   I was running across a couple of different issues…either the Web Service returned JSON or it required a POST command.   As Qlikview does not support these types of services, I had to do some home brewing to get them to work in Qlikview. After some thought and some time,  I have created a Java framework that can be customized by anyone who needs it.  Hopefully I’ve made your life easier if you ever come across this scenario!  All you’ll need is a little Java know how and access to Google.

The code base that was created has been uploaded to GitHub (see link below) and has been listed on http://branch.qlik.com.  While it is meant to get someone up and running fast, I wouldn’t consider it production material(use at your own risk), so please see fit to modify as needed!

The Java code is based on the testing framework provided as part of Jboss’s RESTEasy platform.  The webserver accepts http GET commands, and then does a translation to the appropriate http command, such as POST.  The data stream is fetched via Java and then returned to Qlikview in a friendly format (XML).

There are some sample proxies included.  The first is the /general/Get2Get_JSON2XML, where the code does a GET to GET proxy, but converts the JSON returned from the web service to XML so that Qlikview can understand. 

There are two proxies included for connecting to Infusionsoft.  These proxies can be accessed at ‘/Infusionsoft/DataQuery’ and ‘/Infusionsoft/DataCount/DataCount’.  Stayed tuned to future blog posts for further instructions on how to use these.

If you are trying to run the program for the first time follow these instructions.

  1. Download(or clone) the maven project from GitHub (see link below)
  2. Import the extracted project into Eclipse (or other IDE)
  3. Using the m2e plugin Maven Build – ‘clean compile package’ the imported project
  4. Once the project has built, you can used the Start.java test case to run the service.
  5. Once the service has started, you can use QlikView to connect to various web services.

 

To test the installation, we can try accessing geo information.  Using the /general/Get2Get_JSON2XML proxy, we can connect to the Data Scientist Tool Kit and get geo information based on a latitude and longitude.  This a pseudo reverse geocoding script for when you have the latitude and longitude but no city information.  

 

// Start of [37.769456,-122.429128] LOAD statements
politics:
LOAD name,
    friendly_type,
    code,
    type
FROM [http://localhost:8080/general/Get2Get_JSON2XML?url=http://www.datasciencetoolkit.org/coordinates2politics/37.769456,-122.429128] (XmlSimple, Table is [array/politics]);
 
location:
LOAD longitude,
    latitude,
    %Key_location_6B88F4B6C3E6C05D    // Key for this table: array/location
FROM [http://localhost:8080/general/Get2Get_JSON2XML?url=http://www.datasciencetoolkit.org/coordinates2politics/37.769456,-122.429128] (XmlSimple, Table is [array/location]);
// End of [37.769456,-122.429128] LOAD statements 

 

 

And there you have it! A quick example of how you can access non tradidtional web services with Qlikview., all you have to do is take it and run!  

Stay posted for blogs containing more innovative out of the box QlikView solutions!

 

Downloads:

Lucas