HyperAutomation in a data-driven organisation is the ability to automate analytics
A core component of HyperAutomation in a data-driven organisation is the ability to automate analytics. Organisations everywhere understand a basic truth in today’s economy: No enterprise can lead – or even survive – without analytics.
But not all analytics platforms are created equal. Choosing the right one is essential to making discoveries that have the power to drive real change. But too often the evaluation is based on narrowly-focused criteria around just features and functions, limiting the impact analytics can bring to your organisation.
This guide shall arm you with the knowledge you need to confidently choose a modern data analytics platform.
Source of Value
BI drives value for your data analytics short and long term goals. Examples of how a data analytics platform will drive value for your business include:
- Better understand key KPIs and what’s happening in your organisation (and why)
- Empower your workforce to make better decisions from actionable or active intelligence*
- Solve a specific business problem in your industry
- Uncover hidden insights in your data and communicate them
- Better leverage your assets and finances via meaningful alerts
- Create advanced analytics to support and forecast large strategic decisions
- Include analytics on external portals for partners and customers
- Build new types of analytics for unique challenges
- Democratize analytics by embedding them in operational apps
Active Intelligence is a state of continuous intelligence where the technology and the processes support the triggering of embedded actions from real-time up to date data.Elif Tutuk, Vice President, Innovation and Design, Qlik, QlikWorld 2021
Avoid Business Velocity Pitfalls and Maximise the Benefits to Stakeholders
If you want analytics to have a widespread impact on your organisation in the short and long term, look for a platform that will help all your users become more data-literate.
Ensure that the data analytics platform targets broad user communities in your organisation, instead of a small number of more skilled analysts. These limitations will create a bottleneck and impede your business velocity.
Even better, provide different user groups with meaningful visualisations and data experiences, to accelerate the adoption process.
Similarly, look for potential limitations such as:
- Only select users having access to dashboards, reports, and self-service capabilities
- The data analytics platform not catering to a mobile workforce
- An inability to embed analytics in operational applications and workflows
The capabilities of your data analytics platform can also enhance or curb your business velocity.
The capabilities of your data analytics platform can also enhance or curb your business velocity.Mike Capone Chief Executive Officer Qlik, QlikWorld 2021
A data analytics platform that uses a query-based architecture will struggle to bring many different data sources together. Furthermore, SQL joins can leave data behind, or make errors such as double counts. The query-based architecture will also be unable to provide a user with a synchronised timely global view of the data, which is essential for data exploration. In regards to business velocity, query-based architecture will slow down data analytics through poor performance at scale and delayed calculations.
Qlik Data Analytics (QDA) uses an associative engine to overcome the crippling limitations of query-based architecture. Not only can all your data be combined, but the associative engine will index the relationships between values in your data, and create a compressed binary cache in memory, optimised for dynamic calculations. This approach allows for global, real-time exploration of data, instant response times, all while being scalable to great numbers of users and large, complex data sets.
Financial Considerations, TCO, and ROI
For an analytics platform, Total Cost of Ownership (TCO) can include a number of factors on top of the initial licensing fees.
Beware of hidden costs, including:
- Software subscription (licensing and maintenance) for the core analytics product, third-party products, and required underlying technologies
- Cloud Infrastructure (Hardware costs), including servers for production and development, and maintenance
- Internal ongoing support costs, such as IT, database, vendor management,
- Implementation costs and professional services
- Ongoing costs of Software-as-a-Service (SaaS) offerings
- User training and enablement
- Network, compute, and storage costs
Return on Investment (ROI) is provided by features and functionality that add value both relating to time (automation and alerting), accuracy, and intelligence. Intelligence Platform as a Service.
Analytics Use Cases
A data analytics platform should have the capability to accommodate all your use cases within a unified, governed and secure framework.
For example, reporting is just as important as it was decades ago, so you shouldn’t need separate architecture to support it. And if self-service data visualisation is high on your priority list, you’ll also want to think about how your business users – who need more than just static reports – will consume visualisations and dashboards that analysts produce. Finally, with even more innovative use cases like immersive and conversational analytics on the horizon, you’ll want a fully open platform that can be customized and extended to support every possibility.
Take your users into account. You have a broad range of users, all with different skill sets. This includes the business analysts who build visualisations and analytics, but also business users who want to interactively explore, upper-level management and executives, external clients, partners, and beyond. Then there are the more specialised users: ‘citizen’ data scientists, data managers, developers, and IT administrators.
Your analytics platform should give everyone – regardless of their skill set (level of data literacy and system familiarity) – the power to make discoveries in your data and drive action.
Self-service analytics are the bedrock of a data-driven culture. Self-service analytics tools empower non-technical business users to make data-informed decision making. Utilising these tools a business user can receive the up to date information they need, without relying on IT or an analyst. This can include informed guidance in the form of recommendations.
Combining self-service analytics with training, and streamlined business processes will increase data literacy across all business functions.
An often overlooked component of self-service analytics is the ability to collaborate. Particularly to collaborate at scale. For this to happen your data analytics platform must put an end to data silos. To succeed at this you need to adopt the fourth wave of BI which includes process automation and a data-driven culture.
Analytics Apps and Dashboards
It is crucial to consider complexity when choosing an analytics platform.
Qlik® is a complete BI platform that runs on the cloud, making it faster and easier to set up than other popular platforms.
Qlik has streamlined the data journey from the data source to the data consumer. Other platforms require many components to act as data gateways and to facilitate data loading. Whereas Qlik SaaS is client-managed and features centralised data load, prep, and modelling.
Large communities of less skilled users need Hyperautomation and not static reports. They need a way to search and explore data – uncovering patterns, connections, and insights that drive meaningful decisions. Interactive dashboards and guided analytics apps let you do just that, benefitting a wide variety of business users, managers, and executives.
Centrally deployed apps
The first use case of Qlik Data Analytics, fundamental in how a business operates.
- There are intuitive authoring tools for the rapid development of dashboards and analytics applications
- There are application-level controls and functionality for creating an interactive experience, including sliders, buttons, layout options, etc.
- An application should guide a user through a linear process of exploration
- Data needs to be able to be reduced dynamically, allowing the same applications to be deployed with different subsets of data for users based on entitlements
- Data and visualizations should be packaged and deployed together within applications
- Apps should be designed for broad deployment, to large communities of users, across geographies, without performance loss
Embedded and Custom Analytics
Embedded analytics integrate with data analytic capabilities within business applications or information services anywhere that deliver business value. Embedded analytics provide tailored ‘in-context insights that accelerate specific actions that address gaps and opportunities or enhance decision-making ability.
Embedded analytics can range from a simple solution such as embedding objects in web mash-ups to far more complex use cases such as embedding a wide range of analytic capabilities with visualizations directly into applications, such as ERP, CRM or financial management – to help provide relevant insights immediately within a business user’s typical working environment.
Qlik Sense apps, sheets and visualizations can be embedded in (for example):
- Web applications
- Intranet and extranet sites
There are two ways of embedding the Qlik Sense content:
- iFrame integration using the Single Integration API
- Div integration using the Capability APIs (Mashups)
he Qlik Sense® platform allows you to integrate custom data sources by using the Qlik data eXchange (QVX) SDK. It encases the logic in a custom connector to provide a seamless user experience.
Extending Qlik Sense with custom visualizations Qlik Sense visualization extensions enable new ways of visualizing data and
enhance user input and interaction. The visualization extensions are regular Qlik Sense objects that you have added your own rendering code to. They can be charts of different types, or other items like tables or filter panes. In QlikView®, these types of extensions are referred to as Object Extensions.
To expand Qlik Sense, you can integrate it in your own software using the Qlik Sense APIs and SDKs. Qlik Sense, for example, can be integrated as a Windows application and a web-based application.
You can also leverage the Qlik engine in your applications, as well as automate common tasks and build your own client.
Remote work is not a temporary adjustment. According to Growmotely’s Future of Work report in 2020, 61% of professionals answered that they would prefer a fully remote position in the future.
With Qlik, things like data prep and app development are done via the web, so there’s no need to download or maintain tools to your desktop or laptop. SaaS with Qlik is also 100% cloud-based, so you can avoid having to download products across desktops, servers, and the cloud. Furthermore, Qlik runs on your preferred cloud solution, like AWS, Azure, Qlik’s, or any other.
Reporting and Alerting
The aim of every manager is to leverage data to increase business velocity and therefore business value. Monitoring Measuring and Managing by exception is made easier with automated alerting tools that enable efficient communication of exceptions that require attention.
Examples of when might you might need alerts, that leverage data to increase business velocity
- Supply Chain – Inventory falls below a threshold
- Systems – Data quality issues
- Vertical – Consumer Notification
- Contracts – Renewals
- Social – Updates
Platform-Wide Analytics Capabilities
Another area to consider is your choice of deployment: cloud, on-premises, or a hybrid environment. There can be major functional differences between cloud and on-premises analytics such as…
Your data sources can be accessed from the cloud. Also, determine if this requires you to move all your data need to be moved to the
The platform should be able to be seamlessly deployed across combinations of on-premises, private cloud, and public cloud sites. Find out if the private cloud offerings are available and managed by trusted third parties.
You should also ascertain if there a full SaaS offering hosted by the vendor, capable of working in a multi-cloud setup.
Data and connectivity
Adopting native tools for data and connectivity can help reduce TCO, other considerations include:
- Self-service data preparation for business users is available in the cloud environment
- There are more powerful ETL tools or scripting available for complex data integration, transformation, and modelling
- There is a broad set of connectors for file-based, on-premises, cloud, and web sources
- That all required data sources accessible
- There is a complete and accurate catalogue of metadata associated with each data source
- The lineage of each dataset is preserved as the data is prepared, so a user can understand its origin, evolution, and meaning
- There is a global mechanism for offering governed data sources to users for analysis
- Many different data sources can be combined for analysis without data loss or inaccuracy
- Check if the data needs to be fully modelled and cleaned before it can be made available
- The data sources should continue to be up-to-date as changes occur to the underlying data
- See if the full and incremental data reloads are scheduled or event-based
- The platform should be able to handle streaming data
Big (and small) data capabilities
Having the flexibility to scale will help deliver against your data pipeline strategy also need to ensure that:
- The platform can connect to a variety of data lakes and other very large data sources
- The platform scale to massive data sets without sacrificing speed or flexibility of analysis
- There are facilities for user-driven, dynamic reduction of very large data sets for analysis
- Users can combine big data and “small” data, such as user-provided spreadsheets
- The platform index big data, to support interactive exploration while leaving the data where it resides
Data as a Service is being offered by various accredited data vendors, improving the insights that can be acted upon:
- Users can access and subscribe to third-party and syndicated data sources from within the platform
- Ensure you know the range of topic areas offered in the data service
- Check there is assistance when integrating a third-party data set with internal data
Advanced analytics and augmented analytics
Forecasting is key, and having automation in this function provides an enabler for change as part of your HyperAutomation roadmap.
As always integration is key.
- The platform integrates with advanced analytics engines (R & Python and Instant ML tools)
- Integration supports both batch and real-time integration, updating calculations as the user explores
- There are onboard, engine-level capabilities for generating insights using correlation, prediction, outlier identification, etc.
- There is an auto-suggestion of visualisations and insights based on the data
- There are insight suggestions context-aware, taking into account selections/search criteria as users explore
- Machine learning is available to enhance suggestions and analytical processes
- Natural-language generation and search are available
- Self-service data preparation is augmented with machine intelligence to assist users and automate processes
Governance and Deployment
Client and administration
User experience (UX) is one of the components to consider when look at the tool itself. Also consider:
- The core analytics client can be zero-footprint – using HTML5/web technology or built with native code
- Ensure all clients (web, desktop, mobile) deliver the same analytics experience
- Check there is support for multiple languages and accessibility
- There should be a centralised management and administration UI available
- The management UI should allow for administering all assets, including apps, data sources, users, and workspaces
- The management UI should also provide access to all configurations, including tasks/scheduling, security, governance, deployment,
- and licensing
Platform needs to meet the needs of the business to ensure compliance with policies such as business continuity. Also consider:
- The architecture is modular and the workload-optimized (containerised, microservices-based, etc.)
- There is support for high availability and failover
- Awareness of the physical location of the data and analytics content
- That there are import/export capabilities for moving content across environments
- The platform is a unified architecture without multiple/disconnected components
Security and governance
Persona based access controls and data encrypted at rest is the ideal. Other operational elements include:
- All analytics use cases in an organisation can be handled seamlessly within a unified, governed platform
- There are governed repositories of measures, dimensions, and analytics content
- Governed data sources are available for analytics use
- There are governed workspaces for teams and business functions
- There is a flexible, rules-based security model for all functionality
- Data security/reduction extends down to the row and column level
- There is auditing/usage analysis for analytics apps, content, data, and objects?
- The platform integrates with third-party security and management tools
- App version control/integration is available
When it comes to scalability here are some key indicators to look for:
- Clear scalability benchmarks
- A statement that the solution scales to large numbers of concurrent users
- That the solution can scale to large data volumes
- That the solution can scale across geographies
- The solution offers clustering and load-balancing
- That the analytics engine can scale and still offer dynamic calculation without impacting performance or flexibility
The right BI solution will put you on the path to more agile business processes, new opportunities, and better customer relationships. Plus, with a platform that empowers everyone in your organization to make discoveries in your data, you can drive increased data literacy and widespread Digital Transformation.