Optimized data helps Vancouver Airport save time.
Need to optimize your data? Read on.
Even though passengers have been flying on planes for more than 100 years, there’s still something exciting about it. Apart from it being an amazing technical feat, there’s a thrill in knowing that within a few hours, you could be in any number of places, on any number of adventures.
On the ground, however, a massively complicated series of coordinated efforts are constantly underway to make this a reality and ensure everything runs smoothly and safely.
This is the scene at Vancouver International Airport (YVR), Canada’s second-busiest airport and North America’s natural gateway to Asia. We’ve won several international awards, including being named Skytrax’s Best North American airport for 10 years running. My role as Lead Data and Analytics Architect is to make sure everyone in the organization, as well as our partners, get the most out of our data. By doing so, we can make everyone’s time at our airport as efficient and enjoyable as possible.
Before 2013, YVR didn’t do much in the way of business intelligence (BI). It was more like reporting, and it wasn’t holistic. In 2013, however, we advanced our strategy, taking more of an enterprise approach to building our BI program.
We started with Qlik, which became foundational to our entire architecture. We used QlikView for all the data extraction and transmissions and Qlik Sense for servicing our BI content. Eventually, we sourced data from 35 different systems and built about 40 different dashboards that covered all facets of the organization, from revenue streams to cost maintenance, from marketing to health and safety. Qlik allowed us to rapidly build all these systems in an iterative, agile way, but we still viewed this as just the foundation for our next phase.
This takes us to 2018, when we re-assessed everything in a new strategic plan. There were multiple business and technological factors driving the decision to take our hub strategy to the next level. We want to be the number one hub airport between Asia and North America, but we aren’t the only ones with that goal.
Like us, our competitors have their sights on the lucrative Asia market. Until only a few years ago, we had a competitive advantage in being the closest physical location to Asia in North America. Advances in aircraft technologies disrupted that strategy, however, and we now have to rely on other factors to compete.
In the industry we have something called minimum connection time (MCT), which measures how fast you can turn around planes and passengers. The lower your MCT, the more money an airline makes, the more airlines will serve your airport. At the same time, the airport has to physically grow to keep up with this demand. We’re expanding as fast as we can, but construction is a lengthy process.
So how do we drive down our MCT while processing all these additional passengers within the same physical constraints of the airport? By using data, of course.
Around the same time we developed our new strategic plan, we started hiring a new breed of analyst who pushed our organization to be more data-driven. I affectionately call them “data junkies,” and their demands for data were very different from the status quo at YVR. When training them on our various dashboards, they would say, “This is great, but I need to get to the data behind these visualizations.” The thing was, our architecture didn’t allow for that access.
The airport is growing significantly year over year, and data is one of the key things we can use to drive efficiency in not just passenger, bag, and plane movement, but the security, customs, and check-in processes as well. Our old architecture didn’t gather this data quickly enough for our analysts, and it wasn’t scalable.
Alongside our physical growth, we witnessed massive data growth, and not just because of the increased number of bags, aircrafts, and people. Everything we do has a software component that generates events. Every time we updated to a new system, like when we overhauled our baggage system in 2015, it generated more data than our previous system. We recently went live with a system that tracks aircraft movement every few seconds. That generates billions of events.
The systems are so interconnected that sometimes they aren’t even interacting with humans; they’re system-to-system or system-to-sensor. We had to figure out how we were going to scale to work with all this interconnected data, and in real time so it was valuable to the people who run the airport on a daily basis.
Using these business and technical inputs, we built a technical architecture framework so we could go out into the marketplace and find the solution we needed. That framework gave us some guideposts. Even though this was early 2018, it was a different world in terms of available solutions. There were established vendors, such as Oracle and IBM, who had some technology in analytics, but not a complete set, and they were legacy-based. There were open-source vendors working around this new wave of technologies. Then you had your cloud vendors.
We needed a solution that would truly put data at the core of everything we do at YVR and that would allow us to build a real data hub: a central location where every area of the organization can draw on for any use case.
BI is a lean team, so I had to select technologies that didn’t require an army of system administrators and performance tuners. We chose to use POCs instead of RFPs. I think RFPs tend to be about checking boxes, and vendors will claim to be capable of a lot of things only for some significant gaps to emerge once you go into production. With BI, you need to see it, in your own use case, to believe it. From our POCs, we confirmed the technologies we needed to build up the data hub, and we felt the strongest move was to go deeper with Qlik.
We liked Qlik’s evolution from being a product to an entire platform. They are building this entire stack from ingestion to the presentation analytics layer—and integrating everything with other technologies. The direction they were heading was exactly in line with our vision.
What we have built with our data hub is an architecture that connects everything, because under our new operating model, data is a shared company-wide asset. We want to embed data intelligence in everything we do, so the hub is designed to supply data to whoever, or whatever, in our organization needs it. We are using Qlik to replicate all our data into various zones, so our analysts can work without going back to, and potentially compromising, the source system.
One of these zones is what we call the Raw Zone, where we’ll keep our raw data until the end of time. Storage is cheap, right? Data is too precious a resource to delete just because the use case hasn’t presented itself yet. You can’t recreate data once it’s gone, and every attribute of every table is critical—maybe not today, but months or even years from now. Then we are using Qlik Sense for our corporate dashboard tool, as well as self-service analytics.
The most important dashboard we’ve created is our capacity management dashboard, displaying the efficiency of how our airport processes passengers according to that MCT metric—the most critical metric to our airport’s new business strategy. In our pre-data hub era, it took 33 hours to run this query, so we would only produce an MCT report four to six times a year. Today, we run this same query every night, and in fact we could run it every hour if we wanted. The beauty of this dashboard is we can drill right down to look at the lowest sub-processes at any time of day to see how we can better optimize. And producing this dashboard didn’t require anything of the BI team.
We’ve even harnessed data for gate optimization and situational awareness. Typically, our gate management system assigns gates based on parameters such as wind or size of aircraft, but there are additional data sets we could utilize to intelligently assign a flight to a gate.
Let’s say we have a flight coming into YVR from Hong Kong, and we know that 30% of its passengers are connecting to LAX. Why can’t we use that extra data point to assign a gate that reduces the walk time for that connecting flight? Meanwhile, the situational awareness dashboard shows how we’re doing in terms of loading bags onto that connecting flight. By showing in real time the flights that are at risk of being delayed because of slow baggage loading, we can make adjustments to ensure the plane is in the air on time.
As I mentioned, our BI team here at YVR is quite lean. It’s doubled from being just myself and my director, Bernie, to a four-person team that includes a lead for our data governance and user adoption programs, and another person dedicated to Qlik development.
One of the things we’re most excited about is Qlik Catalog. We have a lot of data going into this hub, and while we’ve been leveraging spreadsheets and Qlik Master Library to catalog some items, we need something more scalable. Qlik Catalog is going to save our business from getting lost.
Qlik is a strategic partner in advancing YVR’s analytics program. The way Qlik has grown from a product into a platform nicely mirrors our growth as an organization. There’s a lot of synergy in our relationship, and I can’t wait to see the evolution of what we’re going to build. We’ve also relied heavily on GINQO, our local Qlik partner. Their team of Qlik specialists have been instrumental in helping me build out these solutions. Our transformation truly was a team effort.
Together, we’re going to make sure that YVR remains North America’s top airport, and that passengers have a swift journey through our hub on the way to their next adventure.
Head of Customer Care
Differentia Consulting strives to encourage a supportive Qlik Community where we can share and support one another to optimize data. Whether this is via events or direct introductions, please contact us to learn more.
To stay up to date with Differentia Consulting news, and find out more about how you can optimize your data, please follow us online:
In addition, there are some LinkedIn groups that we suggest you subscribe to;
Qlik Application Platform Global User Group – https://www.linkedin.com/groups/72977/
Qlik Application Platform Global UK User Group – https://www.linkedin.com/groups/1882895/
Qlik and SAP – https://www.linkedin.com/groups/1891998/
#CyberHygiene – https://www.linkedin.com/showcase/cyberhygiene/
Qlik has consolidated much of its information on how to optimize data, including self-service support information to the https://community.qlik.com site and we’d recommend that you subscribe to some of their feeds – in particular the Support Updates Blog.
|CONSENT||16 years 5 months 13 days 6 hours||These cookies are set via embedded youtube-videos. They register anonymous statistical data on for example how many times the video is displayed and what settings are used for playback.No sensitive data is collected unless you log in to your google account, in that case your choices are linked with your account, for example if you click “like” on a video.|
|_ga||2 years||This cookie is installed by Google Analytics. The cookie is used to calculate visitor, session, campaign data and keep track of site usage for the site's analytics report. The cookies store information anonymously and assign a randomly generated number to identify unique visitors.|
|_gat_gtag_UA_7663490_3||1 minute||This cookie is set by Google and is used to distinguish users.|
|_ga_C8ZHXWKXLG||2 years||This cookie is installed by Google Analytics.|
|_gid||1 day||This cookie is installed by Google Analytics. The cookie is used to store information of how visitors use a website and helps in creating an analytics report of how the website is doing. The data collected including the number visitors, the source where they have come from, and the pages visted in an anonymous form.|
|IDE||1 year 24 days||Used by Google DoubleClick and stores information about how the user uses the website and any other advertisement before visiting the website. This is used to present users with ads that are relevant to them according to the user profile.|
|test_cookie||15 minutes||This cookie is set by doubleclick.net. The purpose of the cookie is to determine if the user's browser supports cookies.|
|VISITOR_INFO1_LIVE||5 months 27 days||This cookie is set by Youtube. Used to track the information of the embedded YouTube videos on a website.|
|YSC||session||This cookies is set by Youtube and is used to track the views of embedded videos.|
|AWSALB||7 days||AWSALB is a cookie generated by the Application load balancer in the Amazon Web Services. It works slightly different from AWSELB.|
|cookielawinfo-checkbox-functional||1 year||The cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".|
|yt-remote-connected-devices||never||No description available.|
|yt-remote-device-id||never||No description available.|