Optimizing Terminal Hygiene at a Major International Hub
The Challenge
A leading international airport serving more than 45 million passengers annually operates across 14 terminals, each with distinct layouts, passenger volumes, and vendor contracts. The operations team was responsible for holding three separate cleaning vendors accountable to service-level agreements covering more than 400,000 square feet of public-facing space.
On paper, the program looked sound. Vendors submitted daily completion reports, supervisors signed off on weekly summaries, and quarterly audits produced scores that, more often than not, came back in the acceptable range. In practice, the picture was considerably messier.
Inconsistent hygiene scores were appearing in passenger satisfaction surveys. Restroom conditions varied sharply between terminals and across shifts. When the operations leadership team tried to trace specific incidents back to service records, the investigation routinely hit a wall: vendors had documentation showing work was completed, but there was no independent record to validate the claim.
The core problem was not bad vendors. It was a system that created no reliable mechanism for independent verification. Without that, every dispute became a conversation about competing narratives rather than a review of objective data.
What the Operations Team Found
Before deploying any new technology, the operations team conducted a structured audit of the existing oversight process. Over a four-week period, inspectors walked 60 randomly selected shifts across all 14 terminals and recorded findings independently of vendor reports.
The results were instructive. In roughly 28% of the cases reviewed, the independent inspector found discrepancies between what the vendor reported and what was actually observed on the ground. In some cases, tasks marked complete had not been performed. In others, the timing was significantly off, with services logged as completed during peak passenger hours that the physical evidence suggested had actually happened well outside the service window.
The audit also revealed a secondary problem: even when vendor performance was genuinely good, there was no way to demonstrate that to internal stakeholders or to the airport's partners. Good performance and bad performance looked identical in the existing reporting framework, because neither was independently verified.
The Solution
The airport implemented GPS-verified digital inspections across all 14 terminals, with a phased rollout beginning with the two highest-traffic international concourses. Inspectors used mobile devices to conduct structured assessments, with each data point timestamped and geotagged at the point of collection.
The key design principle was that no data entered the system through the vendors themselves. All inspection scores, photo evidence, and completion records came from an independent operations team using a standardized rubric. Vendors continued to submit their own records, but those records now sat alongside independent data in a unified dashboard.
Several elements of the implementation proved particularly valuable:
Standardized scoring across terminals. Before this rollout, different terminals used different rubrics and different supervisors applied different standards. The new system defined precise criteria for each inspection category, from restroom cleanliness to waste bin fill levels, and applied those criteria uniformly. For the first time, a score of 78 in Terminal C meant exactly the same thing as a score of 78 in Terminal H.
Real-time exception alerting. When an inspection score fell below a defined threshold, the relevant vendor supervisor and the operations duty manager both received immediate notifications. This replaced the previous process of discovering issues in the next day's summary report, often hours after the problem had already affected passengers.
Shift-level trend visibility. The dashboard made it possible to see performance not just by terminal but by shift. This surfaced a pattern that had been invisible in aggregated reports: one vendor's performance was consistently strong on day shifts and significantly weaker on overnight rotations, where supervision was lighter.
Results
Within the first six months of full deployment, the airport saw measurable improvements across every tracked metric.
Missed service events, defined as documented instances where a scheduled task was not completed within its service window, dropped by 41%. The reduction came partly from improved vendor performance and partly from better detection: events that would previously have been logged as completed regardless of actual status were now being flagged accurately.
Vendor dispute resolution time fell from an average of nine days to under 24 hours. With GPS coordinates, timestamps, and photo evidence attached to every inspection record, it became possible to resolve most disputes by reviewing the data rather than negotiating between conflicting accounts. The vendors themselves adapted quickly to this new environment, because the data protected them as much as it held them accountable. When a passenger complaint came in claiming a terminal had been dirty at a specific time, vendors could now point to inspection data showing their team had serviced that area on schedule.
The unified cross-vendor dashboard also changed the nature of the quarterly vendor review meetings. Instead of reviewing aggregate scores that told an incomplete story, operations leadership could now walk through specific incidents, trend lines by shift and terminal, and comparative performance across vendors. The conversations became more substantive, and vendors began bringing their own improvement plans to the table rather than responding defensively to complaints.
Key Takeaway
The most significant shift at this airport was not technological. It was relational. Independent verification changed the structure of the vendor relationship from one built on trust in paperwork to one built on shared access to objective data.
When both parties see the same numbers, disputes shorten, accountability increases, and there is room for something more productive than defensiveness. Vendors who perform well have evidence to show for it. Vendors who fall short cannot obscure that fact, but they also cannot be blamed for things they did not do.
The operations team came away with a clear principle: quality oversight is only as credible as the independence of the data behind it. Self-reported compliance is not compliance. It is a representation of compliance. The two are not the same, and no number of well-designed SLAs can make them equivalent without an independent verification layer.
This case represents a hypothetical composite based on patterns common across large aviation operations. No specific airport is identified.
Ready to see IQS Flow in action?
See how independent quality intelligence transforms vendor oversight.
Request a Demo