When two equal-sized drops collide in a two-dimensional extensional flow of a second immiscible fluid, the time required to drain the thin film between the drops prior to coalescence is referred to as the drainage time. In a previously proposed scaling theory [
H. Yang et al., Phys. Fluids 13, 1087–1106 (2001)
] we found that in the low Ca regime, the dimensionless product of the film drainage time (td) and the applied shear rate (G) should scale as Ca1/2. Yet, recent numerical simulations contradict this result and show that the dimensionless drainage time (tdG) in this regime scales as Ca. Furthermore, the existing experiments suggest that the drainage time may become independent of Ca in the limit Ca⪡1. In this paper, we attempt to address these apparent contradictions. First, we carry out coalescence experiments in a four-roll mill for significantly smaller drops than have heretofore been studied. Our results show that as R is decreased for a fixed Ca range, the scaling exponent m in the correlation tdG ∼ Cam falls in the range 1 ⩽ m ⩽ 4/3, but never exhibits a value smaller than 1. Thus, we corroborate the numerically predicted scaling of tdG with Ca in the low Ca regime. We then reexamine the scaling theory. We find that the disagreement between scaling theories and the numerical simulations (as well as the present experiments) ultimately emanates from a fundamental limitation in the definition of the drainage time. Finally, our experiments show that the scaling exponent unexpectedly increases when the viscosity ratio is increased from λ = 0.19 to λ = 6.8 for a drop radius smaller than 27 μm. We show that one must evidently account for interfacial “slip” between the drops and the surrounding film to account for this observed increase in m. We define a slip parameter that gives an a priori estimate of the importance of slip in the experimental data.