Problem : Suppose a dog named Tika is chasing a duck in a straight line. If the duck's speed is given by d'(t) = 5 feet per second and Tika's speed by T'(t) = 2t feet per second, how far has Tika traveled when her speed is equal to the duck's speed? If the duck gets a 100 foot head start, how far has Tika traveled when she catches the duck?

Figure %: The Dog Tika Chasing a Duck

Tika's speed is equal the duck's speed after 5/2 seconds. To compute the distance she has traveled in this time, we integrate her speed from 0 to 5/2:

2tdt = (t2|05/2) =    

To find how far Tika must run to catch the duck, we must find the functions that give the distance traveled by Tika and by the duck in the first t seconds. These are just antiderivatives of the velocity functions: d (t) = 5t, T(t) = t2. Since the duck gets a 100 foot head start, we should solve the equation 100 + 5t = t2 for t. The quadratic formula gives t = (5 + 5)/2. Substituting into T(t), we find that Tika must run a total of about 164 feet.