A radio signal travels at 3.00 • 10^8 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 3.54 • 10^7 meters? Show your work.

would the answer be rate=3*10^8m/s
distance=3.6*10^7m
time=distance/rate=3.6/30 ???