Did You Know? Women were the pioneers of physical therapy in the United States! Physical therapy was officially recognized as a profession during World War I when female civilian employees of the U.S. Army were tasked with rehabilitating injured soldiers using primarily massage techniques.