Say you need to buy a car. My guess is that you would do a few things. You would probably observe that particular car on the road, talk to your friends who have the model you are considering, and take a test ride. Then, most likely, you would go to some database like Consumer Reports that will tell you about the vehicle’s reliability. Finally, with the information in hand, you will make a decision whether to purchase that model or pursue another one.
Okay, now you’re a member of a PA faculty unit and a question arises: how does your program make decisions? You would probably take similar steps as those for buying the car. You might bring up the item informally to your peers in the program, maybe reach out to a peer in another program who can provide a different perspective, or discuss it formally at a meeting where you can talk out potential answers. However, inevitably someone will ask, “Is there any data on this somewhere?”
That “somewhere” could be as close as a PAEA survey report!
Several times a year, PA programs are asked to complete PAEA surveys, some of which are required while others are voluntary. Survey data are collected by the PAEA Research Team, which then develops the reports and makes them available on the PAEA website. These reports provide a rich source of current information that program directors and faculty can use to support program decisions.
Because faculty are often inundated with survey requests, we thought it might be helpful to provide a brief look at each of the PAEA survey instruments and overall program response rates to illustrate the importance of completing PAEA-specific surveys.
The Program Survey. This is an annual survey sent to program directors, usually in early summer. Data include general and financial information as well as information on program personnel and students. Generally completed by the program director with input from faculty and staff, this survey also includes questions associated with the Support to Advance Research (STAR) Program. Program response rate: 100%.
The Curriculum Survey. This annual survey is also sent to program directors in early summer. However, unlike the program survey that asks the same questions each year, the curriculum survey cycles its focus each year among didactic curriculum, clinical curriculum, and admission prerequisites. This means that instead of getting annual data, a three-year rotating snapshot of each area is collected. It’s not uncommon for the program director to delegate these surveys to the faculty member most closely associated with that aspect of the program. Program response rate: 100%.
Faculty and Directors Survey. Formerly included as part of the program survey and completed by the program director, this component was broken out into its own survey in 2014 to improve data validity. Usually deployed in the spring, program directors will forward a link to all applicable faculty for them to complete individually. This survey collects data on demographics, job satisfaction, roles and responsibilities, and salary directly from PA faculty, program directors, and medical directors. Program response rate (defined as programs with at least one respondent): 89%. Individual overall response rate: 60%.
Student Surveys. There are two student surveys that are reported in the Student Report. They include: 1) the Matriculating Student Survey (MSS), and 2) the End of Program Survey (EOPS). Unlike the other surveys that come out at regular intervals, the deployment of these surveys coincides with the entry and exit of each program’s student cohort. PAEA Research Team staff monitor these dates and notify the program director via email, which is to be forwarded to the students for completion. Program directors may delegate the task of overseeing students’ survey completion to another faculty member. Some of the data collected on each survey overlap, including demographics, student health and well-being, future practice, and financial information. Other data collected are specific to the student characteristics at their entry and their completion of the program.
The most recent Student Report was released this past August. Response rates for the student surveys vary widely and generally have the lowest response rates of all of the PAEA surveys. The MSS most recently had a 73% overall program response rate. Similarly, the EOPS had a 67% overall program response rate. (Both defined as programs with at least one respondent.) Although program participation may appear high, participation within programs is far less, with only 4,624 students completing the MSS and 3,188 students completing the EOPS.
Like any survey, these reports are only as accurate as the data that programs collect from their students. Data collection may be hindered by faculty concerns of privacy, the fact that the survey is deemed voluntary and not important, or simply fatigue at the time of deployment by students or faculty. However, we encourage programs to frame student participation as a professional responsibility for the greater good — one that will also help them respond to future student needs.
Putting It All Together
If your program has a unique approach for increasing survey participation with voluntary surveys, we would like to hear about it. With everyone’s participation, PAEA survey reports could become your “Consumer Reports” in answering questions with up-to-date information.