As a first-year graduate student in Public Affairs (MPA), the Truman School has provided scores of opportunities to new students to learn more about the policy process and to gain hands-on experience in their career trajectories. As a student on the “policy-track,” public policy analysis and research is an important part of my experience here at TSPA. Since working at the Institute of Public Policy (IPP) as a Graduate Research Assistant, I have received real-world policy experience as well as important skills that complement my MPA program education.
First, IPP has given me experience synthesizing large amounts of material into analyzable parts and themes. None of my prior employment history required me to deliver information to supervisors or clients that is “short and sweet,” but my GRA work at IPP so far has required the ability to give informed, research-supported opinions on a short timeline. For example, my background research for a law enforcement training impact brief required me to jump headfirst into academic literature and case studies on policing, to discern which findings are most valid and applicable for our client, and deliver them to my supervisor and other staff in a timely fashion. While literature reviews are not rocket science, the social sciences are cluttered with hypothesized relationships that are constantly validated, half-supported, or entirely discredited. Working at IPP helps me conduct better research by giving me more experience sifting through academic journals and government reports for the most important, credible, and fact-based takeaways.
Assisting at IPP has also helped me develop models and frameworks for understanding policies and processes. While data-cleaning and data-entry are important parts of the work we do as GRAs, much of my work at IPP has involved qualitative, big-picture thinking. For example, in the law enforcement training project mentioned above, background research revealed a surprisingly limited amount of measures and metrics to determine the impact of training for peace officers. In response, my supervisor, a policy analyst and I spent nearly a month piecing together a comprehensive, evidence-based model to help our client understand the outcomes of law enforcement training and how police departments or researchers can measure impact. Experience in this kind of model-building and framework construction have had immediate impacts for my academics, coming in handy for policy process projects, evaluations of neighborhood empowerment programs, and developing a deeper understanding of topics like collaborative governance and organizational dynamics.
Finally, working at IPP has emphasized the importance of measuring impact in my work. All of the projects that I have gotten the chance to experience here at IPP involve delivering some value to the public (e.g. reducing prison re-entry, improving a community’s trust of police, and improving sexual education). Often, demonstrating impact is incredibly difficult to achieve. Not all measures are easily quantified, monetized, or even agreed upon. Consider community trust in local law enforcement. Should this be measured by survey data, the percentage of cases solved by local departments, or the number of civil liability complaints against officers? While there isn’t a clear consensus in academia, it is evident that each of these carries some potential value. If local departments use survey data to measure trust, it becomes difficult to measure cost-savings. On the other hand, if law enforcement uses civil liability suits to measure impact, there is a failure to measure positive interactions with the police, not just avoiding negative interactions. This illustrates an important part of what I’ve learned at IPP: measuring impact is tough, yet critical for policy analysis.
In summation, IPP has helped me gain a deeper appreciation of the research process, including its bends, turns, stressors, and eventual product. While solid research is hard to produce, its potential impact makes the process a reward in itself.