By the numbers
After implementing a new analyst onboarding program in 2015:
- 92 percent of completed trainees rated all sessions at least effective or above
- 35 percent of trainees are from other analytics groups
- 83 percent of trainees have completed the program
- 100 percent of trainees contribute to projects one month after starting
In 2014, Children's Hospital of Philadelphia (CHOP) launched its Office of Clinical Quality Improvement with two analysts. Vaidehi Mehta was one of the first hired, and now she is one of the managers of a team of 60 analysts. Early on, Mehta saw the role change. No longer were analysts just number crunchers; they needed to play the part of storyteller—someone who could transform raw numbers into actionable information.
But as her team started to grow, Mehta, manager, Clinical Data and Analytics at CHOP, says it became apparent traditional "analyst mentor" onboarding wasn't effective. There was a problem with retention. New hires reported they were learning techniques different from their peers, and they felt ill prepared when onboarding ended.
Mentors were equally frustrated as they felt over-burdened and had no standard methodology to teach—something had to change. What started as a quality improvement experiment at CHOP has grown into a well-functioning, productive department now serving several areas of the organization. Here, she explains moving from onboarding analysts to assembling them.
Why did you change the onboarding process?
We realized standardization is more important than having individual analysts pair with a new hire. Each analyst mentor taught slightly different things, and new hires felt shadowing and not actually practicing the tool didn't prepare them properly when they had to take on a new project.
How do you define this new breed of analyst?
Traditional back-end analysts are great at crunching numbers, but they don't often require much context about where the problem is coming from. The new breed of analyst is someone who can unlock the data by understanding the clinical question and problem at hand.
This analyst actively listens to problems and uses that information to translate clinical context into specific data elements. We look for smart, curious, analytic-minded people who are passionate about health care, passionate about the domain of clinical science and interested in using data to problem solve.
What were key changes you made?
We created a development team where our experienced analysts teach sections of the onboarding program. The makeup of that team includes:
- Four analysts from the Improvement Analyst team
- Experience ranging from six months to a year and a half
- Varying backgrounds and previous experiences
- Different comfort levels with tools and techniques
- Varying understanding of challenges facing new analysts
Over a five-week period, trainees move through "Content Weeks" to sequentially build skills using a past quality improvement project. A different analyst teaches each week, exposing the trainee to more staff members.
How did you decide to develop onboarding internally and not through HR?
We were coming up with a way to teach the technical skills and the soft skills; those are intermingled. It was hard to separate the skills and outsource this to someone who doesn't know the details of our specific role.
Each of the tools we use are nuanced, and we needed to build a program that effectively shared which tool or technique we needed to use in a project phase. Soft skills are general to many roles, but because those skills are so entangled with the technical aspect, it's difficult to push that piece to HR training.
Describe the pod system.
As our team grew, we started to realize we were losing some of the benefits that come with having a small team. After onboarding, each analyst joins a "pod" of four to six people. A pod is made up of people who are knowledgeable about a subset of project work. This is a space for you to go through the detailed technical discussions by brainstorming or actively testing your approach to problem solving. It's a first line of defense.
Did the department's transformation catch on?
Over time, we showed success with quality improvement methodology, and individual departments identified they wanted dedicated support. We had rapid expansion, and simultaneously we started to identify top down initiatives. Analysts look at a large expanse of data to see if there's something we can proactively identify as a problem. With these approaches, we can attack a problem from two ways.