Practice Plan Management | Process-Centered Management Comes to Health Care

Download PDF of AAMC PCM article

By David Hefner, Harry Bloom, and Dwight Monson

This article explores how four academic institutions have successfully implemented process-centered management and the benefits achieved. These examples provide practical approaches to back up the buzzwords.

Over the last decade, two important lessons particularly important for the health care industry have been learned, and relearned. in corporate America:

1) customer satisfaction is best managed around end-to-end processes , not in fragmented departmental encounters; and, 2) sustainable gains in efficiency and growth are realized by optimizing end-to-end processes, not functions. Health care provider organizations have begun to grapple with these powerful concepts, with greater focus on managing total patient stays around service lines and major support processes (see Figure 1). Predictably, the primacy of process orientation threatens traditional, department-focused leadership. Nevertheless, a growing body of evidence suggests that process-centered operations result in higher quality outcomes for patients and lower costs for payers. It is wise to ask: What lessons can the medical profession draw from experiences in other industries and how can the transition to process-centered management be accelerated while preserving many of the benefits of departmental structures?

Process-Centered Management in the Business World

Back in the 1980s. increased foreign competition, deregulation, quantum leaps in information technology, and more demanding customers presented challenges and opportunities to U.S. industry. Leaders in corporate America came to recognize the importance of organizing around core processes for greater efficiency and better service to customers. A frenzy of reengineering (or radical process redesign) appeared, between the late ‘80s and the mid-90’s. Despite well-publicized failures of reengineering, it is clear that intense focus on redesigning core work “processes” and integrating these redesigns with information technology has benefited the competitiveness of American industry in the last half of this decade. In fact, industry’s preoccupation with work redesign and technology has arguably helped fuel the longest peacetime cycle of sustained expansion for the U.S. economy in this century.

Today, there is general recognition that traditional. functionally organized management alone is simply not adequate to sustain a competitive cost position and high customer satisfaction. Leading organizations such as Procter & Gamble, American Standard, GTE, and Citibank have markedly shifted the focus of management to define “process-focused teams.” These teams utilize end-to-end performance measures and operational clout in actively guiding outputs across departmental barriers to meet and exceed the expectations of customers. In many cases, functional departments have not been dismantled; rather, they have been transformed into specialized centers of excellence oriented to the broader goals of the organization through process management. In its most developed state, financial performance for the entire organization is being measured by process, in addition to more traditional revenue and cost centers.

The Role of “Process Champions” In this transformation, a new role, the “process champion” or “process owner,” has been defined as part of a matrix organization. At Procter & Gamble, the “product delivery process champion” ensures that key customers such as Wal-Mart receive exceptional service with a single point of customer contact. The GTE “customer service process champion” makes certain that telephone service is activated promptly for new customers. At Progressive Insurance, the “claims process champion” oversees claims settlement quickly and efficiently, often at the accident site and always to the delight of the insured. In each case. process champions are accountable for achieving pro – cess goals, cutting across departmental lines to meet or exceed the performance expectations of customers.

Process-Centered Management in Healthcare: Four Vignettes

Recently, similar success stories have begun to emerge in health care institutions. Here, we will illustrate the experiences of four health care providers with process-centered management. While each of the institutions has made the transition to a process focus to a differing extent, all have used this powerful technique to break through traditional barriers and achieve higher performance for patients, while lowering costs.

CASE STUDY #1

HENRY FORD HEALTH SYSTEM’S SUPPLY CHAIN PROCESS

Henry Ford Health System, long regarded as a progressive model of organizational and operational integration, is a giant health system comprising several large hospitals, a medical group of 800 physicians, an expansive ambulatory care network, and a market leading health plan. In response to continued pressure on reimbursement, Henry Ford has mounted a concentrated effort to reduce non-labor expenses and improve cash flow, by optimizing two key processes: supply chain management and billing and collection.

For both processes, Henry Ford’s senior leadership has set improvement targets , defined key sub-processes, and appointed process champions from among respected administrative and clinical leadership. These process champions have been asked to galvanize their peers across traditional departmental and organizational boundaries, questioning past practices, identifying production bottlenecks, assessing options for improvement and advancing new models of organizational and operational design.

Selecting the right kind of leadership to lead these efforts is critical. For example, Dr. Douglas Weaver, Department Chair of Cardiology and the champion for the “Manage Demand” sub-process within the supply chain process, has a two-pronged program :

■ work with clinical leadership to standardize duplicative products (e.g., stents, pacemakers, catheters. and contrast media) and
■ negotiate with a small number of suppliers across multiple product lines to achieve highly significant price reductions.

Dr. Weaver has managed to engage his physician colleagues in a way that administrative leadership was unable to achieve.

CASE STUDY #2

WINTHROP/SOUTH NASSAU HEALTH SYSTEM’S CUSTOMER FOCUSED PROCESSES

Roughly 18 months ago, Winthrop University Hospital, a highly re g a rd e d teaching hospital located on Long Island, New York, redesigned its departmental management structure in favor of a cross-departmental process model (see Figure 2). The institution faced intense competition from an aggressive local delivery system, plus significant reimbursement cutbacks. Winthrop needed to improve efficiency and strengthen relationships with community physicians, who supply 70% of admissions, without jeopardizing its deserved reputation for high quality.

Previous attempts at process redesign convinced leadership that a successful effort would require a new paradigm, one engaging both clinicians and staff. Winthrop’s leadership team began by defining its key processes, setting targets (cycle time, cost and quality) and enlisting new leadership to be responsible for process outcomes. Process owners led efforts to redesign virtually every process across the medical center, including:

■ customer-focused processes (e.g., Manage Patient Flow and Deliver Patient Care)
■ growth-oriented processes (e.g., “Develop New Products and Services”)
■ support-processes (e.g., “Manage Supply Chain”)

A significant number of newly appointed process owners were physicians, who interjected a strong market-focused perspective to the leadership.

Process-centered management had a dramatic effect at Winthrop:

■ Three champions were appointed to lead “Develop New Products and Services” a staff physician, a voluntary physician, and a senior nurse. Each champion was selected because he/she demonstrated an aptitude for entrepreneurial activity. Their first priority was to redesign the approval process for new product and service offerings to reduce cycle time and improve the odds of success in the market.

■ The head of Cardiovascular Surgery teamed with a senior nurse manager to co-lead the “Deliver Patient Care” process. This team succeeded in introducing innovations in the Winthrop care model promoting active teamwork between nurses and the physicians to improve patient care significantly. Process-centered redesign at Winthrop has resulted in a 10% to 12% reduction in controllable cost. One-half of these identified savings have already been implemented. More important, a new spirit of genuine cooperation and entrepreneurialism now characterizes the culture of the organization, resulting in better services to patients and their families.

CASE STUDY #3

UNIVERSITY OF MARYLAND’S PATIENT ACCESS PROCESSES (FRONT-END)

For most practice plans and hospitals, it now takes more staff more time to collect less revenue. Designing the front-end of a patient encounter to preregister and verify insurance coverage is critical to improving cash collections.

At University of Maryland, these front-end processes are moving from a highly decentralized departmental focus to a highly centralized process-centered model within its practice plan, University Physicians Inc. Initial estimates showed that ineffective front-end processes were lowering collections by $10 million to $15 million per year (15% of predicted net revenues). A respected associate professor of Internal Medicine, Dr. Louis Domenici, was appointed by the Dean and the Clinical Affair’s Committee to oversee the redesign of critical frontend processes. By using interdisciplinary teams, a new operating model was developed and is being implemented. The organization has already improved annual revenues by $11 million, the run rate increase, as compared with previous 12 months’ run rates.

Moving from a highly decentralized organization to a more centralized operation is challenging. as many former practice plan administrators can attest. By putting in place performance measures and involving both faculty and staff, the University of Maryland has derived operating models that draw the best from department and process managers. “High touch” customer service provided by the departments is preserved, while pre-registration and insurance verification are centralized to improve consistency and accuracy. Process champions, both physicians and administrators, are held accountable for producing enterprise-wide, quantifiable results, with a portion of compensation tied to actual performance

CASE STUDY #4

UNIVERSITY OF FLORIDA’S “ACTIVITY CAPTURE T H ROUGH COLLECTION” PROCESSES (BACK – END )

Faced with declining payments and increasing rejections and denials, the University of Florida Faculty Practice Plan embarked on a two-step process to improve collections. First, design teams grouped departments into billing and collection units according to volumes and common operational characteristics (see Figure 3). Second, leadership reconnected the front-end and back-end of the collection processes by establishing team process owners with a single point of accountability and agreed-upon team targets. Now, clinicians and staff know who to turn to in resolving patient financial concerns. Department chairs can work with the same individuals to improve collections and assist in meeting or exceeding departmental budgets. With these process managers in place, annual collections have improved by $15 million over the last two years. At the same time, customer service has been improved with more accurate bills and more timely resolution of problems.

Process – Centered Management “Lessons Learned”

What are the common themes, principles and “lessons learned” from these diverse organizations?

■ Define an operating model that maximizes process performance potential for each process. Traditional solutions of centralizing or decentralizing operations are ineffective in improving service and reducing costs. A more pragmatic, process – centered approach will utilize both organizational models to best accomplish defined tasks. For instance, registration and insurance verification lend themselves to centralization, while other activities in the patient access process are performed more effectively in a decentralized or hybrid structure. Giving careful consideration to each sub-process can significantly enhance the overall process.

■ Establish and build consensus around key measures of success. Defining specific measures of performance (e.g., percentage of gross revenue collected and number of days in accounts receivable) is essential to long-term success. Building consensus a round performance measures enables process champions to assume responsibility for improving outcomes across institutional boundaries and provides a performance-based focus for the entire organization.

■ Appoint physicians and respected administrators as process champions
who have the capacity to influence behavior in areas beyond their direct control. Physicians are looking for more substantive involvement in managing the entire clinical enterprise. Appointing credible leaders and charging them with accountability for achieving desired results is essential for generating organizational support to make process – centered management successful.

■ Link process performance to process champion compensation. Introducing new kinds of measures and rewards is the quickest way to shift the attention of leadership to what really matters. A significant portion of the process champion’s compensation should be linked to specific quality and cost performance measures for which he/she is responsible.

■ Transition the budgeting process of the organization to reflect core processes of the organization. As the process champions and their associated process teams become active, it becomes increasingly important to mirror process organization in budget and financial systems. Generally, this is a drawn-out process that takes several years, but it is effective in shifting focus away from cost centers and departments to the new central organizing structure of a process-centered enterprise.

Harry Bloom is a director and leader of the Performance Improvement practice at CSC Healthcare Group in New York 212- 903-9300. David Henfer, principal, and Dwight Monson, director, are co-directors of the School of Medicine Reengineering practice at CSC Healthcare Group in Chicago 312-470-8600. 

The Impact of the 2008 Economic Recession on U.S. Medical Schools and Related Organizations

Editorial Manager(tm) for Academic Medicine Manuscript Draft

Title: Impact of the 2008 Economic Recession on U.S. Medical Schools and Related

Organizations Article Type: Research Report Corresponding Author: Dr. Jack Krakower, Ph.D.


Corresponding Author’s Institution: AAMC First Author: Jack Krakower, Ph.D.


Order of Authors: Jack Krakower, Ph.D.; Margarette C Goodwin, ?; Heather Sacks, ?; David Hefner, ?


Manuscript Region of Origin: UNITED STATES

ABSTRACT

The near collapse of the U.S. financial system in 2008 had broad impact that did not spare the U.S. health care system, including medical schools and teaching hospitals. Two years later, the media is still filled with reports about schools and hospitals that are struggling to survive. In order to obtain insight into the impact of the financial crisis on academic medical centers, AAMC staff conducted interviews with the leadership of medical schools that purported to experience material losses in funds supporting their organizations.

This article describes the magnitude of the losses reported by participating schools and the factors that should be considered in interpreting their impact. It describes the actions and strategies that schools adopted to address financial stress and suggests steps that might be taken by other schools that find themselves in similar circumstances. It concludes with observations by some interviewees regarding the use of the crisis to bring about needed changes that would not have been possible without a “burning platform.”

An added commentary by a health system president warns against tactical approaches adopted by most schools that fail to address systemic changes required to assure the long-term health of academic medical centers.

BACKGROUND

The collapse of the investment sector in 2008 had widespread national and international impact. Not unlike other economic and health sectors, the fall-out included material losses in critical sources that support medical schools and teaching hospitals in the United States News media began reporting the impact of the losses in the fall of 2008, and by early 2009, furloughs and staff layoffs were reported at universities, medical schools and teaching hospitals across the country.1

The Association of American Medical Colleges (AAMC) determined it would be useful to investigate the targeted impact of the crisis on U.S. medical schools and steps taken to address the situation. This paper summarizes the causes, impacts, and steps some institutions have taken to address problems related to the financial crisis.

In July 2009, AAMC staff contacted Group on Business Affairs (GBA) Principal Business Officers (PBOs) asking those at schools experiencing material revenue reductions attributable to the crisis to participate in a brief phone interview. PBOs from 23 schools agreed to participate in a 30-minute interview. A copy of the interview protocol is included as an appendix to this document.

SOURCE AND MAGNITUDE OF LOSSES

There is no simple way to characterize the magnitude of losses experienced by schools in the study, albeit the sources are generally predictable. The magnitude of losses reported by participating PBOs ranged from those that were described as imminent but never materialized to losses ranging from under $1 million to more than $80 million, with anticipated losses in one school approaching $200 million over the next three years.

To understand the impact of losses at a given school, one needs to understand the funding source(s) affected, intended use(s) of the lost funds, the magnitude of the loss, and the ability of the institution to cover such losses from other funding sources. At some schools, FY09 losses were defined as funds allocated or appropriated that were subsequently unavailable for spending (i.e., budget reductions), whereas other schools defined losses as the difference between budget projections and funds available for spending (i.e., revenue shortfalls). In order to understand the impact of the size/magnitude of a given loss, one also needs to compare current year revenues to prior year revenues for the source in question.

In the descriptions that follow, ―losses‖ include both budget reductions and revenue shortfalls. In some cases, losses may have been covered from alternative fund sources such that existing commitments (e.g., faculty salaries) and planned commitments (e.g., program expansions) were ultimately met.

As might be predicted, reductions in state support was the most common source of material losses for public schools (and even a handful of private schools). Schools, particularly private institutions that relied heavily on investment earnings (e.g., from endowments, quasi-endowments, unendowed reserves and working capital) to support current operations reported experiencing material losses. 1 While media reports have suggested that losses at teaching hospital have been widespread, the majority of PBOs in our sample reported that their affiliated hospitals expected to break-even or end the year with a positive bottom-line.2 Even more surprising to us, the majority of PBOs reported that clinical practice income was at least

However, this strategy was not available to all schools, with several relying heavily on investment income to support current operations. In many of these cases, endowments had been established to support the creation of new faculty positions and research centers. Faced with reduced earnings, the schools found themselves in the position of having to either underwrite these commitments from other fund sources, and/or materially cut other expenses in order to fund these commitments.

Most PBOs with whom we spoke reported losses in no more than two sources – most often state support and investment earnings. Although gift revenues were impacted at many schools, these tend to be relatively small in scope. Furthermore, because gifts are often restricted by the donor, medical schools were not reliant on these funds for day-to-day operations.

Several schools experienced significant losses from multiple sources and found themselves facing dire circumstances due to commitments previously made based on revenue assumptions. These commitments included recruiting new faculty to grow the research enterprise, construction of research buildings and critical, but marginally revenue producing faculty, for clinical programs. This scenario occurred in both relatively small medical schools as well as some of the largest schools in the country. In these circumstances, the coupling of financial losses from multiple sources with the costs of significant new commitments served to create the ―perfect storm.

One of the common refrains we heard from both public and private schools was that the medical school was often viewed by the parent university as the ―fat cat‖ school that could bear a larger share of the budget cuts. It’s easy to envision how those who do not understand the nature of medical school financing might come to this conclusion. The median total revenue for public medical schools in FY 2008 was $424 million; the median for private schools was $651 million. The relatively large size of total revenues characteristic of most medical schools sometimes leads to the erroneous conclusion that cuts of a few million dollars or additional parent taxes on the medical school have relatively little impact on operations of the medical school.4 However, funds available for discretionary spending are but a small fraction of total revenues.5

The principal sources of seemingly ―discretionary funds that support the operations of medical schools are state support (in the case of public schools), unrestricted gifts, endowment earnings, indirect cost recoveries, and tuition. Only a handful of schools also have access to patent income.

Although ―discretionary fund sources represent a fraction of the budget for most medical schools, more often than not, they are the primary source of funding for medical education and medical school administration as illustrated by the circumstances facing one school that participated in the study. In FY08, this school’s total budget was nearly a billion dollars, but state support to the medical school has historically been among the lowest in the country. At the onset of the crisis, state support to the medical school was reduced by one-third. Although the reduction amounted to little more than one-half percent of the school’s total budget, state funds were the primary source of support for medical education, tenured faculty salaries, student services (e.g., financial aid), and medical school administration (e.g., admissions). In this case, the school was able to divert dean’s development funds to cover some of the loss but was unable to cover the entire loss, thus resulting in cutbacks in services and staff.

On the one hand, this example illustrates why it is difficult to characterize the relationship between the magnitude of losses and their effect on a given medical school. On the other hand, we did find patterns between the source of funds cut and the impact. Because state funds are generally used to cover expenses related to teaching and administrative infrastructure, reductions in state funds typically required medical schools to reduce administrative staff in the dean’s office and in departments. The impact on educational programs was less clear because schools took a variety of steps to preserve educational programs as discussed in the next section.

As previously mentioned, some schools relied on earnings from endowments and quasi-endowments to fund key aspects of their current operations. Schools that relied on this funding model experienced material shortfalls due to the loss of investment income.6 These were most often research-intensive schools that made commitments on investment income to hire new faculty and construct new facilities needed to expand their research mission. The loss of investment income meant not only that they could not continue to grow research, but that they were challenged to meet commitments made to new faculty and to fund debt service on new facilities.

While this study focused on reductions in revenues, one PBO pointed out that the other side of the equation is material increases in expenses, particularly expenses associated with compliance, quality assurance, and technology. The school in question had experienced cost increases that exceeded 15% annually for the last six years – all without off-setting sources of revenues.

STEPS TAKEN TO ADDRESS LOSSES

Steps taken to address budget reductions and revenues shortfalls can be characterized as falling into two categories – tactical and strategic. The two approaches are distinguished in terms of the former encompassing short term steps aimed at staunching losses and shoring-up the enterprise whereas the latter approach encompasses longer term actions that seemed to be driven by a more holistic approach to dealing with financial problems. 

While a strategic orientation may have guided the steps schools took to address the budget crisis, most of the actions taken were described in more tactical terms. It should be noted that the degree to which a particular medical school’s leadership participated in making decisions about how to approach the budget crisis varied across schools, so that medical schools that might have otherwise chosen a strategic approach were sometimes partially constrained by tactical steps mandated by the parent university.

The tactical steps offer no surprises – reductions in discretionary expenditures (e.g., travel), hiring and salary freezes, and administrative staff furloughs and layoffs. It was rare that furloughs or layoffs applied to faculty, and when layoffs were required, the dean’s office generally took a proportionately greater share of the burden. Budget reductions were frequently administered across-the-board with the dean’s office and departments sharing the burden; however, at some schools, reductions were distributed across departments based on ―ability to pay.

Other tactical steps taken to address budget reductions and revenue shortfalls included:

 Consolidating basic sciences departments
 Consolidating administrative functions across departments (e.g., IT, grants
management)
 Consolidating administrative functions between the medical school, clinical practice,
and an affiliated hospital (e.g., IT, budgeting, cashier, safety, human resources,
credentialing)
 Increasing tuition and student fees
 Delaying capital projects, thus preserving cash
 Deferring building maintenance
 Changing benefits plans, including retirement contributions
 Eliminating bonuses and position reclassifications
 Implementing clinical performance incentives and clinical workload standards
 Changing faculty compensation – limiting guaranteed salary for tenured faculty
 Developing new clinical service lines in order to increase patient volumes and reduce
expenses
 Consolidating faculty clinical practice plans
 Delaying implementation of strategic priorities (e.g., increasing class size, electronic
medical records)
 Implementing plans to consume less energy
 Implementing ―green initiatives‖ to reduce waste
 Closing regional physician offices and clinics
 Implementing voluntary retirement programs or buy-outs for senior faculty
 Outsourcing services, such as printing
 Delaying chair recruitment and giving greater consideration to internal candidates
 Developing new programs to generate tuition revenues (e.g., post-baccalaureate
certificates)
 Enforcing policies on space utilization and density
 Developing partnerships with affiliate hospitals that result in profit sharing
 Draining reserves?

A number of schools reported that protecting the educational mission influenced their decisions, though generally not in the context of an overall institutional strategy for dealing with losses. Only three of the 23 schools interviewed reported steps the institution took in the context of an over-arching strategic plan. Even when schools adopted a strategic approach, many (if not most) of the steps they took were identical to those of schools whose approach was tactical. However, in these schools, the decision-making process included identifying steps needed to achieve both short-term and long-term organizational goals.

It should be noted that those schools who adopted a strategic approach faced very significant budget reductions and/or revenue shortfalls—in the tens of millions of dollars, but not all schools who faced significant losses adopted a strategic approach. In some cases, the magnitude of budget reductions and revenue shortfalls was both sudden and unanticipated; there was simply no time to develop a process for strategic decision-making.

As described by the PBO at one school that adopted a longer term strategic approach: ―Our overall intention was to change the culture of how people operate at the institution and to have their response be long term and sustainable, even as times got better.‖ This school’s plan included four components. First and foremost, they considered the feasibility of developing and implementing ―more systematic business practices‖ as the ―keystone‖ to long-term cost reductions. As a part of their overall strategy, the medical school partnered with their affiliate hospital in the planning process. The dean and hospital CEO committed to working collaboratively to make the organization a ―high performing, highly reliable organization‖ and to strive for ―resource maximization.‖ The result of their efforts included eliminating and recombining redundant hospital and medical school support/administrative services in areas including purchasing, housekeeping, and capital planning. Other steps included cuts in discretionary spending, educating faculty and staff about the costs of running the school (―reduce, reuse, and re-think‖), taking steps to reduce energy consumption, and reaching out to faculty and staff for ideas to improve efficiency. It should be noted that this was accomplished in the absence of a reporting structure that required the dean and hospital CEO to work together!

LESSONS LEARNED

PBOs who participated in the interviews were asked to describe what they learned from their experience that might benefit other schools facing similar circumstances. Their advice can be organized into three categories: 1) communication, 2) processes, and 3) crisis as an opportunity.

Nearly all of the PBOs with whom we spoke talked about the importance of frequent, honest, and open communications as critical to dealing with the situations that they faced. They spoke about the importance of regular meetings between the dean, chairs, where possible, health system personnel, and ―town hall meetings with faculty and staff. The mechanism for communicating with students, residents, and postdoctoral trainees was not always clear, but many PBOs said that budget news updates were frequently posted on their websites. Even though communication was identified as paramount, significant shortfalls surfaced. Among these was that communications between the dean’s office and chairs did not always filter down to faculty. Equally problematic, messages conveyed by chairs (and sometimes the dean’s office) were sometimes inconsistent.

The second category relates to how the process was managed and how the process materially influenced the success/failure in dealing with the situations they were facing. Advice and observations included:

 Chairs & faculty almost reflexively denied the circumstances facing the institution,
taking a position that the institution could ―ride out the storm.‖
 Even when chairs were involved in deciding how to cut expenses, it was not
uncommon for them to take a position that the cutbacks should not apply to their
department.
 The dean’s office did not always have transparent knowledge of departmental fund
balances.
 Having a consolidated budget process that includes the medical school and the
health system is essential to decision-making.
 If layoffs are necessary, effect them all at once and in unison.
 It is essential to align spending with priorities.
 Small decisions can cause huge ripples.
 Faculty should be directly involved in the earliest stage of the decision-making
process, especially if the magnitude of cuts involves reducing faculty salaries,
departmental reorganization, or other actions that affect them.
 Students should be engaged early in the process.
 Celebrate your successes!

The third category of lessons learned has to do with the notion that a crisis may create opportunities to change the organization in ways that are necessary but might otherwise be extremely difficult to implement. It can be characterized by the expression, ―Never let a serious crisis go to waste (Rahm Emanuel).‖ As noted above, several schools used the crisis they faced as the opportunity to consolidate departments and administrative functions, reallocate resources across departments, restructure compensation, and implement productivity measures and standards. In addition, the crisis provided some organizations with the basis for strengthening the partnership between the medical school, the clinical practice, and affiliated hospitals.

CONCLUDING OBSERVATIONS

The impact of the financial crisis on schools that participated in the study ranged from imminent, devastating cuts that did not materialize to situations that involved material budget and program cuts, staff furloughs, and layoffs. We note, parenthetically, that we did not hear of one instance where a faculty member was laid-off as a consequence of financial stress. Recent conversations with PBOs suggest that many schools continue to operate in a psychological state that is akin to ―waiting for the ax to fall,‖ or living on an earthquake fault-line – even though the financial crisis has yet to seriously impact their organization.  Decision-making and infrastructure conditions that led to the circumstances facing schools seem to cluster into four categories: (1) revenue projections built on the assumption that ―past performance is indicative of future results,‖ particularly with respect to assumptions about the continued growth of the research enterprise and indirect cost recoveries and the growth in endowments and endowment earnings; (2) resource commitments made based on the assumption that increasing investments, particularly in the research enterprise, would result in increased revenues; (3) lack of management systems and accountability mechanisms at both the dean’s office and department levels that hindered the ability of schools to react quickly; (4) dependence on state/parent funds as the sole source of support for education and administrative infrastructure.

Steps taken to address the financial crisis seemed to fall into two overlapping categories: tactical and strategic – with the latter category encompassing responses found in the former category, but including a set of over-arching goals and communications processes that focused on bringing about changes to the entire organization. The crisis forced some schools to confront weak accounting, reporting and management systems, and provided a rationale for introducing greater accountability into their systems. It also obliged some schools to identify and remove artificial political barriers that resulted in duplicative and inefficient systems.

Shrinking resources, particularly with regard to endowment earnings and state budget reductions, forced many schools to turn to short-term tactical solutions such as hiring freezes, layoffs and furloughs. Even as the national economy shows signs of recovery, the impact of financial crisis on medical schools, particularly those who rely heavily on state funds, will likely extend well into FY2011.

Commentary: “Fair Warning” – A Health System Executive’s Perspective

Those of you who have attended auctions may recall that the auctioneer’s cry of ―fair warning‖ to signal bidders that they have one last chance to change the course of events before the gavel falls and the bidding ends. Reading the draft report of this study compels me to offer my own version of ―fair warning‖ to my medical school colleagues. I offer this in hopes that we will jointly recognize and come to grips with the failed notion of incremental reactions as a substitute for an overarching strategic and complimentary tactical multi-year approach for fundamentally redesigning our operating models. My advice is informed by serving as the senior executive of two teaching hospitals and practice plans, as well as serving as a consultant to the leadership of more than seventy-five medical schools and AMCs over the last twenty-five years.

With the near-economic collapse of the financial industry, the rapid decreases of revenues and investment income, and the concomitant recession we are in the midst of, most care delivery institutions restored their margins by dramatically reducing their cost base (see previous reference to the COTH quarterly report).


The financing and feedback loops of hospitals and health delivery enterprises are crystal clear when compared to medical schools. All hospitals are obliged to produce balance sheets and understandable profit/loss statements. Hospitals that fail to generate a sufficient margin, let alone break-even, are subject to severe treatment by external entities such as bond rating agencies and creditors. Hence, the strategic planning and financial modeling processes of hospitals tend to have built-in upside and downside scenarios that enable their leadership to prescribe precipitous actions that preserve the long-term viability of the enterprise.

The financing of medical schools is, at best, characterized as ―one off. While they are commonly supported by only a handful of fund sources (e.g., clinical practice revenues, grants and contracts, hospital support, university overhead, philanthropy, etc.), the interplay of these dollars and the administrative infrastructures vary dramatically. It could be argued that hospitals, unlike their medical school counterparts, are not subject to the same labor constraints with respect to cutting costs — e.g., faculty tenure and due process. However, the external visibilities and stresses of unions, the media scrutiny, and the serving the safety net needs of their communities are similar constraints and trade-offs that lead me to believe we all have our constraints to wrestle with.

A historical and somewhat analogous version of ―fair warning‖ comes from the food preservation industry. In the 1920s, the first patents for refrigerators, then described as a ―refrigerating machine for cooling and preserving foods at home,‖ were issued. The dominate ice cutters’ union in Minnesota dismissed the new fangled machines as dangerous contraptions. Furthermore, they were certain that, at best, all they needed to do in response was to learn how to cut ice more efficiently.

As I read about or witness first-hand the tactical rather than strategic approaches taken by our medical schools, I cannot help but conclude that we are focusing on cutting the ice more efficiently. Taking steps to protect the status quo ignores reality. Unfortunately, there is evidence to suggest that this is not the first time that our schools have chosen to believe that what is happening around us is not applicable. The study done by Heinig, Dickler, Korn, and Krakower in 2007 demonstrated that while the NIH had made it clear that research funding would be flat (at best) in the next 3-5 years, almost all schools in the country reported their expected research portfolios would grow by 3%-5% per year for the same period.7 We see the same gold rush mentality as it relates to the temporary ARRA stimulus funding that wears off in the next 12 months – – have we not just increased our research base and consumption that will inevitably lead to heightened research shortfalls?

I do not for a moment believe that we are out of the economic woods and can continue to operate as we have. If we do not press forward with substantive and systemic change, the likes we have not experienced for 40 years, then we will indeed experience the bang of the hammer and have missed our own industry’s ―fair warning”.

1) Unendowed reserves and working capital include funds invested for the short and intermediate term – investment vehicles like the Common Fund. Their earnings were typically driven by Treasury bill rates which were below 0.2% for most of 2009, and were below 0.1% for the last quarter of the year.
2) A comparison of the COTH data reflecting margins for the fourth quarter of 2008 to the second quarter of 2009 lends supports to PBOs comments. That is, the median margin went from –1.3% to +8.1% during this period.
3) According to data collected on the FY 2009LCME Part I-A Annual Financial Questionnaire, the average change in practice plan revenue was about 6%.
4) The medians obscure huge differences between schools – total revenues in FY08 ranged from more than $1.5 billion to less than $10 million. See http://www.aamc.org/data/finance/. 5 Almost 40% of the funds supporting medical school activities are not directly controlled by the leadership of medical schools. Most often, these revenues are related to expenses of affiliated hospitals or clinical practice operations that are “offthe-books” of the medical school. Furthermore, funds associated with sponsored grants and contracts, which represent nearly 30% of total medical school revenues, are “restricted” by sponsors to a specific purpose, as are most gifts.
6) It is important to recognize that endowments exist because donors elect to fund certain programs. In general, the funds are restricted to specific uses and cannot be set aside and used for one-time expenses. If, for example, a given endowment is established to fund new faculty positions or a research center, once established the school is generally obliged to support the activity even if the endowment earnings fall short. Consequently, it would be a mistake to characterize “relying on endowment earnings” as a “choice.” Comment by George Andersson, CFO, Washington University School of Medicine.
7) Stephen J. Heinig, M.A., Jack Y. Krakower, Ph.D., Howard B. Dickler, M.D., David Korn, M.D. Sustaining the Engine of U.S. Biomedical Discovery. NEJM, September 6, 2007, p. 1042-1047.

APPENDIX

INTERVIEW QUESTIONS

I. Nature of crisis:

1) Cause(s) of stress including reductions in

  •  state support
  •  endowment earnings 
  • gifts
  • clinical practice income
  • hospital support
  • other fund sources

2) $ Magnitude

3) Previous use of funds (e.g., recruit faculty, financial aid, support/underwrite research)

II. Steps taken to address crisis:

1) Staff actions (e.g., furloughs, lay-offs, hiring freezes)

2) Revenue/Expenditure actions (e.g., increase tuition, limit discretionary spending, reduce salaries, eliminate raises, delay capital projects)

3) Restructuring/programmatic actions (e.g., combine departments, change curriculum, reorganize administrative functions)

4) Decision process

5) Who was involved (e.g., dean, FPP director, chairs, hospital, parent university)

6) communication strategy

7) Complicating factors (e.g., unions, tenure, commitments, partners, contracts)

8) Timeframe — near and longer-term actions

III. Outcomes and Unanticipated Consequences

IV. Advice/Lessons for others

1) major challenges

2) roadblocks

3) miss-steps

4) facilitating conditions

5) crisis as an opportunity to drive change

1 See, for example, Sickinger T. Downturn Cripples OHSU Staff. Oregonlive.com. December 2, 2008. http://www.oregonlive.com/business/oregonian/index.ssf?/base/business/122819191672940.xml&c oll=7

Loyola University Health System Cutting More Than 440 Jobs. Chicagotribune.com. May 12, 2009. http://www.chicagotribune.com/topic/wgnam-loyola-090512,0,1204730.story

Gallagher K. Medical College of Wisconsin Will Cut It’s Budget by 5%. Milwaukee Journal Sentinel. March 21, 2009. http://www.jsonline.com/business/41608377.html

Jan T. Harvard to Lay Off 275. Boston.com June 23, 2009. http://www.boston.com/news/local/breaking_news/2009/06/harvard_u_to_la.html

The UW System Furlough Guidelines. University of Wisconsin System. http://www.uwsa.edu/furloughs/

Furlough Information. University of Wisconsin Department of Medicine. http://www2.medicine.wisc.edu/home/hr/furlough

Implementing a Simpler Approach to Mission-Based Planning in a Medical School

Tod B. Sloan, MBA, MD, PhD, Celia I. Kaye, MD, PhD, William R. Allen, Brian E. Magness, and Steven A. Wartman, MD, PhD

Abstract

Changes in the education, research, and health care environments have had a major impact on the way in which medical schools fulfill their missions, and mission-based management approaches have been suggested to link the financial information of mission costs and revenues with measures of mission activity and productivity. The authors describe a simpler system, termed Mission-Aligned Planning (MAPTM), and its development and implementation, during fiscal years 2002 and 2003, at the School of Medicine at the University of Texas Health Science Center at San Antonio, Texas. The MAP system merges financial measures and activity measures to allow a broad understanding of the mission activities, to facilitate strategic planning at the school and departmental levels.

During the two fiscal years mentioned above, faculty of the school of medicine reported their annual hours spent in the four missions of teaching, research, clinical care, and administration and service in a survey designed by the faculty. A financial profit or loss in each mission was determined for each department by allocation of all departmental expenses and revenues to each mission. Faculty expenses (and related expenses) were allocated to the missions based on the percentage of faculty effort in each mission. This information was correlated with objective measures of mission activities.

The assessment of activity allowed a better understanding of the real costs of mission activities by linking salary costs, assumed to be related to faculty time, to the missions. This was a basis for strategic planning and for allocation of institutional resources.

Changes in the education, research, and health care environments have had a major impact on the way that medical schools fulfill their missions. Specifically, rising costs, including escalating salaries for faculty, coupled with declining clinical care reimbursement and shrinking educational support have presented substantial challenges, particularly regarding teaching. Without a regular assessment of resource allocation and faculty effort, the allocation of institutional resources can rapidly become misaligned with the activities and contributions of the faculty. It is therefore essential that methods be developed that permit objective assessments of resource allocation. Schools need a firm grasp on the efforts of their faculty and their allocated costs in order to plan strategically for the continued academic and financial success of the institution.

The Association of American Medical Colleges (AAMC) has promoted a methodology for this challenge known widely as mission-based management (MBM).1 In general, this involves the quantification of the activities of faculty in relation to the traditional missions of teaching, research, clinical care, and administration and service. MBM has also assessed the productivity of this activity and the costs associated with each mission. MBM systems have served as a tool for medical schools to optimize the alignment of institutional resources with both the existing activities of the faculty and new strategic initiatives. Many articles in the literature have shared various approaches and the associated results, which often involve the redistribution of resources based on the quantity and quality of faculty effort.1–15 Implicit is the recognition that historic methods of allocating funds do not match the faculty’s actual contributions to the missions. If data can be obtained that are derived from these actual contributions, then they can be used to “correct” the maldistribution of resources.

In addition, MBM has attempted to address the problem of defining the relative value of the productivity measures in each mission-related activity. Methods that have attempted to do so often result in a complex process that is difficult and/or expensive to administer. Further, such approaches may find it problematic to value nontraditional or novel methods of instruction. Additionally, academic faculty can find the assessment of productivity threatening. Nevertheless, using a system of weighted measures for faculty contributions can have the advantage of moving efforts more rapidly in the direction of desired change.

However, changes in the direction of the school’s missions may be possible using a system which is simpler, less expensive, and more acceptable to faculty than one that emphasizes productivity and the value of each mission-related activity. The faculty of the School of Medicine at the University of Texas Health Science Center at San Antonio (UTHSCSA) participated in the development of a simpler system, and its implementation over two fiscal years, FY2002 and FY2003, has provided the opportunity to assess its utility.

In this article, we describe that system and its development. It is a relatively simple method of mission-based management within a medical school, which focuses on assessing faculty activity in each of the missions and the associated revenues and costs. We have termed this process Mission-Aligned Planning (MAP™). Our goal was to gain the knowledge and insight necessary to guide the institution through its current challenges, change its direction in selected areas, and improve the operating margin of each department and the medical school.

Development of the System

The development of the MAP system, which began in 2001, was based on three criteria:

▪ that it involve a relatively simple method of assessing faculty effort and the cost of that effort, but make no attempt to weight the productivity or value of faculty effort in its primary data acquisition;

▪ that it prove to be useful at different administrative levels throughout the institution; and

▪ that it minimize cost by using as much available information and resources as possible.

We (the authors) developed the MAP system as a three-step process. After a series of departmental meetings organized by the dean and us to inform the faculty about the system’s goals and the institution’s commitment to the system, the first step was the enumeration of the faculty activities, which were measured by a survey of the faculty in each of the two academic years devoted to forming the database. Next, a process was developed to obtain the expenses and revenues relative to the school’s four missions of teaching, research, patient care, and administration and service. Administrative time was defined as activities within the institution (e.g., administrative positions or committee work) and was divided into activities within the department or hospital, for the medical school, and for the university. Service time was defined as time spent conducting activities outside the institution (e.g., national lectures, service on professional societies). Faculty development time (e.g., increasing administrative skills) was not differentiated between within and outside the institution.

This required that a consensus be achieved regarding specific conventions to be used for faculty activities that simultaneously apply to more than one mission (e.g., providing care for a patient while teaching with medical students and/or residents). Finally, all faculty activity and financial information data were merged into a unified financial format that would facilitate both analytic and strategic decision making. In order to protect the confidentiality of faculty members as well as to permit department chairs the opportunity to manage their departments effectively, each of the school’s 13 clinical departments received individual data on its own faculty only; the dean’s office received information aggregated at the department and school-wide levels. (Those departments are Anesthesiology, Family and Community Medicine, Internal Medicine, Obstetrics and Gynecology, Ophthalmology, Orthopaedics, Otolaryngology, Pediatrics, Psychiatry, Radiation Oncology, Radiology, Rehabilitation Medicine, and Surgery.) The basic science departments at the institution are part of a separate graduate school of biomedical sciences and were not part of the MAP system. Because financial information is most easily acquired for a full academic year, it was decided to use two consecutive academic years, FY2002 and FY2003 to form the initial MAP database. At UTHSCSA, the academic and fiscal years are coincident and run from September 1 through August 31. We will refer to years as fiscal years throughout this article.

The activity survey was developed through a faculty-led process designed to maximize faculty input and buy-in. The survey sought to identify all activities considered essential for carrying out the school’s missions. Initially, the MAP system was “beta tested” in a department that had high levels of activity in all mission areas. It quickly became evident that there were several activities that appeared to be unique to this department. As a result, the activity identification process was broadened by using workgroups within each of the clinical departments to incrementally expand the scope of the activity survey and include the unique aspects of each department. Early in the process, two departments piloted the survey instrument to gain specific feedback on the activities measured and the methodology. A final faculty consensus group representing all departments finished the survey instrument and assisted in developing the instruction set for the system’s administration.

The faculty consensus group also determined the amount of time to be allocated to a faculty activity in cases where the time spent would be either difficult to recall (lecture preparation) or where substantial variation would likely exist (e.g., the percent of time spent in clinical work and research activities that would be allocated to teaching). For example, the group decided to limit the time allocated for lecture preparation to an amount agreed upon, based on the type of lecture (new lectures, significant update of previous lectures, and minimal updates). And in those cases where teaching time was spent during research or clinical care activities, the group determined which portion of that time would be allocated to teaching or to the primary activity (research or clinical care). These conventions were revised for the second-year survey by data from the first year’s faculty survey.

The faculty activity survey took approximately one year to develop. The consensus group decided to allocate 50% of time to teaching when either clinical activity or research activity was being conducted in the presence of students and the faculty indicated that teaching was occurring. This was unchanged during the second year of the survey, as the initial survey results indicated a median estimate of 50% by the faculty. Lectures were categorized as either new or as major or minor updates of previous lectures. The consensus group elected to allocate lecture preparation times of 16, 8, and 2 hours, respectively, per hour of lecture. The first survey results indicated median lecture preparation times of 10.4, 3.7, and 1.3 hours per hour lecture; these medians were used during the second survey period.

Two other issues that needed resolution were the allocation of state-appropriated revenue and the allocation of faculty salary expenses. An administrative consensus group decided to allocate 80% of state-appropriated revenue to the teaching mission and 20% to administrative activities. This was based on the premise that these funds were intended by the state legislature to support the educational mission and that a portion was necessary to support the administrative infrastructure to facilitate teaching. Lengthy discussions occurred regarding this approach, notably whether some of these funds should be allocated to faculty support for start-up research activities. The decision to do so is an internal one, but the simplicity of the MAP methodology is such that this and other approaches can easily be changed and the resulting impact assessed. Since both education and administration had negative operating margins and required cross subsidization from the clinical and research missions, decreasing revenue to either of these would not change the overall picture.

Distribution of faculty salary expenses also deserves a note. First, it was agreed that there would be no distinction between missions when the cost of faculty time was assessed in the MAP system. An hour of teaching would be assigned the same cost (the actual cost of time for that faculty member) as an hour of clinical work or research. Second, it was decided to allocate salary expenses based on the percentage of faculty time spent in each of the four missions exclusive of clinical at-home calls. In the survey instrument, call time was defined as time away from the campus during which the faculty member was available by pager or phone for consultation. Call time that required the faculty to be “in-house” was included in clinical time.

The final survey instrument was distributed as an Excel spreadsheet and included a request for an estimate of the hours spent in mission-specific activities. Information was also requested regarding mission activities in specific geographic locations (e.g., different hospitals and clinics) and with different levels and types of trainees (e.g., medical students, graduate medical trainees, graduate students). Also requested were an estimate of time to complete the survey instrument, an estimate of lecture preparation time, and an estimate of the fraction of time during research or clinical activities that should be devoted to teaching. The spreadsheet compared the entered data to each faculty member’s estimate of the weekly hourly activity to assist in identifying areas of underreporting or overreporting. Faculty were asked to reexamine their entries when they exceeded maximum reasonable limits, and they were asked for corrected data when time entries were obviously in error. In order to make the survey most effective for mentoring, additional information such as detailed reports of products of academic work (e.g., papers published, presentations made) were also recorded.

The collection of activity data was conducted starting approximately three months following the completion of the fiscal year. Departments were asked to obtain survey information from all faculty, regardless of salary status, who participated in mission activities in excess of ten hours for the academic year. Each department strove for a 100% completion rate, as the departmental leadership wanted all of their mission activities to be recorded. The responsibility for having the faculty complete the surveys and then working with the faculty to correct survey entries was assumed by the respective departments. Therefore the departmental leadership had access to departmental data immediately following the submission of the data. Faculty rosters, the human resource database, and clinical care records were used to identify faculty missed in the initial survey replies. After the spreadsheets were completed, the results were collated into a departmental aggregate by an impartial intermediary (TS) who acted as both an institutional advocate and as a faculty advocate to insure the completeness and integrity of the process.

The aggregate data were verified with the department leadership before presentation to the dean. A similar process was followed during the second survey year except that the consensus groups used information learned in the first year’s survey process to make changes in the instructions to improve the second year’s process. The faculty consensus group also reevaluated the time allocated for lecture preparation as well as the fraction of clinical and research time allocated to teaching when teaching was being conducted in the context of clinical and research activities. This second survey requested the same data; only the instructions and spreadsheet format were changed slightly.

It was recognized that the data entered by faculty would be an imperfect recollection of exact time utilization; therefore, a large variety of objective activity measures was collected to corroborate the departmental aggregate activity data. For example, the Dean’s Office collected readily available information about the hours of lecture activity, measures of student rotations, and various other measures of educational activities conducted by departments. For the clinical mission, RVU (relative value unit) activity was collected. RVUs are a common scale developed by the Health Care Financing Administration (HCFA) and subsequently modified to quantify the work and resource costs needed to provide physician services across all fields of medicine.16 The HCFA system of RVUs has a physician component, a practice component and a malpractice component. For our analysis, only the physician component was used.

Grant funding data was collected for objective measures of the research mission. For administrative activity (exclusive of service activity), the number of department faculty was recorded. It must be emphasized that these “productivity” measures were not primary data elements in the survey; rather, they were elements assembled by the Dean’s Office to provide a “reality check” on the information from the surveys.

In addition, no effort was made to assign relative values to various educational, clinical, research, or scholarly activities. For the purposes of our analysis, we assumed that all activities of the faculty were valuable, and that it was the role of the chair to direct faculty to tasks that were most beneficial to the department and school. In essence our goal was to obtain a reasonable picture of the activity distribution across the missions. We recognized that a focus on “perfect data” would not only be costly but likely impossible (e.g., measuring actual clinical teaching time).

The second and third steps were the development and preparation of the unified financial spreadsheets for merging the activity data and the expenses and revenues within each of the departments. Table 1 shows the unified financial worksheet for a hypothetical department. The administrative consensus group of school leadership and departmental administrators also determined the conventions for the allocation of institutional funds to the various missions. With respect to revenues, the allocation of practice plan income to the clinical mission and of National Institutes of Health grant support to the research mission was straightforward. As mentioned above, each department’s state appropriations were allocated according to an 80/20 split, with 80% of state funding used to support the educational mission and 20% used to support the departmental administrative infrastructure. On the expense side, faculty salaries were allocated across the missions proportionate to each department’s overall faculty survey results. While the faculty survey results do not provide a precise reflection of the effort of departmental support staff, department chairs felt that the activities of their support staff generally occur in the mission areas of the faculty they support. The allocation of non-salary expenses into the four missions was accomplished through a set of expense allocation guidelines developed by the administrative consensus group.

The final presentation of information used for strategic planning included bar graphs created from these worksheets that demonstrated the positive or negative financial margins in each mission for each department. Additional information was gained by correlating various mission activity subsets with the respective expense and revenue amounts.

In order to correlate the objective teaching data with the MAP survey teaching hours, regression analysis was used. The total teaching hours recorded in the survey for all categories for each department were compared to the sum of the percentage contributions of that department in each objective teaching category. In order to determine the correlation of the data submitted in the FY2002 and FY2003 years, the hours in each category of mission activity, adjusted for the number of FTEs (full-time equivalents), were compared. A p value less than 0.05 was considered significant.

The First Two Years

Below we describe the results of using the MAP system during its first two years. Faculty surveys were completed three to four months after the end of each of the two fiscal years so that the financial information would match the recording of effort. In our description below, FY2002 refers to the fiscal and academic year of September 1, 2001, through August 31, 2002, and FY2003 refers to the fiscal and academic year of September 1, 2002, through August 31, 2003. During FY2002, 880 survey instruments were competed, representing 802.7 FTE; for FY2003, 987 survey instruments were completed, representing 892.8 FTE (the salaried faculty increased substantially by the time the second survey was made). Faculty indicated a median time for completing the survey instrument of 2.0 hours for FY2002 (with several faculty indicating substantially longer times to complete the survey). The median time for completing the survey instrument for FY2003 was 1.5 hours.

Table 2 shows the hours that all the school’s faculty reported, in the two surveys, toward fulfilling the various missions. Teaching hours shown included hours delivering lectures to students, non-lecture time (e.g., small group sessions, ward rounds), lecture preparation, teaching during clinical care (50% of the time when clinical care was being delivered and the faculty indicated that teaching was also occurring), teaching during research (50% of the time when teaching was being done during research activities), commuting between teaching at different locations, administrative time for teaching (e.g., organizing student rotations), and teaching development (e.g., taking courses on teaching). For the purposes of this report, teaching included hours spent with all students of the university: medical, dental, graduate, nursing, allied health, and all graduate medical trainees in programs sponsored by the institution (e.g., residents, fellows).

Also shown are the hours recorded in the clinical mission. These were the hours delivering direct clinical care (i.e., where the faculty were providing clinical care without teaching), delivering clinical care where teaching was occurring (50% of the clinical care time when teaching is occurring), commuting between clinical activity at different locations, other professional activities (e.g., dictating, record reviewing, legal work), faculty clinical development time (e.g., learning new clinical skills, continuing medical education courses) and hours spent “on call.” Time recorded for faculty who were required to remain in the hospital on call (e.g., obstetrics, anesthesiology) was recorded in direct clinical care. Call time outside the hospital was divided into “pager call” (where a faculty was not in the hospital, but must be available by pager to return, such as a surgeon on call), and “phone call” (where a faculty was not in the hospital, but must be available by phone for consultation). The time spent on “pager call” or “phone call” was not included in the time allocated to the clinical mission for the purposes of determining the allocation of faculty related expenses in the financial margin.

The hours recorded in the research mission were: time when the faculty was conducting research without teaching students, time when teaching was occurring (50% of the time was allocated to teaching and 50% to research), and time spent in faculty research development (e.g., learning new research skills). Time spent in research was not differentiated between the types of research (e.g., bench research, clinical research, population research).

Excluding pager and phone call, the distribution of the hours entered for FY2002 and FY2003 were 36.2% teaching in the first year (35.9% in the second year), 36.8% clinical work (36.4%), 16.9% research (17.7%), and 10.1% administration and service (10.0%). When the aggregate financial spreadsheet for the entire school is merged with the mission activities, the pattern of finances that emerges is shown in the financial margin bar chart in Figure 1. As noted by the negative margin, the cost of education far exceeded its allocation of funds. As expected, the clinical mission was the major activity for which revenues exceeded expenses, with the research mission also providing a positive margin (note that research revenues included nonfederal payers and endowments). Also as expected, the administration and service mission expenses exceeded revenues. Hence the cross-subsidization of the teaching mission and the administration and service mission occurred primarily from the clinical mission and, to a minor extent, the research mission.

The objective information collected for the missions was used to confirm and corroborate the activity measures that were recorded by the surveys. This was particularly important for the teaching mission, where independent measures of clinical teaching were often not available. Figure 2 is a bar plot by department, for the first survey year, of the percentage of the entire teaching load as measured by independent objective data on teaching activity in each of four categories (undergraduate medical education course directorships, medical school didactic course contact hours, third-year medical school clerkship faculty hours, fourthyear medical school student rotation hours) versus the total teaching hours for each department that were collected by the survey (i.e., MAP total teaching hours). The general concurrence of the first four measures with the MAP measure supports the belief that the reported survey hours reasonably reflected the relative contribution of each department to the overall teaching load. Using regression analysis, there was a significant correlation between aggregated objective measures of teaching and total teaching hours reported in MAP, with a p value of .011.

Bar charts similar to that shown in Figure 1 were constructed for individual departments and served to highlight issues that then formed the basis of departmental strategic decision making. Information gained during the first MAP program year (FY2002) was used to assist the school leadership in the allocation of institutional funds for FY2004 (September 1, 2003 through August 31, 2004). However, since the effects of the FY2004 allocations were not known prior to the start of FY2005, there were no reallocations based on the information from the second MAP program year (FY2003). As the effects of the FY2004 allocations were understood in the context of information gathered during the third MAP program year (FY2005), it is expected that a refinement in the allocation of institutional funds will occur in FY2006 or later.

We learned from this experience that the use of a full year of faculty survey and financial data, which must be collected and analyzed in the fiscal year following the year surveyed, results in a two-year delay between the year surveyed and implementation of changes in resource allocation. This delay could be reduced by utilizing a shorter survey period (e.g., first six months of the fiscal year), with analysis during the second half of the fiscal year and changes in allocation in the next fiscal year. Alternatively, since we observed little change in faculty activity between the two years, data from a previous year could be used to predict the subsequent year’s activity if no significant shift in activity has occurred. In future years, it is expected that insights derived from the MAP system will enable the school leadership to maintain a solid alignment of institutional funds with the missions and strategic plans of the school.

The emphasis in this analysis has been directed to the development of insights involving the net balance and crosssubsidization of school and departmental missions for the purpose of guiding strategic planning while improving overall financial performance. It is also a tool to assist the departments in moving in the direction of the vision of the school’s leadership. Obviously, this was particularly important where negative margins in key mission areas were of concern, or where the operating margin for the entire department was negative. A good example of this is shown in Figure 3 for a department that had substantial external support for the educational mission, and the overall departmental margin was negative. In addition, for this department the usual areas of positive margin (clinical and research) were also negative. The plan at the time of the first survey had been for the department to expand its clinical enterprise to improve the operating margin. However, the actual negative margin in clinical work suggested that simply expanding the existing clinical paradigm would worsen the margin. Instead the department was encouraged to reexamine each clinical activity and reduce those with negative margins to provide only those services that were necessary for the educational mission and expand in areas of positive clinical margin. In this case the analysis suggested a strategic change in clinical activity was needed, not a change in resource allocation. As seen in the second survey year, this philosophy was helpful.

A second example is seen in Figure 4, which shows a financial bar chart for a specific department. In this department, the slightly negative operating margin significantly worsened in the second year (FY2003). Our analysis suggested that the overall change was caused by worsened margins in all mission areas. Of particular interest was the change in the research mission. To understand this change, objective data and the survey data for all departments were merged (Figure 5). As shown, the average research support per hour of research activity was plotted versus average hours of activity recorded in the research mission in the survey for every department. As seen, two departments depart substantially from the rather linear relationship in the remainder of the departments. One department (Department Y) departs with relatively high funding per hour and the other (Department Z, the department in question) departs with low research support at the same time the hours devoted to research are high per FTE. This suggested that the department in question could improve its financial performance by increasing external research funding or decreasing time spent conducting underfunded research. Alternatively, some effort in research could be reallocated to provide revenue to the department by increasing clinical activity and income. Again, a strategic change in faculty effort was needed, not a reallocation of resources.

Why the System Works

The success of the implementation of the MAP system was due to several factors. First, the leadership of the medical school indicated its commitment to the system by visiting all departments to explain the system. Second, the development of the survey tool by the faculty allowed them to create a survey tool that would best reflect their efforts. As such, the final survey tool also included various measures of academic productivity that could be used for faculty mentoring and career guidance by the departmental leadership. Concerns about misuse of individual data by central administration were addressed by reporting only aggregate data to the leadership of the school. Hence, only the faculty and the departmental leadership had access to individual data. Finally, only faculty in clinical departments participated in this project, obviating the need for comparisons between clinical and basic science faculty. In this context, the faculty embraced the opportunity to develop the survey and willingly completed the tool.

Concerns have been raised that a reporting tool such as this may be misconstrued as being in conflict with other effort-reporting measures, such as reporting time on federally funded projects or contracted time (e.g., Veterans Administration salary eighths). Since the data were immediately available to the faculty and departmental leadership as the survey instruments were completed, the faculty had an opportunity to insure that the reporting was congruent among the various reports. This issue is complicated by the fact that reporting periods are not identical for different sources of funds, and definitions of time and effort differ. The school of medicine specifically asked faculty to compare their self-reported MAP data to federal effort reports to insure appropriate completion of all forms.

This method of mission-based analysis differs substantially from those described in the majority of the published articles on the topic. The fundamental difference is that our approach estimates the relative operating margin for each mission by determining the net cost of activity in the missions; the productivity and/or relative value of the activities contributing to each mission was not used as a primary measure. It should be noted that we placed no constraints on the data faculty could enter, unlike some approaches that force a limited number of hours worked per week.2,4 We also made relatively few assumptions regarding time allotments (i.e., limitation of lecture preparation time, the percentage of clinical and research time allocated to teaching when the faculty indicated that teaching was occurring with these activities). The cost of mission activities so measured includes all costs directly linked to mission activities as well as the part of faculty salaries allocated by the proportion of faculty time devoted to each mission. This allows us to focus directly for future planning on the largest expense and the most crucial resource, faculty time.2,3 The result is a more realistic assessment of the faculty (salary) cost of a mission. In addition, since each salary cost can in theory be directed to another mission, the opportunity cost of faculty activities can be estimated. 

In contrast, a system that emphasizes productivity focuses resource allocation that is based on past strategic decisions and past performance. Such a system has been termed “backward looking” if future allocations are allocated strictly based on past strategic decisions and data. Such a system has the potential to drive the system to status quo unless sufficient incentives are built in to approach new activity targets. Further complicating this approach are the lags that exist between the time when the activity is measured, the time the resources are distributed, and the time the next assessment is made to determine the effect. In effect, by the time the data are collected from the past budget year, the next opportunity for resource reallocations is either midyear or the following fiscal year, often resulting in up to a two-year lag time. This built in delay makes decision making difficult.4

The approach presented here does not include valuations of particular activities based on past strategic decisions. As a result, the data are more straightforward, and strategic decisions regarding future resource allocation can be made more readily, based on an assessment of the actual cost of faculty effort as well as opportunity costs associated with particular activities. In this sense, the MAP approach is more forward-looking than other MBM approaches. However, the time delay associated with the use of a full fiscal year as the survey period led to a two-year delay between survey year and changes in resource allocation for the MAP system, as would be true with other MBM systems also. This long delay could be reduced by use of a shorter survey period, as noted previously.

A system of resource allocation based on productivity—“leadership by the numbers”5—also hampers strategic planning by the need to then follow the data.6 This often leads to the search for “perfect” data, an exercise that adds complexity while delaying decision making.7 This focus on data may distract from strategic thinking by impeding the ability to respond to strengths, weaknesses, opportunities and threats to the school’s missions. Hence our goal was to develop a less complex system that gathered meaningful information to facilitate basic and strategic decision making.

We achieved our goal of a simple, inexpensive system which can provide helpful insights in making strategic decisions. Given the relatively low cost of the MAP system, it is important to emphasize that the real work was distributed among faculty and administrators, who we believe had a vested interest in the project. The collection and correction of survey data were done by the departmental administrators. Actual data gathering and entry were tasked to individual faculty. The financial analyses were conducted by the office of the associate dean for finance of the medical school. Only one individual (TS) received direct salary support and that was a faculty member who acted as the intermediary between the departments and the medical school administration on a part-time basis. This person developed the data repository, managed the data collation, and provided the school leadership with the aggregated effort data that were merged with the financial information. He also worked with the department chairs and administrators to ensure the completeness and integrity of the data while assuring its confidentiality. As a result, the medical school administration was able to work confidently with aggregate departmental data while individual departments were able to manage their faculty as best suited the department’s mission, all in a complex and challenging environment where multiple missions must be served by faculty with widely differing individual interests and motivations.6,8,9 For some departments, initiatives to increase the clinical and/or research missions consistent with the school’s vision were developed with the understanding that they would eventually have positive margins. In others, where the operating margins for the clinical and research missions were negative (or where the overall balance was negative), the data were examined for insights that could help guide the department leadership.

Clearly, the real cost of the MAP system is the faculty time and effort in the survey completion and related administrative costs. Fortunately, the time to complete the survey decreased in the second year, suggesting that faculty were more familiar with the instrument. Other than the part-time individual mentioned above, there were no recurring fixed costs. Hence, the costs to use survey data in subsequent years are minimized. The similarity of the data between the two survey years suggests that the frequency of subsequent surveys can be decreased unless major shifts in faculty number, activity, or school financing occur. Other approaches have focused on assessing productivity for medical school missions, and proposals for relative weighting of these products have been published.1,2,7,10 –13 Each of these approaches requires that a value judgment be derived that can be used to strategically drive the productivity of the faculty if reallocation of faculty activity is anticipated. However, previous publications have indicated that academic faculty are generally not accustomed to having their activity measured in this way.9 Therefore a method that surveys time utilization may be less threatening than one focused on productivity or the relative values of various activities. While the assessment of productivity (and any associated incentive plans) can be effective in improving positive margins and providing funds for missions with negative margins, this approach can be threatening for the teaching mission or the administrative and service mission, where increased productivity may not increase a positive margin or even contribute to a meaningful expansion. Further, measuring and valuing teaching conducted during the clinical and research missions is quite difficult, as objective measures are hard to come by. Research may be difficult to assess, since productivity is cyclical and has a built-in lag time.14 When RVUs are used as the basis for clinical incentive systems,2,10 these units do not allow assessment of the impact or quality of the interaction, especially if teaching is occurring during the clinical care. Similarly, it is difficult to assess the impact or quality of teaching in a large-group format versus more personalized teaching to a small group or an individual.3,7

The MAP system presented here is useful independent of the issue of quality, since it assesses only the cost of the activity. This cost of activity assessment also allows estimation of the opportunity costs when certain activities are reduced in favor of another, or when a new activity is introduced and the costs and revenues can be estimated. We agree that quality issues must be considered, but independent quality indicators can be used to weigh the various mission activities in light of their costs and operating margins.

Mission-based management, as promoted by the AAMC, has been defined as “a process for organizational decision making that is mission driven, ensures internal accountability, distributes resources in alignment with organization-wide goals, and is based on timely, open, and accurate information.”8 The methodology we present allows calculated “strategic” decision making, since the direction of movement is not predetermined and the estimated costs of strategic initiatives can be developed. As challenges or opportunities present themselves, the impacts of shifts in activities can be estimated because the relative costs and impacts on the operating margins can be approximated.

One potential drawback of the method presented here is the necessity to use a survey instrument. Previous publications have noted the value of faculty survey techniques in budgeting and manpower planning.2 The survey method, when linked to productivity measures, has the inherent disadvantage of potential abuse, especially when known incentives cause the faculty to inflate the time entered.2,8,15 Since the current system does not link directly to productivity, the potential for abuse is reduced. Also, the independent collection of activity indicators from available school data allows confirmation of the validity of the survey’s findings. If, as in our case, the survey instrument is developed by the faculty themselves, faculty anxiety is reduced.8,13,14 Further, since no weighting of the value of activities occurs with this method, the faculty have more confidence that the data will not be manipulated. The only manipulation in this system is through the use of conventions for lecture preparation and for the fraction of teaching credited when clinical and research activities are occurring simultaneously. Fortunately, these were set by the faculty conventions group and updated based on the findings of the initial faculty survey. Other survey-based methods have reported poor faculty participation in the survey.9 However, since each departmental administrator had a vested interest in ensuring that the full departmental contribution to the missions was recorded, we achieved full participation.

Survey methodology can also be criticized because of the inaccuracies inherent in the memory of faculty regarding their activities over the past year. In our school, the survey results were remarkably unchanged in hours logged per FTE when the two survey years were compared, suggesting that the data are likely representative of the distribution of faculty effort. When the aggregate data from FY2002 and FY2003 are compared on hours per FTE in each category of reporting, the p value is .0001. Consistent with this, a large number of faculty resubmitted the data from FY2002 for the FY2003 year. However, if the accuracy of data is to be confirmed, an independent system for recording activity (e.g., lectures) would be needed. This would clearly increase the complexity (and cost) of the methodology and likely still fall short of measuring the teaching associated with clinical and research activities. The survey has the advantage of establishing time utilization patterns that can be used for mentoring and strategic planning of faculty members’ careers. For example, the extent to which an individual’s survey deviates from the expected can serve as a focus for this planning.

With respect to the survey being based on a faculty member’s recollection of his or her time utilization, it is essential to use corroborating objective information if decisions are made regarding resource allocation. A variety of information sources are generally available for the clinical and research missions. However, key objective measures may need to be derived for the teaching mission. At our institution a variety of measures of teaching effort (e.g., shown in Figure 4) were readily available and were used to corroborate the relative distribution of teaching effort.

One useful aspect of this approach is that the data can be used for any subset of the entire dataset (e.g., by school, departmental, division, or individual faculty member). Thus the data are helpful for the school to assess its overall operating margins, for departments to review their contributions to the missions, and for individual faculty to consider their progress towards promotion and tenure. Two departments utilized the survey to analyze clinical productivity (based on RVUs) and the faculty time recorded in the survey. This was used to assist faculty in focusing their efforts and balancing their academic time. A third department utilized the faculty survey by comparing the effort recorded to the acquisition of grant funding and publication of scholarly manuscripts in order to assess accountability for academic research time. Finally, the data may be useful to provide reports of accountability and activity to external stakeholders (such as time and effort reporting).

A second useful attribute of this method is that the simplicity of the operating margin analysis allows assessment of the impact of different assumptions on conclusions. For example, as mentioned above, state appropriations were divided between the teaching and administrative missions. It would be straightforward to assess the impact of changing the 80:20 ratio for this, or the inclusion of some of these funds to stimulate research. Similarly, it is not difficult to assess the impact of apportioning a different proportion of clinical time to teaching. In our original analysis, we allocated 50% of clinical time to teaching if both activities were occurring simultaneously. As shown in Figure 6, we recalculated the operating margin for the school of medicine by assuming all clinical time was applied to the clinical mission, and none to teaching. Comparing Figure 6 with Figure 1, which utilized a 50% allocation of clinical time to teaching if teaching of students was occurring, the operating margin is improved in the teaching mission and decreased in the clinical mission, reflecting the impact of faculty (and related) expenses on the clinical mission. This difference suggests the degree of cross-subsidization from the clinical mission that is needed to support the teaching mission.

In conclusion, over the two years during which the MAP system has been used, we have gained much useful insight into the budgetary challenges faced by the medical school. The data have been useful in the allocation of state resources to incrementally correct maldistributions caused by historical methods that no longer reflect actual contributions to the school’s activities or their desired strategic directions. As a simple, inexpensive tool, it has been easily integrated into the budgeting and planning process and has served to inform strategic decision making and resource allocation.

This article was originally published in the November 2005 issue of Academic Medicine.

References

1 Nutter DO, Bond JS, Coller BS, et al
Measuring faculty effort and contributions
in medical education. Acad Med. 2000;75
199–207.


2 Daugird AJ, Arndt JE, Olson PR. A
computerized faculty time-management
system in an Academic family medicine
department. Acad Med. 2003;78:129–36.


3 Watson RT, Romrell LJ. Mission-based
budgeting: removing a graveyard. Acad Med.
1999;74:627–40.


4 Whitcomb ME. Mission-based management
and the improvement of medical students’
education. Acad Med. 2002;77:113–14.


5 Howell LP, Hogarth MA, Anders TF.
Implementing a mission-based reporting
system at an Academic health center: a
method for mission enhancement. Acad Med.
2003;78:645–51.


6 Ridley GT, Skochelak SF, Farrell PM.
Mission-aligned management and allocation:
a successfully implemented model of
mission-based budgeting. Acad Med. 2002;
77:124–29.


7 Mallon WT, Jones RF. How do medical
schools use measurement systems to track
faculty activity and productivity in teaching?
Acad Med. 2002;77:115–23.


8 Brigham EJ, Tellers CA, Rondinelli R.
Academic survival through mission-based
management. Am J Phys Med Rehabil. 2001;
80:778–85.


9 Garson A, Strifert KE, Beck JR, et al The
metrics process: Baylor’s development of a
“report card” for faculty and departments.
Acad Med. 1999;74:861–70.


10 Cramer JS, Ramalingam S, Rosenthal TC, Fox
CH. Implementing a comprehensive relativevalue-based incentive plan in an academic family medicine department. Acad Med. 2002;75:1159–66.


11 Bardes CL, Hayes JG. Are the teachers
teaching? Measuring the educational activities
of clinical faculty. Acad Med. 1995;70:
111–14.


12 Hilton C, Fisher Jr, W, Lopez A, Sanders C. A
relative-value-based system for calculating
faculty productivity in teaching, research,
administration, and patient care. Acad Med.
1997;72:787–93.


13 Coleman DL, Moran E, Serfilippi D, et al
Measuring physicians’ productivity in a
Veterans’ Affairs Medical Center. Acad Med.
2003;78:682–89.


14 Howell LP, Hogarth M, Anders TF. Creating
a mission-based reporting system at an
Academic health center. Acad Med. 2002;77:
130–38.


15 Ruedy J, MacDonald NE, MacDougall B.
Ten-year experience with mission-based
budgeting in the faculty of medicine of
Dalhousie University. Acad Med. 2003; 78:
1121–29.


16 Johnson SE, Newton WP. Resource-based
relative value units: a primer for academic
family physicians. Fam Med. 2002; 34:
172–6.

Aligning Compensation with Education: Design and Implementation of the Educational Value Unit (EVU) System in an Academic Internal Medicine Department

MBM – Aligning Comp with Education

Steven Stites, MD, Lisa Vansaghi, MD, Susan Pingleton, MD, Glendon Cox, MD, and Anthony Paolo, PhD

Abstract

The authors report the development of a new metric for distributing university funds to support faculty efforts in education in the department of internal medicine at the University of Kansas School of Medicine.

In 2003, a committee defined the educational value unit (EVU), which describes and measures the specific types of educational work done by faculty members, such as core education, clinical teaching, and administration of educational programs. The specific work profile of each faculty member was delineated. A dollar value was calculated for each 0.1 EVU. The metric was prospectively applied and a faculty survey was performed to evaluate the faculty’s perception of the metric.

Application of the metric resulted in a decrease in university support for 34 faculty and an increase in funding for 23 faculty. Total realignment of funding was US$1.6 million, or an absolute value of US$29,072 38,320.00 in average shift of university salary support per faculty member. Survey results showed that understanding of the purpose of university funding was enhanced, and that faculty members perceived a more equitable alignment of teaching effort with funding.

The EVU metric resulted in a dramatic realignment of university funding for educational efforts in the department of internal medicine. The metric was easily understood, quickly implemented, and perceived to be fair by the faculty. By aligning specific salary support with faculty’s educational responsibilities, a foundation was created for applying mission-based incentive programs.

The rapidly changing environment of academic medicine continues to pose challenges to its leaders. Those responsible for allocating funding within academic medical centers face the pressures of limitations in resources related to decreased government funding for medical education and increasing demands on faculty time, along with increased demand for public accountability.1 Increasingly, leaders of academic medical centers are recognizing the importance of developing systems that specifically assign resources in support of all academic missions, but especially the mission of educating students and residents.2,3

In 1999, Watson and Romrell reported development of a process that came to be known as “mission-based budgeting.” The three-step process described by the University of Florida group consisted of identifying revenue streams to fund each of the institution’s missions, evaluating each faculty member’s productivity with regard to each mission, and aligning funding source with faculty effort.4 Interest in mission-based budgeting and management has grown. The Association of American Medical Colleges (AAMC) has established a Mission-Based Management (MBM) Program to aid deans and department chairs in the task of realigning funds to match missions. In 2000, position papers of the AAMC’s MBM task force emphasized the need for deans and faculties to develop formalized methods for allotting financial resources to support their institutions’ goals in education, research, and patient care.3,5 The MBM task force for medical education emphasized that each medical school should establish guidelines and metrics consistent with the school’s education mission.3

In addition, the task force suggested applying a template for approaching MBM in education, beginning with listing all faculty educational activities, then assigning each activity a weight in relative value units (RVUs). Factors recommended for consideration included time required to perform the educational function, time required to prepare, level of faculty expertise, and relative importance of the activity to the professional development of the institution’s trainees, and the institution’s mission. The group also recommended attempts at the potentially difficult but important task of linking compensation to quality of teaching, rather than focusing exclusively on quantity of work.3

The AAMC’s call for more robust mechanisms for faculty evaluation and compensation has led to several published chronicles of experience in MBM.2,6 –10 Initial reactions to MBM have been mixed, with obstacles including faculty’s resistance to change and logistical difficulties with collecting data.2 However, deans and department chairs are recognizing the value of quantifying their faculty’s educational and clinical activities as a method of developing evidence-based accountability of faculty for progress toward goals.11,12

In the midst of growing discussions of MBM within the community of academic medicine, data collection was underway at our institution to define how our faculty members devoted their time to each of the department’s missions. The intent was to link faculty members’ compensation to designated funding sources according to mission, with clinical productivity defining compensation for patient care, grant dollars supporting research, and state funding supporting education of students and residents. However, the MBM data had not yet been used to develop a metric for distributing state-appropriated dollars for education.

Funding for our department’s educational mission originates from two sources: state appropriations and Medicare Direct Medical Education (DME) funding. The state of Kansas appropriates funding for the University of Kansas Regents’ system, which then allocates funding for the University of Kansas Medical Center. The University of Kansas School of Medicine receives a portion of the funds, which are then allocated to academic departments by the dean.

In addition, DME Medicare dollars are paid from the University of Kansas Hospital to the School of Medicine, and these funds are pooled with the state appropriations. Thus, resident education at our institution is underwritten in part by the state of Kansas, due to our relatively low portion of Medicare DME funding, which is insufficient to support the salaries and benefits of our residents, and thus cannot provide compensation for faculty members’ efforts in resident education.

Distribution of departmental funds, including the portions of clinical revenues and research overhead distributed to the divisions, in addition to the allocated state funds, has previously been the prerogative of the department chair. No clear metric linking these funds with mission for individual faculty was used. For example, some subspecialty divisions with minimal direct involvement with teaching had historically benefited from generous allotments of university funding, while other divisions, such as general medicine, received disproportionately little university support despite having extensive educational responsibilities. This inequity was further compounded by the assignment of the bulk of the funding to individual faculty salary lines with no provision for meaningful adjustments from year to year based on changing levels of responsibility for and participation in mission-critical education activities. Absent a well defined method of linking mission related activities to compensation, faculty expected that their individual levels of state funding would at least remain stable from one fiscal year to the next. More often, the expectation was that their compensation would automatically increase at the same rate as that of other state employees. Yet in those years of budgetary contraction, the expectation was that faculty salaries would not decrease. Finally, the historic methods of allocating state funds to faculty left no room for the creation of meaningful incentives for clinical productivity and for exceptional performance in medical education, particularly for faculty members whose salary sources were not consistent t with their mission-based activities.

In order to improve upon our institution’s history of ill-defined revenue streams and to strengthen financial support of educational efforts, our department of internal medicine was tasked by the university with developing a system for distributing state appropriations that supported the educational mission of the school of medicine. Our department was selected to pilot a mission-based budgeting effort because departmental distribution of university funding was perceived to be especially misaligned, and because the department was one of the few within the institution lacking incentive programs for clinical productivity and teaching. We formed a committee to address the challenge. Here, we report the design and implementation of a simple, prospective, and time-based system for compensating educational efforts in our internal medicine department, and our faculty’s responses to the changes. The educational value unit (EVU) system resulted in the alignment of expectations of physician’s educational effort with compensation and accountability, dramatically changing how our department paid for the educational mission and how our faculty understood its funding.

Development of the EVU Metric 

In 2003, the EVU Task Force was formed with eight members including division directors, residency program directors, clerkship directors, and financial administrators, as well as representation from the medical school leadership. The committee met weekly for four months. Initially, efforts were focused on a review of recent mission-based analysis of faculty activities and compensation at the medical school. Faculty Medicare time sheets, historical distribution of financial support, and the educational responsibilities of the department were carefully considered. In addition, a review of the literature was conducted to identify other efforts in the medical education community to align educational effort with compensation.

As suggested by the AAMC’s MBM Task Force,3 the committee began the RVU-based method of listing all educational activities in the department and assigning a relative weight. We encountered difficulty with comparing and assigning value to the various teaching efforts of our faculty members and were concerned about the subjectivity inherent in the weighting process. Standardizing the array of each faculty’s educational activities with an RVU-based scale and translating these values into a specific dollar amount challenged our ideal of defining expectations prospectively for each faculty member and detracted from faculty’s autonomy in determining how they could best contribute to education. Moreover, the foreseeable task of updating the system with each change in curriculum or personnel was daunting.

After much deliberation, the committee determined to generate a new metric. Several criteria were identified as vital. The committee sought a system that could be easily understood and adopted by faculty; one that would engender a prospective, goal-setting approach; and that would allow efficient use of faculty time and resources. Based on these criteria, the group decided against an RVU-based metric, and chose instead to create a time-based metric.

The EVU was defined as a unit of time spent in education of students and residents. By using a time-based metric, we avoided subjective assignment of relative values to different educational activities, and chose to value different educational activities of faculty members with the same metric, regardless of subspecialty or level of experience. In order to translate time spent in teaching effort to EVUs, 0.1 EVU was designed to represent approximately four hours of work per week. In theory, the EVU for a particular activity represents the fraction of the time devoted to purely education related functions while completing the activity.

After developing the concept of the EVU and relating it to faculty time, the committee further defined the core and clinical subdivisions of the EVU (see Table 1). The core EVU was defined as teaching time spent educating students and residents that is not associated with billable clinical activity. Examples of core education include time spent participating in Grand Rounds, Morning Report, clinicopathologic conference, small-group discussion with medical students, and all development time for didactic lecture preparation and presentation. Core EVU time was also allotted for the administration of education, for residency program directors, fellowship program directors, and clerkship directors (see Table 2). The allotment for program administration was taken from national certifying bodies as well as our own experience. For the first year of implementation, each faculty member was presumed based on committee consensus and review of prior mission-based reports to contribute a baseline of 0.2 core EVU while conducting non-billable clinical activities. This presumption was to be validated during the year with recorded logs of educational time submitted by faculty members.

In contrast, the committee defined clinical EVUs as those associated with billable clinical activities, and thus the data could be accrued automatically based on inpatient and outpatient attending schedules (see Table 3). Clinical EVUs were not meant to fully replace clinical income, but rather to compensate for the expected decrease in faculty efficiency and productivity during patient care in the presence of learners.13 For example, a faculty attending on an inpatient service would accrue a clinical EVU allotment in recognition of his or her time spent during rounds in bedside teaching, including listening and providing feedback to a learner’s presentation of a patient, and informal discussions of diagnostic and therapeutic topics related to a specific patient’s care. However, if the attending physician also presented a lecture for the team that was not directly related to patient care, then time spent preparing and delivering the lecture would be recorded in the faculty’s core EVU log. The value of the clinical EVU was based on the committee’s analysis of mission-based reports and their collective experience with the impact that student and resident learners have on rounding efficiency in our institution.

The committee communicated plans for implementation of the new metric to the faculty through divisional and departmental meetings, e-mails, and one-on-one sessions with faculty members to review expectations for teaching time and compensation. Faculty members kept core EVU logs of hours spent in teaching and administration and submitted them during the year for review.

Implementation of the EVU Metric and Faculty Response 

The calculated annual total EVU production for the department was 24.8, including 15.4 core EVUs (of which 2.3 were administrative) and 9.4 clinical EVUs. When divided by the total amount of university educational funding for the department (US$3.12 million), each 0.1 EVU was worth US$12,562.00.

An EVU template was developed for each faculty member, allowing them to determine their proportion of work and compensation for the educational mission. The EVU calculation shown below is for a hospitalist with 4.5 months of inpatient rounding and 2.5 months of general medicine consults, who also serves as student sub-internship clerkship director. This faculty member’s total EVU allotment is 0.4275, producing a total of US$53,702.55 of educational salary support from university funding.

Clinical EVU:
Inpatient attending: 0.020/month x 4.5
months 0.09 EVU US$11,305.80
Consults with resident: 0.015/month x
2.5 months 0.0375 EVU
US$4,710.75
Total 0.1275 Clinical EVU
US$16,016.55


Core EVU:
Baseline expectation 0.20 EVU
US$25,124.00
Administrative: Subinternship director
0.10 EVU US$12,562.00
Total 0.30 Core EVU US$37,686
Total 0.4274 EVU US$53,702.55

Fifty-seven faculty members had a change in their salary structure as a result of the EVU system (see Figure 1), 34 of whom had a decrease in salary support from the university. Among those whose university support decreased, the mean change was -US$28,814 30,158 (mean SD) for a net loss for those faculty of US$979,676. The remaining 23 faculty saw an average increase in university support of US$29,453 19,979 for a net gain of US$677,419. Overall, there was a total realignment of US$1.66 million in funding among faculty members with an average shift in university funding, in absolute dollars, of US$29,072 per faculty member. In addition to the 23 faculty who had an increase in university dollars, there was a net gain of US$302,257 in distribution of university funds that was used for salary support for new faculty. Specifically, ten new faculty members were hired with a base of 0.2 core EVU, or US$25,124, per faculty, and the remaining approximately US$50,000 was held in reserve for support of educational missions, including recruitment of faculty for the following academic year. A number of faculty members who were heavily involved in teaching were able to decrease their clinical responsibilities, allowing time for teaching activities while maintaining their salaries. Those who were less involved in teaching had a decrease in university educational support, and as a result were more dependent on clinical productivity to maintain their salaries. Despite the large shift in university funding distribution, application of the metric did not appreciably change total faculty compensation, but rather created a realignment of salary sources with the department’s educational and clinical missions. Individual faculty members who faced a decrease in university funding because they did not have a significant teaching mission were given adequate warning and were expected to increase their clinical productivity or identify other sources of salary support.

In December 2003, four months after implementing the EVU system, a faculty survey was conducted to evaluate changes in faculty perceptions regarding distribution of state funds. Faculty were asked their perceptions of the purpose of university educational funding before and after implementation of the EVU system, the implications of the EVU compensation for educational productivity, and their perceptions of the fairness of the EVU metric. Potential differences in faculty perceptions before and after implementation of the EVU system were evaluated using proportional analyses.

Individuals excluded from participation in the survey were volunteer faculty (not eligible for university funding), emeritus professors, and members of the EVU committee. Although 57 faculty members had changes in university funding, 79 questionnaires were distributed to the remaining full-time department faculty members. (Twenty-two were faculty members who were considered full-time but on either research or other tracts.) Twenty-nine faculty members returned completed questionnaires (37%).

Faculty members were asked to identify their perceptions of the purpose of university funding before and after implementation of the EVU system. They were asked to check all responses that applied. While only 27 of 84 responses (32%) indicated that university funding had been directed toward teaching efforts under the previous system, 28 of 44 responses (64%) indicated that the EVU system matched teaching efforts with university funds (p .001) (see Table 4). We found no other statistically significant differences.

When asked about the implications of the EVU compensation system for educational productivity in their division, 11 (39%) faculty members believed productivity would be better, and 13 (46%) felt that it would be unchanged. With regard to research productivity, two (7%) felt that it would be better, 17 (59%) believed it would be unchanged, and ten (35%) believed research productivity would be worse than it was under the previous system.

Faculty members were also asked about their perceptions of the fairness of the dollar amount assigned to each 0.1 EVU. Fourteen (47%) respondents stated that the dollar amount was “somewhat fair” or “very fair,” and 6 (20%) faculty members thought that the dollar amount was somewhat or very unfair.

An additional outcome of implementing the EVU system was a dramatic improvement in faculty attendance at Grand Rounds, clinicopathologic conference, and Morbidity and Mortality Conference. For example, attendance numbers at Grand Rounds more than doubled from 14 faculty members per session to an average of 31 faculty members per session. We verified faculty self-reporting logs with residency program lecture schedule and conference attendance rosters and other backup data to ensure accuracy, and we found no evidence of faculty overreporting of educational effort.

Conclusions

We have described the development and application of a simple EVU metric that has allowed alignment of educational expectations with compensation and accountability in an academic department of internal medicine. The metric is easily understood, quickly implemented, and perceived to be fair by the faculty. After initially attempting to adapt published RVU-based systems to our department’s needs, we found the task of enumerating and assigning relative values to each educational activity to be daunting. The committee foresaw that, even if a list of RVU weighted educational activities could be agreed upon within the committee, it would not be well-received by the faculty due to its subjectivity, and would be too cumbersome to allow timely implementation. Finding no readily applicable precedent in the literature, we chose to create an MBM system that could be tailored to meet the specific needs of our department. Our system can be distinguished from previously reported metrics by three key characteristics: It is time-based, prospective, and compensates bedside teaching in addition to formal lectures and program administration.

Instead of using well-described RVUbased metrics, 2,6 –10 we created a simple system that allowed faculty to self-report their time spent in educational effort. We established a market value for an internist’s teaching time, which is not specialty-specific. We considered whether various subspecialties should be reimbursed for teaching time differently, but the committee felt that educational funding should be related to the time invested in education and not based on medical specialty training, which does not necessarily enhance teaching ability. As with any system that establishes a flat compensation rate based on teaching activity, our metric may discourage sub-specialists with higher rates of reimbursement for clinical work from teaching, just as it encourages faculty in fields with lower clinical compensation rates to participate in teaching activities. Thus far, there has been no decrease in sub-specialist involvement in teaching efforts in our department, but this possibility will warrant further observation.

We used the new EVU system prospectively, using previously gathered MBM data to determine reasonable expectations of faculty effort associated with various teaching and administrative activities. Our prospective approach shortened transition time and allowed the departmental leadership to set clear expectations of teaching productivity by faculty members. A clinical productivity incentive program was simultaneously implemented. Faculty salaries were structured according to expected teaching effort and clinical productivity, and faculty were responsible for meeting teaching expectations in order to maintain their university funding, and for generating the expected patient care work to maintain their clinical salary or be eligible for a productivity bonus. Faculty members experienced a significant change in the allocation of university funding, but this change was generally perceived as fair and consistent with the university’s mission-based emphasis on funding educational endeavors. For example, two faculty members had disproportionately large amounts of salary support from university funds, yet had relatively little participation in teaching efforts. While the faculty members saw decreases in their compensation from university funds of more than US$100,000 (see Figure 1), both were heavily involved in clinical activities and thus were able to fully maintain their salaries with clinical income. By reallocating university funds from these two individuals to other faculty members who participated in medical education but had less clinical income, funding sources were more closely matched with missions in education and patient care. In addition, by creating compensation sources specifically for medical educators, we now have a foundation for creating incentive programs to reward quality teaching.

One of the key missions of an academic internal medicine department is to provide excellent clinical education for residents and students. The EVU committee valued the importance of bedside teaching and wanted to encourage faculty to teach on the inpatient services and in their outpatient practice. However, we recognized the tendency of learners to decrease faculty efficiency13,14 and the possibility that clinical productivity-based incentive plans may have the unintended effect of discouraging faculty from teaching during rounds or clinics. To address this concern, we designed the clinical EVU as an adjunct to clinical RVU production for faculty providing patient care in the presence of learners. By adding university funding in support of teaching that coincides with direct patient care, we can supplement a faculty physician’s clinical billing to provide an incentive for faculty to teach while they care for patients.

Our department was the first within our institution to implement an MBM system for medical education. The department-wide implementation was somewhat unique, since most programs reported in the literature have been implemented medical school-wide.2 Lack of institutional precedent contributed to skepticism and inertia, but also provided freedom for innovation. Smaller numbers of participants allowed close observation of the impact on departmental finances, productivity, and morale as we piloted the program.

As we hoped, faculty participation in resident teaching and attendance at departmental conferences have dramatically improved. When provided with clearly defined expectations of teaching productivity and prospectively determined compensation for teaching efforts, our faculty members responded with more enthusiasm and interest in medical education. Scheduling faculty for medical student and resident lectures became easier, and faculty attendance at resident morning report improved. By heightening awareness of our educational mission within the department, we hope to ensure that it is viewed with importance equal to our patient care and research missions.

Although the survey results were generally positive, the response rate was low (37%), suggesting that the findings may not generalize to the rest of the faculty who did not respond. However, informal feedback from faculty and the ease of implementation suggest that support was wide-spread. In addition, compliance with the educational logs was 100%. The EVU system is now being evaluated for more wide-spread use throughout the School of Medicine. We believe that the EVU metric can be easily adapted to the full range of medical specialties, and that it will be particularly useful in clinical departments that sponsor required medical student clerkships, and that typically receive lower per capita state funding under the existing, historical model of resource allocation.

By electing to implement our newly created EVU system instead of an RVUbased system, we were able to efficiently link university educational funding sources with faculty teaching efforts. A potential drawback to our system is that, by choosing to use a flat, time-based reimbursement rate for all educational activities, the department chair and the EVU committee relinquished influence over which specific educational activities our faculty members choose to emphasize in their allotted time. In addition, the value of the EVU depends on university funding which can vary from year to year. Indeed, one major challenge is to maintain the value of the EVU even while faculty size may change. However, many of these drawbacks are not unique to our EVU system, but rather are inherent in many mission-based budgeting strategies aimed at support of educational productivity.

Another limitation is that we do not yet have an incentive program, to measure quality of teaching effort and adjust compensation accordingly. Possibilities for incorporating quality assessment may include learner and peer evaluations of teachers and learner performance on exams. While high-quality educational effort is clearly a key outcome, defining and measuring quality teaching is far more complex than simply counting hours. We chose to proceed with implementation of the EVU metric to establish a baseline for mission-based compensation for teaching, while foundations for quality measures are in development. Our struggle to define and measure quality is not unique to our institution, or to the realm of education. As the academic medical community searches for innovative ways to identify and enhance both clinical and educational quality, we will need to incorporate newly-developed quality measures into our metric.

Acknowledgement

The EVU committee members were Susan Pingleton, MD, Steven Stites, MD, Glendon Cox, MD, Jeffrey Whittle, MD, Daniel Stechshulte, MD, Amy O’Brien-Ladner, MD, Daniel Hinthorn, MD, and Christopher McGoldrick.


This article was originally published in the
December 2005 issue of Academic Medicine.

References

1 Phillips RL Jr, Fryer GE, Chen FM, Morgan
SE, et al The Balanced Budget Act of 1997 and
the Financial Health of Teaching Hospitals.
Ann Fam Med. 2004;2:71–78.


2 Mallon WT, Jones RF. How do medical
schools use measurement systems to track
faculty activity and productivity in teaching?
Acad Med. 2002;77:115–23.


3 Nutter DO, Bond JS, Coller BS, et al Measuring
faculty effort and contributions in medical
education. Acad Med. 2000;75:200–07.


4 Watson RT, Romrell LJ. Mission-based
budgeting: removing a graveyard. Acad Med.
1999;74:627–40.


5 D’Allesandri RM, Albertsen P, Atkinson BF,
et al Measuring contributions to the clinical
mission of medical schools and teaching
hospitals. Acad Med. 2000;75:1232–37.


6 Ruedy J, MacDonald NE, MacDougall B. Tenyear experience with mission-based budgeting
in the Faculty of Medicine of Dalhousie
University. Acad Med. 2003;77:1121–29.


7 Williams RG, Dunnington GL, Folse JR. The
impact of a program for systematically
recognizing and rewarding academic
performance. Acad Med. 2003;78:156–62.


8 Ridley GT, Skochelak SE, Farrell PM. Mission
aligned management and allocation. Acad
Med. 2002;77:124–29.

9 Tarquinio GT, Dittus RS, Byrne DW, Kaiser
A, Neilson EG. Effects of performance-based
compensation and faculty track on the
clinical activity, research portfolio, and
teaching mission of a large academic
department of medicine. Acad Med. 2003;78:
690–701.


10 Howell LP, Hogarth MA, Anders TF.
Implementing a mission-based reporting
system at an academic health center: a
method for mission enhancement. Acad Med.
2003;78:645–51.


11 Jarrell BE, Mallot DB, Peartree LA, Calia FM.
Looking at the forest instead of counting the
trees: an alternative method for measuring
faculty’s clinical education efforts. Acad Med.
2002;77:1255–61.


12 Bland CJ, Wersal L, VanLoy W, and Jacott W.
Evaluating faculty performance: a
systematically designed and assessed
approach. Acad Med. 2002;77:15–30.


13 Vinson DC, Paden C, Devera-Sales A. Impact
of medical student teaching on family
physicians’ use of time. J Fam Pract. 1996;43:
112–13.


14 Skeff KM, Bowen JL, Irby DM. Protecting
time for teaching in the ambulatory care
setting. Acad Med. 1997;72:694–96.

Creating a Mission-Based Reporting System at an Academic Health Center

MBM – Creating a Mission-Based System

Lydia Pleotis Howell, MD, Michael Hogarth, MD, and Thomas F. Anders, MD

Abstract

The authors developed a Web-based mission-based reporting (MBR) system for their university’s (UC Davis’s) health system to report faculty members’ activities in research and creative work, clinical service, education, and community/university service. They developed the system over several years (1998 –2001) in response to a perceived need to better define faculty members’ productivity for faculty development, financial management, and program assessment. The goal was to create a measurement tool that could be used by department chairs to counsel faculty on their performances. The MBR system provides measures of effort for each of the university’s four missions. Departments or the school can use the output to better define expenditures and allocations of resources. The system provides both a quantitative metric of times spent on various activities within each mission, and a qualitative metric for the effort expended.

The authors report the process of developing the MBR system and making it applicable for both clinical and basic science departments, and the mixed success experienced in its implementation. The system appears to depict the activities of most faculty fairly accurately, and chairs of test departments have been generally enthusiastic. However, resistance to general implementation remains, chiefly due to concerns about reliability, validity, and time required for completing the report. The authors conclude that MBR can be useful but will require some streamlining and the elimination of other redundant reporting instruments. A well defined purpose is required to motivate its use.

The development of mission-based management programs has been the focus of many academic medical centers. The Association of American Medical Colleges (AAMC) has encouraged its use. The AAMC defines mission-based management as “a process for organizational decision making that is mission-driven, ensures internal accountability, distributes resources in alignment with organization-wide goals, and is based on timely, open and accurate information.”1 An essential aspect of mission-based management is the ability to measure faculty and department activities that contribute to the missions of the school. This is, however, a highly controversial area, since faculty fear that poorly designed measurement systems will adversely affect their salaries, promotions, workloads, and allocation of support. Relative-value units (RVUs), commonly used for billing, are a generally accepted method of gauging clinical productivity; however, there are only a few published methods describing productivity measures for non-clinical missions, such as education.2–6 Likewise, only a few of the published mission-based management systems have attempted to integrate the information from all missions for an individual faculty member.7,8

In this article we describe our development of a mission-based reporting (MBR) system that measures faculty members’ quantitative and qualitative efforts in the four missions of clinical work, research, education, and administration/community-service activities. We designed MBR as a reporting system for chairs to provide them with quantitative and qualitative information about their departments related to each of the four missions. We avoided the term mission-based management because we wanted to deemphasize control and the negative connotations of the term management. We intended, rather, to imply that the term reporting should lead to recognition of faculty members’ efforts and growth in their careers. The purpose of MBR is to provide a reporting tool for use in evaluating faculty resources and department performance, both retrospectively and prospectively. The tool helps chairs to better fulfill the missions of their departments and the school, plan for the future, and mentor and reward individual faculty members.

System Design

Technical characteristics: We initially designed the MBR system in 1998 as an Excel spreadsheet, but changed it to a Web-based program early in the course of development so that participating faculty could better access their individual records and enter and view their own results. The current version of MBR employs a three-tier architecture with a Web browser as the client software, an application server for middle-tier “business logic,” and a relational database for data storage. Since the MBR system is a Java Servlet 2.1-compatible system, it can be implemented on a large variety of server environments. User summary reports are provided as portable document format (PDF) files, constructed “on the fly” from data in the database and submitted to the Web browser when a user requests the report. We chose the PDF format because it produces high-fidelity printing, constructing summary reports with a professional appearance. A printed record is available for each individual faculty member. Printable summary reports compile data by department, for the school as a whole, and by faculty rank and/or series across departments (Charts 1–3). Security levels exist so that an individual faculty member can view his or her own personal record only. A department chair can view the records of all faculty members within his or her own department, and the deans can view the records of all faculty and departments.

Designing the database structure: We designed the basic data-entry module in three sections: an Activity Section for faculty to enter their year’s activities, an Evaluation Section for qualitative assessment of performance, and an automated Summary “Report Card.” Each of the three sections is further subdivided according to the university’s four missions: clinical service, investigation and creative work (i.e., research/scholarship), teaching, and administration/university/community service. Before a faculty member begins to enter data, that individual’s “budgeted” or “targeted” percent effort for each mission is entered by the department manager. Budget projections (targets) of faculty effort by mission for each faculty member are required as part of each department’s annual budget submission. These budgeted projections are entered into the MBR system.

The MBR system is a self-report system whereby individual faculty members enter their data (quantitative and qualitative) by mission and immediately see the relative values of their efforts. Faculty entries are later reviewed and validated by the department chair during an annual career-planning session required for all faculty. Based on the faculty member’s entries in the Activity Section, the MBR program computes an estimate of the time spent in each activity, using the RVU codes embedded in the program. Activity scores for each mission are summed. Each mission summary score is then transferred to the “% Actual” field in the summary report card. A grand total for percent effort is also computed. The summary report card thus compares previously entered “projected” or “targeted” effort with actual activities entered by the faculty member for each mission (Chart 1).

Defining activities and computing RVUs: Faculty from diverse departments within the University of California Davis School of Medicine served on committees dedicated to defining parameters for each of the university’s four missions (listed earlier). Faculty volunteered, were appointed, or were selected to serve on committees because of their special interests or expertise. In general, committees were open to anyone who wished to serve, but committee size did not exceed 15 for any one committee. Two of us (LH and TA) served as chair or co-chair for each of the committees. We charged each committee to select and define the most relevant and representative activities for its assigned mission. The charge urged comprehensiveness but, at the same time, demanded simplicity.

The Activity Section translates activities into quantitative time/effort-based metrics. Thus, another of the committee’s charges requested estimates of the quantity of time expected to complete each activity over the course of a year (Chart 4). The quantity of time was defined as a percentage of a year spent performing that activity, using a 50-hour work week as the standard. The committees achieved consensus on estimated average times to accomplish each activity based on personal experience and creative deduction. For example, there is no easily established standard for the length of time it takes to complete a manuscript. However, promotion committees generally expect faculty to publish the equivalent of at least two journal articles per year. Our clinical faculty strive to have a minimum of 20% of their time protected for scholarly activities. Thus, the RVU time allotment for a journal article for a clinical series faculty member was calculated accordingly.

In a later refinement, a higher RVU score was assigned to articles published in peer reviewed journals than to limited distribution articles because promotion committees value the former more highly. Similarly, book chapters were given more relative value for clinician– educator faculty than for research faculty. In the same spirit, abstracts and “submitted” grants were weighted more for junior than for senior faculty. Such differential weightings of time-based RVU codes motivate and reward faculty for activity that is aligned toward academic success in their respective series (i.e., track) and rank. The MBR program knows which RVU codes to select for a given faculty member because the department manager enters the rank and series of each faculty member at the same time that the percent “targeted” effort from the budget is entered. The faculty member entering data is “blind” to the RVU weight assigned to each activity.

Both the teaching and the clinical services committees were required to distinguish patient care with students from clinical service without associated teaching. Since published reports indicate that faculty spend approximately 43–53% of time teaching residents in ambulatory care settings,9,10 we designed the MBR system to allocate 50% of clinical time spent with trainees to the clinical mission and 50% to the teaching mission. The clinical services module was designed as a logic tree requiring faculty to enter the weekly half-days in the clinic with and without students, and the number of months per year as ward attending with and without students. The MBR program then allocates effort to the two missions automatically. In the first version of the MBR system, these calculations had been left to the individual faculty member. Significant confusion and misinterpretation of instructions led us to automate the input via the structured decision tree.

Similarly, for the administration/ university/community service mission, we did not want to credit all committee and administrative activities equally. The university endorses community service, and the promotion committees expect some service activities of faculty. However, academic advancement is not enhanced by excessive community service at the expense of scholarship. Therefore, less RVU credit and fewer opportunities were provided in the Activity Section for these activities. Only major school and university committees, such as the institutional review board, promotion committee, and admission committee, were included. These committees require large time commitments of faculty and are considered important for the school’s function. We did not include minor committees and service work outside the university but credited them qualitatively in the Evaluation Section. We coded administrative activities that are considered part of the job description of a chair, dean, division chief, or other leader on the basis of the size of the department/division or scope of the responsibility.

For the qualitative metrics designed for the Evaluation Section, the committees were charged with developing a list of standards reflecting the quality of the work performed. The standards were ranked from 0 to 5. Thus, the Evaluation Section (Chart 2) summarizes the qualitative aspects of faculty scored previously. The teaching mission is evaluated from the perspectives of student and peers and is averaged to achieve a final evaluation score for teaching. Individual evaluation standards are not additive. An individual faculty member records only one standard for each mission. This evaluation score is then automatically imported to the Summary Report Card and can be viewed separately for each mission.

As part of the Summary Report Card, the computer also multiplies the evaluation score by the activity score to achieve a single quantity/quality product for each mission. The mission products are then summed to obtain a single summary score for each faculty member. The following theoretical model drives the interpretation of this summary score. If a faculty member’s actual activities total 100% and her or his evaluation codes for each mission are 3, the resultant final summary score of 300 (100 3) reflects expected and appropriate performance. In other words, faculty members whose summary scores are at least 300 are on target for academic advancement. A score below 300 suggests substandard performance for the year and requires attention from the chair. A score above 400 indicates outstanding performance worthy of an incentive reward.

Implementation: Testing and Modifications

Phase 1: Selected feasibility testing: We chose to test and modify the MBR system in three phases. In phase 1 in 1998, we tested the initial RVU and performance codes created by each committee for inconsistencies, omissions, and other user-entry problems on 21 randomly selected volunteer faculty members. Of the 21, two had quantitative scores less than 100% (56.0 and 55.9%), six had scores between 100% and 150%, and 13 had scores higher than 150%. The faculty with the high scores were hardworking, but not working at the level their scores would indicate, nor were the two faculty members whose scores were less than 100% considered to be “slackers.” The mission in which the largest number of faculty showed discrepancies between targeted effort and actual effort was the teaching mission. Sixteen of 21 faculty exceeded their targeted expectations by more than 10%. The next most discrepant mission was the investigation and creative work mission, with nine of 21 faculty demonstrating similar over-reporting. For the clinical mission, all of the faculty had discrepancies of less than 10% between targeted effort and actual effort. For the administration/university/community service mission, department chairs and deans had actual percentages below the targeted percentages because some of their activities had not been included. In response to this initial pilot trial, adjustments were made to the RVU codes. For the teaching mission, time values believed to be excessive were decreased for some activities. In the quantitative portion of the administration/university/community service section, a line was added for “administrative stipend” (% salary support) to account for time spent on administrative activities relevant to the job descriptions of department chairs or other leaders. The results from the phase 1 trial enabled us to better define activities and adjust the RVU weighted scores.

Phase 2: Pilot testing with selected departments: In phase 2 in 1999, we tested the revised system on 131 faculty members from eight departments. These departments ranged in size from five to 28 faculty members and included two basic science departments, three surgical departments, two medical departments, and one hospital-based specialty department, with an almost even division between clinical and basic science activities. Faculty members in each of the test departments completed MBR data entry online prior to their annual career planning sessions with their chairs. The printed results for each faculty member were validated by the chair and discussed with the faculty member.

For the investigation and creative work mission, only one department did not have faculty members who were under target. Half of the departments had more than 48% of their faculty under target, suggesting under-performance. The under-target faculty in this mission tended to be basic scientists or faculty with large percentages of time designated for research. They were often junior faculty who were still in the start-up phases of their research careers. Based on these findings, several new activities were added to the investigation and creative work section to reflect work in progress. Credit for published abstracts, grants submitted but not yet funded, cost recovery on grants, and time spent in study sections was added. These activities were also given greater RVU weight for junior faculty than for senior faculty. Only one department produced results that showed that the majority of its faculty were over target for the investigation and creative work mission. This was a surgery department whose faculty had been budgeted with minimal time for research. As a consequence, even modest scholarly output made it fairly easy for these faculty to exceed their targeted time.

For the teaching mission, all of the Phase 2 trial departments produced results that indicated that the majority of faculty were on or over target. The improvements to the RVU weightings after phase 1 had been successful. Only one fourth to one third of the faculty were under target. Almost equal numbers of faculty were over and under target. For the administration/university/ community service mission, five of the eight departments also showed the majority of their faculty to be on or over target for that mission. Likewise, for the clinical mission, six of the eight departments with clinicians showed that more than 50% of their members were on target. In two departments large percentages of faculty were under target. One of these was a hospital-based specialty whose clinical activities were not easily measured by the system. The results of phase 2 pointed to yet other areas in need of revision.

Phase 3: School-wide implementation: Based on the experience from phase 2, we made additional refinements, focusing primarily on further fine-tuning the RVU scores. Because some faculty were concerned about the invisibility of RVU equivalents of the activity scores, we revised the program so that a mouse click provides the actual RVU weight used in the computation. In addition, we added “help” buttons for specific items whose definitions had been ambiguous. A mouse click on the help button now provides a specific definition of the activity.

During the post-phase 2 refinements, we reconvened the committees. Their further guidance and advice were reflected in the revision. Many committee members had experienced first-hand the phase 2 implementation. Throughout all phases of MBR development, we actively pursued dialogue with our faculty. We discussed difficulties and changes in a variety of forums such as the faculty senate, the Council of Department Chairs, and the curriculum committee, and at department faculty meetings. Individual faculty provided input directly or via e-mail. Phase 3 tested MBR in a school-wide trial of all faculty and departments.

We modified the RVU coding system to stratify faculty by rank and faculty series. Since junior faculty are often in more of a “building” phase of their careers, with less published investigative/creative work or funded grants, instructors and assistant professors were given more credit for work in progress than were senior faculty. Stratification based on rank and series also expanded the system’s summary reporting and dataanalytic capacities.

In 2000 for phase 3, the dean’s office required use of the new version of the MBR system for annual faculty career planning by all departments in the school. The dean’s office did not articulate a clear purpose for MBR but did clearly state that the results of MBR would not be used for any salary or promotion planning. The dean’s office implied that the results would be used only to further refine categories of academic activity and the RVU and Evaluation Section codes.

Discussion

Developing an MBR system is a complex task requiring careful group planning, considerable administrative support, and significant time for design, testing, and modification. Even then, there are obstacles to general faculty acceptance and uniform use. It is not clear from the extant literature that any mission-based management system has gained general acceptance and is regularly being employed successfully.

The system we describe differs from other published mission-based systems in several ways. One important difference concerns the definition of the research/ scholarly mission and what types of work should be included as evidence of productivity. In our system, we specifically selected the term “investigation and creative work” to encompass the scholarship of education, application, and integration as well as the scholarship of discovery. The former are evidenced by publication of books, book chapters, educational manuals, review articles, and peer-reviewed articles describing clinical experience. In other mission-based systems, many of these activities would be included under the educational mission.6,7 However, our university defines all of these types of activities as creative scholarship and views them as research-specific to one or more of the academic series. The university criteria are reinforced in the MBR system by giving due credit for integrative and educational publications for faculty in the education and clinical series. RVU credits were weighted according to the publication (chapter versus peer review) and the faculty member’s rank and series.

Another difference unique to MBR is the separation of quantitative and qualitative measurements of productivity. The system described by Nutter et al. integrates a qualitative multiplier directly into the quantitative RVU score assigned to each activity.6 By separating the two in MBR, department chairs or administrators can consider each dimension separately for different purposes. Examining the quantitative component alone can be useful in determining staffing or assignment of duties to an individual. The qualitative component can be examined separately to advise faculty about areas needed for improvement. The quantity/quality product provides an indication of the cost– benefit value of the activity. The summary score might be useful in the promotion process or in comparing faculty for other forms of rewards. School administrators might also consider rewards on a broader department level. For example, the mission-based management system at the University of Florida bases 20% of the department’s budget allocation on the qualitative component of its effort in the educational mission.11

It is important to note that the phase 2 trial with eight departments demonstrated that many of the faculty in the clinical departments had quantitative scores significantly exceeding 100%. This indicates that most faculty are working more than the 50-hour week, which had been considered the standard in creating this MBR system. We were not surprised by this result. We operate a rapidly growing primary care network in a highly competitive managed care market. The faculty’s clinical workload has significantly increased.

If the quantitative RVU scores assigned to clinical activities are deemed to be accurate and fair, faculty members should be able to advance successfully academically by working only slightly above 100% time. If faculty members are academically successful only by working clinically at effort levels that greatly exceed 100%, then the expectations that surround academic advancement and the assignment of clinical workload are in direct conflict. Demanding continued performance much greater than 100% will lead to faculty burn-out and problems with retention. Exit interviews by the dean with a number of faculty have suggested that “private” group practice is a more personally rewarding and manageable alternative than the 150% effort required of academic medicine. We believe that it is important to document faculty efforts beyond normal working hours in order to support academic advancement and better align faculty compensation to faculty effort.

During the phase 2 trial, we also found it interesting that the mission with the most discrepancy between target effort and actual effort was the investigation and creative work mission. Basic scientists were understandably suspicious of a system that made them look underproductive. Gauging research productivity had been problematic during the design stage. The research subcommittee had specifically concluded that quantitative effort in this mission should be based only on final products (published papers, funded grants). The other missions were largely time-based. Since the MBR system is designed to be implemented annually, research productivity may be specifically compromised because of publication lag times and grant-submission review cycles. Most research projects take several years before coming to fruition. Since work in progress was not originally credited and only published work was considered, a faculty member could appear to be under-productive one year and over-productive the next year when the work that was in progress the first year was finally published in the second year.

We used the results of phase 2 to revise the MBR system. In phase 3, we included additional credit for salary support from grants, for abstracts, and for new grant submissions. The AAMC’s mission-based management program noted that there are some advantages in including these activities, and that they are included in mission-based systems at other schools.12 Despite these additions, some element of under-reporting of faculty efforts in the investigation and creative work mission may continue to exist. Discovery-type research is by nature an inefficient process in which many time-consuming efforts do not result in funded grants or as publishable work. If the MBR system described here is to be used as part of annual faculty career counseling, chairs will need to be cognizant of this issue and not unfairly evaluate a faculty member unless a trend is observed for more than one year. This mission will merit continued scrutiny as the system is further refined.

The difficulties we encountered in phase 3 testing of the MBR system include a persistent general resistance by faculty and chairs. Faculty concerns focused on the resistance to quantification of their activities, a belief that the information collected would be more harmful than helpful, and a conviction by each specialty that its activities are unique and, therefore, can not be fitted into a general template. Similar difficulties have been encountered by others and remain a challenge for general implementation.

One significant remaining challenge that requires further refinement is the area of on-call time. The issues of in-house versus at-home call, 24-hour versus night and weekend call, procedural versus consultative call, and resident versus non-resident supported call are difficult to equilibrate between specialties.

Despite these ongoing challenges, we believe that the overall experience with the MBR system at UC Davis has been positive. Significant faculty-wide attention has been focused on the benefits of MBR, and there has been general recognition of its necessity. Skeptical department chairs became more enthusiastic when shown the summary results for their faculty. In general, chairs of the eight test departments in phase 2 felt that the MBR system did give higher scores to the faculty that they had previously perceived as high achievers, and lower scores to those faculty whom they felt were relatively weaker. They also found MBR to be a good springboard for discussions with faculty members during their annual career-counseling sessions.

We are making an effort to overcome continued resistance by some faculty and address the barriers to implementation. Integration of existing data collected by other administrative units, such as a faculty member’s clinical RVUgeneration report, and research grant and contract dollars, should directly be downloaded to that individual’s MBR record. Such automation reduces redundancy, minimizes individual input, and increases data integrity and report accuracy. However, MBR may never gain acceptance until input efforts result in responsive decision making for allocation of resources to departments and/or for more streamlined procedures for academic advancement.

MBR can be used by department chairs as a management tool for individuals, to discuss faculty performances and goals and determine salary, or to automate some of the tedious hard-copy paperwork required for promotion actions. For departments, examination of the total projected effort and actual effort expended in each mission can aid in determining faculty staffing and work assignments, identifying recruitment needs, and developing department budgets. For the school, MBR data can be used to aid in equitable allocation of funds and space to missions and departments. Allocation of positions and money to departments based on MBM elsewhere has been described.9 Use in decision making, however, requires trust in the accuracy of the system. Future efforts to ensure accuracy and build trust will require refinement of quantitative and qualitative scores for each mission. Comparison of MBR results with successful promotion actions is one way to establish validity.

Acknowledgement

Special thanks to Benny Poon, Medical Informatics Group, for his programming expertise in the development of the Web-based MBR system.


This article was originally published in the
February 2002 issue of Academic Medicine.

References

1 Association of American Medical Colleges.
Mission-Based Management Program:
Introducing the MBM Resource Materials.
Washington, DC: AAMC, 2000.


2 Bardes CL, Hayes JG. Are the teachers
teaching? Measuring the educational activities
of clinical faculty. Acad Med. 1995;70:111–4.


3 Bardes CL, Hayes JG, Falcone DJ, Hajjar DP,
Alonso DR. Measuring teaching: a relative
value scale in teaching. Teach Learn Med.
1998;10:40–3.


4 Bardes CL. Teaching counts: the relativevalue scale in teaching. Acad Med. 1999;74:1261–3.


5 Sachdeva AK, Cohen R, Dayton MT, et al. A\
new model for recognizing and rewarding the
educational accomplishments of surgery faculty. Acad Med. 1999;74:1278–87.


6 Nutter DO, Bond JS, Coller GS, et al. Measuring
faculty effort and contributions in medical
education. Acad Med. 2000;75:199–207.


7 Garson A, Strifert KE, Beck R, et al. The
metrics process: Baylor’s development of a
“report card” for faculty and departments.
Acad Med. 1999;74:861–70.


8 Hilton C, Fisher W, Lopez A, Sanders C. A
relative-value– based system for calculating
faculty productivity in teaching, research,
administration, and patient care. Acad Med.
1997;72:787–93.


9 Zweig SC, Glenn JK, Reid JC, Williamson
HA, Garrett E. Activities of the attending
physician in the ambulatory setting: what part
is teaching? Fam Med. 1989;21:263–7


10 Melgar T, Schubiner H, Burack R, Aranha A,Musial J. A time–motion study of the
activities of attending physicians in an.
internal medicine and internal medicine—
pediatrics residents continuity clinic. Acad
Med. 2000;75:1138–43.


11 Watson RT, Romrell LJ. Mission-based
budgeting: removing a graveyard. Acad Med.
1999;74:627–40.


12 Holmes EW, Burks TF, Dzau V, et al.
Measuring contributions to the research
mission of medical schools. Acad Med. 2000;
75:304–13

Looking at the Forest Instead of Counting the Trees: An Alternative Method for Measuring Faculty’s Clinical Education Efforts

MBM – Looking at the Forest (Jarrell)

Bruce E. Jarrell, MD, David B. Mallot, MD, Louisa A. Peartree, MBA, and Frank M. Calia, MD

Abstract

Purpose: To present an alternative approach to mission-based management (MBM) for assessing the clinical teaching efforts of the faculty in the third and fourth years of medical students’ education. 

Method: In fiscal years 2000 and 2001, interviews were conducted with department chairs and faculty members with major responsibilities in education at the University of Maryland School of Medicine. Using a standard worksheet, each rotation was categorized according to the amounts of time students spent in five teaching modes. After each department described its rotation and maximum teaching time, the department team and the MBM team negotiated the final credit received for its course. This final determination of departmental clinical teaching was used in subsequent calculations. Adjustments were made to the department clinical education time based on the teaching mode. Groups of medical students were surveyed to determine the relative value of each teaching mode. These relative values were then used to modify the clinical education times credited to the department. The last step was to distribute the effort of the faculty between clinical and educational missions.

Results: The data analysis showed approximately 57,000 credited faculty hours in one year for direct education of medical students in each curriculum year. These hours equal the annual workload of 28 full-time faculty members. 

Conclusions: A powerful use of MBM data is to move from thinking about resource allocation to thinking about the effective management of a complex organization with interlaced missions. Reliable data on faculty’s contributions to medical students’ education across departments enhances other MBM information and contributes to a picture of the dynamic interconnectedness of missions and departments.

One outcome of the profound economic changes in medical reimbursement over the last 15 years is a need for more attention to resource allocation within academic medical centers (AMCs).1 One tool currently enjoying interest is mission-based management (MBM), whereby money or effort is matched, albeit with great difficulty, to the AMC’s three traditional missions of education, research, and clinical care. Decisions regarding departmental support by the dean can then be made on a mission-directed rather than on a historical basis.2 Progress has been made in reliably consolidating clinical and research budgets from various accounting systems, allowing a global view of those missions. This consolidation has permitted resource allocation to be data based. The educational activity has been more difficult to measure.3 Many efforts have focused on educational assessment based on faculty-effort surveys and are based largely on self-reporting, with inherent problems of inaccuracy, lack of response, and problems of categorizing various teaching activities.4,5 This is particularly problematic for faculty when responding to their clinically-based teaching activities. Difficulties in accurately and consistently separating clinical care from teaching time and dealing with trainees at multiple levels of sophistication add to the complexity. One additional problem with self-reporting is the tendency to define educational time as the time left over after the amounts of time devoted to other, more definable, missions (i.e. research, clinical care, and administration) have been determined.

From an institutional perspective, teaching responsibilities are assigned to departments and oversight is provided by a curriculum committee. This assignment creates a departmental teaching responsibility that is, in turn, determined by the sizes of medical students’ classes and lengths of rotation. The department must allocate resources, including faculty, to undertake that educational load. Meeting that educational requirement is determined by the teaching philosophy of the department in conjunction with economic factors, residency workforce, school policies, and external review boards. Thus, although the total departmental teaching load is able to be estimated accurately and is relatively predictable from year to year, the contribution of any single faculty member might vary as frequently as daily and is much less predictable. In this study, we present an alternative approach to assessing the clinical education efforts of the faculty in the third and fourth years of medical students’ education.

Method: At the University of Maryland School of Medicine, year one and year two have interdisciplinary curricular blocks that use both basic science and clinical faculty. Faculty are assigned hour-for-hour credit for lectures, small-group sessions, and teaching laboratories. In addition, each of those teaching modes is credited with additional time that reflects class preparation and test development. The additional time assigned to each mode was debated and determined by the Fiscal Affairs Advisory Committee (FAAC), the committee charged with overseeing MBM. Course administration credit is based on the length of the course (see Figure 1).

Figure 1

Departmental teaching responsibility in the clinical years is determined by aggregating faculty effort in a variety of teaching modes. Third- and fourth-year rotations have two main components. The first, small component is formal classroom sessions with no patient interaction, and includes lectures and small-group discussions. Departmental credit for these sessions is determined by the average number of didactic sessions for each rotation. For each hour of formal didactic teaching, an additional half hour is added to reflect the faculty time necessary to prepare for the session. This “prep” time is less than credit assigned for preclinical didactic activities. These hours can be attributed to individual faculty members in the department, but in our analysis the total time for these activities is accounted for only at the departmental level. The second and major component of clinical education is face-to-face teaching with an attending physician in the presence of a patient.

In the clinical setting, the occasion for faculty to educate medical students depends on the number of trainees rotating through the department and the amount of time students spend “face-to-face” with faculty. Assuming every interaction consists of one student with one faculty member for a given number of hours per day, it is possible to define in hours the maximum time for “face-to-face” teaching while delivering care. This number of hours is calculated by multiplying the number of trainees in the rotation by the number of days in the rotation by the number of hours per day of faculty interaction in a clinical setting

Maximum Teaching Time = number of
medical students X length of rotation in
days X agreed-upon number of hours
with faculty in a clinical setting

This maximum time is defined as the upper limit of the department’s teaching load, which would be the total faculty teaching effort if every trainee were taught in a one-to-one ratio with a faculty member and full credit was given for teaching even though clinical care was also being delivered.

After determining the maximum teaching time, in fiscal years 2000 and 2001, we conducted interviews with each department chair and one or two faculty members with major responsibilities in education. Using a standard worksheet, completed at the time of the interview, each rotation was categorized according to the amount of time students spent in large versus small groups and by the trainee mix. We categorized activities into five teaching modes: one student and one faculty attending physician; a small group of medical students (two to four) and a faculty attending physician; one student with one faculty attending physician and one resident; small groups (three to five) of medical students and residents and a faculty attending physician; and a large group of both types of trainees (more than five) and a faculty attending physician. During the departmental interview, the clerkship or rotation was discussed and categorized based on teaching groups. Special teaching situations (i.e., “teaching attending physician”) and unique teaching features of the clerkship were discussed so that department-specific credit could be applied. After the department described its rotation, the department team and the MBM team negotiated the final credit that the department received for its course. We used this final determination of departmental clinical teaching in subsequent calculations.

Using the data gathered from each department, adjustments were made to the department clinical education time based on the teaching mode. We surveyed groups of medical students to determine the relative value of each type of educational interaction (see Appendix A). The resulting relative values were reviewed by key “education” faculty members from a variety of clinical departments. We found the values generated by the medical students to be consistent with the perceptions of faculty reviewers. The students’ values for clinical education time were: an attending physician with a medical student (one hour), an attending physician with two to four medical students (0.77 hours), an attending physician with a medical student and a resident (0.56 hours), an attending physician with five medical students and residents (0.36 hours), and an attending physician with more than five medical students and residents (0.25 hours). These relative values were then used to determine the clinical education times credited to the department. (For a more detailed discussion of our method for measuring clinical education time, see Appendix B).

The last major step in our clinical education method was to designate the efforts of the faculty in fulfilling their clinical and education missions. Up to this point, the method produced a total time that medical students and faculty were together in a clinical setting where the faculty member performs both clinical and educational activities. This total time had then to be split between these missions to give appropriate credit and avoid double counting. Through a series of discussions with the members of the school of medicine’s FAAC, and faculty and leadership in the office of medical education, we decided that for every eight hours of patient care delivered in the presence of a medical student, 1.5 hours (18%) of clinical educational time would be credited to the department. This ratio of clinical education to patient care is based on the student’s, the faculty’s, and the administration’s input and discussion and does not include time directed to teaching residents in the clinical setting. This method of allotting credit also did not include faculty’s scholarly educational activities unrelated to the medical students’ curriculum, e.g., developing innovative teaching materials for future use, writing textbooks, and advising students.

Finally, we determined unique education endeavors with the department. An example would be teaching attending physicians with no clinical responsibilities who receive full hour-for-hour teaching credit without any reduction in their credit for patient care. Another example would be credit given to the department of radiology for teaching medical students during basic third-year clerkships that include significant radiology components. Credit hours for education administration were also given for clerkship directors. We derived data from discussions with the individual departments to determine allocations for these special situations. After these data were summarized, they were given to the departments for review and further input. The data were then submitted to the FAAC and became a key component in institutional decision making.

Results: The total faculty times allotted to teaching medical students are summarized in Figure 2. In addition, this figure shows the breakdown between basic science faculty’s and clinical faculty’s contributions. The data exclude the participation of residents, fellows, and staff in the curriculum. The total hours for teaching students in their first two years are nearly equal in faculty time, approximately 10,000 hours, reflecting the similarity in curriculum structure. The distribution of hours in the second year between preclinical and clinical departments reflects clinicians’ participation in the current curriculum. The large number of clinical faculty’s hours in the third year of medical school reflects the individual and small-group teaching modes in the clinical setting as well as the increased time medical students spend with faculty. The time in year four is significantly less than that in year three because the year itself is shorter, and many students spend considerable time in community sites or other medical institutions.


Figure 3 shows the distribution of faculty education hours summarized for all four years of medical school and shows the relative ranking of education hours among the departments of the school of medicine. By using our method, clinical departments that have required clinical rotations are credited with large numbers of hours. For example, faculty in the department of medicine received the greatest amount of credit due to the length of the third-year clerkship, the amount of teaching required in fourth year sub-internships, as well as a significant teaching contribution in year two. On the other hand, many medical students are assigned to community sites for the obstetrics and gynecology clerkship, resulting in fewer hours credited to the department.

Our method produced additional data not directly tied to the MBM process. For instance, the distribution of clinical teaching in third-year clerkships was an attending physician with a medical student, 19%; an attending physician with two to four medical students, 7.1%; an attending physician with a medical student and a resident, 21.5%; an attending physician with three to five medical students and residents, 47.5%; and an attending physician with more than five medical students and residents, 4.9%. This distribution also shows the efficiency of teaching within a department. These data, originally collected for MBM purposes, can then be reviewed and analyzed by the curriculum committee and individual departments.

Our data analysis showed approximately 57,000 credited faculty hours in one year for direct education of medical students in each curriculum year. Using a standard work-year definition of 2,080 hours (52 weeks 40 hours), these 57,000 hours of credited education time equal the workload of 28 full-time faculty members. However, this credited time is not the total cost of medical students’ education, because it does not include residents’ student teaching or indirect expenses the department incurs coordinating the educational effort or mentoring the students. In addition, the data do not take into account the number of faculty members necessary to generate patient volumes to sustain a teaching program. While our method produces a relative ranking among departments, it also provides an overall faculty effort number for the medical students’ education mission, which can then be compared with faculty’s efforts in the clinical and research missions.

Discussion

The Association of American Medical Colleges (AAMC) has identified six core principles as central to MBM: integrating a school’s financial statements, measuring faculty and departmental activities and contributions to mission, building organizational support for reporting tools and metrics, guiding the dynamics of leadership, holding faculty and department and institutional leaders accountable, and building trust and institutional perspective.2 With regard to the education mission of an AMC, the AAMC’s second core principle—faculty and departmental activities and contribution to mission— has historically been measured through an aggregation of individual faculty members’ teaching activities. The crux of the problem, however, is that individual faculty members’ activities, even if accurate, do not necessarily reflect the educational mission of the school. That mission is defined by external accrediting bodies, the dean’s office, departmental chairs, and faculty education committees. Individual faculty members assume that a variety of teaching activities are central to the education mission, especially in the clinical years. These activities may enrich the students’ experiences and augment the curriculum. However, our method of measuring the faculty’s contribution seeks to segregate those activities tied to the core education mission for purposes of MBM.

Our method yields data that describe the educational effort in a reproducible manner. The data verify our impressions about the effort and time expended by each department and should encourage discussion about the relative educational effort. Our method does not focus on individual faculty members, but it does spotlight different departmental teaching activities and raises questions about the merit and/or cost of those activities, furthering the MBM core principle, “building trust and institutional perspective.” At our medical school, the FAAC uses MBM to make fiscal recommendations to the dean. In fiscal year 1999, the FAAC recommended redistributing $3.0 million over two years among departments that were critical to the educational mission, but that were in financial difficulty. These decisions were based on perceptions of the faculty’s educational activities but had little supporting data. In fiscal year 2002, the school of medicine redistributed $1.0 million of the dean’s funds among departments important to the education of medical students. In this redistribution, the method we describe in this paper provided key education data for the FAAC’s discussions.

Previously, annual departmental support from the dean’s office has been a continuation of historic allocations. The data in Figure 4 represent a scattergram of the historical allocation of dean’s funds in fiscal year 2001. The data show the lack of correlation between the dean’s historical financial support to departments and the medical students’ education data derived from our analysis. Thus, a redistribution based on our method could align resource allocation with educational effort. The FAAC operates under the assumption that medical students’ education should be a factor in departmental support.

Our method has other advantages. Gathering and compiling the data are less time-consuming than surveying individual faculty members. Our method permits separation of teaching medical students from teaching residents. It can also be used for other education evaluations. The next applications of our method will be to analyze faculty’s time spent teaching residents and graduate students and to refine the measures of education merit described above. By meeting with the chair and lead educators in each department, our method adhered to the MBM core principle of “building organizational support for reporting tools and metrics.” Each department is able to demonstrate the uniqueness of its own educational approach.

Conclusion

One of the major stated objectives of MBM is to provide a medical school’s decision makers with accurate information with which they can allocate resources.6 A more powerful use of MBM data is to move from thinking about resource allocation (“accounting”) to thinking about the effective management of a complex organization with interlaced missions. The addition of reliable data on medical students’ education across departments enhances other MBM information and contributes to a picture of the dynamic interconnectedness of missions and departments. In contrast, summation of individual faculty member’s efforts (i.e., survey methods) does not necessarily reflect the overall mission of the school and is unlikely to produce an accurate picture of a complex organization. These summations may even obscure direct educational activity and hinder an open discussion of the place of education in a school’s mission. The University of Maryland School of Medicine’s experience with MBM and the use of our method produce an informative image of a complex environment. In describing this image, it is more informative to describe the forest than to count the trees.

This article was originally published in the December 2002 issue of Academic Medicine.

References

1 Cohen JJ. Financing academic medicine:
strengthening the tangled strands before they
snap. Acad Med. 1997;72:520.


2 Cohen JJ. Introducing the Mission-Based
Management Resource Materials.Washington, DC:
Association of American Medical Colleges, 2000.


3 Nutter DO, Bond JS, Coller BS, et al.
Measuring faculty effort and contributions in
medical education. Acad Med. 2000;75:200–7.


4 Hilton C, Fisher W. A relative-value-based
system for calculating faculty productivity in
teaching, research, administration, and patient
care. Acad Med. 1997;72:787–91.


5 Mallon WT, Jones RF. How do medical
schools use measurement systems to track
faculty activity and productivity in teaching?
Acad Med. 2002;77:115–23.


6 Watson RT, Romrell LJ. Mission-based budgeting:
removing a graveyard. Acad Med. 1999;74:627–40.

Appendix A

Estimation of the educational value of learning in a group setting:

 The medical school is developing an internal methodology to measure education in the clinical setting. As part of this methodology we are trying to understand how trainees perceive the amount of teaching they receive in certain situations while clinical care is being delivered. We will evaluate this information along with responses from department leadership and selected faculty to the same questions.

Instructions: 

Please think about the amount of teaching that you as an individual receive in-one-on-one interaction with an attending. With that in mind, compare it to the teaching that you as an individual receive in the situations delineated below. Please try to generalize your answers across all of your third-year required rotations and not bias your answer based upon your current position or upon some good or bad anecdotal experience.

Please respond to each question in comparison to one-to-one teaching. That is, you and one attending in the clinical setting. This will be considered a 1:1 teaching value. Please do not consider lectures, small group or seminars in your answers. For the purpose of this survey, we are interested in education during rounds, ambulatory clinic, operating room, reviewing films/test, etc.

Example:

Consider a rotation where one aspect of the rotation has you and several other students with an attending discussing a clinical problem. If you consider that the amount of teaching that you receive from the attending is similar to what it would have been if it had been just you and the attending, you would answer 1:1. If you thought that this experience is less than that, for example one half the benefit, then you would answer 1:2.

  1. Individual  an attending 1:1
  2. Individual  an attending  one resident __
  3. Individual  an attending  group of students (3–5) __
  4. Individual  an attending  group of students and residents (3–5) __
  5. Individual  an attending  group of students and residents (5) __
  6. Does teaching directed to a higher-level trainee have a similar benefit as that directed to a lower-level trainee?

Appendix B

An In-depth Look at the University of Maryland School of Medicine’s Method for Measuring Faculty’s Clinical Education Time

Example:

Suppose that for Department 1, the analysis shows that ten third-year students rotate through this department every four weeks. The students spend five hours a day with faculty in the clinical setting. On average, the following teaching modes for the students are described by the department:
▪ Group A: Attending  one medical student (10% of the rotation)
▪ Group B: Attending  two to four medical students (15% of the rotation)
▪ Group C: Attending  one medical student  one resident (11% of the rotation)
▪ Group D: Attending  three to five medical students and residents (32% of the rotation)
▪ Group E: Attending  more than five medical students and residents (32% of the rotation)

Implications:

Calculations for Group A or B. Suppose for the 10% of time spent in Group A that the typical experience is an attending with one third-year student (and no residents) for four hours in a medical clinic. Of that clinical care time, 18% or 0.72 hours is credited as educational time. Since this is Group A teaching mode, the department is credited with one student x 0.72 hours x 1.0 0.72 hours of clinical education time per day for third-year students. Thus, the department received 0.18 hours (or 10.8 minutes) of credit for teaching this student for each hour. Since there are five clinical days per week and four weeks per rotation, the total time for Group A is 0.72x5x4 14.4 hours per rotation. For students in Group B, the department would receive 0.138 hours (or 8.25 minutes) of credit for teaching each student for each hour.

Clinical implication. One student and no residents (Group A) would slow down the clinical activity rate, but still allow the attending to perform other independent activities. In one hour, the student might be able to see one patient, develop a diagnosis and treatment plan, discuss the case with the attending and revisit the patient with the attending at which time the attending could evaluate and treat the patient. There is significant time for one-on-one teaching at the student level. The attending has a relatively low “overhead” of getting to know the student, teaching the student, and evaluating the student’s ability and performance. Several students with no resident participation (Group B) would likely slow down the clinical activity rate more significantly and not allow the attending to perform independent activities. There would still be significant time for one-on-one teaching at the student level. The attending’s overhead is higher than for one student.

Calculations for Group C. Suppose for the 11% of time spent in Group C that the typical experience is an attending with one student and one resident for three hours while delivering clinical care in the operating room. Of that clinical care, 18% or 0.54 hours is credited as educational time. Since this is Group C teaching mode, the department is credited with one student x 0.54 hours x 0.56 0.30 hours of clinical education time for the third-year student. Thus, the department received 0.10 hours (or 6.0 minutes) of credit for teaching this student for each hour. Since there are five clinical days per week and four weeks per rotation, the total time for Group C is 0.56x5x4 11.2 hours per rotation.

Clinical implications. One student with one resident could have a positive or negative effect on patient flow depending on the training level of the resident and whether the student works with the resident or independently. Since some of the attending’s time is needed to supervise and teach the resident, less time would be available for one-on-one teaching with the student. In addition, the teaching that is not one-on-one with the student would have to be directed at two levels of trainees, reducing its effectiveness at the student level. The attending’s overhead is low for this combination of trainees.

Calculations for Group D or E. Suppose for the 32% of time spent in Group D that the typical experience is an attending making rounds with two third-year students and two residents on the inpatient floor each day for six hours. Of that clinical care, 18% or 1.08 hours is credited as clinical education time. Since this is Group D teaching mode, the department is credited with two students x 1.08 hours x 0.36 0.78 hours of clinical education time for the third-year students. Thus, the department received 0.065 hours (or 3.9 minutes) of credit for teaching each student for each hour. Since there are five clinical days per week and four weeks per rotation, the total time for Group D is 1.08x5x4 21.6 hours per rotation. For students in Group E, the department would receive 0.045 hours (or 2.7 minutes) of credit for teaching each student for each hour.

Clinical implications. A small group of students and residents (Group D) would most likely have a positive effect on the clinical activity rate but would leave less time for teaching one-on-one with students. Also, group teaching would have to be directed to the multiple levels of the trainees. The attending’s overhead would be higher in this setting. With a larger group of students and residents, the attending would be required to supervised the residents and, therefore, would have less time for one-on-one teaching with the students. Note that although the attending’s teaching time in Groups C, D, and E might be less than in Groups A and B, students in Groups C, D, and E might receive additional teaching from residents, a factor which is not assessed in this analysis.

Mission Aligned Management and Allocation: A Successfully Implemented Model of Mission-Based Budgeting

MBM – Mission Aligned Management (Wisconsin)

Gordon T. Ridley, MHA, Susan E. Skochelak, MD, MPH, and Philip M. Farrell, MD, PhD

Abstract

In response to declining funding support and increasing competition, medical schools have developed financial management models to assure that resource allocation supports core mission-related activities. The authors describe the development and implementation of such a model at the University of Wisconsin Medical School. The development occurred in three phases and included consensus building on the need for mission-based budgeting, extensive faculty involvement to create a credible model, and decisions about basic principles for the model. While each school may encounter different constraints and opportunities, the authors outline a series of generic issues that any medical school is likely to face when implementing a mission-based budgeting model. These issues include decisions about the amounts and sources of funds to be used in the budgeting process, whether funds should be allocated at the department or individual faculty level, the specific metrics for measuring academic activities, the relative amounts for research and teaching activities, and how to use the budget process to support new initiatives and strategic priorities. The University of Wisconsin Medical School’s Mission Aligned Management and Allocation (MAMA) model was implemented in 1999. The authors discuss implementation issues, including timetable, formulas used to cap budget changes among departments during phase-in, outcome measures used to monitor the effect of the new budget model, and a process for school-wide budget oversight. Finally, they discuss outcomes tracked during two years of full implementation to assess the success of the new MAMA budget process.

In the early 1990s, medical schools began examining their budgeting processes to align their resource allocations with the fulfillment of their multiple missions. The Association of American Medical Colleges (AAMC) encouraged and supported schools’ widespread interest in mission-based management (MBM) by creating forums for institutional leaders to share their variety of approaches, and several years later developed an operational framework.1–3 Early efforts at conceptualizing and developing models of ways to link academic resources with faculty effort have been described to help other institutions develop their own resource-allocation plans.4 –10 The University Hospital Consortium initiated a related effort for identifying revenue streams (“funds flow”) among schools and their related hospitals and practice organizations.11

Currently, approximately 25% of U.S. medical schools are working on the development of metrics to measure the teaching and academic activities of faculty.12 However, relatively few have implemented systems that link the budgeting process with those metrics.

At the University of Wisconsin Medical School, a method for alignment of resource allocation and academic mission has been developed and is in its second year of implementation. In this article, we report on the process of development, the model’s successes thus far, and the lessons we learned in development and implementation.

The Mission Aligned Management and Allocation Model

The University of Wisconsin Medical School’s annual operating budget is approximately $300 million (2000-2001), including university and state funding, extramural grant support, hospital support, and practice plan contributions. Of these funds, the school has direct control over around $70 million received from the university/state and from the practice plan. Approximately two thirds, or $47 million, of these funds are allocated to the school’s 25 departments and one third to support such schoolwide needs as facilities, libraries, animal care, and administration. The missionaligned model is applied to the entirety of the $47 million departmental allocation and not to any other school funds. Allocations to departments are based on quantification of their contributions to education, research, service, and school strategic priorities.

Development

Between 1994 and 1996, a variety of forces prompted the University of Wisconsin Medical School to explore resource alignment and accountability models, eventually naming its plan Mission Aligned Management and Allocation (MAMA). Those forces included the appointment of a new dean, a lean prognosis for the adequacy of existing resources, growing skepticism about the longstanding budget model, and the appointment of many new chairs and other leaders. The school’s 14 clinical practice partnerships unified to become the University of Wisconsin Medical Foundation in 1996, founded on principles of accountability, productivity, and academic mission.13 This not-forprofit organization contributes a portion of its revenue to the school and sought a rationalized method for its distribution. For all of these reasons, the context for change was favorable.

The dean of the medical school initiated planning for mission-aligned budgeting in 1995, emphasizing that “process is as important as product.”14 Despite initial support for the concept, three phases of planning for implementation were needed until a final product garnered sufficient acceptance from the various constituencies within the medical school.

Phase 1. A task force representing the school’s many constituencies was assembled in 1995. After a year of work, the task force achieved consensus around acceptable measures of academic activity (research awards, lectures, mentoring, etc.) but fell short of an operational model suitable for implementation. Most significantly, this group established principles that eventually served as a guiding force for future model development. In addition, the first phase involved approximately 100 faculty (including department chairs) who, through informal and formal communication, began the cultural transformation that proved essential to achieve faculty “buy in” and successful implementation.

Phase 2. During Phase 1, it became clear that although the school had a strategic plan in place, more effort was needed to delineate priority programs eligible for preferential allocation of discretionary funds. A second task force, consisting of a subset of chairs and associate deans, worked in 1998 to refine the initial ideas, identify the need for predetermined strategic priorities, and build a climate for greater acceptability for mission-based resource allocation. This group established that departments rather than individual faculty should receive budget allocations, and began to quantify curriculum components for a complex educational program. The task force set the stage for definitive model development, which occurred in the subsequent academic year.

Phase 3. A steering committee of 16 leaders, chaired by the dean with equal numbers of basic science chairs, clinical science chairs, associate deans, and faculty at large, was convened for the final phase of the process. The faculty included members of the medical school’s governing body, the Academic Planning Council— our university-designated “official” governance body. Subcommittees increased the total number of faculty involved to over 60. The group worked for approximately six months, and in July 1999 completed an operational model that was implemented in July 2000. The model has been the sole basis of departmental allocations of medical school funds for the 2000-2001 and 2001-2002 budgets.

Description of MAMA

Each year the school calculates the portion of its total budget, derived from university/state support and the faculty practice plan contribution (sometimes referred to as the “dean’s tax” at various institutions), to be allocated to departments. This amount is then divided into five categories: education, research, faculty, leadership development, and dean’s discretionary funding, and allocated as follows: 

▪ Sixty percent to education, based on department contributions to medical student, graduate study, allied health, and undergraduate teaching

▪ Twenty percent to research, based on extramural funding and salaries received from grants

▪ Ten percent to academic service, based on a per-capita distribution 

▪ Ten percent at the dean’s discretion, based on alignment with the school’s strategic priorities 

▪ Two percent to leadership activities such as sponsorship of training programs and participation on key school committees (these funds are a subset of the funds allocated to education) There are several other features of the plan: 

▪ Only academic funds originating in the medical school are allocated, thus excluding extramural or hospital support. 

▪ Each department develops its own allocation methods for distributing these funds to its infrastructure, programs, and facilities, and for faculty compensation. 

▪ Implementation is phased over three years. 

▪ Credit is awarded for faculty effort that crosses departmental lines, such as interdisciplinary courses and research grants. 

▪ Strategic priorities influence allocations of the dean’s discretionary category. 

▪ An oversight committee adjudicates disagreements over application of the model or major policy issues. 

▪ The dean and associate deans evaluate the entire model at two-year intervals. 

▪ Allocations to departments are unrestricted, and chairs have flexibility (within the guidelines of their compensation plans) for allocation of funds to individual faculty. 

▪ There is no faculty self-reporting of academic activity. 

▪ There is a commitment to avoid unintended consequences by monitoring and adjustment. 

▪ The model’s content is transparent to faculty members. 

▪ There are no large shifts of resources between basic science and clinical science departments.

Decision making to create the model. Mission-aligned budgeting processes require critical decision making around a limited number of issues. The resolution of each issue is described below.

Scope of funds for allocation. The spectrum for potential resources to be included in mission-aligned models is broad, from discretionary, special funds for particular purposes to the inclusion of all funds over which the school has some influence. The Phase 1 plan restricted mission alignment to funding for faculty salaries. However, a consensus developed that all medical school funds provided to departments should be subject to the model in order to strengthen its impact on academic productivity and not create a perception of “protected budgets.” Of the $70 million in funds directly controlled by the dean’s office, approximately one third are used for school-wide purposes such as libraries, animal care, facilities, and information technology. It was considered practical not to submit these expenditures to a mission-aligned model but rather to annually evaluate them to assure that they support the mission. All other funds, such as grants and hospital support, were already restricted in use. At the University of Wisconsin–Madison, student tuition is paid directly to the state of Wisconsin, returned to the university, and becomes one source of the university’s allocations to its schools and colleges.

Relationship to strategic planning. A major obstacle to faculty acceptance of Mission-Based Management Academic Medicine, Management Series: Mission-Based Management 37 mission-aligned budgeting was the perception that it would either encourage academic activity in a random, indiscriminate manner or carry a strong bias toward status-quo activities. The school had previously undertaken strategic planning, but not comprehensively or linked to budgeting. In 1997, the dean established a faculty task force to develop strategic priorities. It identified six major strategic priorities, since expanded to ten, that form the centerpiece of the school’s strategic plan and serve as a guide for mission-aligned budgeting.14

Allocation of funds to departments or to individual faculty. Early on, some questioned how a model solely allocating resources from the school to departments could have an impact on the alignment of academic work with mission, as teaching and research are primarily individual behaviors. A very creditable, precise system might be in place for delivering resources to departments, but if they continued their current methods of compensation to individual faculty— often based more on historic factors than an alignment with the academic mission—the entire purpose of the exercise would be frustrated.

The steering committee decided that the model should measure each department’s academic activity in aggregate, and should allocate school funds to departments on this basis. Departments of the University of Wisconsin – Madison enjoy a strong tradition as the academic and financial home for faculty, and it was concluded that department chairs were best able to judge the individual academic efforts of their diverse faculties. In fact, MAMA takes advantage of chair leadership and has increased the chairs’ authority. Also, there was trepidation about using the MAMA model to develop a school-wide individual compensation model—a “one size fits all” model for 1,200 faculty—which would require an exponential increase in the model’s complexity. Instead, the chosen model measures departmental academic activity in the aggregate, and funds are transferred from the school to the department on an unrestricted basis, allowing the department chair and executive committee discretion in how academic work and compensation are distributed among department faculty.

The question remained, however, of how to obtain alignment at the individual faculty level. The steering committee concluded that departments should replicate the model’s principles in their compensation plans and other allocation mechanisms. Department compensation plans must adhere to guiding principals established by the school and the practice organization and are subject to approval by the school’s compensation committee. This combination allows flexibility across departments while still providing assurance that school-wide strategic priorities are met.

Proportions of funds allocated to education and research. The model called for annual creation of separate funding pools for education and research, which are then each distributed to departments. One initial and very important question was how to determine the relative sizes of the two pools.

The steering committee decided on a three-to-one ratio for allocation of funds to education and research. This ratio was chosen for a number of reasons, including the acknowledgement that education is supported primarily through tuition and state revenues, and has no other significant source of funding. Research is expected to be predominantly supported from extramural sources at the University of Wisconsin Medical School. The ratio was analyzed through a number of empiric measures that attempted to determine the magnitude of revenues needed to support the teaching mission of the medical school. This exercise was complicated by the fact that it is difficult to separate faculty activity into discrete categories of “teaching, research, service, and clinical practice” and that the indirect costs for teaching have not been well defined. Three separate empiric approaches were used to determine the magnitude of funds allocated to the educational mission: (1) calculation of the faculty fulltime equivalent (FTE) requirements for the number of credit hours offered by the school based on university FTE teaching standards; (2) calculation of the faculty’s teaching contact hours using a relative-value-unit factor; and (3) analysis of tuition recovery. These three calculations approximated an absolute cost for education that allowed a consensus to form around the three-to-one ratio for education-to-research funds. This ratio was established firmly before any departmental modeling exercise was performed—a sequence that proved essential when some departments that were found to have budget “gaps” asked to change from a 3:1 ratio to a ratio more favorable to them.

Measurement of academic activity. Methods to measure academic activity were studied and debated intensively during all three phases of model development. These methods included faculty contact hours, revenues generated by research and clinical practice, a relative-value system for weighting academic activity, and individual reporting of comprehensive activities such as publications, presentations, and committee work. The medical school already had access to data that measured academic activity at the department level: courses and clerkships offered by each department, numbers of graduate students, extramural research awards, and numbers of faculty, including those participating in mission-related leadership activities. The steering committee selected global measurement criteria of departmental academic activity, as shown in List 1.

List 1
Measurement Criteria for a Faculty Member’s Academic Activity, University of Wisconsin Medical School, 2001
Courses and clerkships, based on credit hours and enrollment
Mentorship of doctoral students
Extramurally funded research
Faculty salary support obtained from extramural sources
Participation on major academic committees
A global “service” allocation based on number of faculty in each department
Leadership roles for training programs

These measures were described as proxies for academic activity; exclusion from this list did not represent devaluation of a particular faculty member’s work. For example, productive research can be done without extramural funding, but it is difficult to measure and therefore was not chosen as an allocation criterion. Rather, the assumption was made that departments with funded research programs could choose to use portions of their MAMA support for unfunded research. Publications and similar activities were an expected outcome of the measured academic activity, and thus were not an allocation criterion. These benchmarks were expected to be determined at the department level, based on individual faculty roles, and incentives could be created through individual department compensation plans. 

The steering committee and the dean were firmly committed to this level of specificity and have resisted attempts to include efforts at a highly detailed level, many of them requiring faculty self-reporting. These measures will be refined and improved with actual experience.

Resource allocation versus resource identification. The option of developing a medical center funds-flow model (encompassing university, hospital, and practice organization funds) was thoroughly considered,8 either in addition to or in lieu of mission-aligned allocation. In order to create a well defined, achievable end product, the school elected to defer consideration of a medical center funds-flow model and instead focus all energies on tight linkage between academic mission and resource allocation, with a specific target date for implementation. The three medical center entities recognized the need for improving the factual basis for the considerable amount of funds they exchange, and progress will continue on the most important of these.

Implementation timetable. The dean directed that a mission-aligned budget model be initiated in the first fiscal year after the steering committee completed its work and fully implemented after a three-year phase-in period. This became a useful parameter for compressing the group’s work and encouraged the use of readily available and verifiable data sources in the MAMA model.

Formula budgeting versus leadership flexibility. The need for a more transparent, quantitative resource allocation model was obvious, and the measures chosen on which to base the allocation— course direction, lectures, mentoring of graduate students, etc.— were indisputable means of doing so. Some faculty leaders suggested perfecting this method and using it to allocate the entire budget of the school.

However, during Phase 2, department chairs emphasized that the leadership expected of the dean’s office would be undermined by a formula that modeled 100% of all funds. The chairs clearly stated that the dean needed a source of strategic funds to enhance mission outcomes and stimulate change. Without some strategic funds under the dean’s discretion, the school’s need to support emerging areas of research and learning might be forfeited and along with it the dean’s negotiating influence with chairs. There was also a perception that however refined the model became it could never respond to all valid needs of the learning community. Some level of dean’s discretion could make the model responsive to unmeasurable needs of an extremely complex organization.

The solution to this dilemma was an additional category, the Academic Discretionary Fund, equal to 10% of the total allocation to departments. It is completely at the dean’s discretion to allocate to departments and is heavily weighted toward strategic priorities.

Measuring the product. Outcome measures have been defined to allow evaluation of the model’s impact at the completion of the third year of implementation, including teaching quality before and after MAMA, extramural research support, salaries supported by extramural sources, the tendency for faculty to seek teaching roles, and others.

Including graduate medical education (GME). Because GME is a major teaching activity for most clinical departments, there was initial interest in using it as a basis of allocating school funds. However, because funding support for GME rests solely with the school’s affiliated hospitals, it was decided to continue treating support for GME as a funds flow between departments and hospitals, and to assure that this funds flow would also be reassessed to assure fairness and mission alignment.

Implementation without destabilizing departments. The model in its pure form required redistribution of funds among departments, and while no department’s critical mass of funds was threatened, more movement of funds was prescribed than could be immediately accomplished. Implementation will occur over three years, and as departments reorient their academic activity, substantial compliance should be achieved. A formula limits a department’s maximum annual loss to the lesser of two amounts: one third of the formula-derived reduction, or 3% of the department’s revenue from all sources.

Avoiding manipulation of the model and adjudicating disputes. A faculty committee, advisory to the dean, was established during the first year of model implementation. It was acknowledged that no budget model would ever perfectly reflect all academic work. Therefore, the MAMA Oversight Committee was established to review and revise the model as issues arose during implementation. The committee was appointed by the medical school’s governing body, the Academic Planning Council, and chaired by the senior associate dean for academic affairs. During the first year of implementation, a number of questions were reviewed that resulted in revisions or clarifications to the model.

Discussion—Measures of Success

The University of Wisconsin Medical School’s experience with mission-aligned budgeting has been positive to date; the model is now in the second year of implementation. There has been evidence of increased academic productivity at both the department level and the individual faculty level. Even in the early phases, it became clear that department chairs and faculty were motivated to obtain more resources— or prevent loss of resources— by engaging in activities that earned more support under the MAMA plan. For example, discussion at department meetings began to focus on how to place more salaries on grants and involve more faculty members in teaching. One advantage of the department-linked MAMA method is that the role of the department chair as a Mission-Based Management Academic Medicine, Management Series: Mission-Based Management 39 manager and motivator has been strengthened. In addition, the chairs can now reinforce faculty accountability to the medical school. They have emphasized, with dean’s office guidance, encouraging their faculty to “close MAMA gaps” by methods shown in List 2.

List 2
Methods for Departments to Close Gaps in the Mission Aligned Management
and Allocation System, University of Wisconsin Medical School, 2001

Provide more instruction in more courses
Participate in more education leadership roles
Allow attrition (don’t fill faculty vacancies)
Attract graduate students
Use gift funds
Obtain more salary support from grants
Prepare more grant proposals and receive more awards
Provide service on targeted committees
Contribute activities that match strategic priorities
Create named professorships

Other changes have been correlated with the MAMA plan’s implementation. Course directors have reported that it has been easier to recruit faculty for medical student teaching. There is new interest and energy from faculty for educational programs, as evidenced by a number of new courses and clerkships that have been proposed. For example, a new clerkship in radiology and an integrated neurosciences clinical clerkship have been approved. Increased research productivity has been correlated with the implementation of MAMA. When compared with the pre-MAMA plan base year (1997), total research awards to medical school departments and research awards to clinical departments have both increased by 25% in three years.

Concomitantly, some “gaming” of the system has already been evident, including strenuous vying for curriculum time. A checks-and-balances system has been developed to try to shield the educational mission from mercenary goals. The Educational Policy Council (curriculum committee) operates separately from the MAMA Oversight Committee and is charged with the responsibility of maintaining curriculum standards and quality. The Educational Policy Council has defined a limit on the number of credit hours and contact hours that will be available for medical student teaching each semester, to limit the tendency for additional courses to enhance department revenues rather than to support the academic goals of medical education. The council has developed competency standards for each year of medical education and is planning for ongoing curricular revision without consideration of budget implications. The MAMA Oversight Committee is charged with phasing in implementation of budget changes that result from curricular revision, ideally separating the educational standards from direct influence of departmental budget considerations.

Despite the three-year implementation plan there is still work to be done, as the MAMA model is considered a work in progress. Further refinement of longitudinal outcome measures is occurring to help assess whether the mission-aligned budgeting process has indeed helped to achieve the school’s mission-related goals and strategic priorities. More specific measures of quality, especially in teaching, are being developed for courses and clerkships, including position descriptions for course and clerkship directors. The credit hour and enrollment measures for education are good approximations of contact hours, but are unduly influenced by the number of small groups that are offered within courses, and refinement of these measures is under discussion. Finally, the clinical practice plan and the university hospital are working on plans to align their resources more closely with activities that support their core missions. This last step has always been anticipated so that departments can respond to mutually compatible reward systems from the school, hospital, and practice organization.

Conclusion

The MAMA budget process at the University of Wisconsin Medical School has helped focus attention on the school’s prime mission and strategic goals and helped define the roles of departments and individual faculty in achieving those goals. It has given the school, especially the dean’s office and department chairs, a tool for motivating behavior in support of the academic mission and allowed all constituencies to see how the school’s resources are allocated. While the initial outcomes have been positive at this stage of the second year of implementation, careful monitoring and refinement are necessary to ensure that the alignment of the budgeting process with academic mission is truly helping the University of Wisconsin Medical School achieve its mission of meeting the health needs of Wisconsin and beyond through excellence in education, research, patient care, and service.

This article was originally published in the February 2002 issue of Academic Medicine.

References
1 Nutter DO, Bond JS, Collier BS, et al.
Measuring faculty effort and contributions in
medical education. Acad Med. 2000;75:200–7.


2 Holmes EW, Burks TF, Dzau V, et al.
Measuring contributions to the research
mission of medical schools. Acad Med. 2000;
75:304–15.


3 Association of American Medical Colleges:
Mission-Based Management Program,
Introducing the MBM Resource Materials.
Washington, DC: Association of American
Medical Colleges, 2000.


4 Cramer SJ, Ramalingam S, Rosenthal TC, Fox
CH. Implementing a comprehensive relativevalue— based incentive plan in an academic family medicine department. Acad Med.
2000;75:1159–66.


5 Johnston MC, Gifford RH. A model fo
distributing teaching funds to faculty. Acad
Med. 1996;71:138–40.


6 Garson A, Strifert KE, Beck JR. The metrics
process: Baylor’s development of a “report
card” for faculty and departments. Acad Med.
1999;74:861–70.

7 Scheid DC, Hamm RM, Crawford SA.
Measuring academic production— caveat
inventor. Acad Med. 2000;75:993–5.


8 Watson RT, Romrell LJ. Mission based
budgeting: removing a graveyard. Acad Med.
1999;74:627–40.

9 Watson RT, Suter E, Romrell LJ, Harman
EM, Rooks LG, Neims AH. Moving a
graveyard: how one school prepared the way
for continuous curriculum renewal. Acad
Med. 1998;73:948–55.


10 Watson RT. Managed education: an
approach to funding medical education. Acad
Med. 1997;72:92–3.


11 Burnett DA. Funds flow initiative:
collaborative efforts supporting missionbased business transformation. Paper presented at the University Health System
Consortium, Oakbrook, IL, October
15, 1998.


12 Mallon WT, Jones RF. How do medical
schools use measurement systems to track
faculty activity and productivity in teaching?
Acad Med. 2002;77:115–23.


13 Rao V, Farrell PM. Transformation of faculty
practice organizations: the University of
Wisconsin experience. Paper presented at the
Council of Deans Spring Meeting, Santa
Barbara, CA, April 18, 1999.


14 Bonazza J, Farrell PM, Albanese M, Kindig D.
Collaboration and peer review in medical
schools’ strategic planning. Acad Med. 2000;
75:409–18.