Cross functional teams / Matrix Management

Performance appraisal in a matrix

Author: Kevan Hall

accountable for performance appraisal in the matrix, but more than  half the work happens outside my line of sight.” This frustration is common in matrix organizations, where direct line managers carry formal accountability for performance decisions while much of the individual’s work happens through projects, cross-functional initiatives, and stakeholders who do not manage the individual and do not have visibility of the range of other things they do. Many company frameworks acknowledge that dotted line managers “provide input”, yet they often stop there. Without a clear method, that input often creates more noise than clarity. Managing performance is one of the common challenges in our guide to matrix management

Predictable challenges in performance evaluation in the matrix

  1. Proxy wars. Performance conversations become a continuation of unresolved resource or priority conflicts between managers. Feedback is shaped less by observed contribution and more by whose agenda won the quarter. The employee becomes collateral damage in a broader power struggle.
  2. Narrative capture. In the absence of agreed evidence standards, the loudest or most confident stakeholder defines the story. Recency bias, charisma, and storytelling skill outweigh substance. Over time, this rewards visibility over value and erodes trust in the process.
  3. Proximity bias- hybrid and virtual contributors are systematically underrated because their impact is less observable. Work done asynchronously, across time zones, or through influence rather than instruction disappears from traditional appraisal lenses, even when outcomes are strong.

From an individual perspective the appraisal feels unfair and is more likely to damage than improve performance.

What counts as credible performance evidence in a matrix?

To reduce politics, matrix organizations need a Minimum Viable Evidence standard. This does not mean more documentation or complex scoring models. It means being explicit about what counts as valid evidence of performance in cross-functional work.

Credible evidence in a matrix includes delivered outcomes, not effort. It includes decisions influenced or unblocked, not meetings attended. Including commitments made across functions and whether they were kept, and stakeholder impact, both positive and negative, that can be traced to specific actions.

Without agreed and flexible priorities, how would a direct line functional manager evaluate the performance of one of their people who (quite correctly) prioritized the cross-functional team results over the functional goals.

By contrast, weak evidence includes visibility, responsiveness at all hours, calendar density, or how often someone speaks in meetings. These signals feel persuasive, yet they correlate poorly with real contribution in matrix environments.

When solid and dotted line managers anchor their input to shared evidence standards, feedback becomes comparable and discussable rather than personal.

Best practice is a formal 360-degree feedback system where individuals get input from a range of individuals who can accurately evaluate their performance on multiple teams and to multiple, sometimes competing stakeholders.

If you don’t have a formal 360 process then set up an informal one by asking for input from a richer range of observers.

How should solid and dotted line managers work together?

Effective matrix performance reviews rely on a triad-based cadence involving the employee, the solid line manager, and the dotted line manager. Two moments in that cadence matter far more than the annual review itself.

The first is expectation-setting early in the cycle. This is where the triad aligns on what good performance looks like across both lines. It clarifies which outcomes matter, which stakeholders count, and where trade-offs are expected. This conversation dramatically reduces later disagreement because managers are calibrating against the same definition of success.

The second moment is calibration late in the cycle. This is not about negotiating scores. It is about separating impact from preference. Managers and other raters compare evidence, test assumptions, and surface differences in interpretation before conclusions harden. Importantly, this happens before ratings are finalized, not after trust has already been damaged.

As part of these processes solid and dotted line managers need to agree explicitly who will do that as detailed in our blogs on solid and dotted line management and on the role of the function and the team in cross-functional teams.

What should dotted line input actually look like?

Assuming the solid line manager drives the overall process, dotted line feedback should be short, specific, and grounded in observed behavior and outcomes. A simple structure helps prevent overreach and keeps the focus on contribution.

First, where the doted line manager drives cross-functional activity, then they should be the one to evaluate performance on that part of the individuals job.

They should also give input on questions like – what I needed from you and whether it was delivered. What I observed you do that created value or risk. Where you unblocked progress across boundaries. One change that would increase your cross-functional impact next cycle.

This format makes the dotted line role explicit. It provides evidence, not verdicts. It informs and adds to the solid line manager’s judgment.

How should disagreements be handled without escalation?

Disagreement between solid and dotted line managers is common in a matrix. The risk is escalation by default, where differences are pushed upward instead of worked through. To avoid this, organizations need a shared norm: disagreement triggers inquiry, not advocacy.

When perspectives diverge, managers return to the agreed evidence standards and early expectations. They ask what data would change their view and whether the disagreement reflects different information or different preferences. Only when accountability or integrity is at risk should escalation occur. This approach reinforces accountability without control rather than reverting to hierarchy.

How do hybrid and virtual teams avoid bias in reviews?

Hybrid and virtual work amplify invisibility and proximity bias. To counter this, managers must deliberately design for contribution visibility rather than rely on memory. Lightweight tracking of outcomes, decisions, and cross-functional commitments creates a shared record that reduces reliance on anecdote.

Managers should also space feedback across the cycle instead of concentrating it at year end. Distributed contribution becomes visible when it is noticed in context, not reconstructed months later. This discipline directly supports fairness in virtual and hybrid teams.

What is the practical takeaway for leaders and HR?

Matrix performance reviews do not fail because people lack goodwill. They fail because organizations lack a common method. By defining Minimum Viable Evidence, using a triad cadence, and structuring solid and dotted line input, leaders can turn a political process into a developmental one.

If you need to improve your matrix management capability, see our definitive guide or speak to one of our specialists.

 

Educate yourself further with a few more of our online insights:

30 years of experience learning with a range of world class clients

We work with a wide range of clients from global multinationals to recent start-ups. Our audiences span all levels, from CEOs to operational teams around the world.  Our tools and programs have been developed for diverse and demanding audiences.

View more of our clients
Two woman talking with a cup of coffee

Tailored training or off the shelf modules for your people development needs

We are deep content experts in remote, virtual and hybrid working, matrix management and agile & digital leadership. We are highly flexible in how we deliver our content and ideas. We can tailor content closely to your specific needs or deliver off the shelf bite sized modules based on our existing IP and 30 years of training experience.

For more about how we deliver our keynotes, workshops, live web seminars and online learning.

Discover our training solutions