Why change management is the decisive factor in Legal Ops success — three proven frameworks, the psychology of lawyer adoption, and a practical four-phase approach to embedding lasting behavioural change.
## Why Change Fails (And How to Prevent It)
Most legal operations initiatives fail not because of bad technology or poor strategy, but because the people dimension is underinvested. This is not a new observation, but it is a consistently misapplied one. Organisations spend millions on a new contract lifecycle management system, on process redesign, on competitive technology implementation — and then spend a fraction of that on helping their teams actually adopt the change.
The research is consistent. Thomson Reuters’ *Future of Professionals* report (2025) identified adoption and behaviour change as the primary failure mode for legal technology investments. Not feature gaps. Not technical problems. People.
This chapter treats change management as a professional discipline in its own right — not as a soft add-on to be bolted on after the technology decision is made. How your team responds to change, how they adopt new tools and processes, and how they sustain new ways of working determines whether your investment delivers its intended value. This is the decisive difference between success and expensive underperformance.
## Three Frameworks for Understanding Change
Different change management frameworks emphasise different dimensions of the change process. Understanding three complementary approaches — Kotter’s 8-step model, ADKAR, and the People-Process-Technology framework — gives you diagnostic lenses for identifying why adoption stalls and what to do about it.
### Kotter’s 8-Step Model
John Kotter’s seminal research on organisational change identified eight steps that successful change initiatives follow:
**1. Create urgency.** Establish a clear, credible case for why change is necessary. Without genuine urgency, people defer new behaviours, waiting to see if the change will actually stick.
**2. Build a guiding coalition.** Assemble a leadership group that believes in the change and has the credibility to drive it. This coalition is not necessarily the most senior people — it is people who are trusted, respected, and perceived as having “skin in the game.”
**3. Form a strategic vision.** Define a compelling picture of the future after change. This vision should be clear enough to guide decision-making but compelling enough to inspire discretionary effort.
**4. Enlist a volunteer army.** Move beyond the guiding coalition to recruit broader participation. People who choose to support change are more effective advocates than people who feel obligated.
**5. Enable action by removing barriers.** Identify and eliminate the obstacles that prevent new behaviour. Often these are not technical but organisational — policies, processes, or power structures that reward old ways of working.
**6. Generate short-term wins.** Create visible, tangible successes early. These wins build momentum, prove the change is working, and counter the narrative of “this will never stick.”
**7. Sustain acceleration.** Build on early wins rather than declaring victory prematurely. Change requires sustained effort; the risk of backsliding increases if momentum dissipates.
**8. Institute change.** Embed the new way of working into systems, processes, and culture so it becomes “how we do things here” rather than a temporary initiative.
A consistent pattern in legal technology deployments is that organisations skip steps 1, 2, and 8. They move directly to building the technology (step 3), do some training (implicitly addressing step 4), but fail to establish genuine urgency, assemble respected leadership to champion the change, or institutionalise the new way of working. This produces the failure pattern seen repeatedly: initial adoption after launch, followed by gradual drift back to legacy processes as the initiative loses institutional support.
### ADKAR
The ADKAR model, developed by Prosci, provides a diagnostic framework for understanding what is blocking adoption when it stalls. ADKAR is an acronym for the five dimensions required for successful change:
**Awareness** — Do people understand that change is needed? Do they grasp the business case and the consequences of not changing?
**Desire** — Do people want to support the change? Are they motivated to participate, or are they complying reluctantly?
**Knowledge** — Do people know how to change? Do they have the skills, training, and understanding required to work the new way?
**Ability** — Can people demonstrate the new way consistently and independently? Have they moved from knowing to doing?
**Reinforcement** — Are the new behaviours sustained and reinforced, or is there pressure to revert to old ways?
ADKAR’s power is diagnostic. If adoption is stalling at week 6 of a 12-week rollout, the question is not “why aren’t people using the new tool?” but “which ADKAR element is the bottleneck?” An associate who knows precisely what the new CLM does, has attended training, and understands the business case but still sends contracts via email does not have an Awareness or Knowledge gap — she has an Ability or Desire gap. She may lack confidence in the tool under time pressure, or she may lack personal incentive to change. The intervention is different depending on which element is failing.
Map your current adoption initiative against all five ADKAR elements. Where is the bottleneck? Intervene directly at the broken element rather than re-running generic training in hopes it will fix everything.
### The People-Process-Technology (PPT) Framework
The People-Process-Technology framework rests on a simple but profound observation: sustainable organisational change requires alignment across all three pillars. They are interdependent.
**Technology** performs best when it supports well-designed processes. A tool that automates a broken process simply automates the break. Conversely, a well-designed process supported by poor technology creates friction that eventually drives users back to workarounds.
**Processes** perform best when they are intuitive, well-documented, and aligned with how people actually work. A process that ignores human cognition or behaviour will be subverted — people will find workarounds, especially in high-pressure environments.
**People** are the most important pillar to invest in first. Engagement, motivation, training, and behaviour change are the hardest elements to shift and the longest to embed. If you optimise people last, you will have built expensive technology supporting imperfectly designed processes with people who lack either the knowledge or the motivation to use them.
In practice, this means:
1. **Start with people.** Understand the current behaviours, motivations, and blockers. What is the existing workflow that the new process will displace? What do people value about it? What pain points are you solving? If the people dimension is not solved, technology and process redesign alone will fail.
2. **Design process around how people work.** Map the actual workflow, not the ideal workflow. Involve the people doing the work in process design. A process designed by someone who has not done the work in years will miss critical decision points and dependencies.
3. **Choose technology that supports the process.** The technology is a tool, not the change itself. Its job is to make the new process easier, not to impose a predetermined way of working that ignores how the team actually functions.
4. **Reinforce the people pillar continuously.** Long after technology is deployed and processes are documented, behaviour change is still fragile. Reinforcement, recognition, storytelling, and periodic retraining sustain the change and counter the constant pressure to revert.
## The Psychology of Lawyer Adoption
Lawyers as a professional cohort bring specific behavioural characteristics shaped by their education, professional incentives, and daily work. Understanding these dynamics is not about caricature — it is about designing change strategies that work with how lawyers actually think and operate, leveraging their professional strengths rather than treating their natural instincts as obstacles.
**Risk aversion as professional identity.** Lawyers are trained extensively to identify risk and protect against downside outcomes. They are paid to worry. A new tool or process represents an unknown, and unknowns carry risk. The instinct to question the unfamiliar is not obstruction — it is professional due diligence. A lawyer questioning a new workflow is executing their professional training. Change management strategies that acknowledge this — building in a testing phase, inviting scepticism, designing small-scale pilots — align with rather than resist this professional reflex.
**Expertise-based authority.** The legal profession’s value proposition rests fundamentally on individual expertise. A lawyer’s professional standing is built on depth in their subject matter. A tool that claims to perform work that previously required lawyer judgment challenges the professional’s sense of indispensability. The subconscious response is often: “If a machine can do what I do, what am I worth?” This is not paranoia — it is a rational response to a genuine threat to professional identity. Effective change communication directly addresses this by positioning the tool as elevating the lawyer’s work, freeing them from commodity tasks to focus on judgment-based advisory where their expertise is truly irreplaceable.
**Perfectionism and control.** Legal work product faces external scrutiny — from courts, regulators, counterparties, and boards. The tolerance for error is near zero in many contexts. Ceding control over any part of the work product to a technology introduces a perceived quality risk that many lawyers find challenging. This is not paranoia either — it is professional prudence. Change strategies that build in lawyer oversight, validation, and final sign-off reduce the perceived risk and recognise the lawyer’s legitimate role as the final quality gate.
**Billable hour conditioning.** For lawyers trained in private practice, time spent learning a new tool is time not spent billing. This creates a deeply ingrained habit of optimising for immediate output over long-term efficiency. Even in-house, where billable hours are irrelevant, the psychological habit persists. Time spent in training or learning a new system feels unproductive. Change strategies that account for this — showing time savings concretely, positioning the investment as “reclaiming time,” offering training in small increments rather than marathon sessions — align with how lawyers actually experience their relationship to time.
**Autonomy expectation.** Senior lawyers, particularly partners and general counsel, are accustomed to high autonomy. Standardised processes and mandatory workflows feel like constraints on professional judgment — even when the intent is to free that judgment for higher-value work. Autonomy is a fundamental professional value in law; it is not obstruction. Change strategies that preserve autonomy within guardrails — “we are standardising the routine work so you can exercise independent judgment on the complex work” — are more successful than approaches that attempt to impose standardisation on the entire workflow.
Effective change management accounts for these real psychological and professional dynamics. A change plan that works with these dynamics — rather than against them — dramatically outperforms one that treats adoption as a training problem alone, regardless of how excellent the underlying technology is.
## The WIIFM Principle
### What’s In It For Me?
Every individual asked to change their behaviour — adopt a new tool, follow a new process, report data in a new format — implicitly asks: “What’s in it for me?” If the answer is unclear, unconvincing, or absent, adoption stalls.
The most effective approach is to frame WIIFM at the individual level rather than the organisational level. “This will save the department £500K per year” is a compelling board-level argument. It lands very differently with the associate who needs to spend 45 minutes learning a new system. Individual adoption accelerates when each person sees a personal, tangible benefit.
Effective WIIFM must be **personal and immediate**:
**For senior lawyers (partners, GCs):** Your direct reports work more efficiently, escalating only genuinely complex matters that require your judgment. You spend less time on administrative oversight and more time on the strategic advisory and leadership work you entered the profession to perform. Your matter management metrics improve, giving you clearer insight into your function’s performance.
**For mid-level lawyers (senior associates, counsel):** This reclaims 5–8 hours per week from administrative overhead, redirecting your time to substantive client work where you build your expertise and strengthen client relationships. Your matter management data flows automatically into your cycle time metrics, visibly demonstrating your efficiency — both supporting your case for advancement.
**For junior lawyers:** This embeds the firm’s negotiation positions and standard clause language directly into your workflow, accelerating your professional development. You spend more time on substantive legal analysis and less time on precedent searches or guessing at what the “house way” is on a particular clause.
**For business stakeholders:** You get your contracts faster, with real-time visibility into where every request sits in the queue. You stop spending time chasing legal for updates because the system notifies you automatically.
### Implementing WIIFM
Map every stakeholder group affected by the change. For each group, define the specific, personal benefit they will experience — not the organisational benefit, but what each person gains individually.
Build these benefits into training materials, launch communications, and ongoing reinforcement. When you talk to a partner about the new CLM, lead with “this frees you from administrative work to do strategic work.” When you talk to an associate, lead with “this saves you 6 hours a week.” When you talk to business stakeholders, lead with “you get faster turnaround and visibility.”
Measure adoption by stakeholder group and investigate promptly when any group falls below target. The adoption gap almost always maps to a WIIFM gap. If partners are not directing their teams to use the tool, you do not have a training problem — you have failed to articulate a compelling WIIFM to partners.
## Managing Shadow AI
### The New Governance Challenge
Shadow IT — the use of unapproved technology tools by employees — has been a governance challenge for over a decade. In 2026, Shadow AI has emerged as the more consequential successor. Lawyers and business users are independently adopting generative AI tools — ChatGPT, Claude, Gemini, and a proliferation of legal-specific AI applications — often without organisational approval, oversight, or governance.
The risks are material and distinct from traditional Shadow IT concerns:
**Confidentiality breach.** Lawyers inputting client-privileged information, deal terms, litigation strategy, or sensitive business data into consumer AI tools risk waiving privilege and breaching confidentiality obligations. Most consumer AI tools’ terms of service explicitly permit the provider to use input data for model training, and the data may be exposed to other users or retained indefinitely. This is not a theoretical risk — regulatory bodies including the Law Society of England and Wales have already issued guidance warning against inputting client information into public AI tools.
**Accuracy risk.** AI-generated legal analysis requires validation by a qualified lawyer to ensure accuracy. AI systems can generate confident, plausible assertions that are factually incorrect — hallucinations in the technical literature. When unvalidated output reaches clients, counterparties, or regulators, liability exposure increases significantly. Validated AI use, by contrast, reduces downstream review burden and catches errors before they escape the organisation.
**Regulatory non-compliance.** Jurisdictions including the EU (under the AI Act) and Australia (under evolving AI governance frameworks) are imposing obligations on the use of AI in regulated contexts. Uncontrolled Shadow AI use may inadvertently trigger compliance obligations the organisation is not yet meeting — audit trails, bias testing, or documentation of AI decision-making.
**Data sovereignty.** Consumer AI tools process data on infrastructure that may not comply with the organisation’s data residency requirements or the data protection obligations in its client contracts. An AI system trained in the cloud may not maintain data residency in the countries required by contract.
### The Governed Enablement Framework
Organisational strategy in 2026 must account for AI’s genuine productivity benefits while establishing governance that is pragmatic rather than prohibitive. A blanket prohibition drives adoption underground, making usage invisible and unmanageable. The effective response is a **governed enablement model** that harnesses AI’s benefits within a compliance framework:
**1. Establish an AI Acceptable Use Policy.** Define what AI tools are approved, what data can be input, and what use cases are permitted. The policy should distinguish between consumer-grade tools (restricted or prohibited) and enterprise-grade tools with appropriate security, privacy, and auditability controls (approved for specific use cases). The policy should also clarify what constitutes confidential data and which roles are authorised to use AI on specific data types.
**2. Provide approved alternatives.** If lawyers are using ChatGPT because the organisation has not provided a sanctioned AI tool, the Shadow AI problem is fundamentally a supply problem, not a demand problem. Deploy enterprise AI tools with appropriate security, privacy, and auditability controls. Give people a sanctioned channel that solves their productivity problem. If the official channel is clunky or slow, people will use the faster unsanctioned alternative.
**3. Train on responsible use.** Every legal team member should complete training on AI-specific risks: confidentiality, accuracy validation, bias, and regulatory requirements. This training should be practical and scenario-based — “here is a confidentiality question and here is how you get an answer safely” — rather than abstract. Training should be regularly updated as the technology and regulatory landscape evolves.
**4. Monitor and guide.** Deploy network monitoring to detect the use of unapproved AI tools. Make the response proportionate and constructive. Initial instances are treated as coaching opportunities, not disciplinary events, with escalation reserved for repeated use after training. The goal is sustainable behaviour change through education and accessible approved alternatives, not enforcement through fear.
Shadow AI is actively present in most legal organisations today. Establishing a governed enablement framework now allows your organisation to capture productivity benefits while maintaining risk control and compliance. Organisations that move first in this space are building competitive advantage through responsible, visible, auditable AI integration.
## The Four-Phase Adoption Plan
Most change management failures are traceable to insufficient time spent in the early phases. Organisations rush from decision to launch, missing the foundation-building work that determines whether adoption will stick.
### Phase 1: Awareness (Weeks 1–2)
**Objective:** Establish urgency, answer “why,” and frame the personal benefit.
**What to do:**
- Communicate the change clearly through multiple channels: email, team meetings, intranet, Slack, one-on-ones. Repetition builds clarity.
- Lead with the problem being solved, not the tool. “We are reducing time spent chasing contract status” not “we have a new CLM.”
- Address the implicit “what’s in it for me?” directly. What does each stakeholder group gain?
- Acknowledge the cost of change. “This will require learning something new” is honest. “This will feel disruptive initially but is worth it” is credible.
- Invite questions without defensiveness. Questions are information about what people are concerned about.
**Success criteria:**
- At least 80% of stakeholders can articulate why the change is happening and what benefit they personally will experience.
- Questions and concerns are being surfaced rather than silently harboured.
### Phase 2: Training (Weeks 3–4)
**Objective:** Build knowledge and confidence through role-specific, hands-on training.
**What to do:**
- Deliver differentiated training by role. Senior lawyers get a 30-minute executive briefing focused on strategic benefits and oversight responsibilities. Working-level lawyers get hands-on workshops with their actual work examples. Support staff get operational training on their specific tasks. One-size-fits-all training serves no one well.
- Use live examples. Show a contract they recently worked on. Walk through how the new process would handle it. Make it concrete, not abstract.
- Emphasise the first month post-launch will include intensive support. People are more willing to try something new if they know help is immediately available.
- Record sessions if possible. People will need to rewatch; live sessions are rarely sufficient for retention.
**Success criteria:**
- At least 80% of stakeholders complete training.
- Post-training surveys show confidence in ability to use the new tool/process.
- Role-specific training is differentiated, not generic.
### Phase 3: Supported Launch (Weeks 5–8)
**Objective:** Build competence and confidence through intensive support and rapid problem-solving.
**What to do:**
- Go live with intensive support immediately available. Assign “floor walkers” or digital champions who can answer questions in real time. These are not trainers — they are peers who have learned the tool and are stationed where people work, able to answer quick questions without requiring a formal support ticket.
- Establish a dedicated support channel (Slack, email, or helpdesk) with committed response times — ideally same-day for urgent issues. Fast response during launch is critical; waiting three days for a response creates frustration and drives users back to workarounds.
- Track adoption daily during this phase. You need to see: How many people are logging into the tool? How many are completing tasks? Where are they dropping off? This granularity allows rapid intervention.
- Intervene immediately when usage drops. If a team is not adopting, find out why. Is it a knowledge gap? Is the tool slower than the legacy process? Is there a specific manager not championing it? Intervene before dropoff becomes entrenched.
- Celebrate early wins publicly. “The contracts team is turning around reviews 3 days faster” creates positive momentum.
**Success criteria:**
- At least 60% of target users are actively using the new tool/process by end of week 6.
- Support response times are consistently met.
- Issues are resolved within 24 hours.
- Adoption is accelerating, not stalling.
### Phase 4: Reinforcement (Ongoing)
**Objective:** Sustain behaviour change, counter backsliding, and build the new way of working into standard operations.
**What to do:**
- Publish adoption metrics monthly. Show progress toward targets. Celebrate teams that are adopting successfully. This creates peer pressure to adopt — in a good way.
- Share stories of how the tool saved specific team members time or improved their work. “Sarah used to spend 6 hours per week searching precedent contracts — now she spends 45 minutes. She is using the time to develop new client relationships.” These stories are more persuasive than any metric.
- Address laggards individually. The conversation is “how can we help you?” not “why aren’t you using this?” Laggards often have legitimate blockers — the tool doesn’t work for their specific workflow, they lack confidence, or they have a workaround that still works. Listen and intervene based on what you hear.
- Build the new tool or process into performance reviews and team objectives. What gets measured and rewarded gets done. If adoption is not connected to performance conversation, people infer it is optional.
- Plan for training refreshers at 6 months and 12 months. New team members will need training; people will have forgotten features they have not used. Refreshers are normal and expected.
**Success criteria:**
- At least 80% of target users are consistently using the new tool/process by end of month 4.
- Adoption is sustained above 75% at month 12.
- The new way of working is becoming “how we do things here” rather than “the new tool the organisation is making us use.”
## Change Champions
The four-phase approach is executed more effectively with peer advocates driving adoption from within the team. These are change champions or “floor walkers” — not formal project resources but respected peers who have embraced the change and help others do the same.
### Identifying Champions
The best change champions are:
- **Early adopters.** They grasp the benefit quickly and are naturally curious about new tools.
- **Respected peers.** People listen to them. This is not automatically the most senior person — it is the person whose judgment colleagues trust and whose opinion influences behaviour.
- **Credible teachers.** They can explain concepts clearly and patiently. Some early adopters are enthusiasts who leave people behind; good champions are teachers.
- **Not just managers.** Peer champions are often more influential than managers. A senior associate’s peer adoption influences her more than her director’s mandate.
Identify champions early — even before launch. Involve them in design and testing. They become your stakeholder advisory board and your implementation team.
### What Champions Do
- **Answer questions in real time.** They sit with the team or are on Slack during the supported launch phase, answering quick questions without requiring a formal support ticket.
- **Demonstrate value to peers.** They share their own workflows, showing how they are getting value. Stories from peers are more persuasive than training materials.
- **Surface feedback.** They gather concerns, friction points, and suggestions from the team and feed them back to the project team. They are your early warning system for adoption problems.
- **Model the behaviour.** They use the tool consistently and visibly. They ask their colleagues “how did you do that in the new system?” normalising the new way of working.
### Supporting Champions
- **Recognition.** Acknowledge their contribution explicitly. Recognise them in team meetings and performance conversations.
- **Time allocation.** They need protected time to support peers. If they are expected to do their normal job plus champion a tool, the championing suffers. Budget their time.
- **Access to the project team.** Give champions direct access to the project team for escalations and feedback. They should not be a bottleneck between the team and the project; they should be a conduit.
- **Training-the-trainer support.** Equip them with deeper knowledge than the general user base so they can confidently answer complex questions.
## Measuring Adoption
Adoption is not binary — it is a spectrum. Moving from “zero usage” to “all users are expertly using the tool” is a progression, and understanding where you sit on that spectrum and where the blockers are allows targeted intervention.
### Key Adoption Metrics
**Tool activation rate:** What percentage of target users have logged into the system? This is the broadest metric — have people even tried? Target: 80% within 2 weeks of launch.
**Active usage rate:** What percentage of target users are using the tool on a daily or weekly basis? This is a better proxy for genuine adoption than activation. A user who logs in once and never returns is activated but not adopting. Track both daily and weekly active users. Target: 60% at 4 weeks, 75% at 8 weeks, 80% at 12 weeks.
**Task completion via new tool vs legacy method:** For specific workflows, track what percentage of work is being done via the new tool vs the old process. This is the truest adoption metric — are people actually doing the work the new way? Target: Varies by workflow, but for a new CLM you might target 50% of contracts managed through CLM at 4 weeks, 80% at 8 weeks, 95% at 12 weeks.
**User satisfaction:** Run a brief survey or use NPS (Net Promoter Score) to gauge satisfaction. “How likely are you to recommend this tool to a colleague?” Satisfaction is a leading indicator of sustained adoption. Dissatisfied users will find workarounds. Target: NPS \>30 by week 8.
**Time-to-competency:** How long does it take a new user to reach independent competency? This is tracked during the supported launch phase. If time-to-competency is very long, training or tool usability is the issue. Target: 80% of users at independent competency by end of week 4.
### Using Metrics to Diagnose and Intervene
Track metrics weekly during the launch phase and monthly thereafter. When a metric falls below target:
- **Low activation (\< 80% logged in by week 2):** People have not even tried. The barrier is likely awareness or motivation. Do a targeted re-communication emphasizing personal benefit. Is there a manager not championing the change? Address it.
- **High activation but low active usage (60% activated, 20% actively using at week 4):** People tried it but did not persist. The barrier is likely knowledge, ability, or perceived value. The tool may be too complex, the training may have missed something, or the tool may be slower than the legacy process for their specific workflow. Investigate actively and intervene based on what you find.
- **Stalled progress (70% active usage at week 6, still 70% at week 10):** You have hit an adoption ceiling. The remaining 30% likely have a specific blocker — technical, workflow-specific, or motivational. You need to understand what it is and intervene on that specific blocker, not just push harder.
- **Low satisfaction (NPS \< 20 at week 8):** The tool may have quality issues, may not support critical workflows, or may feel slower than the alternative. Do not push harder; diagnose the satisfaction problem and fix it.
## In the Trenches
**The Tool Nobody Used**
A regional Australian law firm invested £280,000 in a state-of-the-art document automation platform. The technology was excellent — it could produce first drafts of complex commercial agreements in under three minutes, with clause selection guided by an intuitive decision-tree interface. The vendor’s demo was impressive. The managing partner approved the spend enthusiastically.
Twelve months later, adoption was at 11%. The firm’s lawyers were still drafting documents from prior-matter precedents saved on their personal drives.
The post-mortem identified three failures, all on the people pillar. First, the lawyers had not been consulted during the selection process — the tool was perceived as something imposed by management rather than something designed for them. Second, training was a single two-hour session delivered on a Friday afternoon when half the team was in client meetings. Third, and most critically, no one had addressed WIIFM: the senior partners who controlled workflow allocation did not understand how the tool benefited them personally, so they never directed their teams to use it.
The firm hired a Legal Ops consultant who restarted the adoption effort from Phase 1. She conducted individual interviews with every partner to understand their concerns, then redesigned the training programme around role-specific use cases — showing each partner how the tool would reduce their administrative overhead and free their teams for higher-value work. She identified three “champion” lawyers who had experimented with the tool independently and were willing to demonstrate its benefits to peers. She embedded adoption targets into the firm’s quarterly review process.
Within six months, adoption reached 67%. Within twelve months, 84%. The tool’s ROI, which had been negative for the entire first year, turned positive in month 14. More importantly, the partners began actively directing work to the system, asking champions to help laggards adopt, and treating the tool as foundational to the practice.
The lesson was not about the tool. It was about starting with people, not technology.
## Checklist
- **Draft a WIIFM statement for your current or next initiative for each stakeholder group.** Write one paragraph for each group explaining the personal benefit they will experience. If you cannot articulate a compelling personal benefit, the initiative has an adoption problem that no amount of training will fix.
- **Identify three potential change champions in your team.** Look for respected peers who are early adopters and good teachers — not necessarily the most senior people, but the people whose judgment colleagues trust.
- **Assess your Shadow AI exposure.** Ask your IT security team for a list of AI-related domains accessed from the corporate network in the past 90 days. Do you have an AI Acceptable Use Policy? Is it current? Have you published approved alternatives?
- **Choose your change framework.** For your next initiative, decide whether ADKAR, Kotter, or PPT is the best diagnostic lens. Map your current initiative against it. Where are the gaps?
- **Set adoption targets with timelines for your current initiative.** Define what success looks like — activation rate, active usage rate, task completion rate, satisfaction. Schedule a 30-day review to assess progress and intervene if targets are falling short.
## Suggested Reading
- [Kotter’s 8-Step Change Model](https://www.kotterinc.com/8-steps-process-for-leading-change/)
- [Prosci ADKAR Model Overview](https://www.prosci.com/methodology/adkar)
- [Prosci - Change Management Research](https://www.prosci.com/resources)
- [McKinsey - The Influence Model for Change](https://www.mckinsey.com/capabilities/people-and-organizational-performance/our-insights/the-four-building-blocks--of-change)
- [Thomson Reuters - Future of Professionals](https://www.thomsonreuters.com/en/reports/future-of-professionals.html)