How we measure our own delivery, what we do when things fall short, and how engagements improve over time. This page exists because we think regulated service providers should be held to the same standards they deliver to others.
Home » Performance & Improvement
We operate an integrated management system and apply it to ourselves — not just to our clients. The standards we audit others against, we are audited against. The documentation disciplines we require of manufacturers, we apply to our own processes. The corrective action cycles we deliver as a service, we run internally when our own performance falls short.
This is not a credential statement. It is a description of how the organisation actually works — and why clients in ongoing engagements report fewer surprises, more predictable timelines, and improving outcomes over successive cycles.
Performance indicators are drawn from direct cross-industry experience in engineering, healthcare, energy, laboratory, and regulated service environments. We measure what has operational consequence — not what produces a favourable dashboard.
The proportion of regulatory submissions, registrations, and documentation outputs accepted at first review — without a request for additional information, correction, or resubmission. Tracked per engagement and per service type.
Milestone delivery against the schedule agreed at the start of each engagement. Where delays arise — from any source — they are flagged proactively before the deadline, not discovered at it.
The proportion of client documentation assessed as audit-ready at initial review, versus requiring rework before it meets the standard. Rework rate is tracked as a quality indicator, not absorbed silently into the engagement.
Proportion of MHRA registrations, EUDAMED submissions, and competent authority interactions that achieve the required outcome without regulatory objection, query, or request for further evidence.
Internal non-conformances raised, classified by severity, assigned, and closed within defined timescales. Repeat findings tracked separately — a repeat finding is treated as a failure of the corrective action, not just the original issue.
Structured feedback collected at the close of each project engagement and at defined review points within ongoing retainer arrangements. Feedback is documented, not informally noted.
Internal audit is conducted against ISO 9001, ISO 27001, and the applicable sector-specific standards relevant to active service delivery. Audit scope covers all major process areas: regulatory delivery, UKRP and EC-REP representation, EUDAMED operations, document control, information security, and continual improvement.
Audit cycles are risk-based. Higher-risk or higher-frequency delivery areas — MHRA registration, EUDAMED submissions, vigilance reporting — are audited more frequently than lower-risk administrative processes.
Internal audit findings are classified, assigned, and tracked to closure with the same rigour applied to findings raised by external bodies. An internal non-conformance is not treated as less significant because it was self-identified.
One principle governs internal audit: if we would not accept a finding being left open or unresolved by a client, we do not accept it in our own system.
Management review is conducted at defined intervals and produces documented decisions — not discussion notes. Every output has an owner and a timescale.
Results against the indicators described in Section 2. Trends over time, not just point-in-time snapshots.
Open findings, closure rates, repeat findings, and any patterns emerging across audit cycles.
Aggregated across engagements. Themes, not just scores.
Updates to UK MDR, MHRA guidance, EU MDR/IVDR implementing acts, EUDAMED requirements, and ESOS/energy regulation. These are reviewed before they create compliance gaps — not after.
Whether the organisation has the capacity and technical knowledge to deliver current and anticipated workload to the required standard.
Current risk register reviewed. New risks or opportunities identified since the previous review assessed and recorded.
Improvement is systematic, not reactive. Patterns observed across client engagements — recurring documentation gaps, common EUDAMED submission errors, frequently misunderstood MHRA requirements — are fed back into delivery methods, client communication templates, and internal process controls.
The objective is that each successive engagement cycle produces fewer findings, shorter corrective action timescales, and more predictable outcomes. This is not reset at the end of each engagement. Knowledge from one client’s regulatory environment informs how we approach the next similar situation.
Root cause identified — not just the symptom addressed. Action taken, effectiveness verified, closed with documented evidence. A corrective action is not marked closed until it has demonstrably prevented recurrence.
Identified proactively, not just in response to failures. Sources include audit results, performance data, client feedback, regulatory updates, and lessons from adjacent sectors. No minimum severity threshold for logging an improvement.
Lessons from each engagement are captured in controlled internal records. Operational knowledge is not held only by individuals — it is documented so that it persists across personnel changes and long gaps between similar engagements.
As UK and EU regulatory frameworks evolve — MDR/IVDR implementing acts, MHRA post-Brexit guidance updates, EUDAMED module changes — delivery methods and documentation are updated before the changes create compliance exposure for clients. Not after.
Engagements involving regulatory representation, EUDAMED management, and technical documentation review require handling sensitive commercial and regulatory data. This is treated as a defined obligation, not an afterthought.
Information security management is aligned to ISO 27001:2022. Controls are applied to how client data is stored, accessed, transmitted, and retained.
Client records are accessible only to personnel working on that engagement. No shared inboxes, generic logins, or uncontrolled document stores.
Where engagements require transfer of technical files, regulatory submissions, or quality records, a controlled exchange route is confirmed before commencement. Files are not sent via unencrypted email as a default.
Client data is retained for the period required by the applicable regulatory framework and deleted thereafter. Retention periods are defined per data category, not held indefinitely by default.
UK GDPR and EU GDPR obligations are embedded in how we operate. Data is not used beyond the agreed scope of the engagement. Full privacy policy available at cyberpan.com/privacy-policy
Regulatory problems identified in documentation review or pre-submission checks cost significantly less to fix than those discovered at MHRA query or competent authority objection stage. Our monitoring is designed to catch them early.
Clients supported through MHRA inspection, Notified Body audit, or internal management review cycles consistently report lower findings volumes. Documentation is structured to withstand scrutiny from the first draft, not prepared retrospectively.
Milestone adherence is tracked. When a delay is anticipated — from any source, including third parties — it is communicated in advance. Deadlines are not missed silently.
Manufacturers in ongoing UKRP or EC-REP retainer arrangements see improving outcomes over successive annual cycles. Each renewal incorporates lessons from the previous year. The service does not reset.
All outputs — registration records, submission logs, correspondence, mandate documentation — are structured to support your own regulatory inspections, internal audits, and management reviews. You are not dependent on us to reconstruct the record.
Technical files, device records, and regulatory submissions contain commercially sensitive and sometimes clinically significant information. It is handled under defined controls — not passed around in unencrypted emails or stored in personal cloud drives.