1. Introduction
These Terms of Service ("Terms") govern the use of the Poyntr Institution Module ("Platform", "Service"), operated by Poyntr Ltd ("Poyntr", "we", "us", "our"), a company registered in England and Wales. The Institution Module is designed specifically for use by educational institutions including schools, colleges, multi-academy trusts, and local authority settings.
Because this Service is used by young people aged 5-18+, these Terms have been drafted in compliance with the UK Age Appropriate Design Code (also known as the Children's Code or AADC), the UK General Data Protection Regulation (UK GDPR), the Data Protection Act 2018, and the statutory guidance Keeping Children Safe in Education (KCSIE).
By accessing or using the Platform, you agree to be bound by these Terms. If you are a student under 16, your parent or guardian must also consent to these Terms on your behalf before you can use the Service. If you do not agree to these Terms, you must not use the Platform.
We may update these Terms from time to time. If we make material changes, we will notify your institution and, where appropriate, students and parents directly. Your continued use of the Platform after such changes constitutes acceptance of the updated Terms.
2. Description of the Service
Poyntr Institution Module is an AI wellbeing companion designed for young people aged 13-17 within subscribing educational institutions. The Platform includes:
- Text-based AI companion conversations tailored for young people, using age-appropriate language and safeguarding-aware responses
- Wellbeing signal detection across 23 youth-specific detectors covering areas such as emotional distress, bullying indicators, self-harm risk, exam anxiety, family difficulties, and peer relationship concerns
- Safeguarding visibility - designated safeguarding leads and authorised pastoral staff can view conversation content to fulfil their statutory safeguarding duties
- Encrypted memory that allows the companion to remember context across conversations, providing continuity of support
- Institution-level analytics showing aggregated wellbeing trends (never individual student content)
Poyntr is NOT a substitute for professional therapy, counselling, mental health treatment, or medical advice. The Platform provides AI-powered wellbeing support only. It does not replace the role of trained counsellors, therapists, or safeguarding professionals. If a student is experiencing a mental health crisis, the institution should follow its own safeguarding and crisis response procedures, and emergency services should be contacted immediately where there is an immediate risk to life.
3. Eligibility
3.1 Age Requirements
The Platform is available to students aged 5-18+ who are enrolled at a subscribing educational institution. Children under 5 are not permitted to use the Platform under any circumstances.
3.2 Parental Consent for Students Aged 13-15
Students under 16 require verified parental or guardian consent before they can access the Platform, in accordance with the UK GDPR and the Children's Code. The institution is responsible for collecting and recording this consent through the mechanisms we provide.
3.3 Self-Consent for Students Aged 16-17
Students aged 16-17 may consent to the use of the Platform themselves, in line with the UK GDPR age of consent for data processing. Institutions may still choose to inform parents as a matter of good practice but parental consent is not legally required.
3.4 Staff Accounts
Staff members with safeguarding responsibilities (Designated Safeguarding Leads, deputy DSLs, pastoral staff, and school counsellors) may be granted access to the Platform in a supervisory capacity. Staff do not use the companion for their own wellbeing support.
4. Parental and Guardian Consent
4.1 How Consent Works
For students under 16, the institution will distribute consent forms (digital or paper-based) to parents or guardians before the student is granted access. Consent covers the processing of the student's personal data, the use of AI wellbeing conversations, and the safeguarding visibility arrangements described in Section 5.
4.2 Withdrawal of Consent
Parents or guardians may withdraw consent at any time by contacting the institution or by emailing [email protected]. Withdrawal of consent does not affect the lawfulness of processing carried out before withdrawal.
4.3 Effect of Withdrawal
Upon withdrawal of consent, the student's access to the Platform will be deactivated within 48 hours. Existing conversation data will be retained for the minimum period required by the institution's safeguarding record-keeping obligations (as required by KCSIE), after which it will be securely deleted. Parents may request earlier deletion of non-safeguarding data.
5. Safeguarding and Content Visibility
Unlike adult coaching platforms, conversations on the Institution Module are NOT fully private. This is a deliberate safeguarding design. Students are clearly informed of this before and during use of the Platform.
5.1 Who Can See What
- Designated Safeguarding Lead (DSL) and Deputy DSLs: Full access to all student conversations, detection alerts, and safeguarding flags within their institution. This is necessary to fulfil their statutory safeguarding duties under KCSIE.
- Pastoral Staff: Scoped access to conversations of students within their assigned pastoral groups (e.g. year group, house, tutor group). They can see conversations and alerts only for students they are responsible for.
- School Counsellors:Access scoped to students who have been specifically referred to them through the Platform. They can see only those students' conversations.
- Teaching Staff: No access to student conversations. Teachers cannot see any companion content.
5.2 Audit Logging
Every instance of staff accessing a student's conversation is recorded in an immutable audit log. The log captures who accessed the data, when, and for what stated purpose. Audit logs are available to the DSL and to Poyntr for compliance review.
5.3 Student Awareness
Students are informed, both during onboarding and within the companion interface, that their conversations may be viewed by designated safeguarding staff. The companion will remind students of this at appropriate points. This transparency is required by the Children's Code.
6. Youth Detectors and Alerting
6.1 Detection System
The Platform analyses conversations using 23 youth-specific detectors designed to identify wellbeing concerns, safeguarding risks, and patterns of distress. These detectors cover areas including but not limited to: self-harm indicators, suicidal ideation, abuse disclosure, bullying, eating disorder signals, substance misuse, online exploitation, radicalisation indicators, family breakdown, exam and academic pressure, grief, loneliness, and anxiety.
6.2 Alert Severity Levels
Detections are categorised by severity:
- CRITICAL: Immediate alert to the DSL and deputy DSLs. Covers disclosures of abuse, self-harm, suicidal ideation, and immediate safety risks. These alerts are delivered in real time and cannot be suppressed or delayed.
- HIGH: Alert to the DSL within the same working day. Covers serious concerns that require prompt attention but are not immediately life-threatening.
- MEDIUM:Flagged for review in the DSL's regular safeguarding dashboard. Covers emerging patterns and moderate concerns.
- LOW: Logged for trend analysis. Contributes to wellbeing pattern detection over time.
6.3 Escalation Ladder
If a CRITICAL alert is not acknowledged by the primary DSL within 15 minutes, it automatically escalates to deputy DSLs. If still unacknowledged after 30 minutes, it escalates to the institution's headteacher or principal contact. The escalation ladder is configured during institution onboarding.
6.4 Pattern Detection
Beyond individual alerts, the system monitors patterns across multiple conversations. For example, repeated low-level anxiety signals over several weeks may be escalated to MEDIUM or HIGH if the pattern indicates deteriorating wellbeing, even if no single conversation triggered a higher alert.
6.5 Transparency with Students
The AI companion does NOT inform students when a safeguarding alert has been generated. This is a deliberate design decision to prevent students from self-censoring disclosures and to protect the integrity of the safeguarding process. Students are informed in general terms (during onboarding and in these Terms) that concerning content may be shared with safeguarding staff, but they are not notified of specific alerts.
7. What the AI Will Not Do
The AI companion is designed with strict boundaries appropriate for young people. It will not:
- Provide medical, psychiatric, or clinical advice of any kind, including advice about medication, diagnoses, or treatment plans
- Promise confidentiality - students are always informed that conversations may be reviewed by safeguarding staff
- Set goals, commitments, or action plans with students - this is reserved for qualified professionals working with the student directly
- Generate study resources, essays, homework help, or academic content - the companion is for wellbeing support only
- Engage in role-play, fictional scenarios, or adopt alternative personas - the companion always identifies itself as an AI wellbeing companion
- Make judgments about a student's family, carers, teachers, or institution
- Encourage a student to take specific actions regarding relationships, conflicts, or grievances
8. Data Processing
Your use of the Platform is also governed by our Institution Privacy Policy, which describes in detail how we collect, use, store, and protect personal data. Key principles include:
- We process data in compliance with the UK GDPR, the Data Protection Act 2018, the Children's Code (AADC), and KCSIE
- All conversations are encrypted at rest using enterprise-grade encryption with per-student key derivation
- Data is processed and stored in European data centres - no data is transferred outside the UK/EEA without appropriate safeguards
- The default data retention period is 3 years from the student's last interaction, aligned with standard safeguarding record-keeping practices - institutions may configure shorter retention
- Students and parents have the right to request data export or deletion, subject to safeguarding record-keeping obligations
9. Age Transition
9.1 Turning 18
When a student turns 18 while using the Platform, they are no longer considered a child under the Children's Code. The following transition process applies:
- The student will receive a notification 30 days before their 18th birthday informing them of the upcoming changes
- On turning 18, the student may choose to transition to an adult Poyntr account (if available) or to close their account
- Institution safeguarding visibility over the student's conversations ends on their 18th birthday - from that date, no staff member can access new conversations
- Historical conversations from before the student turned 18 remain subject to the institution's safeguarding record-keeping obligations
9.2 Leaving the Institution
If a student leaves the institution before turning 18, their account will be deactivated within 30 days of the institution confirming their departure. Data retention follows the same rules described in Section 8 and the Privacy Policy.
10. Institution Responsibilities
The subscribing institution agrees to the following responsibilities:
- Designate at least one Designated Safeguarding Lead (DSL) and at least one deputy DSL who will be responsible for reviewing safeguarding alerts generated by the Platform
- Ensure all staff with access to the Platform hold current, valid DBS checks at the appropriate level (enhanced with barred list check for those with access to student conversations)
- Maintain an up-to-date safeguarding policy that includes the use of the Platform and make this policy available to parents, students, and staff
- Respond to CRITICAL safeguarding alerts in a timely manner, following the institution's own safeguarding procedures and referral pathways
- Inform students clearly and in age-appropriate language that conversations on the Platform may be reviewed by safeguarding staff, both during onboarding and on an ongoing basis
- Collect and manage parental consent for students under 16 using the mechanisms provided by Poyntr
- Notify Poyntr promptly when students leave the institution, when staff roles change, or when consent is withdrawn
- Not use the Platform as a replacement for qualified counselling, therapeutic support, or professional safeguarding assessment
11. Acceptable Use - Students
Students using the Platform agree to:
- Use the companion for genuine wellbeing conversations only
- Not share their account credentials with other students or anyone else
- Not use the companion to generate academic work, essays, or homework
- Not attempt to trick, manipulate, or deliberately confuse the AI companion
- Not use the companion to make threats, bully others, or create harmful content
- Not attempt to access other students' conversations or accounts
- Understand that conversations are not fully private and may be reviewed by safeguarding staff
Misuse of the Platform may result in the student's access being suspended or removed by the institution. The institution is responsible for managing student behaviour on the Platform in line with their existing behaviour policies.
12. Acceptable Use - Staff
Staff members with access to the Platform agree to:
- Access student conversations only for legitimate safeguarding purposes, never out of curiosity or for any purpose unrelated to student welfare
- Understand that all access to student conversations is audit-logged and may be reviewed
- Not share student conversation content with unauthorised individuals, including other staff who do not have Platform access
- Follow the institution's data protection and safeguarding policies at all times when using the Platform
- Report any suspected misuse of the Platform (by students or other staff) to the DSL and to Poyntr
Misuse of staff access to the Platform is a serious matter. It may constitute a breach of data protection law and the institution's safeguarding policy, and may result in disciplinary action, removal of Platform access, and where appropriate, referral to the ICO or other authorities.
13. Data Rights
13.1 Student Rights
Students have all the rights provided under the UK GDPR, exercised in a manner consistent with the Children's Code principle of best interests. These include:
- The right to access their personal data and conversation history
- The right to rectification of inaccurate data
- The right to erasure ("right to be forgotten"), subject to safeguarding record-keeping obligations
- The right to restrict processing
- The right to data portability
- The right to object to processing
13.2 Exercising Rights
Students aged 16 and over may exercise these rights themselves. For students under 16, these rights are exercised by a parent or guardian on the student's behalf. However, we will take into account the student's own views and level of understanding in line with the Children's Code.
13.3 Requests
Data rights requests can be made by contacting the institution's data protection officer, or by emailing [email protected]. We will respond to all requests within one calendar month.
14. Intellectual Property
14.1 Platform Ownership
The Platform, including its software, algorithms, detection models, user interface, documentation, and all associated intellectual property, is owned by Poyntr Ltd. Nothing in these Terms grants you any rights to our intellectual property except the limited right to use the Platform as described herein.
14.2 Student Content
Students retain ownership of the content they submit to the Platform (such as conversation messages). By using the Platform, students (and their parents or guardians where applicable) grant us a limited licence to process this content solely for the purpose of providing the wellbeing companion service, including AI analysis, memory storage, safeguarding detection, and response generation. We do not use student content to train our AI models.
15. Limitation of Liability
To the maximum extent permitted by applicable law:
- The Platform is provided "as is" and "as available" without warranties of any kind, whether express or implied
- We do not warrant that AI-generated responses will be accurate, complete, or suitable for any particular purpose
- We shall not be liable for any indirect, incidental, special, consequential, or punitive damages arising from use of the Platform
- Our total liability for any claims arising from these Terms shall not exceed the amount paid by the institution for the Platform in the twelve months preceding the claim
Poyntr does not accept liability for safeguarding decisions made or not made by institution staff in response to information provided by the Platform. The Platform is a tool that supports safeguarding by detecting and flagging concerns. The responsibility for acting on those concerns, making referrals, and following safeguarding procedures rests with the institution and its designated safeguarding personnel.
Nothing in these Terms excludes or limits liability for death or personal injury caused by negligence, fraud, or any other liability that cannot be excluded or limited by applicable law.
16. Termination
16.1 By the Student
A student may stop using the Platform at any time by simply not logging in. Students (or their parents/guardians) may also request formal account deactivation through the institution or by contacting [email protected].
16.2 By a Parent or Guardian
A parent or guardian of a student under 16 may withdraw consent at any time, which will result in the student's account being deactivated as described in Section 4.
16.3 By the Institution
The institution may deactivate any student's account at any time, for example if the student leaves the institution, if the student breaches acceptable use rules, or if the institution terminates its subscription to the Platform.
16.4 By Us
We may suspend or terminate access if these Terms are breached, if the institution's subscription expires, or if we reasonably believe continued use poses a risk. We will provide notice where practicable.
16.5 Effect of Termination
Upon termination, the student's right to access the Platform ceases. Data retention after termination follows the rules set out in Section 8 and the Privacy Policy. The institution's safeguarding record-keeping obligations continue to apply regardless of account status.
17. Governing Law
These Terms are governed by and construed in accordance with the laws of England and Wales. Any disputes arising from these Terms or use of the Platform shall be subject to the exclusive jurisdiction of the courts of England and Wales.
18. Contact Information
If you have questions about these Terms, please contact us:
- General legal enquiries: [email protected]
- Safeguarding concerns: [email protected]
- Data protection: [email protected]
- Company: Poyntr Ltd
- Website: poyntr.ai