The Centre for Protecting Women Online is delighted to announce the launch of Round 2 of our Industry and Innovation Fellowships. This exciting opportunity is designed for professionals based outside academia who want to collaborate with researchers to tackle some of the most urgent challenges facing women and girls in digital spaces.
If you work in industry, the charity sector, policing, technology, design, policy, or frontline services, and are passionate about making online environments safer, we encourage you to apply and to share this opportunity within your organisation and networks.
What are the Industry and Innovation Fellowships?
Our Fellowships are designed to enable meaningful knowledge exchange between academia and practice. Fellows will spend dedicated time at the Centre working on a specific project, contributing expertise, and co-creating impactful outputs with CPWO researchers and our local, national, and international partners.
You can:
- Pitch your own project idea, or
- Join one of our existing projects
Fellows will also:
- Take part in the Centre’s research culture and public engagement activities
- Present their work at a dedicated Centre event
- Connect with CPWO’s extensive professional and academic networks
Who Can Apply?
We welcome applications from professionals at all career stages who meet the following criteria:
Applicants must:
- Be based in the UK
- Be affiliated with an eligible UK-registered organisation
- Provide written permission from their employer to undertake the fellowship
- Confirm salary costs for the period of the fellowship
Please note: self-employed applicants and applicants based outside the UK are not eligible.
Organisations must:
- Be registered in the UK
- If employing over 50 staff and/or with turnover above £2m, be prepared to offer the Fellow’s time as an in-kind contribution (unless strong justification for funding is provided)
The Centre can cover secondment costs for smaller organisations (subject to capacity and funding), for up to 55 days (equivalent to 3 months full-time).
Fellowship Structure
- Offered on a secondment basis
- Can be full-time or part-time (minimum 1 day per week active engagement)
- Must be completed with agreed outputs delivered within 12 months
- Flexible start dates (from 1 October 2026), subject to agreement
We are also open to discussing alternative models—please get in touch ahead of the deadline to explore options.
Project Areas You Can Join
Round 2 Fellows can contribute to a wide range of cutting-edge projects, including:
This project will explore the next generation of online and technology-facilitated harms affecting women and girls, from deepfakes and synthetic media to immersive technologies and AI-enabled coercion. The Fellow will lead a horizon-scanning exercise, analysing global trends, risk typologies, and implications for policing readiness.
This project will examine how policing can ethically collaborate with online platforms and industry partners to share intelligence and identify risk-related patterns of offending across digital ecosystems (e.g. dating, gaming, and social platforms). The Fellow will help design a governance model, stakeholder map, and practical framework for responsible data-sharing to enhance public protection.
This project investigates the psychological impact of investigating online sexual and gender-based harms, focusing on resilience, supervision, and wellbeing among digital investigators. The Fellow will identify best practice models and produce recommendations for force-level implementation.
This project explores the convergence between sexualised online offending and ideologically motivated harassment or extortion (e.g. “Com” networks). The Fellow will analyse hybrid offender behaviours, develop typologies, and identify investigative and prevention opportunities for policing.
This project will explore how technology facilitated harms against women and girls are facilitated and mitigated at different stages of the software development lifecycle. The Fellow will lead a reflexive activity on their own organisational experience, analysis of industrial codes of conduct, policies and guidance documents, and conclude with implications for future technology development and governance.
This project will explore current practice of technology companies in compliance with UK legal requirements set by Online Safety Act 2023. The fellow will lead an empirical study to collect primary data and/or secondary data about ‘Safety by design’ practices in the technology industry, especially to reflect the best practice and challenges faced by stakeholders to implement the legal requirements in the UK.
During the synthesis phase of the design process, designers take different perspectives (including their own, the users’, and the businesses’) to make sense of data and problems, to come up with design decisions and artifacts. This project will explore the impact of gender biases or stereotypes during sense making activities within the community of design research and design practice.
This project would aim to explore how markers of difference including but also beyond gender (e.g. race, ethnicity, disability, sexuality, etc) intersect to impact both the experience and interventions around online violence in a specific context (e.g. dating apps, online companions, etc). The specific question would be developed in collaboration with the fellow.
This project would aim to map existing support and resources available to (local?) schools to address online sexual violence among students. Once complete, existing provision could be audited to identify strengths, limitations, specificities and inconsistencies.
This project will explore how existing Responsible AI frameworks are implemented in practice, and where they fall short in preventing online harms against women and girls. The fellow will undertake an extensive, systematic landscape review of current Responsible AI principles and standards, and examine how these translate into real-world practices. The project will identify gaps both in the frameworks themselves and in how they are operationalised, particularly where shortcomings contribute to the creation, amplification or neglect of harms affecting women and minorities online. The outcome will provide evidence-based insights into where Responsible AI practice succeeds, where it fails, and what changes are needed to ensure technology development better protects women and vulnerable groups from online abuse and violence.
This project will investigate the radicalisation pathways that lead people, particularly men, into the manosphere. Using social media data, we will examine the micro (individual), meso (social), and macro (global) factors that draw men into these communities. The study will analyse the content they post, the external sources they link to, and the networks they form within and across online communities to understand influence and dominant narratives. Particular attention will be paid to the mental health issues that emerge and to how these shape men’s perceptions, beliefs, and self-conceptualisations. The overall aim is to better understand why men are drawn to these spaces and to use this knowledge to inform appropriate strategies for mental health support, education, and initiatives that strengthen social connection among men, thereby preventing and mitigating engagement with these communities and the adoption of misogynistic narratives. This is a cross-disciplinary project that combines human behaviour research and responsible AI approaches for large-scale data analysis. Therefore, we seek expertise in textual and social data analysis (including programming skills) and/or in psychology and social science focused on men, radicalisation, and online harm.
This project will examine how humans are relying on AI companions and the implications of those interactions for real face-to-face interactions and relationships. The fellow will work closely with charity organisations and frontline services to understand whether, and in what circumstances, people turn to AI chatbots for companionship, the motivations behind this use, and the nature of the questions and conversations taking place. Insights from this qualitative work will then inform a second phase of the project to assess the potential impact of these virtual interactions into real relationships. The project aims to generate evidence on both risks and potential benefits, contributing to guidance and design recommendations that ensure AI companion tools are safe.
This project will explore semi-automatic methods to populate and extend an existing ontology of online gender-based harms. The fellow will investigate approaches for extracting relevant data from online news, platforms and public conversations, and combining these with a human-in-the-loop process to ensure accuracy and contextual understanding. Working alongside domain experts, the fellow will test and refine techniques to identify and classify concepts, instances and relationships relating to online harms and gendered violence, and integrate them into the ontology. The aim of the project is to develop novel, responsible methodologies that support scalable ontology building while retaining critical human oversight, ultimately strengthening the representation and understanding of online gender-based harms.
You can also propose your own project idea aligned with the Centre’s mission.
What’s In It For You?
As a Fellow, you will:
- Develop new skills and research experience
- Influence policy, practice and technology design
- Build strong cross-sector networks
- Co-create outputs such as reports, toolkits, blogs, podcasts, or policy briefings
- Contribute directly to protecting women and girls online
Key Dates
- Application deadline: 16 March 2026 (12pm midday)
- Decisions communicated: Mid-April 2026
- Start date: From 1 October 2026 (flexible)
How to Apply
Applicants should complete the application form and submit it, along with a supporting letter from their employer, to: protecting-women-online@open.ac.uk
Your application will include:
- Your background, skills and experience
- A description of your project idea or contribution to an existing project
- Planned activities and outputs
- Your preferred start date and time allocation
- Employer confirmation of eligibility, time commitment, and funding or in-kind support
Get in Touch
We strongly encourage potential applicants to contact us ahead of the deadline to discuss project ideas, eligibility, funding models, or partnership options.