Prototypes

Participants from the policy design workshops produced 11 prototypes that can help create safer online experiences for women and tackle OGBV.

Fictional platforms were created for the participants to build their solutions for. These apps gave participants the opportunity to think about and design solutions for what is necessary rather than just what is currently possible.

The prototypes built during the workshops were based on a set of personas of highly visible women online.

Find out more about the personas →

Calm the Crowd
Viral Notification
Com Mod
Image Shield
Reporting 2.0
Report Hub
Reporteroo
One Click
GateWay
iMatter
Wellbeing Check-Up
A screenshot of the mycro social media platform app, with a pop-up directed to the user statig “we are seeing increased actiitiy on your acccount. Do you want to respond?” with the option to select yes or no.
A second screenshot feauting the Calm the Crowd interface’s settings menu, allowing the user to set options for “automatic trigger, choose from accounts and keyword filter”.
A third screenshot featuring the Calm the Crowd interface’s settings menu asking the user “Who do you want to hear from?”. The first option “Accounts that are likely inauthentic” is expanded to reveal options for “block, mute, restrict and stop all”. Further options for “newly created accounts” and “accounts flagged as problematic” are listed below.
Left arrow button
Right arrow button

Calm the Crowd

Calm The Crowd offers users more granular control settings by prompting users to check their settings when a spike in abuse is detected. They can then set granular user controls, such as blocking, muting or restricting accounts a platform flags as likely to be inauthentic. It would allow users to create their own keyword filters for the replies and comments they see, and the ability to choose the types of accounts they want to hide, see or limit replies from.

This prototype was designed for the persona Yvonne. View this persona ➝

A second screenshot of an app with a pop up from Viral Notification reading "Your video is getting a lot of views. Viral Mode might be able to help.”
A third screenshot featuring the viral mode interface, featuring options for the user to enable  “cooling off period, allow comments and allow downloads.”
Left arrow button
Right arrow button

Viral Notification

Viral Notification provides greater user controls on a video sharing site. Here they can initiate Viral Mode, which was designed in response to a scenario where someone's content had unexpectedly gone viral. In Viral Mode users can turn off commenting or downloads. This feature has a very clear button that users can access to enable this function.

This prototype was designed for the persona Paula. View this persona ➝

A screenshot of the Com Rod interface preenting the user with two options “I will moderate or I will assign community mode”
A secod screenshot of the Com Rod interface preenting the user options to allow other user permissions to moderate their account including whether they “can upoad, can delete or can restrict”.
Left arrow button
Right arrow button

Com Mod

Com Mod allows users the option to delegate reviewing and reporting abuse to trusted communities/contacts at a more granular level (i.e. per post, for a specific amount of time). This solution builds on the idea of shared responsibility and reducing burden on the user who is under attack or receiving abuse.

This prototype was designed for the persona Amy. View this persona ➝

A screenshot of an app with the Image Shield pop-up which reads “Safety Check-in, our systems detected a video you may be in. Would you like to review it?” The user then has the option to “review video, ask a friend to review or dismiss”.
A second screenshot of an app with the Image Shield pop-up which reads “Review by category” The user then has the option to select “escalated review, report for review, archive, get additional support or dismiss”.
Left arrow button
Right arrow button

Image Shield

Image Shield allows users to gain more control over their content and images. A notification system will flag when the system recognizes the user in a video posted by an external account. They can then review the video or dismiss the notification. Users also have the option to delegate reviewing and reporting any abuse to trusted communities/contacts. They can also collect and archive any flagged content with a date stamp, platform, name, and flag filter.

This prototype was designed for the persona Mouna. View this persona ➝

A screenshot of the mycro social media platform app, with a pop-up directed to the user statig “we are seeing increased actiitiy on your acccount. Do you want to respond?” with the option to select yes or no.
Left arrow button
Right arrow button

Reporting 2.0

Reporting 2.0 offers an improved reporting flow that allows users to easily access information and effectively communicate the full scope of the abuse they are experiencing. It provides easy access to key terminology and policies. For example, a hover button over the categories of abuse gives a short explanation of relevant policies or community guidelines so the user can ensure they are reporting abuse according to the company’s community guidelines. It also offers the ability to add contextual information to a report, including geographical, cultural and linguistic nuance. It also offers the ability to file a report in the original language of the abuse

This prototype was designed for the persona Karishma. View this persona ➝

A screenshot of an app home screen , with a menu feauting the option to go to Report Hub.
A second screenshot featuring the report hub interface which shows that the user has 14 open cases. They have the option to view all their open cases and draft new cases.
A third screenshot featuring the report hub interface which shows that the user made a report at 18:28 today, the interface shows that the report is currently under review and provides the option for further help and resources, including enforcement resource and viewing related policies.
Left arrow button
Right arrow button

Report Hub

Report Hub provides a reporting dashboard that allows users to track the status of all their reports using key milestones on a timeline - for example, ‘report made’, ‘report under review’, ‘review complete’, ‘decision appealed’. The timestamps help the user understand the timeline of the process. The feature is accessible from the homepage at all times. Users also have the option to save draft reports, add further evidence, or even hand over the report to a trusted contact if they are feeling overwhelmed.

This prototype was designed for the persona Paula. View this persona ➝

A screenshot of Reporteroo featuring two user input questions. The first stating “I feel this is a violation of hate speech because...” with the option for the user to select their response, followed by a free text field. The second input asks “are you reporting in the language of the abuse?” with the user having the option to select “yes, no do not translate” and “no, please translate”
Left arrow button
Right arrow button

Reporteroo

Reporteroo is a reporting dashboard that provides transparency and accountability around the reporting process during and post-reporting. It provides specific prompts for users based on the category of abuse, so they can provide the context and information needed for the platform to respond more effectively. It also provides the option for users to flag if they are reporting in the same language as the abuse, and if not, to specify which language they are translating to and from. It includes a toggle option that allows users to choose whether they want the content of the reports to be visible or not.

This prototype was designed for the persona Karishma. View this persona ➝

A screenshot of an app with a user profile with the One Click option pop up appearing in the top right of the screen.
A second screenshot featuring the One Click settings which include filters for “hide profanity, disable tagging and hide keywords” the last of which allows users to add their own keywords.
Left arrow button
Right arrow button

One Click

One Click allows users to set a time-limited safety mode. It can be toggled on when users want to shield themselves from potential pile-ons. It can be accessed and enabled in ‘one click’ from Settings, and from pages throughout the platform - i.e. feeds page, posts page or profile page. Safety mode features could include disabling comments or activating a ‘delay period’ for comments, activating a profanity or keyword filter, flagging keywords, and disabling tags.

This prototype was designed for the persona Yvonne. View this persona ➝

A second screenshot featuring the panic button interface with the options for the user to select “appaly for temporary protected status, flag content and account, archive content and get in touch with a supportive organisation”.
Left arrow button
Right arrow button

GateWay

GateWay allows users to alert a platform that they are being attacked. It gives the option to request protected status, flag abusive content, collect and archive abusive content as evidence, and generate and share evidence reports. Users also have the option to connect to trusted and verified Civil Society Organisations to seek support in handling online abuse.

This prototype was designed for the persona Mouna. View this persona ➝

A screenshot of an app with the a pop up from IMatter appearing on the screen which reads “Hi, we’ve receievd your report”.
A second screenshot featuring the iMatter interface in which the user is a conversation with a chatbot. The conversation reads “Hi, just to let you know we’ve received your report, can I ask you some further questions so we can help you further?” The user, replies “sure” the chatbot then states “We’re reviewing your report right now. Would you like for one of our human report supporters to follow up with you to give you further support?” the user replies “yes”  the chatbot then asks “would you like to view some support materials related to your report?”.
A third screenshot featuring the iMatter interface with a person reaching out to the user via the chat their message reads “Hi Paula, I’m Julia, I’m checking in to see how you are feeling and if you’re happy with the outcome of the report. How are you feeling?” Then asking the user “how did we do? allowing them to set a slider between “not so good” and excellent”, before finally asking “Do you want to share your experience with us?”
Left arrow button
Right arrow button

iMatter

iMatter provides a chat interface to support users through the reporting process. iMatter is accessed from the homepage where the chatbot states that a report has been received. When clicked, a chat opens and guides the user through the status of their report and offers resources such as community support, and the opportunity to chat with a psychologist. A follow-up conversation checks how the user is doing, asks if they need further support, and offers the option to leave feedback about their experience of the reporting process.

This prototype was designed for the persona Paula. View this persona ➝

A screenshot of the Wellbeing Checkup interface which asks the user whether they are ready to run a risk assessment on their profile while also presenting them the option of opting in or out of regular checkups.
A second screenshot of the Wellbeing Checkup interface which asks the user “what are you experiencing” allowing them to choose an option. Then it asks “Do you want to add any information around identity or national context, yes or no” followed by a free text field. Lastly, it asks “do you want to add any information about a specific occurrence or political factor, yes or no?”.
Left arrow button
Right arrow button

Wellbeing Check-Up

Wellbeing Check-Up provides a short, multiple choice pop-up risk assessment that can suggest settings to modify based on the user's experience. It allows users to self-assess their risk, the platform to give prompts around risk and check the level of risk a user’s profile is in at a given time, using indicators that are set by the user.

This prototype was designed for the persona Yvonne. View this persona ➝