☰
  • Home
  • About Stella Polaris
    • Introduction to AI and Child Sexual Abuse
    • About Childhood
    • About Stella Polaris
  • Reports and Articles
    • All Reports
    • Research papers on Child Sexual Abuse and AI
    • Reports by Childhood
  • Common AI Technologies
    • Natural Language Processing
    • Data Analysis
    • Machine Learning
    • Deep Learning
    • Neural Networks
    • Robotics
    • Computer Vision
  • Digital Tools Database
  • Contact Us
  • Support Us!
Natural Language Processing Areas of Development

Natural Language Processing Areas of Development

Stella Polaris Knowledge Center
Stella Polaris Knowledge Center
/Natural Language Processing
Natural Language Processing
/Natural Language Processing Areas of Development
Natural Language Processing Areas of Development
Area (Click to open)
Short description
Reason for Being Key Area
Crime Phase
Technology
Purpose
Expand existing solutions to new countries

Most existing solutions are only used in a few number of areas, while they could be useful in most countries.

🤕Identified Organizational Pain Point
Detect CSAM on encrypted devices

Nearly half of the surveyed police officers reported that encryption is the biggest challenge they face in child sexual abuse investigations, according to NetClean Report 2019.

🤕Identified Organizational Pain Point
⚖️Prosecution
💻Device extraction/search
🧑‍⚖️Perpetrator prosecution
Speed up co-op from social media platforms

Social media companies sometimes are so slow to provide information requested by the police, that evidence is gone.

🤕Identified Organizational Pain Point
⚖️Prosecution
🧑‍⚖️Perpetrator prosecution
Digital Post Crime Solutions

This area currently has very few tools/projects, indicating it might be low hanging fruit that can bring great improvements.

⚪Lack of Tools/Projects in Area
❤️‍🩹Post-crime efforts
Detect self-production of live-streaming

Voluntarily and induced (through grooming or sexual extortion) self-produced live-streamed child sexual abuse were both reported to be common types of live-streamed material in investigations. -Netclean report 2019

⚪Lack of Tools/Projects in Area
🚨Detection
🚨Grooming detection/prevention
Automatic in-chat grooming detection

There's been an increase in grooming cases on social media and gaming platform, and a relative lack of tools/project aimed at detecting it.

⚪Lack of Tools/Projects in Area
🚨Detection
🚨Grooming detection/prevention
Therapy Chatbot for Abuse Survivors

A therapy chatbot could anonymously help children cope with trauma, and could be a first step towards talking to a parent or a therapist.

⚪Lack of Tools/Projects in Area💻Technological Trend
❤️‍🩹Post-crime efforts
🔠Text analysis/processing
Advanced Network Analysis

Advanced network analysis using deep learning flags suspicious accounts and prevents grooming attempts on child-friendly platforms (social media, gaming sites)

💻Technological Trend
🔍Prevention
📊Data analysis and management
🚨Grooming detection/prevention
Automatic flagging of CSAM

Automatic, flagging of CSAM on relevant platforms “at its source” helps remove material before it is circulated.

💻Technological Trend
🚨Detection
🖼️Image and video detection/classification
🛑Consumption of child abuse material prevention
Advanced Visual Recognition of Child Sexual Abuse

Computer vision AI with advanced age, object, voice, location and facial recognition capabilities to gather more context on image for more accurate and quicker victim and perpetrator identification.

💻Technological Trend
🚨Detection⚖️Prosecution
🖼️Image and video detection/classification
🛑Consumption of child abuse material prevention🕵️‍♀️Perpetrator investigation
Automatic risk analysis of text

Language processing AI for advanced risk detection, analyzing text to assess abuse risk “pre-facto” of any given piece of content.

💻Technological Trend
🔍Prevention
🔠Text analysis/processing
🚨Grooming detection/prevention
Video Clustering for Identification

Data-rich pattern recognition AI for advanced grouping and prioritization tools, using deep learning to cluster videos with the same voice, location and child’s face.

💻Technological Trend
⚖️Prosecution
🔎Facial/object detection
🆔Identification
On-device Content Blocking

On-device content blocking AI built into mobile hardware for real-time, dynamic filtering and blocking.

💻Technological Trend
🔍Prevention
🖼️Image and video detection/classification
🛑Consumption of child abuse material prevention
Therapy Chatbot for Perpetrators

Chatbots where perpetrators can get help anonymously with how to cope with their desires. Further the chatbot can recommend anonymous hotlines for the perpetrator.

💻Technological Trend
🔍Prevention
🔠Text analysis/processing
🛑Consumption of child abuse material prevention
Hidden CSAM marketing detection

AI that can detect if sites or social media pages are secretly marketing CSAM, disguised as legal content, like marketing using dressed children to market CSAM.

💻Technological Trend
🔍Prevention
🔠Text analysis/processing
🛑Consumption of child abuse material prevention
AI text content moderators

AI's that based on specific online communities policies can flag and remove content as it is created.

💻Technological Trend
🔍Prevention
🔠Text analysis/processing
🛑Consumption of child abuse material prevention
👉
www.childhood.org

Foother

ChildhoodChildhood
Childhood
linkedinlinkedin
linkedin
My AIMy AI
My AI
FacebookFacebook
Facebook

Blasieholmstorg 8, 111 48 Stockholm

+46(0)8-551 175 00

info@childhood.org