Moderating online child sexualabuse material (CSAM): does self-regulation work, or is greater state regulation needed?

Social media platforms are crucial public forums connecting users around the world through a decentralised cyberspace. These platforms host high volumes of content and, as such, employ content moderators (CMs) to safeguard users against harmful content like child sexual abuse material (CSAM). These...

Full description

Saved in:  
Bibliographic Details
Authors: Bleakley, Paul (Author) ; Martellozzo, Elena (Author) ; Spence, Ruth (Author) ; DeMarco, Jeffrey (Author)
Format: Print Article
Language:English
Published: 2024
In: European journal of criminology
Year: 2024, Volume: 21, Issue: 2, Pages: 231-250
Journals Online & Print:
Drawer...
Check availability: HBZ Gateway
Keywords:

MARC

LEADER 00000naa a2200000 4500
001 189050582X
003 DE-627
005 20240603135211.0
007 tu
008 240603s2024 xx ||||| 00| ||eng c
035 |a (DE-627)189050582X 
035 |a (DE-599)KXP189050582X 
040 |a DE-627  |b ger  |c DE-627  |e rda 
041 |a eng 
100 1 |a Bleakley, Paul  |e VerfasserIn  |0 (DE-588)1284636674  |0 (DE-627)1840335718  |4 aut 
109 |a Bleakley, Paul 
245 1 0 |a Moderating online child sexualabuse material (CSAM)  |b does self-regulation work, or is greater state regulation needed?  |c Paul Bleakley, Elena Martellozzo, Ruth Spence, Jeffrey DeMarco 
264 1 |c 2024 
336 |a Text  |b txt  |2 rdacontent 
337 |a ohne Hilfsmittel zu benutzen  |b n  |2 rdamedia 
338 |a Band  |b nc  |2 rdacarrier 
500 |a Literaturverzeichnis: Seite 247-250 
520 |a Social media platforms are crucial public forums connecting users around the world through a decentralised cyberspace. These platforms host high volumes of content and, as such, employ content moderators (CMs) to safeguard users against harmful content like child sexual abuse material (CSAM). These roles are critical in the social media landscape however, CMs’ work as “digital first responders” is complicated by legal and systemic debates over whether the policing of cyberspace should be left to the self-regulation of technology companies, or if greater state-regulation is required. In this empirical policy and literature review, major debates in the area of content moderation and, in particular, the online policing of CSAM are identified and evaluated. This includes the issue of territorial jurisdiction, and how it obstructs traditional policing; concerns over free speech and privacy if CMs are given greater powers, and debates over whether technology companies should be legally liable for user-generated content (UGC). In outlining these issues, amore comprehensive foundation for evaluating current practices for monitoring and combatting online CSAM is established which illustrates both the practical and philosophical challenges of the existing status quo, wherein the state and private companies share these important responsibilities. 
650 4 |a content moderation 
650 4 |a Social Media 
650 4 |a online harms 
650 4 |a child sexual abuse material 
650 4 |a Cybercrime 
650 4 |a online policing 
700 1 |a Martellozzo, Elena  |e VerfasserIn  |0 (DE-588)1022152564  |0 (DE-627)71692823X  |0 (DE-576)364897341  |4 aut 
700 1 |a Spence, Ruth  |e VerfasserIn  |0 (DE-588)123035770X  |0 (DE-627)1752711173  |4 aut 
700 1 |a DeMarco, Jeffrey  |e VerfasserIn  |4 aut 
773 0 8 |i Enthalten in  |t European journal of criminology  |d London [u.a.] : Sage, 2004  |g 21(2024), 2 vom: März, Seite 231-250  |w (DE-627)378572970  |w (DE-600)2134760-8  |w (DE-576)109717155  |x 1477-3708  |7 nnns 
773 1 8 |g volume:21  |g year:2024  |g number:2  |g month:03  |g pages:231-250 
951 |a AR 
ELC |b 1 
LOK |0 000 xxxxxcx a22 zn 4500 
LOK |0 001 4533774725 
LOK |0 003 DE-627 
LOK |0 004 189050582X 
LOK |0 005 20240603135211 
LOK |0 008 240603||||||||||||||||ger||||||| 
LOK |0 040   |a DE-21-110  |c DE-627  |d DE-21-110 
LOK |0 852   |a DE-21-110 
LOK |0 852 1  |m p  |9 00 
LOK |0 938   |k p 
ORI |a WA-MARC-krimdoka001.raw