Files
Abstract
Social media has emerged as a common platform for knowledge sharing and exchange in online communities. However, it has also become a hotbed for the diffusion of irregular content. Content moderation is crucial for maintaining a safe and healthy online environment by regulating the distribution of user-generated content (UGC).Engaging users in content moderation fosters a sense of shared responsibility and empowers them to actively shape the environment of online communities. Leveraging the expertise of moderators leads to a deeper contextual understanding of content, thereby improving the overall consistency and legitimacy of content moderation in compliance with community or platform guidelines. Nevertheless, the collaborative effort of a more inclusive moderation process remains unexplored by previous studies. While there is increasing attention to fairness, transparency, and ethics in content moderation, prior research often assesses content moderation perceptions of users and moderators in isolation, resulting in a lack of comprehensive perceptual understanding of content moderation decision-making. To address these limitations, this research proposes UMCollab, a user-moderator collaborative content moderation framework that incorporates the dynamics of user engagement and the domain knowledge of moderators into deep learning models to facilitate content moderation decision-making. Additionally, this research empirically investigates user perceptions of content moderation from the perspectives of review information comprehensiveness, user roles, and content familiarity. UMCollab leverages graph learning to model user engagement, which is further enhanced by the creditability and stance of users' online discussions. It also employs attention mechanisms to learn moderators’ domain knowledge through their decisions on UGC in accordance with online community rules. Moreover, this research conducts an online experiment with participants with diverse backgrounds and roles regarding online engagement to complete a series of content moderation tasks and evaluate their perceptions of content moderation. The findings of this dissertation hold significant potential for enhancing the effectiveness, fairness, transparency, and sense of community ownership in moderating UGC in social media. By providing theoretical, methodological, and technical contributions to content moderation, the research aims to improve the safety and success of online communities.