French GossipPrompts: Dataset For Prevention of Generating French Gossip Stories By LLMs

Msvpj Sathvik, Abhilash Dowpati, Revanth Narra


Abstract
The realm of Large Language Models (LLMs) is undergoing a continuous and dynamic transformation. These state-of-the-art LLMs showcase an impressive ability to craft narratives based on contextual cues, highlighting their skill in comprehending and producing text resembling human writing. However, there exists a potential risk: the potential inclination of LLMs to create gossips when prompted with specific contexts. These LLMs possess the capacity to generate stories rooted in the context provided by the prompts. Yet, this very capability carries a risk of generating gossips given the context as input. To mitigate this, we introduce a dataset named “French GossipPrompts” designed for identifying prompts that lead to the creation of gossipy content in the French language. This dataset employs binary classification, categorizing whether a given prompt generates gossip or not. The dataset comprises a total of 7253 individual prompts. We have developed classification models and achieved an accuracy of 89.95%.
Anthology ID:
2024.eacl-short.1
Volume:
Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–7
Language:
URL:
https://aclanthology.org/2024.eacl-short.1
DOI:
Bibkey:
Cite (ACL):
Msvpj Sathvik, Abhilash Dowpati, and Revanth Narra. 2024. French GossipPrompts: Dataset For Prevention of Generating French Gossip Stories By LLMs. In Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 2: Short Papers), pages 1–7, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
French GossipPrompts: Dataset For Prevention of Generating French Gossip Stories By LLMs (Sathvik et al., EACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.eacl-short.1.pdf