Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This facial recognition is going to ruin a lot of lives. The intention does not …
rdc_ffg4n5k
G
Here is a Q - If this guy knew all this why the F did he start it - The lesson -…
ytc_UgyqTJeQ2…
G
wow, just the intro clip is insane. "if we give these people too much power". We…
ytc_UgynPymqA…
G
It seems to me that a case could be made - as a class action suit, I'd imagine -…
ytc_UgxbvYwAF…
G
You can't COMMAND something or someone to create "art" for you because "feel a t…
ytc_UgwPuuK8o…
G
Thousands of lines of codes is going into making AI. I don't know about y'all, b…
ytc_UgxxkrAsU…
G
AI test net is already been used in Gaza, a flattened moon scape of western/Zion…
ytc_Ugw4KvOKq…
G
Hey I am a sex worker.
R u going to fuck an AI.
In 2045?
😂😂😂…
ytc_Ugx7GVCwd…
Comment
When I was in school studying art and media we were required to use AI generators for certain classes to "learn about the technology". It was literally required for the work we did and we'd fail those assigments if we refused which could majorly affect our grades. I didn't realise just how damaging it was at the time but still felt really weird about using it. Eventually I learned that the data the AIs are trained on is taken people without their permission and now I despise AI. I think this is such a cool way to fight back against it and I think it's great people are talking about it. Thanks!
youtube
Viral AI Reaction
2024-10-20T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyUO9wkI6VEEJblShx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz1F2BQl_U0vkj_KTt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxZIu7_i3Wd0g_X17B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"frustration"},
{"id":"ytc_UgxjMUbIfXmr8l0jGvJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx6dIMMtqTmKcjHPut4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugwzg7_lUIZqxYSm9Qh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyIggz4pgLI10mwLu94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwn68OjOSfNszRVtpF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw9nuieYsXxL1OyLK54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxecLnqrmvuHP2pfnh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"}
]