Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
GTFOH this dude is a talking contradiction... he says "I believe we are living i…
ytc_UgySSgqeo…
G
The present reality of AI is that entry level jobs are being erased. Young peopl…
ytc_Ugyd8pmSm…
G
Bro if my dad or brother saw my ai chats I would be dead 🫠…
ytc_UgwtKzmvV…
G
I don’t think chat gpt is going to give them your name though. Obviously it’s go…
ytc_UgwGGfgwW…
G
We need to all email Character AI about changing their community guidelines when…
ytc_UgwA29wLN…
G
@MyTurtleApril Honestly, it's not even an opinion. People can say that AI art is…
ytr_UgzJx0Sd2…
G
We might get UBI in return for our attention. It will be a strange new world, b…
ytc_Ugzd5W24Y…
G
May be google or any platform that use ai for creation should label it with "AI …
ytc_UgwZarlE2…
Comment
There are, to be fair, some pretty damn good videos with both well-grounded facts and well-reasoned arguments that lead up to a conclusion that might be summarized as "AI is going to kill us." Or at least, y'know, "AI has the potential to become the highest degree of dangerous."
For some reason, we've ended up with these two camps, where some people like to emphasize the immediate, current issues with AI tech, and others like to emphasize the potential future existential risks. But there is not necessarily any contradiction between these two, and many of the fundamental problems are the same.
youtube
2025-10-13T13:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugx3UBh1Px04fo6VaqJ4AaABAg.APtAB9-VyNQAPts0GgBQoT","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzIbnuMdt6nBCliBs54AaABAg.APsie7217_IAPsmplAeyiR","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgxEJIXNCC-qNjuxaC94AaABAg.APrdpw50jWaAPx4JLnur7m","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxN0Gcy_vm7VLTFtH94AaABAg.AMtVk3blbvzAMvgGGfZbqH","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxN0Gcy_vm7VLTFtH94AaABAg.AMtVk3blbvzANKVn1OpSk1","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugwt0Abe-66nck8-Qy94AaABAg.ANmoYTSH9POAODYXbMMJK3","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugwt0Abe-66nck8-Qy94AaABAg.ANmoYTSH9POAODZRMUkYHW","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwN4JXE1QnI4j7PyQJ4AaABAg.AMvXSxYKqNlANOfyCuJxCh","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytr_Ugwx1JKZQSI3vA1t6Zt4AaABAg.AMuiZDjdolJAMuojRyXcNd","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgylOM3Y16ciMR6G_6N4AaABAg.AMugSTclqn2AMuhEhhsldm","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]