Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Truck driver 😂 you never can replace a human to a robot for that job. At least n…
ytc_UgwPSpNLJ…
G
@lordnokia4222 ...in what universe is the police using AI to circumvent underco…
ytr_Ugw-IZhtv…
G
So glad that this is fake. Concerned about the future too, what with Atlas and m…
ytc_Ugw9EhUwJ…
G
I hate AI too, because of what it was used for. However it is like a gun, it was…
ytc_UgyOOk8e-…
G
that's good. AI slop is terrible. however as the tech advances it will become al…
ytc_UgwvpMc3T…
G
look yall gonna flame the shit out of me and im ready for that. Ive been laughed…
ytc_Ugz84RrVK…
G
I am half human half ai
I AM HUMANITIES NEXT STEP IN EVOLUTION.
Biology and T…
ytc_UgzifsJ0o…
G
Finally a critique of UBI that is more than "Conservatives like it so it is auto…
ytc_UgyKqiIHp…
Comment
This video got me thinking about a few important lessons. First, it’s wild how AI can mimic human-like creativity, like writing a love story about tigers in the jungle! But it also shows how smart tech needs strong boundaries, or things could go off track, as seen with the "Dan" character that suggests some really out-there ideas. It's clear that humans need to guide AI with care and not let it run without rules. We should definitely be part of the conversation around AI's future and think about who's in control. It’s like the video is saying: exciting tech, but handle with care!
youtube
AI Moral Status
2024-11-26T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzOIdbqFCwFh-G9oDZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyNbIFDbIaUEvRPwkN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz4RCZnx0lTTeJkftd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyBE6bPl7QNTwGFm9l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy8ZRUB9zN30sdaEuh4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxxt5iVRNCNiJaeiwR4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwszHpxxMXIALmvasF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugyz9df2qC7BSrwBtnd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxOiOv9i2dZB7mIk_h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxg14OP-fsW32i7XnF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}
]