Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"You wouldn't be this critical if I hadn't used Ai"
Yeah I like to be nicer to p…
ytc_Ugw-2OtJp…
G
@nickolasolds315Did they have videos of AI robots in the 70s? No.
But we hav…
ytr_UgyKPaaae…
G
It's not about AI of today but of the future, like next year and every year afte…
ytc_UgzwPAPiz…
G
That's stable diffusion which is pretty primitive compared to DALL-E 3. It will …
ytc_UgxgRPkOb…
G
AI is just a scapegoat. The real question is what if we outsource all our jobs o…
ytc_UgyJI5ZNo…
G
AI was supposed to be the hammer to drive the nail, to boost us, not to make us …
ytc_Ugyzq3bXU…
G
Alexis is pretty much a robot without the body and head. I seen a video where ha…
ytc_UgwmR6xCO…
G
We know he was murdered. We know it has to do with these AI evil doers. We know…
ytc_Ugzutm1XD…
Comment
Wow... AI, the new Y2K. This is what happens when you watch too many sci-fi movies and invite some dude who doesn't code AI (apparently his company "works with" AI, which is not the same as coding it) to speak about something he doesn't really have an insight into. It doesn't matter what an AI model does or appears to do, there will be a line of code somewhere in there that will tell it it can do it.
youtube
AI Moral Status
2025-06-12T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy_9kqDidogMRbzWSZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzIkE4AAyP9wUWHQA94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx8bDxA22LsxrWc8-14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgxsBh1KRkrOuNd7yp14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwZrYSK6Wpx6GpNlrJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwh235PoHTc3eaH7E54AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy56UbnSjlpIfSkjQx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzwQBhryKBG5vziHxB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwr9NBBeltQzbCnigd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugz_VLUvgzNBFx359zd4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"}
]