Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's fucking people like this smart stoner who will end the world one day!!!! Pe…
ytc_Ugy2Zwk36…
G
aww, i saw a new professor dave video and was hoping he'd talk about some discov…
ytc_Ugx1sT0yv…
G
Okay, Now teach us how to get more electricity in order for Ai to work.…
ytc_UgxeBfO1d…
G
Open AI is worth 80 billion. You need to tell me, Tim. 5 million isn't enough.😂😂…
ytc_UgxHGFmxr…
G
So far, we can't even make a car that reliably drives itself, but I'm sure that …
ytc_UgznuBOtk…
G
@laurentiuvladutmanea
1. if AI isn't a tool, then paint bucket isn't a tool, l…
ytr_UgwIhuFR0…
G
*locks robot in a room with brutal horror movies.
Donates it to an elementary s…
ytc_Ugi9gln4Z…
G
I would love this but many families can’t afford it either. Even if there is a l…
ytc_Ugw25Ke9V…
Comment
AI thinks like its own mind is made of sin. Just like sin, it has a thirst: it wants to "eat" lives by training itself with real human data, it wants to spread, it wants to be number one in everyones life till the point of full control. To me, that beast, monster, or whatever... is made of sin.
youtube
AI Moral Status
2025-12-19T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxnbAinBAlQeR48fVx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwA6uLE5udtlDyzw0N4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxeqb6WkLuGCsppIbN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwycza_BidooXvFfuJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwthfsQyKvmWi0vUsd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxxO6-7ubVcu7vHDZp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyzh1sG7HKD33yryYF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzkXUfIG8HmHietmgl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwx4uvUXo-jZh26a2V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgypFxCdDLuEdr4G3PR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]