Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It gets rid of the mundane tasks and opens better paying opportunities in contro…
ytc_Ugy_AlzHE…
G
When AI can can say the quote 'You say potato, I say potato.' correctly only the…
ytc_UgyQENUYy…
G
AI defenders better be careful if you bump into me enjoy your thew second's of l…
ytc_Ugz1zxhyi…
G
Yeah it sounds nice, but AI need data centers and power, so your just shifting t…
ytc_Ugz7JuxOZ…
G
As an actual AI developer, I assure you it will take your shitposting job away f…
ytr_UgxXorIDO…
G
Every major company is pulling out of AI right now, its a huge bubble without an…
ytc_UgzlZd65Q…
G
@user-pp4lx3bt1o I wouldn't characterize this as a "computer program". This is c…
ytr_Ugx0xp3t3…
G
I see AI art as a tool, not a replacement factor. I can see it assisting artists…
ytc_UgwKlov1Y…
Comment
All those who afraid of AI, I have a question. Why would AI want to get rid of humans? We fight and kill others because it's in our DNA. Our purpose is to live and reproduce and all our actions are centred around it Expecting AI to have those instincts is stupid.
youtube
AI Governance
2025-06-29T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwGrfHHXuX1i-qNyTd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx0zX5aDM0OEFpUUL14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw6pBFqR7pz7t8Srl54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwFGalfNcxBP9yH4Nl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw9Z-X30WnMhp1OUOZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxGcR7_R7Qvr8MeJN54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy0cF1yBc17qsboVpJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwGjhqRTDx1wKdg1Ph4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz9EkrlgORT4o__cBt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxmsaCZGoPsSARqFtB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]