Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The real danger isn't that AI learns, it's that it might learn the wrong lessons…
ytc_UgxEgIWEv…
G
@aliceuwu1013 Using your logic I should accuse any hunter I meet of taking huma…
ytr_UgzdqoRTv…
G
The problem, though, is that even though human artists and programmers and so on…
ytc_UgyBh4ENg…
G
53:40 The solution is, you can't have cooperation without the option to not coop…
ytc_UgzmTc702…
G
As a s computer engineer , i say we are doomed. AI has progress so fast that AI …
ytc_Ugxokd4Y2…
G
AI p.AI.nters are getting worse and worse with their p.AI.nting skills that its …
ytc_UgzPTRo_3…
G
This is just wonderful. I can't wait to see them giving way more human jobs to A…
ytc_UgyeEJBP-…
G
If you're making imaging AI programs , please make sure you code it where the Bi…
ytc_UgwoIPVbr…
Comment
I guess human beings will be working on destroying the AI systems and ruin it, so they can find a role in life !!
Sounds ugly and crazy.
But I guess it is a battle of survival against some machines or the people who own these machines.
youtube
Cross-Cultural
2025-10-25T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxqcCw_MhJL5wjhGpF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwvuP1G9MXgt7T4obN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwLh334TBwrlpr4DWR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyaJ2EFhY4XQ21sUJZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugz39VeVv4WCuyCD15d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyl0npULexH58dFvxJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzd7u62qtcpLKuERVh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwKD-t5dB10z3uf8s94AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxKkQv1PF8TQegc2ZJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx634UzxisGhIGiH9Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]