Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s fascinating to hear Sam talk about AI becoming an “extension of ourselves.”…
ytc_UgxcQKwnM…
G
Why are people so eager to to have Ai when it could have so many problems how wi…
ytc_UgzFWy-oG…
G
AI is a new tool. Motion picture scared photographers, photography scared painte…
ytr_UgwWai_WC…
G
BANNED ROBOT !! 🤖
This does not feel safe. We understand evil exists and it …
ytc_UgxXpSpAo…
G
This isn’t about a fictional rogue AI. Our growing dependence on AI is dismal an…
ytc_UgxGrxsnn…
G
If you have seen any Disney/Marvel/star wars movie… you’d know that they are com…
ytc_UgzS3Ma_j…
G
In my view this more validates A.I art since it inspired artists to be creative …
ytc_UgzW2v-gC…
G
Sorry, the correct answer is D, as no robot is actually capable of feeling emoti…
ytr_UgzWrxKHu…
Comment
"AGI in 2026" is major BS.. we have no idea how to make machine act like that and machine learning doesnt even come close to that in terms of internal working. Roman just said that CEO's of these AI companies (who make money purely on stocks and not product) told him.. yeah, what else should they do? They need funding. If you are interested in no marketing opinion about AI and AGI check Tomáš Mikolov.
youtube
2024-08-10T21:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgzHmB5ZT-MSE3pZJ_14AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwMAIdlfZLfGpOQq8N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzPPTWfOiUN0p5wCqZ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx6ouZc0QQ6fwQ5Vth4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx9_CQvUBOzx8j3H5V4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx742ZzUDqfcZhYe1t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgzvSXAMfQlFQXdiW0Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzxs2_0HBC8o4DjLhh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Ugw_QELV0Q3MwUTRKKZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzwRhUkQg24RRGFQOZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}]