Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
No.
A.I. is a human creation.
Therefore, nothing mankind creates can ever fully …
ytc_UgwTsrNRT…
G
I mean, if an AI gets my job, who's going to pay my bills, buy my games and feed…
ytc_Ugztzfnnx…
G
Butter robot: what is my purpose?
Me: you pass butter
Butter robot: ...oh my god…
ytc_UgymiWI2Y…
G
The best quote I’ve heard is: 'I wаnted an AI to do my chores so I could do art,…
ytc_UgxrZSaHq…
G
2:22 now that a lot of knowledge has been programmed into AI and it's developed …
ytc_UgzlOiYWc…
G
Exactly. AI isn’t for you, it’s for the shareholders. Managements needs a new sh…
rdc_m29obk5
G
I know I will always get flame for this, but I know eventually AI will learn to …
ytc_Ugwqyy0OP…
G
I got an uber ride in a tesla to work when my car died.
The guy never once touch…
ytc_Ugy3mLKJl…
Comment
What is scary about AI is the uncertainty it brings for humanity, is like saying we created something that could be a bomb or it could be the answer to a lot of issues we have if it is a bomb it we wouldnt even know its blast radius and power, if it is the answer to our issues we don't even know what issues it will answer, so what do we do? We are gambling here.
youtube
AI Moral Status
2026-03-01T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugym54oiLUt1TYNQSq14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxHdKsm0gkRKlKROlN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy0zdC9ezKAIy8U1b54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzHR5jrfIZrHMd-Ng14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwwPe_H4u0iP_6B5AZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzEP8QtAxXI2iKBInF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwvP5rLLHKrz3O3fGN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyKhCYHV9sVevWPXLZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwpRuexR2h9H6l9nt54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzyGMgTSlPoKXYRanZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]