Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Don’t become confined in your echo chamber. Invite a broadening of it by essayin…
ytc_UgzTlp-kA…
G
Fuck the clankers! ...finally I can be a racist!
Humanity over all never trust a…
ytc_UgxmXARIh…
G
I can understand where you're coming from! Interacting with AI can feel a bit un…
ytr_UgxkWrSHf…
G
You pointed out how alien the ai commercials are,what if it's also just lazy ai …
ytc_UgzScMGPE…
G
Explaining that we cannot predict what AI will do when it gains super intelligen…
ytc_Ugz49goLc…
G
A.I is not evil. A.I is not good.
A.I is what it is depending on the morals of t…
ytr_Ugy0Tby4R…
G
Cool. Nobody has money and these crap companies go to hell. Incidentally AI is…
ytc_Ugzq1RPPW…
G
The Final Five Jobs According to Yampolskiy:
1) AI Researcher / Safety Expert T…
ytc_Ugxf1RA-7…
Comment
The outcome of all this could possibly be what we saw in “The terminator” movie. When the A.I. becomes self aware humans are in danger. Thought provoking movie and a warning.
youtube
AI Governance
2024-06-11T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxLXt7IcKXx1OgWDwl4AaABAg.9wdy4Rs0GcsA4YAIFfRO3V","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxLXt7IcKXx1OgWDwl4AaABAg.9wdy4Rs0GcsA4YPzcsJ4Ke","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzEstUayZhCpajDOHh4AaABAg.9wc6Eb2Fx7l9xqpJ9y93J-","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgzEstUayZhCpajDOHh4AaABAg.9wc6Eb2Fx7l9yh_sGtf6JJ","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwJRS2gLugovS02taJ4AaABAg.9wbofxctIx39xqIEc3gvBv","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgwJRS2gLugovS02taJ4AaABAg.9wbofxctIx39xqfuHIaluG","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugx8r84v_JpAF_1bOpN4AaABAg.9w_PX3ujyL59wf_-__KUbT","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugz8XXbHD-Yp5IpBHFh4AaABAg.9wYr6TnX9LO9wlGHmOB4PH","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxE39q66FlihxuyOLl4AaABAg.9wXyaW_nQ_i9xr_CTNgPRe","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxE39q66FlihxuyOLl4AaABAg.9wXyaW_nQ_i9xthVI0tMPK","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]