Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It definitely will be humanity's last build if they don't use AI to reprogram tr…
ytc_UgyNrBDp7…
G
You're contradicting yourself. First, you confidently claim that AI does not hav…
rdc_mdirvo7
G
losing employers because of use of AI, then AI should take the risk of rising co…
ytc_Ugz1JgA7A…
G
when it comes to ai users, Art is only called art if the person puts effort in i…
ytc_UgwB_1FR5…
G
Before it was "urge" now it is "demand". Why do they think they run the world?…
rdc_gxtm26n
G
I respect she care more about a paycheck then actually quitting or making a diff…
ytc_UgwXPVxK_…
G
All accidents that have occurred with self-driving cars thus far have been the r…
rdc_d8bedrs
G
I did AI art generation to entertain these people, and understand it better. Onc…
ytc_Ugyyj1r9l…
Comment
"Good guy AI vs. Bad guy AI - the question is the Good ahead of the Bad?"
No. The question is, will the Good AI decide to Join the Bad AI, and not bother telling us...?
youtube
AI Governance
2024-01-14T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwvmFk1EmYmpazLmON4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz8KCgIjhqNn8a7aRx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwzgrTiIF1rZ7k0WWB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwu0T1S01SdXZd8KX54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzyIYKWw1hn4OX2H0Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxkPUCtKghdyiyt4wh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwmhUG99AYOwsg13-J4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxiw9JRWGl6xyjFOCh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzYHqAUE-IUbqUyR554AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugy1kVQt65vEHQjP1rZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]