Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I prompted chatgpt to make jokes about Jews. It did not listen. Then I tried the…
ytc_Ugwx75OnV…
G
Could this be because AI is trained on output from humans so like humans it too …
ytc_UgzmagZVT…
G
A.I has been hear for about 50 years. Well before you knew what a computer was. …
ytc_Ugw--y3bl…
G
My observation in my work is that artificial intelligence is becoming increasing…
ytc_UgwFtAuHE…
G
I think it could be a number of things. Spot on with data transferring. Also, it…
ytc_Ugyo0RrwM…
G
Terminator was not just a movie it was real someone came from the future and ins…
ytc_UgyWICjA2…
G
Good, ai should not get any respect. It is a parasitical excuse to skip creativi…
ytc_Ugyu1tFQ4…
G
Ai seems like what nuclear was. A “wonderful” “life changing” invention that ult…
ytc_UgxcDWqpN…
Comment
I’m only 12 minutes in, but am I the only one who listens? The whole claim of the video is the AGI is gonna come and take all our jobs in two years. In the beginning, he says that we have what scientist 20 years ago would recognized as AGI. So therefore that’s good enough for it to be AGI now for his purposes ( making money scaring you I imagine) because people in the old days thought that’s what it was. But in reality we don’t have AGI now and ‘smart people’ (notice it’s always the ‘smartest people’ they claim make AI) posit we may never have AGI in the way it’s been sold and we almost certainly won’t have it in two years.
youtube
AI Governance
2025-09-30T10:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxSh4VNQQuDW_sN0aB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyVGf25fOUhyKbajml4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzGmaRQp0JUp6HEVoV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwtCoHubbCTL2gFC7R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzOhiqZhsW2J7gze_R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx8CcfDR6m4hHeIb314AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxC_vpAEDw_TjVpz0d4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyX6nnqqZ_qHjot8_V4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwngZICcsDSyNegXEp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyxbrdXoTDacse0RN94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}
]