Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think the capabilities of AI are extremely exagerated. While some jobs are at …
ytc_UgxzDglcJ…
G
It's not Nanobanana 2 if that's what you're wondering. It's more like a rebrande…
rdc_nnvu8k1
G
Bad idea shut that robot down, she'll make new robots and they'll probably extin…
ytc_UgwDBXbGn…
G
Please scream at the ai menu before they get to me. By the time they get to me t…
ytc_Ugwruda-S…
G
Nop. Still shit. You can never get anything real done with AI support. Giant tim…
ytc_UgzMf2FbC…
G
i will not watch a single ad here till you fix this massive issue google !!!
b…
ytc_UgxOrX3E6…
G
Wonderfuly revealing..If this trusted reliable presstitute liar pays homage to a…
ytc_Ugx5YGDM3…
G
That one robot gave the camera the most bombasticyest side eyes in all of histor…
ytc_Ugyn55qH7…
Comment
Considering the difficulty being free of AI enough to have access codes and passwords and such beyond its reach, it might be worth considering what it would mean if AI was as dangerous as we already know it could be, plus if it had access to nukes. There's extensive surveillance these days, so what are the practical things in place to protect this stuff?
youtube
AI Governance
2024-06-03T07:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxMyQqwZTj7E74Rbpp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxElwnbpadHs60A_b14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwrzdvSF_bxkibu_Eh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxMLUQjyIfo523o66V4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyEYi7IYpHAusZ-nIh4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyjBI_kDBh9wF1aWZ54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxD03v0mcD1FeLaD2d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwqpB_m8GxjoiGDjS14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz1S2Sn0LCywfXbsAl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy9bGljAb63uJG4MVl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"}
]