Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Plot twist, your whole reality has been AI generated, and you are an AI too.…
ytr_UgxbpLCrX…
G
Omg this is awesome!! Learning shit we actually can use in the real life!!! Yay …
ytc_UgzhrCfyt…
G
So artist stealing the work of AI for their own profit without consent from the …
ytc_UgzPqTWi2…
G
Sorry to tell you this, but AI is getting pretty realistic, but AI is knocking l…
ytc_UgwvfkUC8…
G
I'm so glad I never got into ChatGPT. Ive never used that kind of AI for anythin…
ytc_UgwpXhQjz…
G
It is not art because most of these AI’s use actual art from other artists to ma…
ytr_UgxPKbdC3…
G
@ItsMeTord you're also ignoring the fact that any program used to help the disa…
ytr_Ugz5rkY64…
G
Kashmir Hill of NYT on twitter was walking around NYC finding stores with facial…
rdc_jcgs3l5
Comment
How about creating some kind of tool to stop dangerous technology, robots or apps, from acting without approval. Like the old "Ctrl,Alt,Delete" to stop an
program? An electromagnetic pulse tool to stop a machine? Or a netting simulating a faraday cage to grapple a machine? Lol im just thinking about stuff i have seen in the movies. I do feel this is going to be a problem if the good people lose control over AI or robots.
I feel if we prevent or lessen the ability of AI to move about , there could be a solution there.
youtube
AI Governance
2026-03-07T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy4CX-2_WXCDRmoIVZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx1PpLGOwLbGql9iJ14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzJ24f4wsEG2iXJ4994AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwJkHQ9FyOLuJlESxV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzP74pxa68mYhrUOLp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw1dZOuE1Dnpjw5Hnd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxbPDqLKgUKdlP6iHd4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugwi6TuNYd7QUvSwYUF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxU2T8pX0KuW2C22ol4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyoH8jH8ljC9ZzuPuJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}
]