Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The major risk with AI is the same risk with just I, believing every single thin…
ytc_UgzHwC2g-…
G
Well there goes Assimov's "Rules for Robots"... The developers have conveniently…
ytc_Ugz6r6S68…
G
What’s funny about this, is that all data sets are biased. There’s no such thing…
ytc_UgzmWKwf7…
G
This is stupid. Everyone knows ai is kaaali ma via a light slit experiment. Al…
ytc_Ugy8GfZfb…
G
Artificial intelligence has some amazing potential benefits in the health care f…
ytc_UgyotEls0…
G
@psychedelicJake I am elated that AI is helping you in such a meaningful way. I …
ytr_Ugy5OnUqt…
G
Tip on extra alienation of these guys (because they are not artists). Start sayi…
ytc_Ugxhg2YQm…
G
Skynet becoming self aware is only moments away....either that or we'll get bett…
ytc_UgyD6DZxi…
Comment
In regard to the military use of AI. It´s not that big coutnries could invade small coutnries easier due to less backlash but small countries could also fight back and retaliate way easier. They alck the personell due to their size but now they don´t need them anymore if they send robots instead. Also their population won´t die out as fast since it´s not humans. So nothing will really change in that regard. The way bigger rist is a terrorist using 100 autonomous drones to bring down airplanes. Nobody can stop this scenario today.
youtube
AI Governance
2025-06-20T11:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxG_bW-Ga44iH1nQdx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyiwpxK51Qn7h6meiR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwDLvinVNtQGrtSEfF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyGJc4MatqIAwsv90t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxOOPFK7kjqEn4I_-R4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxEx5cGAbGxaSINSht4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx6f9LsQrJTj4sR9694AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzReP_rguS7WN-Mfb54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzQ6jysPESiymlsXBd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgystHbHW3zDK-ZPitZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]