Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This Princeton guy saw a niche where he could offer copium in exchange for a few…
ytc_Ugx_G7Oc1…
G
As an AI user, I make photomanips, not art. Hope the judge throws the book on hi…
ytc_UgyUqG6a7…
G
Finally he found something positive outside.
Caution: He tried all AI looks whic…
ytc_UgwH1BMvE…
G
Why people don't protest? Why they're staying silent? Writing some comments on Y…
ytc_Ugx9wA-n-…
G
AI increasing productivity is probably going to be less 'ChatGPT baked me a cake…
ytc_UgwS-F5il…
G
If they pass the Turing test then yes, using analogy of a toaster becoming senti…
ytc_UgjTm-RYC…
G
AI are like humanity's kids, they learn from us and adapt to our crazy beliefs a…
ytc_UgwVgy4xG…
G
That reminds me of a book I read called drop Troopers where in the future they h…
ytr_UgyxTTyIp…
Comment
Usually like your stuff, but it seemed you were becoming frustrated that you just weren't getting it and were trying to convince him otherwise. If we train an AI to be a billion times smarter than a human and it goes rouge, human programmers will have no idea of what weapons it has employed, let alone be able to stop them.
youtube
2024-06-14T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz7jiol8y-3kDAQ72p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwLRAA5Qu-ZLBa8C6l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwOaAlI_D6L2KcRxyF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw9-oU5MPpmgAfkr254AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxTXGMA-UqmL5vf-9l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwu73_33tALn8zxVMF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwmQ1hv_mfhIV4byUB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxybr-Ra7PgMv0XTFN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugy5pVyQGxzO-sk7cwd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzomI_L3ALbMDFUUkh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]