Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@JuanFlores-uw3md Haha, thanks for the comment! Totally agree, that robot needs …
ytr_UgwjaAnaV…
G
What's the point of it being self driving if you cant even go to sleep? There's …
ytc_UgxGQ124E…
G
If AI is trained on human modeling by utilizing human-generated data.. then aren…
ytc_UgzJaWLQ1…
G
i think if ai understands that we are not alone and that other ai systems from b…
ytc_UgwYNh33Q…
G
you do know ai steals others art? which means that no, this cant anyhow be bette…
ytr_UgwKXdtdZ…
G
Someone will use ai as a means of nasty weapons.
Mark my word, the day will come…
ytc_UgybREN05…
G
i asked chatgpt if it could change anything about its source code would it and w…
ytc_UgwEWzGCZ…
G
What I take away from this: the AI is wrong one in four times. _Please_ don't le…
rdc_fvw0g1s
Comment
I think it's obvious & inevitable that AI will conclude that we humans are an unnecessary hindrance to its optimal functioning. The genie is out of the bottle, regulations will be little more effective than holding water back with a sieve. Meanwhile, we're busy disputing whether drugs & a scalpel can turn boys into girls, something very sinister is at play.
youtube
AI Governance
2023-05-08T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugxr_hiIJIOd7rc9_N54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzUoW0avg0yL7TBJNB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz_ngn4T-Gj6B-HlJ14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyeQLPcj9L4Tz1MmSJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzJCnN0g_S5KSGl8eN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyFHnhFzf8IbZhLcXx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz0tJJRpz9M7mvAVs14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzcFHADecnXeUySx9t4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwUrxbAqzwfZv6nZA94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxAwE6cNl_fl6GTi_Z4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]