Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
First of all aí can’t be human second I don’t think ai have all of eternity…
ytr_Ugwls6Kxn…
G
AI isn’t problem it’s the living breathing biased hate of humans against other h…
ytc_UgwPEXqH1…
G
the rest i asked ChatGPT
did he killed anyone who offended him?
ChatGPT said:
…
ytc_UgwYM02ds…
G
people keep saying plumber, but you should be an electrician...you basically hav…
ytc_UgyK4pWF6…
G
We appreciate your insight into Sophia's quest for wisdom and her interaction wi…
ytr_UgxjgM3Ll…
G
On AI being used for military purposes, from the project nimbus wiki we have: ".…
ytc_Ugzg0p-nh…
G
is it The pedestrian fault? yes
walking in the night with no light
or some…
ytc_UgyevoCeQ…
G
Its not like other tools, its comparable to when people edit images and call it …
ytc_Ugw0ejnnH…
Comment
I have real problems when people say we need to keep innovating for the sake of innovation. Ethical choices in science are what stand between us and the great abyss as a species. How much would we learn if we could custom make Small Pox into an ultra death plague (I mean more so), but is it a good thing? We tend to think of science as linear when its a matter of focus and emphasis. I'd rather see our energy go elsewhere than putting it to our next robot overlords. So would most scifi writers.
youtube
2013-06-28T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyOxEIKsDCqbgwTT2h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"curiosity"},
{"id":"ytc_UgxH0Wy3FMkd5KI1wo14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxtxPvmXODUuSYeIAx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzn9BYaa_5CS2U65F54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzyxc0e-ge6ci6YGCl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwySONz01Vb8ZOh5lV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxI4atIpbp3sYEpPWx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw1EHIFC_6CcLQ3WPB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyPQqvg69EcNw6hOq94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxc40VukF1SsusaWo94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"mixed"}
]