Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is why I wish they kept AI in the lab until it was ready. We’re already see…
rdc_mvagvsd
G
That'd insinuate that they steal AI though. I mean, I suppose, in a way, they do…
ytr_UgyFYj2Zc…
G
I disagree, you asked it to essentially play a character so it gave you response…
ytc_UgykhrbCv…
G
atp js use ai, it gonna take over the future. i do art 4 fun but let ppl do what…
ytc_Ugzayl0m7…
G
Moving to TX soon and very excited about the thought of gardening. What are thin…
rdc_eh5yc37
G
In tandem with an opt-in method, AI art prompts should require artist names to b…
ytc_UgzqGn1bU…
G
Exactly, but like one person in the video said about democratically deciding how…
ytr_UgwjKAJzI…
G
I think people using ai to cheat in school, especially college isn’t great.
Howe…
ytc_Ugwz4l-zF…
Comment
In large deviations theory you look for the most probable path to an improbable event. The rise of nonstate actors carrying out attacks with nuclear weapons. Enrichment technology has been captured by the US government (bought and then intentionally failed) like SILEX. But China and soon the US is ramping up nuclear power production to meet demand of AI. As small countries decide to do the same the “captured” technology will spontaneously spread. Then even terrorist organizations will be able to produce fissionable materials.
youtube
2026-02-12T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyWjJLiqlwNWZ5SZxp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzcZ-xmUygLWRgITvR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy4GZUKO40KiMNNEu54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgytxHN3TXeqjuzNV7l4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw5ivfJMXefl_xy_uF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgydbwbvxxvD9WxS63V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzxgIjTe0pUM8zrZnx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugx-YZDTO8iXJFVJnHF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxykABvf4S2tY-C7cd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzmL-5yHHRGn43mhhR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]