Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI doesn't need to be sentient or think it is sentient, the madness of creating …
ytr_Ugy-LA-03…
G
ok. You need to study video codecs and compression. Not all platforms use the sa…
ytc_UgzerCKrD…
G
The Ibiza Final Boss is just Steven Bartlett in disguise. You know im not wrong.…
ytc_UgylGgCKd…
G
There needs to be a world wide agreement of all nations to stop the development …
ytc_Ugx6bvyKw…
G
Yudkowsky: An AI will create subgoals which are not foreseeable by the designers…
ytc_UgyyLzF6c…
G
Gotta love bbc these days, suggesting he left because of AI isn’t really a corre…
ytc_UgybMPlwi…
G
I was like it’s a robot NO it’s a human NO Ummmm………. I really don’t know 😂…
ytc_UgyG4DASv…
G
@TheAngryDesigner No design tool prior did the actual creating for you. They ch…
ytr_UgxODvg91…
Comment
All scenarios considered in this story omit the environmental constraints currently facing humanity. Runaway climate change will be accelerated by runaway AI, and will deepen inequality, poverty, and civil instability. While no one can be certain what will happen next, I believe environmental crises will end humanity. AI will merely tip us over the edge. We can't be so obedient to tech giants and politicians that hand them everything they want. I also fear AI is taking the place of "God" in many peoples' lives, as generations become further removed from religion and in-person community, further pushing us toward a dependence and defensive connection to AI. This is coming from an Atheist Scientist.
youtube
AI Governance
2025-08-03T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwPBr0agmpPinc6u5B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyzEK6WreDQ4jkuorx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxv1kdutQCxxjVwxKl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzkSJ0OT0mgIOMA_np4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-CWuH43zGoSKeuz94AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzFOuqypNO93zv5oRZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz_Vxzc4w5Uxdm7RYR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz-_XGrkAsblYhdqLl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxlaHZOjWe8tu7l4HB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzhl_an1MfvXhFXxKZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"approval"}
]