Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you can be replaced by an AI bot, then your job wasn't all that useful now wa…
ytc_UgxHosoMj…
G
Considering that the kind of "people" major corporations are
(as per the "Citize…
ytc_UgzvWemwL…
G
Isn’t this how almost every “robot taking over the world” (idfk what to call it)…
ytc_UgyzDfsRV…
G
The reason they dont have an opt in system is because 99% of people wouldnt opt …
ytc_UgyKirazB…
G
How is a society supposed to function if every single job is manned by a robot.…
ytc_UgyFaw3qy…
G
Bro… he asked the A.i “the definition of a LIE” and he gives him a “Line” then a…
ytc_Ugz8IobmZ…
G
1. A robot may not harm a human being.
2.A robot may not harm humanity, or, by i…
ytc_UgjMFF-zo…
G
How do we know that AI hasn't surpassed the meta, and this video, speaking out a…
ytc_UgwaPjSNB…
Comment
There is no such thing as a safe path when it comes to AI. Like humans AI requires lots and lots of energy to power it. You can't power it using renewables due to the unstable nature of the climate but good old fossil fuel like coal and the other alternative nuclear. So in the future humanity will have to compete with AI for jobs but also power bills. You can't have it both ways. It is a very grim and dark future. Save the planet and humanity or AI for the select few.
youtube
AI Responsibility
2025-07-02T06:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyTPCEJ4D_msaYnZDl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyb2DU9aVAMp8tPs9B4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxdWQF4z3o5PB8jRX14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgwjsbloA1MlM5PPUmB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyEqFQjoneO3uw1T-R4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzsS81O2DH-PjwjueV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxRXpbE36Br2KsXuap4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyAKn47SyD2NJBRa8B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzzmDjMQQknoBnNCtZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxob68L4acuP7gd8nh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"}
]