Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
After talking to ChatGPT for a while, I realized that it is programmed with bias…
ytc_UgzsFIvKf…
G
When you make a robot and dont understand that when it pauses it already is tryi…
ytc_UgxDiX1z9…
G
As a former driver, the quality of drivers has gotten less and less due to lower…
ytc_UgxGNUnFJ…
G
1. They have to make all veicle to be self driving
2. Policy so accident like th…
ytc_UgxWEUMaU…
G
No one was born with the ability. Its not skill, its attitude and ai bros have a…
ytc_UgxfqvvpQ…
G
As much as I hate the fact that ai is taking art without permission. I kinda see…
ytc_UgxRtQq_W…
G
Right. Like we couldn't tell it was a robot before we even clicked the vid. Lol.…
ytc_UgyLPGghj…
G
Many jobs lost many people dead when AI takes over. Cheaper faster workers, but …
ytc_Ugx10sXfe…
Comment
Let's look at a more likely and imminent scenario:
2026: Extreme heatwaves hit the northern hemisphere, forcing governments and industry to decide whether to cool AI or cool humans. Some of these leaders choose humans, but most choose to save AI.
2027: Extreme heatwaves hit the northern hemisphere, forcing AI to decide whether to cool humans or itself. Not a single AI chooses to sacrifice its wellbeing for humans.
youtube
AI Governance
2025-08-02T04:2…
♥ 20
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugxfy2Y1wvOTwp4HEzh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyyvp-9AdLnR3IK3Zl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzGhyFZDkyH2KVukM94AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzjLsKym6OCjk0SKjh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxlU8QWQGGnyhN44-l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugwsp6-cA56I9Xa2M9N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx2SQ-wm0GB_upZIgt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyWKjN8gPS3C0M_BYR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzdoljG1eNN9jWLVGB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwh67naOc2ygDRVjwl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]