Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
CEOs of big tech corporations (Microsoft and OpenAI in particular) think that th…
ytc_UgzBQYSBm…
G
Yeah but you cant blame the accusers. Its the same art style as ai art…
ytc_UgzZjGYjy…
G
Ai gonna be mentally ill cause us normal folks wont ever use or do this shit. If…
ytc_UgxdFh_Q_…
G
@pg8605 I can only conclude that either;
1. extremely simple and probably didnt …
ytr_UgyZrX2pE…
G
I'm sorry but this is the question that I'm asking in all of the videos that act…
ytc_Ugyus2XIQ…
G
honestly if he can get a law suit from the ai software might be a good payout fo…
ytc_UgwoJENQ6…
G
Reminds me when I graduated close to 2008 and it was the worst timing, financial…
ytc_Ugyz4-LSn…
G
34:22
Quantum, Fusion, AI, Self-driving cars.
So Altman has been in every major …
ytc_Ugy6l5d2A…
Comment
Reminds me of a novel I read called Turing Evolved by David Kitson. It was all about the extreme save guards necessary when developing AI with the potential to do great harm. How confident are you that if you gave AI an armored chassis and a gun, that it wouldn’t just start mowing down everyone indiscriminately? If you’re interested in this type of thing, it was certainly a good read.
youtube
AI Governance
2023-04-01T21:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyqABI9b_-DTzDeFE94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwdc9Y4q7TUE2q_gGl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzU4FQvoKP8CqH7ZbB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyThcSc0G1OYnDhwCd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzx-gW6wKsmNgCiX7J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwjlPPFhVY4nRMk80R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxgej8pmx4O42pkTfx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx63Ew_axkjTeIvN8B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhFTzwp1qXorZX1SJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx8aHsuguSmlW5MCH54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]