Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
17:30 huh let me guess ai wrote the script for the ad that's why it's alien…
ytc_Ugxu6xvLC…
G
i really only use ai images for my twitter pfp or discord like i dont sell them …
ytc_UgzeO8xD4…
G
So am I safe now or is claude opus 6 gonna replace me in next 6 month ?…
ytc_UgygPCcEq…
G
The REAL bs is that AI can literally wipe out all upper level white collar jobs …
ytc_UgxFHqWzd…
G
Tesla's Ion Machine uses water and electricity and people say it creates a light…
ytc_UgwlN5eSO…
G
I don’t think it’s fair to say it’s AI art vs artist. I’m artist, and have no pr…
ytc_Ugzpb5HvS…
G
Do you use Reddit to train AI?
This place is already overrun with AI while they…
rdc_nugbzy3
G
Understand that it’s been “programmed” so in other words someone out there is fi…
ytc_UgzLh9ddI…
Comment
I believe AI is both powerful and dangerous.
Any company deploying AI should have a clear safeguard a “big red stop button to immediately shut it down if needed.
The real concern isn’t just about job losses. The bigger risk is that AI could be blamed for catastrophic events: a plane crash, an explosion, even the outbreak of war. Instead of holding nations accountable, people may point the finger at AI itself. That could lead us not only into an unemployment crisis, but into conflicts driven by the misuse—or the convenient scapegoating of AI.
And here in Australia I’m not worried about AI as much. I’ll worry about it in maybe 2050 why? It’s because our Internet in Australia is so crap their systems won’t able to keep up with the AI.
Just saying
youtube
AI Governance
2025-09-09T20:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxlyTkKyg3uJTaW-aZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxg0sAGSJaaWVoY9U94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"confusion"},
{"id":"ytc_UgwNcZPUATI71nH-fvt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwdn7kjyT1FrPXy0hF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzq_Ps_l0g_qed0D714AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyostPnCqWoXHNIEHh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzw4m-mASpeZX60MUB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzVA8fUxPQVDUrAOHR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy1x5g4VAkr-t2zNNB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzjiLZpGDy1R31QJ8Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]