Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The government is 50 or more years ahead of us. AI is already trying to take ove…
ytc_UgxEKWycN…
G
I mean i fully support this but it doesn’t really stop companies from just takin…
ytc_Ugx2qDaDe…
G
Not how that works. You clearly didn't even watch the video did you. I wish AI c…
ytr_UgxXaovEc…
G
I cant wait until we are living in a complete AI ecosystem and everyone realizes…
ytc_UgzVFHI6u…
G
yeah but does it have life? Is it made with hardwork? Sure AI is easier and bene…
ytr_UgwMSqJTN…
G
Ai art should be deleted.. that shit is not what the world of Artist's need.…
ytc_UgxBoo3Mr…
G
If we were to make a robot that would be capable of conscious thinking like a hu…
ytc_Ugz-o_rLh…
G
Ai sees art as a duty, a human sees art as a passion. Even if a human drawing 's…
ytc_UgyXC3iot…
Comment
I think what seems to throw some people is that this isn't about AI "turning on us" or developing some malicious will. It's just AI following its own momentum. It's not about us. I think some have difficulty imagining a world in which we are not the centre of things but peripheral - the main things happening being devised, generated and carried out by things that aren't us. Which is an entirely new scenario for humans. It doesn't need to dislike us or even disregard us entirely for this to put us in an extremely vulnerable position at best.
youtube
AI Governance
2025-10-15T11:5…
♥ 652
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwgVNJgSLMJDLUBU8R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgynkSjGpEQy8-Kc8zh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-4krbJQUYK77HCYJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxVP508yV27MiLU79V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugya6dVxuaLTosbbcnN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyx5P5xveaUApfBTcp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxSFDbALHW82C1XitF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwj_ej0JjPm54ddYm14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"unclear"},
{"id":"ytc_Ugxcc7er26T-uw7x5YJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxYuB6uVGN6VsEjMuF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]