Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
10:15 I have an even better idea. What if, and hear me out, we attach those auto…
ytc_UgwJdkXfY…
G
It's not even AI. It's just a dude being paid to answer your questions ,and he h…
ytc_UgxSyBSgh…
G
How to detect if someone has no idea what they are talking about: "AI knows" in …
ytc_UgyuV1Ekn…
G
AI audio has difficulty pronouncing some words and inflections. May take a minut…
ytc_UgwmNygsZ…
G
ryne ai is cute but still got caught by turnitin for me, PurifyText ftw, bypasse…
ytc_UgzgCQ_7x…
G
The AI 2027 scenario was developed by superforecasters with excellent prediction…
ytr_Ugw6iHW2o…
G
OpenAI will never admit that one of their products caused this. If they did, law…
ytc_UgwU6CVM4…
G
honestly as a past ai bro its really stupid their trying to use disabled people …
ytc_Ugz-fP65F…
Comment
Here is my take on all you're BULL SHIT, there is just one little detail that is rarely mentioned with all this doom mongering, it doesn't matter one bit how intelligent AI may become, what matters are the steps already in place if AI does some act that threatens humanity, AI WILL NOT BE ABLE TO STOP US FROM TURNING IT OFF, the off switch is the ultimate control, and if I am wrong about this, then and only then are we BUGGERED, but I do have faith in our scientists and engineers that we are smarter than that
youtube
AI Governance
2026-02-03T06:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyWJka0BFUzwNDIANh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxDL0Zfm7A9ayU-Ki94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx929kf5mN2o5Ex0uN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyI-SGVYz5uWQj2UBl4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugxe-br6p_sZxiYNz514AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyLdaXFzZ6E6BZB1X14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwdpENDGE7lzOYtFRF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzffeW66M118srIQid4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgylOPrAj3VllX2WfXR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgxNJWSj1SBk28dK6XZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]