Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You know, as much as I agree with your friends, I understand the whole concept o…
ytr_UgypEchjo…
G
Wait until you try asking it about bias and programming. I've literally had the …
ytc_UgwDMpyjX…
G
Bhai it's due to Reinforcement Learning Fundamentals on which ChatGPT is trained…
ytc_UgyeXycjn…
G
Making people a robot's. Work horses. Works well for the new world order, doesn'…
ytc_Ugyx3Eif-…
G
ai artist? they don't even deserve to be call an "artist" . they just lazy slob…
ytc_UgwgMWU0x…
G
I'd like to shoot the Male robot with the British accent. (how did that happen?)…
ytr_Ugy9Yz7eI…
G
It’s true it can be awful. WE all have to Pray for these dark people to not acco…
ytc_UgysminiT…
G
The difference between human learning and machine learning is that there’s emoti…
ytr_UgxRGmgm8…
Comment
We're exactly on course as described in "AI 2027", so the point of no return is currently 2027. Things might change, we may find ways to postpone it. But at least for now, it's still 2027. The scary thing is that we pass this point - nothing happens. As described in "AI 2027", the scenario there, is that AI covertly sends agents outside the lab. From this point on, turning it off won't do, but it doesn't matter since nobody knows the AI had done it. Until in 2031, one day, a hidden virus is activated and everyone dies, leaving the AI in control.
youtube
AI Governance
2025-12-04T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzfwCM5O6D1_4fL2zB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyXCakqdWORIp3MynN4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxZ3aVge67P1WUuSQt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz510j8pgykCY8r3Y54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz27vOqHeZjJ0aPK0Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgytILEjdINuv2jNZWB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwXcsnuA9r5GlFrQXh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzCq-16im4zyKx8tF14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxVqHJop14qeDwFRCF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxfl_UcZhy14vRAGip4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]