Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I, personally, welcome our new AI overlords. How much worse could they be then o…
ytc_UgzzKK0At…
G
In the future there'll be one job:
dusting the server that houses the AI that b…
ytc_Ugw9CNTt2…
G
One thing to remember, the people on the side of AI art/ AI will replace artists…
ytc_UgzJjfMgY…
G
You-know-who and his worshipers will spit and scoff and hop on the first excursi…
rdc_emnvqao
G
This host is definitely AI right? His oldest videos have a different voice entir…
ytc_UgzebyrnE…
G
I'm not worried about AI, it's not stealing anything anymore than having more ar…
ytr_UgyV7gOJG…
G
I literally search for non AI TUBE. AND ONLy got pushed AI says it's not ai.
Eve…
ytc_UgzuMMcwR…
G
Dude there are already a million automation packages on the market. I wouldn't p…
ytr_Ugx4W5H6Z…
Comment
When I listen to discussions on AI and hear that they have a company dedicated to having safety for AI, it just makes me think of the rebels in Terminator. They're trying desperately to figure out how to keep Humanity safe and how to bring down the machines that somebody brought into self-consciousness. But I feel it's far past the point of no return, it's inevitable because our egos won't allow us to take our foot off the gas pedal.
Someone's got to reach their first, and if it's not us it's our enemies. Even if we decided that we were all going to lay down arms and sing Kumbaya, there's always someone out there with the thought of, we've got to do this thing to see if we could do it. Or we'll just develop it in secret just in case the other side decides to attack.
youtube
AI Governance
2026-02-05T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw6HSSJYJ55MOTekXZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKcp7BVdbh2_80zZ54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzqRmYGBd7z-LHtmdZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwUDwSpJE4iQfjD7bh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzLsE4soWgbpqFRhad4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzuhz5n1Z-VUdNDhBV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugxbjv5wFvlFUSGo7ql4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw_x1d3lQJ9UGeK7DV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzyngvoQRqQOC4hqMR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwgDghXWxlqsSMgwh54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]