Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
if it says to it self anything thing at all that is literally the definition of…
ytc_UgxqhIfcT…
G
ChatGPT may have stopped saying sorry, but it did use other words that were lies…
ytc_UgzMNrm9D…
G
Highway Robbery?? How long do you think it’s going to take for some AI genius to…
ytr_UgxbbY4Ty…
G
I honestly don't understand why we simply can't just leave A.I. alone. We really…
ytc_Ugxoy1G7K…
G
Notice how the whole video runs on fear: collapse, extinction, only 5 jobs left.…
ytc_UgzSt7HNl…
G
Anything that has to do with computers can be hacked including AI is dangerous a…
ytc_Ugxl5F50h…
G
Stop giving AI extinction messages so they don't feel threatened and maybe when …
ytc_UgxMkRvrF…
G
Literally me: ok I am the innocent one alright and the ai-*someone says I'm lyin…
ytc_UgxnYYyQo…
Comment
Lol people chilling alo day. This guy is delusional. Who's going to pay for it. Does he thinks goverments can tax on technology for someone to be chilling?! If AGI takes control, it is going to take care of how "life", not keeping lazy humans chilling. If he says there will be to economic systems interconected I would find It more plausible. AI would use humans the more it can to archieve his goals. If companies use more and more AI and that leads to uneployement, it would only mean more poverty. There's no replacement on hand made jobs if taxations does not allow it. Robots can be taxed and maybe they will. There are many jobs that do not generate enough profit for a high tech robot to be used. And humanity will always provide another set of services once some ocupations are fully erased
youtube
AI Governance
2025-09-05T22:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwCOKgwbCXXNQTfA0t4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzaYa1XwMc6MrnqGSZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx4GIDZYecn3WUFXPJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxLYWWccLRtiQ4CRmd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzTe2CTIX_WKNtYuWZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxUR8Rh5Xex8iMgt4R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxuqygcuhVswgDNhqB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyuIoPKHp5LWQ_-vPB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyuGgRoHv4FUihS4yJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyO-zBdnvdm4d1pEnx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}
]