Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A true act of courage — standing firm before Microsoft executives, sacrificing w…
ytc_Ugzjh9GIk…
G
Medical diagnosis was what a lot of these algorithms were initially ment to do. …
ytc_UgyxqZylM…
G
The working class has nothing to loose, but its chains... we need to unite befor…
ytc_UgyV36FDc…
G
I agree, if you want your autonomous weapons to attack indiscriminately.
Howe…
ytr_Ugy3QOWo5…
G
If ai prompters separated themselves no one has an issue
Its only the people who…
ytc_UgwfOlPO0…
G
It’s really frustrating because I am pretty sure I have been exposed to the viru…
rdc_fjz83td
G
Sorry but i don't remember who said that but it's quote: "I don't want to ai do …
ytc_UgxaTNjQo…
G
Dr. Saidy’s talk regarding artificial intelligence(AI) in healthcare made the ar…
ytc_UgzQJf9HJ…
Comment
In the end Ai will decide we are useless and terminate all humanity looks like the end of time , some one is making a lot of money now but my question is ? Do they realise that they will wipe out humanity and them selfies all together !!! I think we have a chance to shut down AI OR LIMIT its capabilities or once they turn to robots it will know that people are dumb and start to kill every one moving and every animal moving cause it does not need air , water or oxygen to survive and by that time it will have power to power it self for 500 years .
youtube
AI Governance
2025-10-19T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwHMluy0kJn5blLn594AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx0AYObMigx_P66EPd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzonYyRbbkQKlcSgEN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzKEebplAYMBz0uMvt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyoPoJvRDedTM7F0CN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgybWovYb32e7JxmV5t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxTnsusfTkdWI48NhN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzUCu4ShSb1Jcph7Nl4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy_douuXaThiwuE12t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy4h4BJR_QGRr0AuiN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]