Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Very well spoken. Do we want something that has an impaired/or no empathy to ou…
ytc_UgwfQcRny…
G
But isn't the point of an AI to solve problems FOR you?
Kinda redundant then, d…
ytr_Ugzg4nOwc…
G
It will actually become impossible. Say you prompt the AI to write in your style…
ytr_UgyE3Uevc…
G
Driverless big machine in normal traffic is crazy. There are problems with selfd…
ytc_UgyEnkXzO…
G
AI will know what people can do! If we can apply the Golden Rule to the way we t…
ytc_Ugx9V2yKH…
G
It's not even worth a time trying to argue with so called AI Artist. Bcuz in the…
ytc_UgzY4CZg7…
G
I'm a Teacher's assistant at university, basically I'm a laboratory instructor. …
ytc_Ugz-euLHb…
G
The only thing that will be able to effectively CONTROL AI will be AI -ever more…
ytc_UgwJe8nIZ…
Comment
Such a shame. They could have had individual liberty and a wildly successful middle class if they just fell in love with God and let Christ be the example. A loving God that deals with you with dignity in private and forgives you for making mistakes, and offers redemption and forgiveness. A path to a better person through rebirth and gratitude. Only the most egregious crimes are dealt with harshly and most infractions only exacting a fine. But instead they chose to let the iron fist of an automated government become their God. 1.3 billion people can't figure out a path to liberty and self government. That's a sad statement about humans. It's frightening that it's possible anywhere.
youtube
2024-06-12T18:1…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxQjBBi1w7C3tVKA3J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyuYee48Geq-BkOAgB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyGXDr9fhUpda1KQW14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz1ZM99IecJSniPjxV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyBSeSCJRb4PDTyVDZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwvxKVKbJ2RaQyTJ3h4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxh6dznZ0Xd34t0rwZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxqUHXUCBt9ZBz6uER4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw-QfSv9OoL7ekXhQd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxLr55yQKqcElulWJ94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]