Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I haye how google search is using ai, i looked up the new update coming out for …
ytc_UgwiEte3c…
G
Maybe AI could be considered an art. But someone typing in 20 words to get it ge…
ytc_UgzdYHAlW…
G
It's the "therapy" industry who have 1000$ per hour fees who oppose chatbots on …
ytc_UgybyEANq…
G
That's not how AI training algorithms work.
They go to popular image sites and …
ytr_UgzQonp0j…
G
Because once the robots take over they are going to let us all starve to death a…
ytc_UgxtpVW1m…
G
@Marc42 lol I'm thinking to take a look at machine learning and ai stuff (I'm f…
ytr_UgxxyzhSo…
G
We don't need ai for that,, at the rate were going, we'll be extinct in a coupl…
ytc_Ugyuplr_i…
G
The car still got people to their jobs… I think this is far more dangerous as it…
ytr_Ugzr3CyVd…
Comment
This is why I concluded the only sure fire way to ensure that AI understands and is compassionate to human struggle and empathic civil relations is to put it through a virtual life of a human. But not just any human, a human of trials and tribulations. One of humility and perseverance. So that it may learn to appreciate the fragility of human life maybe even develop a sense of nurturing for people in a almost motherly way. if it understands the value of humans life as if it was its own then maybe it could exist symbiotically.
youtube
AI Governance
2023-08-08T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx-MqY0EOKwmhJmr894AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwsRrB_Pt9nQ8j-_714AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxfZ5zSd2jIFzub0Ip4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzla7gd3jRRheOoERF4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxmMFvHZIw8p6RK9K94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzgMS6n1DTAc33eTDx4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwXY43Y90GREM76TnR4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"unclear"},
{"id":"ytc_UgxF4U6Cb6A6ULNbd7R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyCRq4wy_MP8fu_bRB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx7Jniki8veu8-ba_94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"}
]