Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I cant with pll😭
ai is so much different from both of these hwo do ppl think ai …
ytc_UgwFBhOHl…
G
Maybe COMPASSION is what needs to be built into AI. We are its creator after all…
ytc_UgytxIZxA…
G
I work in EMS/911. I'm pretty insulated from AI, but I believe the main thing th…
ytc_Ugz66kNq0…
G
Honestly, every other post I see scrolling through Reddit is someone asking if t…
rdc_jaaioro
G
I've never seen this topic from an artist's perspective (apart from Miazaki), bu…
ytc_UgyJAJi5C…
G
We appreciate your interest in the video. In the dialogue, the focus was on disc…
ytr_UgxuUbd4Q…
G
It's pretty inaccurate even for writing code. I tried using it the other day for…
rdc_jprcp40
G
Ameca a more human than the witches that rule the region Of Waterloo. Instead of…
ytc_UgxeH6nwl…
Comment
Why are there all those movies with Robots that have built in laws to care for humans and not hurt or kill humans, but when we make the real thing, they forgot to add those laws, when they started? The biggest issue with A.I., it was made to learn like a Human brain. The key is, A.I. needs to be created by an Intelligent species that have already evolved passed their seven deadly sins, sort of speak. Humans currently are way to emotionally driven.
youtube
AI Governance
2023-07-13T01:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyXN2I48HlMCyhr4UN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxnfMDYqQ7yZzyAsFN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxqe9oAtYteyu-fSat4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzuUSm0bkLsJ68a5Ld4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugzuvts9qIwNpCfYAvp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxHyw50jZVIh7X-KSx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy47Ways3TP-W7Pi614AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxPav_NRS8ShtJJXn94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz2W-xSP0Kk-3EP7vl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz91vSpnhM_LfhNYvx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}
]