Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was being some alien kinda shit and the ai asked me how my species reproduced …
ytc_UgyK5e1wN…
G
That reminds me of that that time I was looking at my wallpaper which was the p…
ytc_UgwOVauFv…
G
@103days It isn't the imitation that is the deciding factor. It's the intention…
ytr_UgyOPf4vo…
G
Correct, \*current\* AIs are not smart enough to stop us from unplugging them. T…
rdc_l5txiut
G
There was this one show called person of interest. In it a guy builds an ai call…
ytc_UgxnPCumy…
G
AI technology is truly transforming the field of robotics! The advancements are …
ytc_Ugyq2Kjjg…
G
apperently theres so much ai art now that ai is pulling from ai to make images a…
ytc_Ugwwhmm7b…
G
What would you think if Putin or Kim Um were the ones setting the ethnic policy …
ytc_UgyIf85Qu…
Comment
After watching some of the key points of this podcast, I think we should realize that AI should be treated now as tools rather than advanced being unless we deliberately want to replace ourselves.
In order to make it only tools, it should never have certain advantage that a human has or powerless to be compared to what human's biological form. With these only rules, we can have something that serve humans well or if it turns out to be a nightmare at least we can respect it and it can respect us as well.
This is like playing to become God but the only thing that we don't know is what was that God thinking in particular when creating humans in the first place and not the same or can even be powerful than God itself.
youtube
AI Governance
2025-09-06T00:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzzuJuv3QNNH1MWf_F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwiQ7e1tSjMPbKEefR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzkFCxwzziRUUGHBUN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy1QDnEn1XOqIXLJ_F4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxS5FG2g1JLkB-dvDt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxUEUhZBPbCo52wZp54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzY63qnR0731CsoX094AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx8jtL87iUuV0pksVF4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugz1miqf1gUTos0VXQt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzK-1t9RM1ArtcYk2F4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]