Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When hon say about his goal then sophia cut his talking and say that hes older v…
ytc_UgyP6qkN3…
G
I do believe it's sad, yet I won't be soft about it. To the person who repeated …
ytc_UgwLx-6lo…
G
He says AI is dangerous but takes the middle lane when asked which of his friend…
ytc_UgwUuqk_P…
G
I don't care. if I like ai art more ill buy that. if I can get it for free I'll …
ytc_UgyKNg2WE…
G
Don't become schizophrenic and stay conscient that we are the creator of AI. It …
ytc_UgyhiPEUS…
G
What goes up must come down, and when this whole AI bubble comes crashing down i…
rdc_ohzyct4
G
That's a great question! Sophia's design is focused more on her AI capabilities …
ytr_Ugz2Les2I…
G
The fact that he hesitated to push the button on all AI means he is not so sure …
ytc_UgwlyyhXS…
Comment
I have a simple solution. Why wouldn't people just bypass the ethical concerns with human genetics modification, and plus embed AI into the human brain. That sure will push the next gen of humans. Then, they can exist along side with their super intelligent digital part.
AI was developed with ambiguous concerns of dangers. That's why now that it serves the upperclassmen, the continued development cannot be stopped. And for the several reasons Geoffrey has pointed out. So if we cannot go back, just go forward, leveraging humans.
youtube
AI Governance
2025-07-17T01:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy7WWfgekIn6l-Jws94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwpM5-tdThQktVbeoV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyA_et-PO_LHq1Enol4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugxu5wP2UXoOptZZShZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzDKtZ9oWIh7lea2D54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx4Gwiqnluiqb9-gWB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy7K5msKLQxrTtsGQJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy2__FmT674icdzqNN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwOK8mLJyUublTXp714AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwLMx0GZ70olVy2gbB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]