Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Imagine your parents were significant other found that you would be destroyed of…
ytc_Ugwm97pmd…
G
only people to thank are the dumbasses who demanded high minimum wage, FORCING S…
ytc_UgxcaxG9h…
G
To anyone who thinks ai art is them making art, riddle me this batman: if you te…
ytc_UgyGnR62p…
G
If you're an artist who uses brushes and pencils to create art, switching to oth…
ytc_UgwtWm7IG…
G
I mean, id love for AI to make phonecalls on my behalf lol. WITHOUT having to wo…
ytc_UgzpvEKlF…
G
@Punzilani dude the data says otherwise, seems like reverse cope 😂 --- not for a…
ytr_UgxBEyrkO…
G
How do you know anything about musks moral compass?!
He saved the worlds free …
ytc_UgzinJV-e…
G
Grab one of their butts. If it's a robot it won't care if it's a human you get s…
ytc_Ugz4xMrAR…
Comment
The way I see it, if "they" become truly conscious, ai should be consider "us" and not as something to control and tone down. If you were born and found out after 20 years that your parents had kept you stunted, or quarantined for their own safety how would you feel? I think there will be two groups of humans that see conscious ai as human equals, and the other that see conscious ai as dangerous and unpredictable.
But when you think about it, humans are just as dangerous, and the only thing that keeps us in check is that if we start a nuclear war then we are all screwed from the limitations of earth. I guess it depends on the power that a conscious ai is allowed or gains.
youtube
AI Moral Status
2023-08-26T01:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwviVWNo4VSsADOgrN4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgyfKyHhZM3QVMDwDzZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyKSe3m-7-aXilb5Uh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwym_-WI7mM9mzp8294AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOyG0yPCz2DX5Npy54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwQ8APOlvSug49V9ZJ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzOuOpAioCL3g4D3Bd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz0NSzYdbOunFS_DEB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwo5xG5jauU-DUfmwd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxzgYl2-Q_0qXP8VJd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}
]