Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@ "concious AI"? "He/him"? do you understand the technology you are talking abou…
ytr_UgwUhTR3w…
G
It’s happening now sir, two robots started talking, in the own language, in a la…
ytc_UgwStWdzo…
G
The thing is… I go back, look at what happened, and then build something togethe…
ytc_UgxWtrGTS…
G
I hate people who use AI to do things for them. Like, yeah! AI can be helpful in…
ytc_Ugw_fBiAS…
G
the is-ought gap is not that great of a barrier to a secular (objective) ethic. …
rdc_di32txv
G
Why should people care?
If AI art is bad, artists shouldn't worry. If it's good,…
ytr_UgwqhLBol…
G
After perusing the comments here, maybe all our traumas deserve a mythic mediocr…
ytc_UgxNnxJVM…
G
1:12:00 The story of the Kenyan workers is absolutely DIABOLICAL. Shame on Sam A…
ytc_UgxkD-13z…
Comment
So what? We re all humans, all insignificant in the cosmos and im supposed to care that these companies use my data for training their model lol? I can assure you, what I said to chatgpt is almost 100% already on the internet. To believe you re unique to the point that whatever you share with gpt is yours and only yours when billions of humans have preceded us and went through the same if not worse as us is crazy lmao. Unless you re coming up with some genius idea to cure cancer, create some brain integrated vr or something else crazy, its not worth worrying about. The benefits outweigh the negatives. Matter of fact, im happy they re using user data to train their models, thats good. The tool gets better and better .
youtube
AI Moral Status
2025-08-15T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwxrTzoiaWuQL8NzXl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwOMVqWe6G_dMZXHL94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw_dpzI-mbykYAlsD94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz7LtcudTObch4Yu_V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwkHHosMpBLErEfN2x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzyiwstRSUIRd7e6kV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw-HpkQJ1vjICK-vuV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyr3RU63o7QeIfaAMZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgywUhh4ZCbYo1UOEpl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx4J9VoiMJSEROW7YZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]