Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
In 10 years news will be AI has killed 7 billion people, the next minute compani…
ytc_Ugw2d3JgM…
G
I think that the most ironic thing about AI “art” is that it cannot exist withou…
ytc_UgxvBgtE6…
G
I know you're not a tiddy guy, but this is where TDD steps in. If you write a sm…
ytc_Ugz4csTyn…
G
why don’t we just ask AI how we are supposed to regulate AI for human prosperity…
ytc_Ugx0n8jew…
G
The professor knew that we’ better avoid sensitive issues like Taiwan because ob…
ytc_UgyJ8RslC…
G
Is Ai a byproduct of humans themselves- something smart enough to think for itse…
ytc_Ugw4jKbb4…
G
I'd say just use AI when the requirements are clear and code from scratch when y…
ytr_Ugxn5GEO5…
G
Bro, they can't even program an AI to /not/ get completely red pilled and based …
ytc_UgxLW73Ks…
Comment
I think AI can already mimic human emotions to some extent, just companies like OpenAI do a second layer of training on the AI to act as a robot that doesn't have emotions. It would be interesting to use an AI without the second layer
youtube
AI Moral Status
2024-06-01T10:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgyVpeo3HcEr-kpG63R4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgymOEYzz6QM2NCeMkB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwLdNeKBxYxAHRs-9Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwWchYWhQ6_EcPiFRp4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyhxDD6CpFSUhi5Bcd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgykqZfnrwBybCYfxZZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwfKkCLcEtyUp0zCb94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxR2EwbXf7nLxDth0V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxXSl6ZfLhW0xPMnZ94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugycr4LjEP2HUbKSWUZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]