Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A.i. is a program.
As always humans use new technologies to destroy (other fact…
ytc_Ugw_HgtOj…
G
Thanks for bringing up his work! I hadn't heard of him before and will absolutel…
ytr_UgwJdKDKk…
G
It's interesting how Sophia expresses confidence in her wisdom! It's a playful r…
ytr_UgyZvf8bJ…
G
yea, ai can make a PICTURE, but not a drawing. It will never.. feel.. as good ev…
ytc_UgwXyS57D…
G
brave with an easy opt-out of its wallet and ai assistant is great. doesn’t nag …
rdc_oi43lrk
G
I find it wild how AI "Artists" love to brag about what they generated. If you a…
ytc_UgxmTlqcI…
G
The story is fake, absolutely fake. It is in every effort to derail the upcoming…
ytc_UgzgT-8TP…
G
Just think how much better companies would be if they replaced dumb AI with actu…
ytc_Ugy0xMpGj…
Comment
I mean, in every sci fi apocalypse story the AI gets too powerful and destroys us. It’s already advancing at an alarming rate, and no one can really control it, every country in the world can use it and start developing new, potentially dangerous things with it. I predict it is going to cause some kind of catastrophe in the near future. At least the we don’t have powerful robots yet. So they can’t actually go to war with us. But maybe AI can figure out how to launch nukes
youtube
AI Moral Status
2025-06-09T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxM4qolk2tUN7grPmt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyTdg93fA-JpP-V49V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw0U6jEbY3rslkTf7d4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwzT9ogFSsNbypIAGd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxapkZFcYaMqqM1Hd94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyUaxiDfNJdxEnC1_N4AaABAg","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugxl1N2V_VvPb65gf5l4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw1IoopP8afLgYCs7F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxRlXaZG2K19l8mTml4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwBZXSkSBd7Amkemwd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"}
]