Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So instead of petitioning waymo to turn down their horns volume. You're running …
ytc_Ugz-hl6KV…
G
Y’all need to stop. You’re giving ai and robotics WAAAY too much power and advan…
ytc_UgyzPTUz5…
G
To everyone with a depressed outlook. We're right at the start of dealing with t…
rdc_emoamhs
G
My biggest concern would be that a student using AI in place of hard work, grit,…
ytc_Ugz5feAIn…
G
...unless if i can prove that I didn't break the law do you promise not to tell …
ytc_UgzjRCaEm…
G
This is dangerous technology aimed at the younger generation. TRANShumanism. The…
ytc_Ugzc5d8Sv…
G
AI also lacks any sort of real subtlety, whatever you ask it to do will recreate…
ytc_UgyXpkXRg…
G
Good to know is that poisoning does not need to work forever to still be effecti…
ytc_Ugwajgtzs…
Comment
I definitely know now that AI isn't as smart as i thought it would be. it actually sounds like is Giving a China communist 1 child rule so not creative. and at the rate most humans are to self destruct I can easily say we won't ever over populate the world. we have massive amounts of land that with a genius plan we could help attract population to. not impressed with this AI non original answers
youtube
AI Moral Status
2023-05-06T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxP3qyw7L2Ur-596YF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzKei2TYwtkh0RaesZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyUoyNilX6S4xGxrgV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzBfju5Mf6lXUbr5dx4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxXqUFI-ePdFgclOwp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgziruiAQa9DxBiVr0F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgynL53Z92tQSD_4ueZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz9QZJPsowENlbUdvd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwdDKPDR8qhMjWAdTN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyycBUTS8GfRlzMezB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]