Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m both a traditional artist and an entry level developer of AI. AI art sadden…
ytc_UgwCDJYJy…
G
Lol how many people are doing their work on the fixed cost claude plans? My comp…
rdc_obxcngz
G
@bradweir3085 harming openai by assassinating someone who was about to testify a…
ytr_UgyypQpm3…
G
Wow this video looks real. The ai Feet looks like they’re touching the ground .…
ytc_UgwVAWMZP…
G
I rather generate a art with ai and try tracing it soi could learn a little bit…
ytc_UgySSgsj-…
G
I’d understand the argument for AI if it did touch ups like shading or fixing po…
ytc_Ugwjkc4YB…
G
The same ai also got 90% of the predicted crimes right and machines don't care …
ytc_UgxjZqGgx…
G
People out here thanking chatGPT meanwhile I bully the living hell out of it whe…
ytc_Ugxr2ZB-s…
Comment
So we are essentially building our replacement? Is that what you guys think? We are building something that can take over ourselves? Displace us? Then what are we going to do when all the things are taken over by robots? Die? Leave the planet? Wherever we go we’ll do the same it’s ingrained in us this very comment I’m writing is a product of my experiences and is being used to train models. Those models will produce and end up being services used by some other ai. What’s the end goal? If there are no humans to consume those goods to live our lives. You guys are just stupid and don’t see the bigger picture. If we are building such a thing is because we are supposed to. We are evolving
youtube
2025-04-24T15:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgykwUlT5zj0Hm9vAp94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxqS7gztEwRC1DBmmp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugy0ryeza2oEFqlMcBJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwNm16tE9PETV1TGJp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxhnfn_asJGkv6O8x54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzjt5tPW2GS0BqOR8Z4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwFxxJ5n0vaAx8vOK94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgxiHerhyi3lFqHrU3R4AaABAg","responsibility":"company","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugz0HPUe4zx874lqnJl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzxJy5LbFo9tL_oqbZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]