Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It’s a good step in the right direction. And I feel like there’s a lot of artist…
ytr_Ugzyit1EL…
G
Its definitely beneficial. Ai and chatgpt are the future whether you like it or …
ytc_UgybWk3fz…
G
Me:
If the US were to create an AI regulatory agency, what would be a good name …
ytc_UgyFcb5h2…
G
How would crypto be better than learning something? Knowledge is power. Crypto i…
ytr_UgyUt3MAB…
G
homie so upset about tech boots he wrote an essay to say 'it makes a shit versio…
ytr_UgzdR9j-9…
G
Around the time that this video was posted, a similar scandal happened at a cult…
ytc_UgxHaTtuC…
G
The more that people trust AI over themselves, the more it is already taking ove…
ytc_UgyfTHMmi…
G
"AI is a meta-solution...without question, there is nothing more important than …
ytc_Ugz2Qrw4-…
Comment
This is really pretty stupid. You’re asking Computer nerves what they think will happen. Of course everything they can imagine regarding computers can be taken over by other computer, computers, and by robots. I’m sure it’s possible to develop a robot. That’s totally automated control by Computer to make sugar cubes but well a sugar cube company invest $1 billion in a sugar maker when they’re only selling a couple million dollars a year with the sugar cubes. If you’re an administrator of some sort, yeah you’re in trouble. If you play with computers all day yeah you got a problem but when you’re swimming pool is dirty the robot is not gonna clean it for you. We already have swimming pool robots and we still have to have pool services. This is so much bullshit. This is what happens when you listen to a bunch of Computer assholes who think they know everything and no absolutely nothing
youtube
AI Jobs
2025-12-26T17:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxLaObH6QYOtrf6wXh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwdl_5xpduUS3jGSIV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzsUF3DdM4enYrVLhJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgztoTrpPTT0kqmGT3Z4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugzn2Xv_5JbRmgXE4v94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyRsFhUCkOCd4VV9Vx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugws_QmCS7LfVF5hF7N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzTfAiYAn25Um6lVpx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx81fqeERP5TS1Vz1x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyrITX0dkXM8yXej7F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]