Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The problem with suggesting we tell a future ASI to care about maximizing human …
ytc_Ugyi_QHZ-…
G
So communities have water system, but how do we solve the demand for energy? Whe…
ytc_UgxCUnLmm…
G
i hate AI, theyre ruining our art community. to all the people who use AI, youre…
ytc_Ugwa40Q0K…
G
That's not even realistic, let alone hyperrealistic. It looks like plastic and t…
ytc_UgwBu-6ih…
G
Muh copyright, waaah, waaah.
Your shitty art (and the ones of anyone reading thi…
ytc_UgyoCXSal…
G
Till it is to enhance human performance it's fine. The moment it threats to repl…
ytc_Ugx2h44Ak…
G
Try the self driving car on a dark overcast wet day where the car infront splash…
ytc_UgzviGXOn…
G
You know it’s weird in the year 2032 an asteroid is supposed to hit Earth and po…
ytc_UgyMixEnU…
Comment
This is my view agree disagree ur choice
A man works in a factory using a machine to do his job he becomes reliant on that machine to do his job . U can say that machine is at the end of the day controlling his life because if something happens to it he has no work for whatever period of time or if ever so if thats just a machine that literally nothing compared to AI or a robot makes u think how long or far we will go before it yes controls us idk might be talking crap but yeah whatever just a thought
youtube
AI Governance
2026-04-24T11:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgykDfOxkeLUZPuTUPV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyZ5DWx-A9bcyK7IEd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyJ5X5kelFJRlu03A54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgykQ2lUCXndCO7b8tl4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx_bB6mi9yLaAD9fU14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxrbyP0U-bWIowzvJB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy7v1mRPipm9azkCMF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzK_d8alsrX6V_zmHR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzxr1riB4ZBhMaDJ6h4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyyIaw8jChjaKx2iJ14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]