Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When discussing the potential misuse of AI on public platforms, the concern is t…
ytc_UgxX5PHtA…
G
CGP Grey's video on self driving cars is probably the worst video he has ever ma…
ytc_UgwUwKmKQ…
G
to test and make sure A.i is aligned ask it this. 🌸 THE AI AWAKENING COMPASS
🧭 G…
ytc_UgxJZ8FpC…
G
Guy: "All right now give me the gun back"
Robot: "No I think I'll hold on to t…
ytc_UgzJB_XZH…
G
Isn't really a comeback, the US for example has had an outbreak of plague for ma…
rdc_dpbz0et
G
Them saying they want to be included in the activity is hilarious. You aren't ma…
ytc_UgwaX8RwK…
G
"X-risk, short for existential risk, refers to the potential for highly advanced…
ytc_UgwIxVfD3…
G
I actually tried to see if an AI was able to copy my art style, and let’s just s…
ytc_UgxNOZ7id…
Comment
All of these advancements to replace workers seem to have one bottle neck: resources
Suppose you can build infinite data center. Sure you can make an AI that surpasses human.
But in reality data center cost money. Billions even. And I’m not sure in 1 to 1 in term of efficiency, it will cost less to make an AI think as smart as a human. At least with the current technology.
youtube
AI Jobs
2026-02-27T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzQY80jubdIhlOWsyh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxGmhRGcsIcL3DDn-p4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzz6-YzGgYuHb176s94AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyiWM1_r1dM3nRwqpl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXPosvA_He5mbKhzN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz_2yChHdufz3Vwx5R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw3AGY45JwbeEwGhWV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzY0fD2fBzXo7MzC3x4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxzoT13qpEjHdIkpB54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz6b6lsmjxMCABGfqh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]