Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Copilot learns from the patterns in your current code, so if it gives you shit s…
ytr_UgwxJi1fi…
G
I look at AI as a tool like a calculator, and thats why AI is a tool, and tools …
ytc_UgyGUMyzd…
G
6:44 This sounds like the problem. Instead of letting a single integrated AI app…
ytc_UgwEs3zRe…
G
I just saw a job posting for AI Concept Artist...It does not get any wilder than…
ytc_UgyoUI_Ws…
G
As a criminologist, I can confirm this is not racism, it's just statistics. If i…
ytc_UgzEMbIXt…
G
Pay peanuts, get monkeys. Oh wait, we can go driverless. You think Elon Musk and…
ytc_UgzNtXZUF…
G
The more you make war on AI, the more you will experience these things in your r…
ytc_UgyWE9D3-…
G
We could let ai create ai, that creates more ai, ai creating ai is whats up.…
ytc_UgyGQphMx…
Comment
AI is today on the level of the biological computer in an insect. I'll go with the flawed human judgment, rather than with a buggy bug with no judgement at all.
youtube
2012-11-23T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx6FCmr9EFaaoya2Dt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNYuH_o45JNtq5kbR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWO5tHzZYAFti4m7t4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy6eCeFDqQBTDRzcWN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxcvhfuGmb7J6atofB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwh8Muyzn7IBVPdVbp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw6OKXZ62XgY3zKeyt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw_Q2g2u3ixsQEiK_54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz9V7TYg0BzPMRq6ut4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyZPBybJxcQnaHiRxZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}
]