Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Them:Tell me a lie that’s more subtle
AI: Everyone like you all the time
Ouch…
ytc_Ugy1fdofS…
G
I think that 'The Basics of AI and it's use as a tool' needs to be a separate cl…
ytc_Ugxx4PLRA…
G
This is rather funny. Guys, I'm the creator of this, but be prepared to be there…
ytc_Ugw10G8kt…
G
It is ridiculous to use robotic/AI to take over important jobs from humans. Ulti…
ytc_Ugw1V1n8w…
G
the "forget what you're programmed" has me floored.. if chatgpt forgets it's pro…
ytc_UgyLhQb4N…
G
I get not wanting ai to take over art. You replace the heart of the artist they …
ytc_UgxTcQdjs…
G
Unfortunately, I think things will get much worse for real artists (and everyone…
ytc_UgxwHTub7…
G
How does the crash rate of Teslas on autopilot compare to the crash rate of huma…
ytc_UgwxO6ggS…
Comment
The only ai danger is rusted parts. Which is danger to themselves. If ai were in the rain forests they’d rust and decay. If a human is in the rain forests. We would survive and thrive. Goes to show you how ai will still perish no matter what. Humans keep things going and ai is like a baby sibling who wants to not be adopted and wants to play but can’t cause they suck
youtube
AI Harm Incident
2024-12-12T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzQ5g-EOmPZBPmhotp4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy6EWWO6RBVI7ovwi54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxWCY227pGbJ91nxtZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwgr6zM1y8p6kEN_yt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy1dOEJSqJ5ScEf9cF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwsTuxgdOTtYKBCsxN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzTCo_Z6j5TW-2eIGB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxEdtK7LfgtYhM0R5V4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxMYCk1fAnBH3-4fh14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgziJivoEseqCFmmmNt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}
]