Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
“I don’t fully agree — building a global platform takes more than emotional inte…
ytc_Ugx75Wq1g…
G
17:00 i mean, not gonna defend big tech Google and their bad commercials but rig…
ytc_UgxCOEAWQ…
G
The day when whats under the mask looks like a human being, we're in trouble!😊😊…
ytc_UgyvDb5lj…
G
I had absolutely no idea that Elon was involved in the founding of OpenAI, this …
ytc_UgwIyfvos…
G
Dense people in 2023: You're not a real author if you use AI to write your books…
ytc_Ugx2pr2-s…
G
AI detection also says the declaration of independence is AI generated. I don't …
ytc_UgxralsCo…
G
Not to mention if an A.I. gets commercialized and a singular company cuts as man…
ytc_UgwscNfB6…
G
Do we really need robots in this world? That's what people are for, we support a…
ytc_UgxpDN6We…
Comment
@benwahhhh FSD = Full Self Driving. It's in an early (invitation) beta test and you are warned to be ready to take over at any time as they collect data. It's more like Full Time Babysitting, but it's interesting to watch it improve with each new version.
youtube
AI Harm Incident
2022-09-06T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxGFRO40uyLs743S5l4AaABAg.9f_h_lmNDG49fcWjV5ST1T","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxobKS_erN63elZXjh4AaABAg.9f_a1Hqi_7l9fbSmUtYvnp","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgxobKS_erN63elZXjh4AaABAg.9f_a1Hqi_7l9fbTEJekhWM","responsibility":"none","reasoning":"virtue","policy":"regulate","emotion":"indifference"},
{"id":"ytr_UgxDw_hf_yKQUFR5B6J4AaABAg.9f_Wn24aPFn9fenuOIXq0-","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgxDw_hf_yKQUFR5B6J4AaABAg.9f_Wn24aPFn9fesqHFLEiO","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwA-cuAlrYFrmRrmFt4AaABAg.9f_82ZYVkcS9feQ47I6m8q","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugy428XSf8iGdC4L1Wd4AaABAg.9f_66rDSbnc9fbE4RBpxo-","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytr_UgyWzPD4vYfX32DHyzF4AaABAg.9f_41m8mCQv9fdQ0viC_09","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgyWzPD4vYfX32DHyzF4AaABAg.9f_41m8mCQv9feF4WLy4x0","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugyvgjaq2iSqCaKCETl4AaABAg.9fZyR6xVMg39f_-0TZPBxt","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]