Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yes just like what they have done in Wisconsin. Data Centers are very bad to th…
ytc_Ugy6Fi2Iv…
G
I dont know why this isnt talked about more, but taking out the whole AI Tech Bu…
ytc_Ugx9_Erdz…
G
I think you are wrong. This whole video could've been written and narrated by an…
ytc_Ugx_u0NDk…
G
Erasing your student loan is the incentive I would imagine. If you had $100,000 …
rdc_d2xit6i
G
THANK! FUCKING! GOD!
If customer support is just gonna follow a script, lie abo…
rdc_jrqqsvv
G
The waymo car costs $125,000 to make. Tesla is a 1/3 of that. Even if waymo got …
ytc_Ugx3Yj-4T…
G
If China and Russia have AI powered military aircraft then why haven't we heard …
ytc_UgwVlY1dS…
G
A very good interview, and I enjoyed it. But I find it weirdly disconcerting tha…
ytc_UgzinnFWS…
Comment
You are painfully naive and optimistic.
If the leading experts in this field tell us the risk of extinction is as high as 85%, do you really think your ill conceived notion of a human based and motivated AI will be loving? History tells a very different story about humankind.
youtube
AI Harm Incident
2025-07-24T09:5…
♥ 8
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytr_Ugxjlt2wfeaqjvkWTWJ4AaABAg.AKy35ggv_1bAMxrDy_uwZf","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugy6kGsglnn-2C6TCTN4AaABAg.AKxsAclZB_1AKxy766RR4h","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugy6kGsglnn-2C6TCTN4AaABAg.AKxsAclZB_1AKxyFbtKoU4","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgyMJXzDgkXEVcgFY2d4AaABAg.AKxcba-etolAKxhXeg67up","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwCTRIlx6FsRPbfegV4AaABAg.AKxZy4RPoKVAKxcHrJxlao","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwCTRIlx6FsRPbfegV4AaABAg.AKxZy4RPoKVAKxgnpuOzFj","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"resignation"},
{"id":"ytr_Ugy2eowJLJzfLqzsnF94AaABAg.AKwxmz88RZNAKxZV7Mc1WT","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzSZeAr4y38rNuRPC14AaABAg.AKwvrylZkvDAKxYn6LmXZX","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgzSZeAr4y38rNuRPC14AaABAg.AKwvrylZkvDAKxlO5JDc2o","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_UgzOiGrPFnUIcxTwgAh4AaABAg.AKwq6m1MDolAKxCLSbWZaB","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]