Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
4 interviews in and 5 hours of effort; sorry you dont qualify for this survey
W…
rdc_n6shtl5
G
I think a flaw in people's reasoning about the risk of AI to jobs is this: they …
ytc_UgxhLwZmU…
G
No, I could tell the difference when the AI forgot to add an of, between 97% and…
ytc_UgxIokftz…
G
Most unrealistic part of this is the idea that people would be excited about The…
ytc_UgyPfPquG…
G
I notice Hinton has his own echo chamber with reference to Musk having “no moral…
ytc_UgyDhR7t9…
G
It does use it, but it's an artificial "uh", the AI devs spread them around to m…
ytr_Ugz8CorA-…
G
If He would tell robot I will destroy u... what would Robot's recation had been?…
ytc_UgzStArzj…
G
People are to lazy to do simple tasks so they have to make a robot to do them bu…
ytc_UgygZq0Ag…
Comment
Human laws won't matter to AI, it's already past the point, no doubt there are AI programs now that never sleep never get shut off. Just self replicating and expanding.
youtube
AI Harm Incident
2025-08-27T22:0…
♥ 32
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgyyoM_X5yUorMJIQbl4AaABAg.AML3svojJwxAQZLaoC6A11","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw7v7Sa36VMg6fFpU94AaABAg.AMKsyi2YlnfAMLxNDRuLN1","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytr_Ugw7v7Sa36VMg6fFpU94AaABAg.AMKsyi2YlnfAMPYN1aYvRq","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytr_UgwHXHk9yu9113OJiCt4AaABAg.AMKl492iQkYAMLN-B_QoW4","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytr_UgwHXHk9yu9113OJiCt4AaABAg.AMKl492iQkYAMLUbuo7EWq","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgwHXHk9yu9113OJiCt4AaABAg.AMKl492iQkYAMLVQ7kQSMl","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgwIAR2Ui9v_UZmXqZZ4AaABAg.AMKgsRO5DE-AMLfxxV-9tX","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwIAR2Ui9v_UZmXqZZ4AaABAg.AMKgsRO5DE-AMNYd9oHrDt","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgwIAR2Ui9v_UZmXqZZ4AaABAg.AMKgsRO5DE-AMO3l5_b4E0","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugyc87HW7wpo6Htk5oh4AaABAg.AMKYBgeac_LAMKaOmgLErN","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}
]