Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
cant we just consider that making ai efficient would make it malicious? lying an…
ytc_Ugy43wJiB…
G
But why AI? If AI is going to take EVERYONES jobs what’s the point of education …
ytr_Ugyg2fyb3…
G
In the 90s we called it adaptive optimization. Basically what we currently have …
ytc_UgzcdBpQK…
G
[The Marshall Project](https://www.themarshallproject.org/2020/11/17/we-re-track…
rdc_jg0gt5x
G
I can't with AI being claimed as "art". It's insulting to me because i spend hou…
ytc_UgypRJzoN…
G
TAX businesses using AI / Automation / Machine learning that has replaced a cer…
ytr_UgyC_mtJu…
G
There needs to be a global law against automating certain jobs, because otherwis…
ytc_UgzW3JQnV…
G
This is going to get worse and worse overtime. With ai videos on the rise people…
ytc_UgxhmkSON…
Comment
Dude i literally had a dream about ai last night... like im just starting this video right now im not even a minute in but in my dream it had something to do with as soon as ai actually becomes intelligent like a human and can think for itself that it will explode because it can just make itself more advanced constantly. That would be crazy if this is what this is about lol.
youtube
AI Moral Status
2025-10-30T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxFRZ3ULDDHYaVFVa54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwpBMyuyzK-7qhVLfV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy0-5WwJf846xl2_8R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKsvre-Pndgqw1PnN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgzOuqD7kyxc9-ouC594AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyXzt7wWl9tcxfpmVh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgwdhM7CJ9AOS3XZOYt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzF4EYAm-1EQ2_o6pl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw9yVxH60zBVEnhqIZ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugxxy_KXEeL86_ndwU94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]