Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Future proof occupations:
1) Owner/controller of AI
2) Owner/controller of Robot…
ytc_UgwP2rzlU…
G
Robot arms are gonna be building that burguer too soon so i dont know what jobs …
ytc_UgzsjY7n3…
G
The silver lining is that the AI bros put their own poison on the internet. Ther…
ytc_Ugwq4ANIs…
G
I was actually mixed on the film (though entirely in agreement with its conclusi…
rdc_g99v2jp
G
I agree. The last person who understands how an LLM works is the CEO. Better of …
ytr_Ugxf2zuBx…
G
My idea is purchase robots have salary given by how much work did they finish ei…
ytc_UgxOHduIx…
G
@RenderingUserImagination and creativity still exist though it doesn't matt…
ytr_UgymEcaL3…
G
thank you! i've been put off by the focus on 'AI art is bad' since the beginning…
ytc_UgzjJm73l…
Comment
"A robot may not injure a human being or, through inaction, allow a human being to come to harm. A robot must obey orders given it by human beings except where such orders would conflict with the First Law. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law." Isaac Asimov
youtube
AI Governance
2024-09-04T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugwywp2i1_YnDFbUgQV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwmD0dXLAc_333tHPx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_UgzXCX5uyis33VSekPx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzuiJ5hCEfrbNW_XEZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugyra2pmdT4QcwDM6Vp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_UgzeiKZEvt7tNSYzSTd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgxQypKqgic-nLtLewB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},{"id":"ytc_UgwaIBXnf7XXjyXGdg14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgzZ1dOTVfF74D4L3d94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},{"id":"ytc_UgwvN7s9-WesXbMm3JN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}]