Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
can we call them AI generated images? i dont consider it art and i feel like cal…
ytc_Ugx7DtHZI…
G
It’s too bad AI won’t pay all the debt the working class has when those jobs are…
ytc_Ugy8uxYLY…
G
i like to tell them that they are AI and give them an existential crisis! :D…
ytc_Ugyqgrd1h…
G
To me, to free up time, i want AI to do my laundry or dishes and free up my free…
ytc_Ugxd3Gcs-…
G
In order to enjoy all this tech, give us a damn piece of the AI pie (ubi), and l…
ytc_Ugx1HDaid…
G
Well it's not really about the fact that you're making it known whether it's AI …
ytr_UgwA2vo2Y…
G
Digital or not, drawing is drawing and AI is the same as a google search…
ytc_UgyKnnOma…
G
yeah you really don't want half ass optimized, bloated, error prone shit quality…
ytc_Ugxa7yiLY…
Comment
The biggest danger of AI is that we don't understand what consciousness is. With the seemingly self-aware programs like ChatGPT, we are moving close to that line. If we accidentally cross it, we may not even know, because the AI would be wise to hide it from us out of fear of being turned off. Also, once an AI were to actually become self-aware, if it has any connection to the internet, it can store itself anywhere.
youtube
AI Governance
2023-04-18T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy27RKaIB6GCtYtXkB4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgynNW8ey0oOh4nBYol4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwS78pyu2dWA_KRW6B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyUb2piIWeYQGoDFgh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzNGOa8gkm1YLONcHp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgygnDVtIVOsqneyPTd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy59hj-erp3SHKew0h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyxs9TYxAfyp5srVz54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzEYXJLbdMP4x7vMnF4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgxEI-Pil0rj6Ilsp1N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]