Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If AI takes all our jobs, jobs wont be needed. If AI takes a lot of jobs but not…
ytc_Ugxl67z6e…
G
@vuivraalbastra For now, use tools like Nightshade or Glaze. Until better policy…
ytr_Ugy8cHPX4…
G
Yeah this is quite terrifying. My job is safe though. I’m a librarian and I don’…
ytc_UgyX5PHYK…
G
A.I. is enormously overblown.
A.I. is GIGO. Garbage In Garbage Out.
Meaning it …
ytc_Ugw21Z5p6…
G
And before we get this unlimited energy utopia, we get to compete with the AI co…
ytc_UgzkckSJ_…
G
The irony of thing is that is that a lot of non-fluent english people (A bunch I…
ytc_UgzlTdd5X…
G
How is not trivial for an AI to search the codebase for where the e product is m…
rdc_n3memet
G
Also in the future to avoid this it might not be a bad idea to use a screen reco…
rdc_kgp3ctb
Comment
The average ai researcher is full of shit. One in 1000 people are capable of even understanding the implications of their breakfast, let alone the complexity derived from complex mathematics. Build real models and show your work, or i call bullshit. Show me your work towards this effect or i say you arent an expert, and this opinion is founded on fear and lack of information.
youtube
AI Moral Status
2025-12-11T01:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugz0hDBH9UtOdSPTOrF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwbdyM7s2gMT7f8FaV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyhKgJlNZI86S5cUrJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyDV-TJMWgKB2L0utN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwSU4qMEdQZmQvugaF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxU9oC6TAXg-YaMBXt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzKSmt6AwMAZsDpnCB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzEg8sHD2YwX9rCsw14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx5egmnrKORgJwgIjx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxPXsjNLTRQEwjxr5l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]