Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
it's not monster in ChatGPT, it's a monster in OpenAI and he's called Sam Altman…
ytc_Ugw61BwZM…
G
the philosophic question is who should we blame? the robot? the company?the comp…
ytc_UgyZpVGAz…
G
I worked on chatbots for an insurance company once. We have an emotion detector …
rdc_ksqvg2c
G
The A.I. developers should, then, democratize their earnings and share with ever…
ytc_Ugw1LxDPW…
G
@mxmajewski Are you comparing hating on AI slop to actual genocides? Jesus you a…
ytr_UgxvRTv-d…
G
Autopilot is just for highway driving and it is just cruise control but better. …
ytc_UgwnC3TSE…
G
Making something more selfishly intellectual (political on how to get it's way) …
ytc_Ugx4yQ7xe…
G
So ChatGPT and DeepSeek ARE in a neck n neck race to become Sky Net... 😮…
ytc_Ugz9M7dTG…
Comment
I already use AO to help speed up my workflow, but AI doesn’t code for me. AI is dumb. It only knows how to map requests to results, and doesn’t know if the result is correct. It’s like someone who doesn’t know how to cook following a recipe. Only in coding, the recipe constantly changes. The AI engine doesn’t know what taste is so what it end up making is purely based on a path. That chicken Alfredo could have maple syrup on it and and AI would tell you with confidence that it made your mean correctly.
youtube
2023-11-29T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyo6eyWVjjV0hRQHn94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwjSdu0SoT5FDJSKxN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxLXAQtm0it7zc_9j54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyRYGUdaEPxVfUL0px4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugys4WuQy7-wLuohbrB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzYfhVQO5xEWx-7d154AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxTfzduK6NxGuFvVcR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugz8PNetZOL_rU64znd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxfH44jeDztHHuGYdt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwD3zSuab4SS2BrWb14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"}
]