Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
At first it was "You & AI".
Then it was "AI & You ".
Now it's "AI".…
ytc_UgykNMhZN…
G
When I "yuck it up" about the Jews on ChatGPT I 100% get biases and have to expl…
ytc_Ugxm9KDIe…
G
Ai art is a tool.
Thats all. I use it to help me get inspired. I also use actua…
ytc_UgxAQtHc9…
G
Did not expect Rob Pike to pop up here, glad he shares my sentiments on AI…
ytc_Ugw4HPyJH…
G
WarlordGaming Warlord10
Dont forget BMO, from adventure time, a slightly differ…
ytr_UggeuSOns…
G
Yes and there are people in SK who still to this day will complain about US forc…
rdc_dl0zy27
G
What I don't like about Ai art is it doesn't tell a story, it doesn't say a sing…
ytc_UgyldnYgI…
G
Bro it's really good video but I want create this image into vidoe like an ai …
ytr_Ugz06lB8n…
Comment
So your main argument is that this technology has the capacity to replace lots of jobs that are right now held by humans and put them out of work, so it should be stopped or heavily regulated.
Couldn’t this be applied to almost every advancement in automation that has happened? Self check out in grocery stores, automated robots in factories, the train…all of these technologies put a lot of people out of work, and I don’t think stopping them from ever being released would have made the world a better place.
This just seems like an arbitrary standard to apply to AI, when it has historically never applied to any other automation.
youtube
2023-02-21T20:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz3T1Tjk05STTtKU6J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugz3gP3unbp7vFr9mT54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz4Fq1-X0Fi7Qy0YNJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx97eZiray_k_dLSEp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx5Lj1EN4AntGR70A94AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"mixed"},
{"id":"ytc_UgzvrliuF3eN0yxcWt94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwZvjjdt08VAZ_bdmd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxj0a1cycUr4H-IfIB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzSj1qImUYwcwH8yX54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxvvcmZnjOqr8YM0SF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]