Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As much as you want to cry about it, AI art is the future, humans literally cann…
ytc_UgzDTkH6T…
G
You're right, finding a number to represent the probability of either of these o…
ytr_UgxGaW9p1…
G
The war is between freedom and convenience. Which one will you choose bc you can…
ytc_UgwNOQtV9…
G
Where its just number crunching, automation will always win. Its using DA for ot…
ytr_Ugw6p5xbK…
G
If you are passionate about drawing why do you need to care about AI art ? Do it…
ytr_UgwPPXe0h…
G
Or just don't use ai to justify direct police action, data yes, making a freakin…
ytc_UgzYy4AzI…
G
I checked what AI said about a Bible verse. Turns out it can't handle simple Eng…
ytc_UgydQSiIY…
G
I understand ChatGPT can be a "bad influence" but it's hard to fathom someone be…
ytc_Ugx2OcVsm…
Comment
These idiots don't need to create actual AI for it to be dangerous. All they need to do is create something bad enough to mutate and propogate across the open web. It doesn't even have to have a nefarious purpose, it likely won't even have one whatever ends up making life real hard. Coders don't know all the possible outcomes of what they code that's what scares me the most.
youtube
2024-01-07T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzVqyB9SIVezavFaXV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyFWWHttt6GYJiYOGp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwhUmj-yjh_8_9hvlB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyifBFxDWceqRkbIrV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwymHr25Fn_Bm1BZap4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxQKayT2Xl5R6wE9ip4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxLU1huybkn4UXr5gp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyxrQyqC0j_LzOI7914AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz9DWlmX0IDtYu9aGJ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyIs8MgLDka7Z3zadJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]