Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's moments like this that I feel today's conversations about AI echo those had…
ytc_Ugx5pELiD…
G
I am a retired short-haul trucker. I don't think that we can expect driverless …
ytc_Ugy39x4dr…
G
Okay and how exaxtly did this "destroy" the AI twitter account? He's still makin…
ytc_UgwD8Og55…
G
I don't understand the problem with the greedy humans of present. If AI is harmf…
ytc_UgyuN6cyu…
G
How about companies that replace their workers with AI pay those old employees 5…
ytc_UgyCPWExr…
G
In matters relating to the social sciences, AI bases its reasoning and argumenta…
ytr_Ugyg0i4fI…
G
Just FYI, algorithmic trading has already taken over the stock market and makes …
ytr_UgxxVufKS…
G
In 1996, i wrote various AI reports for my masters level class. Some came true. …
ytc_UgxLPbL-7…
Comment
Equivalent to our own features? Not really. More so that we create stuff to help us do things easier. How are cars for example benchmarked in any way to what we can do? It's more of we find a problem, find a way to solve it using technology and then give it out for others to use. AI is simply one of the ways that we can do it. We can find flaws in how humans learn, and then counter these flaws for AI so that they can learn better than us/
youtube
2013-06-25T02:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugwf8OHi6HLTfI3l9Tl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy2wim4Us7Hq2JT5fh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzs9gC2tJDC-stZdFx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyvDTGSeY2ddK3gNQp4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzwZ3jfrMIcqMn2plR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyJOLbkO5X0n5YcUn54AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxGE5rP2don4fja5Yd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwsC4kTTP1wwi9SEFN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwc6bUu3Lj4NXzfPAl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwqA6jr9GQVtv3byz14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]