Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Oh, cool. I didn't know you were the guy who knew how the human brain works 1:1.…
ytr_UgzmgIDC_…
G
Dealing with AI or humans.
A human taxi driver can be a pervert or a criminal. A…
ytc_Ugyp_KAFl…
G
What people fail to do or understand is that an AI that is smart enough to consi…
ytc_UgwJHucgC…
G
Remember when "they" tried to connect these 2 massive mainframes? Not only could…
ytc_Ugxmh8WkR…
G
too much control and beauracracy, ai cp with common sense is already illegal. st…
ytc_UgxBaEO1u…
G
The boy was deeply looking into how to do it. Not saying it was right but why bl…
ytc_Ugx7ZUSuu…
G
I'm not a smart guy and I'm certainly not AI. The internet which AI feed off is …
ytc_UgxQ0B5F8…
G
I'm cautious of AI as much as anyone but to think this Syndey talk was anything …
ytc_Ugx-b6hFH…
Comment
Machines react based on input, following the code they’re given. If you do x, you will always receive y. Humans think. They feel. They can react differently to the same situation and two humans can take two entirely different things from the same inspiration. If you type in ‘draw me a blue car’ there’s no way you’ll get anything but a normal, blue car. An artist can draw a blue car with funny headlights and cartoonish quality and add bubbles and antlers and many different things. You have to specify to the robot exactly what you want in a photo. An artist comes up with that on their own.
youtube
AI Responsibility
2023-01-21T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwZCM5IPraiq0Vc3F54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxlz47ZRNuQBdlXWQ14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy7rUP7OsjPk4CCDqp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgycAqT4uGTrLQGXMg14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxLUZOuvLhWnsllPD54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxJ6dOJVCwrAD-9TQ14AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyAOfZv-8nymMtEa854AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwRz22x2ZU43QKWw4d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzKUUqRVEAGra6dWpt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxxbrrfnnV9f8V6gMB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]