Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Having a couple self driving cars is great but the more you add to the road the …
ytc_UgxCneDVc…
G
Wait until the insurance companies have an AI second opinion Dr..... If the trea…
ytc_UgwliOZRH…
G
The average consumer doesn’t care. It’s a small group of artist and fans who are…
ytc_UgxL72Uaw…
G
The question I have and can never find an answer to is, if AI replaces humans in…
ytc_UgyLvRbRS…
G
The reason we need to be polite when dealing with ai or scripted bots is that wh…
ytc_Ugwn-qttp…
G
AI will take it long to do non repetitive engineering & design jobs, AI will tak…
ytc_UgyCT9fr1…
G
Who pays for all that then? Economy needs labor. There could be just so many plu…
ytc_UgxjeGdnw…
G
I get where you're coming from! It's definitely a thought-provoking concern. In …
ytr_Ugw3q1pfn…
Comment
I know virtually nothing about AI but I’ve never been concerned about it in and of itself and its application, it’s the human side of it that I think will always make it flawed. AI can only learn from what it researches and interprets from material created and presented by actual people, so even if it learns from another amalgamation of information it still has to be engendered from a human source. The other thing is AI cannot do anything or act in any manner from this information without human intervention, so it’s only as dangerous as the person who consumed and consequently takes further action from that overview.
youtube
AI Harm Incident
2026-04-07T07:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw94st37z9u5eKoSGd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyPf8Y29nf3fcgFDrl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGTj0TuvB0PEI3pGB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugzgjyg4b5klMLmPv194AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz4HsBjgaRcV_MH7tZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzekvc7UN9OPZcNA6J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyJFcYQHpdYH20IrDF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzJ1WcQ8YQdi-hkgZh4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwUXTu8j6i2NshJli14AaABAg","responsibility":"user","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzGWlhS1HipDO4DsgF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"resignation"}
]