Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So humans are upset that our selfish nature is going to be judged by a potential…
ytc_UgyD9x4G-…
G
Well now i know how to tell someone theyre not a real artist if they use AI to m…
ytc_UgwqcJmp_…
G
That desire to perpetuate itself is what makes AI so dangerous. It's what caused…
ytc_Ugz4swK42…
G
@ImpeRiaLismus I'd say ai bros are more pathetic throwing a tantrum and creatin…
ytr_UgzBIte97…
G
There is really a very narrow path we have to go down that will likely result in…
ytc_Ugxy2satT…
G
Funny video, but you are missing the point.
The thing about GenAI is... That it…
ytc_UgzUjjguP…
G
It's not that it's shit, it's that it is overvalued. The bulk of the wealth in t…
rdc_nk6z5wa
G
The other issue that isn't brought up is this -- say that AI does everything the…
ytc_UgwmySliX…
Comment
I will never let a driver less car drive me. I would not get in the vehicle. And you only get one life it’s your fault if you put it in the hands of a robot and think you are safer. Yes I have no doubt some robots can drive better than most people. It’s the idea of letting a robot have to power of your life in its control. Nope not for me. Good luck people. Remember you get one life.
youtube
AI Governance
2025-09-22T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzjfbaw1SI23W9Nsvp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxqfRBIrSQDLGO8cxt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyPzTuEYwDj2oDV8eZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwI0bS_44y4imRbYtB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwbHzlCq0Bqv80YnSR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzgGURy9fXjGjiXe2t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwMZ6vWBZ87V9JYD3h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgykprT_ILZKaGzDE8d4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgybviMWbVq7ybXWI9p4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx5qXIK-_Yn3GIa9Vd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]