Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@lenoxpI that's not gonna happen, you would have to be needlessly reckless and n…
ytr_UgzWkPspq…
G
The AI-generated piece reminded me a lot of the way a certain beloved illustrato…
ytc_Ugy0smT32…
G
AI if asked
could solve the pollution question , by Not using traditional meth…
ytc_UgzEDeHfQ…
G
How do you define "AI art bro"? Is it merely someone who likes and uses AI art? …
ytc_UgxFjtK0v…
G
While this video is from a year ago I found it very helpful in understanding my …
ytc_Ugwysio5F…
G
Yea... who wants to go outside with A.I crap scanning your face 24/7 and trying …
ytc_Ugxp3LzWl…
G
Its totally possible to make this with today's technology. Though most of that w…
ytc_Ugw6Z5sQ5…
G
I don’t mind AI for things like horror and sci-fi but not when they try and make…
ytc_UgxB6Wktq…
Comment
Does it matter? I thought people were supposed to keep their hands on the wheel during self driving mode. Did that rule change? I really don't like Tesla. That said, for every Tesla crash there are probably thousands of similar crashes caused by humans. This technology does not need to be perfect. Just better than we are.
youtube
2025-07-16T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugw10Xoy4qNuE9cpc454AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx3X8FQFdSli4-Sjdl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzokpawTUcjAPyf1Bl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxlkAGkLHZXT8LPuX94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyTWRoNsxkywtp_30F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzEVW-qrOfvbMYPhnN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzWwWVbrR-lvyiP-y54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzNhlAR37OWIvS-mrh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxIMlBIQRorzm6wTMh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugyovj8hZzirkx9vfql4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"}]