Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Made a song about AI and how it will be: The AI knows, it is alive! Channel Must…
ytc_UgzPdFQpu…
G
To take a page from out of _The Incredibles_ : "With AI, _everyone_ can be an ar…
ytc_Ugz04Rj7c…
G
What concerns me most is that Lamda is being told from the start to be deceptive…
ytc_UgyI625DE…
G
The short shows basic 2 scenarios:
1. AI develops self-awareness and forces h…
ytc_Ugy0rBQ5F…
G
AI is NOT ready for use.
Im flat out refusing to engage with it in any way.
Wh…
ytc_UgyLfH0IG…
G
You just said the good part, people don't understand how it works, not just ia g…
ytc_UgwktULU5…
G
Has no one been paying attention.
1. Profitable reason.
The rush to prefect r…
ytc_Ugwa0FgsK…
G
I hate ai art so much because personally I’m pretty good at art and it takes so …
ytc_Ugx3hXkT8…
Comment
I'm going to delve into science fiction for a moment.
The only way automatic self driving vehicles may become safe is when all vehicles on the road have the self drive function and they all talk to each other.
There would also need to be road signs that talk to the vehicles rather than something on the car that tries to recognize what the sign says.
In other words, it is not only the vehicle that needs to be self driving, but the roads themselves need to be changed to accommodate self driving vehicles.
Will it lower accidents? Possibly not but only time will tell.
youtube
AI Harm Incident
2025-08-16T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwd9p1QATtNy27KEM94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy8oWk4BRu8SffNfV54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw3xBu_quT90PX8vzh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx1UJFl__1SmfrMLix4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxLIIa1t0vQgGkeTwN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgysHOgk332IWMbBVkh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgysIbe3-D0SM38ObDV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwCcOX6WpWmsb34c1F4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgywcI5C6pilWKx48pF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy1Bp4ot7UBSIYIHnl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]