Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That's not true nevertheless, you can use ai to specifically make specific detai…
ytc_Ugywn2Iq7…
G
Greedy billionaires wanting to be trillionaires and fire the human factor and yo…
ytc_UgxZkmag-…
G
The way AI bros act towards artists in this comment section is so disrespectful.…
ytc_UgzxIMHzD…
G
This is bull. Mikow Krakow said AI is as about as intelligent as a retarded cock…
ytc_UgzW7ghBe…
G
Just realised that my dad is an AI, he never says when he doesn't understand or …
ytc_UgywxK6bh…
G
LOL, thats what happens when you are being cheap, you get a bad product. Most pe…
ytc_UgwVHBFcM…
G
Marxist Communist moron. What an idiot. You can't put the Genie back in the bott…
ytc_Ugxk1FN5A…
G
pump the brakes!!! or our greed and lust for power will kill everything. everyth…
ytc_Ugz2qNfZJ…
Comment
The real reason: self driving cars sit at a stop sign forever because they’re designed too safely. The minute they learn to drive aggressively, is the minute they are better than us. Self driving cars will only become a thing, the day the force is to not drive anymore. Self driving cars need other self driving cars around them, human driving cars don’t belong in the same road. It takes a universal decision, which we are not emotionally mature enough to make.
youtube
2023-08-06T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzdVIG57GtNOjvd3uF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwsKVIEfgID2qH8as94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgwMS9LqYL3GDmvPKBl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxkKpk9wkCqcXKnyEB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw3TdOeBRcBYkhvEwR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx8OfmIZWGSMNADXHx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw9QC9e0jrMkxV2kwx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwhBbOGJtV7N5CIAPJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx5JnSnKs0HJedPInd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyo_wwtgkLc7leV2ap4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}]