Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Hello friends this is Elon Musk official have you get a gift for Elon musk befor…
ytr_Ugzb3R966…
G
Despite a boom in U.S. oil & natural gas extraction energy cost are high in the …
ytc_Ugy8I2qPG…
G
Big BS... AI can't be really creative, nor does it understand unique context. Th…
ytc_UgxWDT-Fl…
G
The danger itself is very real, I just don't think this particular guy is very g…
ytc_UgxKngtaK…
G
So we’re just not gonna have someone behind the wheel regardless of it being a s…
ytc_UgxLlyLmb…
G
This dude talk like he is an AI. He don't want to die. Not my cup of tea. 😂😅😂😅…
ytc_UgxRKf7rF…
G
Greed and control is horrible no one should be controlled even by AI or robota f…
ytc_UgxOz9kcP…
G
Everybody in this situation are clowns the people using AI aren’t making actual …
ytc_UgwzmpRkq…
Comment
There is a major problem: when things go wrong who is held responsible? I believe that truck drivers should be retained on autonomous vehicles the same as one would retain pilots on planes. The technology can be affected by interferences, that is when you need truck drivers. Also if the technology and drivers work together it will make trucking safer and more effecient. Perhaps introduce "road trains" the same as those used in Australia.
youtube
AI Jobs
2025-08-22T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzv78bHsM2J8p6L1Kh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxDIAuqb89fbblgplp4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgyyhRe5GA5yp08s3h14AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwgzTVtOfTrIs-aH2p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwwXgiZKSOfWlwu9gR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzDyVb4xvZIkaHpy4x4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxLMCI6A6xdsvw33F14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwHBqiZs3jJmiTZNQN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgypNMacUxFXxktiJ8V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugx266PktmLShAhWgT94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]