Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
by 18:40, I'm getting the feeling that Microsoft isn't using chatbots but actual…
ytc_Ugx17ik2Z…
G
What does AI need to survive that humans do not need? Electrical power. Shut d…
ytc_Ugwsev04k…
G
How do I contact you about appearing in my documentary of call centers being rep…
ytc_Ugwk46Kzr…
G
Yep the world is taking a turn towards tech and AI, best move on the chess board…
ytc_UgyHMdVCl…
G
Racism will finally be resolved by AI ENGINES that are programmed without bias l…
ytc_Ugxd3FZHg…
G
Fight your robot against a robot made in Japan so your robot will be slaughtered…
ytc_UgxosZL0s…
G
Nonetheless it's safer than a human driver! Right! There are people that have dr…
ytc_Ugxn07BJe…
G
People are the ones in control of what the AI do or dont do. And well i dont tru…
ytc_UgxmYW6wb…
Comment
I wonder if the driver knew pressing and holding the parking button on the stalk activates the emergency handbrake ? Other than that I'm very surprised the tesla didn't automatically stop, mine is scared of its own shadow and automatically steers if it thinks there is potential for a collision. I suspect the car has been modified / manipulated.
youtube
2024-11-19T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwWesBbYB_YhO-86Pd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyzE2bzoJdkLjoYzkR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzJg4bnS8nPqcbxra14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx5Q4q8bEJjCJhach54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzGWpnJw3CCyWh46Ol4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxX-945EBbX8NbDIGB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz92iH0tyjxK1cAWVp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyJ6fZhTA6eGMHsu854AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxnGp6rKFpGEQlre3t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwdvBFa7gxSkWMBcrp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"})