Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah and when you look at all the ai generated shit on the net they are a waste …
ytc_UgzDrfDhQ…
G
We dont need AI, it has to be built by humans and i dont think most people under…
ytc_UgxPteg82…
G
Nr 3 here. Overworked and underpaid. I've been waiting for years for tech to tak…
ytc_UgxU0E1JN…
G
I'm not buying it. Even the best paid-for LLMs are absolute shite. They might co…
ytc_UgwAL7Yy0…
G
THAT'S THE REAL QUESTION! I'm an artist, I don't incorporate AI in my works, but…
ytc_UgzN07s-1…
G
Would love to give the Skyrim VR a try, especially with the AI NPC's! Thanks for…
ytr_Ugx6kGJhr…
G
I have the same personality as Elon Musk but I live in reality and had to do eve…
ytr_Ugx8GcqX0…
G
No we are not. Have you seen the real compute requirements for llms. AI has a ti…
ytc_UgwEAelyf…
Comment
another reason why I would never want to step into an automatic car or fully electric car. like, when something like this happends, the company doesn't have the decency to look into it and try to help, no they will just try to blame you.
I'll drive a manual for as long as I can, and if I have to switch to an electric car or whatever, I will make sure that I am the one in control of everything. I would add a direct manual handbrake, a way to physically disconnect the battery from the motors and ofcourse an easy way to break the windows if needed.
trade comfort for safety and freedom.
youtube
2024-11-20T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwWesBbYB_YhO-86Pd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyzE2bzoJdkLjoYzkR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzJg4bnS8nPqcbxra14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx5Q4q8bEJjCJhach54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzGWpnJw3CCyWh46Ol4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxX-945EBbX8NbDIGB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz92iH0tyjxK1cAWVp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyJ6fZhTA6eGMHsu854AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxnGp6rKFpGEQlre3t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwdvBFa7gxSkWMBcrp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"})