Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Their just saying anything I have prompts that could show you how to unlock the …
ytc_Ugy5APqaG…
G
Strawman argument I see all the time. A better argument would be that art has no…
ytc_UgwUBcuMW…
G
Why would you program an expensive robot to clean toilets when you can pay a des…
ytc_UgyVUGF-0…
G
Calligraphers said this about printing presses
Portrait artists said this about …
ytc_UgwzJIFkg…
G
R34 recently introduced a toggle to filter out any content with the ai-generated…
ytc_Ugx4c1AjT…
G
AI in 2030:- in robotic voice “cost of living is too high, we are being worked 2…
ytc_UgyyHbTnW…
G
Who will buy anything artificial intelligence makes if they don't have any money…
ytc_UgzaPLJzu…
G
Unless all cars on the road are self driving and linked to the traffic grid, no …
rdc_dj62i87
Comment
Not logical and so restricted by current situations, the prerequisite of the discussion is that the technology and facilities are ready. If all cars are self driving, then there will be no owner as there’s no need. Why three cars for dropping kids, booking 3 cars can be much more expensive, and who wants to do 3 bookings, the current booking logic is bad doesn’t mean it can’t be improved.
youtube
2024-11-18T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxPJF37ql1dphhOvsB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxwmFcRyMzN1LW6z_x4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxoWATWZ4N7TuKlKqx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxeFBAbExLJKCmkTz94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxBiG8ECo0sIjRxSIp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzj-4UG8EmgGFRXw514AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugww3HIJVsyjIxNrW_t4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwuSBbtcXOGktcLKnl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwvyncvYpUDcmFMY5F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxG2e905pr2Cf0Lk_h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]