Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Like anything in life, a simple infection can kill Ai... Be ready with the infe…
ytc_UgwlhZO67…
G
There's exactly 1 AI image that I still kinda think about to this day. A while a…
ytc_Ugy-cO4Mo…
G
No because it uses no actually artwork in the generation process. The art that w…
ytr_UgwQNeL7-…
G
Please keep in mind that we are still far away from a true AGI. All publicly ava…
ytc_Ugz-wJU7k…
G
okay so ur saying im born with art skills and i learned how to get adhd mr ai su…
ytc_Ugzeqht55…
G
If you work in a job that requires efficiency and repetition that follows a logi…
ytc_UgyRZlecs…
G
this is bullshit. the AI doesn't have feelings like survival. it can't think …
ytc_Ugx1cjMRO…
G
Interviews using AI? Wow! I remember when I struggled with marketing until I fou…
ytc_UgzsNdP5J…
Comment
*They are NOT statistically safer.* Waymos only drive in safe roads, safe conditions. Real drivers do not have that choice. Comparing 'safe' miles versus 'all' miles makes the car restricted to safe miles look safer. Statistics can be be very deceptive. Will Pichai (Google CEO) agree to being tied on down on a chair, put in the middle of a square, with dozens of Waymos cruising around randomly *WITHOUT ANY BACKUP REMOTE DRIVER.?* Nope.
youtube
2025-06-29T22:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugw0NqzS2wUaGjnaG2V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugy3C1ispwr-yVvOmih4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},{"id":"ytc_UgxKQI3PP4FCBUhRBhx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},{"id":"ytc_UgwsB4p7Iy_9r2erxQ94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwyhfrtGaIW6m0Mq1R4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgxtEqkt-n0QJ8dkGGp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgxZNRwrLl24jdtkcNd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_Ugxua3n_0_IJrDb_pAJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgxAgE0vQupprhUgDQ94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},{"id":"ytc_Ugzvb1ecNHkaLWtAfAN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}]