Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So... this is almost definitely a flawed argument, but if the robots are capable…
ytc_UghNrYEVi…
G
Accidents are expenses to the insurance company. Fewer accidents means few payou…
rdc_dmp1hmm
G
There's one thing that this type of AI will not be able to do and it's creativit…
ytc_UgwHjy5mN…
G
Jo, the AI looks out of it's window and sees two man fighting over an edible roo…
ytc_UgzAErZfX…
G
I understand where you're coming from! Names can definitely carry a lot of weigh…
ytr_UgxzoAHaT…
G
robot: whoops
robot 2: WHY DDI YOU DO THAT GRRR
*smashes its head on the glass*
…
ytc_UgxcJZZ65…
G
Well we could be heading towards a kind of post work society.
But AI creates oth…
ytc_UgwFOP7Lo…
G
I once had a guy tell me AI was like p1r4t1ng movies and like... No. No its not.…
ytc_UgwTpBIrL…
Comment
I would imagine a law-specific AI model could be very useful but still require heavy verification. It's very clear that for a general model it would be nearly impossible for predictive text generator to accurately site any law decisions. Unless you give it a specific case, but even then I would guess it would be prone to screwing it up.
youtube
AI Responsibility
2023-06-12T17:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugx0ySlvkjP5XeCQRAJ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwn_R__QVOZ_O2tYNR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzcABFYR9npLxBQ6r94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwk2s8VxMyqRUF0iuR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzS9NjiAqiNSjv-2bB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz7xkhdxmcG0IkXcLV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyVQUjRKE7vLsbPxPB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzk1uhye11QHbLF6Z54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx5ld0moXVu-aNbOhJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxAhDVjf0JZhiDaZHp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}]