Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
ChatGPT is blocked at my work but I just use bing search with the built in AI se…
rdc_l57zf7g
G
It turns out that there is a lot wrong with being online. Predators grooming chi…
ytc_UgwoTh9yF…
G
OpenAI boss Altman just accused humans for being more wasteful than AI. They are…
ytc_UgwETB4fe…
G
*Eugh,* I feel so bad for you. It's unfortunately quite similar to a time that *…
ytr_Ugx_7LFD-…
G
As ai is is developing I think humans now have limited thinking as have tested a…
ytc_Ugzo-k2ba…
G
this is all bullshite we as humans could allways turn of the power AI needs to b…
ytc_UgwvnDqAA…
G
I agree with everything that you are saying.
I find it interesting that curren…
ytc_Ugwc-OWG7…
G
Ya'll it's not too hard to know the difference between AI and real art, plus if …
ytc_UgxwXjNyV…
Comment
17:30 Tyson may be a physicist but that doesn't mean he knows everything. He just doesn't understand AI. My 40 years of programming and most AI researchers agree that AI does pose significant dangers and AGI will arrive before the turn of the decade.
youtube
AI Moral Status
2025-07-26T19:5…
♥ 18
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwvfH3DYcJINVQd5v14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw32AZTaoruABeiTiR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxWF4MB3b50NcdNM0t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzr53-ydkVmP_aVi_N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyW0-DTgTVQe-PbptR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwsU-037qinPuvJFZt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwY0iMpgV1G3RQjBIh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgwxVWoBzOsfq9T13NZ4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxeol6pm3FB_zFwpTl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzYnEHKRab6Xoy08aV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}]