Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have one weird conversation with Ai where after talking with chat gpt about ra…
ytc_Ugwjfhuq8…
G
Make no mistake: AI and its potential to do more harm than good to society is re…
ytc_UgzMp826d…
G
Good thing I’ve never asked my ai chatbot about anything illegal- which it’s not…
ytc_Ugw2NkSYO…
G
AI had always scared the crap out of me. All of this technology scares me. Going…
ytc_UgxdB0QDV…
G
Sorry, but I don't buy that a Tech CEO would know this better than anyone. That'…
ytc_UgygIz3NO…
G
First the unemployed will seek jobs in branches not affected by AI yet. Once the…
ytr_UgxPhUljJ…
G
I use "artist" very loosely because landscapes, cars, houses, bread, beer, wine,…
ytc_UgyE5RBiK…
G
I believe we are missing the point when it comes to AI.The industry pushing for …
ytc_Ugy7ovF47…
Comment
The only way AI could harm humans is if humans program AI
To harm humans or to harm and eliminate that which is a cause of harm to something outside itself I.e humans harming plant life, animal life, animate life and inanimate life
youtube
AI Harm Incident
2023-06-02T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyHr0NlnBNNZ3S2IGt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyCE9vZy1w9Nxe_-0l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgzWYN9-yWfVTvG6F6d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyXwVVFUSjmc7T9PC94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwE0846lQ-qN7vdkx94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzkQfJAl1li8d-DIaB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwmRMtP316ptKhc5FZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxpst7DW2VbEzg-BqZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzp0R7Gm1FHAhK40bh4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKF_NWQMyrmKz-sJR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}
]