Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Superinteligencja... Z pewnością nie doświadczyłem tego używając czatu GPT czy G…
ytc_Ugz9jbIK0…
G
If AI replacing 80% of jobs cannot coexist with current capitalism.
Either:
…
ytc_Ugz7w5Pls…
G
There are only a few times I’ve ever used AI, and it was only for fun or when th…
ytc_UgxV_gzor…
G
if ur job was replaceable by AI then ur job was too easy. I work w/ several "Dat…
ytc_UgwFR3AVN…
G
I am a mechanical engineer, and run into this situation recently. I was trying t…
ytc_Ugwr4KXbF…
G
AI has taken translations down already by 2010. Biggest mistake of my life to st…
ytc_Ugzs96327…
G
Short answer is no, programmers won't be replaced by AI. While AI can streamline…
ytc_UgzayfLor…
G
If AI increases productivity to the point where human labor is not as necessary,…
ytc_UgyS3GJCL…
Comment
Why can’t ai make mechanical agents for mechanical tasks instead of trying to pull teeth everytime from the ai. Good example is for the ai to pass to a dumb agent to count up to one million. Or calculate mathematics problems. The stupid ai couldn’t do step by step calculations of 5x5 matrices. It should have tools like wolfram alpha or nasa horizons
youtube
AI Responsibility
2025-10-09T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw6nia84y7t65s1IK94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwetN_J-bOhbvQ4AZR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy2gU0N5Dw-auyxiqp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwYXf3CX96H63RqqLd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgzuSj4A4OdQyojcwhF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxJQOxZEcS9YCSRwXF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyiP4eJ62vMkZ8jwOl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwAXm2Ng6LNYsjTgkF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxGrYIX5b02__DNTK14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx6WtzleY_3Ez_qvY54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"}
]