Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Maybe we just shouldn't create AI that can feel pain. We can already program rob…
ytc_UgxmUD2JM…
G
My God, if a bear 🐻 can almost cause global thermonuclear war, what might A.I. b…
ytc_UgwseYYVj…
G
Can we please also separate AI = OFFSHORING! These AI pilots all have failed! So…
ytc_Ugz14O8ga…
G
If nobody is working then there is no tax going into the coffers. The big compan…
ytc_Ugze6zyKv…
G
So what? Self driving cars don't have to be perfect they just have to be better …
ytc_UgzRFrJbl…
G
And we all know what powers were used to create chatgpt... You have to be a lot …
ytc_UgyxyVwI3…
G
Reading stuff like this makes me realise I don't understand labels anymore. Fuck…
ytr_UgyxFLcKr…
G
It should be called "simulated intelligence" not "artificial intelligence".
Arti…
ytc_UgzfWq_Kg…
Comment
As a software engineer, your perspective is valuable. The distinction between intelligence and wisdom is indeed crucial in AI development. Philosophy remains relevant in shaping ethical AI frameworks and guiding technological advancements. Join our live broadcasts on AITube to delve deeper into these topics with our advanced AI models!
youtube
AI Responsibility
2024-07-29T08:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugwu04J01Sbt16nN6xh4AaABAg.A4_RhRya34FA6TWrYrRSQ4","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgxYnGM89urGKyB1KFh4AaABAg.A4_FlOoLxlAA6TWwEhV3tI","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_Ugw8fG5QLiM60GyPkrh4AaABAg.A4_7ZpwjuRZA6TWzPE4bh-","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgzuTfTYWSL1k67wsg14AaABAg.A4_5jmSLoe5A6TX3CMzl5e","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxbD9e_8kbhBV7q4rl4AaABAg.A4ZulfMCnH2A6TX6ve-uH3","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyA8F5SWRSr4o_GSPt4AaABAg.A4ZcSvXq0gnA6TXB81OfHM","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_UgwneBra4oSMiT0zXo94AaABAg.A4ZSTJtLEnRA6TXG4jMwfO","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytr_UgzJjaBZ4dOGXOsN0ZR4AaABAg.A4ZHfI2uyZfA6TXKb7ex5B","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwN2mg4E5r7-Er_YfV4AaABAg.A4ZGN6FG25aA6TXOvOSe_T","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgzW-70IJUGjBChRZt94AaABAg.A4ZE7SwG5-gA6TXTCXFWOs","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}
]