Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Studies have shown that developers using AI are 19% slower than when working wit…
ytr_Ugy_7tOWQ…
G
Seeing this robot we can come conclusion that this robot has a designer but athe…
ytc_UgzrF-1OA…
G
Merry Christmas to you too, Sam. I totally agree with this. It's unfair, In my c…
ytc_UgznZJHPX…
G
Machine learning- We created specific instruction set Standards by our own langu…
ytc_UgzpPKFHu…
G
I don’t think that what you said is accurate as AI is currently training on way …
ytc_UgwnunNj5…
G
@savagee4641 Robots don't have emotion.If robots don't have emotion than how we …
ytr_UgzopjSIz…
G
I think this snake will eat its own tail. But unfortunately, it's a large snake,…
rdc_nklqd20
G
I haven't worked in years so I'm ahead. Ask AI: Why does Hinton look like my f…
ytc_Ugy1lLyuW…
Comment
Yes true, but the fact is AI will take part everywhere! For example, In the Future, Flights come with built in AI assistance then flight makers projects it like, AI saves flight from danger. But, when AI causes flight crashes then nobody points AI faults, &
Still makers continue adding more AI controls to planes.
Keynote : such things are going to happen, and we can't even stop it.
youtube
AI Responsibility
2025-08-14T04:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgwrJhCTZbtpttuSaNZ4AaABAg.AKbKzX3D7jfAOhn1-P4Cx6","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgwrJhCTZbtpttuSaNZ4AaABAg.AKbKzX3D7jfAP4jVTJ-m1u","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxlxyTAp3C6Cx3KSUV4AaABAg.AKKW7IENboxAKYfi6fhn0q","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugw0oJeHvUlTWXybU8x4AaABAg.AJB_R8Rl98KALn7sFLG2lX","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugw0oJeHvUlTWXybU8x4AaABAg.AJB_R8Rl98KAM7U1JDf97W","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytr_UgwQJ43JYV6Q1F8SHat4AaABAg.AIvTfKzSk4jANzxNQRyqUX","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwIqP3UbqGwbel1BtV4AaABAg.AIowsdJkfE9AJv2jYRW0HU","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwIqP3UbqGwbel1BtV4AaABAg.AIowsdJkfE9ALn7d8FtzBz","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwIqP3UbqGwbel1BtV4AaABAg.AIowsdJkfE9AMeg1j0KIyo","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgzRE8Y8VZvwWnihVXB4AaABAg.AIgwhjn9tuzAV1e9tGXE4I","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]