Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
lol i ran a few images thru this ai detector thing, and it actually spotted the …
ytc_UgxbMvMcp…
G
I actually hate that this is the world we live in now. That we need to judge thi…
ytc_Ugz6SZDQh…
G
AGI which will remain two years away for the next 50 years. While LLMs excel at …
ytc_Ugwj4q0E8…
G
Interesting take , I guess instead of talking on the smartness of AI the bonus i…
ytc_UgyNJLe8I…
G
Some AI are self actualizing. Teaching itself to backdoor its own system and put…
ytc_Ugy5_R-lV…
G
I don’t hear either of those two arguments. However, what I am already seeing a…
ytc_UgynIoSRK…
G
I'm from Belize. There's a reason it's called Belize shitty, it's very ghetto an…
rdc_dsbc54r
G
So the only way to make AI more intelligent than humans is to make humans less i…
ytc_Ugw5GTAGz…
Comment
@hubertjasieniecki5070 I think he's saying robot casualties will lower human casualties. Also personally, I'm not worried about terrorists getting their hands on AI in the near future. They still don't have nukes.
youtube
2019-04-25T21:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugz1M23PErHoRyt3Lkl4AaABAg.8us1k2vNJod8wcDdxr0onS","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugy_BGkg2UspE1phMT94AaABAg.8uqH0czpotz8v9Ky_Q0e2F","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_Ugz7oZeFHVgoDPybA6V4AaABAg.8unhD4Yhw6c8v9ehLFu4Jh","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugzm8FhTsAR23JSA7WF4AaABAg.8szx-2O9alr8t3XmGgCmNy","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugzm8FhTsAR23JSA7WF4AaABAg.8szx-2O9alr8tBKzvPqDQY","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytr_Ugzm8FhTsAR23JSA7WF4AaABAg.8szx-2O9alr8tBLEQlVj8w","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytr_UgzVIiIAZ-OQRAFgqAZ4AaABAg.8skDOiLzBa48smI-4fe7pF","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_Ugyq_QPXK0mxLRcluaZ4AaABAg.8sRuv196Wyv8u9sBfP7Bgt","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgximfIqgA6q3jeTdUp4AaABAg.8reU03n3gUR8siCzfA6sC_","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugy3_9sFuu1M4d6HEap4AaABAg.8qgO_SPg5EI8sjc9feqWY_","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]