Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
For all those that criticize driverless cars, my grandmother was fatally struck …
ytc_UgygH_3NX…
G
1. Technical Reality: AI Self-Improvement and Risk
Scaling Laws: AI capability …
ytr_UgwbcVw3U…
G
When you choose to work for evil - you will turn into deceit sooner or later. Th…
ytc_Ugxze1GQB…
G
Shadiversity's downfall was funny to me. He wrote a book because he had interest…
ytc_UgwQzaL7z…
G
Stuff that the ai is named allah and really wants to be part of the parade😂…
ytc_UgxOWUqbC…
G
I love how these AI people have all the answers. You have a question, they will …
ytc_UgzDfsapx…
G
>Keeping a ‘human in the loop’ in nuclear command and control is essential gi…
rdc_k8whrf4
G
If its my car it better put my life first, if 2 people jump out in front of my c…
ytc_UgimsOSI-…
Comment
we're at the AI equivalent of where the atomic bomb could cause a chain reaction that destroys the whole planet, Its just a probability, no one knows for sure what it will do but I sure do hope that people make sure that the percentage is lower than the 0.2 or 0.02 (I don't know if that's right) that is was for the Manhattan project
youtube
AI Governance
2024-04-20T11:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugzetiop39l6WX4GZI54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw566Qs3Q-V1eam3I54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw0rExiCYEJeD4HAsB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxH8NAWXDYNQmSc7rp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwwGcXkpCQV7N6KagN4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyp8OlELh7er5nsBUV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxnm588c-YZ1lyLBlt4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzFBYVEzLdwyvT8nsV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwFGKtVn8-FRq2Dpll4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugxu7leEA9xcFl24W2B4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]