Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The risk of autonomous weapons that can decide whom to kill is real. The problem…
ytc_UgwaG7MRK…
G
Good grief, please spare us all this . If this is what AI is best at, then are w…
ytc_UgwZ9BlvY…
G
Yep. So old it's reposted on /r/jokes
https://www.reddit.com/r/Jokes/comments/b…
rdc_kubqz2l
G
Hmmm artificial intelligence after you insert micro chip in human beings,half h…
ytc_UgwhD_uR-…
G
AI is on the edge of becoming so accepted that it will be impossible to throttle…
ytc_UgzqnCrGG…
G
You’ve made some valid points but I like ai even though I’m an artist I hope in …
ytc_Ugx6DiNvF…
G
@GrumpDogWhat i am saying is that, AI is not a tool. Comparing a tool like a …
ytr_UgzVngHa7…
G
Tldr: the reason AI couldn't do it is because the guy who prompted it didn't hav…
ytc_Ugzl9QCsN…
Comment
Bet, AI's don't work like people. An AI behaves as it is designed to.
The big concern becomes when it decides what it's designed to do accidentally conflicts with our own interest.
youtube
AI Governance
2024-02-06T16:0…
♥ 9
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgzoiqacwDnTsirvXtl4AaABAg.9zb5LdYWvsR9zmWQZlgGtJ","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzoiqacwDnTsirvXtl4AaABAg.9zb5LdYWvsR9zobHOTJRd6","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytr_Ugxqa0xJTu6jeoZc2Vp4AaABAg.9zax21AhBMW9zb0XwiL8YM","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxO1r6O8j6UlDPOd_F4AaABAg.9zamwSx9CB2A-UHouISpuH","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxO1r6O8j6UlDPOd_F4AaABAg.9zamwSx9CB2A-zeYdA34G6","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytr_UgxO1r6O8j6UlDPOd_F4AaABAg.9zamwSx9CB2A0BTnkQyGBq","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxO1r6O8j6UlDPOd_F4AaABAg.9zamwSx9CB2A0EIUq1Gy0b","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxO1r6O8j6UlDPOd_F4AaABAg.9zamwSx9CB2A0EPCVssybp","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytr_UgyAn9UMXeFG0p5fasN4AaABAg.9zaeaIkd0ji9zaxX0297JL","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzwoaXz9oqsdxViHWF4AaABAg.9zaPkUsCS8F9zaxm-ga8gV","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]