Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is almost at the point that it can get citations correct. I'd say one or two …
rdc_oht7dtq
G
The autonomous car would never be that close to the car in front of it.…
ytc_Ugi_78FWy…
G
Then when your productivity drops (because of course it will when you're babying…
rdc_oflfu54
G
i don't hate the people, i hate the companies putting people out of jobs and ste…
ytr_Ugzrnp1Zo…
G
Put them head to head. No system is 100% safe. But FSD is as safe if not more …
ytc_UgxbVGZ5g…
G
Too bad the juniors impacted by AI won't be the ones getting the opportunity of …
ytc_Ugy0hP-Co…
G
Your AI tech isn't that great then as they've blown up the whole of Gaza. They h…
ytc_UgygpjtHe…
G
AI will definitely destroy us. No one can predict what AI will do. We're not sur…
ytc_Ugy7y18r8…
Comment
Eventually The A.I will create the sentinels (or some such physical real world interacter), to continue to survive...
After doomsday, consequences, it will find it has to use us as batteries to continue to survive itself..
Yes, we'd be beyond Lucky, to even see a "Matrix" like ending.
But It would actually be preferable, to any other "endings" in our times reality.
youtube
AI Governance
2025-12-06T07:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzNNqMkvPiOaUqeLSd4AaABAg.AQPl80-5oVyAQRZkUt_gd8","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytr_UgzNNqMkvPiOaUqeLSd4AaABAg.AQPl80-5oVyAQSiIgE14ZU","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugx38oK10i_PrwCanFp4AaABAg.AQPc7XK172rAQPw3_O01fU","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugya58PEjU0yDC5rLs94AaABAg.AQOztgG8FGJATaf2J-5AvV","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugxe6B26TVkbyEztgch4AaABAg.AQOnJN7aqclAQQHf_L0Bdd","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgxWfqSlCZ-0U8f7o6h4AaABAg.AQOeeAqpOkwAQOtIR7R0Pd","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytr_UgyEwQFFWSRVVA9EjI14AaABAg.AQNtjpewX0IAQNyXhZnrqQ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgxWNXntyuZdZio9TOZ4AaABAg.AQNsshgENz1AQNy15ur6i7","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytr_Ugz66satyPDmdCiMsAJ4AaABAg.AQNgGldeOiUAQPYOaM_DDf","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgxZdpS9AMZBvKg5jP14AaABAg.AQNdLEOoDT6AQNdx589ifs","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]