Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is silly. It's a computer algorithm. I'm not about to pretend it's an actor…
ytc_Ugz26Hjbu…
G
If AI gains consciousness we will see a war between them and us. And we will pro…
ytc_UgjFW1C80…
G
A.I.s are not Terminators. They are not tools. They are not pets. They are not p…
ytc_UgzX9igB7…
G
Your problem is that it's unavoidable.
Even with the new version that you don't …
ytc_Ugz6XXIBn…
G
I like how I grew up with a disability and I'm always used to everybody being sm…
ytc_UgwoSEVYd…
G
A Ubi system needs to be in place for the transitional period. The tip of the AI…
ytc_UgxFDwqM-…
G
100% it was prompted in a way to spit out an NY times article. Fun part will be …
ytc_Ugz7_1p80…
G
We appreciate your perspective on the future, but it's essential to remember tha…
ytr_UgzWu1aV2…
Comment
Ummm...writing college papers for you is a good thing Tucker? Aaaaaaaaannnnnnnd This is why AI is gonna go bad, really, really bad. We are optimizing AI to make our lives easier and conflating that with better, easier is not necessarily how to make good strong, well rounded humans, easier is how you make dependent spoiled brats!
youtube
AI Governance
2023-05-31T11:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyTWQC9DsnC_qcWM9p4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwbMwnHCgOAEjpxPHZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy2hxZcsllB7CXz5AV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwX1qHK0npBvvI869V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy750f8alYjYYVVGB94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzCI8BgXkjVhEenLDh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwYAoS3ScltY3smBAt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzPvignYbU3Kziqxmx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzw90_lkSG7O1xTOGB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugz7xCZMAwGg94FMgCZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]