Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Don't like this? Too bad. Nothing you can do to stop it. What you can do: Learn …
ytc_UgyRUhzHv…
G
Well, if it’s a new conversation / context, I wouldn’t expect ChatGPT to underst…
ytr_Ugysiw5Qj…
G
@michalw8865 Palantir AI placed that Iranian school for girls on what Theil and …
ytr_UgwDUYaIv…
G
Does anyone else think COVID was cancelled due to the realisation that key worke…
ytc_UgzXtaSfx…
G
If your Google driverless car injures someone and you get sued, you should in tu…
ytc_UggCYfK-9…
G
not all generative ai is bad but not all of it is good either
good gen ai you s…
ytc_UgxQcWQkR…
G
If you are going to ѕһіt on AI at least admit that it is very powerful and not a…
ytr_UgxIp7iik…
G
I really, really wish I could like this multiple times. But I can't, so instead …
ytc_UgyEvO31b…
Comment
We can't let A.I. take over. If that happens, we could very likely end up like the animated movie WALL-E. I want people to do the work that keeps society functioning. Not leave it up to some super computer. If a situation comes up that affects humanity, we only have ourselves to rely on to solve that situation. Necessity is the mother of invention. Our creativity and ingenuity will never be replaced.
youtube
AI Governance
2023-04-18T11:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxSdNzsF_gferU2qvh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxSVEgjiA8ftsSvS2J4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy8KumJrpXzNjii7iJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzwt2XM3ugUXh15G2d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwcKaeoR33yWHGtyxR4AaABAg","responsibility":"none","reasoning":"virtue","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy28ni5CP1_xStPb0R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxSVzZs2VJtN0_KISp4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxA_Zb1h67k9SkE5UV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzlP3tyMPr3gtsl3114AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugxb5zMuw50abrurN6h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"})