Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We know there is no stopping AI. Just like there is no stopping mosquitoes, or …
ytc_UgzEkc8rH…
G
Those AI detectors barely work. One time my teacher said I used ai and gave me a…
ytc_UgxH7XrfB…
G
All Driverless vehicles can not & shouldn't be allowed to built. Reducing these …
ytc_UgzVXiKcP…
G
A machine when not powered is just an over glorified piece of scarp that has no …
ytr_UgyF680Og…
G
> NLP (natural language processing) has advanced a lot but is still somewhat …
rdc_fvwhe5o
G
Great question. Self-driving trucks are being tested, but the tech isn’t ready f…
ytr_UgyfhKw4D…
G
I was hoping you'd end this video like that, though I think stressing driver res…
ytc_UgzeEF1cF…
G
Imagine a world with higher corporate taxes, universal basic income, and enough …
ytc_UgyXyjOtU…
Comment
How come an idiot like me knew AI was a slippery slope and given the tendency to take something designed to "aid" humanity gets used against us, but the godfather of AI didn't?? Or did he? Not only stop AI, but send it off into space, it's so bad for us. What was the initial objective of AI? So we wouldn't have to do anything or think for ourselves? What??
youtube
AI Governance
2023-07-07T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyKZ7EO1UmJS8sn60x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzzmk1HHLW_AkY9iol4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy9-GE4LB-WYC59QI94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxPn1Vhn4q7NGCkp2J4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzJpl3xrczepiqv14x4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwiu6SzeMGfMlzZpn94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwJps3IdX6C3ZDjJJR4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwXfdCleiztV1AqD1J4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy8v-BInqt9IZFSxxF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwgXXex0La6CYOUFMR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]