Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When one of the pioneers of AI — someone who helped build the technology itself …
ytc_Ugxxp2-6f…
G
It's more "FBI announces facial recognition system" . I'm sure they've had it fo…
rdc_ckjlisa
G
AI replaces real humans with corporate driven machines. The USA is better off wi…
ytr_UgzxD4uYu…
G
@jamescarroll8917 Here's the issue. AI can work exactly like humans, it can be b…
ytr_Ugzj5iEhX…
G
I dont think people know how our brains work. Our "inspiration" is basicaly us l…
ytc_UgyWrCFut…
G
If you’ve used an LLM for an extended period of time, you ‘d realize this is not…
ytc_UgzLMm5Wa…
G
im an artist too, since i was 5. im 17 now, and my art is below average. im agai…
ytc_Ugy3D9Oyk…
G
Google, right now, "are tesla self driving cars safer than humans" and top resul…
ytc_UgwcfQchZ…
Comment
Firstly - GREAT video as always.
The only thing I can note / disagree with is your comment relating to AI not having a gut feeling (keeping in mind I'm restricting this to existing AI). If we look at what that feeling actually is - it's often just a collection of memories and experiences that in some small way relate back to the instance in question and mean that a person decides to do or not do something based on that 'generated' feeling. In the case of AI, that gut feeling would just translate to something like - 'unlikely but possible' then it decides to do the unlikely thing - which turns out to be right sometimes.
youtube
AI Governance
2023-07-07T05:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzYRLpjT7igmP2HfO14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxXAmcJ-P4C9mFPhQZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzwOjbuQiwFi0zNGUJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyX7PFuwBsFM7Xdnht4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwtfCHPblHZqxZYwUF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxIjL7hNTlrYJFKvp54AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw1Mik5I-N8vnsHNEF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyA09LGyCAVf5_tUUN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw1r-9lNrJG2t2HZjZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzFKmdH4KoByTWbm-B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]