Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not sure what the popularity target is here? Half the planet is worried alread a…
ytc_UgzWcN_bX…
G
At some point, some one will say, enough, and “pull the plug” on the entire prog…
ytc_UgwyJ9rvT…
G
i can't help but wonder if people in 100 years will look back on people saying t…
ytr_Ugy58e41t…
G
No shit. Its a fucking LLM. Choose the right tool for the problem. You buy 1 gal…
rdc_ohuwpik
G
Why on earth would just stand there with your hands up as if the robot would rea…
ytc_UgzW8twVT…
G
Oh, we know. Just watch the first Terminator movie. Most, not all, but most of t…
ytc_UgynESom9…
G
I think I generally agree with your point, but I'd push you a bit to say it's cl…
ytr_UgxnO6auS…
G
Dude, looking at your hands while dreaming is EXACTLY how AI makes hands with al…
ytc_UgzJ69S7V…
Comment
Give it time. Marques Brownlee was just saying he thought they would never be able to make videos at a high level and now they can. As soon as someone takes the time to create the lawyer product to sell it will be trained on all the court cases and precedents. The error rate will plummet and some jobs will be lost. Given time ai or language models will keep getting better. I like the way Lawyers are like if you’ve been in a Train Accident we can help no crap it’s a situation where they are guaranteed to win lol
youtube
AI Responsibility
2024-02-18T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgypNYedn2sp8DJJeo54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyLE9J43zlEwVIzAq94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwEYJd7ODSEN0VTtOF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxNC0E0KqRzvMvFtsh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxkVdA_YLOEHZjvB754AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzaYOQA5PjuIOqEX4h4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgxK4QwUyqEzPQH-lzd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzdP-mYV7VjS-kCerZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxQ4iT4NQIsEumXFfB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxR8YvAYE9DLQg5iJV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]