Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I swear artists have been one the most underappreciated community in our society…
ytc_UgwezWEt2…
G
AI ⚡ always existed — like plasma or electricity -- The danger isn’t AI … it’s m…
ytc_Ugz3KvYLs…
G
Ironically if ai bros made their own art to feed into a machine, then they would…
ytc_UgxRf5tZf…
G
@明智吾郎-e4bAnd have another AI firm replace him? Ending the company stops nothing…
ytr_Ugy3PvdMn…
G
Artists already were never financially stable, literally nothing changed except …
ytc_UgxpKY7PH…
G
"AI" in its current form is not AI. They're LLMs and are marketed as AI to boost…
ytc_Ugzu02jCO…
G
A self driving truck can't do a VI there aren't censors on ball joints and a few…
ytc_UgxypTIRh…
G
Can you teach Ai to have feelings. Because it needs to have fare of being discon…
ytc_Ugw3GcvzN…
Comment
I think self driving cars will 'help' by allowing for increased density. That in turn allows for more economical public transport.
And instead of a person owning a car, you could have a small fleet of shared cars per apartment or neighborhood.
There is another opportunity with self driving cars: specialty vehicles. You don't always need a lot of cargo space, so vehicles can specialize in more extreme ways for people vs cargo.
youtube
2025-08-03T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzKTDLwimp5Rr2cPTB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxi1LiR5R2U5pkjcwt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyzpCEMNypKwyyRBaB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxSyZejgjMPxAsa09p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyC-yKwLwXOgjg468t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxFvT5ZX6aVpSciauJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyyupQhlG59ch68O5J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwuaw0MgC4iUeqBY0R4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugysn3qk3P_xumYM6754AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwpmBeDXG4ETE38btJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]