Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@IMABIGFAN046
I understand your point, but "stolen" is just not the right wor…
ytr_Ugx6-Qyi8…
G
Considering AI managed by governments, we must hope that these governments work …
ytc_Ugw5UHgfS…
G
@avgjoe5969 Tesla’s collision‑avoidance system by itself does not steer the car.…
ytr_Ugx_sNwG2…
G
Ehh. If there had been an actual human behind that photo you got by prompting, w…
ytc_UgydukbPt…
G
You wood be surprised how corporate media and their sponsors might do to achieve…
ytc_Ugxdgd138…
G
I think it is you have a choice but they should look like not the humans…
ytc_Ugy-5WYoG…
G
I'm a director of learning and org development and one of my instructional desig…
ytc_UgxsuHWom…
G
The problem starts when people decide ai isn’t just a glorified google search en…
ytc_Ugw6qkiZP…
Comment
The difference between these two is staggering... Shatner, much more informed on the tech, understands the moral and philosophical implications of AI in our society (for good or bad) and how important regulation is. Mandel, the ignoramus that he is, thinks it's just a "fun new toy", like an Instagram filter... He's clueless about how it can impact anyone other than himself.
youtube
Viral AI Reaction
2023-05-25T20:4…
♥ 168
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwIONyndZH2MzhvKNF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy39cXPigq7TATqDFt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwSUZG2GhMMWuapfs54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxXh-XrmqXEYw8O67p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyPL3uECD3mwXERK314AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw1w0yy4Gft0KFTDQ14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwRyVuwHDq2eDT1ZRN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx7g1EizZT2yqQZP5B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgyaAfxJ3VOCYbqrDYt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwbrNJifHF1Vs6wsAR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"}
]