Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Conveniently ignores the difference in the error _rate_ between humans and "AI" …
ytc_UgxStcDnh…
G
Where is the energy to actually make Ai work? If it was truly going to replace w…
ytc_UgxU0ZIDL…
G
I think it's really great that Elon Musk is telling us the dangers of artificial…
ytc_Ugy8Kq0cZ…
G
Absolutely - as a lawyer using it carefully in some situations. For example, I u…
rdc_mzz8xrp
G
we don't want AI we don't need it... it's from machine people that has no heart …
ytc_UgztaxWh5…
G
@Biatba For now...
I think the reason AI image debate is so fierce is because i…
ytr_UgwAuXmWq…
G
I fear that we are indeed off the path and if mankind wishes to still exist a th…
ytc_UgwxQs1c2…
G
This is one of the actual dangers of ai, not the ai itself but people not compet…
ytc_UgwiJ0oC-…
Comment
The fact is AI will take part in everywhere! For example, In the Future, Flights come with built in AI assistance then flight & makers projects it like, AI saves flight from danger.
But, when AI causes flight crashes then nobody points AI faults, &
Still makers continue adding more AI controls to planes.
Keynote : such things are going to happen, and we can't even stop it.
youtube
AI Jobs
2025-08-14T04:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgzgN8KVnoxqWIuM1m94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgybFxpGsY64arlF8_F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzhTU1W8BHGPDLvyS54AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugwocj9QWH1RMmxVORB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxLcBrwRBPUw3ji0CF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxGU7--HjAK2qDEgup4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyXfXzmVIT_jqxnKtF4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzxlecfIMOXlqHn7mt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw0ijIBKHFEBTb-OjZ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyJy7KLqCaut3I9Yjp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"})