Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Whoever think ai is gonna takeover I called them Choooo ,simple hai "you can't b…
ytc_UgyIlD4vT…
G
Yeah this isnt an issue i care about. If someone walked up to me with ai generat…
ytc_UgztIxzkf…
G
AI is killing the value of EVERYTHING. "They" keep telling us it's better than h…
ytc_UgxyHEPDe…
G
@retrojon_ Thats what I've been doing, though I'm not an artist myself. Its been…
ytr_UgwYIpQoj…
G
People have really wierd ideas about AI. If I said "I asked a random man what I …
ytc_UgwyLFcu8…
G
If streetlights were red humans could see better at night. Near IR high beams c…
ytr_Ugz-q4kGh…
G
Looks to me like they've updated this particular AI model since the release of t…
ytc_UgyiIQRfV…
G
this predictive policing is the MOST INSANE AND SCARY THING THAT HAS HAPPENED IN…
ytc_Ugzz1iJ4S…
Comment
Essays are overrated for teaching, the only thing they've ever taught me is to write an essay. A lot of what people are worrying about is not a worry. We're worried about losing our jobs. Meanwhile, there's people protesting because they won't get to retire and a global birth rate that's falling. AI SHOULD be rolling out while it's being created. I'll tell you what is truly dangerous, rolling out a full-blown finished AGI. An unfinished product allows us to find the bugs and adapt.
youtube
AI Governance
2023-03-30T13:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzWPcnYG67xr7tD2_Z4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyoFziHVBHKhvCAHQ94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgywImcfTgHbBrYi96p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz9gfET2w2gstZaW6V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxz5Dx-gliAsCxOS8V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwRba43u_waxtwIdBp4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxInjTexF_urganT6V4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzssxmUIdzXKp8pZP94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzAnw35jUAr6FYmxvZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzYr0crY9GeTLgRkW94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]