Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Everyday the media becomes more and more exhausting
Everyone will have AI and t…
ytc_UgzPYqzC9…
G
There is not reason to further develop ai. In fact measures should be put in pla…
ytc_UgzruYHr9…
G
Okay, so if anything is a facial recognition conspiracy it's the "#tbt" trend, o…
rdc_eej9dt2
G
I started my higher education in 2019, when AI wasn’t a thing (TM) yet, and am n…
ytc_Ugy01fFLt…
G
AI good until 10 tables once that over real developer needs came in and AI get m…
ytc_Ugx9xbD_t…
G
ai replacement for job is not sustainable, they require high energy, a high qual…
ytc_UgxJ0Xy4n…
G
Short term profit has been a safety risk ever since companies let their product …
ytc_UgyNn7xp4…
G
Even when unsupervised self-driving is available, it will never be as fast, comf…
ytc_UgyZhUjQH…
Comment
Can we just agree that he fixated on the idea of AI ending civilization too much? He’s just starting his own one for some reason is why I’m saying…
youtube
AI Governance
2023-04-18T12:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxbKAHOwWMHYie6h7B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyzHbjn68cChDqNQ8h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyzedawhG8_lOgkb1l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxSXFXSlFlKPxGXCrx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw-nfbPQCIXek8wjnV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw36Ia2Z-e2wtdNNkN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxX1zwFnL4jW1ZBrZh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy1BlHzLf5RT3xhort4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx6qsnc_WWI9ec3CRZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgznJOuiw3qT3sm5fMJ4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"none","emotion":"approval"]}