Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@juliosantana1646 Humans do not need solar power in order to exist. We do not ne…
ytr_UgyOGV90r…
G
Everyone should stand up together.
1. With the productivity increasing by using…
ytc_Ugy4ZEqP0…
G
You are very misinformed. I can program 3-5x faster and of higher quality using …
ytr_UgyRhcypV…
G
People !! This is San Francisco ... of course they don't allow facial recognitio…
ytc_Ugy4vJ0Px…
G
I think when computers started getting popular company's grew a lot bigger and s…
ytr_UgxBLJhFP…
G
We have an entire film series about why sentient AI is a horrible horrible thing…
ytc_UggQEfhGw…
G
Call me crazy 🥴 okay, but wouldn't be easier to hijack a driverless truck, .... …
ytc_UgwbaD4SS…
G
poor baby; there is no privacy, and what little there was is now gone or chompin…
ytc_UgwvqK564…
Comment
As a Medevac pilot in Alaska it’s possible for AI to take my job, but cost prohibitive. They will need drones that can carry two medical personnel a patient and a rider that can land in unimproved places. It’s possible but you won’t have anyone to blame if something goes wrong. I think 2030 is very alarmist. Especially with so much of the pushback I see.
youtube
AI Governance
2025-09-20T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzP12EuEPD8aoU6SaZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy6kk9PcrCiVEYTeo54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwRFF7XoLvoZdAAzNN4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugxu4VCdvsbVMxIic5t4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzGI2DK-lPdch0EDal4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy9S1dtJUx3oQU5omV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxhWfSM89SwgcNKmY14AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwxTFegdTFawgG9-3R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxxkAeake_HbXFK-7d4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugw7uvYD15AP88CwmhR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]