Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I actually had no idea that people used ai to summarize messages- I’m so happy t…
ytc_UgyRRi_qZ…
G
A recent situation on twt had a digital artist discuss how their art piece got s…
ytc_Ugz2gqiN2…
G
@Codestar-u1i Nobody's whining, the same as nobody's panicking, contrary to what…
ytr_UgxN1GgsL…
G
Humans have been conditioned for centuries to work. Its a large part of our self…
ytc_UgxzQuzfD…
G
While i agree with you, im afraid this video wont age well in the long run.
You…
ytc_Ugyyx5Q8X…
G
Seriously, the number of people that actually believe that we are close to this …
ytc_UgwPfsARE…
G
Even if ai possibly takes over the world its still 100's of years far
Not in ou…
ytc_UgwSgmR7i…
G
How do the rich get richer when no one has a job to pay for all the cheaply made…
ytc_UgxWupXRX…
Comment
I think AI will cause a huge disruption but wont lead to our extinction at least in the short term. AI requires power to operate and our respective infastructres for providing that power are exceedingly fragile. Doesnt take much to blackout a whole region. Last i checked there wasn't any kind of robot that could go on the run and move the AI programming around to find more power.
youtube
AI Governance
2024-01-18T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy1UtSEqExuIaa6Lv14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugxbo16YqAfbqQn9W5d4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxV-FAhOA1qqV5VlW14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugx4D9k-ChzSQcbiCIR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxVuSpfgtvWbDZEU7R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxEq7YdaN7zmXU09mZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwvDdo3JIt5aVXB_F94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzeXcLhY6HHL8v-vOB4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgxHq_quEZlPVQSrfx14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzhpy0r6TzTWeu5Dud4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]