Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Question though - if you use AI to generate a script for a video - can someone c…
ytc_UgyLZP_B3…
G
Just cancel it. We all need to send the message that this wasn’t okay. They’ll o…
rdc_n7lknq5
G
I can build an AI enterprise grade automation system within 3 months, alone as o…
ytc_UgzHVB2lq…
G
I'm in financial services for a hospital, and there's already an AI bot working …
ytc_UgwXfOfjr…
G
I think there are two critical points missing from your scenario. 1. The economi…
ytc_Ugy3pMXef…
G
It started already a long time ago that the mankind destroys itself. With the am…
ytc_Ugyae36GZ…
G
Create new laws, educate people, spread the information, not post pictures of ou…
ytr_Ugy263vQp…
G
Exactly. This is an article on /r/worldnews about a broken learning algorithm th…
rdc_e7j5n6h
Comment
When a craftsman creates a tool, do they do so to take the day off or be more productive. With these advances in developing AI, we also need to remember to invest in our people. We need to remember that people can grow and learn, and we have a responsibility as leaders to see them rise. If an entire corporation can be automated with 1 employee, then this should be exciting if employees are empowered and self motivated to leave their current positions to fulfill visions which empower society. Technology and automation brings the question: "if you don't have to work what will you do?". This will collapse a financial economy, but requires one of empathy and trust. Let me know if this makes you excited for the future or more concerned. I'm surprised if you got this far :)
youtube
AI Jobs
2025-07-18T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxJgVs7dMlnvIp6g9R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw8dtRFfaJWqHNFc-F4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyDWMMws6OhmADc3NB4AaABAg","responsibility":"leader","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzgKDMoUPElDL3BDFV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwLxx5cUCk4bwGW74V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx9UwXW8xSyFIn-4Wt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz8juHD94UBgQN3sld4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyazB9kspHosD-2uHt4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz64sbcTlvWrxID18x4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugx_59vXj07pVBzT67N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]