Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just wait until the shareholders realize that AI can do all of the executives jo…
ytc_UgwtXTYVW…
G
I suspected this the first time I saw a screenshot of a poem written by grok tha…
rdc_kcnoumk
G
Think about how powerful a good psyop could be, now thing about someone using AI…
ytc_UgznB25IE…
G
If robot is givin to think then Man will be Lazy and the next generation will st…
ytc_UgyKirH_c…
G
Im an IT students and im already confident that i am not needed anymore as ai wi…
ytc_UgzcwihPl…
G
I don’t see the use of Gen AI, This is all just done on conventional methods so …
ytc_UgyRuEu7u…
G
Once it's easy to make AI, we don't need for it to go rogue, malicious people wi…
ytc_UgzpUO77c…
G
AI is continually learning. Don’t we then have a responsability to teach/model f…
ytc_UgwckF26N…
Comment
Hmmm.
The people who learn AI first will be the ones who automate their own jobs.
You do not want to be in a position where somebody else automates your job.
Microsoft, Anthropic, OpenAI, and Google all offer tools that allow an individual to use AI to automate data entry and other tasks.
If one person on your 5 person team starts automating with AI, soon three of those people will not be needed, and the first person will move on to automate another team's workload.
This is not waiting until 2030.
youtube
2024-11-10T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy3h1TsFdAMrvBf2kl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwEgZxHpQMvoPM9YU94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxo_n0_BvtP9J8kmUh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJi9cX5Hp39fYwB_Z4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwTlbzpMsFP1SQpPQ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwamQgNuDXxJULOeax4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugyv_IbSpuZ0GfA1d9F4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwplMuteW-rp_kC_jB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxLOqUydsyYbTEaBRV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzJn3kel_9wJjHMCwx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]