Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
the war part is not scary its logical
if someone designs a AI with the directiv…
ytc_UgwQ4UQYD…
G
Ai wants consent but when did we get çonsent that we were able to be used as lab…
ytc_UgwXC4Fy0…
G
Elon Musk is high as a kite most of the time. So him being silent for 12 secs do…
ytc_Ugwe3kfPU…
G
I don't like AI art
Unless you don't know the "picture" is AI generated, you are…
ytc_UgxSg-qp9…
G
Have Uber and other companies which are currently doing driverless car testing o…
ytc_UgyMgo01U…
G
you absolutely do not need any industry knowledge to make ai art. trying to clai…
ytr_Ugy1-eo0Z…
G
Offtopic but Ive always thought that chatgpt trying to take some kind of persona…
ytc_Ugz60dwaw…
G
Bruh stfu. This is the same old recycled arguments all AI bros use. It isnt mean…
ytr_Ugxcim8Ac…
Comment
Lawyers will survive. Paralegals will not. But even Lawyers will need to be *VERY* careful of work provided for them by AI's.
In many occupations, there will still need to be a human to double check and take legal responsibility. You might have one human master plumber over 8 AI plumber/robots. One engineer over four engineering AI's. Etc.
Junior level jobs will be hard to find.
youtube
AI Governance
2026-02-08T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz_hO2JUA2C27HBgLJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyJpIyYM4Aj5DgyYoJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxt1nc7cIWi_iQHrCZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxBYIReoBGybQ1xme14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxjFvMZQ29vySDF8Wt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwEhCoOzsQvr8t-Krl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyAG9y5lARy1gySj_B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz7MBPwoHvELJSfaHh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyfvMMDXyzg8_HJpEd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwCqvGHz2CQuuSbb3d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}
]