Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI going be just like there Creator's Murderous, greedy, seeking power or domina…
ytc_UgxhKIZkg…
G
You have actually helped me out a lot when it came to my own artwork and I am ve…
ytc_UgyF6ltnW…
G
@Synthstrucplexfyimmunosystemci I love these AI generated responses on AI workin…
ytr_Ugx2CNzu2…
G
If AI could be use to control evil people it will be useful to human, but if it …
ytr_UgyOlJhLe…
G
#1 people dont need AI #2 technology in the past 20 years has not improved quali…
ytc_UgzPwYOfB…
G
Im so tired of ai voice ads and movie summaries... Its so boring and lazy to me…
ytc_UgyVSdwf9…
G
Any non-religious arguement for why humans deserve rights would also apply to an…
ytc_Ugy-JKQrS…
G
So he clearly stated that he does ai art and people "realized" that an ai art ac…
ytc_UgxHKhMfA…
Comment
With industry pivoting to use AI to do the work of junior and intermediates … while seniors supervise the work, eventually all the seniors will retire and there will be no juniors or intermediates who were developed to take over supervising AI.
At that point a lot of knowledge gap will be there where we won’t know how to do the work because AI is doing it and no one knows how to check it. We will just do what AI says and hope it works.
Eventually AI will make a mistake and a massive incident will occur at an industrial plant that will have devastating consequences. We won’t know how to fix it and we will slowly lose modern technology as the plants we used to rely on become too dangerous to let it operate when we have no knowledge how to keep it safe and no one wants to rely on a computer to do the work.
youtube
AI Governance
2025-06-26T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxSK-OzCKcEAPjKgIF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxDLtkyAd8f-yD6pox4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzv7_MAG8AKic11zVx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyfbySFj1qTo83zub54AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwnL8Qudr8she09U6R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwNfPYwUjKCbGYSPbF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugwn0oAd-wUMOPFM1QF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgynxaSuCAb4VLoFk9F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugx-HdMMrinyDY9URvF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxrHK0ZZldXRXzDUbd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]