Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The app sora AI did make the ""Work Smarter Not Harder" to another level 💀💀…
ytc_UgyJhFteJ…
G
@daveidmarx8296 Robot means slave. And so both a computer AND a human being can …
ytr_Ugy0wV1fY…
G
the ai summaries of the topics being about how ai is bad is hilariously ironic…
ytc_UgznfkfuY…
G
We should question the voice behind AI 😕... The bible prophecy is unravelling be…
ytc_Ugz9WPSud…
G
AI's threat to humans is directly correlated to those who program AI. Greed prog…
ytc_UgyKGAh__…
G
1:30:42 AI can easily get out to the physical world via camera and microphones, …
ytc_UgxUROqrZ…
G
@JackaryWareThey were following too close. Full stop.✋ If you have swerve to ano…
ytr_Ugy-isrs7…
G
Didn't show it in this video, however we pitch it as a solution to amplify their…
ytr_UgxDYWtd8…
Comment
I didn't get too far into this. The algo feeds it to me. Saying 99% of jobs will be gone by 2027 due to AI is not realistic. Applying it isn't as easy as they think. The energy required, etc. Have you spoken to an answering system AI lately? Enraging. Also, not possible to get just very basic needs met by 2027 with AI. Nope. Is it going to give us free power and water? because it will need massive amounts to survive. It's going to be more like 10 yrs at least. Have you seen our cars and transportation? 😂Is AI going to fix logistics? Replace truckers? Replace housecleaners? Plumbers? Electricians? Doubtful.
youtube
AI Governance
2025-09-04T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgwGpPNYqAYvslOwyhx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyvcZGqF8nKresMccx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyCl3eTbIjK3Pa2po14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxUEPZ860O2GLx0XC54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxfUewggNa45WjQX6d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx4pvafxxu_WYlZzAd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzKiRYE7HneNmIK1dx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyX2UHqc7m9NJnxzTF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugzp7vHYGoGd3lHDwUp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwpDMlKRDuNLw4kMGR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]