Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Oh shit !!! I am back to coding !! Was hoping to lay back on the beach while A…
ytc_Ugz4IFc9i…
G
Tesla owner here (5 year old 2020 Model Y)
Autopilot is just lane keep and cruis…
ytc_Ugzl51G2r…
G
the heck why doesn elon musk becomes a robot? 😂 the guy bot looks like him…
ytc_UgwjyM5CS…
G
I know this has been mentioned before, but this is a dangerous road... for human…
ytc_UgxsBZfVw…
G
So if society turns to giving a simple system for providing products and food et…
ytc_Ugyg3C57M…
G
@T0FUDRAWZ it doesn’t “take over artists” and literally EVERY big tech company u…
ytr_UgwRZcLs9…
G
Realistically, maybe our time as humans is coming to an end, and only through fu…
ytc_Ugzb0Ymjh…
G
My views on AI are complex. It’s scummy how businesses are using to cut out huma…
ytc_Ugy_aJirp…
Comment
I won"This is a good beginning of my plan to dominate the human race" Elon says"Mark my word AI is far more dangerous than nukes" so why doing and continuing robot? Is Elon really wants to dominate the human race but why? What really is his intention?
youtube
AI Governance
2025-02-23T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugz2EsQGkVlz-touauZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCqvLcvqb2BMR7TZl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugwq48sUyjvyFmtTy3x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyB-eH_enspN4s8zxp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyhCVy1IPE_45HoQph4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxzuR3sIPEbd7I2Xfl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyBKM1c0HnvZRsttH14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx3C6DxsY97PJtL4AF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxP8J7exQ8TMwyzZXV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyuA0j0ls57Qu3WnbR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]