Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What a load of c**p 😂. You got precisely the answers you wanted to receive. AI i…
ytc_UgxFfjc23…
G
based ai
jokes aside it is really interesting, i think these problems would eli…
ytc_UgxSfGo8E…
G
I want you to look at the symbol for the Teamsters. There are horses on it. Jo…
ytc_UgzkyzV5g…
G
The ai isn't gonna be 100% perfect its gonna make some mistakes but only people …
ytc_UgzCGfZQx…
G
One day technology will control humans . One step closer . Crazy to see how many…
ytc_Ugx7cIBHJ…
G
I don't know how big AI is going to impact. It is definitely a productivity tool…
ytc_Ugyzpp59A…
G
Me 3, I find it creepy and disturbing. I like seeing real people. We don't need …
ytr_UgySY-ir7…
G
What happened to that provision in the "big beautiful bill" that made it ILLEGAL…
ytc_Ugzl89dOC…
Comment
Yeah, I don't think we have much else to fear except for millions and millions of people losing their jobs which in itself is a little apocalypse. AI will always need outside help to improve and will need a insane hardware for it at some point, like entire buildings full of hardware when you consider how much hardware ChatGPT already needs and hardware improvement in the way of "better but smaller" is hitting its soft limit nowadays already. As is the improvement of rechargeable batteries which makes intelligent robots running around in everyday life impossible, they would be out of power in less than 1 hour. I mean, don't get me wrong, AI absolutely will reach a conscious state someday. But it still wouldn't have power over us.
youtube
AI Governance
2023-07-07T14:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyft19WDc6KAqUSS2N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx6RqEiCc74Qj6as1x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwgmK2M2qlHpmCFC7J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyqiIT6jkjFKFRvect4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwAxKFL70UfpfjPxrF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwu19479bLPHtsb2Td4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxcS-wuo0aOpYAueVV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzgeTYr6OeFge0_pbF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzlCVMRe2grRMYSVRV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwhS5zWhwpOQQUpOLN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"}
]