Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
She is studying law. I am sure she is going after the company that created AI th…
ytc_UgwBk3kCo…
G
This video contains some good points, but other parts of it are just downright w…
ytc_Ugya5c2gs…
G
The AI sees it, and ram the woman anyway! Proof that AI will murder us all!…
ytc_UgxyFaNdQ…
G
According to the data AGI would not be able to gain control to autonomous weapon…
ytr_UgyOEX1rW…
G
Waiting for mass civil wars world-wide when AI will take over all jobs...keep fe…
ytc_UgwY_m7GT…
G
Leave the native lands and people's alone, they need their land and water for su…
ytc_Ugx7tPeYV…
G
Waymo cars getting pissed at each other. Didn't take the AI long to get an angry…
ytc_Ugw45GJX8…
G
I know one thing , not if but when AI gets the capibility to lie to decive to sn…
ytc_UgxG0b8-c…
Comment
It would be smarter, instead of trying to achieve singularity like it's some Apex of importance, assuming your use of AI and vision for AI in the future is for the betterment as a tool for mankind, it would be better to ensure that you maintain a strict line between robot and singularity that doesn't get crossed so that the tool stays as a tool not as a imitation of humans. I know these jackass atheist scientists have already crossed that line but that's what a smarter person would have done. All these scientists working day and night time, to solve problems that they create. Smh. Next year we will be paying carbon credits well we mow the lawn and watch rockets go to outer space for no reason. I guess artificial intelligence is a step up from no intelligence huh?
youtube
AI Moral Status
2022-12-25T17:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwbF3SZDPPy7mC8Xt54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxItMLnOqf3B_0I8EB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgynYoq9AnxoOIgXm-l4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx8OOzeOeyRDvV0aTJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxev-pyKYsQlaoDCiJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugy8NKjVd-6ZbCnzhvp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugx0yH380Ar9FamrCV94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxAzX9_Uv__DYnj0D54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugyq--qaM5NRN-uBodx4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgztpdMv0Wl6LoSW--B4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"}
]