Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its silly. Imagine if a theology prof started expounding on ai? Thats how he sou…
ytr_UgyTsBk3P…
G
So this is the problem, I'm newbie in coding, yes I learned from AI but I've onc…
ytc_UgwzbRE3W…
G
Coexistence between Christians and Jews is possible, but it requires finding com…
ytc_UgxGNSLnm…
G
The developers should program the AI with a set of laws like in that movie IRobo…
ytc_Ugzuu-STT…
G
I believe AI is dangerous in the wrong hands which it will be if it isn’t alread…
ytc_UgwYAXhzM…
G
These are shot in segments because its impossible for all the camera angles to o…
ytc_Ugz-N18A_…
G
The problem is that robots don't buy stuff. They don't consume anything other th…
ytr_UgzcymAQm…
G
I dont have the mental strength to deny things that seem even vaguely sentient. …
ytc_UgyhXbB9h…
Comment
ChatGPT will not be AGI. Anyone using it for real purposes can see that from a mile away, and no update on it will change that. A different architecture to LLM is needed. So no AGI by 24 or 25. You should probably lower your expectations to the next decade.
youtube
AI Governance
2023-12-24T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgxLXt7IcKXx1OgWDwl4AaABAg.9wdy4Rs0GcsA4YAIFfRO3V","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgxLXt7IcKXx1OgWDwl4AaABAg.9wdy4Rs0GcsA4YPzcsJ4Ke","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgzEstUayZhCpajDOHh4AaABAg.9wc6Eb2Fx7l9xqpJ9y93J-","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgzEstUayZhCpajDOHh4AaABAg.9wc6Eb2Fx7l9yh_sGtf6JJ","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwJRS2gLugovS02taJ4AaABAg.9wbofxctIx39xqIEc3gvBv","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytr_UgwJRS2gLugovS02taJ4AaABAg.9wbofxctIx39xqfuHIaluG","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugx8r84v_JpAF_1bOpN4AaABAg.9w_PX3ujyL59wf_-__KUbT","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugz8XXbHD-Yp5IpBHFh4AaABAg.9wYr6TnX9LO9wlGHmOB4PH","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxE39q66FlihxuyOLl4AaABAg.9wXyaW_nQ_i9xr_CTNgPRe","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxE39q66FlihxuyOLl4AaABAg.9wXyaW_nQ_i9xthVI0tMPK","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]