Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Self checkout is great. I get it done faster than a cashier would, and I’m enter…
rdc_jrpd1di
G
I don’t think AI will get so smart that we can’t control it. Computers always ne…
ytc_Ugx46fp5k…
G
Well we don't know the full story, but it could raise the question for the publi…
rdc_dfepeps
G
"there should be some regulations of self driving cars"
There is - as long as it…
ytr_UgzRJBNNr…
G
10000 lines of code per day? What sort of project would require that? Implementi…
ytc_UgwzGAKsV…
G
if you are a chatgpt user there's a free Chrome extension called chatgpt booster…
ytc_Ugxf2RaJf…
G
More fantastic work... this video was REALLY great. All technology is a double e…
ytc_UgyOCcvBu…
G
I do not like AI at all and it genuinely concerns me. There NEEDS to be laws on …
ytc_UgzCnstxe…
Comment
We don't have resources, and we don't have AI still. We have just a machine for now. And I don't think that AGI will be born! The AI today we have is totally internet things and lot of storage. AGI is a tiny probability to happen, if if happens, we will have rains of it. Power holds power, not machine, mean who has that machine.
youtube
AI Governance
2025-08-13T03:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzHbyHr8BQmKOJI_8t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyfv3_yck0fEbd-vIl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzb1_gtmPpOHb6sXWd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxlaLVtkoMqseLSwN94AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwx6T6j_PG_4hmoZ2x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxmen0r82zywpa0aT94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzhNbZtrih6h9sxn1Z4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwOx40P27mm7BJIWAt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxvatfpCv0Y9hZ4x1t4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxFudW6sfQhYS5ANwx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]