Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah cause it's the fastest way to win.
AI doesn't have a conscious. It only …
rdc_o7ohrwj
G
This is a SKILL not a gift. You spend time on skills and invest in specific area…
ytc_UgwlQHkZl…
G
Joe...search for Sydney AI here on youtube. then you will see the true potential…
ytc_UgxTRyxPV…
G
There is a book called the bible and it has told us the end from the beginning .…
ytc_UgzbK1wJg…
G
After 1 year people still working, big companies layoff because of bad investmen…
ytc_UgzTQ6EVQ…
G
You should have been downloading everything that you thought was valuable over t…
ytc_UgyAq6TKz…
G
wasnt it clear?
anyone who ever just tought about it know. exactly that would ha…
ytc_UgzPiPWm8…
G
Guess actors are about to learn how little their opinions actually matter when m…
rdc_mif57g5
Comment
Putting regulations on AI would require a great deal of understanding on the part of kaw makers. They don't have it in them. AI will be pushed forward, like many technologies, by military uses. Unfortunately, the military can not / will not because they other side likely won't restrict it either. The outcome of a new super-intelligence is probably a forgone conclusion.
youtube
AI Governance
2024-12-26T06:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxpAOssCerc6HFFQW54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_Ugyb95tn5bvaaDGWKW54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzcS-IqeOiDGyDOymN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz2uxhmjbnsuAQ7O6d4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgywRcGVPEVi1WyxfFR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxPnUjkOFLIUQU-6wx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxI1RL7DabZSl8h4Xp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwbFmFVwvgFEZaGqtF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgycKBydweDTgcl8xgN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxOjoOqdetaSMkko7R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"]}