Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As long as AI isn't set to create new artworks by itself, in my opinion, it will…
ytr_Ugx-LaCUz…
G
AI videos on YouTube are sterile and totally alien. Tired of the voices and imag…
ytc_UgyZGgq-D…
G
If kids studied fractionary reserve banking, marketing, psychology, “bizz”
The…
ytc_Ugxnal0PQ…
G
What will really happen with AGI?
Yes – it will arrive. But we’ll also learn th…
ytc_UgwAH6_bP…
G
NOOOO YOU CANT GIVE AI DATA AND THEN USE SAID DATA TO PREDICT THAT BLACL PEOPLE …
ytc_Ugyil2_Iu…
G
I work in animal care. Me and a good chunk of my coworkers cant and most lukely …
ytc_UgyWZitvS…
G
AI supporters in my eyes: WE WANT NO CREATIVITY OR HAPPINESS OR PURPOSE FOR ANYO…
ytc_UgyRb62kT…
G
@missange4701 They use an algorithm that makes art based on bits and pieces of p…
ytr_UgxK3gevL…
Comment
Yeah, & the climate catastrophe has always been just around the corner
The AI catastrophe has a fixed start date. The day a super-intelligent AGI is developed. But it is a fixed, certifiable event, unless the proper alignment research & solutions have been found. It can be avoided through tractable action, the climate catastrophe is everpresent as a means of control & cant be solved or it loses its fear power. If we solve the AI alignment problem we win great rewards as a species
youtube
AI Governance
2025-01-08T19:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgzIX7SHunxDEMATiMZ4AaABAg.ACW3Fys5U0YAD2OWEvhvTu","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugx7KBNUotC0dAjw5TN4AaABAg.ACOt6o6cjrhAD2elcgzl-0","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgzQuKqgc4PmJAJYV4F4AaABAg.ABfDlAN8Qs7AKJ63JKZkXF","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzvAOmAlIBGdsn8OAV4AaABAg.ABeUru8ninjAD2LNgGtg4o","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_Ugymf9LUofld2b-knd94AaABAg.ABdGHGCMk3hAHdGjApepmf","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytr_UgxTuDfVJrNYvEcqgzp4AaABAg.ABKAcsFvqymANTDoSx3Zkl","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgyF6c5XmakE-LcmqJ94AaABAg.AB7_w-V2edyABilxEclSuT","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyF6c5XmakE-LcmqJ94AaABAg.AB7_w-V2edyAJ68GdJAzfv","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytr_Ugx7QO9apJwSwFJJDFd4AaABAg.AB1m99hR1YWAB1mtBj0h6c","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgypQfzIYTdnRIeNxVB4AaABAg.AB044muIXKSABDzq74UheL","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}
]