Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The complete and utter moral bankruptcy of Muskkk, Peter Thiel, Balaji Srinavass…
ytc_UgzLcGUnV…
G
All these things could be programmed out. Just like you have parameters when usi…
ytc_UgzBf7ifW…
G
While AI could definitely remove some stress and gruelling work hours for anima…
ytc_UgwBTd7eU…
G
here’s the thing cameras are all around and we can’t deny that and if we’re not …
ytc_UgzlQW2VA…
G
11:48 TESLA'S ARE NOT SELF DRIVING CARS!!!!! They are level 2 autonomous vehicl…
ytc_UgzUPzX6P…
G
Deepika Rathore ftyuiop jop aertyuiop automatically dtyuiop jop awertyuiop dty…
ytc_UgyXdlsh-…
G
This is i robot all over again robots with guns = wnd of the world…
ytc_Ugxq7vshf…
G
"In the pursuit of artificial minds, let us not forget the fragility of our own …
ytc_Ugw_qY6_3…
Comment
AI will reduce the doctors needs by almost less then 50% to the current strength....basic to middle level medical needs are already very effective by ChatGPT itself....still we don't see the net effect coz edu genz ppl jus started after 10 to 15 yrs mostly ppl stick into.... after that only doctor will feel that effective only patients will rare doctor dependent high lvl cases ... which is very very few....in contrast... already mbbs is flop n Md struggles for job n patients now itself...rapid increase of seats.... Sector is going to collapse in n after 2035....Mark my words! Chill
youtube
AI Jobs
2025-06-13T16:2…
♥ 15
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzRRl0GxDNGJww5RqB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-Mt71qjARW8fpoNl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy5WByzl6rwng3nyKB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy7KdxpbK9GMQZKYLt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgzsJrrZbcR5sjLxwNx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx3m5XSTfHbqw650rp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgywNQqgJ5GRTgvE3dt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwBf_2wiXuE-k9hY1p4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwb9T7WVRGkknGKLOl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyDm4-UKNOkd87RibR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]