Reduce maximum comment depth to 50 by @nutomic #5009
Goddamnit. I fucking hate paginating comments and would rather just fetch all the top level ones and control the depth based on the number of total comments. I also hate that they see the API through the lens of Lemmy-UI (IMO the worst way to interact with Lemmy).
I don’t understand your comment. This is a fix for a crash in the backend, I don’t see how it relates to lemmy-ui because it seems like any frontend would cause a crash with this issue if it’s hitting the same API route. Also 50 is a lot. Finding a post with 50 comments is rare, finding one with a chain of over 50 in a row is even more rare. Such a thread would be clunky to display in the main comment tree anyways.
This isn’t just a comment with 50 replies, this is 50 levels of indentation.
It’s also funny to criticize Lemmy for being biased towards its own frontend when it probably has more (active and working) frontends than any competitor (Reddit, Mbin)
I read the PR. It seems more like a hacky bandaid rather than addressing the actual issue. But I digress.
It’s also possible I misunderstood where/how the limit was being applied. My understanding was that it was limiting the response to 50 per depth (50 seems to be the arbitrary limit for most of the API’s list endpoints). What I really don’t want to do is have to paginate the request for the top level comments.
e.g. if a post has 100 comments, and say, 60 of them are top-level, I much prefer to be able to get all 60 in one go. Depending on the total number of comments provided in the getPost call, I dynamically set max_depth higher (3-5) or lower (as low as 1) and fill in the deeper comments manually with a “show more” button. The exception is if linking directly to a comment where it uses the path to calculate the exact depth to fetch.
finding one with a chain of over 50 in a row is even more rare. Such a thread would be clunky to display in the main comment tree anyways
I’m working around that without pagination, but it’s a low priority fix since Patrick’s Law come into play. It’s like Godwin’s Law except it says that once a comment thread gets deeper than 9, it’s a slapfight that’s best avoided.
I’ve got a laundry list of reasons, but suffice it to say that pretty much every third party client I’ve ever used has been miles ahead in UX and polish.
One example is that if the API throws any error response and lands you on an “Error” page (post removed, user deleted, etc), the whole UI is stuck there until you refresh the whole page (e.g clicking “back” updates the URL to your previous page, but you’re still seeing the error).
Goddamnit. I fucking hate paginating comments and would rather just fetch all the top level ones and control the depth based on the number of total comments. I also hate that they see the API through the lens of Lemmy-UI (IMO the worst way to interact with Lemmy).
I don’t understand your comment. This is a fix for a crash in the backend, I don’t see how it relates to lemmy-ui because it seems like any frontend would cause a crash with this issue if it’s hitting the same API route. Also 50 is a lot. Finding a post with 50 comments is rare, finding one with a chain of over 50 in a row is even more rare. Such a thread would be clunky to display in the main comment tree anyways.
This isn’t just a comment with 50 replies, this is 50 levels of indentation.
It’s also funny to criticize Lemmy for being biased towards its own frontend when it probably has more (active and working) frontends than any competitor (Reddit, Mbin)
I read the PR. It seems more like a hacky bandaid rather than addressing the actual issue. But I digress.
It’s also possible I misunderstood where/how the limit was being applied. My understanding was that it was limiting the response to 50 per depth (50 seems to be the arbitrary limit for most of the API’s list endpoints). What I really don’t want to do is have to paginate the request for the top level comments.
e.g. if a post has 100 comments, and say, 60 of them are top-level, I much prefer to be able to get all 60 in one go. Depending on the total number of comments provided in the
getPost
call, I dynamically set max_depth higher (3-5) or lower (as low as 1) and fill in the deeper comments manually with a “show more” button. The exception is if linking directly to a comment where it uses the path to calculate the exact depth to fetch.I’m working around that without pagination, but it’s a low priority fix since Patrick’s Law come into play. It’s like Godwin’s Law except it says that once a comment thread gets deeper than 9, it’s a slapfight that’s best avoided.
yea it would definitely suck if it only loaded 50 comments at a time, or 50 replies under a comment, but I think it’s fine as-is
lol for sure, 9 is already a lot
Why?
I’ve got a laundry list of reasons, but suffice it to say that pretty much every third party client I’ve ever used has been miles ahead in UX and polish.
One example is that if the API throws any error response and lands you on an “Error” page (post removed, user deleted, etc), the whole UI is stuck there until you refresh the whole page (e.g clicking “back” updates the URL to your previous page, but you’re still seeing the error).
A new official UI is coming, which might improve the UX.