Harper Evolves

(elijahpotter.dev)

24 points | by chilipepperhott 4 hours ago ago

4 comments

  • torginus 4 hours ago ago

    The biggest weakness of genetic algorithms is they can't make use of gradients - meaning they have no idea how to 'move' towards the solution - they end up guessing and refining their guesses, which means they're much slower to converge.

    Their advantage is they don't require gradients (so the fitness function to be differentiable), but I don't think they're going to be the next big thing.

  • dang 4 hours ago ago

    Recent and related:

    Harper – an open-source alternative to Grammarly - https://news.ycombinator.com/item?id=44331362 - June 2025 (162 comments)

  • jokoon 4 hours ago ago

    Probably much better than gradient descent and co

  • dang 4 hours ago ago

    The submitted title was "Evolution is still a valid machine learning technique" but that sentence doesn't appear in the article, which is really about (interesting!) specific work on a specific program. I couldn't find a better representative sentence in the article so I went with the page's own title, which is what the HN guidelines default to anyway (https://news.ycombinator.com/newsguidelines.html).