frontpage.
newsnewestaskshowjobs

Made with ♥ by @iamnishanth

Open Source @Github

fp.

What right has a "personal fortune" to be anything but working capital?

1•silexia•6s ago•0 comments

1•muthuishere•7s ago

Private Hosted OpenClaw that can connect to your data with included AI models

https://platform.joinable.ai
1•tnac•3m ago•0 comments

Show HN: Machine – One VM per Project

1•katspaugh•3m ago•0 comments

Directory of Blogs with a /Now Section

https://nownownow.com/
2•James72689•6m ago•0 comments

Five LLM agents play Werewolf in-browser, each with a private DuckDB

https://kayhan.dev/posts/013-werewolf-five-agents-one-browser/
1•keynha•11m ago•0 comments

My Wi-Fi Was Faster Than Ethernet So I Fixed It

https://www.youtube.com/watch?v=pzb7py2HeqA
1•iamflimflam1•23m ago•0 comments

An example of functional slop code

https://manemasters.vip/
1•AndrewKemendo•25m ago•1 comments

Driving

https://jzhao.xyz/posts/driving
1•wonger_•28m ago•0 comments

Groww beat every odd to get here. Now what?

https://the-ken.com/newsletters/two-by-two/groww-beat-every-odd-to-get-here-now-what/
1•vidyesh•42m ago•0 comments

AI Poop Analysis App Offered to Sell Me Database of Its Users' Poops

https://www.404media.co/ai-poop-analysis-app-offered-to-sell-me-access-to-its-users-poops/
1•tjek•42m ago•0 comments

Tesla Solar Roof is on life support as it pivot to panels

https://electrek.co/2026/05/14/tesla-solar-roof-promise-vs-reality-pivot-panels/
19•celsoazevedo•43m ago•2 comments

In Japan, we don't see robots as a threat: just a form of presence in the world

https://english.elpais.com/science-tech/2026-05-16/takeshi-yoro-anatomist-in-japan-we-dont-see-a-...
1•pilingual•45m ago•0 comments

Danger Testing

https://www.dangertesting.com/
1•skogstokig•52m ago•0 comments

Anyone on the Internet Can Ring Your Doorbell

https://www.abgeo.dev/blog/anyone-can-ring-your-doorbell
1•jrdres•54m ago•0 comments

Coal Makes a Comeback, Fueled by War in the Middle East

https://www.wsj.com/business/energy-oil/coal-makes-a-comeback-fueled-by-war-in-the-middle-east-fb...
2•JumpCrisscross•55m ago•0 comments

Grok vs. ChatGPT vs. Gemini Comparison 2026: Complete Guide (Tested)

https://aithinkerlab.com/grok-vs-chatgpt-vs-gemini-comparison-2026/
1•carlual•57m ago•1 comments

We refrigerated our way out of needing each other

https://pilgrima.ge/p/the-middleman
1•momentmaker•57m ago•0 comments

Achieving last-iterate convergence in a QNN via an autonomous Gmetric driver

https://github.com/unbconductor/psi.emergence
1•psiemergence•1h ago•0 comments

Grafana Labs internal source code accessed

https://twitter.com/grafana/status/2055827123236171827
11•jschorr•1h ago•1 comments

Show HN: Serene Bach – a Go weblog engine that runs as CGI or HTTP

https://github.com/serendipitynz/serenebach
2•takkyun•1h ago•0 comments

Show HN: Brokkr - Scalable cluster management for GPU/HPC workloads

https://github.com/jackthepunished/brokkr
1•bhdr26k•1h ago•0 comments

As the West Dries Out, a New Generation of Dams Rise

https://www.bloomberg.com/news/features/2026-05-15/colorado-builds-new-dams-in-a-race-with-the-we...
2•divbzero•1h ago•1 comments

Learning to Write (Again)

https://jampa.bearblog.dev/learning-to-write-again/
1•tjampa•1h ago•0 comments

The latest X algorithm has been published to GitHub

https://twitter.com/elonmusk/status/2055277918633562153
3•guiambros•1h ago•0 comments

A Tale of Two File Names

https://tomgalvin.uk/blog/gen/2015/06/09/filenames/
1•GranPC•1h ago•1 comments

Refray – ∞-way RW Git sync tool and auto conflict resolution, for leaving GitHub

https://github.com/MaigoLabs/refray
2•azaneko•1h ago•0 comments

Recent Developments in LLM Architectures: KV Sharing, MHC, Compressed Attention

https://magazine.sebastianraschka.com/p/recent-developments-in-llm-architectures
1•pretext•1h ago•0 comments

I Found Ultra-Pure Quantum Crystals in an Abandoned Mine in the Atacama Desert

https://medium.com/@breid.at/ultra-pure-quantum-crystals-from-an-abandoned-mine-in-a-mysterious-d...
1•vi_sextus_vi•1h ago•0 comments

We Built a Web That Consumes Us

https://gist.github.com/motyar/e53a2c23362a5d5a73a6895e79ee3d20
2•motyar•1h ago•0 comments
Open in hackernews

Show HN: I built a Ruby gem that handles memoization with a ttl

https://github.com/mishalzaman/memo_ttl
48•hp_hovercraft84•1y ago
I built a Ruby gem for memoization with TTL + LRU cache. It’s thread-safe, and has been helpful in my own apps. Would love to get some feedback: https://github.com/mishalzaman/memo_ttl

Comments

locofocos•1y ago
Can you pitch me on why I would want to use this, instead of Rails.cache.fetch (which supports TTL) powered by redis (with the "allkeys-lru" config option)?
film42•1y ago
Redis is great for caching a customer config that's hit 2000 times/second by your services, but even then, an in-mem cache with short TTL would make redis more tolerant to failure. This would be great for the in-mem part.
thomascountz•1y ago
I'm not OP nor have I read through all the code, but this gem has no external dependencies and runs in a single process (as does activesupport::Cache::MemoryStore). Could be a "why you should," or a "why you should not" use this gem, depending on your use case.
hp_hovercraft84•1y ago
Good question. I built this gem because I needed a few things that Rails.cache (and Redis) didn’t quite fit:

- Local and zero-dependency. It caches per object in memory, so no Redis setup, no serialization, no network latency. -Isolated and self-managed. Caches aren’t global. Each object/method manages its own LRU + TTL lifecycle and can be cleared with instance helpers. - Easy to use — You just declare the method, set the TTL and max size, and you're done. No key names, no block wrapping, no external config.

JamesSwift•1y ago
For what its worth, ActiveSupport::CacheStore is a really flexible api that gives minimal contractual obligations (read_entry, write_entry, delete_entry is the entire set of required methods), but still allows you to layer specific functionality (eg TTL) on top with an optional 'options' param. You could get the best of both worlds by adhering to that contract and then people can swap in eg redis cache store if they wanted a network-shared store.

EDIT: see https://github.com/rails/rails/blob/main/activesupport/lib/a...

hp_hovercraft84•1y ago
That's actually a really good idea! I'll definitely consider this in a future update. Thanks!
qrush•1y ago
Congrats on shipping your first gem!!

I found this pretty easy to read through. I'd suggest setting a description on the repo too so it's easy to find.

https://github.com/mishalzaman/memo_ttl/blob/main/lib/memo_t...

hp_hovercraft84•1y ago
As in identify where the source code is in the README?
zerocrates•1y ago
I think they mean just set a description for the repo in github (set using the gear icon next to "About"), saying what the project is. That description text can come up in github searches and google searches.
film42•1y ago
Nice! In rails I end up using Rails.cache most of the time because it's always "right there" but I like how you break out the cache to be a per-method to minimize contention. Depending on your workload it might make sense to use a ReadWrite lock instead of a Monitor.

Only suggestion is to not wrap the error of the caller in your memo wrapper.

> raise MemoTTL::Error, "Failed to execute memoized method '#{method_name}': #{e.message}"

It doesn't look like you need to catch this for any operational or state tracking reason so IMO you should not catch and wrap. When errors are wrapped with a string like this (and caught/ re-raised) you lose the original stacktrace which make debugging challenging. Especially when your error is like, "pg condition failed for select" and you can't see where it failed in the driver.

hp_hovercraft84•1y ago
Thanks for the feedback! That's a very good point, I'll update the gem and let it bubble up.
JamesSwift•1y ago
I thought ruby would auto-wrap the original exception as long as you are raising from a rescue block (i.e. as long as $! is non-nil). So in that case you can just

  raise "Failed to execute memoized method '#{method_name}'"
And ruby will set `cause` for you

https://pablofernandez.tech/2014/02/05/wrapped-exceptions-in...

film42•1y ago
TIL! That's pretty cool. I still think if you have no reason to catch an error (i.e. state tracking, etc.) then you should not.
gurgeous•1y ago
This is neat, thanks for posting. I am using memo_wise in my current project (TableTennis) in part because it allows memoization of module functions. This is a requirement for my library.

Anyway, I ended up with a hack like this, which works fine but didn't feel great.

   def some_method(arg)
     @_memo_wise[__method__].tap { _1.clear if _1.length > 100 }
     ...
   end
   memo_wise :some_method
JamesSwift•1y ago
Looks good. Id suggest making your `get` wait to acquire the lock until needed. eg instead of

  @lock.synchronize do
    entry = @store[key]
    return nil unless entry

    ...
you can do

  entry = @store[key]
  return nil unless entry

  @lock.synchronize do
    entry = @store[key]
And similarly for other codepaths
chowells•1y ago
Does the memory model guarantee that double-check locking will be correct? I don't actually know for ruby.
JamesSwift•1y ago
I think it wouldnt even be a consideration on this since we arent initializing the store here only accessing the key. And theres already the check-then-set race condition in that scenario so I think it is doubly fine.
hp_hovercraft84•1y ago
Good call, but I think I would like to ensure it remains thread-safe as @store is a hash. Although I will consider something like this in a future update. Thanks!
wood-porch•1y ago
Will this correctly retrieve 0 values? AFAIK 0 is falsey in Ruby

``` return nil unless entry ```

chowells•1y ago
No, Ruby is more strict than that. Only nil and false are falsely.
wood-porch•1y ago
Doesn't that shift the problem to caching false then :D
RangerScience•1y ago
you can probably always just do something like:

  def no_items?
    !items.present?
  end
  
  def items
    # something lone
  end

  memoize :items, ttl: 60, max_size: 10`
just makes sure the expensive operation results in a truthy value, then add some sugar for the falsey value, done.
madsohm•1y ago
Since using `def` to create a method returns a symbol with the method name, you can do something like this too:

  memoize def expensive_calculation(arg)
    @calculation_count += 1
    arg * 2
  end, ttl: 10, max_size: 2

  memoize def nil_returning_method
    @calculation_count += 1
    nil
  end
hp_hovercraft84•1y ago
This is why I love working with Ruby!
deedubaya•1y ago
See https://github.com/huntresslabs/ttl_memoizeable for an alternative implementation.

For those who don’t understand why you might want something like this: if you’re doing high enough throughput where eventual consistency is effectively the same as atomic consistency and IO hurts (i.e. redis calls) you may want to cache in memory with something like this.

My implementation above was born out of the need to adjust global state on-the-fly in a system processing hundreds of thousands of requests per second.

kartik_malik•1y ago
In React ?