This interface needs to have a better relationship with streaming, there is always a lag in response and a lot of people are going to want to stream the response in non blocking threads instead of hanging the process waiting for the response. Its possible this is just a documentation issue, but either way streaming is a first class citizen on anything that takes more than a couple seconds to finish and uses IO.
Valid point. I'm actually already working on testing better streaming using async-http-faraday, which configures the default adapter to use async_http with falcon and async-job instead of thread-based approaches like puma and SolidQueue. This should significantly improve resource efficiency for AI workloads in Ruby - something I'm not aware is implemented by other major Ruby LLM libraries. The current approach with blocks is idiomatic Ruby, but the upcoming async support will make the library even better for production use cases. Stay tuned!
This will synchronously block until ‘chat.ask’ returns though. Be prepared to be paying for the memory of your whole app tens/low hundreds of MB of memory being held alive doing nothing (other than handling new chunks) until whatever streaming API this is using under the hood is finished streaming.
Yes, it does. Ruby has a global interpreter lock (GIL) that prevents multiple threads to be executed by the interpreter at the same time, so Puma does have threads, they just can’t run Ruby code at the same time. They can hide IO though.
Thanks for flagging this. The eval was only in the docs and meant only as an example, but we definitely don't want to promote dangerous patterns in the docs. I updated them.
But when later does come, it can take dev-years to fully disentangle the global state and allow code reuse. Did you gain dev-years in productivity by using it in the first place? Probably not.
If you have good reason to believe that an app will stick around for more than a year, be maintained by more than 3 people, or grow to more than 500k lines of code (sub in whatever metrics make sense to you), don't put off removing global state for later. You will regret it eventually, and it doesn't cost much to do it right the first time.
(Also, no mainstream language I'm aware of forces you to not use global state. Even Java, famed for its rigidity, has global state readily available if you really do need it.)
Global state is a tool that will almost always lead to bad architecture in an app where architecture matters. I'm sure you can point to a counterexample or two where a set of devs managed to keep disciplined indefinitely, but that doesn't change the fact that allowing people to reach into a mutable variable from anywhere in the system enables trivially accessible spooky action at a distance, and spooky action at a distance is a recipe for disaster in a medium to large code base.
In a project with more than a few people on it, your architecture will decay if it can decay. Avoiding global state removes one major source of potential decay.
“Almost” is key there. I respect your position, but it’s an always/never take, and the longer I am in this industry, the more I find myself leaning into “it depends.” Here’s a take that articulates this being done well on a large codebase better than I can in a short comment: https://dev.37signals.com/globals-callbacks-and-other-sacril...
No, it isn't—I'm the one who inserted the word "almost" into that sentence! Where did you get the idea that I meant always/never?
Like I said, you can point to exceptions but that doesn't change the rule. It's better to teach the rule and break it when you really know what you're doing—when you understand that you're breaking a rule and can articulate why you need to and why it's okay this time—than it is to spread the idea that globals are really just fine and you need to weigh the trade-offs. The odds are strongly against you being the exception, and you should act accordingly, not treat globals as just another tool.
Sometimes amputation is the right move to save someone's life, but you certainly should not default to that for every papercut. It's a tool that comes out in extreme circumstances only when a surgeon can thoroughly justify it.
Respectfully, your response further qualifies what I meant by your take being an always/never. I’m aware you’re the one who put “almost” in there, and I didn’t meant to imply you were being stubborn with that take, that’s why I said (and genuinely meant) that I respect it.
But I’m also aware that you’re comparing using global state to amputating a human limb. I don’t think it’s nearly that extreme. I certainly wouldn’t say global state “almost always leads to bad architecture,” as evidenced by my aligning with a framework which has a whole construct for globals baked into it (Rails’ Current singleton) that I happen to enjoy using.
Sure, global state is a sharp knife, which I already said. It can inflict pain, but it’s also a very useful tool in certain scenarios (more than would equate to “almost [never]” IMO).
So your response aligns with how I took your original post, and what I inferred “almost” really meant: basically never. My point is that I don’t agree with your take being a “rule.” While I understand your perspective, instead of saying basically never, I would say, “it depends.”
Sorry, I like Ruby, but this is nonsense. Rails apps get enormous very quickly, like apps written in every other framework. In most work you can't just declare that your world will be small, your world is as big as your problem is.
Happiness of a developer writing code can be a misery of a one having to read / debug it.
I worked in ruby for a couple years around 2009 and having to deal with a code that implemented most of its logic via method missing is still one of the strongest negative memories I have about coding.
`binding.irb` and `show_source` have been magical in my Ruby debugging experience. `binding.irb` to trigger a breakpoint, and `show_source` will find the source code for a method name, even in generated code somehow.
Another annoying one from that category is Ruby's forwarded methods. Since they're created via generated, injected code, you can't query which method it forwards to at runtime. Or not easily anyway.
What I advise (and aim for) is only pulling out the sharp knives for "library" code, but application code should stay "simple" (and this much more easily navigable). Otherwise you can absolutely make a bloody mess!
Every language prioritizes something (or somethings) because every language was made by a person (or people) with a reason; python and correctness; Java and splitting up work; Go and something like "simplicity" (not that these are the only priorities for each language). As another comment points out, Matz prioritized developer happiness.
My favorite example of this is the amazing useful and amazing whack Ruby array arithmetic; subtraction (`arr1 - arr2`) is element-wise removal, but addition (`arr1 + arr2`) is a simple append. These are almost always exactly what you want to do when you reach for them, but they're completely "incorrect" mathematically.
I'd say they both optimize for DX, but they come at it from very different angles. Ruby is focused on actually writing the code: making it feel expressive, intuitive, and fun.
Go is more about making it easier to build fast and robust systems. But it really doesn't care if the code itself is ugly and full of boilerplate.
As I've gotten more experience, I've come to really appreciate Go's tradeoffs. It's not as fun up front, but on the other hand, you're less likely to get server alerts at 4am. It really depends what you're building though.
Go ecosystem is generally good. However, given that Go as a language doesn't have any "fancy" (for the lack of a better word) syntactical features you can't create DSL's like this
though Ruby's expressiveness comes at a cost and I'd personally stick with Go in a team but use something like RubyLLM for personal projects
I'm wary of equating "ability to create dsls like this" with "prioritizing developer experience".
Ruby and go each prioritize different parts of the developer experience. Ruby prioritizes the experience of the author of the initial code at the expense of the experience of the maintainer who comes later. Go prioritizes the experience of a maintainer over the experience of the initial author. Both prioritize a developer, both de-prioritize a different developer, and which one makes more sense really depends on the ratio of time spent on writing greenfield code versus maintaining something another human wrote years ago who's long gone.
I disagree with the idea that Go prioritizes the maintainer. More lines of code typically makes maintenance more difficult. Go is easy to read line by line, but the verbosity makes it more challenging to understand the bigger picture.
I find changes in existing Go software often end up spreading far deeper into the app than you'd expect.
The runtime is fantastic, though, so I don't see it losing it's popularity anytime soon.
> More lines of code typically makes maintenance more difficult.
That’s kind of just the surface level of maintenance though. Go is not so much focused on making it easy to read a single file, but on minimizing the chains of abstraction and indirection you need to follow to understand exactly how things work.
It’s much more likely that all the logic and config to do something is right there in that file, or else just one or two “Go to definition” clicks away. You end up with way more boilerplate and repetition, but also looser coupling between files, functions, and components.
Contrast that to a beautiful DSL in Ruby. It’s lovely until it breaks or you need to extend it, and you realize that a small change will require refactoring call sites across a dozen different files. Oh and now this other thing that reused that logic is broken, and we’ve got to update most of the test suite, and so on.
> or else just one or two “Go to definition” clicks away
This is the biggest part of it: maintainers need static analysis and/or (preferably and) very good grepability to help them navigate foreign code. Ruby by its nature makes static analysis essentially impossible to do consistently, whereas Go leans to the opposite extreme.
Surely you can have semantically the same API in Go:
// Must[T](T, error) T is necessary because of Go error handling differences
chat := Must(gollm.Chat().WithModel("claude-3-7-sonnet-20250219"))
resp := Must(chat.Ask("What's the difference between an unexported and an exported struct field?"))
resp = Must(chat.Ask("Could you give me an example?"))
resp = Must(chat.Ask("Tell me a story about a Go programmer"))
for chunk := range resp { // Requires Go 1.23+ for iterators
fmt.Print(chunk.Content)
}
resp = Must(chat.WithImages("diagram1.png", "diagram2.png").Ask("Compare these diagrams"))
type Search struct {
Query string `description:"The search query" required:"true"`
Limit int `description:"Max results" default:"5"`
}
func (s Search) Execute() ([]string, error) { ... }
resp = Must(chat.WithTool[Search]().Ask("Find documents about Go 1.23 features"))
And so on. Syntax is different, of course, but semantics (save for language-specific nuances, like error handling and lack of optional arguments) are approximately the same, biggest difference being WithSomething() having to precede Ask()
I was an early contributor to Langchain and it was great at first - keep in mind, that's before chat models even existed, not to mention tools, JSON mode, etc.
Langchain really, I think, pushed the LLM makers forward toward adding those features but unfortunately it got left in the dust and became somewhat of a zombie. Simultaneously, the foundational LLM providers kept adding things to turn them more into a walled garden, where you no longer needed to connect multiple things (like scraping websites with one tool, feeding that into the LLM, then storing in a vector datastore - now that's all built in).
I think Langchain has tried to pivot (more than once perhaps) but had they not taken investor $$ early on (and good for them) I suspect that it would have just dried up and the core team would have gone on to work at OpenAI, Anthropic, etc.
langchain and llamaindex are such garbage libraries: not only they never document half of the features they have, but they keep breaking their APIs from one version to the next.
I was about to mention those. I decided a while ago to build everything myself instead of relying on these libraries. We could use a PythonLLM over here because it seems like nobody cares about developer experience in the Python space.
Thank you! This is what the Ruby community has always prioritized - developer experience. Making complex things simple and joyful to use isn't just aesthetic preference, it's practical engineering. When your interface matches how developers think about the problem domain, you get fewer bugs and more productivity.
I think it's the very nice-looking and clean high-level API that should be a pleasure to use (when it fits the job, of course).
I'm pretty sure this API semantics (instance builder to configure, and then it's ask/paint/embed with language-native way to handle streaming and declarative tools) would look beautiful and easy to use in many other languages, e.g. I can imagine a similar API - save, of course, for the Rails stuff - in Python, C# or Erlang. While this level of API may be not perfectly sufficient for all possible LLM use cases, it should certainly speed up development time when this level of API is all that's possible needed.
It's the extra parens, semi-colons, keywords and type annotations. Ruby makes the tradeoff for legibility above all else. Yes, you can obviously read the TypeScript, but there's an argument to be made that it takes more effort to scan the syntax as well as to write the code.
Also:
const chat: Chat = LLM.chat;
...is not instantiating a class, where Ruby is doing so behind the scenes. You'd need yet another pair of parens to make a factory!
> It's the extra parens, semi-colons, keywords and type annotations.
I always thought such minor syntactic differences are unimportant, except for the folks who still learn syntax and haven't seen too many languages out there to stop caring much about it.
YMMV of course, but whenever I need to jump hoops with some API or have things conveniently returned to me in a single call matters a lot for my developer happiness. Whenever my code needs semicolons or indentation or parens feels such a negligibly tiny nuance to me but things like this don't even blip on my mental radar... I always think about what the code does, and don't even see those details (unless I have a typo lol).
Maybe my opinion on this is just the echoes from the ancient C vs Pascal vs BASIC syntax holy wars while I was still a schoolkid, idk. I mean, when I wrote Scheme or Lisp I haven't really "seen" all those parentheses (but then, I just checked some Lisp code and syntax looks off and takes time to get through, since I haven't practiced it in a long while and it's pretty different from anything I've used any recently).
Again, YMMV, but `const chat = new LLM.Chat();` and `chat = RubyLLM.chat` are exactly the same thing to me - I don't remember actual tokens from the screen, I immediately mentally process those both as something like "instantiate a chat object and assign `chat` to it" (without really verbalizing it much, but as an concept/idea). And I don't think a little syntactic noise like `const` or `;` is making things worse or better for me. Although, to be fair, I could be wrong here - I haven't really did any experiments in this regards, with properly defined methodology and metrics, and my subjective perception could be deceptive. Sadly, I'm no scientist and not even sure how to set up one correctly...
What do you mean by 'assignment or creating a new object'? It assigns chat to ... whatever RubyLLM.chat returns. Do you mean the function could have a clearer name?
Perhaps to you, but anyone that’s ever worked with Ruby knows that there’s no such thing as fields/properties/members in Ruby. There are only methods. Parentheses are optional for method calls.
You're confusing beautiful with simple. There's a lot of complexity and magic that's hidden behind the curtains of that "beautiful" syntax. Great for scripts and small programs, and an absolute nightmare on large projects. It's too simple.
As a general rule of thumb, don’t yuck someone else’s yum. Plenty of people like the trade offs Ruby makes. And plenty of absolutely huge businesses use it quite successfully (e.g., Shopify, GitHub, GitLab, Airbnb, Stripe).
Can somebody explain to me what is so great about this package? It just seems to be making API calls. I am not critical, I am just genuinely curious since I don't understand the landscape.
I understand it's a Ruby thing, `chat = RubyLLM.chat` looks odd. How do I know whether it's a function call returning an object or just an assignment? Why not just use `RubyLLM.chat()` and eliminate the ambiguity?
In Ruby, chat = RubyLLM.chat is a method call since Ruby doesn't have properties, only methods. Dropping parentheses is standard Ruby style, familiar to Ruby developers. While adding parentheses is allowed, it doesn't match Ruby's readability. The library aims for a clean style consistent with Ruby conventions.
That’s an example of what your code could look like. For example, you might have a Rails app with a Document model and have added search to that model via Searchkick. Then this code is called by the library to execute a search.
Feels more useful for something like cli where you want to run one-off commands to test something instead of running it in production given how non-deterministic the behavior can be for example for something like
chat.ask "What's being said?", with: { audio: "meeting.wav" }
definitely don't want users to get a valid response only 75% of the times, maybe?
I am really impressed and delighted how simple this library is.
I agree that waiting for response can be an issue. I don't think this is meant to be for such purposes, but for tools that would process and create artifacts based on inputs.
I love Mistral and local LLMs, so this would probably the thing I would like to add.
I run engineering for a venture backed AI-first startup and we use Ruby/Rails.
For us, it made sense to leverage one of the best domain modeling and ORM frameworks out there. Most of our inference is http calls to foundational models, but we can still fine tune and host models on GPUs using Python.
Inference matters, but part of building an effective user platform are the same old SaaS problems we’ve had before, and Rails just works. Inbound and outbound email done in a day. Turning an OCR’d title from ALL CAPS into Title Case is one method call and not a whole custom algorithm, etc.
A lot of people seem to think Ruby is slow for some reason but it’s as fast as Python, and with falcon as fast as node for async behavior. Safe to say the application language taking 0.03 seconds instead of 0.003 seconds when you have to wait 3 seconds for first token is absolutely not the bottleneck with LLM heavy workflows, anyway.
And yes, metaprogramming is a powerful tool with which you can easily shoot yourself in the foot. We culturally just don’t write any code that’s not greppable so don’t use method_missing kinds of things unless it’s in a robust gem like active record. Pretty trivial problem to solve really.
PS - We’re hiring, if that philosophy aligns with you!
What's a good way to learn the modern Ruby ecosystem nowadays?
I played with Ruby when I was a teenager (~2015 or so), and I definitely remember enjoying it. I know there's still a vocal group of users who love it, so I would be interested in digging in again.
It's my favorite programming language but I seldom get to use it because I'm an AI Engineer. But I just recently went out on my own so I guess that can change now, hm...
In terms of LLM code generation as well, the well structured nature of a Rails application, where there is a place for everything, a structure for tests to be added, really helps from the perspective of getting a comprehensible application out of it that is easy to modify. In addition to the existence of well tested component for most normal web application tasks, maybe it helps that a lot of Rails has already been based on old-fashioned code generation for 20 years.
I have this same suspicion. I dusted off a hobby Rails app from two years ago I was making with Cursor. I decided to try completely changing the main functionality of the app with the much better LLMs of today and was shocked how well it did with one-shot.
Now compare that to my recent experience with having Cursor help me work on a preexisting Node/React app...geez. What a pain. (It doesn't help that I wasn't the one that originally created the React app though.)
Saw this gem of a gem on reddit earlier today and there were some trollish comments about no one using ruby anymore blah blah blah which quietly bummed me out. Surprised and Delighted to see it as #1 here on HN tonight!
There’s a lot of folks who get immense schadenfreude talking about things they know nothing about to strangers on the internet who also don’t know anything.
Ruby, specially with Rails, is particularly suited for AI coding, because of how mature it is and convention over configuration: Most of the important stuff is up to date in the model, and the entire thing comes with a fairy comprehensive set of ideas of how to to be used cohesively to built entire apps.
> It's worth remembering that the trolls that complain about Ruby do so because they care about it.
I don't think so. I mean there are complains about stuff you care about like people complaining about Healthcare. (edit: there are other forms of caring, see my grandchild comment)
Dissing on Ruby is definitely not this, they are not Ruby users wanting Ruby to be better. They don't even know Ruby apart from dissing on it is socially accepted, and makes them feel good.
Usually people get huffed up about stuff they care about. Caring doesn’t mean wanting it to be better, it could also mean get worked up about.
Surely a one-off comment about nobody using Ruby doesn’t mean you “care”, but if it is true that it is the same people who keep commenting, they obviously care.
Is it because they are jealous of the beauty of Ruby/Rails, as a Rubyist I’d think so, but who is to say really. Maybe they worked at a company where they replaced whatever their favourite stack is with Rails and they have hated Ruby ever since. It could be anything.
You wouldn’t keep responding to stuff you don’t care about at some level.
Equating caring and wanting it to be better was a mistake on my part. It made my comment not true, and it made you worked-up. Sorry for that.
All in all, I don't think that other forms of caring apply either. I think that parroting "Ruby is dead" doesn't mean that they care about Ruby, it's just a thing people like to parrot, without the meaning realizing in their heads. A form of bonding, a form of distraction, a form of opening to a social interaction, a form of self-reassurance etc. It is lot of things, and caring about Ruby at all is usually not among those (IMO).
I agree. Parroting some meme isn’t caring per se. But I was working under the assumption that the statement that it was the same names who keep doing it. If you say “Ruby is dead” 5 times a year it isn’t necessarily “caring” if it becomes 100 times a year there is something else at play.
I'm not sure what your point is. I care about Ruby and want it to die because I have worked on the Gitlab codebase, which is written in Ruby. It's a bad language and it stopped me being able to understand behaviours and fix bugs.
In contrast I have also worked on VSCode which is similarly huge but written in Typescript. Faaaar easier to work with, enough that I've been able to contribute a couple of medium sized features and several bug fixes.
So when people say "yeay Ruby" I try to discourage them because I don't want more Ruby code in the world that I might have the misfortune of having to interact with in future.
I think you are confusing the beauty and elegance of the language with the crap thatpeople write.
My experience is that the sort of folks who misuse Ruby's powerful features are the sort of idiotes who dont realise that because a thing can be done, doesn't mean that it should be done. These are the sort of people who are capable of misusing most languages.
Was it really idiomatic ruby that "stopped you from being unable to understand behaviors"? Or was it unorganized monkey patching?
I'm having a hard time thinking that ruby is difficult to understand, particularly compared to its opposites lisp, erlang, Haskell, e.g. languages that are extremely simple to the point where the burden of complexity is shoved into code space.
IIRC Github was originally written in Ruby as well.
Now that they use something "far easier to work with", the UX gets to suffer accordingly.
I've never been in a situation where making the customer happy was synonymous with applying best practices to the tech stack or otherwise making it so everyone and their dog can contribute.
It ranks right after Shell (#8) and C (#9). Ruby is still a mainstream language, and it's fairly easy to find a Ruby job. Compare that to Clojure or Haskell.
It's dropped in relative popularity, but the demand feels like it's still increasing to me (been doing Ruby professionally since 2006), just not as fast as some of the other languages.
Keep in mind the number of developers overall is rising rapidly still.
Compare that to Rust. For all the hype it has, very few companies are shipping products with it. There are a few, but nowhere near as many as are using Ruby.
I get your point but it's a case of right tool for the job. Every copy of Ruby now includes YJIT written in Rust because Rust is the right tool for that task.
It's easy to forget though that number of lines of code required to do something is also a valid metric and Ruby beats Rust on that.
So if you're shipping CRUD web apps that might be a more important metric than say memory usage or CPU time.
Different job, different tool. More people want to ship web apps than write their own JITs.
That's probably true, but also a poor measure of success. I bet there are more companies using Ruby than there are companies using C++, too. They fill different niches, and different types of companies deliver very different products using those languages.
The ratio of people who can code in Python or Ruby to people who can code in Rust or C++ is very high.
I don’t know why “number of companies using language X” is a metric that is used here. Wordpress is serving 43% of websites on the internet as of 2025, so we should all be learning PHP!
That's very nice, but not in itself a good argument for language use. If you count using a system written in a language, then almost every programmer uses Ruby daily as both Github and Gitlab are written in Ruby. Similarly you probably interact quite frequently with (banking) systems written in COBOL, but nobody would call COBOL a popular language.
Ruby will always have a special place in my heart. I cut my teeth as a young programmer on that language, and I learnt its value (as well as the value of using something else) along the way.
Ruby code can be downright poetic, for better or worse. There's a certain kind of magic to the kind of code it enables. That's not always good, but it _is_ beautiful.
I encourage everybody to read the venerable "Why's Poignant Guide to Ruby" [1] to see what I'm talking about.
I wish Ruby was cross-platform. It still only works on Windows using the MSYS2 emulation layer, and the only reason as far as I can tell is that it committed hard and early to `fork()` as the main way to use multiple cores.
Over the last 10 years the number of programmers has grown enormously. So ruby dropping in position does not necessarily imply the absolute number of ruby programmers went down.
The best Ruby always surpasses the elegance of the best Python... Unfortunately for practical means at this point I go for Python: more libraries, less problems with the C implementation of the interpreter (had issues with the GC in the past), better LLMs understanding of the code.
Surely you can have the same API elegancy and overall semantics in Python?
chat = python_llm.Chat()
_ = chat.ask"What's the best way to learn Python?")
# Analyze images
_ = chat.ask("What's in this image?", image="python_conf.jpg")
# Generate images
_ = python_llm.paint("a sunset over mountains in watercolor style")
# Stream responses
for chunk in chat.ask("Tell me a story about a Python programmer"):
print(chunk.content)
# Can be a class if necessary, but for this weather thingy we can probably do with a simple callable
# Requires Python 3.9+ for typing.Annotated
def get_weather(
latitude: Annotated[Decimal, "Latitude of the location"],
longitude: Annotated[Decimal, "Longitude of the location"]
) -> str:
"""
Gets current weather for a location.
"""
...
_ = chat.with_tool(get_weather).ask("What's the weather in Berlin? (52.5200, 13.4050)")
(The `_ =` bits are mine, to emphasize we have a meaningful result and we're knowingly and willingly discarding it. Just a habit, I hope it doesn't bug people.)
Ruby has significantly more capable metaprogramming facilities, but they aren't used in RubyLLM, it's all just objects and methods (biggest difference being use of iterable in Python vs providing a block in Ruby, as I felt an iterable would be more Pythonic here), which is nothing Ruby-specific.
And IMHO advanced metaprogramming should be used carefully, as it may make code pretty but really hard to comprehend and analyze. My largest issue with Rails is difficulty to tell where things are coming from and what's available (lack of implicit imports and ability to re-open any class or module and inject more stuff in there so there's no single place that defines it is a double-edged sword that may lead to chaos if wielded carelessly - YMMV, of course, I'm merely stating my personal preferences here).
Why does this say it was posted four hours ago on the front page, four days ago on Agolia, 3 days ago on /from and the comments are all from minutes to hours ago here?
1. Moderators can re-submit interesting stories from a second chance pool
https://news.ycombinator.com/pool (This might happen automatically from time to time?) When this happens some of the timestamps get updated but others dont.
If you're wondering about module RubyLLM. That's just how Ruby is often written.
Addendum:
Ruby does not require you to put the opening and closing parenthesis on a function to run that function, and it's not always put there when you have zero or 1 parameter (I find it to be cleaner when you have a parameter, but have no opinion when there isn't a parameter)
In the example code from the link itself, you'll see:
> Ruby does not require you to put the opening and closing parenthesis on a function to run that function, and it's not always put there when you have zero or 1 parameter
mind = blown
I always liked how functionName denotes the function and functionName() calls the function, and then it denotes the result e.g. in JavaScript or in math. But just saying functionName to call a function makes the code read more like English. Code that reads like English > code that reads like math. (And you can still talk about functions of course.)
It comes with the downside that if you want to pass the function itself around, you need to do f=something.method(:the_function), and then f.call(args). It's not a huge deal, but... meh.
The API looks nice on the surface, but this will be expensive to operate due to Ruby and Rails’ lack of support for concurrency. Everything is blocking, which is not a great match for the async nature of interacting with these models.
Unfortunately, 5 years after the release you linked, almost none of this has made it to Rails or even to relatively new libraries this post is about. The reason (imho) are unfortunate design choices in how the language incorporates concurrency - it’s just not well done.
Well instead of looking at the two first hits of Google I spent several years on a platform team of a multi billion dollar company using mostly Rails and worked on solving real world problems caused by Ruby/Rails’ design choices which lead me to believe that Ruby concurrency as of today is hot garbage.
They need fundamental breaking changes to the language to fix this, which means people won’t be able to use their beloved pile of 438 gems that haven’t seen a commit in 7 years. If I had to bet, I’d say the language is dead. It might still be a nice niche language for easy prototyping, but the world has moved on to async/await (js/python/Rust/C++) or stackful coroutines (Go).
Aside from that the DSL is quite excellent.
Checkout the async gem, including async-http, async-websockets, and the Falcon web server.
https://github.com/socketry/falcon
Valid point. I'm actually already working on testing better streaming using async-http-faraday, which configures the default adapter to use async_http with falcon and async-job instead of thread-based approaches like puma and SolidQueue. This should significantly improve resource efficiency for AI workloads in Ruby - something I'm not aware is implemented by other major Ruby LLM libraries. The current approach with blocks is idiomatic Ruby, but the upcoming async support will make the library even better for production use cases. Stay tuned!
Even a goto can be elegant sometimes.
If you have good reason to believe that an app will stick around for more than a year, be maintained by more than 3 people, or grow to more than 500k lines of code (sub in whatever metrics make sense to you), don't put off removing global state for later. You will regret it eventually, and it doesn't cost much to do it right the first time.
(Also, no mainstream language I'm aware of forces you to not use global state. Even Java, famed for its rigidity, has global state readily available if you really do need it.)
In a project with more than a few people on it, your architecture will decay if it can decay. Avoiding global state removes one major source of potential decay.
No, it isn't—I'm the one who inserted the word "almost" into that sentence! Where did you get the idea that I meant always/never?
Like I said, you can point to exceptions but that doesn't change the rule. It's better to teach the rule and break it when you really know what you're doing—when you understand that you're breaking a rule and can articulate why you need to and why it's okay this time—than it is to spread the idea that globals are really just fine and you need to weigh the trade-offs. The odds are strongly against you being the exception, and you should act accordingly, not treat globals as just another tool.
Sometimes amputation is the right move to save someone's life, but you certainly should not default to that for every papercut. It's a tool that comes out in extreme circumstances only when a surgeon can thoroughly justify it.
But I’m also aware that you’re comparing using global state to amputating a human limb. I don’t think it’s nearly that extreme. I certainly wouldn’t say global state “almost always leads to bad architecture,” as evidenced by my aligning with a framework which has a whole construct for globals baked into it (Rails’ Current singleton) that I happen to enjoy using.
Sure, global state is a sharp knife, which I already said. It can inflict pain, but it’s also a very useful tool in certain scenarios (more than would equate to “almost [never]” IMO).
So your response aligns with how I took your original post, and what I inferred “almost” really meant: basically never. My point is that I don’t agree with your take being a “rule.” While I understand your perspective, instead of saying basically never, I would say, “it depends.”
What I advise (and aim for) is only pulling out the sharp knives for "library" code, but application code should stay "simple" (and this much more easily navigable). Otherwise you can absolutely make a bloody mess!
My favorite example of this is the amazing useful and amazing whack Ruby array arithmetic; subtraction (`arr1 - arr2`) is element-wise removal, but addition (`arr1 + arr2`) is a simple append. These are almost always exactly what you want to do when you reach for them, but they're completely "incorrect" mathematically.
I thought it was Python and readability and "one way of doing things".
Go is more about making it easier to build fast and robust systems. But it really doesn't care if the code itself is ugly and full of boilerplate.
As I've gotten more experience, I've come to really appreciate Go's tradeoffs. It's not as fun up front, but on the other hand, you're less likely to get server alerts at 4am. It really depends what you're building though.
though Ruby's expressiveness comes at a cost and I'd personally stick with Go in a team but use something like RubyLLM for personal projects
Ruby and go each prioritize different parts of the developer experience. Ruby prioritizes the experience of the author of the initial code at the expense of the experience of the maintainer who comes later. Go prioritizes the experience of a maintainer over the experience of the initial author. Both prioritize a developer, both de-prioritize a different developer, and which one makes more sense really depends on the ratio of time spent on writing greenfield code versus maintaining something another human wrote years ago who's long gone.
I find changes in existing Go software often end up spreading far deeper into the app than you'd expect.
The runtime is fantastic, though, so I don't see it losing it's popularity anytime soon.
That’s kind of just the surface level of maintenance though. Go is not so much focused on making it easy to read a single file, but on minimizing the chains of abstraction and indirection you need to follow to understand exactly how things work.
It’s much more likely that all the logic and config to do something is right there in that file, or else just one or two “Go to definition” clicks away. You end up with way more boilerplate and repetition, but also looser coupling between files, functions, and components.
Contrast that to a beautiful DSL in Ruby. It’s lovely until it breaks or you need to extend it, and you realize that a small change will require refactoring call sites across a dozen different files. Oh and now this other thing that reused that logic is broken, and we’ve got to update most of the test suite, and so on.
This is the biggest part of it: maintainers need static analysis and/or (preferably and) very good grepability to help them navigate foreign code. Ruby by its nature makes static analysis essentially impossible to do consistently, whereas Go leans to the opposite extreme.
Langchain really, I think, pushed the LLM makers forward toward adding those features but unfortunately it got left in the dust and became somewhat of a zombie. Simultaneously, the foundational LLM providers kept adding things to turn them more into a walled garden, where you no longer needed to connect multiple things (like scraping websites with one tool, feeding that into the LLM, then storing in a vector datastore - now that's all built in).
I think Langchain has tried to pivot (more than once perhaps) but had they not taken investor $$ early on (and good for them) I suspect that it would have just dried up and the core team would have gone on to work at OpenAI, Anthropic, etc.
It doesn't deal with any of the hard problems you'll routine face with implementation.
I'm pretty sure this API semantics (instance builder to configure, and then it's ask/paint/embed with language-native way to handle streaming and declarative tools) would look beautiful and easy to use in many other languages, e.g. I can imagine a similar API - save, of course, for the Rails stuff - in Python, C# or Erlang. While this level of API may be not perfectly sufficient for all possible LLM use cases, it should certainly speed up development time when this level of API is all that's possible needed.
If you see the typescript options it's like giving yourself a water boarding session through your own volition.
Also:
...is not instantiating a class, where Ruby is doing so behind the scenes. You'd need yet another pair of parens to make a factory!This is mainly a matter of syntactic style!
I always thought such minor syntactic differences are unimportant, except for the folks who still learn syntax and haven't seen too many languages out there to stop caring much about it.
YMMV of course, but whenever I need to jump hoops with some API or have things conveniently returned to me in a single call matters a lot for my developer happiness. Whenever my code needs semicolons or indentation or parens feels such a negligibly tiny nuance to me but things like this don't even blip on my mental radar... I always think about what the code does, and don't even see those details (unless I have a typo lol).
Maybe my opinion on this is just the echoes from the ancient C vs Pascal vs BASIC syntax holy wars while I was still a schoolkid, idk. I mean, when I wrote Scheme or Lisp I haven't really "seen" all those parentheses (but then, I just checked some Lisp code and syntax looks off and takes time to get through, since I haven't practiced it in a long while and it's pretty different from anything I've used any recently).
Again, YMMV, but `const chat = new LLM.Chat();` and `chat = RubyLLM.chat` are exactly the same thing to me - I don't remember actual tokens from the screen, I immediately mentally process those both as something like "instantiate a chat object and assign `chat` to it" (without really verbalizing it much, but as an concept/idea). And I don't think a little syntactic noise like `const` or `;` is making things worse or better for me. Although, to be fair, I could be wrong here - I haven't really did any experiments in this regards, with properly defined methodology and metrics, and my subjective perception could be deceptive. Sadly, I'm no scientist and not even sure how to set up one correctly...
Ruby: late to the party, brought a keg.
Keep going! Happy to see ollama support PR in draft.
If you don’t like it, don’t use it.
Love this project!
https://news.ycombinator.com/item?id=43369977
But it seems hashnode.dev as a domain is blocked entirely. Hopefully Ruby gets another chance in AI era.
`Document.search(query).limit(limit).map(&:title)` how do you defined the documents to search on?
haven't tried it yet though
Allowing ai to eval() code or execute any sql statement would scare the crap outta me personally.
You’re totally right that eval()’ing unknown code is terrible but it doesn’t look like the gem itself is doing that.
The usage of eval() is in a user written tool in the docs. Definitely a bd example and should probably be changed
I agree that waiting for response can be an issue. I don't think this is meant to be for such purposes, but for tools that would process and create artifacts based on inputs.
I love Mistral and local LLMs, so this would probably the thing I would like to add.
For us, it made sense to leverage one of the best domain modeling and ORM frameworks out there. Most of our inference is http calls to foundational models, but we can still fine tune and host models on GPUs using Python.
Inference matters, but part of building an effective user platform are the same old SaaS problems we’ve had before, and Rails just works. Inbound and outbound email done in a day. Turning an OCR’d title from ALL CAPS into Title Case is one method call and not a whole custom algorithm, etc.
A lot of people seem to think Ruby is slow for some reason but it’s as fast as Python, and with falcon as fast as node for async behavior. Safe to say the application language taking 0.03 seconds instead of 0.003 seconds when you have to wait 3 seconds for first token is absolutely not the bottleneck with LLM heavy workflows, anyway.
And yes, metaprogramming is a powerful tool with which you can easily shoot yourself in the foot. We culturally just don’t write any code that’s not greppable so don’t use method_missing kinds of things unless it’s in a robust gem like active record. Pretty trivial problem to solve really.
PS - We’re hiring, if that philosophy aligns with you!
I played with Ruby when I was a teenager (~2015 or so), and I definitely remember enjoying it. I know there's still a vocal group of users who love it, so I would be interested in digging in again.
It's my favorite programming language but I seldom get to use it because I'm an AI Engineer. But I just recently went out on my own so I guess that can change now, hm...
https://guides.rubyonrails.org/getting_started.htm
Just have a toy app you want to build in mind
Outside of it, you might find interesting libraries like sinatra, sequel, roda, dryrb, faraday, sorbet, truffle ruby…
Now compare that to my recent experience with having Cursor help me work on a preexisting Node/React app...geez. What a pain. (It doesn't help that I wasn't the one that originally created the React app though.)
Don’t let it bum you out.
You'll often see the same names coming back on every post to angrily insist that no one is interested in Ruby
...apart from them obviously because if they didn't they would be busy trolling something else. :P
I don't think so. I mean there are complains about stuff you care about like people complaining about Healthcare. (edit: there are other forms of caring, see my grandchild comment)
Dissing on Ruby is definitely not this, they are not Ruby users wanting Ruby to be better. They don't even know Ruby apart from dissing on it is socially accepted, and makes them feel good.
Surely a one-off comment about nobody using Ruby doesn’t mean you “care”, but if it is true that it is the same people who keep commenting, they obviously care.
Is it because they are jealous of the beauty of Ruby/Rails, as a Rubyist I’d think so, but who is to say really. Maybe they worked at a company where they replaced whatever their favourite stack is with Rails and they have hated Ruby ever since. It could be anything.
You wouldn’t keep responding to stuff you don’t care about at some level.
All in all, I don't think that other forms of caring apply either. I think that parroting "Ruby is dead" doesn't mean that they care about Ruby, it's just a thing people like to parrot, without the meaning realizing in their heads. A form of bonding, a form of distraction, a form of opening to a social interaction, a form of self-reassurance etc. It is lot of things, and caring about Ruby at all is usually not among those (IMO).
I agree. Parroting some meme isn’t caring per se. But I was working under the assumption that the statement that it was the same names who keep doing it. If you say “Ruby is dead” 5 times a year it isn’t necessarily “caring” if it becomes 100 times a year there is something else at play.
In contrast I have also worked on VSCode which is similarly huge but written in Typescript. Faaaar easier to work with, enough that I've been able to contribute a couple of medium sized features and several bug fixes.
So when people say "yeay Ruby" I try to discourage them because I don't want more Ruby code in the world that I might have the misfortune of having to interact with in future.
My experience is that the sort of folks who misuse Ruby's powerful features are the sort of idiotes who dont realise that because a thing can be done, doesn't mean that it should be done. These are the sort of people who are capable of misusing most languages.
I'm having a hard time thinking that ruby is difficult to understand, particularly compared to its opposites lisp, erlang, Haskell, e.g. languages that are extremely simple to the point where the burden of complexity is shoved into code space.
I think so. I'm not an expert but the Gitlab codebase seems like fairly typical Ruby to me.
Now that they use something "far easier to work with", the UX gets to suffer accordingly.
I've never been in a situation where making the customer happy was synonymous with applying best practices to the tech stack or otherwise making it so everyone and their dog can contribute.
It all seems pretty negative value to me though it’s terribly slow.
It is somewhat objectively true:
https://octoverse.github.com/2022/top-programming-languages
https://github.blog/wp-content/uploads/2024/10/GitHub-Octove...
It doesn't mean much, and this library can be reproduced in any of those top 10 languages from what I can tell.
It ranks right after Shell (#8) and C (#9). Ruby is still a mainstream language, and it's fairly easy to find a Ruby job. Compare that to Clojure or Haskell.
Of the many developers who used to write Ruby (myself included), I would wager not many of those same people still do.
Keep in mind the number of developers overall is rising rapidly still.
It's easy to forget though that number of lines of code required to do something is also a valid metric and Ruby beats Rust on that.
So if you're shipping CRUD web apps that might be a more important metric than say memory usage or CPU time.
Different job, different tool. More people want to ship web apps than write their own JITs.
Engineering is the art of trade offs.
The ratio of people who can code in Python or Ruby to people who can code in Rust or C++ is very high.
Ruby code can be downright poetic, for better or worse. There's a certain kind of magic to the kind of code it enables. That's not always good, but it _is_ beautiful.
I encourage everybody to read the venerable "Why's Poignant Guide to Ruby" [1] to see what I'm talking about.
I wish Ruby was cross-platform. It still only works on Windows using the MSYS2 emulation layer, and the only reason as far as I can tell is that it committed hard and early to `fork()` as the main way to use multiple cores.
[1]: https://poignant.guide/
Ruby has significantly more capable metaprogramming facilities, but they aren't used in RubyLLM, it's all just objects and methods (biggest difference being use of iterable in Python vs providing a block in Ruby, as I felt an iterable would be more Pythonic here), which is nothing Ruby-specific.
And IMHO advanced metaprogramming should be used carefully, as it may make code pretty but really hard to comprehend and analyze. My largest issue with Rails is difficulty to tell where things are coming from and what's available (lack of implicit imports and ability to re-open any class or module and inject more stuff in there so there's no single place that defines it is a double-edged sword that may lead to chaos if wielded carelessly - YMMV, of course, I'm merely stating my personal preferences here).
1. Moderators can re-submit interesting stories from a second chance pool https://news.ycombinator.com/pool (This might happen automatically from time to time?) When this happens some of the timestamps get updated but others dont.
2. Moderators can invite users via email to re-submit stories. https://news.ycombinator.com/invited
> 166 points|ksec|4 days ago|21 comments
> 168 points by ksec 4 hours ago
But yes; it’s a bit dodgy to resurface old news like this imo and pretend it’s new news.
I’d go as far as to say that being at #1, under the circumstances, means it’s been artificially boosted somehow.
I haven’t the foggiest why anyone would bother though.
We're running a few very popular services on rails and it's really not a problem in any way.
If you're wondering about module RubyLLM. That's just how Ruby is often written.
Addendum: Ruby does not require you to put the opening and closing parenthesis on a function to run that function, and it's not always put there when you have zero or 1 parameter (I find it to be cleaner when you have a parameter, but have no opinion when there isn't a parameter)
In the example code from the link itself, you'll see:
which is the same asmind = blown
I always liked how functionName denotes the function and functionName() calls the function, and then it denotes the result e.g. in JavaScript or in math. But just saying functionName to call a function makes the code read more like English. Code that reads like English > code that reads like math. (And you can still talk about functions of course.)
https://www.ruby-lang.org/en/news/2020/12/25/ruby-3-0-0-rele...
https://thoughtbot.com/blog/my-adventure-with-async-ruby
First two hits on Google.
They need fundamental breaking changes to the language to fix this, which means people won’t be able to use their beloved pile of 438 gems that haven’t seen a commit in 7 years. If I had to bet, I’d say the language is dead. It might still be a nice niche language for easy prototyping, but the world has moved on to async/await (js/python/Rust/C++) or stackful coroutines (Go).