Who's gonna help vibe coders with new tech stacks now?

The new langauge/framework/library.. developers better start writing very exhaustive docs, ..and intent proving tests/examples, ..and integration tests/examples, .. and edge case tests/examples, ..and entire open source apps, .., or an entire generation of “dreamers” is gonna have to finally “learn to code”.

this is a catastrophe. StackOverflow provided data to LLMs, LLMs replaced StackOverflow, and now no new Q&A hub exists to provide fresh data. it’s a self-undermining causal loop, like mold growing on food, consuming it, and dying once the food is gone.

Form: https://x.com/Hesamation/status/2011251156467794250

PS. Hate to say I saw this coming, but I saw this coming.

6 Likes

Made my day :grin:

They will just not use those new languages/frameworks/libraries. I guess it would be very hard to launch a new language by now.

Frameworks and libraries may be easier, LLMs seem to be picking them up, but it’s not great.

For instance I just asked perplexity how to define OpenAPI specs in Elixir to see if it would list my library (oaskit) and it does not. It does not even list OpenApiSpex which is largely used in the Elixir community! (except in follow up questions)

3 Likes

Agreed.

I dunno… I just went to Kagi Assistant with the prompt:

What is the best way to define OpenAPI specs in Elixir?

and this was the response…

I don’t know if what it had to say was correct or if the results were influenced by this being a search engine assistant vs. a more conventional coding tool. But it definitely discussed both those libraries and that was my first (and last) try to ask about that subject.

I would say the whole area of LLM capability, weakness, and utility doesn’t lend itself well to simple anecdotes for any practical purpose. Capabilities, training, models, context, and prompting all matter in terms of the results that you might or might not get.

UPDATE:

OK, OK… I had to try that in Cursor, too, allowing Cursor to auto-select the model to use and I got similar results for the same prompt:

Checking if this codebase already uses OpenAPI, then summarizing best practices for defining OpenAPI specs in Elixir.

[5 tools called]

Checking dependencies and project structure:

[3 tools called]

Checking for a web framework and summarizing best practices for OpenAPI in Elixir:

[2 tools called]

Here are the best ways to define OpenAPI specs in Elixir:

Recommended approach: OpenApiSpex (code-first)

OpenApiSpex is the most popular library for OpenAPI in Elixir. It uses a code-first approach where you define schemas and operations alongside your code.

Key features

  1. Code-first: define schemas as Elixir modules using OpenApiSpex.schema/1
  2. Request/response validation: automatic validation of params and body
  3. Test helpers: validate responses against the spec in tests
  4. Swagger UI integration: serve interactive documentation via plugs
  5. Active maintenance: actively maintained, supports Elixir ~> 1.14

Basic example structure

# Define schemas as modules
defmodule MyApp.Schemas.User do
  require OpenApiSpex
  alias OpenApiSpex.Schema

  OpenApiSpex.schema(%{
    title: "User",
    type: :object,
    properties: %{
      id: %Schema{type: :integer, description: "User ID"},
      name: %Schema{type: :string, description: "User name"},
      email: %Schema{type: :string, format: :email}
    },
    required: [:id, :name, :email],
    example: %{
      id: 1,
      name: "John Doe",
      email: "john@example.com"
    }
  })
end

# Define operations in your controller
defmodule MyAppWeb.UserController do
  use MyAppWeb, :controller
  use OpenApiSpex.ControllerSpecs

  alias MyApp.Schemas.User

  @doc """
  Get user by ID
  """
  @spec show(Plug.Conn.t(), map()) :: Plug.Conn.t()
  def show(conn, %{"id" => id}) do
    # Your implementation
  end

  def action(conn, _) do
    apply(__MODULE__, action_name(conn), [conn, conn.params])
  end
end

Alternative: Oaskit (OpenAPI 3.1)

Oaskit is newer and supports OpenAPI 3.1 with JSON Schema validation. It’s good if you need 3.1 features, but some features are still in development.

Best practices

  1. Code-first: define schemas as modules near your controllers
  2. Validate inputs and outputs: use test helpers to ensure responses match the spec
  3. Include examples: add example values to schemas
  4. Modular schemas: reuse schema modules across operations
  5. CI/CD integration: validate specs in your build pipeline
  6. Serve UI: use SwaggerUI or Redoc plugs for interactive docs

For your project

Since this appears to be a system component architecture, consider:

  • Adding OpenApiSpex to components that expose HTTP APIs
  • Defining schemas in a shared location if multiple components need them
  • Generating static specs for production and dynamic specs for development

Should I help set up OpenApiSpex for a specific component, or show how to structure schemas for your architecture?

2 Likes

You even gave a good example for the problem (as I see it coming up in the future)

Kagi is referencing the elixirforum.
Posts written by real people is where LLMs are taking their ‘knowledge’ off.

Who is writing new text/giving help for new (upcoming) computer languages

Disclaimer: I am using LLMs too from time to time, mostly for scaffolding, adding specs, docs and so on - and I’m enjoying this!

Not going to argue a lot about LLM.
Just my 2 cent

Best regards

1 Like

OK… so you go to an LLM… but the LLM doesn’t have the information because it was never written… because LLMs.

So then what? You may well come to a forum like this looking for help. I would expect that if an LLM wasn’t helpful for a given language, library, etc. that communities like this would operate as before the LLM-age… at least until such time that there’s a body of work that the LLMs can search or draw on. I don’t think they really change much here.

Now I do think that LLMs change things.

For one, if you’ve got your new shiny computer language (or updated same) and you have an interest in promoting it, you’ll likely create more and better documentation than you might otherwise so that the LLMs have something to consume. And you’ll work out ways to get it in front of the LLM so your information can be included in those results. Also, if LLM companies find that they don’t have enough information or that their users are finding the results consistently stale… they’ll figure out ways to make it easier to get fresh information. That might not look like what we’ve used coming to this point in time… but the old ways aren’t the only ways. I don’t see the problem as unsolvable.

The bar will be raised for certain ecosystems and new ecosystems, at least as far as issues like business adoption are concerned, but I only expect that incrementally so. If you are starting up a business and you’re planning on leveraging these LLMs for getting product built… you’re not going to chose the upstart language/ecosystem which the LLM doesn’t have deep training on. But this isn’t a new or even an LLM exclusive problem. How many times in this community has the problem about there being no companies hiring Elixir devs been brought up? Or in other forums potential employers complaining that the reason they don’t choose Elixir is because they can’t find enough of a labor pool upon which to draw? I’ve seen that long before LLMs were a factor. Whether those complaints are right or wrong… LLMs don’t fundamentally change anything in this regard… they just widen the existing gap a bit.

4 Likes

This to me only means that there’s even less space and attention for new languages and frameworks.

I am not even sure that’s a bad thing. Most new languages I had a look at outed the creator as somebody who had a very specific axe to grind and made it a point to show that to the world. Some of the newer tech is genuinely interesting and might progress the area but generally there is too much churn.

The danger here is the same as always: what happens if the LLM companies end the free lunch?

My hope for this rather crazy era is that some brave souls will make smaller more specialized open models that we can all download and use ourselves. Because otherwise we would look back at this time as a collective fever dream.

6 Likes

$20/30 a month doesn’t sound like a free lunch to me.

True but you can get very far without paying a cent. People mostly pay for an even more direct integration inside their machine as opposed to copy-pasting from a browser (i.e. Claude Code, Gemini CLI and the like).

1 Like

Yes, you can get a hint or two without paying anything, but the trouble is I often need LLMs for sessions that last longer than that, which is why I pay. And as emphasized in earlier posts, I’m not complaining about having to pay but about sometimes ending up spending way too much time without getting nearly the value I was looking for (but I already elaborated that issue elsewhere so I’ll refrain from repeating myself again).

The content on this forum has noticeably changed. Save for some notable examples, It’s largely library/community announcements and discussions like these more than it is people asking questions. I (subconsciously) completely stopped answering questions here months ago and of course there are generally fewer people around here because they’re all just talking to their robots. It’s bad for people like me who gets excited and motivated by seeing other humans have lots of active discussion about the thing I’m interested in. All that to wonder: when will this be problem for existing languages too (if it will)?

I am and I don’t! Nor do I have my company pay on my behalf and I won’t until I’m told I have to. I’m also not going to get into that, though, since I already did a while back.

6 Likes

But it’s ebb and flow right? I don’t engage nearly so much now either and the robots aren’t all that bad with Elixir… given the right nudges and used judiciously. But once the robots lose that utility, I don’t think the answer for people is just give up. For the enthusiast, a return to the forum would be sensible. Now some places, like Stackoverflow where the engagement is a bread and butter issue, might not survive a different kind of volatile coming and going. But a lot grassroots, nonprofit communities will still be relatively vibrant, if perhaps catering to a smaller core group of users.

I guess being old has some advantages. I remember a time before the public Internet… And even back then as a beginner I was able to find the reference materials and communities where I could learn new programming languages and technologies without too much trouble… And the available information was still much sparser back then than today.

1 Like

I want to echo what you have said about the content on the forum.

It seems to me that the percentage of non-human written content has gone up dramatically.

Personally, I don’t come to this forum to read AI generated posts. I sincerely hope that we can shift the balance closer to how things were even one year ago.

5 Likes

Oh I’m not saying that. I know people will do what people do to get what they need. I’m wondering how it will affect language evolution and what those interactions will look like after people are used to not having to deal with communicating with pesky humans as often. It’s been a while since I’ve read a long passionate post about postgres from @sbuttgereit ( :wink: ) or really such posts from anyone other than @garrison (who I’m now @-ing because I see them typing) who operates at a level that is way outside of my wheelhouse. Things change and that’s fine. I’m not mad about it, just sad.

1 Like

LLMs are language models, they are one-shot and zero-shot machines. They are better at translating things between languages than anything else. If this trend continues (which looks likely) it will never have been easier to get adoption for a new framework/language/library.

@zachdaniel called this a while ago btw and was 100% correct.

Nearly every library posted here over the last couple of months has been 100% Claude so at this rate it’s just going to be these discussions lol. Welcome to the dark forest!

5 Likes

The heart is simply an acknowledgement :wink:

1 Like

Yeah I’m not saying that none LLM knows about those libs :D, they pickup things over time. But some do not, or it can be slow!

It is if you’re subsidizing a company that’s burning VC runway capital like it’s going out of style.

3 Likes

For how long? My understanding is that the AI companies are running the free tier at a severe cost, much more so than any regular SaaS offerings. There will be a reckoning soon (1~2 years max).