It's exactly this type of thinking that made the approach popular at first - "just before the turn of the century". And the fact there is only "one place" that needs to be updated during maintenance - rather than having to push out yet another deploy for all the affected (desktop) applications. But in reality there are a lot of things that can interfere with that ideal ...
tl;dr: Stored procedures are powerful, especially for addressing vendor-specific optimizations and issues - however once they are being used as an application development Swiss army knife any type of (poorly informed) misuse or abuse will quickly land you in a rigid (and brittle) cul-de-sac.
I'm surprised that nobody has mentioned to what I perceive as the most severe pain point; stored procedures exist in an environment where circumstances can quickly conspire to make (unit-)testing them a royal PitA - often to the point that "proper" testing doesn't get done (more on that later).
To put matters in perspective one has to take into account what was going on - "just before the turn of the century" - some of which may have left entrenched manifestations to this day in the guise of "legacy, mission critical systems". Before DevOps was invented DBAs and application developers were mostly in two entirely separate cultures and camps (aka departments) and while it was recognized that application development should receive solid DBA support, good DBA's were scarce and typically overloaded with priority work in operations.
Meanwhile among some application developers the view wasn't that uncommon that SQL, like other DB-related technologies, was just too strange and arcane to be bothered with ("Why can't I just use a QBE tool to generate that query for me?"). So there often was an "us vs. them" situation between application developers and DBAs. Right off the bat this was already not an ideal environment for stored procedures to be developed.
Now there are situations where the use of stored procedures is entirely justified, however golden hammer was a common affliction which could have one of two extreme outcomes for stored procedures:
- Application developers don't even bother to learn the vendor-specific procedural language first and are then surprised when there are significant differences to THAT OTHER LANGUAGE they would rather be using - i.e. the stored procedures aren't going to be particularly well written (some stored procedures can be written in a hosted/managed language (Java in Oracle/.Net in SQL Server) but there can be constraints and penalties that can make these type of solutions less than ideal).
- An ambitious application developer seeks to exploit the procedural language extension to the fullest extent (i.e. actually puts some effort into learning about it) - and promptly goes overboard by implementing actual domain logic inside the stored procedures. This basically lets your implementation of your bounded context dip into the database. Within the previously described culture there were typically two types of planned deployments to production - the slow one when no DB changes were required and the really slow (and tedious) one when DB changes like updating stored procedures were required (i.e. additional coordination with the high-in-demand DBAs was required). So in reality the inherent deployment inertia associated with stored procedures served as a significant disincentive to place anything into them that had any risk of needing to be changed in the future.
I also believe DB-based stored procedures are the wrong place to "decouple the schema from the application". Some aspects of the object relational impedance mismatch don't simply evaporate when you stop using OO. In some sense the DB is dealing with "data-at-rest" while the application is dealing with "data-in-use" and those are very different responsibilities facilitated by some very different "engines". So while the DB-schema and the application schema could be similar initially, pressures to keep both running at peek efficiency can over time cause them to diverge.
To manage complexity the application should only deal with the application schema, while the database, and that includes the stored procedures, should only deal with the DB-schema. The schema transformation should be the responsibility of a repository (which would actually wrap a mapper like an Ecto repo rather than be represented by it).
Well, for one the procedural extensions can feel quite restrictive especially when compared to the available facilities in current mainstream programming languages and at times it can feel like you are working in an environment that was current 10 years ago. Then there is the environment the code has to run in - there are packages that allow activities like manipulating files and accessing the network but they often have arcane interfaces and are locked out for security reasons. And you need to know the (vendor-specific) RDBMS engine in detail unless you don't mind stepping on some landmines - for example:
"Final question: what happens when a PL/SQL block executes a query that calls a user-defined function that, in turn, executes a SQL statement?" I asked.
"Lots and lots of context switches?"
"Yes, indeed, a whole mess of context switches, and each switch is a delay. So the more you switch, the slower your code goes. Remember FORALL from that training I did a few months ago? Reduction of context switches was the primary design goal of that feature."
Thats may be true for a SaaS or PaaS product. But there are cases where the application is the product and corporate clients want to use their own in-house DBA expertise - i.e. they want to use their preferred RDBMS. In that case you want to minimize the use of stored procedures as they would only increase the porting effort to "yet another database".
Back to my other point. Popular wisdom often declares that unit tests should not involve the database - because interacting with the database slows the unit tests down (which need to be fast, otherwise they aren't run often enough, etc.). Well, stored procedures live in the database - worse yet, they usually select, insert, update, and/or delete data directly from the DB tables. Unit testing stored procedures proceeds at a glacial pace, especially if you have to test across transaction boundaries with lots of inserts, updates, deletes (truncates), and verifies during test setup and cleanup. By and large I see stored procedures as a potential impediment to automated testing. And even if you can get past the slowness of automated DB testing - you typically still need full autonomy over your testing environment.
Let's go back to "just before the turn of the century" and see what was going on there.
- The corporate databases are HUGE. So huge in fact that due to cost (licenses and hardware) there are only three instances of each database in the entire organization: dev, test and prod.
- Due to hardware cost "dev" can't even accommodate a production set of data - to test against representative data you have to run against "test". Shared database integration is rampant - so one single "dev" instance is shared among all the application developers of the various active application development project teams. While the application developers typically have the necessary privileges to place testing instrumentation into the "dev" instance they don't bother because there is no point. "dev" is everybody's experimental playground and people frequently don't clean up after themselves, so packages could be invalid, constraints were disabled, etc. - or your unit test could fail because of what somebody else is doing to the database at the time. So when you are truly testing, you are running against "test" - however ...
- "test" is locked down entirely and completely under the control of the "more operation-oriented" (and hence rarely available) DBAs. To thoroughly test your stored procedures you needed a DBA to move your code into "test" for you. And introducing test instrumentation code into the "test" instance was not an encouraged practice as that wasn't "production code". Furthermore any manipulation of the data in "test" outside of production applications was discouraged in order to keep the data representative of production between the data reloads that happened maybe every 3 to 6 months. Basically "stored procedures" were synonymous with "lots of red tape".
All in all, under circumstances like that most people would avoid using stored procedures unless it was absolutely necessary. This may help explain why "not many people advocate using it anymore". Of course there are always other opinions. I suppose that if an application/service has full autonomy over its database then there are fewer downsides to stored procedures - however that only works if the product team is in full control over the development, testing and production instances (DevOps style). But even then I don't think you can even today adopt a "functional programming" style inside stored functions and procedures, so any logic that would benefit from that would have to stay outside of the database.
Future of Stored Procedures
Personally I view the NoSQL alternatives as exactly that - alternatives for situations were an RDBMS is overkill or an ill fit (e.g. object relational impedance mismatch). That doesn't mean that there is no room for improvement for the RDBMS concept. For my taste the current mainstream incarnations of RDBMS engines are still pretty monolithic affairs, ideally cramming everything onto one single iron, requiring relatively complex and kludgy configurations when they do need to scale across servers.
I suspect that to scale (for huge data stores) the RDBMS concept needs to be deconstructed into its constituent parts like mass storage, query engine and application data cache which can then be distributed and scaled separately as is alluded to in "Turning the database inside out with Apache Samza". For the time being Datomic seems to be a step in that direction - note however that Datomic doesn't actually store any data but simply uses existing persistent storage solutions for its mass storage component.
Given that kind of an outlook I don't see stored procedures having much of a resurgence in the future either, save for engine/platform specific optimizations. I guess it would is possible to (optionally) host code in the application data cache (collocated with the application) but who knows if that would still be considered equivalent to "stored procedures".