The Code and Commitment That Carries Drupal: A Conversation with David Bekker
David Bekker, known in the Drupal world as “daffie,” is a developer whose contributions run deep and wide through the platform’s core. With over 700 commit credits, he has helped shape critical systems like the Database API and database drivers that control how Drupal handles data. He works at Finalist, where he is supported in dedicating part of his paid time to open source. He uses that time to make Drupal faster, more flexible, and built for what comes next.
In this interview with Alka Elizabeth of The DropTimes, David shares the thinking that drives his work on performance, scalability, and core stability. He reflects on his journey into Drupal, the thrill of solving hard problems, and the art of making changes that ripple across thousands of sites. Whether it’s experimenting with NoSQL or reimagining how data flows through Drupal, David is building for what comes next, one thoughtful commit at a time.
TDT [1]: You’ve been part of the Drupal ecosystem for over a decade. What initially drew you to Drupal, and what has kept you so deeply involved, particularly in core development?
David Bekker: After being ill for a long time, I started looking for a way to get back to work. I explored several options. At first, I looked into mainframes, but the only way to gain experience was through expensive courses and I couldn't afford those at the time. Drupal became my second choice. Now, I've been actively working with Drupal for 10 years, and I really enjoy it.
TDT [2]: With over 700 Drupal core commits to your name, you're among the top contributors globally. What does that level of sustained contribution mean to you, and how has it shaped your perspective as a developer?
David Bekker: What I enjoy most about working on Drupal core is collaborating with people who are incredibly smart. You can learn so much from them. The trade-off is that thoughtful and reliable improvements take time to land. What keeps me going in the long run is having a bigger goal, something meaningful I can fully commit to from time to time. My personal drive comes from wanting to make a major improvement in Drupal: making it fast for sites with logged-in users.
TDT [3]: You maintain the Database API and several database drivers, which are essential infrastructure for how Drupal handles data. What are the biggest challenges in keeping that layer stable yet adaptable to new demands like NoSQL integration?
David Bekker: The biggest challenge with any change to a low-level system is that everything running on top of it must continue to work. Backwards compatibility is extremely important. Additionally, all changes need to be covered by automated tests. We can’t introduce a change that breaks 100,000 Drupal sites. The main challenge in adapting the Database API to work with a NoSQL database lies in removing code that directly uses SQL strings or providing alternatives within the API. For example, when selecting data from the database and filtering for records where the column “id” equals 4, you typically have two options: ->where('id = 4') or ->condition('id', 4, '='). Both work with MySQL, MariaDB, and PostgreSQL. However, a NoSQL database like MongoDB does not support ->where(), because it relies on raw SQL strings.
TDT [4]: Your work with MongoDB, especially around performance for authenticated users, caught a lot of attention. What led you to explore that integration, and what were the key takeaways from the experience?
David Bekker: I've been fascinated for a long time by how applications handle large numbers of logged-in users. Like over a million. How do those applications work, and what would we need to change in Drupal to make that possible with Drupal as well?
The key difference between applications that serve many logged-in users and a standard Drupal site lies in how they interact with the database. A typical Drupal site is built for anonymous users. Everyone sees the same content, and we put a caching server like Redis in front of the Drupal site to make everything fast. In that setup, how the data is stored in the database doesn’t really matter, because the caching server handles performance.
The challenge with a site that serves many logged-in users is that this caching trick barely works, if at all. Every page view has to go through the entire Drupal stack. The biggest performance bottleneck is fetching data from the database.
To keep the application fast, it's crucial to store data in the database in a way that allows you to retrieve it using the simplest possible queries. Simple queries are fast; complex ones can be painfully slow.
That means the way the application needs to use the data should dictate how that data is stored in the database. In practice, performance demands force us to structure the database around how the application consumes data. This is something I’m actively exploring. I’m looking for solutions to this challenge, and that’s what led me to consider a NoSQL approach.
TDT [5]: When you're working at the core level, changes ripple across the entire ecosystem. How do you approach making decisions that balance innovation, stability, and long-term maintainability?
David Bekker: For every improvement, you need to weigh the benefits it brings against the potential downsides. A few years ago, we had the opportunity to store data in a slightly more efficient way. It would have saved disk space and used a bit less memory on the database server. Which, of course, is a very good thing. However, it would have required every Drupal site to go through a major and complex upgrade. In the end, I decided not to make that change.
TDT [6]: How does your role at Finalist support or influence your open-source work?
David Bekker: At Finalist, I have the opportunity to spend 20% of my paid time contributing to open source, which is great! Finalist genuinely values supporting the Drupal project. What makes this even more meaningful is that we share a clear belief: storing entity data in JSON objects marks an important shift. It’s a direction we’re consciously exploring, because we see real potential to improve performance and flexibility for Drupal sites.
TDT [7]: You’ve contributed to Drupal for nearly a decade. What’s changed the most in how the core team collaborates and operates, and where do you think the contributor experience could still improve?
David Bekker: When I started working on Drupal core, we were still using patches instead of pull requests. We’re still in the process of migrating to GitLab, and to me, that will make contributing to Drupal core better and make it easier for new people to get started.
TDT [8]: Not all impact is visible. What’s a piece of your work that most people outside the core team might not know about, but you're particularly proud of?
David Bekker: Database drivers have been moved into their own modules. That might not sound very exciting, but it’s a significant change. It enables contributed database drivers to exist, and even allows one database driver to depend on another. For example, we're currently working on a new database driver for MySQL/MariaDB that supports parallel database queries. The existing driver doesn’t offer that capability. We still need to write the code that actually makes use of those parallel queries, but once we do, it will help make Drupal sites faster.
TDT [9]: As Drupal explores more flexible, decoupled, and cloud-native architectures, where do you see the role of traditional CMS logic going? And how can technologies like MongoDB help get us there?
David Bekker: When I look at Drupal CMS, I believe its future lies in decoupled sites for logged-in users. To turn that vision into a truly powerful solution, there are several key improvements we need to make:
- Store entity data in JSON objects. All data for each entity instance should be stored as a single JSON object.
- Asynchronous PHP. Modern CPUs have multiple cores, let’s use them. The main benefit here is support for parallel database queries.
- Preload the Drupal Kernel into memory before the user request arrives. We can use tools like FrankenPHP in worker mode or Swoole to achieve this.
- Use materialized views. These are similar to regular views, but their results are stored in the database like regular tables. This lets us structure the data in a way that fits the application’s needs.
Once we implement the above, combined with the improvements introduced in Drupal CMS, I believe Drupal can become a fantastic backend or headless solution for iOS, Android, and JavaScript application developers.
Today, there are many headless CMS platforms available. But when teams need a truly complex headless backend—with fine-grained access control, multilingual content, and deep custom logic—they often end up building something from scratch. Drupal can drastically reduce that effort.
It offers a mature, extensible framework and a strong community of agencies and experts to support it. With these next steps, Drupal can become for mobile and JavaScript applications what MySQL was to the LAMP stack 20 years ago. Drupal can grow beyond anything it has been before.
TDT [10]: For someone interested in contributing to deeper systems like the Database API, what’s the mindset or skill set they should be cultivating, not just technically, but in how they approach problem-solving and collaboration?
David Bekker: For me, it was always about the question: what needs to change in Drupal to make it work well with a large number of logged-in users? That quickly leads you to the Database API.
The mindset they should be cultivating, in my view, is:
- Be patient — fixing things in Drupal Core takes time. Much more than in a contrib module.
- Think about how to make Drupal better in the long run. Client work is usually focused on Drupal 10, sometimes 11. Core work is often aimed at the next major version of Drupal (12).
- Learn from other contributors. Most of them are incredibly smart, and there’s a lot to learn from them. If you want to grow into a senior developer or architect, working on Drupal Core will get you there.