New Lakeflow Designer offers drag-and-drop interface to generate production pipelines; Lakeflow

now Generally Available

SAN FRANCISCO, June 11, 2025 -- Data + AI Summit — Databricks, the Data and AI company, today announced the upcoming Preview of Lakeflow Designer. This new no-code ETL capability lets non-technical users author production data pipelines using a visual drag-and-drop interface and a natural language GenAI assistant. Lakeflow Designer is backed by Lakeflow, the unified solution for data engineers to build reliable data pipelines faster with all business-critical data, which is now Generally Available.

Databricks Logo

Traditionally, enterprises have faced a big tradeoff — either letting analysts create pipelines with no-code/low-code tools, while sacrificing governance, scalability and reliability. Or, they've relied on technical data engineering teams to code production-ready pipelines, but those teams are overloaded, and their backlog is long. In the end, most enterprises adopt a combination of both approaches, resulting in complex environments to manage and maintain. What data-driven enterprises really want is the best of both worlds: no code pipelines with governance, scalability and reliability.

"There's a lot of pressure for organizations to scale their AI efforts. Getting high-quality data to the right places accelerates the path to building intelligent applications," said Ali Ghodsi, Co-founder and CEO at Databricks. "Lakeflow Designer makes it possible for more people in an organization to create production pipelines so teams can move from idea to impact faster."

Lakeflow Designer: AI-Native Drag-and-Drop Data Prep for the Business Analyst

The new Lakeflow Designer empowers business analysts to build no-code ETL pipelines with natural language and a drag-and-drop UI that provides the same scalability,  governance, and maintainability as those built by data engineers. Backed by Lakeflow, Unity Catalog, and Databricks Assistant, Lakeflow Designer eliminates the divide between code and no-code tools. With this new approach, non-technical users gain the speed and flexibility they require to solve business problems without burdening data engineers with maintenance issues and governance headaches.

Additional Lakeflow Capabilities Launching

  • Lakeflow Enters GA: Today, Lakeflow became generally available, providing a unified data engineering solution from ingestion to transformation and orchestration. Notably, the new Lakeflow Declarative Pipelines capabilities allow data engineers to build end-to-end production pipelines in SQL or Python without having to manage infrastructure.
  • New IDE for Data Engineering: Lakeflow is debuting a brand new development experience that speeds up data pipeline development with AI-assisted coding, debugging and validation in an integrated UI.
  • New Ingestion Connectors: New point-and-click ingestion connectors for Lakeflow Connect are launching for Google Analytics, ServiceNow, SQL Server, SharePoint, PostgreSQL, and SFTP, joining connectors for Salesforce Platform and Workday Reports that are already available.
  • Direct Write to Unity Catalog with Zerobus: Zerobus enables developers to write high volumes of event data with near real-time latency to their lakehouse without the need to manage extra infrastructure like a message bus. This streamlined, serverless infrastructure provides performance at scale for IoT events, clickstream data, telemetry and other event-driven use cases.

Customer Momentum

"The new editor brings everything into one place — code, pipeline graph, results, configuration, and troubleshooting. No more juggling browser tabs or losing context. Development feels more focused and efficient. I can directly see the impact of each code change. One click takes me to the exact error line, which makes debugging faster. Everything connects — code to data; code to tables; tables to the code. Switching between pipelines is easy, and features like auto-configured utility folders remove complexity. This feels like the way pipeline development should work." — Chris Sharratt, Data Engineer, Rolls-Royce

"Using the Salesforce connector from Lakeflow Connect helps us close a critical gap for Porsche from the business side on ease of use and price. On the customer side, we're able to create a completely new customer experience that strengthens the bond between Porsche and the customer with a unified and not fragmented customer journey," said Lucas Salzburger, Project Manager, Porsche Holding Salzburg

"Joby is able to use our manufacturing agents with Lakeflow Connect Zerobus to push gigabytes a minute of telemetry data directly to our lakehouse, accelerating the time to insights — all with Databricks Lakeflow and the Data Intelligence Platform."  – Dominik Müller, Factory Systems Lead, Joby Aviation

Availability

At Data + AI Summit, Databricks is launching Lakeflow into General Availability. The new IDE for data engineering is entering Public Preview, new ingestion connectors are launching across various release states and Zerobus is entering Private Preview. Lakeflow Designer will be entering Private Preview shortly after Data + AI Summit.

About Databricks

Databricks is the Data and AI company. More than 15,000 organizations worldwide — including Block, Comcast, Condé Nast, Rivian, Shell and over 60% of the Fortune 500 — rely on the Databricks Data Intelligence Platform to take control of their data and put it to work with AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark™, Delta Lake, MLflow, and Unity Catalog. To learn more, follow Databricks on X, LinkedIn and Facebook.

Contact: Press@databricks.com

Logo - https://mma.prnewswire.com/media/1160675/Databricks_Logo.jpg

Cision View original content:https://www.prnewswire.co.uk/news-releases/databricks-unveils-lakeflow-designer-for-data-analysts-to-build-reliable-pipelines-without-coding-302478837.html