About 406,000 results
Open links in new tab
  1. Databricks / Spark equivalent to lookup (done via CROSS APPLY in SQL)

    Jan 8, 2021 · My users have a small sparse configuration of "scaling factors" they wish to apply to a table of fixed size (50,000 rows). This is how it's currently configured and turned into a small dataframe: [ScalingFactor(rank, scale) for (rank, scale) in zip(rank_lbounds, scale_factors)])

  2. LATERAL Thinking: Reproducing OUTER and CROSS APPLY

    Mar 27, 2023 · For a refresher, OUTER APPLY and CROSS APPLY are T-SQL specific syntax extensions most commonly used to “join” a query’s results with the output of a table-valued function (TVF) whose...

  3. JOIN | Databricks Documentation

    Jan 30, 2025 · Learn how to use the JOIN syntax of the SQL language in Databricks SQL and Databricks Runtime.

  4. Databricks : Equivalent code for SQL query - Stack Overflow

    Sep 18, 2019 · I'm looking for the equivalent databricks code for the query. I added some sample code and the expected as well, but in particular I'm looking for the equivalent code in Databricks for the query. For the moment I'm stuck on the CROSS APPLY STRING SPLIT part. Sample SQL data: CREATE TABLE FactTurnover. ID INT, SalesPriceExcl NUMERIC (9,4),

  5. databricks - Is there a way to use CROSS APPLY from SQL to Spark SQL

    Sep 8, 2023 · I have complex stored procedure that uses multiple views/functions inside and inside there are multiple cross applies and I am not sure if there is a "easy solution" to replicate it in spark.sql. df = spark.sql(f""" select * from table CROSS APPLY ( some business rule ) """)

  6. JOIN - Azure Databricks - Databricks SQL | Microsoft Learn

    Jan 30, 2025 · CROSS JOIN. Returns the Cartesian product of two relations. NATURAL. Specifies that the rows from the two relations will implicitly be matched on equality for all columns with matching names. join_criteria. Optionally specifies how the rows from one table reference is combined with the rows of another table reference.

  7. How to use CROSS JOIN in Databricks? - CastorDoc

    CROSS JOIN is a powerful tool in the world of data analysis, allowing you to combine data from multiple tables in unique and meaningful ways. In this article, we will explore the basics of CROSS JOIN and provide a step-by-step guide on how to use it in Databricks.

  8. Example Notebook - SQL Joins - Databricks - GitHub Pages

    Look at the data model with two tables below. There are two facts that make it a good fit to illustrate the different types of join operations. Not every customer placed an order; Not every order has a customer (one has a null value)

  9. Solved: SQL Update Join - Databricks Community - 27437

    Dec 4, 2019 · I'm importing some data and stored procedures from SQL Server into databricks, I noticed that updates with joins are not supported in Spark SQL, what's the alternative I can use? Here's what I'm trying to do: Another thing I was unable to do in Spak SQL are CROSS APPLY and OUTER APPLY, are there any alternatives for those 2? Thanks in advance. Mike

  10. 8. Joins - Databricks

    Cross-joins in simplest terms are inner joins that do not specify a predicate. Cross joins will join every single row in the left DataFrame to ever single row in the right DataFrame. This will cause an absolute explosion in the number of rows contained in the resulting DataFrame.

  11. Some results have been removed