This page is a quick reference checkpoint for MIN OVER in Spark SQL: behavior, syntax rules, edge cases, and a minimal example; plus the official vendor documentation.
MIN OVER returns the smallest value in the window frame.
When used with OVER`, MIN returns one value per row within the specified window rather than collapsing rows as in GROUP BY.
If this behavior feels unintuitive, the tutorial below explains the underlying pattern step-by-step.
`min(expr) OVER (window_spec)is allowed because Spark explicitly states aggregate functions may be used withOVER.
SELECT category, amount, MIN(amount) OVER (PARTITION BY category) AS category_min FROM sales;
If you came here to confirm syntax, you’re done. If you came here to get better at window functions, choose your next step.
MIN OVER is part of a bigger window-function pattern. If you want the “why”, start here: Aggregate Window Functions
Reading docs is useful. Writing the query correctly under pressure is the skill.
For the authoritative spec, use the vendor docs. This page is the fast “sanity check”.
View Spark SQL Documentation →Looking for more functions across all SQL dialects? Visit the full SQL Dialects & Window Functions Documentation.