Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -16,3 +16,4 @@ out/
*.pem
*.pub
/created_scripts/
/local/
11 changes: 11 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,17 @@
# DLSync Changelog

This project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [3.1.0] - 2026-02-10
### Added
- Added support for AGENTS object type
- Added support for SEMANTIC_VIEWS object type
- Added support for CORTEX_SEARCH_SERVICES object type
- Added support for NOTEBOOKS object type
### Fixed
- Fixed rollback for undeployed declarative scripts
- Moved SESSION_POLICIES, PASSWORD_POLICIES and AUTHENTICATION_POLICIES from account-level to schema-level objects
- Fixed dependency override failure for unknown scripts not in the current changed scripts

## [3.0.1] - 2026-02-05
### Fixed
- Fixed session context issue when managing account-level objects (databases, schemas). DLSync metadata tables now use fully qualified names to prevent "table does not exist" errors after Snowflake automatically switches session context.
Expand Down
11 changes: 5 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -113,7 +113,7 @@ Where
- Other account-level objects (users, resource monitors, integrations, etc.)
- **database_name_*:** is the database name of your project,
- **schema_name_*:** are schemas inside the database,
- **object_type:** is type of the object only 1 of the following (VIEWS, FUNCTIONS, PROCEDURES, FILE_FORMATS, TABLES, SEQUENCES, STAGES, STREAMS, TASKS, STREAMLITS, PIPES, ALERTS, DYNAMIC_TABLES, MASKING_POLICIES),
- **object_type:** is type of the object only 1 of the following (VIEWS, FUNCTIONS, PROCEDURES, FILE_FORMATS, TABLES, SEQUENCES, STAGES, STREAMS, TASKS, STREAMLITS, PIPES, ALERTS, DYNAMIC_TABLES, MASKING_POLICIES, NOTEBOOKS, CORTEX_SEARCH_SERVICES, SEMANTIC_VIEWS, AGENTS),
- **object_name_*.sql:** are individual database object scripts.
- **config.yml:** is a configuration file used to configure DLSync behavior.
- **parameter-[profile-*].properties:** is parameter to value map file. This is going to be used by corresponding individual instances of your database.
Expand All @@ -131,7 +131,7 @@ For example, if you have a view named `SAMPLE_VIEW` in schema `MY_SCHEMA` in dat

The structure and content of the scripts differ based on the type of script. This tool categorizes scripts into 2 types: Declarative scripts and Migration scripts.
#### 1. Declarative Script
This type of script is used for object types of VIEWS, FUNCTIONS, PROCEDURES, FILE_FORMATS, PIPES, MASKING_POLICIES, STREAMLITS, RESOURCE_MONITORS, NETWORK_POLICIES, SESSION_POLICIES, PASSWORD_POLICIES, AUTHENTICATION_POLICIES, API_INTEGRATIONS, NOTIFICATION_INTEGRATIONS, SECURITY_INTEGRATIONS, STORAGE_INTEGRATIONS, and WAREHOUSES.
This type of script is used for object types of VIEWS, FUNCTIONS, PROCEDURES, FILE_FORMATS, PIPES, MASKING_POLICIES, NOTEBOOKS, CORTEX_SEARCH_SERVICES, SEMANTIC_VIEWS, AGENTS, STREAMLITS, RESOURCE_MONITORS, NETWORK_POLICIES, SESSION_POLICIES, PASSWORD_POLICIES, AUTHENTICATION_POLICIES, API_INTEGRATIONS, NOTIFICATION_INTEGRATIONS, SECURITY_INTEGRATIONS, STORAGE_INTEGRATIONS, and WAREHOUSES.
In this type of script, you define the current state (desired state) of the object.
When a change is made to the script, DLSync replaces the current object with the updated definition.
These types of scripts must always have a `create or replace` statement. Every time you make a change to the script, DLSync will replace the object with the new definition.
Expand Down Expand Up @@ -450,7 +450,7 @@ The role must have privileges to create and manage DLSync tracking tables in the
> **Note:** the database and schema specified in the connection will be used to store the DLSync tracking tables.

### Database-Level Objects
For managing database-level objects (VIEWS, FUNCTIONS, PROCEDURES, TABLES, SEQUENCES, STAGES, STREAMS, TASKS, ALERTS, DYNAMIC_TABLES, FILE_FORMATS, PIPES, MASKING_POLICIES), the role must have:
For managing database-level objects (VIEWS, FUNCTIONS, PROCEDURES, TABLES, SEQUENCES, STAGES, STREAMS, TASKS, ALERTS, DYNAMIC_TABLES, FILE_FORMATS, PIPES, MASKING_POLICIES, SESSION_POLICIES, PASSWORD_POLICIES, AUTHENTICATION_POLICIES, NOTEBOOKS), the role must have:
- **USAGE** on the target schema
- **CREATE** privileges for the specific object types being deployed (CREATE VIEW, CREATE FUNCTION, CREATE TABLE, etc.)
- **ALTER** privileges on existing objects in the schema
Expand All @@ -465,9 +465,8 @@ For managing account-level objects, the role must have:
- **CREATE INTEGRATION** (for SECURITY_INTEGRATIONS, STORAGE_INTEGRATIONS, NOTIFICATION_INTEGRATIONS, API_INTEGRATIONS)
- **CREATE RESOURCE MONITOR** (for RESOURCE_MONITORS)
- **CREATE NETWORK POLICY** (for NETWORK_POLICIES)
- **CREATE SESSION POLICY** (for SESSION_POLICIES)
- **CREATE PASSWORD POLICY** (for PASSWORD_POLICIES)
- **CREATE AUTHENTICATION POLICY** (for AUTHENTICATION_POLICIES)

> **Note:** SESSION_POLICIES, PASSWORD_POLICIES, and AUTHENTICATION_POLICIES are schema-level objects (created within a schema) but can be applied to accounts or users via `ALTER ACCOUNT SET` or `ALTER USER SET`.

## DLSync Metadata Tables
DLSync stores script metadata, deployment history, and logs in the database.
Expand Down
4 changes: 2 additions & 2 deletions build.gradle
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,8 @@ dependencies {

implementation 'org.apache.commons:commons-text:1.14.0'
implementation 'net.snowflake:snowflake-jdbc:3.25.1'
implementation 'ch.qos.logback:logback-core:1.5.18'
implementation 'ch.qos.logback:logback-classic:1.5.18'
implementation 'ch.qos.logback:logback-core:1.5.25'
implementation 'ch.qos.logback:logback-classic:1.5.25'
implementation 'org.slf4j:slf4j-api:2.0.4'
implementation 'com.fasterxml.jackson.dataformat:jackson-dataformat-yaml:2.18.2'
implementation 'commons-cli:commons-cli:1.9.0'
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
CREATE OR REPLACE AGENT ${EXAMPLE_DB}.${MAIN_SCHEMA}.SALES_ASSISTANT
COMMENT = 'Sales assistant agent for product and order queries'
PROFILE = '{"display_name": "Sales Assistant", "avatar": "sales-icon.png", "color": "blue"}'
FROM SPECIFICATION
$$
models:
orchestration: claude-4-sonnet

orchestration:
budget:
seconds: 30
tokens: 16000

instructions:
response: "You will respond in a friendly but professional manner about sales and products"
orchestration: "For product questions use Search; for sales analytics use Analyst"
system: "You are a helpful sales assistant that helps with product and order inquiries"
sample_questions:
- question: "What products do we have in stock?"
answer: "I'll search our product catalog to find available items."
- question: "How many orders were placed today?"
answer: "I'll analyze our sales data to find the order count."

tools:
- tool_spec:
type: "cortex_analyst_text_to_sql"
name: "SalesAnalyst"
description: "Analyzes sales data and generates reports"
- tool_spec:
type: "cortex_search"
name: "ProductSearch"
description: "Searches product catalog"

tool_resources:
SalesAnalyst:
semantic_view: "${EXAMPLE_DB}.${MAIN_SCHEMA}.SALES_ANALYTICS"
ProductSearch:
name: "${EXAMPLE_DB}.${MAIN_SCHEMA}.PRODUCT_SEARCH"
max_results: "10"
$$;
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
CREATE OR REPLACE AUTHENTICATION POLICY mfa_policy
CREATE OR REPLACE AUTHENTICATION POLICY ${EXAMPLE_DB}.${MAIN_SCHEMA}.MFA_POLICY
AUTHENTICATION_METHODS = ('PASSWORD', 'KEYPAIR')
MFA_ENROLLMENT = 'REQUIRED'
CLIENT_TYPES = ('ALL')
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
CREATE OR REPLACE CORTEX SEARCH SERVICE ${EXAMPLE_DB}.${MAIN_SCHEMA}.PRODUCT_SEARCH
ON PRODUCT_NAME
ATTRIBUTES PRICE, STOCK
WAREHOUSE = ${MY_WAREHOUSE}
TARGET_LAG = '1 hour'
AS (
SELECT
PRODUCT_NAME,
PRICE,
STOCK
FROM ${EXAMPLE_DB}.${MAIN_SCHEMA}.PRODUCTS
);
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
CREATE OR REPLACE NOTEBOOK ${EXAMPLE_DB}.${MAIN_SCHEMA}.SALES_ANALYSIS_NOTEBOOK
MAIN_FILE='sales_analysis.ipynb'
QUERY_WAREHOUSE='${MY_WAREHOUSE}'
COMMENT = 'Example notebook deployed via DLSync'
;
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
CREATE OR REPLACE PASSWORD POLICY password_strength_policy
CREATE OR REPLACE PASSWORD POLICY ${EXAMPLE_DB}.${MAIN_SCHEMA}.PASSWORD_STRENGTH_POLICY
PASSWORD_MIN_LENGTH = 12
PASSWORD_MAX_LENGTH = 256
PASSWORD_MIN_UPPER_CASE_CHARS = 1
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
CREATE OR REPLACE SEMANTIC VIEW ${EXAMPLE_DB}.${MAIN_SCHEMA}.SALES_ANALYTICS
TABLES (
products AS ${EXAMPLE_DB}.${MAIN_SCHEMA}.PRODUCTS
PRIMARY KEY (ID)
COMMENT = 'Product catalog table',
orders AS ${EXAMPLE_DB}.${MAIN_SCHEMA}.ORDERS
PRIMARY KEY (ID)
COMMENT = 'Customer orders table'
)
RELATIONSHIPS (
orders (PRODUCT_ID) REFERENCES products
)
DIMENSIONS (
products.product_name AS products.PRODUCT_NAME
WITH SYNONYMS = ('product', 'item')
COMMENT = 'Name of the product',
orders.order_date AS orders.ORDER_DATE
WITH SYNONYMS = ('date', 'purchase date')
COMMENT = 'Date when order was placed'
)
METRICS (
orders.total_quantity AS SUM(orders.QUANTITY)
WITH SYNONYMS = ('quantity', 'units sold')
COMMENT = 'Total quantity ordered',
orders.order_count AS COUNT(orders.ID)
WITH SYNONYMS = ('number of orders', 'count')
COMMENT = 'Total number of orders'
)
COMMENT = 'Semantic view for sales analytics';
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
CREATE OR REPLACE SESSION POLICY session_timeout_policy
CREATE OR REPLACE SESSION POLICY ${EXAMPLE_DB}.${MAIN_SCHEMA}.SESSION_TIMEOUT_POLICY
SESSION_IDLE_TIMEOUT_MINS = 60
SESSION_UI_IDLE_TIMEOUT_MINS = 15
COMMENT = 'Session timeout policy for security';
2 changes: 1 addition & 1 deletion gradle.properties
Original file line number Diff line number Diff line change
@@ -1 +1 @@
releaseVersion=3.0.1
releaseVersion=3.1.0
4 changes: 2 additions & 2 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -27,13 +27,13 @@
<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-core</artifactId>
<version>1.5.18</version>
<version>1.5.25</version>
</dependency>

<dependency>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
<version>1.5.18</version>
<version>1.5.25</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
Expand Down
1 change: 1 addition & 0 deletions src/main/java/com/snowflake/dlsync/ChangeManager.java
Original file line number Diff line number Diff line change
Expand Up @@ -98,6 +98,7 @@ public void rollback() throws SQLException, IOException {
.stream()
.filter(script -> !config.isScriptExcluded(script))
.filter(script -> !script.getObjectType().isMigration())
.filter(script -> scriptRepo.isScriptPreviouslyDeployed(script))
.filter(script -> scriptRepo.isScriptChanged(script))
.collect(Collectors.toList());
dependencyGraph.addNodes(changedScripts);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -108,12 +108,14 @@ public List<Script> getDependencyOverride(Script script, List<? extends Script>
.filter(dependencyOverride -> dependencyOverride.getScript().equals(script.getFullObjectName()))
.flatMap(dependencyOverride -> dependencyOverride.getDependencies().stream())
.map(dependencyName -> findScriptByName(nodes, dependencyName))
.filter(Optional::isPresent)
.map(Optional::get)
.collect(Collectors.toList());

return scriptsDependencyOverrides;
}

private Script findScriptByName(List<? extends Script> allScripts, String fullObjectName) {
return allScripts.parallelStream().filter(script -> script.getFullObjectName().equals(fullObjectName)).findFirst().get();
private Optional<? extends Script> findScriptByName(List<? extends Script> allScripts, String fullObjectName) {
return allScripts.parallelStream().filter(script -> script.getFullObjectName().equals(fullObjectName)).findFirst();
}
}
4 changes: 4 additions & 0 deletions src/main/java/com/snowflake/dlsync/doa/ScriptRepo.java
Original file line number Diff line number Diff line change
Expand Up @@ -185,6 +185,10 @@ private boolean insertScriptEvent(Script script, String status, String logs) thr
return statement.executeUpdate() > 0;
}

public boolean isScriptPreviouslyDeployed(Script script) {
return scriptHash.containsKey(script.getId());
}

public boolean isScriptChanged(Script script) {
// return true;
return !scriptHash.getOrDefault(script.getId(), "null").equals(script.getHash());
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,17 +8,21 @@ public enum ScriptObjectType {
STREAMLITS("STREAMLIT", ObjectLevel.SCHEMA, false),
PIPES("PIPE", ObjectLevel.SCHEMA, false),
MASKING_POLICIES("MASKING POLICY", ObjectLevel.SCHEMA, false),
CORTEX_SEARCH_SERVICES("CORTEX SEARCH SERVICE", ObjectLevel.SCHEMA, false),
SEMANTIC_VIEWS("SEMANTIC VIEW", ObjectLevel.SCHEMA, false),
AGENTS("AGENT", ObjectLevel.SCHEMA, false),
NETWORK_RULES("NETWORK RULE", ObjectLevel.SCHEMA, false),
RESOURCE_MONITORS("RESOURCE MONITOR", ObjectLevel.ACCOUNT, false),
NETWORK_POLICIES("NETWORK POLICY", ObjectLevel.ACCOUNT, false),
SESSION_POLICIES("SESSION POLICY", ObjectLevel.ACCOUNT, false),
PASSWORD_POLICIES("PASSWORD POLICY", ObjectLevel.ACCOUNT, false),
AUTHENTICATION_POLICIES("AUTHENTICATION POLICY", ObjectLevel.ACCOUNT, false),
SESSION_POLICIES("SESSION POLICY", ObjectLevel.SCHEMA, false),
PASSWORD_POLICIES("PASSWORD POLICY", ObjectLevel.SCHEMA, false),
AUTHENTICATION_POLICIES("AUTHENTICATION POLICY", ObjectLevel.SCHEMA, false),
API_INTEGRATIONS("API INTEGRATION", ObjectLevel.ACCOUNT, false),
NOTIFICATION_INTEGRATIONS("NOTIFICATION INTEGRATION", ObjectLevel.ACCOUNT, false),
SECURITY_INTEGRATIONS("SECURITY INTEGRATION", ObjectLevel.ACCOUNT, false),
STORAGE_INTEGRATIONS("STORAGE INTEGRATION", ObjectLevel.ACCOUNT, false),
WAREHOUSES("WAREHOUSE", ObjectLevel.ACCOUNT, false),
NOTEBOOKS("NOTEBOOK", ObjectLevel.SCHEMA, false),

// Migration-enabled objects
TABLES("TABLE", ObjectLevel.SCHEMA, true),
Expand Down
112 changes: 112 additions & 0 deletions src/test/java/com/snowflake/dlsync/parser/SqlTokenizerTest.java
Original file line number Diff line number Diff line change
Expand Up @@ -926,6 +926,118 @@ void parseAccountScriptTypeRoleMultipleMigrations() {
assertEquals("user2", thirdMigration.getAuthor(), "Third author should be user2");
}

// ===== Cortex Search Service Tests =====

@Test
void parseSchemaScriptTypeCortexSearchService() {
String filePath = "db_scripts/db1/schema1/CORTEX_SEARCH_SERVICES/PRODUCT_SEARCH.SQL";
String name = "PRODUCT_SEARCH.SQL";
String scriptType = "CORTEX_SEARCH_SERVICES";
String content = "CREATE OR REPLACE CORTEX SEARCH SERVICE db1.schema1.PRODUCT_SEARCH\n" +
" ON product_description\n" +
" ATTRIBUTES product_name, category\n" +
" WAREHOUSE = MY_WH\n" +
" TARGET_LAG = '1 hour'\n" +
"AS (SELECT * FROM db1.schema1.PRODUCTS);";

SchemaScript script = SqlTokenizer.parseSchemaScript(filePath, name, scriptType, content);

assertNotNull(script, "Script should not be null");
assertEquals("PRODUCT_SEARCH", script.getObjectName(), "Object name should be PRODUCT_SEARCH");
assertEquals("db1".toUpperCase(), script.getDatabaseName(), "Database name should be db1");
assertEquals("schema1".toUpperCase(), script.getSchemaName(), "Schema name should be schema1");
assertEquals(ScriptObjectType.CORTEX_SEARCH_SERVICES, script.getObjectType(), "Object type should be CORTEX_SEARCH_SERVICES");
assertEquals(content, script.getContent(), "Script content should match the input content");
}


// ===== Semantic View Tests =====

@Test
void parseSchemaScriptTypeSemanticView() {
String filePath = "db_scripts/db1/schema1/SEMANTIC_VIEWS/SALES_ANALYTICS.SQL";
String name = "SALES_ANALYTICS.SQL";
String scriptType = "SEMANTIC_VIEWS";
String content = "CREATE OR REPLACE SEMANTIC VIEW db1.schema1.SALES_ANALYTICS\n" +
" TABLES (\n" +
" products AS db1.schema1.PRODUCTS PRIMARY KEY (PRODUCT_ID)\n" +
" )\n" +
" DIMENSIONS (\n" +
" products.product_name AS products.NAME\n" +
" )\n" +
" METRICS (\n" +
" products.total_revenue AS SUM(products.PRICE)\n" +
" );";

SchemaScript script = SqlTokenizer.parseSchemaScript(filePath, name, scriptType, content);

assertNotNull(script, "Script should not be null");
assertEquals("SALES_ANALYTICS", script.getObjectName(), "Object name should be SALES_ANALYTICS");
assertEquals("db1".toUpperCase(), script.getDatabaseName(), "Database name should be db1");
assertEquals("schema1".toUpperCase(), script.getSchemaName(), "Schema name should be schema1");
assertEquals(ScriptObjectType.SEMANTIC_VIEWS, script.getObjectType(), "Object type should be SEMANTIC_VIEWS");
assertEquals(content, script.getContent(), "Script content should match the input content");
}


// ===== Agent Tests =====

@Test
void parseSchemaScriptTypeAgent() {
String filePath = "db_scripts/db1/schema1/AGENTS/SALES_ASSISTANT.SQL";
String name = "SALES_ASSISTANT.SQL";
String scriptType = "AGENTS";
String content = "CREATE OR REPLACE AGENT db1.schema1.SALES_ASSISTANT\n" +
" COMMENT = 'Sales assistant agent'\n" +
" FROM SPECIFICATION\n" +
" $$\n" +
" tools:\n" +
" - tool_spec:\n" +
" type: \"cortex_search\"\n" +
" name: \"Search\"\n" +
" $$;";

SchemaScript script = SqlTokenizer.parseSchemaScript(filePath, name, scriptType, content);

assertNotNull(script, "Script should not be null");
assertEquals("SALES_ASSISTANT", script.getObjectName(), "Object name should be SALES_ASSISTANT");
assertEquals("db1".toUpperCase(), script.getDatabaseName(), "Database name should be db1");
assertEquals("schema1".toUpperCase(), script.getSchemaName(), "Schema name should be schema1");
assertEquals(ScriptObjectType.AGENTS, script.getObjectType(), "Object type should be AGENTS");
assertEquals(content, script.getContent(), "Script content should match the input content");
}

@Test
void parseSchemaScriptTypeAgentWithProfile() {
String filePath = "db_scripts/db1/schema1/AGENTS/SUPPORT_AGENT.SQL";
String name = "SUPPORT_AGENT.SQL";
String scriptType = "AGENTS";
String content = "CREATE OR REPLACE AGENT db1.schema1.SUPPORT_AGENT\n" +
" COMMENT = 'Customer support agent'\n" +
" PROFILE = '{\"display_name\": \"Support Bot\", \"avatar\": \"support.png\", \"color\": \"green\"}'\n" +
" FROM SPECIFICATION\n" +
" $$\n" +
" models:\n" +
" orchestration: claude-4-sonnet\n" +
" orchestration:\n" +
" budget:\n" +
" seconds: 30\n" +
" tokens: 16000\n" +
" tools:\n" +
" - tool_spec:\n" +
" type: \"cortex_analyst_text_to_sql\"\n" +
" name: \"Analyst\"\n" +
" $$;";

SchemaScript script = SqlTokenizer.parseSchemaScript(filePath, name, scriptType, content);

assertNotNull(script, "Script should not be null");
assertEquals("SUPPORT_AGENT", script.getObjectName(), "Object name should be SUPPORT_AGENT");
assertEquals("db1".toUpperCase(), script.getDatabaseName(), "Database name should be db1");
assertEquals("schema1".toUpperCase(), script.getSchemaName(), "Schema name should be schema1");
assertEquals(ScriptObjectType.AGENTS, script.getObjectType(), "Object type should be AGENTS");
assertEquals(content, script.getContent(), "Script content should match the input content");
}


}