So over the last couple of weeks I have been helping a client acrhive some old data from a retired application, and they wanted to retire the application and the database server. The archive data, whilst not likely to regulallry queried the clinet has a need to access it should the situation arise. They wanted to move the data to a serverless SQL instance.
Understanding the Source Database
So as this was a new database for me I wanted to look at what objects existed in the source
So I wrote the following query that retreives data from sys.objects that will list all the objects that exist in my database. For the purpose of this example I am using a database called SQLWatch – This is a database that supoprts the SQL Watch monitoring application. An open source monitoring tool that give you great insight into SQL Server and its workload. Its also open source so a nice candidate to use for the demo
SELECT
s.name AS SchemaName,
o.name AS ObjectName,
o.type AS ObjectTypeCode,
o.type_desc AS ObjectType
FROM
sys.objects o
JOIN
sys.schemas s ON o.schema_id = s.schema_id
WHERE
o.is_ms_shipped = 0 -- Exclude system objects
ORDER BY
s.name,
o.type_desc,
o.name;
As you can see from output there are almost 600 objects in the database, from tables to primary keys. Now if I was looking to all of these I could have used a database project in data tools or a third party tool like redgate’s SQL Compare.
Generating Table Scripts Without Third-Party Tools
I didn’t need all the objects, I just needed the tables and a subset of the tables at that, and without going through the visual studio install I decided to see, with the help of chatgpt, if I could generate a script to script out the tables. It came up with the follwoing cursor based solution
DECLARE @TableName NVARCHAR(256), @SchemaName NVARCHAR(256)
DECLARE @SQL NVARCHAR(MAX) = ''
-- Cursor to loop through user tables
DECLARE table_cursor CURSOR FOR
SELECT s.name AS SchemaName, t.name AS TableName
FROM sys.tables t
JOIN sys.schemas s ON t.schema_id = s.schema_id
ORDER BY s.name, t.name
OPEN table_cursor
FETCH NEXT FROM table_cursor INTO @SchemaName, @TableName
WHILE @@FETCH_STATUS = 0
BEGIN
SET @SQL = 'CREATE TABLE [' + @SchemaName + '].[' + @TableName + '] (' + CHAR(13)
SELECT @SQL = @SQL +
' [' + c.name + '] ' +
UPPER(tp.name) +
CASE
WHEN tp.name IN ('char', 'varchar', 'nchar', 'nvarchar')
THEN '(' + CASE WHEN c.max_length = -1 THEN 'MAX' ELSE CAST(c.max_length / CASE WHEN tp.name LIKE 'n%' THEN 2 ELSE 1 END AS VARCHAR) END + ')'
WHEN tp.name IN ('decimal', 'numeric')
THEN '(' + CAST(c.precision AS VARCHAR) + ',' + CAST(c.scale AS VARCHAR) + ')'
ELSE ''
END +
CASE WHEN c.is_nullable = 0 THEN ' NOT NULL' ELSE ' NULL' END +
CASE WHEN c.is_identity = 1 THEN ' IDENTITY(' + CAST(ic.seed_value AS VARCHAR) + ',' + CAST(ic.increment_value AS VARCHAR) + ')' ELSE '' END + ',' + CHAR(13)
FROM sys.columns c
JOIN sys.types tp ON c.user_type_id = tp.user_type_id
LEFT JOIN sys.identity_columns ic ON c.object_id = ic.object_id AND c.column_id = ic.column_id
WHERE c.object_id = OBJECT_ID('[' + @SchemaName + '].[' + @TableName + ']')
ORDER BY c.column_id
-- Remove the final comma
SET @SQL = LEFT(@SQL, LEN(@SQL) - 2) + CHAR(13) + ');' + CHAR(13) + CHAR(13)
PRINT @SQL
FETCH NEXT FROM table_cursor INTO @SchemaName, @TableName
END
CLOSE table_cursor
DEALLOCATE table_cursor
You take the output of this script and run it will recreate all tables in the SQLWatch database. If you wanted to filter the tables you could have added a where clause to the curso select statement to filter out the tables you didn’t want to script out. I made the neccessary manual tweaks to the tables
Moving the Data
Then I used some SSIS packages to map the old tables to the new tables and I was good
Clearly, this was a quick and dirty way of moving the data, and I was able to disregard all other database objects, including primary key foreign relationships. If you need to move those objects, then there’ll be a bit more effort involved and there are some really useful tools that can help you with that tool
Final Thoughts
This wasn’t a highly engineered solution. It was practical, fast, and cost-effective. If your requirements are similar—archiving data you rarely need but can’t delete—it’s a model worth considering. For more complex needs, particularly if referential integrity or logic is required, I’d recommend a more robust toolset and testing strategy.
But for quick and efficient archiving to a serverless SQL instance? This approach works.
Useful Links
0 Comments