With the [usp_ParseTrelloBoard] stored procedure completed in the previous section, we’re able to import a JSON export file from Trello and export a replica of the board layout in the resultset.

Our next task is to automate the process by directing the output to a delimited text file that can be opened in Excel or Google Sheets and email it to a designated recipient.

While one could enhance the T-SQL stored procedure to format the output for use in an attachment using the sp_send_dbmail procedure, one will realize that you can’t do the file management tasks of moving input files through the queue. You could open the surface area up with xp_cmdshell, but it’s arguably more secure to move the file management tasks to SSIS. In SSIS, a For Each container can handle looping through the files in the source folder to pass them through the entire workflow.

The catch with using SSIS, however, is that the usual method of exporting query results to an external file, the Data Flow transform, hits a roadblock. Why? When you configure a destination connection in the Data Flow, the columns to be exported should be pre-defined in the metadata of the source connection. When you have multiple Kanban boards passing through, all with different list/columns defined, the source column metadata becomes a moving target.

The Script task comes to the rescue. It can call the procedure and have it loop through all the columns and rows in the resulting SQLDataReader, writing the output to a file using various .NET Framework methods. You benefit from the power of .NET and SSIS to prepare any output file for your Trello boards, regardless of the number of lists and cards.

Control Flow

At a glance, we’ll construct our SSIS package as follows…

  1. A For Each container will loop through all the JSON files in a given source folder
  2. For Each File…
    1. The source file will go to the Processed folder.
    2. A script task will accept the name of the file in the processed folder, pass it into the stored procedure, write the procedure’s resultset to a text file, and pass back the name of the board in the source file and the path of the output file.
    3. Another script task will then compose an email, sending the output file as an attachment to the recipient, along with the name of the board and the creation date of the source file in the subject line for identification.
    4. The source file is then moved to Archive before going to the next source file in the queue.
  3. Once the queue is completed, we’ll erase the contents of the Output file subdirectory as they’ve been emailed and should be in the Sent folder of your email server.

Let’s take a deep-dive into the design…

Variables

We’ll work with two sets of variables for this package.

On the project level, we’ll maintain the list of folders for the handling of the source and output files, the connection info for the DB server hosting the stored procedure and the credentials needed for emailing the output file.

On the package level, the variables will focus on the current file in the queue (namely capturing the board name and creation date) and the fully-qualified paths for all the workflow stages.

Looping Through The Source Folder

Our For Each container will use a ForEach File enumerator to loop through all the source files and pass them through the workflow. We’ll use an expression based on a Project variable to specify our source directory. We’ll retrieve the name and extension of each file and map it to the CurrentFile local variable. By excluding the directory path for the file (i.e., just retrieving the name and extension), we can easily pre-pend other directories to the file in the workflow for routing purposes.

Moving to Processing

Our first File System Task handles moving the file out of the source directory into the processing directory. We prepend the source directory path from the project parameters to the current file name to indicate our source for the task, but only need the processing path (also stored as a project param) for our destination.

Querying, Parsing and Outputting via Script

The heart of our code is embedded in a Script task using VB.NET.

We’re passing in variables for the path and name of the file which we are about to process, as well as for the desired output file.

On the outbound side, we’ll pass back the name of the Kanban board the processed file represents, as well as the date the processed file was exported from Trello.

Once task execution starts, we pursue three major tasks…

In the first task, we’re using SQLClient namespace objects to instantiate a connection to call the stored procedure. We’re able to reference the properties of the objects of the Connection Manager to avoid passing additional variables.

Depending on your driver implementation, some connection string parameters from the Connection Manager object can’t be implemented so you may need to utilize a stripped-down version (here, we’re passing in just the instance name, default DB name and enabling Windows authentication).

,

>

In our second task, we’re specifying our stored procedure for the SQLCommand object leveraging the SQLConnection, indicating our input (in-process file) and output parameters (Kanban board name).

Calling the ExecuteReader method will yield a SQLDataReader object for our resultset. We can leverage the FieldCount property to know the exact number of columns (Kanban lists) in the resultset and append the correct delimiter to each column and row.

We’re using a StreamWriter object to output the formatted resultset to the file target indicated in the inbound variable. I opted to use tabs (Chr(9)) as the delimiters as they allow you to open text files in Excel and Google sheets without running the Import wizard (vbTab works too).

In our last task, once we loop through all the columns and rows, we’re able to close out the StreamWriter file and close out the DataReader. Only after you close out the DataReader can you access the output parameter’s value for the name of the Kanban board.

Emailing the Output Via GMail

Emailing the output file as an attachment should be a breeze as you already have the output file as a parameter value from the previous task. All you need is the Send Mail task, requiring the name of the SMTP server, credentials to connect to the server, from and to address information, subject and body copy, and the fully-qualified path and filename of the output. This works if your SMTP server supports Windows authentication or has anonymous authentication.

But, what if you’re running the package outside your workplace, like from home with a personal GMail account using Basic Authentication? Send Mail explicitly does not support Basic Authentication. In this case, you’ll need to use another Script task to send the email…

We once again create a Script task using VB.NET, this time importing the System.Net.Mail namespace for easy reference to email classes (newer class libraries like MailKit are available, so feel free to substitute).

We’re relying on more project parameters this time for two reasons.

  1. First, it will enable us to pass through different email account credentials at runtime, if required.
  2. Second, we have the option to configure a project parameter as Sensitive so that its contents are masked in the Integration Services Catalog.

At the beginning of our code block, we assign our parameters to local variables. Note the use of the GetSensitive method to retrieve the values of the protected parameters for the username and password.

We then go about instantiating an SMTPClient object and assigning network credentials for the basic authentication method, followed by a MailMessage object with its basic delivery properties, plus an Attachment for our output file. All the project parameters come to provide sender and recipient info  The only local variables are for the subject line and attachment filename.

During debugging, when you send your email, in the case of GMail, Google requests confirmation of your use of basic authentication, otherwise all transactions beyond the login are blocked. You may need to adjust your Google account settings to have a successful transmission. It’s another incentive to look at newer email class libraries to use more secure authentication methods.

After sending the email, we need to dispose of all the SMTP-related objects in order to remove any references to the output file. Otherwise the downstream delete of all the files collected in the Output subdirectory may fail due to a file lock on an attachment file.

Email Content

Upon successful delivery, you’ll see an email with the subject line containing the Kanban board name and the date the export from Trello took place.

There is no body message in this design, but the attachment (named after the JSON export file, but with a txt extension), is readily downloadable to the O/S for opening in Excel, or directly via Google Sheets.

While column widths need to be adjusted, the key takeaway is that the layout matches the Kanban board at the time of export and the tab delimiters allow for precise formatting of the individual lists and their corresponding cards.

Summary

In summary, we learned how versatile T-SQL is with reading and transforming structured, non-tabular data sources (like JSON documents) and the power of windowing functions and PIVOT operators to transform data for advanced presentation purposes.

We also learned about the capabilities of SSIS with .NET script components to address challenges with dynamically-structured data sources with varying numbers of fields.

Lastly, we learned how to send email messages with .NET to leverage email systems not directly accessible by native SSIS components.

With T-SQL, SSIS and the .NET framework on the SQL Server platform, your ETL capabilities are unlimited.

Please be sure to to visit my blog at www.mattyzdba.net for additional SQL Server development and administrative techniques.