Blog Post

Getting Started Blog

How to Use the Bucket Channel to Schedule an Archive to Box

JamesTravaglini's avatar
Qrew Cadet
5 years ago

How to Use the Bucket Channel to Schedule an Archive to Box
You may be wondering what the Bucket Channel is now that you've started to utilize our newest feature, Pipelines. The Bucket channel is a built in channel that allows you to create temporary storage tables, or objects, in the pipeline and then access that data which will be available for all your future logics. In this tutorial we will utilize this method to create a temporary CSV table to archive data from Quickbase to Box on a monthly schedule. 

Before we start the tutorial you will first want to access pipelines and connect to the Box Channel. For step by step information on connecting to Box, click here.

Now that you have connected to the Box Channel we can start creating our Pipeline!

Step A: Define Your Table Using the Bucket Channel

1. Locate the Bucket Channel > Pipelines Rows > Define Table. Drag this Action to the first step of your Pipeline.
2. Determine your table Header Separator. In this example, I'm using Comma.
3. Define your header rows. NOTE: These will correspond to fields in Quickbase in later steps.
For my example, I'm archiving Time Cards. I want a Date, Employee, Hours, and Related Time Card as my headers. Which should look as follows... Date,Employee,Hours,Related Time Card
4. Determine the Date and DateTime Formats.
5. Define the type of field needed for each column. In this example it is fine to leave all columns as "String".

Step B: Search Your Quickbase table

1. Go to the Quickbase Channel > Records > Search Record. Drag this Step below Step A.
2. Select the Table you want to search your records for.
3. Select all the fields from the table that will be needed in upcoming steps. e.g. Date, Employee, Hours, Related Time Card.
I also created a formula check box for "Previous Month" to use as a query.
4. (optional) Create a query for your records so we're only archiving the data you need. In my example, I'm saying when the "Previous Month" is "Yes"(true/checked).

Step C: Add a Row for each record returned from Step B.

1. Locate the Bucket Channel > Pipelines Rows > Add a Row, drag this step under where it says "For each Record Do" for Step B.
2. Select Step A where it says "Pipeline Row". This will then populate your fields defined for each column in your spreadsheet. e.g. Date, Employee, Hours, Related Time Card.
3. Drag the appropriate fields from Step B to their Corresponding columns.

Step D: Download CSV

1. Locate the Bucket Channel > Pipeline Rows > Download CSV. Drag this step below Step B.
2. Select Step A, in the area for "Pipeline Row".

Step E: Upload File in Box

1. Locate the Box channel > Files > Upload File In. Drag this step below Step D.
2. Determine the Folder Path needed. In this example I will be uploading to a folder I titled "QB". Your folder path should look as follows...All Files/QB
3. Name your file. In this example I want the File name to be the date the upload was made. To do this I'm using a date/time conversion to return todays date and append with .csv. e.g. {{|date_mdy}}.csv

For more information on working with date and time in pipelines, click here.
4. Drag the download URL object from Step D to the URL in Step E.

Final Step: Schedule your Pipeline!

In the upper right corner of your screen locate where it says "Schedule pipeline" (just to the right of "Run Pipeline"). From here you can set the pipeline to run on the 1st of every Month.
NOTE: the timezone is in UTC.

Our pipeline is now complete! Below is what the final product should look like.

When you're ready feel free to test the pipeline by clicking, "Run Pipeline". Go to Box and you will see your file uploaded!
Find more Pipelines How-Tos Below:
Pipelines and Object Linking
How to use Pipelines to email a report at a specific time of day
Importing Pipeline YAML
Build better trend reports with Pipelines
Generate template records with Pipelines
Published 5 years ago
Version 1.0
No CommentsBe the first to comment