Got CSV To Import?

  • 63
  • 2
  • Question
  • Updated 8 months ago
  • Answered

I wanted to show you a simple way to read a CSV file and process it before sending it to QuickBase. Normally when a file input control is on a form you select a file from the local file system and the file gets sent directly to the server when the form posts. However, with the HTML5 File API you can actually read the selected file, manipulate the contents and send the results to the server through an AJAX call so the host page never reloads.

To simplify the procedure even further I wrote a carefully crafted piece of code to make the process dead simple and allow the greatest level of customization for integration with QuickBase. Here is the code:

Pastie Database

https://haversineconsulting.quickbase.com/db/bgcwm2m4g?a=dr&rid=280

In a nutshell all you have to do is place a file control on your page and give it an id such as myFile shown below:

<input type="file" id="myFile" accept=".csv">

After a file is selected by the user, the CSV data will magically show up and can be accessed through the variable csv in the following short snippet of code where the same id is passed to GotCSV():

gotCSV("myFile").then(function(csv){

//your code here

console.log(JSON.stringify(csv, null, " "));

});

So if the CSV file contained the following data:

name,phone,email

John,(800) 555-1212,john@doe.com

Jane,(888) 555-1212,jane@doe.com

Baby,(877) 555-1212,baby@doe.com

You will be able to access the csv variable with properties corresponding to the column names:

[

{

"name": "John",

"phone": "(800) 555-1212",

"email": "john@doe.com"

},

{

"name": "Jane",

"phone": "(888) 555-1212",

"email": "jane@doe.com"

},

{

"name": "Baby",

"phone": "(877) 555-1212",

"email": "baby@doe.com"

}

]

Once you have the CSV represented as a JavaScript object you can go to town manipulating it. It now becomes a trivial matter to transform or filter the incoming CSV data prior to importing it into QuickBase.

Notes:

(1) The function gotCSV() wraps the two asynchronous processes (a) of the user selecting a local file and (b) reading the local file using the HTML5 File API and returns a promise.

(2) When the promise resolves the data in the selected file will be converted to an object and available as the JavaScript variable csv.

(3) This code is very generic and will allow you a perform a wide variety of tasks such as:

  • cleaning up dirty data prior to import
  • normalizing data
  • removing duplicate data
  • prevent importing a redundant data
  • converting quantities prior to import
  • supplying default data prior to import
  • split the raw CSV data into parent/child components prior to import

(4) to use the code the only thing you have to do is place your custom code where the comment "your code here" resides. The promise returned by gotCSV() will insure that your code automatically runs as soon as the file is selected and read by the HTML5 File API.

(5) This code should work in all evergreen browsers including IE11:

http://caniuse.com/fileapi

(6) It may take some time to grok what this code is doing but I assure you this is as extremely powerful technique that can probably greatly streamline your workflow if implemented.

(7) Enjoy!

Photo of Ⲇanom the ultimate (Dan Diebolt)

Ⲇanom the ultimate (Dan Diebolt), Champion

  • 29,664 Points 20k badge 2x thumb

Posted 5 years ago

  • 63
  • 2
Photo of Dinesh

Dinesh

  • 202 Points 100 badge 2x thumb
Hi dan, Very supportive of you and thanks for quick response. I will definitely try this.
Have some simple followup questions.
1)- Once my Code is done and i wish to execute the import functionality, do i call the API to import CSV or i can pass the control to QB Native import functioality, which does a lot...for e.g. it matches .csv columns with QB column names. As i understand and let me take a guess (obviosuly no marks for it), your answer would be to call the API. if i do so, don't i loose the column matching capability of QB default import functionality. Looks like that is a small giveway or am i wrong somewhere?
I will try this, for sure.
Photo of Ⲇanom the ultimate (Dan Diebolt)

Ⲇanom the ultimate (Dan Diebolt), Champion

  • 29,664 Points 20k badge 2x thumb
You have to call the API. This code reassembles the CSV to call API_ImportFromCSV:

gotCSV("myFile").then(function(csv){
  console.log(JSON.stringify(csv, null, "  "));
  var names = _.pluck(csv, "name");
  console.log(names);
  
  var csv_blob = _.map(csv, function(row) {
    var items = [];
    items.push(row["name"]);
    items.push(row["phone"]);
    items.push(row["email"]);
    return items.join(",");
  }).join("\n");
  console.log(csv_blob);

  var promise = $.post(dbid, {
    act: "API_ImportFromCSV",
    records_csv: csv_blob,
    clist: clist,
    skipfirst: "0"
  });

  $.when(promise).then(function(xml){
    alert("imported!");
  });

});

The above code didn't transform the data in any way buy you can imagine that any manner of filtering or transformation can take place once you have the csv object isolated. I threw in a pluck instance for you (look for names).
Photo of William

William

  • 226 Points 100 badge 2x thumb
Hi,

I was able to implement this, based on the example here - and it worked great except if a field contains a non-ASCII character such as a Spanish word (e.g. mañana) then it's imported as ma¿1⁄2ana .... 

Any tips as to how to fix this? Thanks!
(Edited)
Photo of Ⲇanom the ultimate (Dan Diebolt)

Ⲇanom the ultimate (Dan Diebolt), Champion

  • 29,664 Points 20k badge 2x thumb
You can remove diacriticals with code similar to this example:

var str = "ç, á, é, í, ó, ú, â, ê, ô, ã, õ, à";
str.normalize('NFD').replace(/[\u0300-\u036f]/g, "") 
// "c, a, e, i, o, u, a, e, o, a, o, a"
(Edited)
Photo of William

William

  • 226 Points 100 badge 2x thumb
Thanks, this worked well... I spent ages trying to get Encoding to work in the CSV import but could not get it right... although native quickbase's import table function respects file encoding, getting it to work with the API seems very difficult... but normalizing it is fine as the whole app is in English anyway
Photo of Dinesh

Dinesh

  • 202 Points 100 badge 2x thumb
Thanks. That is abundantly clear!
Photo of Darrin

Darrin

  • 0 Points
Dan - thank you as just getting into coding with our application and this is very helpful for what we need to do. However, I'm not clear as to how (or if) this would work from the main dashboard using the widgets. I tested your code by creating a html page and then attach it to a web page widget. The html button to select a file seems to work but it never gets to the js function. Is there something else I need to do?
Photo of Ⲇanom the ultimate (Dan Diebolt)

Ⲇanom the ultimate (Dan Diebolt), Champion

  • 29,664 Points 20k badge 2x thumb
I posted a self contained HTML file that you can use anywhere - inside or outside of QuickBase. What the file demonstrates is that you can read the csv out of a file selected through a file control and log the csv to the console. To see the console output press F12 in any modern browser. The demo stops there. You have to do something further with the CSV other than log it to the console to make the example useful. If you are new to script you should learn how to use the JavaScript console as it will be your greatest aide in debugging and developing script.
Photo of Darrin

Darrin

  • 0 Points
Thank you - the console advice is very helpful. Before my last post I tested your HTML outside of QB with a simple alert statement and it worked fine. That's why I figured it was something I was doing wrong with the web page widget. The console has allowed me to see that is it something with 'ran insecure content' with the 'd3.v3.min.js' script reference. If I comment out that script line and the subsequent parse call, it works fine and gets to my alert so I will further investigate my issue. Thanks again.
Photo of Ⲇanom the ultimate (Dan Diebolt)

Ⲇanom the ultimate (Dan Diebolt), Champion

  • 29,664 Points 20k badge 2x thumb
Try using a verison of d3 hosted from https CDN such as:

https://cdnjs.com/libraries/d3/

If I recall correctly the d3 version here was corrupted at the time I created this demo so I pulled the library from d3js.org instead.
Photo of Darrin

Darrin

  • 0 Points
One additional issues I am running into with code to mimic this approach. I have a master table called Opportunity that has children in table Task. For a given opportunity, I want to modify the records in the task table (say delete the completed date for every task related to the opportunity). This technique seems perfect as I planned to grab the record ids for each task after doing a simple API_DoQuery based on related opportunity. Then I will build a csv string and call API_ImportFromCSV. The problem with the code below is that instead of listing each task record id and then a new line, it concatenates all the ids together then does a new line. This is repeated for how ever many tasks I have. The xml returned looks fine, so could it be something with my each loop?

// parentid is the record id of the opportunity
// column 48 is related opportunity
var query2 = "{48.EX." + parentrid + "}";
var clist2 = "3";

// initialize ajax
$.ajaxSetup({data: {apptoken: apptoken}});

// find all task record ids for a certain opportunity
var promise4 = $.get(tbl_tsk,{
act: "API_DoQuery",
  query: query2,
  clist: clist2
});

// query then build csv string
$.when(promise4).then(function(xml4){
     var csv = "";
    $(xml4).find('record').each(function() {
          csv += $("record_id_",xml4).text() + "/n";
       });
     console.log(csv);
});
Photo of Ⲇanom the ultimate (Dan Diebolt)

Ⲇanom the ultimate (Dan Diebolt), Champion

  • 29,664 Points 20k badge 2x thumb
Push your field values into an array and when finished join the arrays with either a comma or a new line. Like this

https://haversineconsulting.quickbase.com/db/bgcwm2m4g?a=dr&rid=338

FWIW, it is best to start a new thread when joining an old thread with a different issue.
Photo of eduardo

eduardo

  • 40 Points
I have no experience in programming, but using the cut a paste technique (JavaScript, jquery etc.) with success, cause I do understand the logic.

I have learned a lot with your answers Daniel, and implemented most of your techniques (IOL , getscript and so). Thanks a lot on advance.

I do spend lot of time with language syntax cause sometimes things “obvious for the profs, are not so obvious for the newbies.

Automating Importing CSV files (or tab delimited) was something my process needed.

I have successfully implemented the lines of code in this thread and are working.

My dream is to be able to manipulate the CSV JavaScript object prior to importing to match my table. (before creating the csv_blob)
Starting with two features

1.-MY csv file has to columns
[Date]: YYYYMMDD, which I want to format DDMMYYYY
[Value]: xxxxxx.yy, which I want to format xxx.xxx,yy

2. - Want to create a new column in the object, so as to create a Unique Record, previous to importing, thus reducing the possibility of duplicating entries.

{New column] concatenating [NAME+Date+Value]. This creates a unique record in my configuration.

My idea is to include the code where you say “your code comes here”

Could you help me or assist me where to find something simple.
Photo of Ⲇanom the ultimate (Dan Diebolt)

Ⲇanom the ultimate (Dan Diebolt), Champion

  • 29,664 Points 20k badge 2x thumb
>I do spend lot of time with language syntax cause sometimes things “obvious for the profs, are not so obvious for the newbies.

Run your code through a linter such as JSLINT or JSHINT to catch errors, typos and deviations from best coding practices:

JSLINT
http://www.jslint.com/

JSHINT
http://www.jshint.com/

Or use the linter built into jsfiddle.net (JSHINT) or a linter plugin of your favorite editor editor. The free text editor notepad++ has a JSLINT plugin

Notepad++
http://notepad-plus-plus.org/

JSLINT Plugin for Notepad++
http://sourceforge.net/projects/jslintnpp/
If you post code use a pastie or pastebin.com to get syntax highlighting and indexing appropriate for the language.
Regarding your CSV question, post a new message because it is difficult to separate your issue from the original question and I need the extra forum points to win the all expenses paid trip to Hawaii QuickBase is offering.
Photo of Dinesh

Dinesh

  • 202 Points 100 badge 2x thumb
Hi Dan, Apologies for raising it in the same thread as it is related to the same example which i was trying. it works good, but i have 2 small queries , which i have no clue how to handle.
1)- How can I encode the columns as in some cases, the data contains special characters. Since my data has text columns with special characters, my data gets split and goes in multiple columns rather than the column i intend to. is there a encode or escape character i can use ?
2)- How can i import user email address into user fields. As of my know my feed has a user column which has user's email address, which i thought will automatically assign to the user field. I presumed this from my experience on using Quickbase API _AddRecord which automatically assigns to user based on the email address, but somehow this is not happening ?

Thanks a lot.
dinesh
Photo of Ⲇanom the ultimate (Dan Diebolt)

Ⲇanom the ultimate (Dan Diebolt), Champion

  • 29,664 Points 20k badge 2x thumb
I am short time so I can't scan the entire thread. If you are importing CSV there aren't many special characters to worry about. All you have to do is surround the entire column value with double quotes and if there is a double quote within the column value just substitute two quotes. So an sentence field might originally look like this

     He said "call me on Monday" so I did just that.

and you would "escape" it for CSV like this:

     "He said ""call me on Monday"" so I did just that."

On other matters ask a new question. I am going for the pointage!
Photo of Kelly

Kelly

  • 1,812 Points 1k badge 2x thumb
Dan,
I would like to apply this, but I may need more assistance. Can you provide me with your business contact info? 
Photo of Ⲇanom the ultimate (Dan Diebolt)

Ⲇanom the ultimate (Dan Diebolt), Champion

  • 29,664 Points 20k badge 2x thumb