Discussions

 View Only
  • 1.  EXCLUDE DUPLICATE RECORDS FROM AN ARRAY PRIOR TO IMPORTING TO QUICKBASE WITH API_ImportFromCSV

    Posted 11-05-2014 13:28

    For what I know there are 2 alternatives:

    1.- Create a unique record in the array (In the community there are leads on how to do that after reading and parsing the text file), and Import towards a Key field in Quickbase Table.

    Thats fine if you don't have to redo all your relations.

    2.-

    Use API_DoQuery, from a script in the Table you want to Import to, then contrast this query with the array you have from the file you have already read, using Underscore library. In a certain way it is like Excell when you do vertical look ups or you filter using Advanced Filter. Sure most of us do similar things in Excel

    All ideas taken from dandiebolt.

    My problems:

    a.- I don't exactly now which is the Output format of Api_DoQuery: It is an array, It is an object, how data is structured. From my little experience it is a hell of a difference specially when you are a "cut and paste script writer" as myself, first I think in excel then I try to replicate that with script.

    b.- Have been reading Underscore.js library, to find a function that does this for me.

    So I found one that does exactly what I want

    difference_.difference(array, *others)

    Similar to without, but returns the values from array that are not present in the otherarrays.

    _.difference([1, 2, 3, 4, 5], [5, 2, 10]); => [1, 3, 4]
    As always I am confused with the correct syntax:Do I have to put in *others the result of Api do query?Do I have to transform in some way the result of Api do query, so that I can put the new Array into *others?
    The New array created can be used in Api_ImportFromCSV, or must be transformed in some way.Thanks on advance





  • 2.  RE: EXCLUDE DUPLICATE RECORDS FROM AN ARRAY PRIOR TO IMPORTING TO QUICKBASE WITH API_ImportFromCSV

    Posted 11-06-2014 14:13
    Most of Underscore's methods work on collections which is a term they coined to refer to either JavaScript arrays or objects. In my opinion the Underscore documentation does not call attention to this fact explicitly enough. Methods that only operate on arrays appear under the Array section of their documentations:

    http://underscorejs.org/


    Regarding which method to use to reject duplicates I would use _.reject() because you will be rejecting whole records not just an isolated field if it already exists in the table. See:

    http://underscorejs.org/#reject


    You will need to transform the output of API_DoQuery to an array of arrays or an array of objects as it returns an XML document. Also, once your data is filtered down to prevent duplicates you will need to reassemble the data into a CSV string as API_ImportFromCSV is the only method to import multiple records.

    Feel free to contact me off-world if you need additional help.

    Also regarding the first User Group Conference in Brazil I came up with a great name and theme song:

    QuickBase Mas Que Nada (QuickBase No Way)

    https://www.youtube.com/watch?v=zeBDoNBNMro


    We can theme the conference around all the great things you can do with QuickBase that are considered "impossible" with purely native features. I am working on my presentation of using Lua (created in Brazil!) with QuickBase: http://www.lua.org/


  • 3.  RE: EXCLUDE DUPLICATE RECORDS FROM AN ARRAY PRIOR TO IMPORTING TO QUICKBASE WITH API_ImportFromCSV

    Posted 08-08-2019 22:00
    Quick question on this old topic, would you be able to run these multiple API calls by setting up the IOT and putting them on the module.js page? This is exactly what I need to do in order to create additional child records when multiple multi-select fields are updated (without duplicating ones already created) and currently using a single webhook for the API_ImportFromCSV. If I can do it using this type of code, I will learn how to create this type of code! Don't want to spend a million hours if this would not work in the end. Thanks for your feedback Dan!