Ignore values when importing from CSV

  • 0
  • 1
  • Question
  • Updated 1 year ago
  • Answered
I am running an API_ImportFromCSV command:
<qdbapi>
   <usertoken> EXCLUDED </usertoken>
   <records_csv>
      <![CDATA[
%RepeatOn%
[Location Code.csv],[Location Description.csv]
%RepeatOff%
     ]]>
   </records_csv>
   <clist>13.14</clist>
</qdbapi>


 I receive the following error msg.:
<?xml version="1.0" ?>
<qdbapi>
<action>API_ImportFromCSV</action>
<errcode>51</errcode>
<errtext>Trying to add a non-unique value to a field marked unique</errtext>
<errdetail>Your import data has duplicate values for the following unique fields<BR/>
<BR/>
Field name: Location Code<BR/>
Duplicate values:<BR/>
470780</errdetail>
<num_recs_input>63</num_recs_input>
<num_recs_added>0</num_recs_added>
<num_recs_updated>0</num_recs_updated>
<num_recs_unchanged>0</num_recs_unchanged>
</qdbapi>

So, I have designed a webhook that updates my Sites table with data form my Project Code table. What I would like to do is update the sites table whenever an update is made to the Project Codes table. However, several project codes may use the same location code (see below). The result is I will often have duplicate values included in the table I am trying to import into the Sites table, as result the command terminates. 

How can I design a means to parse out duplicated location codes from the csv file I want to import into my Sites table?  The Project Code and Location Code fields must remain the key fields for their respective tables. 

Example Project Code table record:
Project Code*, Project Description, Location Code, Location Description 
*table key

Example Sites table record:
Location Code* (fid 13), Location Description (fid 14)
*table key 

Each project code record has a location code assigned to it. These location codes, as they are added and modified, should update the Sites table via the API . I import a massive list 10,000 + records provided to me by our client each week and would like to avoid importing 


Example Import:
201745846389, Westpoint South AMA , 470780, Westpoint South 
201845788857, Westpoint South IDBC, 470780, Westpoint South 
201700015000, Mayfield North KRS, 487975, Mayfield North
Photo of Cee Jones

Cee Jones

  • 180 Points 100 badge 2x thumb
  • unsure

Posted 1 year ago

  • 0
  • 1
Photo of Matthew Neil

Matthew Neil

  • 31,678 Points 20k badge 2x thumb
If "Location Code" is the parent table to the projects, you could just import all the lines into the projects table, and if the Location Code doesn't already exist, it will create a new record for that location code.

That way you aren't importing at all to the "Locations Code" table, but just using an Action / Webhook to make the needed records.

Is this Telecom? or Infrastructure? 
Photo of Cee Jones

Cee Jones

  • 180 Points 100 badge 2x thumb
Thanks for the reply. But I think you may want to re-read my post. If I wasn't clear, I'll try to re-articulate the problem. The initial webhook works fine until I run into the problem wherein several child records being updated in the projects table have the same location code. When the command encounters this scenario, it ceases to execute any portion of the command and returns an error:

<errtext>Trying to add a non-unique value to a field marked unique</errtext>
<errdetail>Your import data has duplicate values for the following unique fields</errdetail>


 I have to first figure out a way to parse the data before import. Ideally, without doing it in a separate app like excel. 

This is what I'm working on now. I found a few posts in the community that seem to offer some solution using JS. I'm not  well versed in js--it will take me a few days of trial and error to figure out his suggestion; so, I'm looking for a way to embed this in the webhook I designed (see above) if possible.
Photo of Matthew Neil

Matthew Neil

  • 31,678 Points 20k badge 2x thumb
My mistake.

I thinking of  two options, one with code, the other native, but more manual.  

(1) Use script to pull missing Locations, and combine the 'like' values prior to import 
or
(2) Create a summary report with the rows being grouped by Location code. Thus only getting unique values.  Then export that report as csv, and then import to the Location Code table.