ContributionsMost RecentMost LikesSolutionsConvert a Connected Table to Normal TableGreetings, In order to populate one of the tables in an application, a user is downloading a large CSV (over a million rows) from a database, and loading that CSV into google drive, so that it can be imported into QuickBase daily. It is on replace mode. This means that records are only added/removed once a day - and not on a steady state. Because of that, there is a delay throughout the day. We have an opportunity to have the records updated from this service in the json format - which would reduce manual effort, and give us a near real time view. We attempted a test, but the response was: { "data": [], "metadata": { "createdRecordIds": [], "lineErrors": { "1": [ "You don't have permission to add records to this table." ] }, "totalNumberOfRecordsProcessed": 1, "unchangedRecordIds": [], "updatedRecordIds": [] } } Authentication was checked out, and service was able to add records to a non-connected table with the same authentication: { "data": [], "metadata": { "createdRecordIds": [ 1558344 ], "totalNumberOfRecordsProcessed": 1, "unchangedRecordIds": [], "updatedRecordIds": [] } } Is it possible to either: a) allow for a service to create/remove records into a connected table b) disconnect the google drive integration, and turn this into a regular table I would like to keep the existing table rather than create another, because there are many formulas, and relations to other tables. Hoping someone might be an expert in connected tables! ------------------------------ Geoff Barrenger ------------------------------Re: API_DoQuery Response Different in Web vs Terminal AppToken/UserTokenHi - Just in case anyone else searches and finds this, figured it out: curl -X POST 'https://domain/db/table' -H 'Content-Type: application/xml' -H 'Content-Length:' -H 'QUICKBASE-ACTION: API_DoQuery' -d ' <qdbapi> <usertoken>xxxxxxxxxxxxxxxx</usertoken> <query>{6.CT.1427445}AND{99.EX.TRUE}</query> <clist>74.104.52</clist> <slist>98</slist> </qdbapi>' The XML format doesn't return correctly in terminal - but if you pass it as xml you can get the desired result. I still don't know why the URL version doesn't work as well - probably a formatting or quotations issue, but this turned out great. ------------------------------ Geoff Barrenger ------------------------------ API_DoQuery Response Different in Web vs Terminal AppToken/UserTokenHi, Having trouble getting a DoQuery result to work with curl in terminal. Trying to do this way to then parse out for what I actually need. End goal is: 1) user scans a QR code with a client 2) client sends the QR code to QuickBase 3) QuickBase returns with an image URL corresponding to that QR code 4) client displays the image to the user When I do the call in the web browser with apptoken, I get the expected result (needs to be logged in). When I do the call in terminal with usertoken, I get a response but the response doesn't have any data in the response. Any ideas? (I also noticed I needed to use %27 instead of quotes in the web browser) I am searching field 6 to see if "1427445" is contained in that field and field 6 contains "alternate10" expected response is the file name and the URL Interestingly, when I did an API_Authenticate, and then tried in terminal with apptoken and ticket, I still got the same result!!! Web Browser: https://domain.quickbase.com/db/table?a=API_DoQuery&apptoken=xxxxxxxxxxxxxxxxxxxxx&includeRids=1&query={%276%27.CT.%271427445%27}AND{%276%27.CT.%27alternate10%27}&clist={26.6.52} Result ( This works ): <qdbapi> <action>API_DoQuery</action> <errcode>0</errcode> <errtext>No error</errtext> <dbinfo> <name>Assets</name> <desc>Assets available in AEM / ASC.</desc> </dbinfo> <variables></variables> <chdbids></chdbids> <recordrid="1158942"> <filename>s7-1427445_alternate10.psd</filename> <social_url>RESPONDS WITH THE URL TO AN ASSET WHICH I NEED SUCCESSFULLY</social_url> <update_id>1631916573812</update_id> </record> </qdbapi> Terminal: curl "https://domain.quickbase.com/db/table?a=API_DoQuery&usertoken=xxxxxxxxxxxxxx&includeRids=1&query={'6'.CT.'1427445'}AND{'6'.CT.'alternate10'}&clist={26.6.52}" Response ( does not work ): <?xml version="1.0" ?> <qdbapi> <action>API_DoQuery</action> <errcode>0</errcode> <errtext>No error</errtext> <dbinfo> <name>Assets</name> <desc>Assets available in AEM / ASC.</desc> </dbinfo> <variables> </variables> <chdbids> </chdbids> </qdbapi> ------------------------------ Geoff Barrenger ------------------------------ Re: Clear out all Data in a Field - without deleting the field itselfI appreciate this cheap and cheerful suggestion! It works - not ideal once our image count goes up but works as expected. I think eventually I will just move this to another table and do a table to table lookup as Mike and Mark have suggested - i can keep the publish data in a separate table and just wipe the table out monthly. Thanks all for the great suggestions. ------------------------------ Geoff Barrenger ------------------------------ Clear out all Data in a Field - without deleting the field itself Hello, I feel as though I'm overlooking something very simple here. Is there any way to clear out all data in a specific field, without deleting the field itself? Or even better - clear out all data in a specific field - if certain parameters are met? Example: I'm keeping track of the status of images in another system ( available / unavailable, etc ). If an image is unpublished, it is generally moved out of the library, and that status in QB should be null. Unfortunately, the library can only report on what's in it - not what's NOT in it. So how do we wipe out all of the other data? I do not want to do this every week - most of the time, just loading new published dates is OK, but once a month or so - I'd like to wipe the "Publish Status" and start from scratch. A full report every week is too time consuming and the difference is more efficient. Any ideas? ------------------------------ Geoff Barrenger ------------------------------ Re: Retaining Filters on 'New' Table ViewAlthough the new view has some features - losing the quick filters - and especially being able to set up reports in a way that is easy for other users to filter only what will actually help them - is a huge benefit to the side bar filters. Is there any response from QB about this? Losing the sidebar filters will be a huge loss ------------------------------ Geoff Barrenger ------------------------------ Re: API API_AddRecord - with Update https://developer.quickbase.com/operation/upsert This absolutely did the trick - wasn't that hard to format. Now I just have to look as to where it has to be used. Thank you! ------------------------------ Geoff Barrenger ------------------------------ Re: API API_AddRecord - with Update"Upsert" might be good! I hadn't seen this yet... will have a read through and see if I can format what I need in this way. The API_ImportFromCSV has some benefits - it's one call for more data instead of doing little calls - but individual calls will be more of a "real time update" which is what we're trying to accomplish. If anyone else has any other suggestions please send, otherwise thank you Nathan and Mark ------------------------------ Geoff Barrenger ------------------------------ Re: API API_AddRecord - with UpdateHi, Thanks for the quick reply. No, it is launching from an Applescript doing an audit on a list of files in specific folders and creating records of the files and metadata in the files. Occasionally though, there will have been a previous file with the same name (unavoidable), and in this case - the key field value would stay the same, but the other fields will be updated with the new metadata. There is no way to know (unless DoQuery first? ) if there had been a file before with the same name. Is the only solution to query first, and if no response AddRecord, but if there is then EditRecord? Seems cumbersome... ------------------------------ Geoff Barrenger ------------------------------ API API_AddRecord - with UpdateHello, Is it possible to use the API_AddRecord with updates? Occasionally there are already historical records with the key field (unique) field, and those commands fail. The other option it seems is to use API_ImportFromCSV but that will require a fairly large rewrite. If the API_AddRecord command has a "merge with" or "update" modifier that would be fantastic. Any ideas? ------------------------------ Geoff Barrenger ------------------------------