February PDX Qrew Meeting Notes
Diego Alvarez, Tammy King, Pat Wenger, Pierce Wagner, Nico Cantillo, Lee Gilmore, Jonathan Miller, Elena Larabee
Intros
Diego Alvarez was the first to present a problem. He is working on a formula to compare date values between a set of fields. The difference between dates determines a duration between start and finish and is displayed as a number of days. The collaboration energy was strong with this one as we all discussed and postulated the best solution. Tammi really took the lead with assistance from Pat, Lee and Elena. In a few minutes we had a working formula and we were ready to move on to the next problem.
Pat and team Harder have been working on understanding best practices and methods of improving Dashboard report loading performance. We have many Dashboards containing gauge reports that are struggling to load. Some reports we call My Reports, filter by the current user. My reports allow each user a custom experience. Quickbase tells us the username lookup is likely the cause of the problem with slow load times. As a group at Harder we have discussed this many times but are still unclear on the logic behind the recommendations of using the email address over the QB Username. Quickbase recommends using the email address with a ToUser() formula versus using the QB Username. At the Qrew meetup, no one had a solid explanation and we weren't certain how to explain why one works better than the other (this is going to be my Elephants Child at Empower this year).
One response from Lee was to use a User table, which is pretty much what we have with our Employees table. We use both User type fields and an Employees table in various locations throughout the realm. The easiest way to set up a nice gauge report is to use the User field, employing an Employee table is more complex and takes longer.
Another solution is to use a Formula check box to identify where a User field in the record is assigned to the current user. In this case we use the formula UserToEmail(User()) to inform a check box. Then in the My reports filter we look for is equal to in place of is current user. The same filter is applied to the Gauge report as well. The explanation is that the formula does all the work once, whereas the is current user performs the same work for each report. I would really like to see a visual representation of how this works.
Next Jonathan was asking for Kennedy. Kennedy wants to know how to transfer a pipeline from one account to another. There are many variables involved in answering this question. But here is a tldr general answer below and we can work on this at the next PDX Qrew meetup.
If the email address is owned by your company, then to transfer Pipelines over to another account, as a Realm Admin, change that email to a service account. If the email is not on your domain, log on to that Users pipelines and transfer them manually. Below is how to transfer manually in several steps. First export the YAML. Next setup a service account. Last replace the slugs and import the YAML to the service account.
The goal is to replace this part of the YAML called a slug: iAmasLuG and then import the modified YAML to the desired account, preferably a service account.
# Account slugs:
# - quickbase[iAmAsLuG]:
- Click "Profile and preferences" located in the upper right corner of the dark grey App bar
- Select Impersonate a user and enter the user to impersonate (the transferring away from account)
- One at a time, click the pipeline to export
- Click the kabab icon
- Click export
- Click "Download all YAML" or select all the text inside the black background in the Export Pipeline window and save it to a text file
- Once all the YAML is downloaded, open the YAML in VS Code or Notepad++
- Find the all the slugs in the text and replace with the one from the service account
What service account? Read on.
Create an alias User:
- Have IT add a publicly accessible email address to your domain as an alias forwarding to one or more Users who will access the Pipelines
- Verify the email can receive email from outside the domain by sending an email to the address from gmail or some other email service. If you get a bounce email, tell IT to make it public
- Add the email as a User to Quickbase
- On the Users page of the admin console, locate the email
- Under authentication select Use Quick Base to manage password ~ wait for the email
- The registration email is sent to the alias email and then forwarded to you
- Click the register link and fill out the form with information. Record the information in a secure location like Roboforms so others can access this information in case you win the lottery
Make the alias email into a service account:
- Admin console > Permissions, add the User to Permitted Users and groups with Build Pipelines and Create user tokens toggled on
- Admin console > Users, search for the Alias User, click on the User so the sidebar pops out
- In the sidebar scroll down and toggle on "Use as a service account", add whoever needs access to the service account
Refresh the page and look at Profile and preferences dark grey upper right, there should be a Switch Accounts section and the alias is listed in that section. At this point you have all the Pipelines YAML now you need a slug.
Get a service account slug:
- On your account click Profile and preferences, click switch to service account
- Once connected as the service account, click Profile and preferences again
- Click Channel accounts
- On the channel accounts page copy the Account slug to Quickbase connector
- Open each of the Pipelines and find and replace the old slug with the service account slug
- On the service account import the modified slug YAML files
- Test the import
- Turn off the old Pipelines
- Turn on the new Pipelines
- Switch back to your account
Easy right? To be fair, once the process is understood, it is really easy to perform and can be done in a few hours depending on the number of pipelines to move. Also this process is only going to happen once because from this point forward everyone is using the service account and no one is adding Pipelines on their account. @Kennedy, join us for the next Qrew meetup and we can walk through the process.
Next Elena shared some interesting methods around using the Quickbase Document Generation (doc gen) "Document templates" tool. She talked about solving the problem of combining individual PDFs into one pdf output (we call it bundling at HMC). The process is done using the Document Generator Restful API. A JavaScript code page is written where one generates PDFs on multiple child records via URL parameters. Then sends the separate PDFs to a PDF microservice like Convert API. Convert API loops through the PDFs and combines them into a single document. The final document is put back in a record in the App as an attachment. For Elena's solution the use of Chat GPT helped complete the JavaScript process. Elena is using PDF Lib to perform the bundling. The only caveat to this solution is file size. If the combined file size is greater than the allowed file size (50MB last check), this method does not work. Informing the User who clicked the button of the error is the next problem looking for a solution.
In regards to security and uploading business files to microservices, it is not certain who has access to the files uploaded, further investigation is recommended.
Nico also has customers who are using the Document Template feature in Quickbase. He mentions the cell height is limited to 50px statically and cannot be changed (without effort). He was unable to explain why 50px is the defined setting and was equally confused. Maybe someone at Quickbase can explain the 50px baseline and why this value was not given to the builder to decide.
Finally we discussed a Pipeline question relating to whether it is better to batch update or loop update records on a scheduled pipeline. For this problem I have a garbage collection process where I want to schedule the clearing of values entered in different fields after a condition is met. In this case, we have employee records where one or more Employee's are Onboarding and upcoming dates are filled out prior to Onboarding date as a mean to prepare. The Pipeline looks for records two weeks after the onboarding date and clears the date fields. In my experience with Integration Reads, it is more efficient to batch update over loop update. I started a Pipeline using Resful API to get a report. Once I had the report I wanted to parse the JSON in the report to prepare a csv for upsert with blank values. The entire group suggested this method was easily done with a formula field and a search and loop update records Pipeline. I still would like to see if it is possible to get a report, make some changes and then batch upsert at some point in playing with Pipelines.
Overall the meeting went really well and it feels like we solved some problems. Afterwards we decided to visit Eem and continued our conversation around Quickbase life the Universe and everything.