ContributionsMost RecentMost LikesSolutionsRe: Slowly Changing DimensionsHi Mark, sorry for the late response. The requirement got changed so that it was no longer necessary to automatically join products and prices. The altered use case has the user always select from a list of possible conditions for a certain product thus I could easily use the filter condition on the related field in Quickbase to get the currently valid entries using your formula query approach. Thanks for your help on this topic. Best regards, Martin ------------------------------ Martin Suske ------------------------------ Re: Slowly Changing DimensionsHi Blake, thanks for your feedback. My example lacks some other tables because I wanted to focus on the problem of filtering out the currently valid row from "Product Prices". Your idea of summarizing it up to a "Products" table would help if I could put a filter on the summary and only use rows that are within the current time range, which would be the 2.50$ row for today is between 2022-04-01 and 2022-09-30. I just checked and there is a filter setting for summary fields. But I guess I'm not able to apply a dynamic date formula there (?) Another problem I have is that I would need to be able to sell products "in the future". Using the example model again I should have added a "Sales Date" Column to my Order table. T_ORDER PRODUCT_ID, PRICE, SALES_DATE 1, 2.50$, 2022-10-01 So if I sell Milk today (2022-09-12) with a future sales date of 2022-10-01, I would need to retrieve the 3.00$ price row which applies for that period. I think it might be necessary to build a key that somehow contains the different date ranges in "Product Prices" and that can be matched against a key from the Orders table. But since on the Order side I have a single date and in "Product Prices" I have multiple date ranges, I'm not sure how a comparison of those keys could work. Any further ideas are highly appreciated. Kind regards, Martin ------------------------------ Martin Suske ------------------------------ Slowly Changing DimensionsHello Community, I need to be able to have different versions of parameters in a dimension table, depending on a given time period. I wonder how to solve this in Quickbase. Simplified example: T_PRODUCT_PRICES ID, NAME, PRICE, VALID_FROM, VALID_TO 1, Milk, 2.00$, 2022-01-01, 2022-03-31 1, Milk, 2.50$, 2022-04-01, 2022-09-30 1, Milk, 3.00$, 2022-10-01, <<null>> T_ORDER PRODUCT_ID, PRICE 1, 2.50$ => How can I get the currently valid price and use it in a relationship with a facts table? Are there any best practices in Quickbase? Kind regards, Martin ------------------------------ Martin Suske ------------------------------ Re: Connected Table - number import problemNo, it is set to 2. I just tested with point as decimal separator and the appropriate format mask and it worked. So I guess it is a problem with the comma as decimal separator. ------------------------------ Martin Suske ------------------------------ Connected Table - number import problemHello, I setup a connected Table that contains decimal numbers using comma as decimal separator. The data looks like this: ID,Num_1 "1","1234,56" When I create the table the fields are created as type "text" and the number value look like in the import file (1234,56). After changing the field type to "Number - Currency" with the format "12.345.678,00" the number field will show the correct data 1.234,56 €. I then update the import file to ID,Num_1, "1","2234,56" and trigger a refresh of the connected table. After the refresh the value gets displayed as 223.456,00 €. It seems like in the update process QB does not recognize the decimal separator any more. Does anybody have an explanation or solution for this? ------------------------------ Martin Suske ------------------------------ Re: Is there any way to copy data from an Excel file and paste it in the Grid Edit report?Just came across that issue too and it seems to be a feature of the grid edit function to not allow external clipboard data to be inserted. I had no problems copying the same data into other QB fields. I cannot understand the restriction though. Seems quite common to insert data from other sources in a data grid. Validation could be implemented for special types. Giving every user the possibility to upload/merge data or build individual pipelines for that purpose would be worse for me. Edit: I just learned about the "Paste Special" function that can be called from the context menu of a cell in grid edit mode. I guess that'll solve my issue. ------------------------------ Martin Suske ------------------------------ Custom CSV ExportHi all, is there a more flexible way of defining CSV outputs for reports? (Changing the separator, enclosing strings to avoid confusion with the separator) I'd like to use the report subscription via email function, but it only provides CSV using comma as a separator and text fields are not enclosed by quotes so I have to take care of not having commas in the content, right? Is there a more error prone way of exporting data using pipelines? Or do I have to write custom code using the API for this requirement? Best regards, Martin ------------------------------ Martin Suske ------------------------------ Re: Questions on data importGood point Don. I cannot be sure the output will remain constant with multiple data sources attached in the future. Also data quality can be an issue so error checking will be important in the process. Thanks for your input. ------------------------------ Martin Suske ------------------------------ Re: Questions on data importHi Mark, thanks for your answer. The T2T method sounds interesting and would solve the field mapping and transforming issues. I wonder though what do you suggest for the initial import? I assume I'll have to deal with CSV files separated by semicolon, tab or comma. And I need to fill that temporary table in the first place with 25.000 rows. Which interface would provide the performance and flexibility in file structure? I should have mentioned: In the end the whole import process should run automatically or based on some trigger. Best regards, Martin ------------------------------ Martin Suske ------------------------------ Questions on data importHello Quickbase Community, I'm looking for some experienced opinions on how to approach a current requirement. I basically need to upsert 25.000 data rows each day. I have little to no control over the output format of the source systems, so my main concern is: Will I be able to perform the necessary transformations within Quickbase or should I rely on an additional tool sitting between the source and Quickbase and transforming the data into appropriate formats for Quickbase to handle it. I already worked with connected tables and found it pretty smart for the task. But I found it processes only comma separated files with the headers matching the Quickbase table fields. Then I tested the pipeline CSV Handler with 25 rows of testdata and had to wait almost 5 Minutes to load them. Doesn't seem to be the way to go with 25.000 rows. Although I could flexibly change the separator and match source to target fields. I had a look into the pipeline "Bulk Record Set - Import with CSV" which seems to be suited for big loads of data and read about a 10.000 rows limit per pipeline execution what would lead to further preparation of my source data too. My question is: Is there a way of dealing with data transformation and mass data imports in Quickbase or should I definitely use an ETL framework upfront? Any input is highly appreciated. Thanks and best regards, Martin ------------------------------ Martin Suske ------------------------------