Forum Discussion
_anomDiebolt_
8 years agoQrew Elite
It is almost always a better idea to clean up and parse your data before importing it into a new system (here QuickBase). I am going to assume you have already looked at this option and are accepting the fact that the un-parsed data has landed in your application.
You used the word "consistency" in describing the information you want to parse out of the text field. More than likely your data is a lot less consistent than you assume as information could appear out of order, could be missing, or contain unexpected characters. One example of the data is insufficient to derive its actual format and structure. Because of this variability is probably best to initially manually supervise the parsing of the data rather than completely automate the process.
Here is a quick demo - Click the Parse button to parse the fields and click the Reset button to reset the fields:
Regexp Capturing Groups
https://haversineconsulting.quickbase.com/db/bnm9bcsys?a=er&rid=1
Pastie Databsase
https://haversineconsulting.quickbase.com/db/bgcwm2m4g?a=dr&rid=652
Notes:
(1) To simplify the maintenance of the demo I have made the [Text] field readonly on the HTML form and provided a Reset button to clear the parsed fields.
(2) For the City, State and Zip information I have parsed the string into individual fields.
(3) This is just a simple demo of parsing data out of a field using script and regular expressions. In fact I wrote the script to allow a user unfamiliar with regular expressions to modify the script without having to understand every detail. However, regular expressions are so powerful that it could easily be adopted for much more complex parsing requirements.
If you need further assistance with implementing this solution feel free to contact me off-world using the information in my profile:
https://getsatisfaction.com/people/dandiebolt/
You used the word "consistency" in describing the information you want to parse out of the text field. More than likely your data is a lot less consistent than you assume as information could appear out of order, could be missing, or contain unexpected characters. One example of the data is insufficient to derive its actual format and structure. Because of this variability is probably best to initially manually supervise the parsing of the data rather than completely automate the process.
Here is a quick demo - Click the Parse button to parse the fields and click the Reset button to reset the fields:
Regexp Capturing Groups
https://haversineconsulting.quickbase.com/db/bnm9bcsys?a=er&rid=1
Pastie Databsase
https://haversineconsulting.quickbase.com/db/bgcwm2m4g?a=dr&rid=652
Notes:
(1) To simplify the maintenance of the demo I have made the [Text] field readonly on the HTML form and provided a Reset button to clear the parsed fields.
(2) For the City, State and Zip information I have parsed the string into individual fields.
(3) This is just a simple demo of parsing data out of a field using script and regular expressions. In fact I wrote the script to allow a user unfamiliar with regular expressions to modify the script without having to understand every detail. However, regular expressions are so powerful that it could easily be adopted for much more complex parsing requirements.
If you need further assistance with implementing this solution feel free to contact me off-world using the information in my profile:
https://getsatisfaction.com/people/dandiebolt/
- ArchiveUser8 years agoQrew Captain
This works, but, you have to copy over the text from the source to a regular text box, save the record, then go back in and click the parse button.
Is there a line of jQuery we can put in the module.js that will automatically copy over the Connected data field [body] to the regular text field [Bodytwo]?
thank you,
- ArchiveUser8 years agoQrew CaptainAlso, can we automate selecting the parse button?
- _anomDiebolt_8 years agoQrew Elite>This works, but ...
All that can be automated by binding the button to a script that runs a loop over all the records in a query, parses out the results and imports the parsed text in the appropriate fields / table. Those details are specific to your setup. Additionally, you will still need a button to kick off the automatic parsing process (the loop) as I don't know of a way to run script after a connected table updates.
However, I would still approach the problem but using a button on a single records so you can supervise and observe the parsing. It is unlikely that that one example of data captures the true variability of you incoming data and my regular expression may fail in special cases. For example, (1) missing or extra spaces in the City, Stat and Zip information or (2) multiple newline characters in the Description information will cause the current regular expressions to fail. Regular Expression are powerful enough to address the variance in your data but you have to learn these characteristics by manually supervising the script parsing before going for a full automation.
If you need further assistance with implementing this solution feel free to contact me off-world using the information in my profile:
https://getsatisfaction.com/people/dandiebolt/