HeyJim Posted July 28, 2005 Report Share Posted July 28, 2005 When a new enrollment comes in I first enter all of the relevant information in Quickbooks. After that I open a form in our database and enter all of the same information in the data base. There's simply gotta be a better way! At first I thought of entering all of the information in a spreadsheet and processing it that way, assigning each cell of information to a different variable. I'd also have to have in the spreadsheet a corresponding cell that indicates what information goes in which cell. (What I'll call the title cell.) I'd guess it wouldn't be too hard to figure out a way to ignore the title cells in the spreadsheet but since I'm far, far away from being competent in ME I thought I'd ask for some advice first. How would you address this little project. Am I even on the right track with processing a csv file or a text file or is there a better development path to follow? Thanks for all input. Quote Link to comment Share on other sites More sharing options...
Cory Posted July 28, 2005 Report Share Posted July 28, 2005 I'm not sure of how many of what and when you are entering but I think if there were multiple items I would use ME to create a CSV file, prompting you for each value, and simply import that into the DB. Also most DB's can link to an external file which could be a CSV and in that case you would never have to import but just use that as a sort of table. Of course you could also create an action query to port the data to a proper table if need be. Of course you would still need ME to enter the info in QB. QB can import a special tab seperate value file but I haven't had a chance to figure out the mappings and what all the header info does yet. One day... Also in the CSV you won't have the title cell problem obviously. But like I say I'm not entirely sure what you are doing so I might be way off base. Quote Link to comment Share on other sites More sharing options...
HeyJim Posted August 1, 2005 Author Report Share Posted August 1, 2005 "...I would use ME to create a CSV file" Thanks. I'll pursue learning that option first. If ME can create the CSV file AND assign each field to a specific variable I should then be able to TEXT TYPE each variable into the fields as needed. (?) I'm thinking along the lines of having ME request the data for one field at a time. Creating the CSV file would simply be a byproduct... hmmm. I don't know... maybe I can just import it afterall... you might have a better idea there than what I realized... (I'm only half way through the manual so I'll be a while :-0) Quote Link to comment Share on other sites More sharing options...
Cory Posted August 2, 2005 Report Share Posted August 2, 2005 I'm not 100% sure I follow but you mention using Text Type to create the file. This will work but a better approach is to write to the file directly. Use the Variable Modify String Append To Text File instead. It's very simple. Depending on your user input things may very quite a bit in structure but I would suggest a "New Enrollment" macro that prompts the user for each bit of client individually. "First name", "Last name", "SSN" and so on. Then show the user the data to confirm it’s correct after which ME could write all the variables to a CSV and or enter it in QB. Or better yet have ME prompt the user for all the info twice and compare to the previous value to ensure they entered it all correct! It won’t win you any friends but the data entry error rate will drop like a rock. A couple of suggestions I would offer: 1. Make some validation routines. My favorite is a repeat until variable not equal to null. This way they can’t just skip it. I’m going to make a subroutine soon that will test SSNs for validity, if you’re interested let me know and I’ll email you when it’s done. 2. Write a header to the CSV. Makes import mapping easier. Simply test that the file exists first and if it doesn’t create the header row. 3. Test the string vars for commas. If using a CSV file you will need to add quote marks around the entry if it does. Another option is to use a tab separated file instead to avoid this. 4. You will need to generate a CR+LF to put at the end of each line. You can steal my subroutine to do this. Also does TAB. If you need any help just ask. Quote Link to comment Share on other sites More sharing options...
kevin Posted August 2, 2005 Report Share Posted August 2, 2005 Cory's suggestions are good and I agree. I have a couple of further comments. 3. Test the string vars for commas. If using a CSV file you will need to add quote marks around the entry if it does. Another option is to use a tab separated file instead to avoid this.If you wish you can surround each field with double quotes. This eliminates the need to test for commas. What I do is if the field cannot possibly have a comma, then I leave the double quotes off. If the field could have a comma, I put it on for all fields. 4. You will need to generate a CR+LF to put at the end of each line. You can steal my subroutine to do this.The Variable Modify String: Append %T1% to Text File command has an option to append the CR/LF for you. Quote Link to comment Share on other sites More sharing options...
Cory Posted August 2, 2005 Report Share Posted August 2, 2005 You're right of course about the CRLF and it points out a deficit in my explanation which I would like to address so I don’t look too ignorant. <G> The reason I mentioned it was that I often accumulate the entire results (what will be written to the file) in a string var. This way I’m not constantly hitting the file and do it at the end of the session. This is especially nice if working over a network where multiple users are accessing it. If the file is local and only used by one person there’s probably no benefit and therefore your suggestion is much cleaner. This reminds me of another suggestion for HeyJim though. I feel you should accumulate at least one record at a time before writing it to the CSV file. He wouldn’t need my CRLF suggestion for this but it will ensure that if the macro is interrupted he won’t have a partial fieldset for the record. This could be disastrous as if the macro was started up again and started writing to the end of the last incomplete line. As long as the users complete all the entries it wouldn’t be a problem but if you use validation routines there is a good chance that the user will just bail if they don’t pass one of the validation rules. Quote Link to comment Share on other sites More sharing options...
kevin Posted August 2, 2005 Report Share Posted August 2, 2005 These are good points. Depending on the specific application I may even go one step further. It might be wise to write eveything to a temporary file and only use it if the macro completed successfully. Your point about incomplete records is valid but it could also be a problem if each record is complete but one or more records are missing. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.