In my previous blog I talked about how my SQL wizard (Tom Celvi) was able to help migrate masses of data when Excel and Integration Manger could not step handle it. I asked him if he would elaborate briefly as to his steps. Thanks to Tom for this contribution!
A Custom database was created
“I did this so I could keep the initial import data separate from GP and to provide a central point of access for both the GP production and test companies.
This way, I could use the same data and processes for both testing and production deployment without having to re-import the data for both test and production runs.
Additionally, I created a method for capturing the results of each employee import, so we could review failures and successes. I created a single stored procedure to process the data into a company. This stored procedure was developed in the test company, and then deployed and run in the production company once the data import was acceptable.
By placing this data into a separate database, I was able to use the actual GP table definitions and names, which provides some data quality checking at the time of data import.”
A view was built in the custom database
“To grab the employee ID’s of any employees that had received checks in the past 6 months.
That view was then used for filtering employees during transaction processing.”
Uploaded the flat file data in the custom database.
“I just uploaded the flat file data to the actual GP table definition, but in the custom database. This gave me confidence that the file data was usable, since the original definition was able to accept it.”
The stored procedure used eConnect to insert the data into the production and test databases.
“This provided additional system integrity validation (making sure all codes existed and that all data values sent to GP where actually valid).”
“One item of note, it appears that eConnect requires the Class ID information to be rolled down manually to the employee level.”