Posts Tagged ‘SQL’
This time I would like to share with you something that I recently learned “the hard way” obviously on a support case.
First of all, I want to express the importance of validating your information and that the engineers/partners that are/were involved in your company setup of Dynamics GP were bound to the best practices established by Microsoft and supported by many of my colleagues.
In my case, a customer did a side by side upgrade of SQL, with this came the issue of not having the DYNSA login in the new SQL instance, we followed certain processes to make sure DYNSA was the owner of the Databases Dynamics GP is using.
But you may ask who or what is DYNSA? my friend Mariano Gomez has a post very complete about this subject and you can find it here: Mariano’s DYNSA Info since I am not reinventing the wheel take a look at Mariano’s blog its packed with information for all audiences (GP related !!).
So when I was working on this customer DYNSA setup, I suddenly remembered another case where I was having issues with a third party, I jumped into their environment (literally) and started investigating this DB configuration, to my surprise the owner of the databases was an AD account not DYNSA. I proceeded to replace the owner and then certain SQL reports started working and producing results. I am still intrigued on why, but I will do a full research on spare time.
So back on the game for this customer issue I was having while trying to archive data, just imagine a SOP30300 table with 11 million records and a huge base of customers.
The queries running were taking countless hours to execute not even mentioning the impact on the processor. memory and of course user experience.
I learned that the customer had a “Maintenance Plan” where they executed the Shrink process on SQL, as you may know I am a SQL enthusiast and I recalled an important blog post from another noted resource Mr. Pinal Dave aka “The SQL Authority” here is his post about why is BAD to shrink a DB Shrink is Bad for you… there is one section in the article that explains that Shrinking a DB to obtain disk space will actually fragment your tables, obviously to reduce fragmentation you rebuild indexes. So this maintenance plan was being executed to reduce disk space, then to improve performance, and the disk space was gone again. Wise words from a mentor that prefers to be in the shadows once said “with current prices on storage why waste time shrinking when you can focus on performance”.
I ended up tweaking some SQL scripts to automate a SQL job on finding fragmented tables in the DB and executing that as part of DB maintenance, as I said “go buy another disk drive and add it to your server, move the logs to this new disk and keep data apart from the logs and you will be better than now”.
I hope my experience helps you for a better community.
Until my next post
Francisco G. Hillyer
Even customers are truly surprised when I show them how Integration Manager can pull information from one company or even better from another SQL server into Dynamics GP as a source.
Yes I know you are surprised too that I am telling you that your source can be a SQL Query Script…….
We have many ways of sorting data in integration manager, we even have filters and sort fields on the source definition, but nothing beats the customer frustration when they have a sorted file and Integration Manager scrambles the file, then open up the distributions for example on a JE import and see that the order is not even close to what the source file have.
One thing important is that there is a setting in the IntegrationManager.ini file. This setting is called UseOptimizedFilter normally I see this option with the value True.
When this option is enabled, Integration Manager uses a series of ninja techniques to optimize the data upload but that has an impact on the sort order.
If you want to have the distributions ordered or your transactions in the order you have them in your source file, close your Integration Manager, go to this file located on the Integration Manager code folder and set to false this setting.
Try your integration once again and voila!!!! Its just like magic, everything is the way you had it on your source file.
I am hoping to save techs from headaches here, and if this tip helps you, let us know, we want to know if our work can actually influence a better Dynamics community.
Lately for many of our customers the Connect section has not been working as expected, for a few its just an ugly Error 404: File or directory not Found.
Or simply it has been getting in the way of Integration Manager. Even when many colleagues like Mariano Gomez, Mark Polino, David Musgrave, Leslie Vail and others have provided help to the community on this issues. One question remained unanswered: How to massively remove it without the administrator login into each account?.
Famous engineer Tami Jones from the RoseASP team came up to my beach front office with a wonderful view of the Pacific ocean (ok it’s a 3 wall cubicle with no view) and asked me how can we automate this process as you know we team up with myGPcloud as well.
So I decided to take a look at the homepage behavior and came up with the following descriptions:
|SectionID||Value 1||is the ToDo Section|
|SectionID||Value 2||is the Outlook Section|
|SectionID||Value 3||is the Metrics Section|
|SectionID||Value 4||is the My Reports Section|
|SectionID||Value 5||is the Quick Links Section|
|SectionID||Value 6||is the Connect Section|
The table responsible for storing the values to display or hide those sections is SY08100 in Dynamics DB. So here is the script that will help you remove the connect section:
With a little tweak you can limit the update statement to work only on 1 user or a set of users.
Always remember to backup your database and try statements in a test DB.
Have a great day!!!
A few months ago a customer called saying that whenever he tried to post cash receipts he was getting the following error: A get/change last operation on table doCreditCardAuthorizationActivity cannot find the table.
He said that this was the first time they got the error and they could not continue working.
I did my research and found no mention on the tables at all.
Then I asked if any third party products were installed he said “No the only change is that we updated to SP5” ….. that’s were I got the clue.
I reviewed the Dynamics.set file and saw an unusual product 6499 Dynamics Online Services.
I went to “Microsoft Dynamics GP > Tools > Customization > Customization Maintenance”
and disabled Dynamics Online Services and asked the user to reproduce the steps that gave him the error and it did not appeared.
So I used the following approach to solve the issue:
click on Start > Programs > Microsoft Dynamics > GP 10 > Database Maintenance
I followed the wizard and when I got to the step to select the products I selected Dynamics Online Services and I choose to recreate Stored Procedures, Views and Triggers.
After processing it my understanding was that the error would disappear but instead we got the following error: A get/change last operation on table doCreditCardAuthorizationActivity failed accessing SQL data.
So when I read “Failed Accessing SQL Data” immediately went to SQL server and ran the Grant.sql script and after that no more errors.
On the Dynamics GP online community a recent post regarding this issue was asked and Ian Stewart posted a similar solution as well
Have all users log out from GP and login as SA
go to Microsoft Dynamics GP > Maintenance > SQL select DYNAMICS as DB click on the tables DO Setup, and the options Create Table and Create Auto Procedures.
then select each company and for each company recreate tables and auto procedures for the following tables: DO Cash Receipts (History Open and Work), DO Credit Card Services Activity, DO Credit Card Setup, DO Sales Payment Work and History.
Click process and that’s it.
Francisco G. Hillyer
In today’s post I want to share a SQL script that I considered very important, lately I worked in cases were I have been witness on how other partners/customers deal in SQL to remove stranded users.
A few do this task from inside GP, others run the following code:
Delete from Activity where userid = 'XXXX"
But lets analyze, they are not really removing a stranded user and all of the activity associated with the user, what they are really doing is just removing the record from the activity table so that Dynamics GP does not see the user, but what happens if they remove via SQL or GP the wrong user id? if the user is logged into 2 or more different companies at the same time?.
Messages like “The selected document is being edited by another user” sounds familiar?
Let me explain to you how this SQL script works and then I’ll provide it to you remember with this script you don’t need the users to log off from Dynamics GP.
SQL server maintains a record of active user sessions in the table SYSPROCESSES from the MASTER database, inside sysprocesses there is a column named “login name” and we base our script on sysprocesses to clear the ACTIVITY table from the DYNAMICS database.
Once ACTIVITY table has been cleaned out we are ready to clean 2 tables from the TEMP database first we clean DEX_SESSION and then DEX_LOCK in order to eliminate locks and processes in temp tables.
The next step is to clean batch activity (SY00800) and resource activity (SY00801) in order to have a valid session clean up.
Here is the code:
delete from DYNAMICS..ACTIVITY
where USERID not in (select loginame from master..sysprocesses)
delete from tempdb..DEX_SESSION
where session_id not in (select SQLSESID from DYNAMICS..ACTIVITY)
delete from tempdb..DEX_LOCK
where session_id not in (select SQLSESID from DYNAMICS..ACTIVITY)
delete from DYNAMICS..SY00800
where USERID not in (select USERID from DYNAMICS..ACTIVITY)
delete from DYNAMICS..SY00801
where USERID not in (select USERID from DYNAMICS..ACTIVITY)
Note: Make sure you have a backup of your databases when running scripts that can modify your data.
Also I am including the KB’s published that show how to remove records from DEX_LOCK and DEX_SESSION that working together make the script I just provided.
It started out very vanilla. After all, it doesn’t matter the quantity of data…it is all about the quality, right? I only needed current year employee payroll transactions. My methodology is very tried and true. I Download the data into Excel, clean it up, save it, map it to Integration Manager and after a few hit and miss tries…the data has been successfully integrated. I can move on to something more exciting.
My client runs about 2,500 payroll checks a week and there is a massive amount of employee turnover. In the legacy system (which is housed in a separate facility from the new implementation) employees were not inactivated and, after several years, there were literally hundreds of thousands of employee master records.
As in Dynamics GP, the data that I needed was in several different tables, and had to be exported. No problem I think, and dump out the historical payroll transactions via SQL table export. I will do the same thing with the entire employee master record table, Do a VLook-up to parse out the employees that haven’t been paid this year…and there you go… I have what I need to import into my new GP database.
Who knew there was a limit to how much data Excel could handle. Yes, I know, there is a max. amount of rows – and Excel 2007 can hold a lot! Apparently, not enough for my employee master and state tax file. After several attempts (and a lot of swearing) I have the data in separate workbooks. Now to get rid of the employees I do not need. Not so easy. The calculation speed was painfully slow and while I could import data to all 1M rows, Excel couldn’t run the formula due to memory constraints (so I am told.) 12 hours later I am completely frustrated and at my witts end.
This is where the story gets good. My fairy godmother (a.k.a. my boss) tells me to quit pulling my hair out and call Tom Celvi. He is a SQL wizard! I am a bean-counter and application consultant. I have always thought of SQL as something to be respected and a bit feared. After all, it is the “house” for my GP data.
Silly me! There is a whole other world out there that I owe myself to learn more about. Tom has shown me that SQL is a very powerful tool and can be used for other things besides housing my precious GP databases.
Tom created a database within the clients SQL 2008 environment, imported all my raw, ugly data and through a series events that I can only explain as pure alchemy…he managed to move that cleaned up data into my GP database and I had my data in all the correct payroll tables along with only the 12,000 current employees!
This took Tom 10 hours in total and I bet he didn’t swear once! I was then able to successfully integrate the current year payroll transactions via Integration Manager and my my client is live and processing payroll.
Under lessons learned:
- SQL doesn’t have to be feared, it can be your friend (but should be respected)
- Quality over Quantity is over rated – strive for both because Excel does have its limits.
- hang out with a SQL guru, it will save on hair dye
The client thanked me profusely and said they never heard of Microsoft giving anything away for free. I agreed but told them it’s not exactly free as I would charge them for my time which ended up being 3 hours when all was said and done.
Here is the process I used to update the client from MSDE to SQL Server Express Edition 2005:
- Call MBS Sales Ops ((800) 456-0025) and get SQL keys for GP. These should be free but if you don’t have them GP will give you nasty error messages. Do this first. I actually did this last and it took a day to get the stupid keys. When you have the keys go to step 2.
- Download SQL Server Express Management Studio 2005
- Install the SQL Server Express Management Studio (Do this first if you want to use it to create your backups.)
- Make a backup of the company and Dynamics databases
- Run Capture login script on current install of MSDE. See Tech doc 878449. Script is found here.
- Download SQL Server Express Edition 2005
- Install SQL Server Express Edition. Mixed mode. (You may need to allow remote connection in SQL setup if it doesn’t default). You can install SQL in one of 2 ways. Upgrade the existing instance of MSDE. (I tried this and it failed on the upgrade). Or install a new instance (Default instance is SQLEXPRESS, call it whatever you like). You will then need to restore the Dynamics and Company databases to the new instance.
- Run the results of step 5 against the new instance of SQL to recreate your users.
- Put in new reg keys into Dynamics that you got in Step 1. Dyn utilities on any version pre 10.0. In version 10.0 you will need to run this script (delete sy003500) then enter reg keys in application. See tech doc 943965
- Update ODBC to point to new instance of SQL
This is similar to the procedure of moving SQL to a new server.
Recently we have discussed best practices about upgrades in our company. One of the big challenges in an upgrade is installing a large number of Dynamics workstations at the appropriate time with minimal user interruption as possible. Installing Dynamics has always been a pain. MBS has approached fixing this issue with their mass deployment tools but they never work and it usually ends up easier to do the installation manually. Depending on the installation you could spend 15 mins – 40 mins per workstation. Launching the installs simultaniously (3 at a time in the same area for example) may cut time a bit but still a huge pain.
This led to a series of responses by our technical team. Here is a summary from one of our team member Jason Young. Summary of response below:
One of the tricks I have used many times to minimize downtime for large rollouts is to leverage imaging tools like Acronis True Image. This tool lets you take a picture of a server or servers and redeploy them in a virtualized environment or dissimilar physical hardware. In essence, you can take the customers environment back to the office for the test upgrade without risk of downtime or impacting day to day production of the customer.
Imaging gives you some big advantages:
- You don’t change the state of the customers environment until testing is successful and you are ready for the production rollout. This gives you huge kudos in the eyes of local IT staff.
- You always have a clean rollback image.
- You don’t have to bother local IT staff because you are working with an image offsite.
- You can couple this type of test upgrade with a Terminal Server (which we have and is easy to build) and now you can do your user acceptance testing via web. Again, no customer impact.
To do the production rollout it is always optimal to do a parallel installation, but the obvious drawback is the cost of hardware. For customer without additional server resources, there is another trick that can minimize downtime and customer impact. For customers with a single backend server (single SQL server), you install a name instance of SQL and restore production data to the new named instance.
Here are the advantages;
1. A named instance of SQL means you have to use a different DSN name than the default instance. This means you can pre-install workstations with the new version and point them at the new instance without worry of an accidental upgrade.
2. The current state is not affected so you have a rollback in the event of an upgrade failure. You don’t have to do any type of restore because you never changed the data in the default instance of SQL.
3. You are basically taking a single server and doing a parallel application upgrade. So, it’s an in-place upgrade in terms of hardware but parallel in terms of the application and data.
End of summary.
Discussion: I like the seperate instance of SQL approach. I do this with different versions of Dynamics. I have 9.0 on one instance and 10.0 on another. You’d have to be careful that users do not launch and continue working in the older version instead of the newer version but that can be overcome with training or simply stopping the older instance of SQL when the time is right (probably need to set the older instance to manually start in case of a server reboot).
Anything you do to make this task easier?
Just had someone call and want to know how much they owed for all their merchanising vendors. They did not have the vendors assigned by class. I used SQL to look up the vendors and found each vendor had some default accounts listed on the card. I used 3 queries to get what they needed after backing up the database in case I messed up. Queries are as follow:
- Select * from GL00100 – this shows what the account index number is needed to interpret the PM00200 table accounts
- Select * from pm00200 where pmprchix = ‘61‘ or pmprchix = ‘66‘ – 61 and 66 were the account index’s listed in the first query.
- Update pm00200 set vndclsid = ‘MERCH‘ where pmprchix = ‘61‘ or pmprchix = ‘66‘ – Updated each vendor to be included in the MERCH class I set up in GP under Tools>>setup>>purchasing>>vendor class. In all we updated 400 vendors in a matter of mins.
The bolded red fields would need to be changed to fit your needs.
Once this work was done we could run the AP trial balance report with a class ID restriction for MERCH vendors only.
Relatively easy solution to get the customer what they needed.