11/11/10
Wrong PO Number
Fortunately, there is help on the way! I have developed a script based on a previous post, that will scan for the PONUMBER field in all tables in the company database. The script will automatically produce another script in the Results pane that can be copied and pasted into a new Query window and be executed against the company database.
The following example shows the script with the new PO number (@newponumber) and the old PO number (@oldponumber) variables being used to facilitate the interfacing with the person executing the change.
DECLARE @newponumber char(25), @oldponumber char(25)
SET @newponumber = 'PO1023'
SET @oldponumber = 'PO1001'
SELECT DISTINCT 'UPDATE ' + RTRIM(objs.name) + ' SET PONUMBER = ''' + RTRIM(@newponumber) + ''' WHERE PONUMBER = ''' + RTRIM(@oldponumber) + ''''
FROM syscolumns cols
INNER JOIN sysobjects objs ON (cols.id = objs.id)
INNER JOIN sysindexes indx on (cols.id = indx.id)
WHERE (cols.name = 'PONUMBER') and (objs.xtype = 'U') and (indx.rowcnt <> 0)
When this script is executed against plain vanilla GP v10, it produces the following results:
UPDATE POP10100 SET PONUMBER = 'PO1023' WHERE PONUMBER = 'PO1001'
UPDATE POP10110 SET PONUMBER = 'PO1023' WHERE PONUMBER = 'PO1001'
UPDATE POP10310 SET PONUMBER = 'PO1023' WHERE PONUMBER = 'PO1001'
UPDATE POP10500 SET PONUMBER = 'PO1023' WHERE PONUMBER = 'PO1001'
UPDATE POP30100 SET PONUMBER = 'PO1023' WHERE PONUMBER = 'PO1001'
UPDATE POP30110 SET PONUMBER = 'PO1023' WHERE PONUMBER = 'PO1001'
UPDATE POP30310 SET PONUMBER = 'PO1023' WHERE PONUMBER = 'PO1001'
UPDATE POP40100 SET PONUMBER = 'PO1023' WHERE PONUMBER = 'PO1001'
UPDATE SOP60100 SET PONUMBER = 'PO1023' WHERE PONUMBER = 'PO1001'
(9 row(s) affected)
http://dynamicsgpblogster.blogspot.com/2008/04/wrong-po-number.html
11/9/10
GL Reconciliation SQL Script – Payables
This script can be used as a base to reconcile the receivables sub-ledger balances as of a certain date.
DECLARE @ASOFDATE AS DATETIME
SET @ASOFDATE = '2017-04-12'
SELECT Z.*FROM ( SELECT A.[OPENYEAR] TRXYEAR,
A.[ORTRXSRC] AS ARAUDITTRAIL,
A.[ORMSTRID] AS ARCUSTOMERNO,
A.[ORCTRNUM] AS ARDOCUMENTNO,
A.[ORTRXTYP] AS ARDOCUMENTTYPE,
A.[TRXDATE] AS TRANSACTIONDATE,
RTRIM(B.[ACTNUMST]) AS GPACCOUNTNO,
SUM([DEBITAMT] - [CRDTAMNT]) AS GLAMOUNT
FROM [dbo].[GL20000] A
INNER JOIN GL00105 B ON A.[ACTINDX] = B.[ACTINDX]
WHERE A.[ACTINDX] IN ( SELECT [PMAPINDX]
FROM [dbo].[PM00200] )
GROUP BY A.[OPENYEAR],
A.[ORTRXSRC],
A.[ORMSTRID],
A.[ORCTRNUM],
A.[ORTRXTYP],
A.[TRXDATE],
B.[ACTNUMST]
UNION
SELECT A.[HSTYEAR] TRXYEAR,
A.[ORTRXSRC] AS ARAUDITTRAIL,
A.[ORMSTRID] AS ARCUSTOMERNO,
A.[ORCTRNUM] AS ARDOCUMENTNO,
A.[ORTRXTYP] AS ARDOCUMENTTYPE,
A.[TRXDATE] AS TRANSACTIONDATE,
RTRIM(B.[ACTNUMST]) AS GPACCOUNTNO,
SUM([DEBITAMT] - [CRDTAMNT]) AS GLAMOUNT
FROM [dbo].[GL30000] A
INNER JOIN GL00105 B ON A.[ACTINDX] = B.[ACTINDX]
WHERE A.[ACTINDX] IN ( SELECT [PMAPINDX]
FROM [dbo].[PM00200] )
GROUP BY A.[HSTYEAR],
A.[ORTRXSRC],
A.[ORMSTRID],
A.[ORCTRNUM],
A.[ORTRXTYP],
A.[TRXDATE],
B.[ACTNUMST]
) ZWHERE Z.[TRANSACTIONDATE] <= @ASOFDATE
http://cvakumar.com/msdynamics/2010/11/07/gl-reconciliation-sql-script-payables/
5/12/10
SQL Server APPLY operator
As I've said before, rarely a day goes by when I don't say "D'oh!" in surprise (and embarrassment) at being unaware of a tool or technology that I probably should have known about, but just didn't.
The SQL Server APPLY operator is one of those gems that I should have known about 5 years ago. I really need to start buying those "What's New In SQL Server" books, and actually reading them.
Imagine this Microsoft Dynamics GP scenario: You need to query vendors and include the vendor's PRIMARY address information, as well as their REMIT TO address information. But it's not quite that simple (it never is).
You find that the client has inconsistent vendor address IDs in their database. Sometimes they just use the address ID of "REMIT", sometimes "REMIT TO" and sometimes "REMIT ADDRESS", among others. And naturally, there are some vendors who don't have a separate REMIT TO address at all. In theory, some vendors may have two or more remit addresses, so you have to handle that situation as well--in which case you only want the TOP 1 remit address record.
While trying to figure out how to write such a query, I came across this article that discussed the OUTER APPLY operator.
Think of it as an OUTER JOIN, but without needing the ON keyword and subsequent matching columns from the two tables. It's also a bit like a subquery, but with the flexibility of a JOIN.
In short, it will make your inner SQL geek salivate.
Here is an example query that retrieves basic vendor information, along with the TOP 1 primary and remit to address records, allowing me to reference the fields in my query through a derived table.
Note that I used the OUTER APPLY version, since I want to return the vendor information even if there is no primary or remit to address. If you only want to return records that have a match in your apply operation, you can use CROSS APPLY, which works like an INNER JOIN. (hence the two scoops of yummy goodness)
SELECT
RTRIM(v.VENDORID) AS VENDORID,
RTRIM(v.VENDNAME) AS VENDNAME,
RTRIM(ISNULL(vpa.VNDCNTCT, '')) AS CONTACT,
RTRIM(ISNULL(vpa.ADDRESS1, '')) AS ADDRESS1,
RTRIM(ISNULL(vpa.ADDRESS2, '')) AS ADDRESS2,
RTRIM(ISNULL(vpa.CITY, '')) AS CITY,
RTRIM(ISNULL(vpa.STATE, '')) AS STATE,
RTRIM(ISNULL(vpa.ZIPCODE, '')) AS ZIPCODE,
RTRIM(ISNULL(vra.ADDRESS1, '')) AS REMITADDRESS1,
RTRIM(ISNULL(vra.ADDRESS2, '')) AS REMITADDRESS2,
RTRIM(ISNULL(vra.CITY, '')) AS REMITCITY,
RTRIM(ISNULL(vra.STATE, '')) AS REMITSTATE,
RTRIM(ISNULL(vra.ZIPCODE, '')) AS REMITZIPCODE,
RTRIM(ISNULL(vpa.PHNUMBR1, '')) AS PHONE1,
RTRIM(ISNULL(vpa.PHNUMBR2, '')) AS PHONE2,
RTRIM(ISNULL(vpa.FAXNUMBR, '')) AS FAX
FROM PM00200 v
OUTER APPLY (SELECT TOP 1 * FROM PM00300 vpa WHERE VENDORID = v.VENDORID AND ADRSCODE LIKE '%PRIMARY%') AS vpa
OUTER APPLY (SELECT TOP 1 * FROM PM00300 vpa WHERE VENDORID = v.VENDORID AND ADRSCODE LIKE '%REMIT%') AS vra
I told you it was tasty! Bon Appétit!
Here are a few more articles on the topic:
http://msdn.microsoft.com/en-us/library/ms175156.aspx
http://www.sqlteam.com/article/using-cross-apply-in-sql-server-2005
http://decipherinfosys.wordpress.com/2007/10/08/apply-operator-in-sql-server-2005/
FROM: http://dynamicsgpland.blogspot.com/2010/05/sql-server-apply-operator.html
4/1/10
Formatting Dynamics GP Phone Numbers in a SQL Query
By default, Dynamics GP stores phone numbers as a long string of zeroes. This format is based on the GP Format Definition assigned to phone number fields in GP.
I recently had to write a SQL query that would be used to export vendor information to a CSV file. Because the system receiving this data has a free form phone number field, we wanted to make sure that the phone number data coming from GP was formatted consistently.
I'm sure there are several other ways to do this, but this is the script that I created to put the phone numbers in US phone format. It's pretty basic, but gets the job done.
SELECT
CASE RTRIM(PHNUMBR1)
WHEN '00000000000000' THEN ''
WHEN '' THEN ''
ELSE '(' + SUBSTRING(PHNUMBR1, 1, 3) + ') ' + SUBSTRING(PHNUMBR1, 4, 3) + '-' + SUBSTRING(PHNUMBR1, 7, 4)
+ CASE WHEN RIGHT(RTRIM(PHNUMBR1), 4) <> '0000' THEN ' x' + RIGHT(RTRIM(PHNUMBR1), 4) ELSE '' END
END AS PHONE
FROM PM00200
If you are having to store international phone numbers in GP, then it probably makes sense to edit the "Format" resource in Modifier to use a Fill of Space, and no format string.
Or if anyone has any other tips or tricks on querying and formatting GP phone numbers, post a comment!
10/14/09
Simple Table Backups with T-SQL
The following will select all of the data in SOP10100 into a new SOP10100_Bkup_03012009 table:
select *
into SOP10100_Bkup_03012009
from SOP10100
Remember that inserts, updates, and deletes often fire off events that could alter data in other tables. This won't backup those dependent tables so be mindful of the potential that a simple update to one table could affect data on many others.
http://mbsguru.blogspot.com/2009/03/simple-table-backups-with-t-sql.html
Custom Business Alerts
This alert, if scheduled to run periodically, will e-mail a list of users that have been logged into GP for longer than 12.5 hours or 750 minutes. It's pretty simple!
IF EXISTS
(
select datediff(mi,logindat+logintim, getdate()) as DURATION,--convert(datetime, convert(varchar(15), GetDate(), 114), 114) - LOGINTIM as DURATION,
USERID,
CMPNYNAM,
LOGINDAT,
LOGINTIM
from DYNAMICS.dbo.ACTIVITY
where datediff(mi,logindat+logintim, getdate()) > 750
)
BEGIN
DECLARE @SQL varchar(8000)
SET @SQL = 'select datediff(mi,logindat+logintim, getdate()) as DURATION,
USERID,
CMPNYNAM,
LOGINDAT,
LOGINTIM
from DYNAMICS.dbo.ACTIVITY
where datediff(mi,logindat+logintim, getdate()) > 750'
print @SQL
EXEC master.dbo.xp_sendmail @recipients = 'youralias@yourcompany.com',
@subject = 'Users Logged in beyond limit',
@message = 'Attached is a list of users that have been logged in beyond the limit',
@query = @SQL,
@attach_results = 'TRUE',
@width = 250
END
http://mbsguru.blogspot.com/2009/02/custom-business-alerts.html
Who has that record locked?
Run this against your company database to select the users which users have SOP Documents locked:
select distinct l.row_id, t.dex_row_id, t.SOPNUMBE, a.USERID
from tempdb.dbo.Dex_Lock l
inner join SOP10100 t
on l.row_id = t.dex_row_id
inner join DYNAMICS.dbo.Activity a
on l.session_id = a.SQLSESID
where right(rtrim(table_path_name), 8) = 'SOP10100'
http://mbsguru.blogspot.com/2007/12/who-has-that-record-locked.html
SQL Nugget - shortcut to create INSERT into
A slick cat we'll call Mo gave this to me. I can't take credit for it.
set nocount on
DECLARE @sTableName varchar(128)set @sTableName = '%TableName%'
select @sTableName as sTableName into #tmpTableName
DECLARE @lTableID intset @lTableID = NullSELECT @lTableID = [ID] from sysobjects where (objectproperty(id, N'IsTable') = 1) and (id = object_id(@sTableName))
if (@lTableID is Null)begin print 'Table not found! Aborting.' returnend
SELECT [name], xtype, prec, scale, colorder, isnullable into #tmpColumns from syscolumns where ([id] = @lTableID) and (colstat = 0) order by colorder
alter table #tmpColumns ADD lRowID int not null identity, -- add an identity column to the temp table sDefault varchar(40) -- add a column to store the default that we want to enter for new rowsGO
UPDATE #tmpColumns set sDefault = case xtype when 34 then ''' ''' -- image when 35 then ''' ''' -- text when 36 then Null -- unique identifier when 48 then '0' -- tinyint when 52 then '0' -- smallint when 56 then '0' -- int when 58 then '''1/1/1900''' -- smalldatetime when 59 then '0.0' -- real when 60 then '0' -- money when 61 then '''1/1/1900''' -- datetime when 62 then '0.0' -- float when 99 then ''' ''' -- ntext when 104 then '0' -- bit when 106 then '.00' -- decimal when 108 then '0.0' -- numeric when 122 then '0.0' -- smallmoney when 165 then '0' --'convert(varbinary, '' '')' -- varbinary when 167 then ''' ''' -- varchar when 173 then '0' --'convert(binary, '' '')' -- binary when 175 then ''' ''' -- char when 189 then Null -- timestamp when 231 then ''' ''' -- nvarchar when 239 then ''' ''' -- nchar endDELETE #tmpColumns where sDefault is Null--select sDefault, xtype, [name] from #tmpColumns order by colorder, lRowID
DECLARE cur insensitive scroll cursor for select sDefault, [name] from #tmpColumns order by colorder, lRowIDOPEN cur
declare @sDefault varchar(40), @sName varchar(128)declare @sWork varchar(200)declare @sWork2 varchar(100)
select top 1 @sWork2 = sTableName from #tmpTableNameprint 'INSERT into ' + @sWork2print ' ('
FETCH first from cur into @sDefault, @sNameWHILE ( @@fetch_status = 0 )begin set @sWork = char(9) + @sName
FETCH next from cur into @sDefault, @sName -- if the fetch is good, add a ',' to the end if ( @@fetch_status = 0 ) set @sWork = @sWork + ','
print @sWorkendprint ' )'
print 'select'
FETCH first from cur into @sDefault, @sNameWHILE ( @@fetch_status = 0 )begin set @sWork = char(9) + @sDefault set @sWork2 = @sName
-- get the next record from the cursor FETCH next from cur into @sDefault, @sName
-- if the fetch is good, add a ',' to the end if ( @@fetch_status = 0 ) set @sWork = @sWork + ','
set @sWork = @sWork + char(9) + '-- ' + @sWork2 print @sWorkend
CLOSE curDEALLOCATE CUR
DROP table #tmpColumnsDROP table #tmpTableName
http://mbsguru.blogspot.com/2006/04/sql-nugget-shortcut-to-create-insert.html
Find tables, with data, that have a spefic column
Anyway, I had to figure out which tables to update so I wrote a query that would return to me all of the tables, with data, that have an ITEMNMBR column:
select distinct o.Name
from SysColumns c
inner join SysObjects o
on c.id = o.id
inner join SysIndexes i
on c.id = i.id
where c.name = 'ITEMNMBR'
and o.xtype = 'u'
and rowcnt <> 0
http://mbsguru.blogspot.com/2007/12/find-tables-with-data-that-have.html
Deleting Empty Batches in SOP
DECLARE @INTERID varchar(10),
@CMPNYNAM varchar(31)
SET @INTERID = DB_Name()
SELECT @CMPNYNAM = CMPNYNAM from DYNAMICS.dbo.SY01500 where INTERID = @INTERID
DELETE SY00500
where BCHSOURC = 'Sales Entry'
and BACHNUMB not in (select BACHNUMB from SOP10100)
and BACHNUMB not in (select BACHNUMB from DYNAMICS.dbo.SY00800 where CMPNYNAM = @CMPNYNAM and TRXSOURC = 'Sales Transaction Entry')
http://mbsguru.blogspot.com/2007/11/deleting-empty-batches-in-sop.html
9/2/09
SQL SERVER – Insert Data From One Table to Another Table – INSERT INTO SELECT – SELECT INTO TABLE
Following three questions are many time asked on this blog.
How to insert data from one table to another table efficiently?
How to insert data from one table using where condition to anther table?
How can I stop using cursor to move data from one table to another table?
There are two different ways to implement inserting data from one table to another table. I strongly suggest to use either of the method over cursor. Performance of following two methods is far superior over cursor. I prefer to use Method 1 always as I works in all the case.
Method 1 : INSERT INTO SELECT
This method is used when table is already created in the database earlier and data is to be inserted into this table from another table. If columns listed in insert clause and select clause are same, they are are not required to list them. I always list them for readability and scalability purpose.USE AdventureWorks
GO
----Create TestTable
CREATE TABLE TestTable (FirstName VARCHAR(100), LastName VARCHAR(100))
----INSERT INTO TestTable using SELECT
INSERT INTO TestTable (FirstName, LastName)
SELECT FirstName, LastName
FROM Person.Contact
WHERE EmailPromotion = 2
----Verify that Data in TestTable
SELECT FirstName, LastName
FROM TestTable
----Clean Up Database
DROP TABLE TestTable
GO
Method 2 : SELECT INTO
This method is used when table is not created earlier and needs to be created when data from one table is to be inserted into newly created table from another table. New table is created with same data types as selected columns.USE AdventureWorks
GO
----Create new table and insert into table using SELECT INSERT
SELECT FirstName, LastName
INTO TestTable
FROM Person.Contact
WHERE EmailPromotion = 2
----Verify that Data in TestTable
SELECT FirstName, LastName
FROM TestTable
----Clean Up Database
DROP TABLE TestTable
GO
Both of the above method works with database temporary tables (global, local). If you want to insert multiple rows using only one insert statement refer article SQL SERVER – Insert Multiple Records Using One Insert Statement – Use of UNION ALL.
8/3/09
Monitor Windows User Log In and Log Off without software
This is a great little batch file that enables an admin to keep a log of where, when and what computer user log on to and when they log off.
1. Set up a Share on the network.
The batch file is actually two files. One runs at user log on and one runs at user log off. When the batch files run, they create a rolling log file with the details in a shared network folder.
1) Create a shared folder on the network. Mine is called Logs. Everyone should have full access to this share.
2) In that folder, create a folder called User and a folder called Computer.
2. Create the batch files.
Copy and Paste the following into two separate text files. When you save them, remember to change the file type to .bat
Name: Log On.bat
rem The following line creates a rolling log file of usage by workstation
echo Log In %Date% %TIME% %USERNAME% >> \\servername\Logs\Computer\%COMPUTERNAME%.log
rem The following line creates a rolling log file of usage by user
echo Log In %Date% %TIME% %COMPUTERNAME% >> \\servername\Logs\User\%USERNAME%.log
Name: Log Off.bat
rem The following line creates a rolling log file of usage by workstation
echo Log Off %Date% %TIME% %USERNAME% >> \\servername\Logs\Computer\%COMPUTERNAME%.log
rem The following line creates a rolling log file of usage by user
echo Log Off %Date% %TIME% %COMPUTERNAME% >> \\servername\Logs\User\%USERNAME%.log
Put these files in the Logs folder.
3. Add the batch files to group policy.
You do this on your User Container GPO.
User Configuration > Windows Settings > Scripts
Add Log In.bat to the Logon scripts box and Log Off.bat to the Logoff scripts box
7/9/09
How to calculate the number of days a vendor check takes to clear?

According to the help file, the field is defined as "View or change the average number of days that a check you issue to the vendor usually takes to clear your bank.", in other words, you can manually update it if you need to track this data.
The user also pointed out that they performed daily bank reconciles and that the Cleared Date column in the CM Transaction (dbo.CM20200) table was reflective of the actual check cleared date in the bank and that they would like to use this field to update the Days for Checks to Clear field in the Vendor Credit Summary window. After all, the date value of the Cleared Date column only gets updated when the check is marked as cleared during a checkbook reconciliation process and does not necessarily reflect the actual date the check cleared the bank, but with daily bank reconciles this should not be an issue, because the would be one of the same.
So, if you perform daily bank reconciliations and would like to know the average days a vendor check takes to clear your bank, you could use the following T-SQL query to update this column value:
-- Created by Mariano Gomez, MVP
WITH AVGCTE(VENDORID, AVGDAYS) AS (
SELECT A.CMLinkID, AVG(DATEDIFF(DD, A.TRXDATE, A.clearedate)) AS AVGDAYS
FROM CM20200 A
GROUP BY A.CMLinkID
)
UPDATE PM00201 SET DYCHTCLR = AVGDAYS
FROM PM00201 INNER JOIN AVGCTE ON (PM00201.VENDORID = AVGCTE.VENDORID)
Now what to do with the script? You can schedule it as part of a SQL Server job that will run daily to update this value for you! This will allow you to have accurate average days for each vendor and in turn will allow you to make better purchasing decisions to preserve your hard earned cash.
6/28/09
You receive error "Someone is clearing company files and you cannot get into this company" when logging into a company - The Clear Data process

In an attempt to replicate this issue, the most logical place to start was to login into Dynamics GP as my 'sa' user and open the Clear Data window and try to do something there.

Since Clear Data is a distructive process, the Dynamics GP Development team figured they had to find a way to prevent users from accessing the system while this process was being executed by, perhaps, the system administrator. This is how the Clear Data Comp reserved user ID was born. When the window is first accessed by the system administrator, Dynamics GP checks for any users currently available in the system. If there are no users, it will display the Clear Data window and create a record for the Clear Data Comp user ID in the system databases User Activity table (DYNAMICS.dbo.ACTIVITY). The following query shows the content of the ACTIVITY table.

If a second user attempts to access the same company, the code will check for the Clear Data Comp user activity record before allowing the user to access the company. If the record is found, the system will issue the error message originally described.
Since this user ID is internal to the application, there are no locks placed or sessions logged in the tempdb..DEX_LOCK or tempdb..DEX_SESSION tables respectively. If you noted the query results, the Clear Data Comp user ID's SQL session ID is 0.
What happens when Dynamics GP crashes in middle of running Clear Data?
If the system crashes in middle of the Clear Data process or SQL Server becomes unavailable, the ACTIVITY record for the Clear Data Comp user ID will not be properly released. Hence, when users attempt to log back into the company they will receive the same error message.
To correct this problem, the system administrator must log into SQL Server Management Studio and remove the record by running the following query:
-- created by Mariano Gomez, MVP
DELETE FROM ACTIVITY WHERE USERID = 'Clear Data Comp'
Hope this helps in your troubleshooting efforts and to understand another one of those 'old' Dynamics maintenance utilities.
6/1/09
Printer Clear Out VBS Script
Set colServiceList = objWMIService.ExecQuery _("Select * from Win32_Service where Name = 'Spooler'")
For Each objService in colServiceListIf objService.State = "Running" ThenobjService.StopService()Wscript.Sleep 5000End If
Set FileList = objWMIService.ExecQuery _("ASSOCIATORS OF {Win32_Directory.Name='C:\Windows\system32\spool\printers'} Where " _& "ResultClass = CIM_DataFile")
For Each objFile In FileListobjFile.DeleteNext
Next