PR39 Automation

 9 Replies
 0 Subscribed to this topic
 68 Subscribed to this forum
Sort:
Author
Messages
BlackSVT03
Basic Member
Posts: 7
Basic Member
    First I want to say is, I have no experience in Lawson. My knowledge consists of what I can make head or tails out of from the training workbooks of some of my colleagues. So with that out of the way, I have been assigned the task of trying to completely automate the one time deductions. What we have now is we do a data dump from the cafeteria, gift shop; etc. After that, we have a process flow that reads the file, parses out the data, modifies the data and then creates a csv file. Then payroll takes the csv file and uses the Microsoft add-ins to upload the data to the ONETMDED table. We were told this is similar to using the PR39 job. How can I eliminate the step of payroll having to use the add-ins to upload the data? Thank you for your help.
    sleow
    Basic Member
    Posts: 15
    Basic Member
      There is a PR539 One Time deduction conversion program that will do the job for you. I am sure the CSV file is very close to what they are using for the Addins upload. We use if for the same thing. Ours is not automated but I am sure it would be fairly easy to schedule the job if you put the csv file in the right place with your flow.
      BarbR
      Veteran Member
      Posts: 306
      Veteran Member
        You want to build the PR539 conversion file format.
        Define a PR539 job with the csv parameters that point to the file you place on the server, and your Payroll Dept staff can run the job to bring the data in.
        We have ours fully automated. The Cafeteria deduction system (Gempay) outputs the data, we have a scheduled ProcessFlow that converts the Gempay output into the PR539 format and FTP's it to the Lawson server. The PR Dept staff then run the pre-defined PR539 job when they want to pull in the data.
        LisaN
        Veteran Member
        Posts: 53
        Veteran Member
          We also have the PR539 automated, it is scheduled to run every Monday morning through 'recdef'. Sounds like you could get your flow modified to perform like BarbR described and just make sure your 'recdef' job is scheduled to run after the file is in place. (We scheduled our PR539 job to run a couple of hrs after the file is placed in case there were problems).
          Thanks
          BlackSVT03
          Basic Member
          Posts: 7
          Basic Member
            Thank you all for your quick responses! PR539 it is.
            Shane Jones
            Veteran Member
            Posts: 460
            Veteran Member
              We have deduction automation and I write directly to pr14 from a process flow. Sometimes the 500 jobs just create more complexity - just my opinion. (needing another job to run, having it know where the file is going to be, having someone review the output of the 500 job, making sure it runs after an upgrade, ...)

              I just tell the flow to do an AGS to update and send an email to the (assigned) owner of the process showing each change made.. mine counts the records in the file and the AGS transactions to make sure it completed all and it changes the email subject if any had a result other than "change complete - continue". If you want complete automation you need to make it so the user gets notified of everything performed...

              While you indicate you are a novice you are using process flow to create the csv so why not use the data to write directly to pr39? Good luck with your process. Email me if you want to see my flow... Shanesmj@yahoo.com
              Shane Jones
              Tools: HR, Payroll, Benefits, PFI, Smart Office, BSI, Portal and Self-Service
              Systems: Lawson, Open Hire, Kronos, Crystal Reporting, SumTotal Learning
              ** Teach others to fish...
              BlackSVT03
              Basic Member
              Posts: 7
              Basic Member
                Fully automated, with notifications is what I would love to have.

                So I have it right now where I create the OTDLOADCSV and process flow dumps in the right directory. I've tested the job through the portal everything works great. But now I moved on to the AGS call and I am getting a "Security violation" message when testing it. When I check the log this is the error that I am getting:

                ERROR [JobRequest] Exception getting RMId for Owner OSId: lawson

                com.lawson.lawsec.authen.LSFSecurityAuthenException:Error getting resource string for identifier NO_IDENTITY_FOR_RDID, values {OSId, lawson}. Message: java.util.MissingResourceException: Can't find resource for bundle java.util.PropertyResourceBundle, key NO_IDENTITY_FOR_RDID

                Any ideas? Thanks again

                PS I saw on here to where you can test it through a URL /cgi-lawson/jobrun.exe?FUNC=RUN&USER=&JOB=&OUT=text
                and it works.
                Shane Jones
                Veteran Member
                Posts: 460
                Veteran Member
                  Same user account? Have you run it both local and on the server? (your security error makes me question if you are running it under an account that does not have the correct security access.)


                  Since you are using a "one-time" deduction it is likely that you are just adding records - consider using this:

                  _PDL=YOURPRODLINE&_TKN=PR39.1&_EVT=CHG&_RTN=DATA&_LFN=ALL&_TDS=IGNORE&FC=A&COMPANY=&EMP-EMPLOYEE=&LINE-FCr0=A&EMPLOYEEr0=&DED-CODEr0=&DED-AMTr0=&EFFECT-DATEr0=&DED-STATUSr0=C&CHECK-GRPr0=&DED-PRIORITYr0=&CHECK-DESCr0=&_DELIM=%09&_OUT=XML&_EOT=TRUE

                  I think it is additional steps to create a processflow that parses a file just to save it to another file so that it can be used in a job. The flow parses the data so you might as well take the action at the time. Then you could create a message builder that collects all of the changes and results - line by line to include in an email.
                  Shane Jones
                  Tools: HR, Payroll, Benefits, PFI, Smart Office, BSI, Portal and Self-Service
                  Systems: Lawson, Open Hire, Kronos, Crystal Reporting, SumTotal Learning
                  ** Teach others to fish...
                  BlackSVT03
                  Basic Member
                  Posts: 7
                  Basic Member
                    I am able to run the locally and on the server under the Lawson login.

                    I don’t know if it’s the right way or wrong way but i got it to work by changing the web program to "cgi-lawson/jobrun.exe?FUNC=RUN&USER=&JOB=&OUT=text" and it works perfectly.

                    Now onto further automation issues. I am trying to pull the file that we are loading from the actual server it exists on. And i am running into a "Command error output: Access is denied". I have tried to run it line by line through the syscommand and i have put it in a batch file and tried to execute it. Here's the batch file.

                    pushd \\dione\f$\automatic interfaces\copy of cbord\archive
                    copy /y 06062011a.bak e:\lsftest\gen\bpm
                    popd
                    e:
                    cd lsftest\gen\bpm
                    ren 06062011a.BAK CBord06062011.csv

                    I get the access to denied right of the bat with the "pushd \\dione\f$\automatic interfaces\copy of cbord\archive". I have made the Lawson account a local admin on the server, and i have shared the folder and I gave Lawson rights. I'm starting to hate the security with system.
                    BlackSVT03
                    Basic Member
                    Posts: 7
                    Basic Member
                      Just as a FYI

                      We figured out why we were having "Access is Denied" while accessing another server. Lawson kept telling us to grant permissions to the user that is running the process which, in our case, is wrong. While checking the event viewer (on the server we were pulling the files from) I notice that a user of "LAWSONWEBTST$" was logging on. The reason for this is that Process Flow must be using the "laserv-lsftest" service which is setup to log on as "Local System". So when you execute the process flow it is accessing files as the server name. Also we had to grant permission to the complete file path. So in our case automatic interfaces, copy of cbord and archive all had to be shared with permissions to the server (LAWSONWEBTST).