PHP to Construct cURL Script to Facilitate Data Streaming

Using PHP to construct cURL script to facilitate data streaming from a secured server connection. 
 
INTRODUCTION

In the spring of 2008 I was faced with a quite a challenge. The previous year I had created a website for a financial services company that needed to process and display DST Fan Mail - financial data files from DST Systems, Inc. in Kansas City, Missouri USA. The customer had to manually go through a series of steps to download the data files they needed from the DST Systems, Inc. web server. Then the website I created would process the downloaded data to show monetary changes in the customer accounts over daily, weekly, monthly and quarterly time periods. Manual downloads of data wasn’t a real solution. The employees were too busy with other things to be able to consistently perform this task on a daily basis.

DST Systems, Inc. gave me the basic technical information for connecting with their secured server and streaming back their financial data files to the customer’s website. The problem was they had no methodology to fully automate the entire process for doing this. I was actually disappointed that DST Systems, Inc. did not offer an in-house guide for fully automating the data file downloads. I just assumed they would have something like this available for people who wanted to fully automate the DST Fan Mail downloads. I really didn’t want to let this company down by expecting them to manually download the DST Fan Mail data files, so the burden was on me to orchestrate a way to fully automate the process.

CREATIVE THINKING IS IMPERATIVE
 
Doing it the conventional way was not going to fly with a project of this magnitude and complexity. I had to think creatively, that is "outside the box" to get this project working as a fully automated deployment. I decided to implement a series of scripts that would run on the customer website crontab manager. Some would be PHP scripts, while others would be cURL scripts. They would all have to work together as one seamless, fully automated system to connect to the DST Fan Mail secured server and implement data streaming back to the customer’s website. Once the data streaming was complete, the downloaded data would be processed and consolidated for future use via additional automated PHP scripts on the customer’s website.

Here are the names of the scripts I created. They are listed in their chronological run order on the customer website crontab manager:

  • dst_curl_save_info.sh {first in sequence - cURL}
  • dst_get_file_streaming_inputs.php {second in sequence - PHP}
  • dst_stream_back_file.sh {third in sequence - cURL}
  • import_transaction_data_for_dstfanmail.php {fourth in sequence - PHP}
  • import_transaction_data_for_dstfanmail_second.php {fifth in sequence - PHP}
  • import_transaction_data_for_dstfanmail_third.php {sixth in sequence - PHP}
  • import_transaction_data_for_dstfanmail_fourth.php {seventh in sequence - PHP}
  • import_transaction_data_for_dstfanmail_fifth.php {eighth in sequence - PHP}

Below I will discuss the workings of each of the eight scripts that constitute this automated process. To view the entire readout of all the scripts, please click here.

A DIGITAL PAPER TRAIL

Before I begin to narrate the code, I would like to point out that much of the programming for this application development project involves updating 2 data tables in the MySQL database on the customer’s web server. One is a "good audit trail data table" that is updated when an operation in the code is successfully completed. The other is a "bad audit trail data table" that is updated when an operation fails. As I explain the software, I will omit referencing the audit trail data table posts since there are so many of them. The reason I placed these in the code was because it was important to everyone involved to see what succeeded and what failed in a fully automated processing environment. If there were snafus, then we had to be able to refer back to a "digital paper trail". This told everyone where the problem occurred so the appropriate correction could be made to resolve it. Also, some of the actual file names, login parameters and other customer specific references have been renamed from the original for the sake of the customer’s privacy and security.
 
FIRST SCRIPT TO RUN ON THE CRONTAB MANAGER (cURL)

This cURL script, "dst_curl_save_info.sh", is run to initially make contact with the DST Fan Mail web server in Kansas City, Missouri USA. This is the "first phase" of the cURL script processing. It will stream back the "transaction id" as well as the list of "zip" files available for download. Each "zip" file contains a data text file of the same name as the "zip" file, but with a "txt" file extension. The "transaction id" and the list of downloadable "zip" files will be streamed back to the customer’s website in separate text files (destresponse.txt and dstxml.txt) as noted below.

  1. // this cURL script directive will query the DST FAN Mail server in kansas city, mo. usa  
  2. // to retrieve the “transaction id”, returned in the file "dstresponse.txt". it will be  
  3. // used to later on stream back a list of “zip” archive files each containing a text data  
  4. // file of financial market transaction data. the data file within each “zip” archive  
  5. // file has the same filename as its archive counterpart file. the difference is the  
  6. // filename extension - “txt”. the list of “zip” archive files available for streaming  
  7. // will also be returned in the file, "dstxml.txt".  
  8.   
  9. #!/bin/shcURL -o dstxml.txt -H "Accept:*/*" -H "X-File-Requester: Test" -H "X-Dlua: Zm0wMDM0ODY6YnVybnM3MjQ=" -d "tx=RetrieveFile&cz=415171403" https://filetransfer.financialtrans.com/tf/FANMail  
  10.   
  11. cURL --dump-header dstresponse.txt -H "Accept:*/*" -H "X-File-Requester: Test" -H "X-Dlua: Zm0wMDM0ODY6YnVybnM3MjQ=" -d "tx=RetrieveFile&cz=415171403" https://filetransfer.financialtrans.com/tf/FANMail 

SECOND SCRIPT TO RUN ON THE CRONTAB MANAGER (PHP)
 

The second script to run on the customer website crontab manager, which is also the first PHP script to run, is where the mechanics for setting up the download of data files occurs. First, the "transaction id" parameter is retrieved from the "dstresponse.txt" text file. Then the list of financial transaction data files available for download is retrieved from the "dstxml.txt" text file.

As the "dstxml.txt" text file is opened and explored, the script will tally the total number of transaction data files available for processing. This number is stored in the $kounter PHP variable. One thing to note here is that I am using a filtering process to count the total number of data files available for download. DST Systems, Inc. offers an array of different file types for downloading. For this project, I only needed 3 types of DST data files:

  1. Account position file, denoted by "03" in the 13 character filename.
  2. Direct financial activity file, denoted by "02" in the 13 character filename.
  3. Price refresher file, denoted by "17" in the 13 character filename.

Next, the script will perform some miscellaneous clean up chores to remove older, unneeded files from the data table I call already_processed_files.

Lastly, we are ready to use PHP coding to "reconstruct" a newly updated version of a pre-existing cURL script file I call "dst_stream_back_file.sh". This cURL script is regenerated using the parameters we have collected from the "first phase" of the cURL script processing as illustrated below.

  1. $fstreamer = $_SESSION[maintain_file_path]."dst_stream_back_file.sh";  
  2.     // if the file ‘$_SESSION[streamed_file_path]."dstxml.txt"’ exists, then remove it.  
  3.     if ( is_file($saved_xml) ) {  
  4.     unlink($saved_xml);  
  5.     }  
  6.         // if the file ‘$_SESSION[streamed_file_path]."dstresponse.txt"’ exists, then   
  7.         // remove it.  
  8.         if ( is_file($saved_headers) ) {  
  9.         unlink($saved_headers);  
  10.         }  
  11. // open a file stream in write mode for the file   
  12. // ‘$_SESSION[maintain_file_path]."dst_stream_back_file.sh"’. this will programmatically   
  13. // construct cURL script for streaming back the dst ‘zip’ files that are available for   
  14. // download.  
  15. $rh = fopen($fstreamer"w");  
  16. // if the file stream is valid, then proceed.  
  17. if ($rh) {  
  18. // write the beginning of the cURL script that will stream back   
  19. // one ‘zip’ file to our website.  
  20. fputs($rh"#!/bin/sh\n\n", 11);  
  21.     // loop while the counter variable ‘$counter’ is less than the number of DST FAN Mail   
  22.     // ‘zip’ files which are available for download.  
  23.     $counter = 0;  
  24.     while ($counter < $kounter) {  
  25.         // if the character in the 14th position is not a ‘?’, then proceed with processing   
  26.         // the filename for the cURL script that will stream ‘zip’ archive files back to   
  27.         // our server.  
  28.         if ( substr($dst_text_files_preliminary[$counter], 14, 1) != "?" ) {  
  29.             // format a string version of the numeric ‘$counter’ variable to be stored in   
  30.             // the variable, ‘$serialno’.  
  31.             if ($counter < 10) {  
  32.             $serialno = "00".strval($counter);  
  33.             }  
  34.                 if ($counter >= 10 && $counter < 100) {  
  35.                 $serialno = "0".strval($counter);  
  36.                 }  
  37.                     if ($counter >= 100 && $counter < 1000) {  
  38.                     $serialno = strval($counter);  
  39.                     }  
  40. // construct the rest of the cURL script that will stream back the dst ‘zip’ file   
  41. // contained in the variable, ‘$dst_text_files_preliminary[$counter]’. also, the   
  42. // variables ‘$serialno’ and ‘$tidxvar_new’ are used to construct this cURL directive.  
  43. $stream_it = 'cURL -o sentfile'.$serialno.'.zip -H "Accept:*/*" -H "X-File-Requester: test" -d "tx=RetrieveFile&cz=415171403&tidx='.$tidxvar_new.'&event=RetrieveFile&file='.$dst_text_files_preliminary[$counter].'" https://filetransfer.financialtrans.com/tf/FANMail';  
  44.     // now write the remainder of the cURL script directive to the file stream denoted by   
  45.     // the handle, ‘$rh’. this will be followed by 2 linefeeds.  
  46.     fputs($rh$stream_it."\n\n"strlen($stream_it)+2);  
  47.         // next, make a note in the data table ‘already_processed_files’, that this dst   
  48.         // ‘zip’ file has now been processed.  
  49.         $strSQLInsert = "INSERT INTO already_processed_files (filename, date_processed)";  
  50. $strSQLInsert =  $strSQLInsert." VALUES ('$dst_text_files_preliminary[$counter]', '$todaysdate2')";  
  51.         mysql_query($strSQLInsert);  
  52.         }  
  53.     // increment and loop around until we have reached the number of dst ‘zip’ files to be   
  54.     // downloaded.  
  55.     $counter++;  
  56.     }     
  57. // close the handle to the file stream.  
  58. fclose($rh);  

THIRD SCRIPT TO RUN ON THE CRONTAB MANAGER (cURL)

Here is the third crontab manager script and the "second phase" of the cURL script processing, which will stream the "zip" files from the DST FAN Mail web server to the customer’s website.

The cURL script below has been "constructed" with PHP coding from the second script run on the crontab manager to use the "transaction id" along with the downloadable list of "zip" files that were returned from the "first phase" of the cURL script processing.

Due to the file streaming protocol used by DST Systems, Inc., each individual data file that is currently available for download must be streamed back with its own cURL script directive using the "transaction id" parameter. You can see how I notated this sequential download processing by the fact that I use {"n" sequential number}.zip, {"n+1" sequential number}.zip, {"n+2" sequential number}.zip………{$kounter sequential number}.zip to plug in the list of the "zip" files I can stream back. There will be a total of "$kounter" number of "zip" files to be streamed back to the customer’s website.

  1. // the programmatically “reconstructed” cURL scripts created from the  
  2. // previous php script that ran on the crontab manager will stream back  
  3. // a series of dst fan mail “zip” archive files. this number of files  
  4. // is equal to $kounter, the number of files to be downloaded from the  
  5. // previous script. below are descriptions to explain some of the  
  6. // parameters used in the cURL script directives that follow:  
  7. // {“n” sequential number} = $serialno, the sequential number variable  
  8. // from above php script.  
  9. // {transaction id} = $tidxvar_new, the transaction id variable from above php  
  10. // script.  
  11. // the php variable $kounter, is the number of DST FAN Mail “zip” archive  
  12. // files to be downloaded with this instance of a secured server connection.  
  13. // {“n” filename to be processed} = $dst_text_files_preliminary[$counter], is  
  14. // the file name to be processed in sequence. note: $serialno is the string equivalent of  
  15. // $counter. the programmatically “reconstructed” cURL script that streams back the “zip”  
  16. // archive files will assume the form below. each cURL directive is executed in sequence  
  17. // for each “zip” archive file to be streamed back from the DST FAN Mail server in kansas  
  18. // city, mo. usa.  
  19.   
  20. #!/bin/shcURL -o sentfile{“n” sequential number}.zip -H "Accept:*/*" -H "X-File-Requester: test" -d "tx=RetrieveFile&cz=415171403&tidx={transaction id}&event=RetrieveFile&file={“n” filename to be processed}" https://filetransfer.financialtrans.com/tf/FANMail  
  21.   
  22. #!/bin/shcURL -o sentfile{“n+1” sequential number}.zip -H "Accept:*/*" -H "X-File-Requester: test" -d "tx=RetrieveFile&cz=415171403&tidx={transaction id}&event=RetrieveFile&file={“n+1” filename to be processed}" https://filetransfer.financialtrans.com/tf/FANMail  
  23.   
  24. #!/bin/shcURL -o sentfile{“n+2” sequential number}.zip -H "Accept:*/*" -H "X-File-Requester: test" -d "tx=RetrieveFile&cz=415171403&tidx={transaction id}&event=RetrieveFile&file={“n+2” filename to be processed}" https://filetransfer.financialtrans.com/tf/FANMail  
  25.   
  26. .  
  27. .  
  28. .  
  29. .  
  30. .  
  31. .  
  32.   
  33. #!/bin/shcURL -o sentfile{“$kounter” sequential number}.zip -H "Accept:*/*" -H "X-File-Requester: test" -d "tx=RetrieveFile&cz=415171403&tidx={transaction id}&event=RetrieveFile&file={“$kounter” filename to be processed}" https://filetransfer.financialtrans.com/tf/FANMail  
  34.   
  35. // note: the {“$kounter” sequential number} and {“$kounter” filename to be processed}  
  36. // parameters represent the last DST FAN Mail “zip” file to be downloaded with this  
  37. // instance of a secured server connection. 
FOURTH SCRIPT TO RUN ON THE CRONTAB MANAGER (PHP)

As the fourth script commences execution, all the data from the DST Fan Mail web server has now been streamed back to the customer’s website. At this stage, it needs to be combed through and processed for the customer’s intended purpose.

Initially, some data tables in the MySQL database on the customer’s website will be purged to prepare them for being used. One of these is the txt_files_to_be_processed data table. Next, the downloaded "zip" files will be decompressed to empty out the text data files of the same file name except for their file extensions, which are txt. This code will decompress the "zip" files that were downloaded.

  1. // the following code will unzip DST FAN Mail zip files to empty into the folder. then   
  2. // the zip files from the root drive on www.generic.com will be removed.  
  3. $_SESSION[zip_path] = "/home/generic/";  
  4.     // if the path contained in the ‘$_SESSION[zip_path]’ session variable is valid, then   
  5.     // proceed.  
  6.     if (is_dir($_SESSION[zip_path])) {  
  7.     // if the path is valid, then update the good audit trail table and proceed.  
  8. $strSQLInsert = "INSERT INTO trans_audit_trail_good (success_description, audit_date)";  
  9. $strSQLInsert =  $strSQLInsert." VALUES ('web server path is valid...proceeding to next step...', '$todaysdate2')";  
  10.     mysql_query($strSQLInsert);  
  11. // if the path specified in the ‘$_SESSION[zip_path]’ session variable can be   
  12. // accessed with the handle ‘$dh’, then proceed.  
  13. if ($dh = opendir($_SESSION[zip_path])) {  
  14. // if the path is accessible, then update the good audit trail table and proceed.  
  15. $strSQLInsert = "INSERT INTO trans_audit_trail_good (success_description, audit_date)";  
  16. $strSQLInsert =  $strSQLInsert." VALUES ('$_SESSION[zip_path] is opened', '$todaysdate2')";  
  17. mysql_query($strSQLInsert);  
  18. // read 15 characters into the variable ‘$file’.  
  19. while (false !== ($file = readdir($dh))) {  
  20. if ( strlen($file) == 15 ) {  
  21. // after the zip file name read operation is complete, detect the  
  22. // last 4 characters in the file name to verify it is indeed a   
  23. // ‘zip’ file.  
  24. if ( substr($file, 11, 4) == '.zip' ) {  
  25. // if it is a zip file, then decompress the contents into the directory specified by   
  26. // the session variable, ‘$_SESSION[zip_path]’.  
  27. system('unzip -o ' . $_SESSION[zip_path].$file);  
  28. // next, remove the zip file since the text file of the same name is available and we   
  29. // no longer need the zip file that held it.  
  30. unlink($_SESSION[zip_path].$file);  
  31. }  
  32.         }  
  33.         }  
  34. // close access to the directory.  
  35. closedir($dh);  
  36.             }  
  37. else {  
  38. // if the path is invalid, then update the bad audit trail table and abort processing.  
  39. $strSQLInsert = "INSERT INTO trans_audit_trail_bad (failure_description, audit_date)";  
  40. $strSQLInsert =  $strSQLInsert." VALUES ('web server path is invalid...import operation was aborted at this point...', '$todaysdate2')";  
  41. mysql_query($strSQLInsert);  
  42. exit;  

Next, the script will perform a check to see if all the needed data tables (there should be 15) in the MySQL database are present before further processing can continue. If this is not the case, then processing will be aborted at this point. Here is the code for it.

  1. // now that we are in the mysql database, lets check to see if all the pertinent data   
  2. // tables exist.  
  3. $resultx = mysql_query("SHOW TABLES");  
  4.     if (!$resultx) {  
  5.     // if we can’t show the data tables, then update the bad audit trail table and abort   
  6.     // processing.  
  7.     $strSQLInsert = "INSERT INTO trans_audit_trail_bad (failure_description, audit_date)";  
  8. $strSQLInsert =  $strSQLInsert." VALUES ('the tables in the database were not retreived...import operation was aborted at this point...', '$todaysdate2')";  
  9.     mysql_query($strSQLInsert);  
  10.     exit;  
  11.         } else {  
  12.         // if we can show the data tables, then update the good audit trail table and   
  13.         // proceed.  
  14. $strSQLInsert = "INSERT INTO trans_audit_trail_good (success_description, audit_date)";  
  15. $strSQLInsert =  $strSQLInsert." VALUES ('the tables in the database were retreived...proceeding to next step...', '$todaysdate2')";  
  16.         mysql_query($strSQLInsert);  
  17. // check to see if these data tables exist within the mysql database.  
  18. $w = 0;  
  19. while ($rowx = mysql_fetch_row($resultx)) {  
  20.     if ($rowx[0] == 'trans_audit_trail_good' ) {  
  21.     $w++;  
  22.     }  
  23.         if ($rowx[0] == 'trans_audit_trail_bad' ) {  
  24.         $w++;  
  25.         }  
  26.             if ($rowx[0] == 'curr_day_trans' ) {  
  27.             $w++;  
  28.             }  
  29.                 if ($rowx[0] == 'curr_week_trans' ) {  
  30.                 $w++;  
  31.                 }  
  32.                     if ($rowx[0] == 'curr_month_trans' ) {  
  33.                     $w++;  
  34.                     }  
  35. if ($rowx[0] == 'curr_quarter_trans' ) {  
  36. $w++;  
  37. }  
  38.     if ($rowx[0] == 'dst_management_codes_counters' ) {  
  39.     $w++;  
  40.     }  
  41.         if ($rowx[0] == 'txt_files_to_be_processed' ) {  
  42.         $w++;  
  43.         }  
  44.             if ($rowx[0] == 'trans_temp_dstfanmail' ) {  
  45.             $w++;  
  46.             }  
  47.                 if ($rowx[0] == 'trans_temp_dstfanmail_account_position_merge' ) {  
  48.                 $w++;  
  49.                 }  
  50. if ($rowx[0] == 'trans_temp_dstfanmail_price_refresher_merge' ) {  
  51. $w++;  
  52. }  
  53.     if ($rowx[0] == 'dst_account_position_table' ) {  
  54.     $w++;  
  55.     }  
  56.         if ($rowx[0] == 'dst_price_refresher_temp_table' ) {  
  57.         $w++;  
  58.         }  
  59.             if ($rowx[0] == 'trans_temp_dst_tr' ) {  
  60.             $w++;  
  61.             }  
  62.                 if ($rowx[0] == 'trans_temp_dst_tr_parse' ) {  
  63.                 $w++;  
  64.                 }  
  65. }  
  66. // free the result resource.  
  67. mysql_free_result($resultx);  
  68.         }  
  69. if ($w<15) {  
  70. // if less than the 15 data tables above were discovered, then update the bad audit trail   
  71. // table and abort processing.  
  72. $strSQLInsert = "INSERT INTO trans_audit_trail_bad (failure_description, audit_date)";  
  73. $strSQLInsert =  $strSQLInsert." VALUES ('all required tables in the generic database were not identified...import operation was aborted at this point...', '$todaysdate2')";  
  74. mysql_query($strSQLInsert);  
  75. exit;  
  76.     } else {  
  77.     // if all 15 data tables above were discovered, then update the good audit trail table   
  78.     // and proceed.  
  79. $strSQLInsert = "INSERT INTO trans_audit_trail_good (success_description, audit_date)";  
  80. $strSQLInsert =  $strSQLInsert." VALUES ('all required tables in the generic database were identified...proceeding to next step...', '$todaysdate2')";  
  81.     mysql_query($strSQLInsert);  
  82.     }    

Assuming all data tables are present, the rest of this script will proceed with some data initialization routines before compiling a list of data files to be processed in a data table I call txt_files_to_be_processed.

FIFTH SCRIPT TO RUN ON THE CRONTAB MANAGER (PHP)

If the data table txt_files_to_be_processed has at least one record in it, then it will proceed to process for the three types of DST Fan Mail data files mentioned previously. Each of these files will be appended and then queried for further processing.

If it is the account position file (type=03 in the file name), then the dst_account_position_table data table is used.

If it is the price refresher file (type=17 in the file name), then the dst_price_refresher_temp_table data table is used.

If it is the direct financial activity file (type=02 in the file name), then the trans_temp_dst_tr data table is used.

SIXTH SCRIPT TO RUN ON THE CRONTAB MANAGER (PHP)

If the data table txt_files_to_be_processed has at least one record in it, then it will proceed with the following. Each of these functions occurs within a nested "while loop" for all the records in the fetched result set within the main loop for the txt_files_to_be_processed data table.

The trans_temp_dstfanmail_price_refresher_merge data table is purged of all data records. Then all the records are queried from the dst_price_refresher_temp_table data table and the resulting rows are then fetched for subsequent processing. Then some miscellaneous operations are performed to designate insurance codes, net asset values, closing security prices, trade dates, etc. Now the trans_temp_dstfanmail_price_refresher_merge data table is appended with this extracted financial information.

Next, the trans_temp_dstfanmail_account_position_merge data table is purged of all records. Then all the records are queried from the dst_account_position_table data table and the resulting rows are then fetched for subsequent processing. Then some miscellaneous operations are performed to designate insurance codes, net asset values, closing security prices, trade dates, etc. Now the trans_temp_dstfanmail_account_position_merge data table is appended with this extracted financial information.

Lastly, the trans_temp_dstfanmail data table is purged of all records. Then all the records are queried from the trans_temp_dstfanmail_account_position_merge data table and the resulting rows are then fetched for subsequent processing. Then some miscellaneous operations are performed to designate insurance codes, net asset values, closing security prices, trade dates, etc. Now the trans_temp_dstfanmail data table is appended with this extracted financial information.

SEVENTH SCRIPT TO RUN ON THE CRONTAB MANAGER (PHP)

If the txt_files_to_be_processed data table has at least one record in it, then it will proceed with the following.

The trans_temp_dst_tr_parse data table is purged of all records. All records will be queried from the trans_temp_dst_tr data table. If the result set contains at least one row, then a "for loop" will be used to process the queried result set. Then some miscellaneous operations are performed to designate customer codes, price, shares added/subtracted, social security number, ticker symbols, etc. Now the trans_temp_dst_tr_parse data table is appended with this extracted financial information.

Next, the trans_temp_dstfanmail_account_position_merge data table will be queried for all records to form a result set. A "while loop" will navigate this result set as it performs a number of detailed operations involving the manipulation of data variables that will be used to update the cinvestments data table. More specifically, cinvestments will be queried by a specific set of customer code and securities ticker symbol from the current row of the result set. The script will gather information such as number of shares held, sector, market capitalization, share price, name of investment, last date updated, domestic/international designation. After these variables are filled with appropriate data, the cinvestments data table will then be purged for the same specific set of customer code and securities ticker symbol from the current row of the result set.

After some more miscellaneous variable manipulation, the trans_temp_dstfanmail_price_refresher_merge data table is queried for a specific set of cusp number and fund code. If the result set is viable, the variables for fund name, current price and date are filled in with the result set field data.

Now the trans_temp_dst_tr_parse data table is queried for a specific set of customer code and ticker symbol from the main "while loop" result set. If the queried result set is viable and contains at least one record, then it will proceed with further processing. This will designate variables to hold things such as number of shares, shares added or subtracted, transaction date and purchase price using a variety of conditional logic.

After more miscellaneous variable manipulation, a new record is appended to the cinvestments data table using the variables that were set during the processing of the seventh script in the customer website crontab sequence.

The main purpose of this script is to consolidate the newly arriving data from the downloads into the customer investments data table cinvestments by a specific set of customer code and security ticker symbol for each pass through the txt_files_to_be_processed data table.

EIGHTH SCRIPT TO RUN ON THE CRONTAB MANAGER (PHP)

If the data table "txt_files_to_be_processed" has at least one record in it, then it will move ahead with updating the data tables that reflect account value changes by day, week, month and quarter. First, all records will be queried from the "trans_temp_dstfanmail" data table. Next, it will fetch the rows from this result set and proceed as follows.

The "curr_day_trans" data table will be appended with the fetched row results. The new row data will include: customer code, ticker symbol, date, close price, management code, total shares held and the number of shares added or subtracted. This operation is performed within a "while loop" for all the fetched rows. The "cinvestments" data table is queried by a specific set of customer code and ticker symbol to help generate the number of shares held and changes in those shares held, which are used to help append each new record to the "curr_day_trans" data table along with the data from the result set of "trans_temp_dstfanmail" data table.

Next, the curr_week_trans data table will be updated with the fetched row results. This is similar to the processing mentioned above for the curr_day_trans data table, but a little more involved. The curr_day_trans data table was simply appended with a new record for each pass through the fetched result set with queried data from the trans_temp_dstfanmail data table, but the curr_week_trans data table processing is more conditional. Here, the curr_week_trans data table is queried by a specific set of date, customer code and ticker symbol. If this queried result contains no records, then the curr_week_trans data table is appended with a record containing the same fields as in the curr_day_trans data table processing. If the same queried result set contains 1 record, then the curr_week_trans data table is updated with the fetched row data for close price, total shares held and the number of shares added or subtracted where the date, customer code and ticker symbol matched their counterparts from the fetched row data. Also, The cinvestments data table is queried by a specific set of customer code and ticker symbol to help generate the number of shares held and changes in those shares held, which are used to help append each new record to the curr_week_trans data table along with the data from the result set of trans_temp_dstfanmail data table.

The curr_month_trans data table will be updated with the fetched row results. Here, the curr_month_trans data table is queried by a specific set of date, customer code and ticker symbol. If this queried result contains no records, then the curr_month_trans data table is appended with a record containing the same fields as in the curr_day_trans data table processing. If the same queried result set contains 1 record, then the curr_month_trans data table is updated with the fetched row data for close price, total shares held and the number of shares added or subtracted where the date, customer code and ticker symbol matched their counterparts from the fetched row data. Also, The cinvestments data table is queried by a specific set of customer code and ticker symbol to help generate the number of shares held and changes in those shares held, which are used to help append each new record to the curr_month_trans data table along with the data from the result set of trans_temp_dstfanmail data table.

The curr_quarter_trans data table will be updated with the fetched row results. Here, the curr_quarter_trans data table is queried by a specific set of date, customer code and ticker symbol. If this queried result contains no records, then the curr_quarter_trans data table is appended with a record containing the same fields as in the curr_day_trans data table processing. If the same queried result set contains 1 record, then the curr_quarter_trans data table is updated with the fetched row data for close price, total shares held and the number of shares added or subtracted where the date, customer code and ticker symbol matched their counterparts from the fetched row data. Also, The cinvestments data table is queried by a specific set of customer code and ticker symbol to help generate the number of shares held and changes in those shares held, which are used to help append each new record to the curr_quarter_trans data table along with the data from the result set of trans_temp_dstfanmail data table.

The purpose of making data record appends and updates of existing data records to these four data tables in the MySQL database is to reflect changes in client account value over the four time periods (daily, weekly, monthly and quarterly). I created a financial client website "front end" interface as part of this project to let authorized people log in and see their individual account value changes. This gives them an idea of how their investment portfolios are doing from 4 different vantage points in time.

CONCLUSION

The two cURL scripts and six PHP scripts that run in a set chronological order on the customer website crontab manager are like pieces of a puzzle. They fit perfectly together to form a seamless automated processing sequence. This is a prized asset in any business setting, because employees are simply too busy to manually run a sequence of scripts consistently each and every business day. Automating these tasks is the best option.

In addition to PHP software development, I am also versed in Corel Paradox, C#.NET, and computer repair. Please visit my website to learn more about the services I can offer your organization.

Up Next
    Ebook Download
    View all
    Learn
    View all