The "SalesForce.com Destination" is used to send data to a SalesForce.com object. See the SalesForce.com Connection Manager page to learn more about setting up the connection manager.
• Connection - Here you will select an existing SalesForce.com connection, or create a new one.
• SalesForce Object - Once selecting the connection, you will then select and object where the data will be inserted within your SalesForce.com account.
o Insert - Use this option to insert data into the SalesForce object
o Upsert - Use this option to insert data into the SalesForce object if the data does not exist or update the data if it already exists. Upsert requires that an external ID be added to your SalesForce object in your SalesForce.com account.
o Update - Update your data in the SalesForce object based on the ID column from the SalesForce object. This means that the ID from the SalesForce object needs to exist in the local source data. To the ID column data the TF Sales source Adapter needs to be used to retrieve the data or the ID needs to be retrieved and updated using the output from the SalesForce Destination.
o Delete - Delete is used to delete data from the SalesForce object. Again, the ID column of the SalesForce object needs to be used to delete the data.
• Upsert External ID - If you selected the Upsert action you must select your External ID here.
• Batch Size - Users can configure a custom batch size (200 or less).
• Assignment Rule - Users select an assignment rule created for their object(s). Rules can be ignored by selecting [No Assignment Rule].
• Process Mode -
o Normal - Uses normal processing which can use up to 200 batch size.
o Bulk - Uses bulk processing that can achieve up to 10,000 batch size. Bulk mode uses more network traffic but less IO because no compression occurs.
o BulkZip - Also uses bulk processing but compresses the bulk CSV file before sending. BulkZip uses more IO for compression but less network traffic.
• Bulk Concurrency - Using bulk processing mode, users can choose between parallel (based on the batch size) and serial (1 record at a time) modes.
• Map SalesForce Destination Columns - Once you have an object selected, the columns here will be filled. Any column names that match will be automatically mapped. To map more columns, click on the Input column that will be mapped to the destination column.
• How to handle errors - There are three options to handle errors
o Fail Component - The component will fail upon the first data error that is thrown.
o Redirect row to error output - The tows of data that failed will be sent to the error output which then can be used to handle the data errors.
o Ignore Failure - The failures are ignored but reported in the Execution Results log.
• Refresh Salesforce Columns - Pressing this button allows the component to refresh the metadata and update any changes made in Salesforce after the destination was opened.
• Wait For Bulk Results - This selection can be used along with bulk processing mode. When selected, it will send a batch of rows to a job and wait for the results of the operation (insert, update, upsert, delete) and then output the results to the success output with the ID's generated by Salesforce when updated or created. If this option is not selected, the component will not wait for the results and process the execution as quickly as possible and the Salesforce ID's will not be properly returned (the batch ID will be returned instead for all rows in the successful output.)
•Use Legacy Output Mode - In legacy versions of Task Factory, users could update the unique ID's created by Salesforce.com by connecting a destination component to the Salesforce Destination's error output. This legacy output also returned errors (when error handling was enabled) which could be confusing when sifting through multiple updates mixed with errors. The current version now defaults with the option turned off thus directing the unique ID's to the success output. This change allows users to separate the data returned from Salesforce.com from the error output.
Please see the Error Row Handling page for more information about this functionality.