Skip navigation
All Places > Developer > Sugar Integrate > Blog

Sugar Integrate

6 posts

Over the course of the next 4 weeks Sugar Integrate will be undertaking security enhancements that will benefit all users of the platform. A direct impact of these changes is that on or before these deadlines all user accounts must update their passwords to comply with our updated more strict password requirements. This requirement to update user passwords will have no current impact on using the Platform APIs or existing Authorization Header tokens, but if passwords are not reset by these deadlines then users will be locked out of the Sugar Integrate user interface until they reset their passwords with the assistance of the Support team.

 

Required Action:

All basic auth user passwords (e.g. User + Password, non-SSO users) must be updated to have Excellent UI passwords that comply with the following standards:

  • at least 10 characters
  • at least 3 of the following 4 types of characters:
    • a lower-case letter
    • an upper-case letter
    • a number
    • a special character (such as !@#$%^&*)
  • Not more than 2 identical characters in a row (such as 111 is not allowed)

 

How to Reset a Password in Sugar Integrate

Reset all non-Excellent user passwords either by logging into the Sugar Integrate UI to reset (Reference Link) or by performing a call to POST /authentication/passwords (Reference Link) with the users’ auth headers. The third method of resetting your password is by navigating to the reset password screen for the appropriate environment. Here are those links:

 

Note that users who already have Excellent passwords will not need to take any action. Note that users with weak passwords who do not update their passwords by the specified deadlines will be unable to login to the UI and will need to go through the “Forgot Password” flow to regain access to their account.

 

Deadlines:

Staging - August 14, 2020

Production - August 28, 2020

A common request we hear is for a solution to store files that are attached to records in an external location. For example: "How can I store uploaded documents in Box.com and include a reference to the file in Sugar?" I have created a high-level example that I'd like to show you.

 

Let's talk about the proposed solution. The first thing we'll do is poll for new events on the Documents module in Sugar - for this exercise I will only deal with Documents and Opportunities but most solutions will benefit from expanding this to other module types. When a new Document is added to Sugar Sell, we will:

  1. Verify it is of Document type
  2. Retrieve the details of the Document (like name, filename, etc)
  3. Retrieve the details of the Document's parent (we're only checking Opportunities for this exercise)
  4. Create a directory in Box.com using the name of the parent Opportunity as the directory's name. Note I was able to switch this to Google Drive with about 10 minutes of changes (really just needed to authenticate the Google Drive adapter and change the base URL for the final link from app.box.com/file/{id}
  5. Copy the file from the Sugar Sell Document into the newly created directory in Box.com
    1. We will use the same filename
    2. If the file already exists, overwrite the existing file (Box.com has versioning so the overwrite is less of a dangerous operation)
  6. Generate a direct URL to the file in Box.com
  7. Write the URL to a custom field in the Sugar Sell Opportunity record
  8. Delete the original Document record from Sugar Sell

 

Setup

To begin, we must do some configuration in Sugar Sell. First we need to add a custom textarea field to the Opportunities module via Studio. A generic textarea field is perfect for this exercise - we can append each new file URL to the existing value in the field and use "\r\n" to create a new line each time. PLUS, in v10.1, we added URL linking in textareas. So, the links will be functional without adding any additional HTML.

 

In Studio in Sugar's admin screen, add this new textarea field to the Opportunities module (see: Creating Fields support document). I called mine externaldoclink_c. After creating the field, remember to add it to the record view layout so that we can see it when viewing an Opportunity.

 

While you are in Sugar Sell admin, there are a few other things that we nearly always recommend when creating an integration with a Sugar product. All of these will help avoid authentication conflicts between logged-in instances of Sugar and the REST calls from Sugar Integrate:

  • turn off IP Validation in the System Settings section of Sugar Admin. 
  • create a custom platform in the Developer Tools section of Sugar Admin
  • create a custom user that will be used exclusively for API calls

 

Now that Sugar is configured, we can move over to Sugar Integrate. We will need an authenticated instance of the Sugar Sell adapter and one of the Box.com adapter. In the Sugar Sell adapter, be sure to enable eventing and poll for the Documents module - I chose 1 minute intervals for testing but this is probably too often in reality.

 

This solution needed a couple of endpoints that weren't available by default in the Sugar Sell adapter. I have made a request to have them added as I think they are important. For now, however, we will have to add them manually. 

 

The first new endpoint will allow us to get the details of the file like filename, etc. Without this endpoint, we would only know the GUID of the document record. So, we will add a new endpoint called /files/{objectId} that will map to the Sugar endpoint of /Documents/{objectId}/file/filename

 

  1. In Sugar Integrate, navigate to the Resources tab of the Sugar Sell Adapter.

    1. Click on "Add a new Endpoint" on any of the sections of the swagger view

    1. Here, we will enter the details of the new endpoint. Use this screenshot for reference.
      1. Be sure to choose GET as the method for both fields at the top. 
      2. The first field, which represents the Sugar Integrate endpoint, will be set to /files/{objectId}
      3. The second field, which represents the Sugar Sell endpoint, will be set to /Documents/{objectId}/file/filename
      4. Set the Response Content Type field to "application/octet-stream" so that we will receive an actual file and not just JSON metadata describing the file.
      5. Then we need to define the objectId parameter that we put into the endpoints
      6. Click the Save icon in the far right corner of this endpoint section when your settings are complete  

     

     

    The second new endpoint that we need to add is for retrieving the opportunity that this document is associated with. We'll make it generic enough so that it can be used in the future for any type of parent record for the document.

    1. If you are not still in the resources tab of the Sugar Sell adapter, navigate to there now.
    2. Click on "Add a new Endpoint" on any of the sections of the swagger view

    3. Here, we will enter the details of the new endpoint. Use this screenshot for reference.
      1. Be sure to choose GET as the method for both fields at the top. 
      2. The first field, which represents the Sugar Integrate endpoint, will be set to /{objectName}/{objectId}/{childObjectName}. When we call this endpoint, we will replace those parameters like so: /Documents/123e4567-your-GUID-here-abcdef987654/opportunities
      3. The second field, which represents the Sugar Sell endpoint, will be set to /{objectName}/{objectId}/link/{childObjectName}
      4. Leave the Response Content Type field as "application/json" because we want to receive JSON metadata describing the file.
      5. Then we need to define the parameters that we put into the endpoints
        1. objectName - string
        2. objectId - string
        3. childObjectName - string
      6. Click the Save icon in the far right corner of this endpoint section when your settings are complete

     

     

    If you'd like to test these new additions, simply click on your instance of the adapter on the left side panel and then the "Try it out" button will appear on the right of your endpoint panels. You will need a Document Id from Sugar Sell to complete either test. The Id can be found by navigating to the Document in Sugar and copying the GUID from the URL string in the address bar.

     

     

    The Procedure

    Now that everything is configured, it is time to create our procedure. I'm attaching a JSON file with my procedure to this article. You may import that to test the functionality, or continue following along in this exercise to create your own version. Remember to unzip the file before attempting to import it into Sugar Integrate.

     

    This procedure has 2 configuration variables already set up: sourceAdapter and destinationAdapter. So, when you create an instance of the procedure, you will set those values to your authenticated Sugar Sell and Box.com adapters respectively. If you are re-creating this procedure from scratch, remember to create these 2 Adapter Instance type variables.

     

    Whether you are importing the procedure or starting from scratch, it is important to understand the steps. I think I'll explain the steps one by one.

     

    1. Trigger - every procedure has a trigger as its first step. This procedure is using an event-based trigger. We configured our Sugar Sell adapter to poll for Documents, so those events will trigger this procedure when we set it up to use our Sugar Sell adapter instance.
    2. isItDocument - This is a JS Filter type step. In here, I am simply verifying that this is a Document and it is new (not an update or delete). This is fine for a demo, but for a real life scenario, we should write some code and add some steps to handle the other possibilities. All Javascript steps must call done() function to move on. A JS Filter step is expected to send true or false to the done() function. If the object in the trigger is a Document and it is new, we want the procedure to continue running. So, we send true to done(). If it is not a document or it is not a newly created one, we want the procedure to stop running and move to the onFailure step. So, we send false to done().
    3. parseValues - This is a JS Script step that will grab the values from the trigger that we need for subsequent steps. As you can see, I only needed the GUID of the Document from the trigger. So I grabbed that and sent it to the done() function so it will be available in the next step.
    4. getFileDetails - This is an API Adapter step. We are using this to make another call to Sugar Sell so that we can get the details of the file. This is where we will call one of the custom endpoints that we created above. So, the Adapter Instance Variable will be set to our source Adapter config variable. The method of our request will be GET. The API endpoint we are calling is the /documents/{objectId} one that we created. We must replace the {objectId} with the fileID in the previous step by using the dollar sign, curly brackets notation for steps.parseValues.fileID.
    5. getParentThis is another API Adapter step. This one is going to call our other custom endpoint that we created earlier so that we can retrieve the Opportunities record associated with this Document. We will again use the sourceAdapter config variable and set the Method field to GET. The endpoint this time is /Documents/${steps.parseValues.fileID}/opportunities because we want to find opportunities linked to the Document with the fileID from the parseValues step.

    6. createMetadata - This step is a JS Script step. Now we have the file and the parent Opportunity record. The next task is to copy the file to our destination (Box.com) and save the URL to the file in the Opportunity record. To do that, we need to create some parameters for the coming steps. Note the conditional in this step. I am checking to see if there is, in fact, a parent Opportunity for this Document. If not, I don't want to proceed (for this demo). I am also assuming that there is only one associated Opportunity with this Document. This is safe because in my use case, we are only proceeding with the procedure if the Document is new and not already in the system. This additional conditional verifies that it was also created as a Document associated with an Opportunity.
      1. downloadHeaders - JSON object that defines what we want to receive. In our case, we are pulling down a file so application/octet-stream is correct.
      2. uploadHeaders - JSON object that defines what we are going to send in the filestream step. For that, we are using multipart-form-data so we can send the file and additional data
      3. destinationPath - JSON object defining the location of the file on the destination platform. I also set the overwrite parameter to true because I am using Box.com and it has pretty decent versioning. Your use case may differ.
      4. newDirectory - A JSON object defining the path of the file in the destination platform. This is a concatenation of the base directory I wanted to use (offloadedimages) and the name of the parent Opportunity. This way I can see in Box.com all of the files associated with an Opportunity.
      5. parentId - This one probably isn't necessary as we could reference it later via ${steps.getParent.response.body.records[0].id}. But, putting it into a simpler variable and exposing it through the done() function makes it nice and neat to reference later. 


    7. createDirectory This is another API Adapter step. It is probably not necessary as the Stream File step will create a directory if it doesn't already exist. I like to break things in to small chunks and I thought I'd like some error handling around this step. So, I made a separate step to create the directory. Note that I set the Adapter Instance Variable to our destinationAdapter variable (Box.com). The API Method is POST because we are sending data TO the Box.com endpoint. The endpoint we are calling is simply /folders/. For this step, we need to send a request body object. We defined it in the previous step as newDirectory. If we click on the Show Advanced button, we will reveal the Body field. This is where we put our reference to the object from the createMetadata step.

    8. streamTheFile - This is a File Stream step. This is where the magic happens. This single step will download our file from Sugar Sell and then upload it to Box.com. All we have to do is fill in the values.
      1. Download Adapter Instance Variable - the sourceAdapter configuration variable. This says that we will be downloading from Sugar Sell
      2. Download Method - We are simply making a GET call
      3. Download API - This is the endpoint for Sugar Sell that we want to make the GET call to. In our case it is the custom endpoint we created that gives us the file itself with the Document ID we grabbed in the parseValues step: /files/${steps.parseValues.fileID}
      4. Upload Adapter Instance Variable - the destinationAdapter configuration variable. This says that we will be uploading to Box.com
      5. Upload Method - To send the file, we will make a POST call. This endpoint in the Box.com adapter is called /files
      6. Download Headers - the JSON object that we defined in the createMetadata step
      7. Upload Headers - the JSON object that we defined in the createMetadata step
      8. Upload Query - the JSON object with the destinationPath that we defined in the createMetadata step

    9. makeTheExternalLink - A JS Script step. In this step, I am using the response from the Stream File step to create the URL of the file that has been saved in Box.com. For this demo, the custom field is a textarea to which I want to append each new external file URL. The result of this step is a JSON object defining the value to be stored in our custom field in Sugar Sell.

    10. addExternalLinkToCRMThis is an API Adapter step that will update the Opportunity record in Sugar Sell with the new value we created in the previous step. Since we are updating Sugar Sell, our Adapter Instance Variable will be set to the sourceAdapter config variable. We are doing an update so I'm calling the PATCH method of the opportunities/{id} endpoint. Therefore, our endpoint path is /opportunities/${steps.createMetaData.parentId}. The last thing we need is the body of this PATCH call. Since in the previous step we passed only the postBody object to the done() function, we do not have to specify which value to use. It just needs to be set to the result of that step: ${steps.makeExternalLink}.

    11. deleteOriginalDocument - Another Adapter API Request step. This is our final step. Now that we have copied the file over and updated the parent Opportunity with the external link, we have no need for the original Document or file. So, this step will delete it from the DB and file server. Our Adapater Instance Variable is the sourceAdapter because we are removing the file from Sugar Sell. We will make a DELETE call to /documents/{id} and that's it!

    There you have it. That is our procedure. I have a few onFailure steps in there that simply console.log an "All done." message. Typically, this should be more robust but for demo purposes, it is just fine.

     

    I know this is a lot, but taken in small steps it should make perfect sense. Is this the best solution? Not by far, but it does demonstrate the concept. The proper solution I could think of would be to create a copy of the documents module that would allow never store the file on the Sugar side. it would simply offload it to the external storage location. There would also have to be much more robust error handling throughout the procedure. My hope is that this will get you started. If (when) you do create something inspired by this post, please share it in the comments.

    A consistent challenge for integration developers is managing flat file integrations with systems where direct access is not natively provided by existing functionality or permitted for security reasons. In this guide, you'll see how to take an existing CSV file in an SFTP location and upload/stream its content to Sugar products — a necessary step before you can quickly transform the contained data with a common resource transformation.

     

    Overview

    In this example, we'll do the following:

    1. Take a .csv file from an SFTP location (using the SFTP adapter)
      1. This file will contain rows of records made up of appropriate Sugar Integrate field names (for Account records: billing_address_street, industry, email1, etc). Any fields included in the CSV that are not in Sugar or not in the Sugar record type that you are creating will be skipped and not imported.
      2. Here is a CSV with sample data or you could use a data generation utility like mockaroo.com to generate your own. Unzip the file before uploading to your SFTP server - you only need the .csv file.
    2. Upload file data to a CRM system (Sugar Sell)
      1. The data will be parsed so that a record can be created for each line in the csv file. These data will be mapped appropriately in Sugar Sell. When completed, you should have new records in Sugar Sell in the module that you specified.

    Although this tutorial specifically utilizes a source SFTP location and uploads to Sugar Sell, both the source resources and endpoint can be changed or extended to suit your use case. For example, you could select top-level sources from nearly any bulk-supported endpoint and upload the resulting downloaded data to any other endpoint that can take in data from a CSV file.

     

    Prerequisites

    In order to follow along with and perform this use case, you'll need the following:

    • Sugar Integrate account (request a trial account here)
    • Sugar Sell admin account (Sugar Enterprise or Serve will work also)
    • Configured SFTP server and valid credentials
    • Authenticated instances of the SFTP and Sugar Sell adapters
    • A csv file with data to be imported into Sugar Sell on your SFTP server. Remember to actually upload the file to your SFTP server before trying this process! Here is a csv with sample data of Accounts
    • The JSON configuration file of the sample procedure we created for this article

     

    Creating and Configuring the Procedure Instances

    With Sugar Integrate, you can create procedures from scratch, use existing procedures as a starting point for new ones, or import JSON files of an existing procedure. Here, we'll import the JSON you downloaded above and configure the instances to work properly with each other.

     

    Importing JSON to Create the Procedures

    Your existing procedures are available from the Procedures tab on the left-hand side of our platform's interface. Click Build New Procedure and then select Import to upload the JSON for the procedure we're going to use (see our docs for more details about importing JSON).

     

     

    After importing the JSON for the procedure, it's time to create the procedure instances.

     

    Creating the Procedure Instances

    Remember to configure your SFTP and Sugar Sell adapter instances before starting this step. Once you have those instances, navigate to the Procedure catalog, hover over the procedure, and click Create Instance.

     

     

    On the Instance Creation page:

    1. Enter a name for the procedure instance. Here, we use "Instance 1".
    2. Click the plus sign under targetInstance and select your Sugar Sell adapter instance.
    3. Click the plus sign under sourceInstance and select your SFTP adapter instance.

    It should look something like this:

     

     

    After completing the configuration, click Create Instance.

     

    Triggering the Procedure

    Now that the procedure instance is created and configured, we'll manually trigger the procedure. We will do this in the Edit screen for the procedure. Navigate to the Procedure catalog, hover over the procedure, and click Open.

     

     

    On the console that opens, click Try It Out and then click Select Instance.

     

    Select Instance 1 (or the instance with the name you set up above).

     

    Now click Select Trigger on the console.

     

    In the Chooser window, specify the input JSON to use as the trigger to kick off the procedure. Your trigger should contain the path to the .CSV file you want to upload to the CRM system and the destination object as shown below:

     

    The pathToFile field should be relative to the root of your SFTP server (ex: "/home/Public/test-docs/MockData-Accounts.csv"). The destinationObject is a string that represents the module type you would like to import this data into. Each CRM will have a different set of possible destination objects. So how do you know what value to put for this parameter? I suggest using the API docs for the CRM adapter instance. You can invoke the GET /objects endpoint to retrieve a list of all possible object types (and their spelling and capitalizations) from the response. Remember to select your instance from the left side list so that you may use the Try It Out button. 

     

    Since we are looking to create Account records from our data, I see from the response of the GET /objects call that we need to set destinantionObject equal to "accounts" (all lowercase and plural). Now that the input JSON for our trigger is all set, all that's left is to click Run.

     

    Verifying the Procedure Execution

    After giving the procedure some time to complete, we can view details of the procedure executions to ensure they were successfully completed.

    Navigate to the Procedures instances page, hover over and click Executions.

     

    In the Procedure Executions column, click your procedure execution to view its steps.

     

    Here, we see all three steps of the procedure were successful. You can also click on an individual step to see more details. Once the procedure is executed, the data has been uploaded to Sugar Sell. Log in to Sugar Sell to see your newly created records.

     

    That's it! The process is complete. 

     

    So, what did this procedure actually do?

    As you can see, the procedure only has 3 steps: The trigger, a Javascript step that sets up values for the final step which is a File Stream step. That last one is where the magic happens. A File Stream step essentially does a download from one platform and an upload to another. In the first step called generateMetadata, we are simply defining values to be used in the Stream File step. Here's a bulleted list of the parameters used in the Stream File step, what they mean, and how they are defined:

    • Download Adapter Instance Variable
      • This is the Adapter Instance that you have authenticated for the source data. In our case, it should be an SFTP Adapter authenticated to your SFTP server where the CSV data file resides
      • Our procedure variable is called sourceInstance so we use the curly bracket notation to specify we are using a config variable called sourceInstance
    • Download Method
      • To grab the file from the SFTP server, we will invoke a GET endpoint
    • Download API
      • This is the path to the desired endpoint. Since we are looking to grab a file from the SFTP server, we are going to use the /files endpoint
    • Upload Adapter Instance Variable
      • Similarly to the Download Adapter Instance Variable, this is the config variable called targetInstance that we assigned when making the Procedure instance
      • We again need to use the curly bracket syntax
    • Upload Method
      • To send the file to the CRM, we will invoke a POST endpoint
    • Upload API
      • Which endpoint are we calling for the POST? In this example, we are going to call the bulk endpoint for the object type we are trying to create. In the Trigger Event JSON, we assigned pathToFile and destinationObject values to be used later. In order to use those values properly, we needed to create properly formatted parameters. In our generateMetadata procedure step, we grabbed the destinationObject and assigned it to a variable called objectName. So, our Upload API value should be /bulk/${steps.generateMetadata.objectName} which says grab the value of the objectName variable from the step called generateMetadata
    • Download Headers
      • This parameter is hidden by default. Click Show Advanced to reveal these additional parameters
      • We defined a downloadHeaders variable in the generateMetadata step that sets the acceptable mime type of files to download to 'text/csv' as a JSON object
    • Download Query
      • In the generateMetadata step, we also defined a variable called path that contains a JSON object with the value from the pathToFile value we defined in the trigger event JSON
      • We again need to use the curly bracket syntax
    • Upload Headers
      • We defined an uploadHeaders variable in the generateMetadata step that sets the acceptable content type to download to 'multipart/form-data' as a JSON object

     

    I hope that helps you understand what it takes to grab a csv file from an SFTP server and send it to a CRM as individual records. Play around with it and see what improvements you can make. I'd love to hear about it in the comments!

    Sugar Integrate makes integrating with (or migrating data between) systems so much easier than doing it manually through code. Even with this incredible IPaaS, however, some integrations can still feel daunting. So, the Sugar Integrate team has begun identifying some common integration and migration use cases that we think make sense to templatize (is that a word?).

     

    Today, we have published our first 2 Sugar Integrate template solutions. Each solution is a zip file containing almost everything you need to complete an advanced migration. We identified, created, and mapped the Common Resources. The formulas (and their sub formulas) have been completely written. All you need to do is install the components into Sugar Integrate, prepare your external systems (Sugar Sell, Salesforce, etc), and run the procedures.

     

    Don't worry. We've also created installation and implementation guides to go along with the templates.

     

    So, what do these particular templates do?

     

    Let's start with the CRM Data Migrator for Sales template. We designed this template to facilitate the migration of records from a CRM (the documentation focuses on Salesforce) to Sugar Sell or Enterprise. It will copy ALL objects into Sugar - including custom modules. To move the custom modules, however, you will need to make some mappings on your own (because we don't know what your custom modules are called or where you want them to live in Sugar).

     

    Take a look at one record that was migrated from Salesforce to Sugar Sell:

     

    The second template we have created is the Opportunity to Cash template. This template automates the opportunity or quote to cash process between Sugar Sell or Sugar Enterprise and an ERP system. It supports synchronizing the account/contact data and financial history (such as invoices and payments) from the ERP system to Sugar. The template also supports automatic order or estimate creation in ERP when a quote is accepted or an opportunity closes in Sugar Sell or Sugar Enterprise.

     

    Here's an example of data that was migrated from QuickBooks Online to Sugar Sell:

     

    To access the templates, navigate to the Software Downloads page and select "Integrate" from the dropdown. Then simply select the correct version and package to download and follow the instructions in the Installation Guide.

     

    We really hope that you find these templates useful. We plan to create more so keep an eye out for future announcements of new templates.

     

    For more information on the new templates, please refer to the Sugar Integrate Template Release Notes. Here's a list of additional resources that may be of help for your migrations or integrations:

    Here’s a step-by-step guide on what a Common Resource is and how to create and map transformations within Sugar Integrate.

     

    What is a Common Resource?

    A Common Resource is simply a way of normalizing data between disparate systems. So, for example, if I want to pass data between Sugar Sell and Quick Books, I need to ensure that each application recognizes the data from the other one. Both of these systems have multiple address fields - which should I use? I could make a common resource to map the billing address of Sugar to the shipping address of Quickbooks. Maybe I need to combine first name and last name fields into a single full name field. What if one system has a field that takes values like "Hot", "Warm", or "Cold" but the other one uses a letter grade system. This is another great case for making a Common Resource that tells Sugar Integrate how to translate those values between the systems.

     

    Every common resource within Sugar Integrate is accessed using a consistent RESTful API with a JSON payload regardless of the protocol used at the endpoint. The Sugar Integrate pre-built Adapters do the work of mapping the normalized API call to each application’s API endpoints, as well as transforming SOAP, XML and other API protocols to REST/JSON. A common resource allows you to take advantage of our “one-to-many” integration approach by mapping a single resource to multiple adapter instances.

     

    Creating a Common Resource

    You can find all your common resources under the resources tab on the left side of the platform. You'll find any of your existing common resources on the bottom left, but can also create new ones using the "Build New Common Resource" button.

     

     

    For example, we will define the resource myContacts as email, first name and last name. In the drop down menus there’s different ‘types’ that are supported: string, boolean, number and date, and the resource can additionally support nested objects.

     

    Build a Common Resource In Sugar Integrate

    A great place to start with when creating a common resource is to define the object and match it to your origin system. If you’re writing an integration to your application, and inside of your application you already have an existing concept of a contact, then that's how you would go about defining this contact when creating a new resource.

     

     

    Mapping Transformations

    The next part of any common resource is building a transformation. Take the field and the common resource you just defined for your application, and map it to the field and resource at an endpoint.

     

    On the right you’ll be able to see the mapped transformations and the option to create new transformations for your common resource. In our example we’ve already created a transformation for Sugar Sell for myContact.

     

     

    In our transformations screen, the drop down menus populate all the fields that are available in that destination resource, pulled via discovery APIs. In the above example, we’ve taken the firstName field in the common resource and mapped it over to the last_name field in the Sugar Sell contact resource. You can also edit these as free text fields.


    For the fields that may not show up in the drop down menu, a good example of this might be custom fields for a specific record, you can still make the mapping, even if it doesn't show up in the discovery object. From this window you can press Try It Out (play button) which will run and will show you how your object is getting transformed.


    In the original tab, you can see the original payload as it's coming back from Sugar Sell. So the contact object that comes back from Sugar Sell is difficult to work with as it has a lot of nested properties. Critical fields, like email_address, are nested sometimes several objects deep Compare this to the transformation, where you receive a clean response of the fields you are requesting.

     

    Comparison of original payload versus transformation:

     

    How To Add Docs To Your Common Resource

    Under the advanced settings, you can toggle on 'Add to API Docs.' When this is set to true, it will take this common resource and add it to the API docs for this adapter. Note this setting is set to TRUE by default.

     

     

    Now, when you return to our Adapters tab (top left), and open the Adapter Instances, you can view the new resource in the API docs for that Adapter.

     

     

    Example of a Common Resource added to the Sugar Sell Adapter API Docs:

     

    Other Configurations For Resources

    The cog at the top right corner of your common resource will also give you access to remove unmapped fields. If you set this to false by turning this toggle off and then save it, then run your transformation one more time, you can re-gather the original payload object. Notice the new fields mapped, firstname and lastname, are being included where they were not included before.


    So what's happening here? When you turn off the remove unmapped fields, the platform is going to return all the fields in your original object, even if they haven't been mapped. For any field that has been mapped, the keys are going to be changed to the keys in the common resource.


    This is a great solution if you wanted to get some easy properties such as first name and last name, but you still want to keep a record of all the other data that comes with that contact resource.

    There is a lot you can do with Sugar Integrate. In most of the demos we have released, our examples show a Procedure that is triggered by an event in a platform like Sugar Sell. But what about when we want a Procedure to be called when a user clicks a button in a platform (any other platform, but for this article we'll look at Sugar Sell).

     

    Converting an event-based procedure to manual

    I had a procedure already set up that sends a message to Slack whenever a new lead is added to Sugar. So, you know, it is event-based. I decided that I wanted to use that same procedure but have it triggered with a button press inside of Sugar Sell. In Sugar Integrate, only manual procedures can be called externally. It turns out, there isn't a simple way to convert a trigger-based procedure into a manual one. So, I improvised. There are at least 2 ways to convert a trigger. Each method is a bit of a work-around. You decide which method you prefer.

     

    Method 1 - Export, Edit, Import

    If you are comfortable editing JSON, this method may be for you. The first thing I did was export my existing event-based procedure. Then, I opened the json file in a text editor and located the triggers object. It looked like this:

    "triggers": [{
         "id": 8068,
         "onSuccess": ["js-step"],
         "onFailure": [],
         "type": "event",
         "async": true,
         "name": "trigger",
         "properties": {
              "elementInstanceId": "trigger1"
         }
    }],

     

    I can see that the type is set to "event". That's the first value to change. The other change to make will be to remove the values in the "properties" object because it is not necessary for a manual trigger. So, my new trigger will be:

    "triggers": [{
         "id": 8068,
         "onSuccess": ["js-step"],
         "onFailure": [],
         "type": "manual",
         "async": true,
         "name": "trigger",
         "properties": {}
    }],

     

    Now, I can create a new procedure template by uploading this json. It will create the full procedure with all of the steps and variables. The only difference will be that the trigger is now manual.

     

    Method 2 - Delete Original Trigger, Add a New One

    The second method is all in the UI. First, open your procedure and take note of the first step AFTER the trigger. Now you can click on the trigger step and choose "Edit". In the edit screen, there is an option for "Delete". When you click on Delete (and then confirm), you will be shown the trigger type select screen. Choose your new Trigger type - in this case we would select "Manual". After making a selection, you will be brought back to the procedure editing screen. OH NO... all of my steps are gone! Not to worry. This is why we took note of the first step.

     

    Click on the Plus sign to add a new step. From the "Add Procedure Step" screen, switch to "Add from Existing" and select the step with the same name that you took note of. After selecting the step to add, you'll see that all of the original child steps have been added back in.

     

    So what do I need to send in as the data? All we need is the minimum values required for an event-triggered formula with a payload that contains the 'events' object. The json should look like this:

    {
         "events": [{
              "objectType": "leads",
              "objectId": "37cf6e1e-9130-11ea-92bb-02d60046d9de",
              "eventType": "CREATED"
         }]
    }

     

    What data did I send in? From my previous procedure, I know that objectType should equal "leads". objectId comes from the GUID of the lead in Sugar. Since it is a new lead, the eventType will be "CREATED". That should do it!

     

    Wait, wait, wait… My first step in the procedure was looking for data at trigger.event.objectType. But when I send in this json object, i get an error telling me that trigger.event is undefined. So by logging out (console.log) the trigger object, I see that it now sees this:

    [
      {
        "args": {
          "events": [
            {
              "date": "2019-02-10T19:07:00Z",
              "eventType": "CREATED",
              "objectId": "37cf6e1e-9130-11ea-92bb-02d60046d9de",
              "objectType": "leads"
            }
          ]
        }
      },
      "new applicant"
    ]

     

    No problem, I will update my step to now look for trigger.args.events[0].objectType. I chose to use the 0 index on events because I control the data and I am planning on only sending one lead in at a time. So, all I need is the first event in the array.

     

    After making these updates, my procedure is working as I had hoped - doing exactly what it was doing as an event-based procedure but via a manual trigger.

     

    That's fine for in the debugger. But how do I set this up so that I can use the procedure as a resource (call it externally)?

     

    Calling the Procedure from Postman

    There are currently two ways to call this procedure externally. One is by posting to /formulas/instances/{instanceId}/executions. This method is asynchronous and will complete usually within 15 minutes. The other method uses a feature that is currently in beta but will execute the procedure immediately. I'm going to try the beta method. The standard method will only send a response telling me that the request was received. The synchronous method will show me my results directly from the procedure. I also like this new method because I can give my endpoint a custom name.

     

    In the procedure editor, I can hit the Edit button in the top right and show the advanced options. In the section titled "Execute Procedure via API (BETA)" I will select POST and create an endpoint name. For my purposes, I called it "shaheen". I chose this name because there's no way I will forget that name and I know it won't conflict with any other endpoint.

     

    This is important to remember because each Procedure that you open up for external use is added as an endpoint to the Sugar Integrate API. This API is technically shared with other users and organizations. So, if someone in your organization knows your procedure instance ID, they can call this endpoint. Don't worry, no one outside of your organization will be able to. They will get an "invalid organization or user secret" error.

     

    Now all that's left to do is try calling this new endpoint externally. Let's move into Postman. I know I will need to set up some headers. In Sugar Integrate, I can grab my org and user secret by clicking on the profile icon in the bottom left of the application screen.

     

    I will also need the ID of the procedure instance. That can be found in the instances section of Sugar Integrate.

     

    For postman, I must add 2 headers: Authorization and Elements-Formula-Instance-Id (those are case-sensitive). The value for Elements-Formula-Instance-Id is simply the Procedure instance ID that we just located. The Authorization value should look like this:

    User xxxxYOUR_USER_SECRET_FROM_INTEGRATExxxx=, Organization xxxxYOUR_ORGANIZATION_SECRET_FROM_INTEGRATExxxx

     

    Now I can add the body which is the JSON we used earlier:

    {
         "events": [{
              "objectType": "leads",
              "objectId": "37cf6e1e-9130-11ea-92bb-02d60046d9de",
              "eventType": "CREATED"
         }]
    }

    The URL of the endpoint in Sugar Integrate is https://api-us.integrate.sugarapps.com/elements/api-v2/shaheen. This will be similar for yours as well.

     

    That's it. We should be able to make the request now. When I hit "send", I will receive a success response that looks something like this:

     

    There should also be a message that was sent to Slack (because that's what this procedure does). And, finally I can look at the executions for the Procedure instance and see a successful one.

     

     

    Calling the Procedure from Sugar Sell

    I'm feeling good about our external manual procedure. All that's left is to create the button in Sugar and have it call this new endpoint.

     

    To do so, I have created a Module Loadable Package (MLP). The package is attached to this article. The package will copy 3 files into the custom directory:

    1. record.php - which is an altered copy of /modules/Leads/clients/base/views/record/record.php that adds the button to the record view of the Leads module
    2. record.js - the controller that will contain the JavaScript code for the button's action
    3. en_us.slack-button.php - a language file for the text of our new button

     

    In record.php, I added the following item to the buttons array:

    array(
        'type' => 'divider',
    ),
    array(
        'type' => 'rowaction',
        'event' => 'button:slack_button:click',
        'name' => 'slack_button',
        'label' => 'LBL_SLACK_BUTTON_LABEL',
        'acl_action' => 'view',
    ),

    This creates a button in the record view dropdown menu for a Lead. The text for it will be defined as LBL_SLACK_BUTTON_LABEL in our language file. When clicked, the button will call the slack_button function from our controller. Oh, and I added a divider to make it look nicer. If we upload our MLP with just the record.php and language file, we would see our new menu item at the bottom of the menu with a divider like this:

     

    Now, let's hook up the button. In the JS controller file, I added a listener to the init function for the new button.

    this.context.on('button:slack_button:click', this.slack_button, this);

    Then, I added the actual function that the listener is referring to.

     

        slack_button: function() {
            //example of getting field data from current record
            var LeadID = this.model.get('id');
            var evtData = {
                 "events": [
                     {
                         "objectType": "leads",
                         "objectId": LeadID,
                         "date": "2019-02-10T19:07:00Z",
                         "eventType": "CREATED"
                     }
                 ]
            };

            $.ajax({
                url : "https://api-us.integrate.sugarapps.com/elements/api-v2/shaheen",
                type: "POST",
                data : JSON.stringify(evtData),
                contentType : "application/json",
                dataType: "json",
                beforeSend: function (xhr) {
                    xhr.setRequestHeader("Authorization", "User xxxxYOUR_USER_SECRET_FROM_INTEGRATExxxx=, Organization xxxxYOUR_ORGANIZATION_SECRET_FROM_INTEGRATExxxx");
                    xhr.setRequestHeader("Elements-Formula-Instance-Id", "1422812");
                },
                success: function(data, textStatus, jqXHR) {
                    app.alert.show('slack-button-pressed', {
                        level: 'success',
                        messages: 'Successful API call for Applicant with ID# ' + LeadID + '.',
                        autoClose: false
                    });
                },
                error: function (jqXHR, textStatus, errorThrown) {
                    app.alert.show('slack-button-pressed', {
                        level: 'error',
                        messages: 'API call failed for Applicant with ID# ' + LeadID + '.',
                        autoClose: false
                    });
                }
            });

        }

    This function first grabs the GUID of the current Lead record by calling this.model.get('id'). Then I constructed the JSON for the body of the request and assigned it to a variable called evtData. Now all we need is a jQuery AJAX call. Sugar Integrate allows cross-domain requests so it is not necessary to proxy the request from the back-end. BUT you really should. In this example, all of the code is written in javascript. That means my org and user secrets are exposed to anyone who wants to inspect the code. I do not recommend this practice. But, for this example exercise, it was the simplest method.

     

    I should point out a few things about the AJAX request we are making. The first 2 properties should make sense: the URL of the endpoint is the same as we used in Postman. The method is POST. For the data, it is important to set the contentType to application/json and the dataType to json. Then, ensure that the data is formatted correctly, remember to call JSON.stringify on the data object.

     

    For our headers (like in Postman), we will use the beforeSend function. Follow the example above to add the 2 headers that we used in Postman. 

     

    In the event handlers (success and error), I simply made calls to app.alert.show that will display a success or failure message.

     

    Upload and install the MLP, and we should have a working button that will trigger our procedure from Sugar Integrate. All the same tests apply: see the success/error message in Sugar Sell, verify there's a new message in Slack, look at the executions for the instance in Sugar Integrate.

     

    That is it! We have successfully converted a procedure in Sugar Integrate to a manual trigger that is accessible from the outside. We then added a button in Sugar Sell to call that procedure. It took some work to get here. I had to console.log in the procedure for debugging and use the API docs often. But now that I know how to do it, it's not that difficult.

     

    Enabling Debug Logging

    Oh, speaking of debugging, console.logs don't show up in the procedure debugger by default. You have to turn on debugLogging for EACH procedure template. I've touched on this in a couple videos. Here's a quick explanation:

    1. Go to the API docs for the Sugar Integrate Platform (top right of entire screen) - not the API docs for any particular Adapter or Procedure
    2. Choose Procedures from submenu on left
    3. Choose PATCH endpoint
    4. Click TRY IT OUT button
    5. Enter the procedure ID
    6. Use this as the body: {"debugLoggingEnabled": true}
    7. Execute

     

    Now you will see more verbose logging in the Procedure debugger for this instance. Hope that helps!