Salesforce Alexa Integration

 

Will it not be interesting if Alexa can get to details of new cases from Salesforce??

Yes!! You heard it right. In this article, I will help you with the steps to integrate Salesforce with Amazon Alexa.

The scope of this document will be restricted to showcase one of the capability and not solve any business use case.

What is Alexa? Amazon Alexa, also known simply as Alexa, is a virtual assistant AI technology developed by Amazon, first used in the Amazon Echo smart speakers is capable of voice interaction, music playback, etc. Users can extend the Alexa capabilities by installing "skills" (additional functionality developed by third-party vendors, in other settings more commonly called apps such as weather programs and audio features).

Design - We will create a Java application, which will connect to Salesforce and Amazon Alexa through their respective APIs. We will use AWS Lambda to run our code. AWS Lambda lets you run code without provisioning or managing servers. We will create a custom Alexa skill (using Alexa developer console), which can be enabled in Alexa supported devices and takes user inputs/commands. The Alexa skill will handle the user request and direct it towards the AWS Lambda function to get you the response.

Prerequisites
API Connectivity  - We will use Salesforce SOAP API and Amazon Alexa skill kit SDK’s to establish a connection between Salesforce and Alexa. You can connect to Salesforce via other APIs provided by Salesforce like REST API etc.
Eclipse IDE + Java project - Let us start with a backend Java application that can handle requests from Alexa and connect Salesforce to retrieve appropriate data for user commands.
Prerequisites for Salesforce connection.
  • Login to Salesforce
  • Go to Set up and type API in a quick search
  • Click on Enterprise WSDL to download the WSDL file. You would need to generate required jar files using a Web service connector from Salesforce. More information in Web Service Connector here.
Once you have the Enterprise.jar file ready, you would need to import this to your Gradle project. You might also need dependencies for Alexa integration as below. More details about dependencies here. You would need ask-SDK and AWS-lambda-java dependencies for Alexa connectivity and Enterprise Jar(external Jar) you created in the above step and force WSC for salesforce connectivity.

        

For this use case, We will create below 4 java classes.

  •  LaunchRequestHandler.java - To handle initial user request
  • CaseRequestHandler.java  - To request salesforce case information
  • QuerySalesforce – Logic to retrieve Case information from Salesforce
  • SalesforceAlexaSkillStreamHandle – Include the above two handlers
        


        


For details on Amazon related classes like LaunchHandlerequest java class, refer to this document. Compile all the classes and run the build jar under a Gradle task to generate the jar file. We will upload this Jar as our Alexa skill backend logic at Amazon S3 location in some time. You can also use the Java Maven project to build the executable Jar with the above-mentioned dependencies.

Custom Alexa Skill - Next we will create a Custom Alexa skill that will redirect the user request to the above-created java application. Login to Alexa developer console from here and click on create the skill. You can give a skill name of your choice and choose a model as custom. As we will be using the AWS Lambda function to host our backend using Java, select ‘Provision your own’ option for the question ‘Choose a method to host your skill's backend resources’. 

       


On the left accordion, click invocation to change the skill invocation name. I have kept it as a ‘geeksoft salesforce’ as in the screenshot below. Alexa users say a skill's invocation name to begin an interaction with a particular custom skill. For example, for the invocation name is "geeksoft salesforce", users can say:

                 User: “Alexa, ask geeksoft salesforce how many new cases are assigned to me

        


Remember to hit the ‘save model’ button at the top to save your changes. This saves your data but does not build the model. We will be looking into the ‘Build Model’ in some time.

You will need to create an Intent that can respond to Salesforce queries. If you have already worked on Einstein BOT on Salesforce service cloud, Intents and building a model should not be a surprise to you. More about Salesforce bot Intents here.

Click on Interaction on the left accordion and then click on Intents. Click ‘Add Intents’ as in the below screenshot to create one. An intent represents an action that fulfills a user's spoken request.


        

For Salesforce Interaction, I have created Case intent as in the below screenshot. You can add some good numbers of utterances. I have included only 5 and should be good to start with. The utterances are a set of likely spoken phrases mapped to the intents.

          

Next, you need to retrieve a unique Id for your skill. Select Endpoint from left accordion and then select AWS Lambda ARN. Note down your skill ID which you need at a later point while linking Alexa skill to the AWS Lambda function. The Endpoint will receive POST requests when a user interacts with your Alexa Skill. 

         

Once you have saved the endpoint, click on Interaction Model on the left accordion, and then hit the build model button. Alexa will run background jobs to get your AI Interaction model ready. This should look similar to the Salesforce Einstein bot Build model. Your changes will be validated and saved while building an Interaction model.

        

AWS Lambda function - Once you are done with the above step, you need to login to the AWS account to create aa IAM role and Lambda function. The IAM role is AWS identity with permission policies that determine what the identity can and cannot do in AWS.Login here using your credentials, and then click on IAM under services as in the screenshot below.

         

Click on roles to create a new role.