{"id":1393,"date":"2021-12-18T01:38:41","date_gmt":"2021-12-18T01:38:41","guid":{"rendered":"https:\/\/salarydistribution.com\/machine-learning\/2021\/12\/18\/live-call-analytics-for-your-contact-center-with-amazon-language-ai-services\/"},"modified":"2021-12-18T01:38:41","modified_gmt":"2021-12-18T01:38:41","slug":"live-call-analytics-for-your-contact-center-with-amazon-language-ai-services","status":"publish","type":"post","link":"https:\/\/salarydistribution.com\/machine-learning\/2021\/12\/18\/live-call-analytics-for-your-contact-center-with-amazon-language-ai-services\/","title":{"rendered":"Live call analytics for your contact center with Amazon language AI services"},"content":{"rendered":"<div id=\"\">\n<p>Your contact center connects your business to your community, enabling customers to order products, callers to request support, clients to make appointments, and much more. When calls go well, callers retain a positive image of your brand, and are likely to return and recommend you to others. And the converse, of course, is also true.<\/p>\n<p>Naturally, you want to do what you can to ensure that your callers have a good experience. There are two aspects to this:<\/p>\n<ul>\n<li><strong>Help supervisors assess the quality of your caller\u2019s experiences in real time<\/strong> \u2013 For example, your supervisors need to know if initially unhappy callers become happier as the call progresses. And if not, why? What actions can be taken, before the call ends, to assist the agent to improve the customer experience for calls that aren\u2019t going well?<\/li>\n<li><strong>Help agents optimize the quality of your caller\u2019s experiences<\/strong> \u2013 For example, can you deploy live call transcription? This removes the need for your agents to take notes during calls, freeing them to focus more attention on providing positive customer interactions.<\/li>\n<\/ul>\n<p><a href=\"https:\/\/aws.amazon.com\/connect\/contact-lens\/\" target=\"_blank\" rel=\"noopener noreferrer\">Contact Lens for Amazon Connect<\/a> provides real-time supervisor and agent assist features that could be just what you need, but you may not yet be using <a href=\"https:\/\/aws.amazon.com\/connect\/\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon Connect<\/a>. You need a solution that works with your existing contact center.<\/p>\n<p>Amazon Machine Learning (ML) services like <a href=\"https:\/\/aws.amazon.com\/transcribe\/\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon Transcribe<\/a> and <a href=\"https:\/\/aws.amazon.com\/comprehend\/\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon Comprehend<\/a> provide feature-rich APIs that you can use to transcribe and extract insights from your contact center audio at scale. Although you could build your own custom call analytics solution using these services, that requires time and resources. In this post, we introduce our new sample solution for live call analytics.<\/p>\n<h2>Solution overview<\/h2>\n<p>Our new sample solution, Live Call Analytics (LCA), does most of the heavy lifting associated with providing an end-to-end solution that can plug into your contact center and provide the intelligent insights that you need.<\/p>\n<p>It has a call summary user interface, as shown in the following screenshot.<br \/><a href=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image001.png\" target=\"_blank\" rel=\"noopener noreferrer\"><img decoding=\"async\" loading=\"lazy\" class=\"alignnone wp-image-31796 size-full\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image001.png\" alt=\"\" width=\"1081\" height=\"368\"><\/a><\/p>\n<p>It also has a call detail user interface.<br \/><a href=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image003.png\" target=\"_blank\" rel=\"noopener noreferrer\"><img decoding=\"async\" loading=\"lazy\" class=\"alignnone size-full wp-image-31798\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image003.png\" alt=\"\" width=\"891\" height=\"576\"><\/a><\/p>\n<p>LCA currently supports the following features:<\/p>\n<ul>\n<li>Accurate streaming transcription with support for personally identifiable information (PII) redaction and custom vocabulary<\/li>\n<li>Sentiment detection<\/li>\n<li>Automatic scaling to handle call volume changes<\/li>\n<li>Call recording and archiving<\/li>\n<li>A dynamically updated web user interface for supervisors and agents:\n<ul>\n<li>A call summary page that displays a list of in-progress and completed calls, with call timestamps, metadata, and summary statistics like duration, and sentiment trend<\/li>\n<li>Call detail pages showing live turn-by-turn transcription of the caller\/agent dialog, turn-by-turn sentiment, and sentiment trend<\/li>\n<\/ul>\n<\/li>\n<li>Standards-based telephony integration with your contact center using Session Recording Protocol (SIPREC)<\/li>\n<li>A built-in standalone demo mode that allows you to quickly install and try out LCA for yourself, without needing to integrate with your contact center telephony<\/li>\n<li>Easy-to-install resources with a single <a href=\"https:\/\/aws.amazon.com\/cloudformation\/\" target=\"_blank\" rel=\"noopener noreferrer\">AWS CloudFormation<\/a> template<\/li>\n<\/ul>\n<p>This is just the beginning! We expect to add many more exciting features over time, based on your feedback.<\/p>\n<h2>Deploy the CloudFormation stack<\/h2>\n<p>Start your LCA experience by using AWS CloudFormation to deploy the sample solution with the built-in demo mode enabled.<\/p>\n<p>The demo mode downloads, builds, and installs a small virtual PBX server on an <a href=\"http:\/\/aws.amazon.com\/ec2\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon Elastic Compute Cloud<\/a> (Amazon EC2) instance in your AWS account (using the free open-source <a href=\"https:\/\/www.asterisk.org\/get-started\/\" target=\"_blank\" rel=\"noopener noreferrer\">Asterisk<\/a> project) so you can make test phone calls right away and see the solution in action. You can integrate it with your contact center later after evaluating the solution\u2019s functionality for your unique use case.<\/p>\n<ol>\n<li>Use the appropriate <strong>Launch Stack<\/strong> button for the AWS Region in which you\u2019ll use the solution. We expect to add support for additional Regions over time.\n<ul>\n<li>US East (N. Virginia) us-east-1\u00a0 <a href=\"https:\/\/us-east-1.console.aws.amazon.com\/cloudformation\/home?region=us-east-1#\/stacks\/create\/review?templateURL=https:\/\/s3.us-east-1.amazonaws.com\/aws-ml-blog-us-east-1\/artifacts\/lca\/lca-main.yaml&amp;stackName=LiveCallAnalytics&amp;param_installDemoAsteriskServer=true\" target=\"_blank\" rel=\"noopener noreferrer\"><img decoding=\"async\" loading=\"lazy\" class=\"alignnone wp-image-15948 size-full\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2020\/09\/16\/2-LaunchStack.jpg\" alt=\"\" width=\"107\" height=\"20\"><\/a><\/li>\n<li>US West (Oregon) us-west-2\u00a0 <\/li>\n<\/ul>\n<\/li>\n<\/ol>\n<ol start=\"2\">\n<li>For <strong>Stack name<\/strong>, use the default value, <code>LiveCallAnalytics<\/code>.<\/li>\n<li>For <strong>Install Demo Asterisk Server<\/strong>, use the default value, <code>true<\/code>.<\/li>\n<li>For<strong> Allowed CIDR Block for Demo Softphone<\/strong>, use the IP address of your local computer with a network mask of <code>\/32<\/code>.<\/li>\n<\/ol>\n<p>To find your computer\u2019s IP address, you can use the website <a href=\"https:\/\/checkip.amazonaws.com\/\" target=\"_blank\" rel=\"noopener noreferrer\">checkip.amazonaws.com<\/a>.<\/p>\n<p>Later, you can optionally install a softphone application on your computer, which you can register with LCA\u2019s demo Asterisk server. This allows you to experiment with LCA using real two-way phone calls.<\/p>\n<p>If that seems like too much hassle, don\u2019t worry! Simply leave the default value for this parameter and elect not to register a softphone later. You will still be able to test the solution. When the demo Asterisk server doesn\u2019t detect a registered softphone, it automatically simulates the agent side of the conversation using a built-in audio recording.<\/p>\n<ol start=\"4\">\n<li>For <strong>Allowed CIDR List for SIPREC Integration<\/strong>, leave the default value.<\/li>\n<\/ol>\n<p>This parameter isn\u2019t used for demo mode installation. Later, when you want to integrate LCA with your contact center audio stream, you use this parameter to specify the IP address of your SIPREC source hosts, such as your Session Border Controller (SBC) servers.<\/p>\n<ol start=\"5\">\n<li>For <strong>Authorized Account Email Domain<\/strong>, use the domain name part of your corporate email address (this allows others with email addresses in the same domain to sign up for access to the UI).<\/li>\n<li>For <strong>Call Audio Recordings Bucket Name<\/strong>, leave the value blank to have an <a href=\"http:\/\/aws.amazon.com\/s3\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon Simple Storage Service<\/a> (Amazon S3) bucket for your call recordings automatically created for you. Otherwise, use the name of an existing S3 bucket where you want your recordings to be stored.<\/li>\n<li>For all other parameters, use the default values.<\/li>\n<\/ol>\n<p>If you want to customize the settings later, for example to apply PII redaction or custom vocabulary to improve accuracy, you can update the stack for these parameters.<\/p>\n<ol start=\"8\">\n<li>Check the two acknowledgement boxes, and choose <strong>Create stack<\/strong>.<br \/><a href=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image006.png\" target=\"_blank\" rel=\"noopener noreferrer\"><img decoding=\"async\" loading=\"lazy\" class=\"alignnone size-full wp-image-31799\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image006.png\" alt=\"\" width=\"599\" height=\"229\"><\/a><\/li>\n<\/ol>\n<p>The main CloudFormation stack uses nested stacks to create the following resources in your AWS account:<\/p>\n<ul>\n<li>S3 buckets to hold build artifacts and call recordings<\/li>\n<li>An EC2 instance (t4g.large) with the demo Asterisk server installed, with VPC, security group, Elastic IP address, and internet gateway<\/li>\n<li>An <a href=\"https:\/\/aws.amazon.com\/chime\/features\/voice-connector\/\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon Chime Voice Connector<\/a>, configured to stream audio to <a href=\"https:\/\/aws.amazon.com\/kinesis\/video-streams\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon Kinesis Video Streams<\/a><\/li>\n<li>An <a href=\"http:\/\/aws.amazon.com\/ecs\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon Elastic Container Service<\/a> (Amazon ECS) instance that runs containers in <a href=\"https:\/\/aws.amazon.com\/fargate\/\" target=\"_blank\" rel=\"noopener noreferrer\">AWS Fargate<\/a> to relay streaming audio from Kinesis Video Streams to <a href=\"https:\/\/aws.amazon.com\/transcribe\/\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon Transcribe<\/a> and record transcription segments in <a href=\"https:\/\/aws.amazon.com\/dynamodb\/\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon DynamoDB<\/a>, with VPC, NAT gateways, Elastic IP addresses, and internet gateway<\/li>\n<li>An <a href=\"https:\/\/aws.amazon.com\/lambda\/\" target=\"_blank\" rel=\"noopener noreferrer\">AWS Lambda<\/a> function to create and store final stereo call recordings<\/li>\n<li>A DynamoDB table to store call and transcription data, with Lambda stream processing that adds analytics to the live call data<\/li>\n<li>The <a href=\"https:\/\/aws.amazon.com\/appsync\" target=\"_blank\" rel=\"noopener noreferrer\">AWS AppSync<\/a> API, which provides a GraphQL endpoint to support queries and real-time updates<\/li>\n<li>Website components including S3 bucket, <a href=\"https:\/\/aws.amazon.com\/cloudfront\/\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon CloudFront<\/a> distribution, and <a href=\"https:\/\/aws.amazon.com\/cognito\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon Cognito<\/a> user pool<\/li>\n<li>Other miscellaneous supporting resources, including <a href=\"https:\/\/aws.amazon.com\/iam\/\" target=\"_blank\" rel=\"noopener noreferrer\">AWS Identity and Access Management<\/a> (IAM) roles and policies (using least privilege best practices), <a href=\"https:\/\/aws.amazon.com\/vpc\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon Virtual Private Cloud<\/a> (Amazon VPC) resources, <a href=\"https:\/\/aws.amazon.com\/eventbridge\/\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon EventBridge<\/a> event rules, and <a href=\"https:\/\/aws.amazon.com\/cloudwatch\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon CloudWatch<\/a> log groups.<\/li>\n<\/ul>\n<p>The stacks take about 20 minutes to deploy. The main stack status shows CREATE_COMPLETE when everything is deployed.<\/p>\n<p><a href=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image007.png\" target=\"_blank\" rel=\"noopener noreferrer\"><img decoding=\"async\" loading=\"lazy\" class=\"alignnone size-full wp-image-31800\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image007.png\" alt=\"\" width=\"330\" height=\"100\"><\/a><\/p>\n<h2>Create a user account<\/h2>\n<p>We now open the web user interface and create a user account.<\/p>\n<ol>\n<li>On the AWS CloudFormation console, choose the main stack, <code>LiveCallAnalytics<\/code>, and choose the <strong>Outputs<\/strong> tab.<br \/><a href=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image008.png\" target=\"_blank\" rel=\"noopener noreferrer\"><img decoding=\"async\" loading=\"lazy\" class=\"alignnone size-full wp-image-31801\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image008.png\" alt=\"\" width=\"836\" height=\"465\"><\/a><\/li>\n<li>Open your web browser to the URL shown as <code>CloudfrontEndpoint<\/code> in the outputs.<\/li>\n<\/ol>\n<p>You\u2019re directed to the login page.<\/p>\n<ol start=\"3\">\n<li>Choose <strong>Create account<\/strong>.<br \/><a href=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image010.png\" target=\"_blank\" rel=\"noopener noreferrer\"><img decoding=\"async\" loading=\"lazy\" class=\"alignnone size-full wp-image-31802\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image010.png\" alt=\"\" width=\"210\" height=\"220\"><\/a><\/li>\n<li>For <strong>Username<\/strong>, use your email address that belongs to the email address domain you provided earlier.<\/li>\n<li>For <strong>Password<\/strong>, use a sequence that has a length of at least 8 characters, and contains uppercase and lowercase characters, plus numbers and special characters.<\/li>\n<li>Choose <strong>CREATE ACCOUNT. <\/strong><\/li>\n<\/ol>\n<p>The <strong>Confirm Sign up <\/strong>page appears.<\/p>\n<p><a href=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image011.png\" target=\"_blank\" rel=\"noopener noreferrer\"><img decoding=\"async\" loading=\"lazy\" class=\"alignnone size-full wp-image-31812\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image011.png\" alt=\"\" width=\"191\" height=\"194\"><\/a><br \/>Your confirmation code has been emailed to the email address you used as your username. Check your inbox for an email from <code>no-reply@verificationemail.com<\/code> with subject \u201cAccount Verification.\u201d<\/p>\n<ol start=\"7\">\n<li>For <strong>Confirmation Code<\/strong>, copy and paste the code from the email.<\/li>\n<li>Choose <strong>CONFIRM. <\/strong><\/li>\n<\/ol>\n<p>You\u2019re now logged in to LCA.<br \/><a href=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image012.png\" target=\"_blank\" rel=\"noopener noreferrer\"><img decoding=\"async\" loading=\"lazy\" class=\"alignnone size-full wp-image-31803\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image012.png\" alt=\"\" width=\"882\" height=\"327\"><\/a><\/p>\n<h2>Make a test phone call<\/h2>\n<p>Call the number shown as <code>DemoPBXPhoneNumber<\/code> in the AWS CloudFormation outputs for the main <code>LiveCallAnalytics<\/code> stack.<\/p>\n<p>You haven\u2019t yet registered a softphone app, so the demo Asterisk server picks up the call and plays a recording. Listen to the recording, and answer the questions when prompted. Your call is streamed to the LCA application, and is recorded, transcribed, and analyzed. When you log in to the UI later, you can see a record of this call.<\/p>\n<h2>Optional: Install and register a softphone<\/h2>\n<p>If you want to use LCA with live two-person phone calls instead of the demo recording, you can register a softphone application with your new demo Asterisk server.<\/p>\n<p>The following <a href=\"https:\/\/github.com\/aws-samples\/amazon-transcribe-live-call-analytics\/blob\/main\/lca-chimevc-stack\/Asterisk.md#optional-client-configuration\" target=\"_blank\" rel=\"noopener noreferrer\">README<\/a> has step-by-step instructions for downloading, installing, and registering a free (for non-commercial use) softphone on your local computer. The registration is successful only if <strong>Allowed CIDR Block for Demo Softphone <\/strong>correctly reflects your local machine\u2019s IP address. If you got it wrong, or if your IP address has changed, you can choose the <code>LiveCallAnalytics<\/code> stack in AWS CloudFormation, and choose <strong>Update<\/strong> to provide a new value for <strong>Allowed CIDR Block for Demo Softphone.<\/strong><\/p>\n<p>If you still can\u2019t successfully register your softphone, and you are connected to a VPN, disconnect and update <strong>Allowed CIDR Block for Demo Softphone<\/strong>\u2014corporate VPNs can restrict IP voice traffic.<\/p>\n<p>When your softphone is registered, call the phone number again. Now, instead of playing the default recording, the demo Asterisk server causes your softphone to ring. Answer the call on the softphone, and have a two-way conversation with yourself! Better yet, ask a friend to call your Asterisk phone number, so you can simulate a contact center call by role playing as caller and agent.<\/p>\n<h2>Explore live call analysis features<\/h2>\n<p>Now, with LCA successfully installed in demo mode, you\u2019re ready to explore the call analysis features.<\/p>\n<ol>\n<li>Open the LCA web UI using the URL shown as <code>CloudfrontEndpoint<\/code> in the main stack outputs.<\/li>\n<\/ol>\n<p>We suggest bookmarking this URL\u2014you\u2019ll use it often!<\/p>\n<ol start=\"2\">\n<li>Make a test phone call to the demo Asterisk server (as you did earlier).\n<ol type=\"a\">\n<li>If you registered a softphone, it rings on your local computer. Answer the call, or better, have someone else answer it, and use the softphone to play the agent role in the conversation.<\/li>\n<li>If you didn\u2019t register a softphone, the Asterisk server demo audio plays the role of agent.<\/li>\n<\/ol>\n<\/li>\n<\/ol>\n<p>Your phone call almost immediately shows up at the top of the call list on the UI, with the status <code>In progress<\/code>.<br \/><a href=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image014.png\" target=\"_blank\" rel=\"noopener noreferrer\"><img decoding=\"async\" loading=\"lazy\" class=\"alignnone size-full wp-image-31804\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image014.png\" alt=\"\" width=\"927\" height=\"203\"><\/a><\/p>\n<p>The call has the following details:<\/p>\n<ul>\n<li><strong>Call ID<\/strong> \u2013 A unique identifier for this telephone call<\/li>\n<li><strong>Initiation Timestamp<\/strong> \u2013 Shows the time the telephone call started<\/li>\n<li><strong>Caller Phone Number<\/strong> \u2013 Shows the number of the phone from which you made the call<\/li>\n<li><strong>Status<\/strong> \u2013 Indicates that the call is in progress<\/li>\n<li><strong>Caller Sentiment<\/strong> \u2013 The average caller sentiment<\/li>\n<li><strong>Caller Sentiment Trend<\/strong> \u2013The caller sentiment trend<\/li>\n<li><strong>Duration <\/strong>\u2013 The elapsed time since the start of the call<\/li>\n<\/ul>\n<ol start=\"3\">\n<li>Choose the call ID of your <code>In progress<\/code> call to open the live call detail page.<\/li>\n<\/ol>\n<p><a href=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image016.png\" target=\"_blank\" rel=\"noopener noreferrer\"><img decoding=\"async\" loading=\"lazy\" class=\"alignnone size-full wp-image-31805\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image016.png\" alt=\"\" width=\"980\" height=\"587\"><\/a><\/p>\n<p>As you talk on the phone from which you made the call, your voice and the voice of the agent are transcribed in real time and displayed in the auto scrolling <strong>Call Transcript<\/strong> pane.<\/p>\n<p>Each turn of the conversation (customer and agent) is annotated with a sentiment indicator. As the call continues, the sentiment for both caller and agent is aggregated over a rolling time window, so it\u2019s easy to see if sentiment is trending in a positive or negative direction.<\/p>\n<ol start=\"4\">\n<li>End the call.<\/li>\n<li>Navigate back to the call list page by choosing <strong>Calls<\/strong> at the top of the page.<\/li>\n<\/ol>\n<p><a href=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image018.png\" target=\"_blank\" rel=\"noopener noreferrer\"><img decoding=\"async\" loading=\"lazy\" class=\"alignnone size-full wp-image-31806\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image018.png\" alt=\"\" width=\"391\" height=\"90\"><\/a><\/p>\n<p>Your call is now displayed in the list with the status Done<em>.<\/em><\/p>\n<ol start=\"6\">\n<li>To display call details for any call, choose the call ID to open the details page, or select the call to display the <strong>Calls<\/strong> list and <strong>Call Details<\/strong> pane on the same page.<\/li>\n<\/ol>\n<p>You can change the orientation to a side-by-side layout using the <strong>Call Details<\/strong> settings tool (gear icon).<br \/><a href=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image019.png\" target=\"_blank\" rel=\"noopener noreferrer\"><img decoding=\"async\" loading=\"lazy\" class=\"alignnone size-full wp-image-31807\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image019.png\" alt=\"\" width=\"869\" height=\"497\"><\/a><\/p>\n<p>You can make a few more phone calls to become familiar with how the application works. With the softphone installed, ask someone else to call your Asterisk demo server phone number: pick up their call on your softphone and talk with them while watching the turn-by-turn transcription update in real time. Observe the low latency. Assess the accuracy of transcriptions and sentiment annotation\u2014you\u2019ll likely find that it\u2019s not perfect, but it\u2019s close! Transcriptions are less accurate when you use technical or domain-specific jargon, but you can use <a href=\"https:\/\/www.youtube.com\/watch?v=oBgSJ7bsP2U\" target=\"_blank\" rel=\"noopener noreferrer\">custom vocabulary<\/a> to teach Amazon Transcribe new words and terms.<\/p>\n<h2>Processing flow overview<\/h2>\n<p>How did LCA transcribe and analyze your test phone calls? Let\u2019s take a quick look at how it works.<\/p>\n<p>The following diagram shows the main architectural components and how they fit together at a high level.<br \/><a href=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image021.png\" target=\"_blank\" rel=\"noopener noreferrer\"><img decoding=\"async\" loading=\"lazy\" class=\"alignnone size-full wp-image-31808\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image021.png\" alt=\"\" width=\"1066\" height=\"542\"><\/a><\/p>\n<p>The demo Asterisk server is configured to use Voice Connector, which provides the phone number and SIP trunking needed to route inbound and outbound calls. When you configure LCA to integrate with your contact center instead of the demo Asterisk server, Voice Connector is configured to integrate instead with your existing contact center using <a href=\"https:\/\/docs.aws.amazon.com\/chime\/latest\/ag\/start-kinesis-vc.html#siprec\" target=\"_blank\" rel=\"noopener noreferrer\">SIP-based media recording (SIPREC) or network-based recording (NBR)<\/a>. In both cases, Voice Connector streams audio to Kinesis Video Streams using two streams per call, one for the caller and one for the agent.<\/p>\n<p>When a new video stream is initiated, an event is fired using EventBridge. This event triggers a Lambda function, which uses an <a href=\"https:\/\/aws.amazon.com\/sqs\/\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon Simple Queue Service<\/a> (Amazon SQS) queue to initiate a new call processing job in Fargate, a serverless compute service for containers. A single container instance processes multiple calls simultaneously. AWS auto scaling provisions and de-provisions additional containers dynamically as needed to handle changing call volumes.<\/p>\n<p>The Fargate container immediately creates a streaming connection with Amazon Transcribe and starts consuming and relaying audio fragments from Kinesis Video Streams to Amazon Transcribe.<\/p>\n<p>The container writes the streaming transcription results in real time to a DynamoDB table.<\/p>\n<p>A Lambda function, the Call Event Stream Processor, fed by DynamoDB streams, processes and enriches call metadata and transcription segments. The event processor function interfaces with AWS AppSync to persist changes (mutations) in DynamoDB and to send real-time updates to logged in web clients.<\/p>\n<p>The LCA web UI assets are hosted on Amazon S3 and served via CloudFront. Authentication is provided by Amazon Cognito. In demo mode, user identities are configured in an Amazon Cognito user pool. In a production setting, you would likely configure Amazon Cognito to integrate with your existing identity provider (IdP) so authorized users can log in with their corporate credentials.<\/p>\n<p>When the user is authenticated, the web application establishes a secure GraphQL connection to the AWS AppSync API, and subscribes to receive real-time events such as new calls and call status changes for the calls list page, and new or updated transcription segments and computed analytics for the call details page.<\/p>\n<p>The entire processing flow, from ingested speech to live webpage updates, is event driven, and so the end-to-end latency is small\u2014typically just a few seconds.<\/p>\n<h2>Monitoring and troubleshooting<\/h2>\n<p>AWS CloudFormation reports deployment failures and causes on the relevant stack <strong>Events<\/strong> tab. See <a href=\"https:\/\/docs.aws.amazon.com\/AWSCloudFormation\/latest\/UserGuide\/troubleshooting.html\" target=\"_blank\" rel=\"noopener noreferrer\">Troubleshooting CloudFormation<\/a> for help with common deployment problems. Look out for deployment failures caused by <a href=\"https:\/\/docs.aws.amazon.com\/AWSCloudFormation\/latest\/UserGuide\/troubleshooting.html#troubleshooting-errors-limit-exceeded\" target=\"_blank\" rel=\"noopener noreferrer\">limit exceeded<\/a> errors; the LCA stacks create resources such as NAT gateways, Elastic IP addresses, and other resources that are subject to default account and Region Service Quotas.<\/p>\n<p>Amazon Transcribe has a default limit of 25 concurrent transcription streams, which limits LCA to 12 concurrent calls (two streams per call). Request an increase for the <a href=\"https:\/\/console.aws.amazon.com\/servicequotas\/home\/services\/transcribe\/quotas\/L-CDB96031\" target=\"_blank\" rel=\"noopener noreferrer\">number of concurrent HTTP\/2 streams for streaming transcription<\/a> if you need to handle a larger number of concurrent calls.<\/p>\n<p>LCA provides runtime monitoring and logs for each component using CloudWatch:<\/p>\n<ul>\n<li><strong>Call trigger Lambda function<\/strong> \u2013 On the Lambda console, open the <code>LiveCallAnalytics-AISTACK-transcribingFargateXXX<\/code> function. Choose the <strong>Monitor<\/strong> tab to see function metrics. Choose <strong>View logs in CloudWatch<\/strong> to inspect function logs.<\/li>\n<li><strong>Call processing Fargate task<\/strong> \u2013 On the Amazon ECS console, choose the <code>LiveCallAnalytics<\/code> cluster. Open the <code>LiveCallAnalytics<\/code> service to see container health metrics. Choose the <strong>Logs<\/strong> tab to inspect container logs.<\/li>\n<li><strong>Call Event Stream Processor Lambda function<\/strong> \u2013 On the Lambda console, open the <code>LiveCallAnalytics-AISTACK-CallEventStreamXXX<\/code> function. Choose the <strong>Monitor<\/strong> tab to see function metrics. Choose <strong>View logs in CloudWatch<\/strong> to inspect function logs.<\/li>\n<li><strong>AWS AppSync API<\/strong> \u2013 On the AWS AppSync console, open the <code>CallAnalytics-LiveCallAnalytics-XXX<\/code> API. Choose <strong>Monitoring<\/strong> in the navigation pane to see API metrics. Choose <strong>View logs in CloudWatch<\/strong> to inspect AppSyncAPI logs.<\/li>\n<\/ul>\n<h2>Cost assessment<\/h2>\n<p>This solution has hourly cost components and usage cost components.<\/p>\n<p>The hourly costs add up to about $0.15 per hour, or $0.22 per hour with the demo Asterisk server enabled. For more information about the services that incur an hourly cost, see <a href=\"https:\/\/aws.amazon.com\/fargate\/pricing\/\" target=\"_blank\" rel=\"noopener noreferrer\">AWS Fargate Pricing<\/a>, <a href=\"https:\/\/aws.amazon.com\/vpc\/pricing\/\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon VPC pricing<\/a> (for the NAT gateway), and <a href=\"https:\/\/aws.amazon.com\/ec2\/pricing\/\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon EC2 pricing<\/a> (for the demo Asterisk server).<\/p>\n<p>The hourly cost components comprise the following:<\/p>\n<ul>\n<li><strong>Fargate container<\/strong> \u2013 2vCPU at $0.08\/hour and 4 GB memory at $0.02\/hour = $0.10\/hour<\/li>\n<li><strong>NAT gateways<\/strong> \u2013 Two at $0.09\/hour<\/li>\n<li><strong>EC2 instance<\/strong> \u2013 t4g.large at $0.07\/hour (for demo Asterisk server)<\/li>\n<\/ul>\n<p>The usage costs add up to about $0.30 for a 5-minute call, although this can vary based on total usage, because usage affects Free Tier eligibility and volume tiered pricing for many services. For more information about the services that incur usage costs, see the following:<\/p>\n<p>To explore LCA costs for yourself, use <a href=\"https:\/\/aws.amazon.com\/aws-cost-management\/aws-cost-explorer\/\" target=\"_blank\" rel=\"noopener noreferrer\">AWS Cost Explorer<\/a> or choose <a href=\"https:\/\/console.aws.amazon.com\/billing\/home#\/bills\" target=\"_blank\" rel=\"noopener noreferrer\">Bill Details<\/a> on the <a href=\"https:\/\/console.aws.amazon.com\/billing\/\" target=\"_blank\" rel=\"noopener noreferrer\">AWS Billing Dashboard<\/a> to see your month-to-date spend by service.<br \/><a href=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image023.png\" target=\"_blank\" rel=\"noopener noreferrer\"><img decoding=\"async\" loading=\"lazy\" class=\"alignnone size-full wp-image-31809\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/ML-5911-image023.png\" alt=\"\" width=\"929\" height=\"575\"><\/a><\/p>\n<h2>Integrate with your contact center<\/h2>\n<p>To deploy LCA to analyze real calls to your contact center using AWS CloudFormation, update the existing <code>LiveCallAnalytics<\/code> demo stack, changing the parameters to disable demo mode.<\/p>\n<p>Alternatively, delete the existing <code>LiveCallAnalytics<\/code> demo stack, and deploy a new <code>LiveCallAnalytics<\/code> stack (use the stack options from the previous section).<\/p>\n<p>You could also deploy a new <code>LiveCallAnalytics<\/code> stack in a different AWS account or Region.<\/p>\n<p>Use these parameters to configure LCA for contact center integration:<\/p>\n<ol>\n<li>For <strong>Install Demo Asterisk Server<\/strong>, enter <code>false<\/code><em>.<\/em><\/li>\n<li>For<strong> Allowed CIDR Block for Demo Softphone<\/strong>, leave the default value.<\/li>\n<li>For <strong>Allowed CIDR List for Siprec Integration<\/strong>, use the CIDR blocks of your SIPREC source hosts, such as your SBC servers. Use commas to separate CIDR blocks if you enter more than one.<\/li>\n<\/ol>\n<p>When you deploy LCA, a Voice Connector is created for you. Use the Voice Connector documentation as guidance to configure this Voice Connector and your PBX\/SBC for <a href=\"https:\/\/docs.aws.amazon.com\/chime\/latest\/ag\/start-kinesis-vc.html#siprec\" target=\"_blank\" rel=\"noopener noreferrer\">SIP-based media recording (SIPREC) or network-based recording (NBR)<\/a>. The Voice Connector <a href=\"https:\/\/aws.amazon.com\/chime\/voice-connector\/resources\/#Configuration_Guides\" target=\"_blank\" rel=\"noopener noreferrer\">Resources page<\/a> provides some vendor-specific example configuration guides, including:<\/p>\n<ul>\n<li>SIPREC Configuration Guide: Cisco Unified Communications Manager (CUCM) and Cisco Unified Border Element (CUBE)<\/li>\n<li>SIPREC Configuration Guide: Avaya Aura Communication Manager and Session Manager with Sonus SBC 521<\/li>\n<\/ul>\n<p>The LCA GitHub repository has additional vendor specific notes that you may find helpful; see <a href=\"https:\/\/github.com\/aws-samples\/amazon-transcribe-live-call-analytics\/blob\/main\/lca-chimevc-stack\/SIPREC.md\" target=\"_blank\" rel=\"noopener noreferrer\">SIPREC.md<\/a>.<\/p>\n<h2>Customize your deployment<\/h2>\n<p>Use the following CloudFormation template parameters when creating or updating your stack to customize your LCA deployment:<\/p>\n<ul>\n<li>To use your own S3 bucket for call recordings, use <strong>Call Audio Recordings Bucket Name<\/strong> and <strong>Audio File Prefix.<\/strong><\/li>\n<li>To redact PII from the transcriptions, set <strong>IsContentRedactionEnabled<\/strong> to <code>true<\/code><em>. <\/em>For more information, see <a href=\"https:\/\/docs.aws.amazon.com\/transcribe\/latest\/dg\/pii-redaction-stream.html\" target=\"_blank\" rel=\"noopener noreferrer\">Redacting or identifying PII in a real-time stream<\/a>.<\/li>\n<li>To improve transcription accuracy for technical and domain-specific acronyms and jargon, set <strong>UseCustomVocabulary<\/strong> to the name of a custom vocabulary that you already created in Amazon Transcribe. For more information, see <a href=\"https:\/\/docs.aws.amazon.com\/transcribe\/latest\/dg\/custom-vocabulary.html\" target=\"_blank\" rel=\"noopener noreferrer\">Custom vocabularies<\/a>.<\/li>\n<\/ul>\n<p>LCA is an open-source project. You can fork the <a href=\"https:\/\/github.com\/aws-samples\/amazon-transcribe-live-call-analytics\" target=\"_blank\" rel=\"noopener noreferrer\">LCA GitHub repository<\/a>, enhance the code, and send us pull requests so we can incorporate and share your improvements!<\/p>\n<h2>Clean up<\/h2>\n<p>When you\u2019re finished experimenting with this solution, clean up your resources by opening the AWS CloudFormation console and deleting the <code>LiveCallAnalytics<\/code> stacks that you deployed. This deletes resources that were created by deploying the solution. The recording S3 buckets, DynamoDB table, and CloudWatch Log groups are retained after the stack is deleted to avoid deleting your data.<\/p>\n<h2>Post Call Analytics: Companion solution<\/h2>\n<p>Our companion solution, Post Call Analytics (PCA), offers additional insights and analytics capabilities by using the <a href=\"https:\/\/aws.amazon.com\/transcribe\/call-analytics\/\" target=\"_blank\" rel=\"noopener noreferrer\">Amazon Transcribe Call Analytics<\/a> batch API to detect common issues, interruptions, silences, speaker loudness, call categories, and more. Unlike LCA, which transcribes and analyzes streaming audio in real time, PCA transcribes and analyzes your call recordings after the call has ended. Configure LCA to store call recordings to the PCA\u2019s ingestion S3 bucket, and use the two solutions together to get the best of both worlds. For more information, see <a href=\"https:\/\/www.amazon.com\/post-call-analytics\" target=\"_blank\" rel=\"noopener noreferrer\">Post call analytics for your contact center with Amazon language AI services<\/a>.<\/p>\n<h2>Conclusion<\/h2>\n<p>The Live Call Analytics (LCA) sample solution offers a scalable, cost-effective approach to provide live call analysis with features to assist supervisors and agents to improve focus on your callers\u2019 experience. It uses Amazon ML services like Amazon Transcribe and Amazon Comprehend to transcribe and extract real-time insights from your contact center audio.<\/p>\n<p>The sample LCA application is provided as open source\u2014use it as a starting point for your own solution, and help us make it better by contributing back fixes and features via GitHub pull requests. For expert assistance, <a href=\"https:\/\/aws.amazon.com\/professional-services\/\" target=\"_blank\" rel=\"noopener noreferrer\">AWS Professional Services<\/a> and other <a href=\"https:\/\/aws.amazon.com\/partners\/\" target=\"_blank\" rel=\"noopener noreferrer\">AWS Partners<\/a> are here to help.<\/p>\n<p>We\u2019d love to hear from you. Let us know what you think in the comments section, or use the issues forum in the <a href=\"https:\/\/github.com\/aws-samples\/amazon-transcribe-live-call-analytics\" target=\"_blank\" rel=\"noopener noreferrer\">LCA GitHub repository<\/a>.<\/p>\n<hr>\n<h3>About the Authors<\/h3>\n<p><strong><a href=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/02\/10\/Bob-Strahan-p.png\"><img decoding=\"async\" loading=\"lazy\" class=\"size-full wp-image-21654 alignleft\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/02\/10\/Bob-Strahan-p.png\" alt=\"Bob Strahan\" width=\"100\" height=\"133\"><\/a>Bob Strahan<\/strong> is a Principal Solutions Architect in the AWS Language AI Services team.<\/p>\n<p><strong><a href=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/10\/08\/OliverAtoa_pic_resized.png\"><img decoding=\"async\" loading=\"lazy\" class=\"size-full wp-image-29077 alignleft\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/10\/08\/OliverAtoa_pic_resized.png\" alt=\"\" width=\"100\" height=\"120\"><\/a>Oliver Atoa<\/strong> is a Principal Solutions Architect in the AWS Language AI Services team.<\/p>\n<p><strong><a href=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/sagar-blog-bio-image-cropped.png\"><img decoding=\"async\" loading=\"lazy\" class=\"size-full wp-image-31813 alignleft\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/sagar-blog-bio-image-cropped.png\" alt=\"\" width=\"100\" height=\"123\"><\/a> Sagar Khasnis<\/strong> is a Senior Solutions Architect focused on building applications for Productivity Applications. He is passionate about building innovative solutions using AWS services to help customers achieve their business objectives. In his free time, you can find him reading biographies, hiking, working out at a fitness studio, and geeking out on his personal rig at home.<\/p>\n<p><strong><a href=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/court-blog-bio-image.jpg\"><img decoding=\"async\" loading=\"lazy\" class=\"size-full wp-image-31814 alignleft\" src=\"https:\/\/d2908q01vomqb2.cloudfront.net\/f1f836cb4ea6efb2a0b1b99f41ad8b103eff4b59\/2021\/12\/16\/court-blog-bio-image.jpg\" alt=\"\" width=\"100\" height=\"100\"><\/a>Court<\/strong> <strong>Schuett<\/strong> is a Chime Specialist SA with a background in telephony and now likes to build things that build things.<\/p>\n<p>       <!-- '\"` -->\n      <\/div>\n","protected":false},"excerpt":{"rendered":"<p>https:\/\/aws.amazon.com\/blogs\/machine-learning\/live-call-analytics-for-your-contact-center-with-amazon-language-ai-services\/<\/p>\n","protected":false},"author":0,"featured_media":1394,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[3],"tags":[],"_links":{"self":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/1393"}],"collection":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/types\/post"}],"replies":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/comments?post=1393"}],"version-history":[{"count":0,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/posts\/1393\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media\/1394"}],"wp:attachment":[{"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/media?parent=1393"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/categories?post=1393"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/salarydistribution.com\/machine-learning\/wp-json\/wp\/v2\/tags?post=1393"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}