1484235843 by Unknown

1484235843 by Unknown

Author:Unknown
Language: eng
Format: epub
Published: 2018-10-18T21:13:48+00:00


Chapter 7 performanCe Considerations

• Set up an Azure Function app for processing files into Cosmos DB

document storage.

• Add reporting from PowerBI for visualizing the data from CosmosDB.

• Create an embedded report from the Power BI visualization that can

be displayed on a dashboard.

You also sketch up a rough diagram of how the solution might look when using all

components. Figure 7-2 illustrates the initial design.

Figure 7-2. API Testing solution overview

EXERCISE: THE APACHE BENCH API TEST SOLUTION

While the individual components of this solution do not seem to be overly complicated, making this all work together will prove to be a bit more challenging than you first thought. following the checklist you put together, you start assembling the components and testing out the solution.

1. Create a new folder for your solution on your file system named ab-api-

tests. inside of that folder, create folders for your major component types:

docker, functions, reporting.

2. Within the docker folder, create directories for your containers: ab, azure-cli, mongo (for local persistence testing), and the function app. Within each of the

new directories, create a new dockerfile:

125

Chapter 7 performanCe Considerations

3. in the dockerfile for the custom ab container, use the following to bootstrap the container:

FROM jess/ab

RUN apk --update-cache add bash python libffi openssl gnuplot

curl zip

VOLUME [ "/opt/ab/data" ]

WORKDIR /tmp/ab

COPY Urls.txt .

COPY exec.sh .

ENTRYPOINT [ "/bin/bash","/tmp/ab/exec.sh" ]

4. in azure-cli, use the following to bootstrap the container. please note that

the two enV variables listed will be used during container execution to upload

the resulting zip file from the ab container to a pre-created storage account. Be

sure to create the same mount point to allow the CLi container to access the zip

file created by the ab container.

FROM azuresdk/azure-cli-python

VOLUME [ "/opt/ab/data" ]

ENV AZURE_STORAGE_ACCOUNT=<<acct-name>>

ENV AZURE_STORAGE_KEY=<<SAS KEY>>

5. for testing purposes, create a file called Urls.txt that contains the route of each api you’d like to test. ensure it is in the same directory as the ab dockerfile.

enter two or three test UrLs into that file.

6. Construct a bash script that will execute the apache Bench tests and save the

output along with a file that contains information about the UrL being tested,

the name of the data file, the unique id of the test run, and the timestamp for

when the test was run. name the file exec.sh (as mentioned in the dockerfile

in step 3). allow the file to accept the number of requests to use along with

the number of concurrent requests to use. the script should read similar to the

following:

#!/bin/bash

timestamp(){

date "+%Y-%m-%d %H:%M:%S"

126

Chapter 7 performanCe Considerations

}

getrunid() {

python -c 'import uuid; print str(uuid.uuid4())'

}

echo "${2} concurrent connections requested"

echo "${1} requests requested"

i=0

rm /opt/ab/data/links.txt

touch /opt/ab/data/links.txt

runtime=`timestamp`

runid=`getrunid`

while IFS= read -r line

do

((i++))

ab -n $1 -c $2 -r -g "/opt/ab/data/out${i}.dat" $line

echo "${line},out${i}.dat,${runid},${runtime}" >> /opt/ab/

data/links.txt

done < "./Urls.txt"

cd /opt/ab/data

zip "${runid}.zip" o*.dat

zip "${runid}.zip" links.txt

7. Create a storage account in azure that will house the uploaded zip files. You

may use powershell or the azure CLi to perform this task. an example of the

azure CLi command is shown as follows:

az storage account create -g <<RESOURCE_GROUP>> -n <<STORAGE_

ACCOUNT_NAME>>

8. Build the docker image by going into the ab directory and typing:

docker build . -t ab:latest

do the same for the azure-cli directory, changing the name of the tag from

ab to azure- cli.



Download



Copyright Disclaimer:
This site does not store any files on its server. We only index and link to content provided by other sites. Please contact the content providers to delete copyright contents if any and email us, we'll remove relevant links or contents immediately.