A/B testing is a user research technique, which is used to test whether the A or B version is a better design.
We'll be analyzing the following metrics to see which version performs better:
- Time to Completion: the time it takes to complete the desired task (in this case, put >$150 worth of cacti in your cart)
- Time to Click: the time it takes for the first click to occur on the interface.
- Return Rate: the number of times a user returned to the page (e.g. if the user goes to the cart page, then back to the main shopping page)
This project does simple A/B testing of 2 versions of a website. The metrics collected could then be used to see which version among the baseline(A) or variant(B) performed better.
Next studio, you'll be acting as users for eachother's website, which will give you the data you need to perform the analysis. The users will be given the task "Add >$150 worth of cacti to your cart. You can see the total by clicking on the cart icon." Keep this in mind when making your changes!
- Modify Version A and B of the website by editing the
A.html
andB.html
files found in the/templates
folder. Your changes do not have to be drastic, but think about how they will affect the above metrics. Feel free to change the items from cacti to something else, but you must keep the ids the same. - You can open the html files to check your changes, but since this is a flask app, the page naviagation will only work if you run the app locally using the following commands from your 1300ABTesting folder:
export FLASK_APP=app.py
flask run
- Deploy it on heroku following our deployment guide. Ignore the cs1300 template - you'll be deploying this git repo.
- Verify that navigating to your URL will open version A 50% of the time, and version B the other 50%. Click around and check if the cart page is working.
- Check if the logs are working properly. From the 1300ABTesting folder, run the following command to extract the logs into a text file
mylog.txt
.
Mac:heroku logs --app=<your-app-name> -n 1500 > mylog.txt
Windows:heroku logs --app=<your-app-name> -n 1500 | Out-File mylog.txt
After you hit enter, you’ll see a blank line at the bottom of your terminal, which means that the command is running. Once the command has finished, if you look at the created text file (mylog.txt
in the above example), there’s a lot of information we don’t need. To get just the lines of data that we want for this project, run:
Mac:grep AB_TEST mylog.txt > myfilteredlog.txt
Windows:Get-Content mylog.txt | Select-String -Pattern "AB_TEST" | Out-File myfilteredlog.txt
Openmyfilteredlog.txt
and verify that you have logs. They should look something like the sample output included below. - IMPORTANT: Fill out this form with your deployed URL.
- Do not copy and paste code between
A.html
andB.html
, because this could interfere with logging and ruin your analysis! - Feel free to change the items from cacti to something else, but you must keep the ids the same.
2017-09-22T04:15:05.318219+00:00 app[web.1]: AB_TESTING: A 1506053696735 1506053705304 mp1 1506053696475 2017-09-22T04:15:07.288130+00:00 app[web.1]: AB_TESTING: A 1506053696735 1506053707279 mp2 1506053696475 2017-09-22T04:15:08.353015+00:00 app[web.1]: AB_TESTING: A 1506053696735 1506053708343 mp2 1506053696475 2017-09-22T04:15:09.444328+00:00 app[web.1]: AB_TESTING: A 1506053696735 1506053709431 mp1 1506053696475 2017-09-22T04:15:14.822574+00:00 app[web.1]: AB_TESTING: B 1506053712624 1506053714784 ca1 1506053712526 2017-09-22T04:15:16.682459+00:00 app[web.1]: AB_TESTING: B 1506053712624 1506053716671 ca2 1506053712526 2017-09-22T04:15:17.769221+00:00 app[web.1]: AB_TESTING: B 1506053712624 1506053717759 ca1 1506053712526