
Author:
Subscribed: 0Played: 0Subscribe
Share
Description
Episodes
Reverse
Kobiton is a service to test mobile apps on real devices.
QASymphony offers software testing and QA tools.Special Guest: Josh Lieberman.Sponsored By:Test & Code Patreon Supporters: Thank you to Patreon Supporters
Today we have an interview with Casey Rosenthal of Netflix.
One of the people making sure Netflix runs smoothly is Casey Rosenthall.
He is the manager for the Traffic, Intuition, and Chaos teams at Netflix.
He's got a great perspective on quality and large systems.
We talk about
Chaos Engineering
Experimentation vs Testing
Testing Strategy
Visualization of large amounts of data representing Steady State
Special Guest: Casey Rosenthal.Sponsored By:Nerdlettering: Love Python? Show It With Some Python Swag
Custom-made Mugs and Accessories for Pythonistas, by Pythonistas. Promo Code: TESTCODEKobiton: Test your Mobile App on Real Devices for Free with Kobiton. Sign up at kobiton.com/testandcode to start testing in minutes.
What is the difference between a unit test, an integration test, and a system test? Mahmoud Hashemi helps me to define these terms, as well as discuss the role of all testing variants in software development. What is the difference between a unit test, an integration test, and a system test? TDD testing pyramid vs […]
The post 27: Mahmoud Hashemi : unit, integration, and system testing appeared first on Python Testing.
Interview with Sam Van Oort about pyresttest, "A REST testing and API microbenchmarking tool"
pyresttest
A question in the Test & Code Slack channel was raised about testing REST APIs. There were answers such as pytest + requests, of course, but there was also a mention of pyresttest, https://github.com/svanoort/pyresttest, which I hadn't heard of. I checked out the github repo, and was struck by how user friendly the user facing test definitions were. So I contacted the developer, Sam Van Oort, and asked him to come on the show and tell me about this tool and why he developed it.
Here's the "What is it?" section from the pyresttest README:
A REST testing and API microbenchmarking tool
Tests are defined in basic YAML or JSON config files, no code needed
Minimal dependencies (pycurl, pyyaml, optionally future), making it easy to deploy on-server for smoketests/healthchecks
Supports generate/extract/validate mechanisms to create full test scenarios
Returns exit codes on failure, to slot into automated configuration management/orchestration tools (also supplies parseable logs)
Logic is written and extensible in Python
Support
Special thanks to my wonderful Patreon supporters and those who have supported the show by purchasing Python Testing with unittest, nose, pytestSponsored By:Test & Code Patreon Supporters: Thank you to Patreon Supporters
Interview with Dave Hunt @davehunt82.
We Cover:
Selenium Driver
pytest
pytest plugins:
pytest-selenium
pytest-html
pytest-variables
tox
Dave Hunt’s “help wanted” list on github
Mozilla
Also:
fixtures
xfail
CI and xfail and html reports
CI and capturing
pytest code sprint
working remotely for Mozilla
Sponsored By:Test & Code Patreon Supporters: Thank you to Patreon Supporters
pytest is an extremely popular test framework used by many projects and companies.
In this episode, I interview Raphael Pierzina (@hackebrot), a core contributor to both pytest and cookiecutter. We discuss how Raphael got involved with both projects, his involvement in cookiecutter, pytest, "adopt pytest month", the pytest code sprint, and of course some of the cool new features in pytest 3.
Links:
Raphael Pierzina on twitter (@hackebrot)
pytest - http://doc.pytest.org
cookie cutter - https://github.com/audreyr/cookiecutter
cookiecutter-pytest-plugin - https://github.com/pytest-dev/cookiecutter-pytest-plugin
Sponsored By:Test & Code Patreon Supporters: Thank you to Patreon Supporters
Kent Beck's twitter profile says "Programmer, author, father, husband, goat farmer". But I know him best from his work on extreme programming, test first programming, and test driven development. He's the one. The reason you know about TDD is because of Kent Beck.
I first ran across writings from Kent Beck as started exploring Extreme Programming in the early 2000's.
Although I don't agree with all of the views he's expressed in his long and verbose career, I respect him as one of the best sources of information about software development, engineering practices, and software testing.
Along with Test First Programming and Test Driven Development, Kent started an automated test framework that turned into jUnit. jUnit and it's model of setup and teardown wrapping test functions, as well base test class driven test frameworks became what we know of as xUnit style frameworks now, which includes Python's unittest.
He discussed this history and a lot more on episode 122 of Software Engineering Radio. The episode is titled "The History of JUnit and the Future of Testing with Kent Beck", and is from Sept 26, 2010.
http://www.se-radio.net/2010/09/episode-167-the-history-of-junit-and-the-future-of-testing-with-kent-beck/
I urge you to download it and listen to the whole thing. It's a great interview, still relevant, and applicable to testing in any language, including Python.
What I've done in this podcast is take a handful of clips from the interview (with permission from IEEE and SERadio), and discuss the clips and my opinions a bit.
The lessons are:
You're tests should tell a story.
Be careful of DRY, inheritance, and other software development practices that might get in the way of keeping your tests easy to understand.
All test should help differentiate good programs from bad programs and not be redundant.
Test at multiple levels and multiple scales where it makes sense.
Differentiating between TDD, BDD, ATDD, etc. isn't as important as testing your software to learn about it. Who cares what you call it.
Sponsored By:Test & Code Patreon Supporters: Thank you to Patreon Supporters
How do you convert manual tests to automated tests?
This episode looks at the differences between manual and automated tests and presents two strategies for converting manual to automated.Sponsored By:Test & Code Patreon Supporters: Thank you to Patreon Supporters
A listener requested that I start covering some terminology.
I think it's a great idea.
Covered in this episode:
Test Fixtures
Subcutaneous Testing
End to End Testing (System Testing)
I also discuss:
A book rewrite
Progress on transcripts
A story from the slack channel
Sponsored By:Test & Code Patreon Supporters: Thank you to Patreon Supporters
I talk with Michael about:
Episodes of his show having to do with testing.
His transition from employee to podcast host and online training entrepreneur.
His Python training courses.
The Pyramid Web framework.
Courses by Michael
Explore Python Jumpstart by Building 10 Apps
Explore Write Pythonic Code Like a Seasoned Developer
Python for Entrepreneurs
Testing related podcast Episodes from Talk Python To Me:
episode 10: Harry Percival, TDD for the Web in Python, and PythonAnywhere
PythonAnywhere
Harry's book, TDD with Python
episode 45: Brian Okken, Pragmatic testing and the Testing Column
Talk Python To Me podcast
episode 63: Austin Bingham, Mutation Testing, Cosmic Ray
Cosmic Ray
episode 67: David MacIver, Hypothesis
Hypothesis
Sponsored By:Test & Code Patreon Supporters: Thank you to Patreon Supporters
Interview with Robert Collins, current core maintainer of Python's unittest module.
Some of the topics covered
How did Robert become the maintainer of unittest?
unittest2 as a rolling backport of unittest
test and class parametrization with subtest and testscenarios
Which extension to unittest most closely resembles Pytest fixtures?
Comparing Pytest and unittest
Will unittest ever get assert rewriting?
Future changes to unittest
I've been re-studying unittest recently and I mostly wanted to ask Robert a bunch of clarifying questions.
This is an intermediate to advanced discussion of unittest.
Many great features of unittest go by quickly in this talk.
Please let me know if there's something you'd like me to cover in more depth as a blog post or a future episode.
Links
unittest
unittest2
pip
mock
testtools
fixtures
testscenarios
subunit
pipserver
devpi
testresources
TIP (testing in python) mailing list
Sponsored By:Test & Code Patreon Supporters: Thank you to Patreon Supporters
In this episode, I interview with Joe Stump, cofounder of Sprintly (https://sprint.ly), to give the startup perspective to development and testing.
Joe has spent his career in startups.
He's also been involved with hiring and talent acquisition for several startups.
We talk about testing, continuous integration, code reviews, deployment, tolerance to defects, and how some of those differ between large companies and small companies and startups.
Then we get into hiring. Specifically, finding and evaluating good engineers, and then getting them to be interested in working for you.
If you ever want to grow your team size, you need to listen to this.Sponsored By:Rollbar: Full-stack error tracking for all apps in any language.Test & Code Patreon Supporters: Thank you to Patreon Supporters
The Travis Foundation. Interview with Laura Gaetano
Links and things we talked about:
Travis Foundation
Open Source Grants
The Foundation's support of Katrina Owen from exercism.io
Exercism.io
Rails Girls summer of code
Diversity Tickets
Conference support
Speakerinnen
Prompt
Sponsored By:Test & Code Patreon Supporters: Thank you to Patreon SupportersRollbar: Full-stack error tracking for all apps in any language.
This is a small episode.
I'm changing the name from the "Python Test Podcast" to "Test & Code".
I just want to discuss the reasons behind this change, and take a peek at what's coming up in the future for this podcast.
Links
The Waterfall Model and "Managing the Development of Large Software Systems"
Josh Kalderimis from Travis CI
An introduction to Lean Software Development
This is a quick intro to the concepts of Lean Software Development.
I'm starting a journey of trying to figure out how to apply lean principles to software development in the context of 2016/2017.
Links
Lean Software Development book by Mary & Tom Poppendieck
wikipedia entry for Lean Software Development
Patreon supporters of the show
Talk Python to Me Podcast
Python Jumpstart by Building 10 Apps - video course
pytest sprint
pytest.org
pytest/tox indiegogo campaign
Interview with Josh Kalderimis from Travis CI.
Josh is a co-founder and Chief Post-It Officer at Travis CI.
Topics
What is Continuous Integration, CI
What is Travis CI
Some history of the company
travis-ci.org vs travis-ci.com and merging the two
Enterprise and the importance of security
Feature questions
Travis vs Jenkins
Travis notification through Slack
Reporting history of Travis results
Dealing with pytest results status other than pass/fail
Capturing std out and stderr logging from tests
Build artifacts
Tox and Travis
Using Selenium
What does a Chief Post-It Officer do
Differentiation between Travis and other CI options
Using Slack to keep remote teams communicating well
Travis team
Funding open source projects
Travis Foundation
Rails Girls Summer of Code
Open source grants
Mustaches and beards
Shite shirts
New Zealand
What does Team Periwinkle do
Links
Jeff Knupp's Open Sourcing a Python Project the Right Way
Sven's blog post when Travis started
Sven's mustache and Josh's beard
Travis CI for open source
Travis CI for private repositories and enterprise
Slack
Travis Foundation
Rails Girls Summer of Code
Talk Python to Me Podcast
Testing apps that use requests without using mock.
Interview with Ian Cordasco (@sigmavirus24)
Topics:
Betamax - python library for replaying requests interactions for use in testing.
requests
github3.py
Pycon 2015 talk: Ian Cordasco - Cutting Off the Internet: Testing Applications that Use Requests - PyCon 2015
Pytest and using Betamax with pytest fixtures
The utility (or uselessness) of teaching programming with Java (My own rant mainly)
Rackspace and Ian’s role at Rackspace and OpenStack
Python Code Quality Authority: flake8, pep8, mccabe, pylint, astroid, …
Static code analysis and what to use which tool when.
Raymond Hettinger - Beyond PEP 8 -- Best practices for beautiful intelligible code - PyCon 2015
Links:
Testing Python-Requests with Betamax
Cutting Off the Internet: Testing Applications that Use Requests - PyCon 2015
github3.py
requests
Rackspace
Openstack
Python Code Quality Authority and documentation
GitLab
Raymond Hettinger - Beyond PEP 8 -- Best practices for beautiful intelligible code - PyCon 2015
Other Betamax resources:
Betamaxing Boto3
Using Betamax with pytest fixtures
Isolated @memoize
In this episode I interview Ned Batchelder.
I know that coverage.py is very important to a lot of people to understand how much of their code is being covered by their test suites.
Since I'm far from an expert on coverage, I asked Ned to discuss it on the show.
I'm also quite a fan of Ned's 2014 PyCon talk "Getting Started Testing", so I definitely asked him about that.
We also discuss edX, Python user groups, PyCon talks, and more.
Some of what's covered (pun intended) in this episode:
coverage.py
types of coverage
Line coverage
branch coverage
Behavior coverage
Data coverage
How Ned became the owner of coverage.py
Running tests from coverage.py vs running coverage from test runner.
edX
what is it
what Ned's role is
Ned's blog
Ned's PyCon 2014 talk "Getting Started Testing"
Teaching testing and the difficulty of the classes being part of unittest
fixtures package
some of the difficulties of teaching unittest because of it's class based system.
the history of classes in unittest coming from java's jUnit implementation
Boston's Python Group
PyCon in Portland
Ned to do a talk here "Machete mode debugging".
Practicing PyCon talks at local group meetings.
At the very least, practice it in front of a live audience.
Links:
Ned Batchelder
Coverage
Coverage documentation
django-nose
pytest-django
edX
open edX
Boston Python User Group
Portland Python User Group - I need to go to these
PyCon 2016 - Planning on attending, it's in Portland. Yay!
Getting Started Testing - Ned's 2014 Pycon talk
How pytest, unittest, and nose deal with assertions.
The job of the test framework to tell developers how and why their tests failed is a difficult job.
In this episode I talk about assert helper functions and the 3 methods pytest uses to get around having users need to use assert helper functions.
Given-When-Then is borrowed from BDD and is my favorite structure for test case design.
It doesn’t matter if you are using pytest, unittest, nose, or something completely different, this episode will help you write better tests.
The Given-When-Then structure for test method/function development.
How and why to utilize fixtures for your given or precondition code.
Similarities with other structure discriptions.
Setup-Test-Teardown
Setup-Excercise-Verify-Teardown.
Arrange-Act-Assert
Preconditions-Trigger-Postconditions.
Benefits
Communicate the purpose of your test more clearly
Focus your thinking while writing the test
Make test writing faster
Make it easier to re-use parts of your test
Highlight the assumptions you are making about the test preconditions
Highlight what outcomes you are expecting and testing against.
Links discussed in the show:
Mechanics of pytest, unittest, nose
unittest fixture reference
nose fixture reference
pytest fixtures (series of posts starting here)
pytest style fixtures
pytest parameterized fixtures
a bit hard to listen 😅
I would be interested to invite you as keynote for PyCon Singapore in June and I would love to know more on your PyTest online courses when it releases.
Such a great episode! I've even listened to it twice
I enjoyed this episode. Hope more episodes on this subject.
react tables
another great episode
Thanks for sharing these good tips