Claim Ownership

Author:

Subscribed: 0Played: 0
Share

Description

 Episodes
Reverse
Will McGugan has brought a lot of color to CLIs within Python due to Rich. Then Textual started rethinking full command line applications, including layout with CSS. And now Textualize, a new startup, is bringing CLI apps to the web. Special Guest: Will McGugan.
When you are teaching someone web development skills, when is the right time to start teaching code quality and testing practices? Karl Stolley believes it's never too early. Let's hear how he incorporates code quality in his courses. Our discussion includes: starting people off with good dev practices and tools linting html and css validation visual regression testing using local dev servers, including https incorporating testing with git hooks testing to aid in css optimization and refactoring Backstop Nightwatch BrowserStack the tree legged stool of learning and progressing as a developer: testing, version control, and documentation Karl is also writing a book on WebRTC, so we jump into that a bit too. Special Guest: Karl Stolley.
Being productive is obviously a good thing. Can we measure it? Should we measure it? There's been failed attempts, like lines of code, etc. in the past. Currently, there are new tools to measure productivity, like using git metrics. Nick Hodges joins the show to discuss the good and the bad of developer and team productivity, including how we can improve productivity. Special Guest: Nick Hodges.
Django has a handful of console commands to help manage and develop sites. django-rich (https://pypi.org/project/django-rich/) adds color and nice formatting. Super cool. In a recent release, django-rich also adds nice colorized tracebacks to the Django test runner. Special Guests: Adam Johnson and David Smith.
Twisted has been supporting asynchronous / event driven applications way before asyncio. Twisted, and Glyph, have also been encouraging automated tests for a very long time. Twisted uses a technique that should be usable by other applications, even those using asyncio or other event driven architectures. Full Transcript (https://pythontest.com/testandcode/184-twisted-testing-event-driven-async-apps/) Special Guest: Glyph.
Ryan Cheley joins me today to talk about some challenges of managing software teams, and how to handle them. We end up talking about a lot of skills that are excellent for software engineers as well as managers. Some topics discussed: handling code reviews asking good questions being honest about what you can't do with current resources and data discussing tradeoffs and offering solutions that can be completed faster than the ideal solution balancing engineering and managing making sure documentation happens remote teams encouraging collaboration encouraging non-work-related conversations watching out for overworking Full Transcript (https://pythontest.com/testandcode/183-managing-software-teams/) Special Guest: Ryan Cheley.
Don't you just love technical interviews, with someone who just saw your resume or CV 5 minutes ago asking you to write some code on a whiteboard. Probably code that has nothing to do with anything you've done before or anything you will do at the company. No? Neither does Nathan Aschbacher. So when he started building the team at his company, he decided to do things differently. Hiring is one of the essential processes for building a great team. However, it's a high noise, low signal process. Nathan Aschbacher has a relatively unorthodox tech hiring approach. He's trying to make it very humane, with a better signal to noise ratio. Nathan is not intereseted in bizarre interview processes where the interviewer doesn't know anything about the interviewee beforehand, all people are asked the same questions, and people are asked to code on white boards. Instead, he states "if the goal is to try to figure out if the person can do the work with your team, and your trying to build the team that you are adding this person to, they need to know what the team is like, and determine if they want to be part of the team, and the team needs to know what the person is like and if they would be addititve to the team. So what's Nathan's process: - Screening resumes and CVs, looking for internal motivation to become an expert at something. - Basic phone screen, very informal. - A couple 2-3 hour pairings with someone on the team with whatever they are working on. - Debriefing both the candidate and the team afterwords. - Giving the candidate an opportunity for a second impression and following up on difficulties during the pairings. We discuss the process, and also: - trying to remove the barriers to team integration - treating people as humans And of course, there's the story of how Nathan ended up interviewing someone with Zoo experience an no technical experience for a technical role. Of course, it was a misunderstanding of a job requirement around experience with ZooKeeper. But it's a good story. Full Transcript (https://pythontest.com/testandcode/182-unorthodox-tech-interview/) Special Guest: Nathan Aschbacher.
We talk with Adam Johnson about his new book, "Boost Your Django DX". Developer experience includes tools and practices to make developers more effective and efficient, and just plain make software development more fun and satisfying. One of the things I love about this book is that it's not just for Django devs. I'd guess that about half the book is about topics that all Python developers would find useful, from virtual environments to linters to testing. But of course, also tons of tips and tools for working with Django. Full Transcript (https://pythontest.com/testandcode/181-boost-your-django-dx/) Special Guest: Adam Johnson.
180: Lean TDD

180: Lean TDD

2022-02-2126:05

Lean TDD is an attempt to reconcile some conflicting aspects of Test Driven Development and Lean Software Development. I've mentioned Lean TDD on the podcast a few times and even tried to do a quick outline at the end of episode 162 (https://testandcode.com/162). This episode is a more complete outline, or at least a first draft. If you feel you've got a good understanding of TDD, and it's working awesome for you, that's great. Keep doing what you're doing. There are no problems. For me, the normal way TDD is taught just doesn't work. So I'm trying to come up with a spin on some old ideas to make it work for me. I'm hoping it works for you as well. I'm calling the new thing Lean TDD. It's inspired by decades of experience writing software and influence from dozens of sources, including Pragmatic Programmer, Lean Software Development, Test-Driven Development by Example, and many blog posts and wiki articles. The main highlights, however, come from the collision of ideas between Lean and TDD and how I've tried to resolve the seemingly opposing processes. Full Transcript (https://pythontest.com/testandcode/180-lean-tdd/)
179: Exploratory Testing

179: Exploratory Testing

2022-02-0911:391

Exploratory testing is absolutely an essential part of a testing strategy. This episode discusses what exploratory testing is, its benefits, and how it fits within a framework of relying on automated tests for most of our testing. Full Transcript (https://pythontest.com/testandcode/179-exploratory-testing/)
"There are five practical reasons that we write tests. Whether we realize it or not, our personal testing philosophy is based on how we judge the relative importance of these reasons." - Sarah Mei This episode discusses the factors. Sarah's order: Verify the code is working correctly Prevent future regressions Document the code’s behavior Provide design guidance Support refactoring Brian's order: Verify the code is working correctly Prevent future regressions Support refactoring Provide design guidance Document the code’s behavior The episode includes reasons why I've re-ordered them. Full Transcript (https://pythontest.com/testandcode/178-factors-automated-software-testing/)
A recent Twitter thread by Simon Willison reminded me that I've been meaning to do an episode on the testing trophy. This discussion is about the distinction between unit and integration tests, what those terms mean, and where we should spend our testing time. Full Transcript (https://pythontest.com/testandcode/177-unit-test-integration-test-testing-trophy/)
The idea of having a software as a service product sound great, doesn't it? Solve a problem with software. Have a nice looking landing page and website. Get paying customers. Eventually have it make enough revenue so you can turn it into your primary source of income. There's a lot of software talent out there. We could solve lots of problems. But going from idea to product to first customer is non-trivial. Especially as a side hustle. This episode discusses some of the hurdles from idea to first customer. Brandon Braner is building Released.sh. It's a cool idea, but it's not done yet. Brandon and I talk about building side projects: - finding a target audience - limiting scope to something doable by one person - building a great looking landing page - finding time to work on things - prioritizing and planning - learning while building - even utilizing third party services to allow you to launch faster - and last, but not least, having fun Full Transcript (https://pythontest.com/testandcode/176-saas-side-projects/) Special Guest: Brandon Braner.
175: Who Should Do QA?

175: Who Should Do QA?

2022-01-1213:06

Who should do QA? How does that change with different projects and teams? What does "doing QA" mean, anyway? Answering these questions are the goals of this episode. Full Transcript (https://pythontest.com/testandcode/175-who-should-do-qa/)
In this episode, I talk with Paul Ganssle about a fun workflow that he calls pseudo-TDD. Pseudo-TDD is a way to keep your commit history clean and your tests passing with each commit. This workflow includes using pytest xfail and some semi-advanced version control features. Some strict forms of TDD include something like this: - write a failing test that demonstrates a lacking feature or defect - write the source code to get the test to pass - refactor if necessary - repeat In reality, at least for me, the software development process is way more messy than this, and not so smooth and linear. Pauls workflow allow you to develop non-linearly, but commit cleanly. Full Transcript (https://pythontest.com/testandcode/174-pseudo-tdd/) Special Guest: Paul Ganssle.
173: Why NOT unittest?

173: Why NOT unittest?

2021-12-1723:30

In the preface of "Python Testing with pytest" I list some reasons to use pytest, under a section called "why pytest?". Someone asked me recently, a different but related question "why NOT unittest?". unittest is an xUnit style framework. For me, xUnit style frameworks are fatally flawed for software testing. That's what this episode is about, my opinion of * "Why NOT unittest?", or more broadly, * "What are the fatal flaws of xUnit?" Full Transcript (https://pythontest.com/testandcode/173-why-not-unittest/)
A prototype is a a preliminary model of something, from which other forms are developed or copied. In software, we think of prototypes as early things, or a proof of concept. We don't often think of prototyping during daily software development or maintenance. I think we should. This episode is about growing better designed software with the help of a prototype mindset. Full Transcript (https://pythontest.com/testandcode/172-designing-better-software-prototype-mindset/)
Paul Ganssle, is a software developer at Google, core Python dev, and open source maintainer for many projects, has some thoughts about pytest's xfail. He was an early skeptic of using xfail, and is now an proponent of the feature. In this episode, we talk about some open source workflows that are possible because of xfail. Full Transcript (https://pythontest.com/testandcode/171-use-pytest-xfail/) Special Guest: Paul Ganssle.
Prayson Daniel, a principle data scientist, discusses testing machine learning pipelines with pytest. Prayson is using pytest for some pretty cool stuff, including: * unit tests, of course * testing pipeline stages * counterfactual testing * performance testing All with pytest. So cool. Full Transcript (https://pythontest.com/testandcode/170-pytest-data-science-machine-learning/) Special Guest: Prayson Daniel.
Performance monitoring and error detection is just as important with services and microservices as with any system, but with added complexity. Omri Sass joins the show to explain telemetry and monitoring of services and of systems with services. Full Transcript (https://pythontest.com/testandcode/169-service-microservice-performance-monitoring/) Special Guest: Omri Sass.
Comments (7)

Antonio Andrade

a bit hard to listen 😅

Jun 19th
Reply

Max Ong Zong Bao

I would be interested to invite you as keynote for PyCon Singapore in June and I would love to know more on your PyTest online courses when it releases.

Jan 3rd
Reply

Александр Михеев

Such a great episode! I've even listened to it twice

Dec 9th
Reply

Eduardo Costa

I enjoyed this episode. Hope more episodes on this subject.

Mar 15th
Reply

Leora Juster

react tables

Jan 12th
Reply

GreatBahram

another great episode

Dec 16th
Reply

Antonio Andrade

Thanks for sharing these good tips

Dec 9th
Reply
Download from Google Play
Download from App Store