As a python developer, I use a lot of different packages, but there are a few that I always return to. Maybe its the type of work I do, but these python packages, although they aren’t tied in string These are a few of my favorite things
VADER
If you want to take a look at how people are talking about your brand, VaderSentiment is for you. This package provides a sentiment analysis on text and lets you know if text is positive, negative or neutral and to what degree.
How does it work?
The package scans text to see if any of the words are present in its lexicon(fancy word for dictionary) and it will calculate a sentiment score.
This package is great for social posts because it can pick up on emoticons, WORDS IN ALL CAPS and the context of the word. I use this to analyze tweets for different brands to see how people are talking about their brand.
You can find a really great example of how to use VADER on t-redactly.io.
Selenium
This is by far my favorite web scraping tool. Where beautiful soup fails, Selenium comes in. Selenium automates web browsers, so aside from scraping you can automate web applications for testing purposes.
Getting it set up can be a bit tricky (you have to download a driver for your preferred browser, etc.) but once you have an initial script written out its easy to change for almost anything you want to scrape online. Just highlight anything you want on a website, inspect the webpage and copy the XPATH from the source code.
Sample code:
import selenium from selenium import webdriver from selenium.webdriver.common.by import By from selenium.common.exceptions import NoSuchElementException driver = webdriver.Chrome() driver.get("url") data = driver.find_element(By.XPATH, 'XPATH HERE')\
I haven’t found the best guide to Selenium, but this one does the best job of explaining how to set it up.
Seaborn
There are lots of visualization tools out there you can use for graphs, but nothing beats Seaborn. Its based on matplotlib, so all you MPL lovers out there rest easy.
I personally really love their correlation matrix, but you can find just about any visualization and an array and beautiful color themes. If you’re not a designer, but want your visualizations to look pretty, this is an easy way to do so.



More examples and full documentation can be found at seaborn.pydata.org
Requests
I don’t know how I would function without requests. I do so much data collection from APIs that don’t have python clients. Even Google’s page insights API doesn’t have a built out client, so I have to rely on requests a lot of times. Every API is different, but you can build a really simple way to call the API without using your terminal.
Its so simple, just specify the url you want to use, a key if you need one and basic parameters to facilitate the call.
API_URL = 'https://www.googleapis.com/pagespeedonline/v2/runPagespeed' API_KEY = 'Your API Key' URL_TO_GET = 'url' params = { 'url': URL_TO_GET, 'strategy': 'mobile', 'key': API_KEY } re = requests.get(API_URL, params=params)
Some APIs aren’t the easiest to figure out. They have parameters that they don’t tell you about or the URL isn’t very clear or their instructions just plain suck. I found a converter that takes cURL and converts it to a Python request. It has been life changing. If you’re interested, take a look here. All you have to do is copy the cURL and paste the code into your script.
So there you have it, just a few of my favorite things! What are your favorite packages? Let me know so I can check them out!